Find answers from the community

Updated 3 months ago

Hey i'm using the ragDatasetGenerator to

Hey i'm using the ragDatasetGenerator to generate a set of questions but i'm using a local llm (llama3:70b-instruct) served using ollama and I get these as my questions, is there a way around it ?
Attachment
image_49.png
L
3 comments
oof llama3 is not following the prompt instructions. Its supposed to write one per line
Probably need to prompt engineer a bit
Plain Text
text_question_template = """\
Context information is below.
---------------------
{context_str}
---------------------
Given the context information and not prior knowledge.
generate only questions based on the below query. Questions should be numbered on each line.
{query_str}
"""

RagDatasetGenerator.from_documents(..., text_question_template=text_question_template)
Add a reply
Sign up and join the conversation on Discord