Find answers from the community

Updated 6 months ago

Hey i'm using the ragDatasetGenerator to

At a glance

The community member is using the ragDatasetGenerator to generate a set of questions, but they are using a local language model (llama3:70b-instruct) served using ollama and the generated questions are not following the expected format. Other community members suggest that the community member may need to "prompt engineer a bit" to address this issue. One community member provides a sample text_question_template that could be used with the RagDatasetGenerator.from_documents() function to generate the desired question format.

Hey i'm using the ragDatasetGenerator to generate a set of questions but i'm using a local llm (llama3:70b-instruct) served using ollama and I get these as my questions, is there a way around it ?
Attachment
image_49.png
L
3 comments
oof llama3 is not following the prompt instructions. Its supposed to write one per line
Probably need to prompt engineer a bit
Plain Text
text_question_template = """\
Context information is below.
---------------------
{context_str}
---------------------
Given the context information and not prior knowledge.
generate only questions based on the below query. Questions should be numbered on each line.
{query_str}
"""

RagDatasetGenerator.from_documents(..., text_question_template=text_question_template)
Add a reply
Sign up and join the conversation on Discord