Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 3 months ago
0
Follow
Prevent Hallucinations
Prevent Hallucinations
Inactive
0
Follow
N
Nenuko
2 years ago
ยท
Hello, I've a question about how can the LLM prioritize llama index? So I can prevent hallucinations
L
N
2 comments
Share
Open in Discord
L
Logan M
2 years ago
This is mainly a prompt engineering thing. You'll want to modify the text_qa_template and the refine_template in the query call
https://gpt-index.readthedocs.io/en/latest/how_to/customization/custom_prompts.html
N
Nenuko
2 years ago
Okay, i will check and try it, thx
Add a reply
Sign up and join the conversation on Discord
Join on Discord