Find answers from the community

Updated 5 months ago

Hey folks.. I have a question about the

At a glance
Hey folks.. I have a question about the expected behavior of LlamaIndex when the answer to a query is NOT part of the context provided. I have a file with a bunch of product information (pricing, description, etc.). I use LlamaIndex + OpenAI for being able to ask it questions and get back very accurate results.

Sometimes, I want to ask questions whose answers are not in the provided files/data. But OpenAI certainly has the infirmation ... But, I don't get these answers. I get back a response saying that the information is NOT part of the context provided.

Is there a way to get over this? Right now, i don't do any prompt engineering. I simply send the user provided question as is to the model.
L
s
7 comments
The default prompts tell it to only use the context provided
So you'd have to modify those with other instructions
That would be expected default behaviour though
Is there a way to allow responses outside of the context?
modifying the prompts yes
Awesome.. thanks for sharing this. I will try these out.
Add a reply
Sign up and join the conversation on Discord