Find answers from the community

Home
Members
PierreAumont
P
PierreAumont
Offline, last seen 6 months ago
Joined September 25, 2024
Hello everyone. A newbie question for which I can't find an answer in the docs, but it should be simple. I use LlamaIndex with an openai api as AI engine to make a chatbot on my personal data. But if the request is badly formulated, the LLM returns an answer that it finds in its own data and not in the data prepared at the time of launching my chatbot. My question: is there a way to force the LLM to look for answers only in the data sent?
1 comment
L