Find answers from the community

Updated 3 months ago

Hallucinations and no sources

Heya;
I'm currently developing a AI for a company.
It works as of rn but i have some issues.

  • It hallucinates American answers even tho we need German answers. F.e. German "Pflanzenschutzmittel" and American "Pesticides" are different.
  • It hallucinates questions from the User which were never given. F.e. when i ask a question, it answers it but then directly thinks that i ask a follow up which i never did and mentions it in the same response and answers it.
  • I'm trying to display it's sources. But the response.source_nodes does not contain anything.
Is there anything i can try to fix these mentioned issues?
Thanks in Advance
T
L
16 comments
For issue no. 2 i found this article
http://antirez.com/news/142
maybe thats a start
Are you using an agent? If response.source_nodes is empty, thats because it did not use any query engine tools for the response I think
For issue 2, that sounds quite common for open-source LLMs πŸ€”
Yee, i use a normal chat agent
Let's see if that gets fixed with using llama2 too
or atleast less
Just so the hreads are open agsin
@Logan M do you think there is a fix?
Or ratherm what happened
The default chat engine is an agent. It likely responded without choosing to use the query engine
But i asked questions that were specifically in a PDF
Right, but maybe it would help to explain how agents work.

Agents work by looking at the chat history, and a list of tool names/descriptions, and deciding whether or not to invoke a tool or respond directly.

The default inde.as_chat_engine() gives the index a generic name/description, which might not be specific enough for the LLM to know when it actually needs to use the tool/query engine
Add a reply
Sign up and join the conversation on Discord