Find answers from the community

Updated 12 months ago

I have a private dataset in a

At a glance

The community member has a private dataset in a vectorsearch database and can query it fine using a chat engine in "context" mode, but experiences hallucinations when the chat engine is switched to "best" or "ReAct" mode. A community member suggests trying to force the chat engine to use the query engine while looking up the answer, as this may prevent hallucinations. However, the community member reports that when they force the OpenAI agent to use the tool, the tool returns empty results, and they only get results when switching back to the local chat engine in context mode.

Useful resources
I have a private dataset in a vectorsearch database which I can query fine using a chat engine in "context" mode but i get hallucinations when chat engine is switched to "best" or "ReAct" mode. Any advice on how to address this?
W
S
2 comments
You can try forcing the chat engine to use query engine while looking up for answer. That way it wont hallucinate: https://docs.llamaindex.ai/en/stable/examples/chat_engine/chat_engine_openai/#force-chat-engine-to-query-the-index


In case of Open-source, react might not work in case LLM is not very good at handling instructions to use query engine or not.
Thank you, I can force OpenAI agent to use the tool but the tool returns empty. If I switch back to local chat engine in context mode, I get some results.
Add a reply
Sign up and join the conversation on Discord