Find answers from the community

Updated last year

Hi, do you know how to put a routerquery

Hi, do you know how to put a routerquery engine in a chat engine ? Or to achieve the same result ?
L
R
10 comments
The only alternative is to use an agent I think. Otherwise you need to switch to using a router retiever to use context or condense_plus_context


https://docs.llamaindex.ai/en/stable/examples/agent/openai_agent_with_query_engine.html
yes I just tried agent, but it does not work very well
I also tried to change the condense prompt, doing this provided the best results
The agent takes a lot of tuning -- system prompts, tool descriptions, very tedious haha
yeah, I've followed your doc on the llama index doc website, using both openai agent and react agent
openai agent simply didn't work at all
like it was showing the current 1, but then nothing else and the answer was not using any context from the index
React agent gave some good answers, but when asking for more question in a chat it does not send the right query to the router query engine
finally, the router retriever didn't work very well too
yea agents decide whether to use a tool or not based on the tool descriptions -- so the tool descriptions often need a lot of tweaking
Add a reply
Sign up and join the conversation on Discord