Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated last year
0
Follow
Hi, do you know how to put a routerquery
Hi, do you know how to put a routerquery
Inactive
0
Follow
R
RAPHCVR
last year
Β·
Hi, do you know how to put a routerquery engine in a chat engine ? Or to achieve the same result ?
L
R
10 comments
Share
Open in Discord
L
Logan M
last year
The only alternative is to use an agent I think. Otherwise you need to switch to using a router retiever to use context or condense_plus_context
https://docs.llamaindex.ai/en/stable/examples/agent/openai_agent_with_query_engine.html
R
RAPHCVR
last year
yes I just tried agent, but it does not work very well
R
RAPHCVR
last year
I also tried to change the condense prompt, doing this provided the best results
L
Logan M
last year
The agent takes a lot of tuning -- system prompts, tool descriptions, very tedious haha
R
RAPHCVR
last year
yeah, I've followed your doc on the llama index doc website, using both openai agent and react agent
R
RAPHCVR
last year
openai agent simply didn't work at all
R
RAPHCVR
last year
like it was showing the current 1, but then nothing else and the answer was not using any context from the index
R
RAPHCVR
last year
React agent gave some good answers, but when asking for more question in a chat it does not send the right query to the router query engine
R
RAPHCVR
last year
finally, the router retriever didn't work very well too
L
Logan M
last year
yea agents decide whether to use a tool or not based on the tool descriptions -- so the tool descriptions often need a lot of tweaking
Add a reply
Sign up and join the conversation on Discord
Join on Discord