Find answers from the community

Updated last month

Router

@Logan M - I am trying to understand how can i use RouterQueryEngine with Chat History. If i just stuff the query with all the chat messages will it work? or do i need to do something else? How can I achieve this?
L
S
d
4 comments
Yea that'll probably work fine. You could also just call the llm directly and use function calling too

https://docs.llamaindex.ai/en/stable/examples/llm/openai/#structured-prediction
We have an example of Router Query Engine built from scratch using workflows, you might find it easier to modify that to include a chat history: https://docs.llamaindex.ai/en/stable/examples/workflow/router_query_engine/
in the long run, i definately want to use some agentic approaches like workflow, but right now just want to get the RouterQueryEngine working with ChatHistory
I will check by stuffing the chat history to RouterQueryEngine and report back
Add a reply
Sign up and join the conversation on Discord