Find answers from the community

Updated 5 months ago

Router

At a glance

The community member is trying to understand how to use the RouterQueryEngine with chat history. They wonder if they can simply include all the chat messages in the query, or if they need to do something else. The comments suggest that stuffing the chat history into the query may work, and that the community member could also call the language model directly and use function calling. Another community member points to an example of building a Router Query Engine using workflows, which they think might be easier to modify to include chat history. The community member says they want to get the RouterQueryEngine working with chat history for now, and will try stuffing the chat history into the query to see if it works.

Useful resources
@Logan M - I am trying to understand how can i use RouterQueryEngine with Chat History. If i just stuff the query with all the chat messages will it work? or do i need to do something else? How can I achieve this?
L
S
d
4 comments
Yea that'll probably work fine. You could also just call the llm directly and use function calling too

https://docs.llamaindex.ai/en/stable/examples/llm/openai/#structured-prediction
We have an example of Router Query Engine built from scratch using workflows, you might find it easier to modify that to include a chat history: https://docs.llamaindex.ai/en/stable/examples/workflow/router_query_engine/
in the long run, i definately want to use some agentic approaches like workflow, but right now just want to get the RouterQueryEngine working with ChatHistory
I will check by stuffing the chat history to RouterQueryEngine and report back
Add a reply
Sign up and join the conversation on Discord