Find answers from the community

Updated last year

hey y'all, sorry if this is a simple

At a glance
hey y'all, sorry if this is a simple question, is there a way to get a RouterQueryEngine behind a chat_engine? Ive tried going the ObjectIndex route, but no luck...
e
L
24 comments
Or is this a perfect LangChain -> LlamaIndex opportunity?
You should be able to create the router query engine, wrap it in a query engine tool, and give it to an agent πŸ‘€
soon as i wrap it in a query tool and pass it into a ContextChatEngine as a retriever the answers that come back are how i should use to the tools to get the answer i want, not the answer itself :/
Ohhh it won't work in context chat engine. That one uses a retriever only, not a full query engine.

You would need to use a router retriever for that one
An agent would work fine with a query engine though
what im looking for is the memory bit that the chat engine offers
such that i can put a chat ui on top of it
thank you Logan!
An agent also has memory πŸ™‚
Which tbh with an agent, you might not even need the router (just a query engine tool for each underlying query engine)
just looking at the classes for agents, am i limited to openai with agents?
Nope, you can use a react agent for everything else non-openai
You got it.

But your milage may vary. Openai is quite good compared to open source for agent-like applications
from what i was reading the zephyr models are my best bet?
Nailed it again, was just going to suggest that πŸ‘
I think we even have a notebook for setting it up
Agents at the bottom, LLM config at the top
lovely.... i was working form this very notebook last week, missed the agent stuff entirely πŸ˜‚
thank you again Logan
Add a reply
Sign up and join the conversation on Discord