Find answers from the community

Updated 11 months ago

Is it just me, or is the `chat_engine`

Is it just me, or is the chat_engine weaker than the query_engine?
L
i
8 comments
Depends on the chat engine. In general, approaches for including chat history on top of query engine all have their own pros/cons
imo an agent is probably the most flexible, but it can take some time to configure nicely
Ooo, I haven’t gotten to that section yet, I’ll get there eventually
in general, theres tradeoffs between being fast but basic, or being slow but accurate
Yah, cause most of the docs mention RetriverQueryEngine and I can’t find anything chat equivalent. I just want to have previous context sent in to create a basic chat experience.
I’m trying to set up https://docs.llamaindex.ai/en/stable/examples/retrievers/reciprocal_rerank_fusion.html

But I’d like to use chat engine, I’ll see if I can’t figure something out. I’ll use query engine if I have to
"I just want to have previous context sent in to create a basic chat experience" -- turns out when doing that on top of RAG, there's many approaches to do that, and none are perfect 😄

You can use that retriever directly with context chat engine, or condense plus content context chat engine.

You could also wrap the retriever in a tool and use it with an agent. Or put the retriever into a query engine and also use with an agent
There's a description of all the chat modes here.

You'll just have to initialize the chat engine of choice from the constructor instead of using as_chat_engine
https://docs.llamaindex.ai/en/stable/module_guides/deploying/chat_engines/usage_pattern.html#available-chat-modes
Add a reply
Sign up and join the conversation on Discord