Find answers from the community

Updated last year

Is it just me, or is the `chat_engine`

At a glance

The community members are discussing the relative strengths of the chat_engine and query_engine in a system. Some community members suggest that the chat engine may be weaker, while others note that it depends on the specific chat engine used and that there are tradeoffs between speed and accuracy. The discussion covers various approaches to incorporating chat history into a query engine, such as using a retriever directly with a context chat engine, or wrapping the retriever in a tool and using it with an agent. The community members also mention that the documentation provides details on the available chat modes that can be initialized from the constructor.

Useful resources
Is it just me, or is the chat_engine weaker than the query_engine?
L
i
8 comments
Depends on the chat engine. In general, approaches for including chat history on top of query engine all have their own pros/cons
imo an agent is probably the most flexible, but it can take some time to configure nicely
Ooo, I haven’t gotten to that section yet, I’ll get there eventually
in general, theres tradeoffs between being fast but basic, or being slow but accurate
Yah, cause most of the docs mention RetriverQueryEngine and I can’t find anything chat equivalent. I just want to have previous context sent in to create a basic chat experience.
I’m trying to set up https://docs.llamaindex.ai/en/stable/examples/retrievers/reciprocal_rerank_fusion.html

But I’d like to use chat engine, I’ll see if I can’t figure something out. I’ll use query engine if I have to
"I just want to have previous context sent in to create a basic chat experience" -- turns out when doing that on top of RAG, there's many approaches to do that, and none are perfect 😄

You can use that retriever directly with context chat engine, or condense plus content context chat engine.

You could also wrap the retriever in a tool and use it with an agent. Or put the retriever into a query engine and also use with an agent
There's a description of all the chat modes here.

You'll just have to initialize the chat engine of choice from the constructor instead of using as_chat_engine
https://docs.llamaindex.ai/en/stable/module_guides/deploying/chat_engines/usage_pattern.html#available-chat-modes
Add a reply
Sign up and join the conversation on Discord