Find answers from the community

Updated 6 months ago

Hi guys, I am building a chatbot with

At a glance

The community member is building a chatbot using a RetrieverQueryEngine and wants to know if it's possible to add chat history as 'context' so the chatbot can answer questions based on the conversation memory. The comments suggest that the Chat Engine from the LlamaIndex documentation can handle chat history, but there doesn't seem to be a corresponding RetrieverChatEngine for building a hybrid retriever. Some community members have tried using the ReActAgent with tools, which exposes chat, but it's not quite the same. There is no explicitly marked answer, and the community members are still trying to find a solution for this issue.

Useful resources
Hi guys, I am building a chatbot with RetrieverQueryEngine, which can talk with my documents. Is it possible to add the chat history as 'context' so that the chat bot can answer the question based on conversation memory ?
1
T
b
h
10 comments
I had the same question, thanks!
Unfortunately, there doesn't seem to be a corresponding RetrieverChatEngine to keep track of chat history when building a hybrid retriever.
I just quickly verified you cannot substitute a query engine with a chat engine into a QueryFusionRetriever object.
i'm using ReActAgent with tools, which exposes chat
but yea, not quite the same
Have tried several ways for this issue, and still no luck...
@Teemu Would you happen to know if, when you enable chat history, if the history increasingly reduces the context window buffer with each subsequent query or if the history gets incorporated automatically into the vector store?
Add a reply
Sign up and join the conversation on Discord