The community member is building a chatbot using a RetrieverQueryEngine and wants to know if it's possible to add chat history as 'context' so the chatbot can answer questions based on the conversation memory. The comments suggest that the Chat Engine from the LlamaIndex documentation can handle chat history, but there doesn't seem to be a corresponding RetrieverChatEngine for building a hybrid retriever. Some community members have tried using the ReActAgent with tools, which exposes chat, but it's not quite the same. There is no explicitly marked answer, and the community members are still trying to find a solution for this issue.
Hi guys, I am building a chatbot with RetrieverQueryEngine, which can talk with my documents. Is it possible to add the chat history as 'context' so that the chat bot can answer the question based on conversation memory ?
@Teemu Would you happen to know if, when you enable chat history, if the history increasingly reduces the context window buffer with each subsequent query or if the history gets incorporated automatically into the vector store?