Find answers from the community

Updated 3 months ago

I have a query engine as follows

I have a query engine as follows
Plain Text
query_engine = RetrieverQueryEngine.from_args(
            retriever=hybrid_retriever,
            node_postprocessors=[reranker],
            service_context=service_context,
            streaming=True,
        )

how can I add memory to it?
a
p
3 comments
We don't support memory for query engine at moment. You can add memory to Chat Engines and Agents tho.

https://docs.llamaindex.ai/en/stable/module_guides/storing/chat_stores.html
now i use
Plain Text
chat_engine = ContextChatEngine.from_defaults(hybrid_retriever, service_context=service_context, memory=memory, node_postprocessors=[reranker])

however, it seems to not be able to remember the previous context, even though the last memory was added.
For example,
Plain Text
Question: Tell me about recommendation engine?

then second question
Question: Tell me more about it

For the second question, it cannot refer it to recommendation engine? Any solution to this?
Ohh that shouldn't be the case. Perhaps there is a bug. Do you mind creating an issue with code to replicate the bug?
Add a reply
Sign up and join the conversation on Discord