Find answers from the community

Updated 6 months ago

Can we add conversational memory in the

At a glance

The community member asked if conversational memory can be added to the llama-index. Another community member responded that yes, the Chat Engine can be used in place of the query engine to enable conversational memory, and provided a link to the relevant documentation. The original poster thanked the other community member and said they would look into it.

Useful resources
Can we add conversational memory in the llama-index
W
d
2 comments
Yes. You can use Chat Engine in place of query engine to have conversational memory.

https://gpt-index.readthedocs.io/en/latest/core_modules/query_modules/chat_engines/root.html
Thanks, I will have a look
Add a reply
Sign up and join the conversation on Discord