Find answers from the community

Updated 6 months ago

Chat engines

how to implement conversation memory in llama index with rag?
W
1 comment
Hi, you can use chat engines from llamaindex to have conversational memory while querying.

https://docs.llamaindex.ai/en/stable/examples/chat_engine/chat_engine_context/
Add a reply
Sign up and join the conversation on Discord