Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 6 months ago
0
Follow
Chat engines
Chat engines
Inactive
0
Follow
d
demigod2466
6 months ago
Β·
how to implement conversation memory in llama index with rag?
W
1 comment
Share
Open in Discord
W
WhiteFang_Jr
6 months ago
Hi, you can use chat engines from llamaindex to have conversational memory while querying.
https://docs.llamaindex.ai/en/stable/examples/chat_engine/chat_engine_context/
Add a reply
Sign up and join the conversation on Discord
Join on Discord