Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 2 months ago
0
Follow
Is there any clear cut documentation on
Is there any clear cut documentation on
Inactive
0
Follow
A
Abhiram_AI_Guy
9 months ago
Β·
Is there any clear cut documentation on Chatmemory Buffer or How to use it?
L
A
3 comments
Share
Open in Discord
L
Logan M
9 months ago
https://docs.llamaindex.ai/en/stable/module_guides/storing/chat_stores.html
memory.put(ChatMessage(role="user", content="hello"))
memory.get_all()
memory.get()
A
Abhiram_AI_Guy
9 months ago
FIgured it out , thanks, would appreciate something that works alongside with semantic memory once the context exceeds a certain token limit.
L
Logan M
9 months ago
I think someone would have to implement a semantic buffer memory then -- we really only have the single memory class, contributions definitely welcome π
Add a reply
Sign up and join the conversation on Discord
Join on Discord