Find answers from the community

Updated 2 months ago

Is there any clear cut documentation on

Is there any clear cut documentation on Chatmemory Buffer or How to use it?
L
A
3 comments
https://docs.llamaindex.ai/en/stable/module_guides/storing/chat_stores.html

memory.put(ChatMessage(role="user", content="hello"))

memory.get_all()

memory.get()
FIgured it out , thanks, would appreciate something that works alongside with semantic memory once the context exceeds a certain token limit.
I think someone would have to implement a semantic buffer memory then -- we really only have the single memory class, contributions definitely welcome πŸ™‚
Add a reply
Sign up and join the conversation on Discord