Find answers from the community

Updated 4 days ago

llama_index/llama-index-core/llama_index...

Hi there! I observed that ChatSummaryMemoryBuffer crashes for Anthropic due to sending a single message with role system and no message with role user. when reading the code here: https://github.com/run-llama/llama_index/blob/90761a9f789bb7628d4faf40ae900d93f16065b7/llama-index-core/llama_index/core/memory/chat_summary_memory_buffer.py#L272 I'm seeding that it's sending the context as role system but doesn't send the system prompt instruction to summarize the context. In the image attached I fixed it and it works perfect for me. Is the current implementation bugged?
Attachment
Screenshot_2025-02-17_at_4.13.46_PM.png
L
p
3 comments
Feel free to make a PR -- probably a bug tbh
cool thanks πŸ™‚
Add a reply
Sign up and join the conversation on Discord