Find answers from the community

Updated 5 months ago

I just updated to latest llama_index

At a glance

The community member updated to the latest version of llama_index and encountered an error with the response = context_agent.stream_chat("HI", chat_history=[]) line, which did not occur in previous versions. Other community members responded that a new version was just released that should fix the issue, and suggested the user update again. Some community members mentioned refactoring the underlying agents and experiencing issues with messages getting stuck in memory, while one community member said the code worked fine for them.

I just updated to latest llama_index version and suddenly got error from this line: response = context_agent.stream_chat("HI", chat_history=[]), this didn't happen with previous versions, am I missing something? πŸ€”
Attachment
image.png
L
T
5 comments
we JUST released a version that should fix this
can you update again?
we refactored the underlying agents
I get Added user message to memory: hi and its stuck there forever
Works fine for me?
Attachment
image.png
Add a reply
Sign up and join the conversation on Discord