The community member updated to the latest version of llama_index and encountered an error with the response = context_agent.stream_chat("HI", chat_history=[]) line, which did not occur in previous versions. Other community members responded that a new version was just released that should fix the issue, and suggested the user update again. Some community members mentioned refactoring the underlying agents and experiencing issues with messages getting stuck in memory, while one community member said the code worked fine for them.
I just updated to latest llama_index version and suddenly got error from this line: response = context_agent.stream_chat("HI", chat_history=[]), this didn't happen with previous versions, am I missing something? π€