Find answers from the community

Updated 2 months ago

Hello, I am trying to build an agent

Hello, I am trying to build an agent around a query pipeline by following the below document. Everything seems to be working fine but for some reason the state object is getting reset every time when I ask a question using agent.chat. Can anyone suggest me how I can troubleshoot this

https://docs.llamaindex.ai/en/stable/examples/agent/agent_runner/query_pipeline_agent/
L
v
5 comments
I'm not sure I totally follow

What are you checking for this? imo agent.chat_history or agent.memory.get_all() should be maintained just fine
Thanks @Logan M The main issue that I am facing is, the chatbot is not able to maintain the context. So if I ask a fresh question its working fine, but if I ask a followup it just fails. So I thought it could be because it is not able to maintain the state.
Maybe the memory buffer is filling up too quickly and cutting off the previous history? i.e. agent.memory.get() will get the current buffer
Optionally you can increase the token limit of the memory, or use a summary buffer to help maintain older context a little better
Got it.. let me try that. Thank you!
Add a reply
Sign up and join the conversation on Discord