Find answers from the community

Updated 2 months ago

Hello there! Does anyone have pointers

Hello there! Does anyone have pointers around adding streaming support to the 'error response' agent? https://docs.llamaindex.ai/en/latest/examples/agent/custom_agent.html# I see elsewhere that OpenAIAgent supports it but I'd like to learn how to add it to a CustomSimpleAgent implementation
L
2 comments
streaming is pretty complex to add to an agent, so be warned.

You need to both stream the result AND write that result to the chat memory. But you can only write it to memory once it's finished streaming
You can reference the code in the existing OpenAIAgent -- it basically uses a thread with a queue to keep track of the streamed tokens, then writes it to history when the stream is exhausted. It does this by wrapping the original response generator
Add a reply
Sign up and join the conversation on Discord