Find answers from the community

Updated 3 months ago

Streaming

Hi, when i try to run stream_chat and chat for an OpenAIAgent created via https://github.com/run-llama/llama_index/blob/a0d793aa07c8baf9683cf682f07b712f56971db5/llama-index-integrations/agent/llama-index-agent-openai/llama_index/agent/openai/base.py#L33 I don't see any difference in behavior.

PS: I don't trying to debug why the agent i am creating doesn't yield streaming response at frontend. i.e. the frontend is recieving the full response all at once from backend after i setup webSocket.
L
1 comment
Not sure what you mean?

Plain Text
response = agent.stream_chat("hello")
for r in response.response_gen:
  print(r, end="", flush=true)
Add a reply
Sign up and join the conversation on Discord