stream_chat
and chat
for an OpenAIAgent created via https://github.com/run-llama/llama_index/blob/a0d793aa07c8baf9683cf682f07b712f56971db5/llama-index-integrations/agent/llama-index-agent-openai/llama_index/agent/openai/base.py#L33 I don't see any difference in behavior.response = agent.stream_chat("hello") for r in response.response_gen: print(r, end="", flush=true)