Find answers from the community

Updated 5 months ago

Hi! I’m having an issue with the React

Hi! I’m having an issue with the React agent where the first few tokens are cut off from every response after the first one while streaming, when progressing through with the response generator. I’m using AzureOpenAI as the LLM object for the Agent. Has anyone been able to solve this issue?
L
3 comments
probably some small issue with streaming. Seen a few PRs for this, but not convinced what the right solution is
If you are using openai though, I would be using an OpenAIAgent
so that it uses function calling
Add a reply
Sign up and join the conversation on Discord