Hi! I’m having an issue with the React agent where the first few tokens are cut off from every response after the first one while streaming, when progressing through with the response generator. I’m using AzureOpenAI as the LLM object for the Agent. Has anyone been able to solve this issue?