AzureOpenAI
agent using agent.stream_chat
. In my setup there is a middleware proxy that adds a header to the response containing useful usage information which I would like to access in the agent, but from what I can see there are no events or callback events that contain the response object. Would someone be able to suggest a way?Agents
doesnt return completion object directly. The response object is changed based on the requirement. for example for normal response the object is of type Response
and for your case that is streaming it will return an object type StreamingAgentChatResponse
response.raw_response
if you write a hook using instrumentationhttpx.Client
to the OpenAI
llm object with an event hook on the response.