Find answers from the community

Updated 3 months ago

Hi everyone!

Hi everyone!

I'm trying to access the response headers from my completions endpoint for an AzureOpenAI agent using agent.stream_chat. In my setup there is a middleware proxy that adds a header to the response containing useful usage information which I would like to access in the agent, but from what I can see there are no events or callback events that contain the response object. Would someone be able to suggest a way?
W
L
m
5 comments
Agents doesnt return completion object directly. The response object is changed based on the requirement. for example for normal response the object is of type Response and for your case that is streaming it will return an object type StreamingAgentChatResponse
Attachment
image.png
you should be able to access the response.raw_response if you write a hook using instrumentation
https://docs.llamaindex.ai/en/stable/examples/instrumentation/observe_api_calls/
Thanks @WhiteFang_Jr and @Logan M !

I'll look into your suggested solution.

Alternatively, I found that I can pass a custom httpx.Client to the OpenAI llm object with an event hook on the response.
Yea that would work too
Add a reply
Sign up and join the conversation on Discord