Find answers from the community

Updated 4 months ago

I have a strange issue - trying to get a

At a glance

The community member is having an issue with the llama-index library not working correctly with Azure OpenAI. They are getting an error related to the delta.role and delta.content attributes being unavailable. The community members discuss that the issue may be due to incompatibility between the latest OpenAI Python library and the Azure services. They suggest that Azure sometimes returns malformed dictionary responses, and adding checks to handle this can fix the issue. One community member provides a specific fix by checking for the existence of choices and choices[0].content before processing the response. This seems to resolve the problem for the community members.

Useful resources
I have a strange issue - trying to get a llama-index working with Azure. I've managed to get embedding working fine, and when I run the query, I get the nodes from the query fine, but at the point that I get a streaming response, I get an error from llms\openai.py (which is already odd, as I thought it should be using AzureOpenai.py?)

if (delta.role == MessageRole.ASSISTANT) and (delta.content is None):
AttributeError: 'dict' object has no attribute 'role'

I'm using
llm = AzureOpenAI ( ...)
embed_model = AzureOpenAIEmbeddings(... )

service_context = ServiceContext.from_defaults(embed_model=embed_model, llm=llm)
...
response = query_engine.query(query)

for text in response.response_gen:
....


I get back a response fine and response.source_nodes returns some nodes, but I can't seem to iterate the reponse_gen.
L
R
a
21 comments
AzureOpenAI is just a light wrapper around the OpenAI LLM, so it makes sense that's where the error is (openai.py implements all almost the logic)

We recently updated to the new OpenAI client -- it should be returning pydantic objects, not dicts 🤔

If you run pip show openai what do you get?
actually nvm it would error out way earlier if you had the wrong openai version
does a non-streaming query work?
It's Version 1.2.3 as required I think by llama-index latest/greatest. I think they've (openai) made a tonne of changes since my app has regular retriever-less non-llamaindex chat and I had to do things like convert the old:
for chunk in response:
...
content = json.dumps(chunk)

which changed to
content = chunk.model_dump_json()

since the response was returning pydantic objects/

So you reckon this might be an incompatability between openai's latest python library and the Azure services?

I don't see AzureOpenAI.py in the call tree which is what I'd expect.
Some additional clues:
-Am using CitationQueryEngine (if that makes any difference?)

Will attempt a non-streaming response but it's a bit of butchery to make it do that as this sits behind a fastapi / called from a react app
...
\venv\Lib\site-packages\starlette\routing.py app 69
\venv\Lib\site-packages\starlette\responses.py call 270
\venv\Lib\site-packages\starlette\responses.py wrap 273
\venv\Lib\site-packages\starlette\responses.py stream_response 262
\Nova.API\nova\main.py stream_response 118
\venv\Lib\site-packages\llama_index\llm_predictor\utils.py gen 28
\venv\Lib\site-packages\llama_index\llms\base.py wrapped_gen 193
\venv\Lib\site-packages\llama_index\llms\openai.py gen 306
e.g. stream_response (little func that is iterating the returned response.response_gen seems to be hitting openai.py not AzureOpenAI.py that I'd expect
Index flow
Attachment
0.png
query flow
Attachment
0.png
Yea as o mentioned, AzureOpenAI extends openai, so that makes sense

https://github.com/run-llama/llama_index/blob/29ef306ae0536de44840ca5acfdf93d84b9a560c/llama_index/llms/azure_openai.py#L16

We've made a ton of changes to accommodate the new openai client version. But it seems to be breaking its own rules by returning a dict for delta
Very sus, I don't have azure to debug with at the moment 🤔
So something I noticed with azure that is different to openai- it does return the dicts but sometimes there is a precursor message with the content filter stuff that doesn’t necessarily have the delta or content included. I had to put in checks to make sure that only process the streaming back packet if it actually has those keys
So in my chat thing I do this
Attachment
IMG_0044.jpg
This fixes it: eg adding a check for choices > 0 and existence of choices[0].content. Without this we sometimes get a packet missing this.
Also if there is no content, setting an empty object … why not just continue ?
In some ways switching from a json response to a pydantic model of python dicts makes it more brittle.
I'm having this issue too after switching to Azure. Is modifying the lib the only way?
Currently- I will do a PR at the weekend - my fix above works.
Basically azure sends back some weird malformed packets without content fields
Malformed dicts.
Thanks a lot! I added the two continues and things work (I'm using chat_engine, not query_engine, but same issue)
Add a reply
Sign up and join the conversation on Discord