The community member is having an issue with the llama-index library not working correctly with Azure OpenAI. They are getting an error related to the delta.role and delta.content attributes being unavailable. The community members discuss that the issue may be due to incompatibility between the latest OpenAI Python library and the Azure services. They suggest that Azure sometimes returns malformed dictionary responses, and adding checks to handle this can fix the issue. One community member provides a specific fix by checking for the existence of choices and choices[0].content before processing the response. This seems to resolve the problem for the community members.
I have a strange issue - trying to get a llama-index working with Azure. I've managed to get embedding working fine, and when I run the query, I get the nodes from the query fine, but at the point that I get a streaming response, I get an error from llms\openai.py (which is already odd, as I thought it should be using AzureOpenai.py?)
if (delta.role == MessageRole.ASSISTANT) and (delta.content is None): AttributeError: 'dict' object has no attribute 'role'
It's Version 1.2.3 as required I think by llama-index latest/greatest. I think they've (openai) made a tonne of changes since my app has regular retriever-less non-llamaindex chat and I had to do things like convert the old: for chunk in response: ... content = json.dumps(chunk)
which changed to content = chunk.model_dump_json()
since the response was returning pydantic objects/
So you reckon this might be an incompatability between openai's latest python library and the Azure services?
I don't see AzureOpenAI.py in the call tree which is what I'd expect. Some additional clues: -Am using CitationQueryEngine (if that makes any difference?)
Will attempt a non-streaming response but it's a bit of butchery to make it do that as this sits behind a fastapi / called from a react app
e.g. stream_response (little func that is iterating the returned response.response_gen seems to be hitting openai.py not AzureOpenAI.py that I'd expect
So something I noticed with azure that is different to openai- it does return the dicts but sometimes there is a precursor message with the content filter stuff that doesn’t necessarily have the delta or content included. I had to put in checks to make sure that only process the streaming back packet if it actually has those keys