Find answers from the community

Updated last year

Having an issue with azure openai.. but

Having an issue with azure openai.. but only when running in a thread executor. Anyone else deal with this? Is the azureopenai object no longer thread safe? Was working before I upgraded llama index and openai packages

‘AzureOpenAI’ object has no attribute ‘_client’
L
b
28 comments
It should be even more thread safe than before 😅 no more using globals for azure

That's a super unexpected error... if you do pip show openai and pip show llama-index what do you get?
and you followed the latest docs? Some kwargs might have changed ever so slightly
https://docs.llamaindex.ai/en/stable/examples/customization/llms/AzureOpenAI.html
Yeah I had to change those last night to get it working again
So it works fine if I’m just running something standalone, but if I pass the object to asyncio event loop like loop.run_in_executor(llm…) that is when the error happens
ah I wonder if this is related to some changes I made to pickeling/deepcopy.... let me double check something
Also was looking at the azureopenai class and I couldn’t really follow the ‘_client’ instantiation 😅 I see the SyncAzureOpenAI but don’t know then it’s actually being called
ah crap it is related to those changes. Noooooo
Plain Text
>>> import pickle
>>> data = pickle.dumps(llm)
>>> new_llm = pickle.loads(data)
>>> new_llm.complete("Say one word! ")
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/loganm/llama_index_proper/llama_index/llama_index/llms/base.py", line 313, in wrapped_llm_predict
    f_return_val = f(_self, *args, **kwargs)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/loganm/llama_index_proper/llama_index/llama_index/llms/openai.py", line 203, in complete
    return complete_fn(prompt, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/loganm/llama_index_proper/llama_index/llama_index/llms/generic_utils.py", line 153, in wrapper
    chat_response = func(messages, **kwargs)
                    ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/loganm/llama_index_proper/llama_index/llama_index/llms/openai.py", line 238, in _chat
    response = self._client.chat.completions.create(
               ^^^^^^^^^^^^
AttributeError: 'OpenAI' object has no attribute '_client'. Did you mean: '_aclient'?
Ye that’s the exact traceback just async
So I made indexs/query engines pickleable, but it seems like this destroys private variables

Let me see if I can fix this
and satisfy everyone
Ah this makes sense
Way long ago I was trying to pass an index this way and was getting the non-pickle-able error
So I switched to just passing the llm
Ok, fixed for the LLM.... but now to check the query engine/index lol
huzzah, it seems to work! query engine, index, llm, and nodes/documents all pickle
need to head out for a flu shot lol but will push soon
Thanks!! Greatly appreciate it
Ok sorry for the delay
0.9.3.post1 is out
and I pray it works
We are back in business, amazing, thank you!
Add a reply
Sign up and join the conversation on Discord