Find answers from the community

Updated 3 months ago

Azure openai path settings issue with query parameter

I would appreciate some help with AzureOpenAI path settings. Our proxy requires a path like https://ai-proxy.lab.mycompany.com/openai/deployments/gpt-35-turbo-1106/chat/completions?api-version=2023-12-01-preview.
By setting OPENAI_API_BASE to https://ai-proxy.lab.mycompany.com/openai/deployments/gpt-35-turbo-1106 Llamaindex calls https://ai-proxy.lab.mycompany.com/openai/deployments/gpt-35-turbo-1106/chat/completions but the query parameter ?api-version=2023-12-01-preview is missing and the call fails. How can I add that? Thank you for any help.
c
B
5 comments
The only difference to that code is that I use from llama_index.llms.azure_openai import AzureOpenAI. Going to try from openai import AzureOpenAI per that example.
That was it. Wrong import. Thank you.
Celebrated too early: SchemaLLMPathExtractor does not work with the openai version.
Developer dilemma: It's working now and I don't know why .... πŸ‘€ Thank you for your help though.
Add a reply
Sign up and join the conversation on Discord