https://ai-proxy.lab.mycompany.com/openai/deployments/gpt-35-turbo-1106/chat/completions?api-version=2023-12-01-preview
.OPENAI_API_BASE
to https://ai-proxy.lab.mycompany.com/openai/deployments/gpt-35-turbo-1106
Llamaindex calls https://ai-proxy.lab.mycompany.com/openai/deployments/gpt-35-turbo-1106/chat/completions
but the query parameter ?api-version=2023-12-01-preview
is missing and the call fails. How can I add that? Thank you for any help.from llama_index.llms.azure_openai import AzureOpenAI
. Going to try from openai import AzureOpenAI
per that example.