Find answers from the community

Updated last year

Hi, Can anyone help me with this issue?

Hi, Can anyone help me with this issue?

dependency:
llama-index 0.9.5
openai 1.3.4

I set up the api on AzureOpenai with:
  1. model = gpt-4(0613)
  2. model = gpt-4(1106-Preview)
  3. model = gpt-35-turbo(0613)
I initialize llm by
Plain Text
llm = AzureOpenAI(
    model="gpt-35-turbo", # or model="gpt-4(0613)" or model="gpt-4(1106-Preview)"
    deployment_name="NAME",
    api_key="API",
    azure_endpoint="ENDPOINT",
    api_version="2023-08-01-preview",
)


When I set up the PydanticSingleSelector on Router Query Engine:

Plain Text
selector=PydanticSingleSelector.from_defaults(llm=llm)


It works well with llm = "gpt-35-turbo" or "gpt-4(1106-Preview)",
but with "gpt-4(0613)", it returned this error:
Plain Text
BadRequestError: Error code: 400 - {'error': {'message': 'Unrecognized request arguments supplied: tool_choice, tools', 'type': 'invalid_request_error', 'param': None, 'code': None}}
L
l
3 comments
For whatever reason, gpt-4(0613) isn't supporting the new openai client πŸ€”

Need to use 1106 or gpt-35-turbo
openai really made a mess with this new client -- but thats just how it is right now
Thanks for your information. I hope they could fix this soon.πŸ˜‡
Add a reply
Sign up and join the conversation on Discord