Find answers from the community

Updated 3 months ago

Hi,

Hi,



Trying out SubQuestionQueryEngine to connect heterogeneous indexes (code, sql and documentation) , when running a query I'm getting an error

Error code: 404 - {'error': {'message': 'Unrecognized request argument supplied: tools', 'type': 'invalid_request_error', 'param': None, 'code': None}}

This are my dependencies

llama-index==0.8.69.post2
llama-hub >= 0.0.44
tree_sitter_languages >= 1.8.0
openai~=1.2.4
numexpr~=2.8.7
tiktoken~=0.5.1
azure open ai gpt3.5 turbo

Is there a limitation on azure's openai in order to use this? ... or should i take a different approach for querying?

Thanks!

Fran
L
F
13 comments
Are you using azure?
I have updated the dependencies and still happening the same 😩

llama-index==0.9.2
llama-hub==0.0.45
tree_sitter_languages >= 1.8.0
openai==1.3.2
numexpr~=2.8.7
tiktoken~=0.5.1
jinja2
SQLAlchemy~=2.0.23
Try with a fresh venv. Also make sure you are using the newest azure api version

I ran this notebook just yesterday and it worked fine for function/tool calling
https://docs.llamaindex.ai/en/stable/examples/customization/llms/AzureOpenAI.html
The problem comes when using the SubQuestionQueryEngine

query_engine = SubQuestionQueryEngine.from_defaults(
query_engine_tools=query_engine_tools,
service_context=azure_service_context
)
i will try the freshenv
Still having the same issue 😩
https://github.com/jbergant/LlamaIndexCourse/blob/main/queryYoutubeAndData.ipynb this is a working sample with openAI... using azureOpenAi ... does not work
What api_version are you using?
2023-07-01-preview
Thanks Logan for helping me thinking πŸ™‚
Add a reply
Sign up and join the conversation on Discord