Hi - I'm trying to get some query_engine sample code to work from this page:
https://docs.llamaindex.ai/en/stable/examples/usecases/10k_sub_question.htmlllama-index is version 0.9.13
One key difference that I have with the sample and my code is that I am using a local LLM, and not OpenAI. However, I am setting a local dummy key and an alternative base_url, which points to my local server. My embedding model is also a local HF one. It seems to build the indexes fine, and make a call to the local LLM, but the query engine seems to fail.
The error is "ValueError: Expected tool_calls in ai_message.additional_kwargs, but none found.".
Source and error trace in the included error.txt file