Was following the
https://docs.llamaindex.ai/en/stable/examples/query_engine/sub_question_query_engine.html tutorial and received this error output:
**********
Trace: query
|_query -> 6.062075 seconds
|_templating -> 0.0 seconds
|_llm -> 6.062075 seconds
**********
Traceback (most recent call last):
File "S:\Gemini-Coder\local-indexer\cmd_local_index_chat.py", line 83, in <module>
respnose = query_engine.query(
File "C:\Users\thecr\AppData\Local\Programs\Python\Python310\lib\site-packages\llama_index\core\base_query_engine.py", line 40, in query
return self._query(str_or_query_bundle)
File "C:\Users\thecr\AppData\Local\Programs\Python\Python310\lib\site-packages\llama_index\query_engine\sub_question_query_engine.py", line 129, in _query
sub_questions = self._question_gen.generate(self._metadatas, query_bundle)
File "C:\Users\thecr\AppData\Local\Programs\Python\Python310\lib\site-packages\llama_index\question_gen\llm_generators.py", line 78, in generate
parse = self._prompt.output_parser.parse(prediction)
File "C:\Users\thecr\AppData\Local\Programs\Python\Python310\lib\site-packages\llama_index\question_gen\output_parser.py", line 13, in parse
raise ValueError(f"No valid JSON found in output: {output}")
ValueError: No valid JSON found in output: Understood! I'll do my best to help you with your questions and provide relevant sub-questions based on the tools provided. Please go ahead and ask your user question, and I'll generate the list of sub-questions accordingly.
I am using local embedding model and local language model, but everything else I kept the same. I didn't read anything bout linking a json file in that doc.