Find answers from the community

Updated last year

Hello did anybody else encounter error

Hello, did anybody else encounter error with the guidance question generator for the subquestion query engine?

Example of error:
raise OutputParserException(
llama_index.output_parsers.base.OutputParserException: Got invalid JSON object. Error: Expecting property name enclosed in double quotes: line 2 column 14 (char 15) while parsing a flow mapping
in "<unicode string>", line 2, column 14:
"items": [{{#geneach 'items' stop=']'}}{{#u ...
^
expected ',' or '}', but got '<scalar>'
in "<unicode string>", line 3, column 48:
... ": "{{gen 'sub_question' stop='"'}}",
L
k
D
6 comments
interesting, I haven't seen this one πŸ€” Seems like the JSON maybe got messed up because of quotes?
i am getting same
Hi I have the same issue using SubQuestionQueryEngine. But only if it needs to hit two indexes. If it uses one tool no problem. It use to work but I upgraded to 0.9.42
This is the error: raise OutputParserException(
llama_index.output_parsers.base.OutputParserException: Got invalid JSON object. Error: Expecting ',' delimiter: line 22 column 10 (char 670) while parsing a flow sequence
in "<unicode string>", line 2, column 14:
"items": [
^
expected ',' or ']', but got '<stream end>'
in "<unicode string>", line 22, column 10:
OK sorted. Increased max_new_tokens to 512. However did slow things down.
def createLLM(path: str, temp: float):
llm = LlamaCPP(
model_url=None,
model_path=path,
temperature=temp,
max_new_tokens=512,
context_window=20000,
generate_kwargs={},
model_kwargs={"n_gpu_layers": 4, "n_threads": 10, "n_threads_batch": 10},
# messages_to_prompt=messages_to_prompt,
# completion_to_prompt=completion_to_prompt,
verbose=True,
)
return llm
Add a reply
Sign up and join the conversation on Discord