lama_index.output_parsers.base.OutputParserException: Got invalid JSON object. Error: Extra data: line 15 column 1 (char 423) while scanning for the next token found character '`' that cannot start any token in "<unicode string>", line 15, column 1:
seems correct in the code, maybe it's because I'm using offline Llama-2 through HuggingFaceLLM if it has anything to do? I'm having now trouble also with RouterQueryEngine to invoke the LLMSingleSelector using LLMPredictor made out of this HuggingFaceLLM. Maybe there are some tutorials, instructions or whatever how to use LLamaIndex with LLama-2 with these nice features like subquery/router query engines?
Usually the tools that use output parsing (like all the ones you listed here) require pretty smart LLMs. In general, open-source LLMs are not great for this yet π
You can customize the underlying selectors/question generators, but there's no tutorial (yet) on that. You kind of have to track things down in the code and subclass the respective generator/selector so that it works in a way you want
yeah, unfortunately I've just noticed this, btw. there used to be a class called HuggingFaceLLMPredictor that was deprecated, however the migration guide is gone now (404 link), do you know if there is any way to initalize LLMPredictor to somehow treat input HuggingFaceLLM as some kind of special LLM to be able to predict? It's giving me empty JSONs as output. Maybe it's the LLM or maybe it's the predictor.
Llama 7B was just too weak indeed and it generated JSONs however it wanted and not according to LLMSelector instructions. Llama 13B handles it well enough. So it was actually the reason behing both subquestion query engine and router query engine