Find answers from the community

H
Hayago
Offline, last seen 3 months ago
Joined September 25, 2024
H
Hayago
·

Stack

Hi , I got this error when running the subquestion query engine, i tried to debug on my own but it seems like the global_stack_trace became empty after the first subquestion query was completed. No idea how to go about fixing it.

Traceback (most recent call last):
File "/opt/conda/envs/xxx/lib/python3.10/site-packages/llama_index/indices/query/base.py", line 23, in query
response = self._query(str_or_query_bundle)
File "/opt/conda/envs/xxx/lib/python3.10/site-packages/llama_index/query_engine/sub_question_query_engine.py", line 142, in _query
qa_pairs_all = [
File "/opt/conda/envs/xxx/lib/python3.10/site-packages/llama_index/query_engine/sub_question_query_engine.py", line 143, in <listcomp>
self._query_subq(sub_q, color=colors[str(ind)])
File "/opt/conda/envs/xxx/lib/python3.10/site-packages/llama_index/query_engine/sub_question_query_engine.py", line 238, in _query_subq
with self.callback_manager.event(
File "/opt/conda/envs/xxx/lib/python3.10/contextlib.py", line 135, in enter
return next(self.gen)
File "/opt/conda/envs/xxx/lib/python3.10/site-packages/llama_index/callbacks/base.py", line 169, in event
event.on_start(payload=payload)
File "/opt/conda/envs/xxx/lib/python3.10/site-packages/llama_index/callbacks/base.py", line 242, in on_start
self._callback_manager.on_event_start(
File "/opt/conda/envs/xxx/lib/python3.10/site-packages/llama_index/callbacks/base.py", line 105, in on_event_start
parent_id = global_stack_trace.get()[-1]
IndexError: list index out of range
4 comments
L
Did anybody have any luck using StarChat for the LLM used in a query engine? Given the same query, StarChat is not answering the question at all while openai model provides one.
2 comments
H
L
H
Hayago
·

Custom

I woud like to create a custom agent using a local LLM. I have already implemented a local LLM using CustomLLM class. Is there a similar guide on implementing a custom agent?
2 comments
L
Hello, did anybody else encounter error with the guidance question generator for the subquestion query engine?

Example of error:
raise OutputParserException(
llama_index.output_parsers.base.OutputParserException: Got invalid JSON object. Error: Expecting property name enclosed in double quotes: line 2 column 14 (char 15) while parsing a flow mapping
in "<unicode string>", line 2, column 14:
"items": [{{#geneach 'items' stop=']'}}{{#u ...
^
expected ',' or '}', but got '<scalar>'
in "<unicode string>", line 3, column 48:
... ": "{{gen 'sub_question' stop='"'}}",
6 comments
D
k
L
Anybody else facing issues with inconsistency with openai models? I've been using the sub question query engine and it was supposed to route questions to different tools, but this morning it started routing all queries only to one tool only despite not having changed the prompt or data
4 comments
L
H
H
Hayago
·

Persona

Hi guys, a really newbie question, but is there any way to get a query engine to assume a certain persona? In particular a sub-question query engine. I tried to include it in the prompt but the LLM doesn't really follow the persona instruction well. Was wondering if anyone had success providing a custom text_qa_template in the response synthesizer for a custom persona.
3 comments
L
H
Hi guys, pretty new here. Wanted to check if anyone know if its possible to compose a bunch of composable graphs?
4 comments
H
L