Find answers from the community

Updated 4 months ago

llama_index/docs/examples/query_engine/s...

At a glance

The community member is trying to retrieve the source nodes and node IDs from the sub-questions in the sub-question query engine. They have found a notebook that shows how to get the sub-query questions, but they are unsure how to get the source nodes for those questions. The comments suggest that the source nodes may be accessed through qa_pair.sub_q.sources, but the community member is not seeing the sub-questions in the callback debugger. They are using Anthropic for synthesis and OpenAI GPT-4 for question generation, and they have tried setting up the callback manager on the LLM instances, but the issue seems to be with the sub-question query engine itself. The community members discuss various approaches to resolving the issue, but there is no explicitly marked answer.

Useful resources
HI team, I'm trying to retrieve the source nodes and node ids from the sub questions in the sub question query engine, i see that this notebook shows how to get the sub query questions but what about the source nodes for those questions?

https://github.com/jerryjliu/llama_index/blob/main/docs/examples/query_engine/sub_question_query_engine.ipynb
L
s
20 comments
I thiiiink its

qa_pair.sub_q.sources
@Logan M i'm not seeing the sub q, i have a response synthesizer on the subq engine, i don't see subq in the callback debugger at al
Hmm, even if you follow the notebook?
@Logan M ya not seeing it at all, could it be that i'm using different llms? i didn't see an option to make the debugger global
Which LLM are you using?
Anthropic for synthesis openai gpt4 for q gen
Assuming you are using the LlamaIndex LLM classes for each LLM, maybe throw the callback manager on it directly too

Anthropic(..., callback_manager=callback_manager)
OpenAI(..., callback_manager=callback_manager)
the issue was the subq engine not the LLM lol
can you share you set this up?
right the sub q
query_engine = SubQuestionQueryEngine.from_defaults(
question_gen=OpenAIQuestionGenerator.from_defaults(
llm=OpenAI(model="gpt-4-0613",temperature=0),
verbose=True
),
query_engine_tools=[tool],
service_context=service_context,
response_synthesizer=response_synthesizer_compact,
verbose=True )
service context is from the global and its anthropic
hmmm. And the service context you passed in there for sure has the callback manager set on it?
ya i have events in the debugger but not SUBQ
ok was able to recreate it in a simple version
{'qa_pair': 'sub_q=SubQuestion(sub_question="What is the patient's medical history?", tool_name='use_this') answer=None sources=None'}]
looks like the answer since its being synthesized by a different LLM instance is not being attached
so that's the event start
but there should be an event end with more info
Add a reply
Sign up and join the conversation on Discord