Find answers from the community

Updated 2 months ago

What is the difference between Multi-

What is the difference between Multi-Step Query Engine and Sub-Query Engine ? there is any implementation where I can have control over query decomposition to add ore remove a question from subquestions generated by this engines, before it is sent to LLM?
x
L
4 comments
@Logan M I'm delving into the functionalities of the MultiStepQueryEngine, particularly focusing on the task of decomposing complex queries into simpler, manageable sub-questions with the help of StepDecomposeQueryTransform.

Firstly, I am curious to know if there is an implementation within these engines that allows for detailed control over the decomposition of queries. Essentially, I'm looking for a way to validate and adjust the decomposed steps – such as adding or removing sub-questions – before they are processed by a Large Language Model (LLM).

Secondly, and most importantly, I am interested in understanding how to effectively orchestrate these steps or sub-questions. My goal is to receive responses from the LLM sequentially for each sub-question, ensuring that there is a coherent and logical relation between each step. This sequential processing is crucial for the integrity of the final answer, especially when dealing with complex queries.

Is there a method or an existing implementation in these engines that not only allows for modification of the sub-questions but also facilitates their sequential orchestration to garner a cohesive response from the LLM?

Any insights, experiences, or suggestions regarding this would be immensely helpful. Thank you in advance for your assistance!
I would either subclass the question generator class to give you control over the process, or implement the entire pipeline yourself using lower level features

Although our latest agent refactors also sound like they could be useful -- they allow you to run things step by step

https://docs.llamaindex.ai/en/stable/examples/agent/agent_runner/agent_runner.html
can you give me more details about question generator class and where is configured ?
query_engine = SubQuestionQueryEngine.from_defaults(question_gen=question_gen)

We have a few question gen modules here. They rely on either function calling apis or parsing structured outputs. Basically it gets the LLM to generate questions and which sub-index to send each question to
https://github.com/run-llama/llama_index/tree/main/llama_index/question_gen
Add a reply
Sign up and join the conversation on Discord