Find answers from the community

Updated 2 months ago

are there independently usable modules

are there independently usable modules for the components that underlie the subquestion query engine?

Like, I’m trying to play around specifically with the component that does the subquery generation. Is there anything special to it or under the hood is it as simple as directly prompting a base LLM to decompose your query?
T
L
2 comments
Pretty sure it just breaks down the original query into subqueries with a LLM call, gathers the answers and then synthesizes a final response https://github.com/run-llama/llama_index/tree/9b798f819bb0afa6dabf418f8f2db87a31125d5e/llama_index/question_gen
There is a standalone question generator compoment

It breaks an initial question down into both sub questions and where to send them

You can use openai functions or plain json output parsing for this
Add a reply
Sign up and join the conversation on Discord