Find answers from the community

Updated 4 months ago

I'm currently developing a complex RAG Pipeline.This pipeline needs to answer all user in

@kapa.ai

I'm currently developing a complex RAG Pipeline.
This pipeline needs to answer all user inquiries about a specific topic.

I currently have the following tools:

Plain Text
get_trim_list = FunctionTool.from_defaults(
    fn=trim_list_for_specific_model,
    name="trim_list_for_specific_model",
    description="This tool returns a trim list for a specified model. You can use it to ensure the trim that user asks for is valid. And use the trim name to do further processing.",
)

# Tool mainly used to query specific specifications
specific_trim_specs_tool = FunctionTool.from_defaults(
    fn=get_specific_trim_specs,
    name="specific_trim_specs_tool",
    description=(
        "Provides data about product specifications. "
        "This tool can return specific specification in dictionary format for a given trim and category. "
        "If the user question contains 'and', you need to split it into multiple questions first. "
        "Use a detailed plain text question as input to the tool."
    ),
)


I aim to use OpenAIAgent as the entry point for questions.

Plain Text
open_AI_agent = OpenAIAgent.from_tools([specific_trim_specs_tool, get_trim_list], verbose=True, callback_manager=CallbackManager([token_counter]))
response = open_AI_agent.chat("user_question_here")


However, if the user question is extremely complex, such as:
"Can you provide a detailed comparison of the different trim levels available for the specific model, including their respective prices, key specifications, and unique features? Additionally, how do these trims compare in terms of total cost of ownership, considering potential incentives, maintenance costs, and resale value?"

I want OpenAIAgent to use SubQuestionQueryEngine to break the question into several sub-questions, then use specific_trim_specs_tool to retrieve the specifications data, and finally synthesize the responses in English.
k
3 comments
Please stand by @歐東 as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
To handle complex queries, you can use the SubQuestionQueryEngine from LlamaIndex. This engine breaks down a complex query into sub-questions for each relevant data source, gathers all the intermediate responses, and synthesizes a final response.

First, you need to import the necessary modules and set up your environment:

Plain Text
import os
import nest_asyncio
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.core.tools import QueryEngineTool, ToolMetadata
from llama_index.core.query_engine import SubQuestionQueryEngine
from llama_index.core.callbacks import CallbackManager, LlamaDebugHandler
from llama_index.core import Settings

os.environ["OPENAI_API_KEY"] = "sk-..."
nest_asyncio.apply()


Next, you can use the LlamaDebugHandler to print the trace of the sub-questions captured by the SUB_QUESTION callback event type:

Plain Text
llama_debug = LlamaDebugHandler(print_trace_on_end=True)
callback_manager = CallbackManager([llama_debug])
Settings.callback_manager = callback_manager


Then, you can create a SubQuestionQueryEngine using your tools:

Plain Text
query_engine = SubQuestionQueryEngine.from_defaults(
    [get_trim_list, specific_trim_specs_tool],
    llm=open_AI_agent,  # OpenAIAgent
    verbose=True,
)


Finally, you can use the query_engine to handle complex queries:

Plain Text
response = query_engine.query("Your complex question here")
print(response)


This will break down the complex question into sub-questions, use the specified tools to answer each sub-question, and then synthesize the responses into a final answer.

Please note that you need to replace "Your complex question here" with the actual complex question you want to answer.
For more details, you can refer to the SubQuestionQueryEngine tutorial () and SubQuestionQueryEngine API reference () in the LlamaIndex documentation.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord