Find answers from the community

Updated 3 months ago

Reading the docs about the Agentic

Reading the docs about the Agentic Strategies, I created a RouterQueryEngine and that worked very well. But I ended up realziing I wanted to use a chat style, from chat engines or agents. So I wrapped that RouterQueryEngine in a QueryEngineTool and passed that into a OpenAIAgent. There's just this single tool for the agent, but the agent always has to make an LLM call to then decide to use this single tool. Is there any way of setting this QueryEngineTool as the default to be executed and skip this LLM call? Thanks!
C
L
5 comments
Plain Text
qt_router_query_engine = QueryEngineTool.from_defaults(
    query_engine=router_query_engine,
    description=("This is a general tool made to answer all questions"),
)

agent = OpenAIAgent.from_tools(
    [qt_router_query_engine],
    default_tool_choice="query_engine_tool",
    verbose=True,
)
I tried using default_tool_choice, but I still get an LLM call to use this tool and then my other tools inside the router are executed.
The chat history influences how the tool is called

You can force the tool to be called withagent.chat("Hello", tool_choice="query_engine_tool"), but it will still require an LLM call

Without the LLM call, it would be the same as using the router query engine on its own πŸ‘€
You could use a router retriever (slightly different) with a context or condense plus context chat engine
You're right Logan, it doesn't make sense to skip that LLM call. It's helpful to condense the chat history. Thanks!
Add a reply
Sign up and join the conversation on Discord