Find answers from the community

Updated 6 months ago

Reading the docs about the Agentic

At a glance

The community member created a RouterQueryEngine that worked well, but then wanted to use a chat-style interface. They wrapped the RouterQueryEngine in a QueryEngineTool and passed it to an OpenAIAgent. However, the agent always has to make an LLM call to decide to use this single tool, and the community member wants to know if there's a way to set the QueryEngineTool as the default and skip the LLM call.

In the comments, another community member suggests using default_tool_choice to set the QueryEngineTool as the default, but this still requires an LLM call. Another community member explains that the chat history influences how the tool is called, and you can force the tool to be called with agent.chat("Hello", tool_choice="query_engine_tool"), but this will still require an LLM call. The final comment suggests that it doesn't make sense to skip the LLM call, as it's helpful to condense the chat history.

There is no explicitly marked answer in the comments.

Reading the docs about the Agentic Strategies, I created a RouterQueryEngine and that worked very well. But I ended up realziing I wanted to use a chat style, from chat engines or agents. So I wrapped that RouterQueryEngine in a QueryEngineTool and passed that into a OpenAIAgent. There's just this single tool for the agent, but the agent always has to make an LLM call to then decide to use this single tool. Is there any way of setting this QueryEngineTool as the default to be executed and skip this LLM call? Thanks!
C
L
5 comments
Plain Text
qt_router_query_engine = QueryEngineTool.from_defaults(
    query_engine=router_query_engine,
    description=("This is a general tool made to answer all questions"),
)

agent = OpenAIAgent.from_tools(
    [qt_router_query_engine],
    default_tool_choice="query_engine_tool",
    verbose=True,
)
I tried using default_tool_choice, but I still get an LLM call to use this tool and then my other tools inside the router are executed.
The chat history influences how the tool is called

You can force the tool to be called withagent.chat("Hello", tool_choice="query_engine_tool"), but it will still require an LLM call

Without the LLM call, it would be the same as using the router query engine on its own πŸ‘€
You could use a router retriever (slightly different) with a context or condense plus context chat engine
You're right Logan, it doesn't make sense to skip that LLM call. It's helpful to condense the chat history. Thanks!
Add a reply
Sign up and join the conversation on Discord