RouterQueryEngine
and that worked very well. But I ended up realziing I wanted to use a chat style, from chat engines or agents. So I wrapped that RouterQueryEngine
in a QueryEngineTool
and passed that into a OpenAIAgent
. There's just this single tool for the agent, but the agent always has to make an LLM call to then decide to use this single tool. Is there any way of setting this QueryEngineTool
as the default to be executed and skip this LLM call? Thanks!qt_router_query_engine = QueryEngineTool.from_defaults( query_engine=router_query_engine, description=("This is a general tool made to answer all questions"), ) agent = OpenAIAgent.from_tools( [qt_router_query_engine], default_tool_choice="query_engine_tool", verbose=True, )
agent.chat("Hello", tool_choice="query_engine_tool")
, but it will still require an LLM call