Find answers from the community

Updated 3 months ago

Router

using a router, I would like the router to be able to understand if it is a request for one of the retriever_tools, or if it is just a chatgpt standard query, how can I do that?
Current code:

c_tool = RetrieverTool.from_defaults( retriever=retriever_c, description=("Usefull to retrieve any information related to c"), ) router_retriever = RouterRetriever( selector=LLMSingleSelector.from_defaults(service_context=service_context), retriever_tools=[ a_tool, b_tool, c_tool ], )
L
B
8 comments
You could have a top level router with two tools -- a simple gpt query, or a retriever router

You can turn any function into a tool using FunctionTool (see Add numbers here)


https://docs.llamaindex.ai/en/stable/module_guides/deploying/agents/tools/usage_pattern.html#using-with-our-agents
got it. But which router should I use? Those I see get either a retriever or a query engine as tools.
@Logan M sorry for the follow up but it's unclear for me in the doc how to pass a normal tool to a router. I only see retriever or query engine tools being able to be passed. Am I missing something?
yea I guess the routers only accept query engine tools as inputs. You could make a fake query engine that just queries the LLM directly I suppose
thanks, just built a custom retriever (my other tools are retrievers) where init param is a chat_engine and retrieve just queries it:
def _retrieve( self, query_bundle, ) -> str: response=[] response.append(self._chat_engine.chat(query_bundle.query_str)) return response
that seems to work
Add a reply
Sign up and join the conversation on Discord