The community member is trying to create a router that can distinguish between a request for a retriever tool and a standard ChatGPT query. They have provided some sample code for a RouterRetriever with various retriever tools. The comments suggest that the community member could create a top-level router with two tools - one for a simple GPT query and one for the retriever router. However, it's unclear how to pass a normal tool to the router, as the routers seem to only accept retriever or query engine tools.
The community members discuss a potential solution where the community member could create a "fake" query engine that directly queries the language model, and they provide an example implementation of a custom retriever that queries a chat engine. This seems to work for the community member.
using a router, I would like the router to be able to understand if it is a request for one of the retriever_tools, or if it is just a chatgpt standard query, how can I do that? Current code:
c_tool = RetrieverTool.from_defaults(
retriever=retriever_c,
description=("Usefull to retrieve any information related to c"),
)
router_retriever = RouterRetriever(
selector=LLMSingleSelector.from_defaults(service_context=service_context),
retriever_tools=[
a_tool,
b_tool,
c_tool
],
)
@Logan M sorry for the follow up but it's unclear for me in the doc how to pass a normal tool to a router. I only see retriever or query engine tools being able to be passed. Am I missing something?
thanks, just built a custom retriever (my other tools are retrievers) where init param is a chat_engine and retrieve just queries it: def _retrieve(
self,
query_bundle,
) -> str:
response=[]
response.append(self._chat_engine.chat(query_bundle.query_str))
return response