Find answers from the community

Updated 2 months ago

is there any class that uses agents as tools?

is there any class that uses agents as tools?
k
t
5 comments
Please stand by @tomi as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Yes, LlamaIndex provides a feature where agents can be used as tools. Since the agents inherit from BaseQueryEngine, you can easily define other agents as tools through the QueryEngineTool. Here is an example of how you can do this:

Plain Text
from llama_index.core.tools import QueryEngineTool

query_engine_tools = [
    QueryEngineTool(
        query_engine=sql_agent,
        metadata=ToolMetadata(
            name="sql_agent", description="Agent that can execute SQL queries."
        ),
    ),
    QueryEngineTool(
        query_engine=gmail_agent,
        metadata=ToolMetadata(
            name="gmail_agent",
            description="Tool that can send emails on Gmail.",
        ),
    ),
]

outer_agent = ReActAgent.from_tools(query_engine_tools, llm=llm, verbose=True)


In this example, sql_agent and gmail_agent are agents that are being used as tools. They are wrapped in the QueryEngineTool class and then used to create a new agent outer_agent.

You can find more details in the source documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai how do i find out what the exact model is that an LLM is using
Let me check the knowledge base for you. One moment! :books:
----------------------
You can find out the exact model that an LLM (Large Language Model) is using by accessing the .model attribute of the LLM object. Here's an example:

Plain Text
llm = NVIDIA()
print(llm.model)


In this example, llm is an instance of the NVIDIA LLM. By printing llm.model, you can see the exact model that the LLM is using.

This is also applicable when you load a specific model:

Plain Text
llm = NVIDIA(model="mistralai/mistral-7b-instruct-v0.2")
print(llm.model)


In this case, the llm.model will output "mistralai/mistral-7b-instruct-v0.2" which is the model you loaded.

You can find more details in the source documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord