Find answers from the community

Updated 8 months ago

After doing `OpenAIAgent.from_tools` how

After doing OpenAIAgent.from_tools how to get a list of tools that have already been added, and add more without redefining the agent entirely?
L
z
32 comments
defining the agent is a no-op, so adding more tools that way should be fine?
what do you mean? I've got a whole chat history going. It seems terribly inefficient to OpenAIAgent.from_tools constantly in a chat
Chat history is already in memory and can also he passed in? It really is a no-op

I suppose a method could be added for adding additional tools, but it really is not that much of a difference to re-init the agent

This is a pretty common pattern especially for stateless server apis, etc. All the state is in the chat history
The reason is I have a tool for creating new tools. I have done this in autogen and coded it myself from nothing, but would like to port it all to llama-index. This is one of the more important tools to me. Here's the autogen version: https://github.com/microsoft/autogen/blob/main/notebook/agentchat_inception_function.ipynb
I could get it to work if I could see an example of how to define a llama-index tool without using the function inspection method llama-index is using.
Plain Text
_function_config = {
            'type': 'function',
            'function': {
                "name": name,
                "description": description,
                "parameters": {"type": "object", "properties": json_args},
                "required": ["url"],
            }}

llama_tool = FunctionTool.from_defaults(
            lambda **args: self.execute_func(name, packages, code, **args),
            name,
            description,
            _function_config
        )


that didn't work. AttributeError: 'dict' object has no attribute 'schema'
from_defaults rly can't do it? It seemed like it might since it also accepts name, description, fn_schema, tool_metadata
FunctionTool looks relatively promising
depending on what ToolMetadata wants. checking
Since you are defining a lambda, from_defaults wont work (like you mentioned, because of the function introspection seems to be incompatible with lambdas)

I would just do

Plain Text
llama_tool = FunctionTool.from_defaults(
    lambda **args: self.execute_func(name, packages, code, **args),
    ToolMetadata(description=description, name=name, fn_schema={...}
)
hard part would be manually getting the schema
but its doable, more annoying than hard lol
what is the schema supposed to be though? the openai json spec? I'm already generating that
it looks like it. The default is like:
Plain Text
parameters = {
    "type": "object",
    "properties": {
        "input": {"title": "input query string", "type": "string"},
    },
    "required": ["input"],
}

though i've never seen "object" before
it's always "type": "function"
everything else looks the same
Plain Text
{
  'type': 'function',
  'function': {
      "name": name,
      "description": description,
      "parameters": {"type": "object", "properties": json_args},
      "required": ["url"],
}


oh it's obviously parameters lol. I've been awake too long
no way this is gonna work haha
Attachment
image.png
otherwise i might have to sleep before recreating it all
hahaha might be good to get some sleep first πŸ™
I think I got it
it's just having trouble passing an argument to the tool now but I think that's just the prompting
behold the complex mind-bending solution, long theorized by science
Plain Text
dynamic_function.__name__ = name
ayyy nice :dotsHARDSTYLE:
AGI solved. we can all go on our UBIs now
thx for help!
Add a reply
Sign up and join the conversation on Discord