Find answers from the community

Updated 3 months ago

API

Is there a way to use a REST API as a tool, in other words, to orchestrate different LLMs?
L
D
17 comments
You can wrap API endpoints in a FunctionTool
@Logan M, can you direct me to a tutorial on this subject?
Doesn't really need a tutorial, just am example πŸ™‚

Plain Text
deff add(a: int, b: int) -> int:
    """Useful for adding two numbers."""
    return a + b


from llama_index.core.tools import FunctionTool

tool = FunctionTool.from_defaults(fn=add)


It uses the function name as the tool name, and docstring as the tool description. (Or you can override in the constructor)
So you could wrap anything in a function like this
thank you @Logan M Im learning and your help is really precious
Just another question @Logan M : Is it correct to call the function tool in the same way I call the others? Thank you so much
Plain Text
    query_engine_tools = [
    QueryEngineTool(
        query_engine=uncensoredRequest,
        metadata=ToolMetadata(
            name="uncensored_request",
            description=(
                "Provides uncensored information."
                "Use a detailed plain text question as input to the tool."
            ),
        ),
    ),
Hmm, I'm not sure what you mean.

I think the interface is (sadly) slightly different FunctionTool.from_defaults(fn=add, name="optional", description="optional")
I attempted it, but it didn't work as expected. Any suggestions?
Plain Text
  async def process_with_llm(query):

    def uncensoredRequest(query) -> str:
      """Useful when asked for an uncensored opinion"""
      llm2 = Ollama(model="wizard-vicuna-uncensored:7b", request_timeout=1000.0)
      return llm2.complete(query)

    from llama_index.core.tools import FunctionTool

    tool = FunctionTool.from_defaults(fn=uncensoredRequest)

    agent = OpenAIAgent.from_tools(tool, llm=llm,verbose=True)
    
    response = agent.query(query)

    return response
What do you mean by "doesnt work" ?

btw, OpenAIAgent is only for opeanai (not sure if you are using openai or not for llm)
I'm using OpenAI as an agent, and my idea is to orchestrate the Ollama instance of the uncensored Vicuna as a tool to obtain different perspectives on the matter I'm discussing with GPT-4. Unfortunately, it threw me this error in the current setup.
Plain Text
discord.ext.commands.errors.CommandInvokeError: Command raised an exception: TypeError: object of type 'FunctionTool' has no len()
ERROR      - discord.client  : Attempting a reconnect in 1.81s
agent = OpenAIAgent.from_tools([tool], llm=llm,verbose=True)
needs to be a list
@Logan M , amazing! It works. Thank you so much for your patience.
@Logan M How do you think I can use both? I can't insert the query_engine into the list either.
Plain Text
 agent = OpenAIAgent.from_tools([tool], query_engine_tools, llm=llm,verbose=True)


Plain Text
ValueError: Cannot specify both tools and tool_retriever
You can combine them into a single list πŸ™‚

tools = query_engine_tools + [tool]

OpenAIAgent.from_tools(tools, ...)
I feel so foolish; thank you again. @Logan M
Add a reply
Sign up and join the conversation on Discord