Find answers from the community

Updated 2 months ago

Using Llama-index-agent-openai with Ollama Backend

At a glance

The community member is trying to use the llama-index-agent-openai with the Ollama backend, but is encountering an exception stating that the llm must be an OpenAI instance. A comment suggests that the OpenAIAgent only works with OpenAI, and recommends using the FunctionCallingAgent instead, assuming the Ollama LLM supports function calling.

I am trying to use llama-index-agent-openai with Ollama as backend. The code
Plain Text
agent = OpenAIAgent.from_tools(
    tools=[my_tool], llm=llm, verbose=True, SystemMessage=SCHEMA_PROMPT
)

raises the exception: ValueError("llm must be a OpenAI instance") when the llm is provided by:
Plain Text
    llm = Ollama(
        model=settings.ollama_model,
        base_url=settings.ollama_base_url,
        request_timeout=120.0,
    )

Is this a bug or is there a workaround? Thanks for any help!
L
1 comment
The openai agent indeed only works with openai πŸ‘€ Use FunctionCallingAgent (assuming your ollama LLM supports function calling)
Add a reply
Sign up and join the conversation on Discord