Find answers from the community

Updated 3 weeks ago

Using Llama-index-agent-openai with Ollama Backend

I am trying to use llama-index-agent-openai with Ollama as backend. The code
Plain Text
agent = OpenAIAgent.from_tools(
    tools=[my_tool], llm=llm, verbose=True, SystemMessage=SCHEMA_PROMPT
)

raises the exception: ValueError("llm must be a OpenAI instance") when the llm is provided by:
Plain Text
    llm = Ollama(
        model=settings.ollama_model,
        base_url=settings.ollama_base_url,
        request_timeout=120.0,
    )

Is this a bug or is there a workaround? Thanks for any help!
L
1 comment
The openai agent indeed only works with openai πŸ‘€ Use FunctionCallingAgent (assuming your ollama LLM supports function calling)
Add a reply
Sign up and join the conversation on Discord