llama-index-agent-openai
with Ollama as backend. The codeagent = OpenAIAgent.from_tools( tools=[my_tool], llm=llm, verbose=True, SystemMessage=SCHEMA_PROMPT )
ValueError("llm must be a OpenAI instance")
when the llm is provided by:llm = Ollama( model=settings.ollama_model, base_url=settings.ollama_base_url, request_timeout=120.0, )
FunctionCallingAgent
(assuming your ollama LLM supports function calling)