Find answers from the community

Updated 3 months ago

Agent

Hi, I wonder is there a guide to add custom Agent based on AgentRunner and AgentWorker? Currently only OpenAIAgent and ReActAgent are supported, but I want to add custom agent for Ollama as well.
L
S
11 comments
You can use any LLM (including ollama) with the react agent
But the ReActAgent output is not for production I think
That is why I want to create something similar to OpenAIAgent but work with other LLms
It's supposed to be lol but most open source LLMs suck at being agents, and following conplex instructions, and structured outputs

If you have an idea, I recommend just implementing the base class your own way

https://github.com/run-llama/llama_index/blob/4269179c58ad73e2007be9d125bff8cafe968807/llama_index/agent/types.py#L173
THanks, is there any document on this? How about the AgentRunner? What I want to do is to have the open source LLMs to retry with the feedback from Human for the previous step until it can get the correct input and function name.
not really any docs on it yet, its quite new. I would read the source code I linked above, and look at the react and openai implementations as reference
I'm doing that atm. Are you also interested if I create PR for OllamaAgent?
Would definitely be interested -- happy to review any PR you make! πŸ’ͺ
@Logan M I started the first step by refactoring Ollama LLM to use chat API: https://github.com/run-llama/llama_index/pull/9685/files
https://github.com/run-llama/llama_index/pull/9689/files
@Logan M I have a follow up PR to refactor to httpx
https://github.com/run-llama/llama_index/pull/9833/files
@Logan M Another PR to allow kwargs for Pydantic program. This will help with passing params to llm.chat/achat/complete/acomplete . This is required if we want to use format="json" for Ollama.
Add a reply
Sign up and join the conversation on Discord