Find answers from the community

Updated 2 months ago

Building a Generic Agent with Various Language Models

At a glance

The community member is working on a project that uses OpenAiAgent with language models from OpenAI and Azure OpenAI. They would like to use Agents outside of the OpenAI ecosystem, such as Calude and Bedrock deploys, but couldn't find a "generic" way to build an Agent with other language models. The community members in the comments suggest that to use language models other than OpenAI, the community member can pass the new language model reference or define it globally. They also recommend using the ReactAgent or FunctionCallingAgent, depending on whether the language model has a function calling API.

Useful resources
Hi guys! In my project I've been using OpenAiAgent with llms from OpenAI and, in some cases, AzureOpenAI. Now I would like to be able to use Agents outside OpenAI's ecocystem, like Calude, Bedrock deploys and so on. I couldn't find a "generic" way of building an Agent with other LLMs, I understand the possibility is also constraint in some cases with the capability of function calling, but maybe I didn't search enough. Have anyone faced a similar challenge?
W
L
2 comments
Hey, For using llm other than OpenAI just need to pass the new llm reference or just define it globally:
Plain Text
from llama_index.core import Settings
Settings.llm = llm # your claude llm reference


Also for creating agents, you can use ReactAgent: https://docs.llamaindex.ai/en/stable/examples/agent/react_agent/

There are lots of examples as well there, just check the left section on this page
Assuming the llm has a function calling api, you can use FunctionCallingAgent. Otherwise, use the ReActAgent yea
Add a reply
Sign up and join the conversation on Discord