OpenAiAgent
with llms from OpenAI and, in some cases, AzureOpenAI. Now I would like to be able to use Agents outside OpenAI's ecocystem, like Calude, Bedrock deploys and so on. I couldn't find a "generic" way of building an Agent with other LLMs, I understand the possibility is also constraint in some cases with the capability of function calling, but maybe I didn't search enough. Have anyone faced a similar challenge?from llama_index.core import Settings Settings.llm = llm # your claude llm reference