----------------------
Yes, based on the information provided in the knowledge sources, it appears that the OpenAI Agent from LlamaIndex is specifically designed to work with OpenAI models. For instance, in the
OpenAI Agent + Query Engine Experimental Cookbook (), the OpenAI Agent is initialized with an OpenAI model. Similarly, in the
OpenAI function calling for Sub-Question Query Engine (), it is mentioned that the OpenAIQuestionGenerator only works with the latest OpenAI models that support the function calling API.
Here is an example of how the OpenAI Agent is used with an OpenAI model:
from llama_index.agent.openai import OpenAIAgent
from llama_index.llms.openai import OpenAI
# initialize llm
llm = OpenAI(model="gpt-3.5-turbo-0613")
# initialize openai agent
agent = OpenAIAgent.from_tools(tools, llm=llm, verbose=True)
This does not mean that LlamaIndex cannot be used with other models, but the OpenAI Agent specifically seems to be designed for use with OpenAI models. If you want to use a different model, you might need to use a different agent or modify the OpenAI Agent to support the model you want to use.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------