Find answers from the community

Updated 8 months ago

does it mean OpenAI Agent can't be used with open source models like llama3 as it is speci

@kapa.ai does it mean OpenAI Agent can't be used with open source models like llama3 as it is specific to openai models ?
k
2 comments
Please stand by @praveenks30#120294 as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Yes, based on the information provided in the knowledge sources, it appears that the OpenAI Agent from LlamaIndex is specifically designed to work with OpenAI models. For instance, in the OpenAI Agent + Query Engine Experimental Cookbook (), the OpenAI Agent is initialized with an OpenAI model. Similarly, in the OpenAI function calling for Sub-Question Query Engine (), it is mentioned that the OpenAIQuestionGenerator only works with the latest OpenAI models that support the function calling API.

Here is an example of how the OpenAI Agent is used with an OpenAI model:

Plain Text
from llama_index.agent.openai import OpenAIAgent
from llama_index.llms.openai import OpenAI

# initialize llm
llm = OpenAI(model="gpt-3.5-turbo-0613")

# initialize openai agent
agent = OpenAIAgent.from_tools(tools, llm=llm, verbose=True)


This does not mean that LlamaIndex cannot be used with other models, but the OpenAI Agent specifically seems to be designed for use with OpenAI models. If you want to use a different model, you might need to use a different agent or modify the OpenAI Agent to support the model you want to use.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord