Find answers from the community

Updated last year

When I tried to use the AZureOPenAI llm

At a glance

The community member is having an issue using the AzureOpenAI language model (LLM) as input for the OpenAI agent, as it is stating that the LLM must be an OpenAI instance. Another community member suggests using the Azure LLM from either LangChain or llama-index, and provides an example of using the AzureOpenAI LLM with the OpenAI agent. However, there is no explicitly marked answer to the original question.

When I tried to use the AZureOPenAI llm as input for the openai agent, it SAYS LLM MUST BE A OPENAI INSTANCE. Does it not take a azure instance as the llm?
L
s
2 comments
did you use the azure llm from langchain or llama-index?
Not sure if you resolved this @Milkman, I'm using OpenAI agent with llama-index' AzureOpenAI llm
Plain Text
self.llm = AzureOpenAI(
    model=self.deployment_model,
    engine=self.deployment_name,
    temperature=self.temperature,
    max_tokens=self.num_output)
............
agent = OpenAIAgent.from_tools(
  tools=tools,
  llm=self.llm,
  verbose=True)
Add a reply
Sign up and join the conversation on Discord