Find answers from the community

Updated last year

When I tried to use the AZureOPenAI llm

When I tried to use the AZureOPenAI llm as input for the openai agent, it SAYS LLM MUST BE A OPENAI INSTANCE. Does it not take a azure instance as the llm?
L
s
2 comments
did you use the azure llm from langchain or llama-index?
Not sure if you resolved this @Milkman, I'm using OpenAI agent with llama-index' AzureOpenAI llm
Plain Text
self.llm = AzureOpenAI(
    model=self.deployment_model,
    engine=self.deployment_name,
    temperature=self.temperature,
    max_tokens=self.num_output)
............
agent = OpenAIAgent.from_tools(
  tools=tools,
  llm=self.llm,
  verbose=True)
Add a reply
Sign up and join the conversation on Discord