Find answers from the community

Updated last year

#❓issues-and-help can we use any other

At a glance

The community members discuss the possibility of using models other than the OpenAI API. One community member confirms that there are other LLM (Large Language Model) modules supported by llamaIndex, and provides a link to the documentation. Another community member asks for specific changes required to use a different model, and is directed to an example that demonstrates how to use the Azure OpenAI model. The discussion continues with the community members trying to understand the process of replacing the OpenAI model, and one member offers to help with the code changes.

Useful resources
#❓py-issues-and-help can we use any other different model ??
instead of Open Ai API ??
W
C
7 comments
Yes, You can!!
These are the llm modules that llamaIndex currently supports: https://docs.llamaindex.ai/en/stable/module_guides/models/llms.html#modules
ok what changes do i have to make in the code so as to use a different model ???
its not mentioned in this above link
Just create the llm object and pass it to the service_context

This example will help you to understand more on using diff llm: https://docs.llamaindex.ai/en/stable/examples/customization/llms/AzureOpenAI.html
i am not quite able to understand this any YT video ideas so i can get help in this
Have you written the code for llm change?
Maybe I can help you there
The only change if you want to replace OpenAI as llm is that you'll need to pass the new llm model and embed model in the service_context.

In the above shared link, We are trying with Azure llm which is different than OpenAI.
Add a reply
Sign up and join the conversation on Discord