Find answers from the community

Updated last year

#❓issues-and-help can we use any other

#❓py-issues-and-help can we use any other different model ??
instead of Open Ai API ??
W
C
7 comments
Yes, You can!!
These are the llm modules that llamaIndex currently supports: https://docs.llamaindex.ai/en/stable/module_guides/models/llms.html#modules
ok what changes do i have to make in the code so as to use a different model ???
its not mentioned in this above link
Just create the llm object and pass it to the service_context

This example will help you to understand more on using diff llm: https://docs.llamaindex.ai/en/stable/examples/customization/llms/AzureOpenAI.html
i am not quite able to understand this any YT video ideas so i can get help in this
Have you written the code for llm change?
Maybe I can help you there
The only change if you want to replace OpenAI as llm is that you'll need to pass the new llm model and embed model in the service_context.

In the above shared link, We are trying with Azure llm which is different than OpenAI.
Add a reply
Sign up and join the conversation on Discord