Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
😞
😐
😃
Powered by
Hall
Inactive
Updated last year
0
Follow
#❓issues-and-help can we use any other
#❓issues-and-help can we use any other
Inactive
0
Follow
C
CHAITANYA DESHPANDE
last year
·
#❓py-issues-and-help can we use any other different model ??
instead of Open Ai API ??
W
C
7 comments
Share
Open in Discord
W
WhiteFang_Jr
last year
Yes, You can!!
These are the llm modules that llamaIndex currently supports:
https://docs.llamaindex.ai/en/stable/module_guides/models/llms.html#modules
C
CHAITANYA DESHPANDE
last year
Thanks mate
C
CHAITANYA DESHPANDE
last year
ok what changes do i have to make in the code so as to use a different model ???
its not mentioned in this above link
W
WhiteFang_Jr
last year
Just create the llm object and pass it to the
service_context
This example will help you to understand more on using diff llm:
https://docs.llamaindex.ai/en/stable/examples/customization/llms/AzureOpenAI.html
C
CHAITANYA DESHPANDE
last year
i am not quite able to understand this any YT video ideas so i can get help in this
W
WhiteFang_Jr
last year
Have you written the code for llm change?
Maybe I can help you there
W
WhiteFang_Jr
last year
The only change if you want to replace OpenAI as llm is that you'll need to pass the new llm model and embed model in the service_context.
In the above shared link, We are trying with Azure llm which is different than OpenAI.
Add a reply
Sign up and join the conversation on Discord
Join on Discord