Find answers from the community

Updated last year

OpenAI

When using Llama Index, do we have control over the model used in Open AI? The reason I'd ask is because I'm clueless about which model my Llama Index setup is using. Is this managed on Open AI side or my application side? Thank you!
W
P
4 comments
Yes by default llamaindex uses gpt3.5-turbo from openai.

You can change it as per your need while defining the llm

Plain Text
from llama_index.llms import OpenAI

llm= OpenAI(model="DEFINE_YOUR_MODEL", temperature=0.1)
@WhiteFang_Jr thanks for letting me know! Sorry to miss that on the docs
Add a reply
Sign up and join the conversation on Discord