Find answers from the community

Updated 4 months ago

how to set query engine to the latest LLM version

Hi Guys,
I am trying to define the Llama index query engine to use the latest open AI model. Documentation doesn't point out. Has any one built the RAG Pipeline where the query engine uses the latest open AI version?
j
L
4 comments
how to set query engine to the latest LLM version
Kapa AI points out discrepancy in the model and the Llama index documentation itself
This sets the default globally

Plain Text
from llama_index.core import Settings
from llama_index.llms.openai import OpenAI

Settings.llm = OpenAI(model="gpt-4o-mini")
Add a reply
Sign up and join the conversation on Discord