Find answers from the community

Updated 5 months ago

how to set query engine to the latest LLM version

At a glance

The community member is trying to define the Llama index query engine to use the latest OpenAI model, but the documentation does not provide clear guidance. The community members discuss how to set the query engine to use the latest language model version, and one community member provides a code snippet to set the default OpenAI model to "gpt-4o-mini". However, there is no explicitly marked answer in the comments.

Useful resources
Hi Guys,
I am trying to define the Llama index query engine to use the latest open AI model. Documentation doesn't point out. Has any one built the RAG Pipeline where the query engine uses the latest open AI version?
j
L
4 comments
how to set query engine to the latest LLM version
Kapa AI points out discrepancy in the model and the Llama index documentation itself
This sets the default globally

Plain Text
from llama_index.core import Settings
from llama_index.llms.openai import OpenAI

Settings.llm = OpenAI(model="gpt-4o-mini")
Add a reply
Sign up and join the conversation on Discord