Find answers from the community

Updated 2 years ago

This is probobly simple question but i

This is probobly simple question, but i cannot find a good source in the documentation on how to switch from davinci to using gpt 3.5 or gpt 4 as a query engine.
L
R
W
6 comments
both gpt-3.5 and gpt-4 use the ChatOpenAI class

Search for gpt-3.5 on the docs, there is quite a few examples

The most useful example is probably when setting a global service context (this way, you don't have to pass in the service_context everywhere)
That is very useful - thanks!
Examining if llmama index can better support data query use cases than pure langchain / chroma
We have quite a few advanced modules - sub question query engine, router query engine, graphs, Sql+Vector query engine, etc.

Happy to answer any questions that come up!
Use this
Plain Text
from llama_index import ServiceContext, LLMPredictor
from langchain.chat_models import ChatOpenAI
llm_predictor = LLMPredictor(llm=ChatOpenAI(temperature=0, max_tokens=1024, model_name="gpt-3.5-turbo"))


pass this llm object in your service context and you are all set with GPT-3.5 turbo
Add a reply
Sign up and join the conversation on Discord