Find answers from the community

Updated last year

Is it possible to query OpenAI with

At a glance

The post asks if it is possible to query OpenAI directly with LlamaIndex without using Rag. The comments indicate that this is possible, and a community member provides an example of how to use the OpenAI class from the llama_index.llms module to complete a prompt. Another community member confirms that this is the correct approach.

Is it possible to query OpenAI with LlamaIndex directly without using Rag?
a
R
4 comments
Yeah, sure just see our LLM abstraction. We have one for OpenAI and you can call complete method on it.
Do you have an example?
Thanks I found it

from llama_index.llms import OpenAI

resp = OpenAI().complete("Paul Graham is ")
perfect! yup
Add a reply
Sign up and join the conversation on Discord