The post asks if it is possible to query OpenAI directly with LlamaIndex without using Rag. The comments indicate that this is possible, and a community member provides an example of how to use the OpenAI class from the llama_index.llms module to complete a prompt. Another community member confirms that this is the correct approach.