Find answers from the community

Updated 10 months ago

Hi everyone.

Hi everyone.
Im trying to use llamaindex as a middle man between my llm and the user.
As I dont have any document i tought that using an EmpyIndex would be the aproppiate solution, but when i query it returns an empty index because the nodes retrived is an empty list, and by default if nodes =[] the return is Empty Response.
Does anyone have an idea of how it should be done?
L
1 comment
You can just use an LLM directly

Plain Text
from llama_index.llms import OpenAI

llm = OpenAI()
resp = llm.complete("Hello")
print(str(resp))
Add a reply
Sign up and join the conversation on Discord