Find answers from the community

Updated 3 months ago

Hello, I want to query directly the llm

Hello, I want to query directly the llm, any advice on how can I configure system prompt?

Plain Text
from llama_index.llms.openai import OpenAI

response = OpenAI().complete("Paul Graham is ")
print(response)
L
2 comments
use .chat() and send the system prompt
Plain Text
from llama_index.core.llms import ChatMessage

llm.chat([
  ChatMessage(role="system", content="..."), 
  ChatMessage(role="user", content="Hello!"),
])
Add a reply
Sign up and join the conversation on Discord