Find answers from the community

Updated last year

Generate info about prompt

@kapa.ai Generate info about prompt
k
M
7 comments
Please stand by @Matthews as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Prompts are a fundamental input that gives Language Learning Models (LLMs) their expressive power. In LlamaIndex, prompts are used to build the index, perform insertions, aid in traversal during querying, and to synthesize the final answer.

LlamaIndex comes with a set of built-in, battle-tested prompts that handle the tricky work of getting a specific LLM to correctly handle and format data. This is one of the biggest benefits of using LlamaIndex [source ()].

You can also define your own custom prompts. This is as simple as creating a format string. Here's an example of how to do it:

Plain Text
from llama_index.prompts import PromptTemplate

template = (
    "We have provided context information below. \n"
    "---------------------\n"
    "{context_str}"
    "\n---------------------\n"
    "Given this information, please answer the question: {query_str}\n"
)
qa_template = PromptTemplate(template)

# you can create text prompt (for completion API)
prompt = qa_template.format(context_str=..., query_str=...)

# or easily convert to message prompts (for chat API)
messages = qa_template.format_messages(context_str=..., query_str=...)

[source ()]

LlamaIndex also provides advanced prompt capabilities such as partial formatting and template variable mappings [source ()].

If you want to access or customize prompts within higher-level modules, you can use get_prompts to access the set of prompts for any module (including nested) and update_prompts to update these prompts easily [source ()].
For more detailed information, you can refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai Defining a custom prompt
Let me check the knowledge base for you. One moment! :books:
----------------------
Defining a custom prompt in LlamaIndex is as simple as creating a format string using the PromptTemplate class. Here's an example:

Plain Text
from llama_index.prompts import PromptTemplate

template = (
    "We have provided context information below. \n"
    "---------------------\n"
    "{context_str}"
    "\n---------------------\n"
    "Given this information, please answer the question: {query_str}\n"
)
qa_template = PromptTemplate(template)

# you can create text prompt (for completion API)
prompt = qa_template.format(context_str=..., query_str=...)

# or easily convert to message prompts (for chat API)
messages = qa_template.format_messages(context_str=..., query_str=...)

[source ()]

In this example, {context_str} and {query_str} are placeholders that will be replaced with the actual context and query when the prompt is used.

You can also define a template from chat messages:

Plain Text
from llama_index.prompts import ChatPromptTemplate, ChatMessage, MessageRole

message_templates = [
    ChatMessage(content="You are an expert system.", role=MessageRole.SYSTEM),
    ChatMessage(
        content="Generate a short story about {topic}",
        role=MessageRole.USER,
    ),
]
chat_template = ChatPromptTemplate(message_templates=message_templates)

# you can create message prompts (for chat API)
messages = chat_template.format_messages(topic=...)

# or easily convert to text prompt (for completion API)
prompt = chat_template.format(topic=...)

[source ()]
In this case, {topic} is a placeholder that will be replaced with the actual topic when the prompt is used.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord