----------------------
Prompts are a fundamental input that gives Language Learning Models (LLMs) their expressive power. In LlamaIndex, prompts are used to build the index, perform insertions, aid in traversal during querying, and to synthesize the final answer.
LlamaIndex comes with a set of built-in, battle-tested prompts that handle the tricky work of getting a specific LLM to correctly handle and format data. This is one of the biggest benefits of using LlamaIndex
[source ()].
You can also define your own custom prompts. This is as simple as creating a format string. Here's an example of how to do it:
from llama_index.prompts import PromptTemplate
template = (
"We have provided context information below. \n"
"---------------------\n"
"{context_str}"
"\n---------------------\n"
"Given this information, please answer the question: {query_str}\n"
)
qa_template = PromptTemplate(template)
# you can create text prompt (for completion API)
prompt = qa_template.format(context_str=..., query_str=...)
# or easily convert to message prompts (for chat API)
messages = qa_template.format_messages(context_str=..., query_str=...)
[source ()]
LlamaIndex also provides advanced prompt capabilities such as partial formatting and template variable mappings
[source ()].
If you want to access or customize prompts within higher-level modules, you can use
get_prompts
to access the set of prompts for any module (including nested) and
update_prompts
to update these prompts easily
[source ()].