----------------------
Yes, you can use Prompt Templates in LongRAG. The PromptTemplate class from the llama_index.core module is used to create and manage prompts. Here's a basic example of how to use it:
from llama_index.core import PromptTemplate
template = (
"We have provided context information below. \n"
"---------------------\n"
"{context_str}"
"\n---------------------\n"
"Given this information, please answer the question: {query_str}\n"
)
qa_template = PromptTemplate(template)
# you can create text prompt (for completion API)
prompt = qa_template.format(context_str=..., query_str=...)
# or easily convert to message prompts (for chat API)
messages = qa_template.format_messages(context_str=..., query_str=...)
Source ()
You can also customize your prompts. For example, if you want to use a different template variable than the standard
context_str
and
query_str
, you can define
template_var_mappings
:
template_var_mappings = {"context_str": "my_context", "query_str": "my_query"}
prompt_tmpl = PromptTemplate(
qa_prompt_tmpl_str, template_var_mappings=template_var_mappings
)
Source ()
For more advanced usage and examples, you can refer to the
Prompt Engineering for RAG () notebook on GitHub. This notebook shows various prompt techniques you can try to customize your LlamaIndex RAG pipeline, including getting and setting prompts for query engines, defining template variable mappings, adding few-shot examples, and performing query transformations/rewriting.