----------------------
In LlamaIndex, you can pass additional variables to your prompt template during query-time. This can be done by specifying these additional variables in the
**kwargs
for
get_response
. Here is an example:
from llama_index.core import PromptTemplate
from llama_index.core.response_synthesizers import TreeSummarize
# NOTE: we add an extra tone_name variable here
qa_prompt_tmpl = (
"Context information is below.\n"
"---------------------\n"
"{context_str}\n"
"---------------------\n"
"Given the context information and not prior knowledge, "
"answer the query.\n"
"Please also write the answer in the tone of {tone_name}.\n"
"Query: {query_str}\n"
"Answer: "
)
qa_prompt = PromptTemplate(qa_prompt_tmpl)
# initialize response synthesizer
summarizer = TreeSummarize(verbose=True, summary_template=qa_prompt)
# get response
response = summarizer.get_response(
"who is Paul Graham?", [text], tone_name="a Shakespeare play"
)
In this example, an extra variable
tone_name
is added to the prompt template and its value is passed during the
get_response
call (
source ()).
If you have a prompt template with different variable names than the ones expected by LlamaIndex (e.g.,
context_str
and
query_str
), you can use
template_var_mappings
to map your variable names to the expected ones. Here is an example:
```python
template_var_mappings = {"context_str": "my_context", "query_str": "my_query"}