Find answers from the community

Updated 2 months ago

if my prompt template in my chat engine is consisting of serveral other variables that i w

if my prompt template in my chat engine is consisting of serveral other variables that i want to pass in how can i do that?
k
A
p
5 comments
Please stand by @Abhiram_AI_Guy as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
In LlamaIndex, you can pass additional variables to your prompt template during query-time. This can be done by specifying these additional variables in the **kwargs for get_response. Here is an example:

Plain Text
from llama_index.core import PromptTemplate
from llama_index.core.response_synthesizers import TreeSummarize

# NOTE: we add an extra tone_name variable here
qa_prompt_tmpl = (
    "Context information is below.\n"
    "---------------------\n"
    "{context_str}\n"
    "---------------------\n"
    "Given the context information and not prior knowledge, "
    "answer the query.\n"
    "Please also write the answer in the tone of {tone_name}.\n"
    "Query: {query_str}\n"
    "Answer: "
)
qa_prompt = PromptTemplate(qa_prompt_tmpl)

# initialize response synthesizer
summarizer = TreeSummarize(verbose=True, summary_template=qa_prompt)

# get response
response = summarizer.get_response(
    "who is Paul Graham?", [text], tone_name="a Shakespeare play"
)

In this example, an extra variable tone_name is added to the prompt template and its value is passed during the get_response call (source ()).

If you have a prompt template with different variable names than the ones expected by LlamaIndex (e.g., context_str and query_str), you can use template_var_mappings to map your variable names to the expected ones. Here is an example:

```python
template_var_mappings = {"context_str": "my_context", "query_str": "my_query"}
prompt_tmpl = PromptTemplate(
qa_prompt_tmpl_str, template_var_mappings=template_var_mappings
)
`` In this example, my_context and my_query are the variable names in your template, and they are mapped to context_str and query_str`, respectively, which are the variable names expected by LlamaIndex (source ()).
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
I initialized a chat engine but i want to pass this variable as well to keep my chat history intact
as repilied by @Logan M in previous response chat engine just have system prompt and chat history https://discord.com/channels/1059199217496772688/1253543135217188924/1253548026006470811
Add a reply
Sign up and join the conversation on Discord