Find answers from the community

Updated last year

How to change prompt templates in

How to change prompt templates in RetrieverQueryEngine.from_args
L
k
18 comments
@kapa.ai How to change prompt templates in RetrieverQueryEngine.from_args?
Do I have to set both text_qa_template and refine_template when response_mode is set to "refine "?
Let me check the knowledge base for you. One moment! :books:
----------------------
Based on the information provided in the extracts, it's not explicitly mentioned how to change prompt templates in RetrieverQueryEngine.from_args(). However, it's clear that when using the "refine" response mode, two types of prompts are used: text_qa_template and refine_template.

The text_qa_template is used for the initial query, and the refine_template is used for refining the answer with additional context from subsequent nodes. Therefore, it's reasonable to assume that both templates would need to be set when using the "refine" response mode.

Here's an example of how these templates are defined:

Plain Text
DEFAULT_TEXT_QA_PROMPT_TMPL = (
    "Context information is below. \n"
    "---------------------\n"
    "{context_str}"
    "\n---------------------\n"
    "Given the context information answer the following question "
    "(if you don't know the answer, use the best of your knowledge): {query_str}\n"
)
TEXT_QA_TEMPLATE = PromptTemplate(DEFAULT_TEXT_QA_PROMPT_TMPL)

DEFAULT_REFINE_PROMPT_TMPL = (
    "The original question is as follows: {query_str}\n"
    "We have provided an existing answer: {existing_answer}\n"
    "We have the opportunity to refine the existing answer "
    "(only if needed) with some more context below.\n"
    "------------\n"
    "{context_msg}\n"
    "------------\n"
    "Given the new context and using the best of your knowledge, improve the existing answer. "
    "If you can't improve the existing answer, just repeat it again."
)
DEFAULT_REFINE_PROMPT = PromptTemplate(DEFAULT_REFINE_PROMPT_TMPL)

(Source: LlamaIndex Documentation ())
However, to get a definitive answer on how to change these templates in RetrieverQueryEngine.from_args(), I would recommend referring to the official LlamaIndex documentation or the source code.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai where to import PromptTemplate from?
@kapa.ai @kapa.ai where to import PromptTemplate from?
Let me check the knowledge base for you. One moment! :books:
----------------------
You can import PromptTemplate from the llama_index.prompts module. Here is the import statement:

Plain Text
from llama_index.prompts import PromptTemplate


This was shown in the example provided in the GitHub notebook () on customizing prompts.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai how to pass the templates to the query_engine?
Let me check the knowledge base for you. One moment! :books:
----------------------
You can pass the custom prompt templates to the query engine in two ways:

  1. Via the high-level API:
Plain Text
query_engine = index.as_query_engine(
    text_qa_template=custom_qa_prompt, refine_template=custom_refine_prompt
)


  1. Via the low-level composition API:
Plain Text
retriever = index.as_retriever()
synth = get_response_synthesizer(
    text_qa_template=custom_qa_prompt, refine_template=custom_refine_prompt
)
query_engine = RetrieverQueryEngine(retriever, response_synthesizer=synth)


In both cases, custom_qa_prompt and custom_refine_prompt are instances of PromptTemplate that you have defined.

These methods are equivalent, with the first one being a simpler way to achieve the same result as the second one. The first method is recommended for quickly modifying some common parameters, while the second method gives you more granular control.

For more details, please refer to the LlamaIndex Documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai I instantiate my query engine with from_args method
Let me check the knowledge base for you. One moment! :books:
----------------------
If you are using the from_args method to instantiate your query engine, you can pass the custom prompt templates as arguments. Here's an example:

Plain Text
from llama_index.query_engine import RetrieverQueryEngine
from llama_index.prompts import PromptTemplate

# Define your custom templates
custom_qa_prompt = PromptTemplate("Your custom QA template here")
custom_refine_prompt = PromptTemplate("Your custom refine template here")

# Instantiate the query engine
query_engine = RetrieverQueryEngine.from_args(
    retriever=your_retriever,  # replace with your retriever
    text_qa_template=custom_qa_prompt,
    refine_template=custom_refine_prompt,
    # include other parameters as needed
)


In this example, your_retriever should be replaced with the retriever you are using. The text_qa_template and refine_template parameters are used to pass the custom prompt templates.

Please note that the exact parameters you need to pass to from_args may vary depending on your use case. For a full list of parameters, refer to the LlamaIndex Documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
how to give ServiceContext a system prompt?
@kapa.ai how to give ServiceContext a system prompt?
Let me check the knowledge base for you. One moment! :books:
----------------------
To give a ServiceContext a system prompt, you can use the update_prompts method. This method allows you to update the prompts used by the ServiceContext. Here's an example:

Plain Text
from llama_index.prompts import PromptTemplate
from llama_index import ServiceContext

# Define your system prompt
system_prompt = PromptTemplate("Your system prompt here")

# Instantiate the service context
service_context = ServiceContext.from_defaults()

# Update the system prompt
service_context.update_prompts({"system_prompt": system_prompt})


In this example, replace "Your system prompt here" with the actual system prompt you want to use. The update_prompts method takes a dictionary where the keys are the names of the prompts and the values are the PromptTemplate instances.

Please note that the exact way to set a system prompt may vary depending on your use case and the specific configuration of your ServiceContext. For more details, refer to the LlamaIndex Documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord