RetrieverQueryEngine.from_args()
. However, it's clear that when using the "refine" response mode, two types of prompts are used: text_qa_template
and refine_template
.text_qa_template
is used for the initial query, and the refine_template
is used for refining the answer with additional context from subsequent nodes. Therefore, it's reasonable to assume that both templates would need to be set when using the "refine" response mode.DEFAULT_TEXT_QA_PROMPT_TMPL = ( "Context information is below. \n" "---------------------\n" "{context_str}" "\n---------------------\n" "Given the context information answer the following question " "(if you don't know the answer, use the best of your knowledge): {query_str}\n" ) TEXT_QA_TEMPLATE = PromptTemplate(DEFAULT_TEXT_QA_PROMPT_TMPL) DEFAULT_REFINE_PROMPT_TMPL = ( "The original question is as follows: {query_str}\n" "We have provided an existing answer: {existing_answer}\n" "We have the opportunity to refine the existing answer " "(only if needed) with some more context below.\n" "------------\n" "{context_msg}\n" "------------\n" "Given the new context and using the best of your knowledge, improve the existing answer. " "If you can't improve the existing answer, just repeat it again." ) DEFAULT_REFINE_PROMPT = PromptTemplate(DEFAULT_REFINE_PROMPT_TMPL)
RetrieverQueryEngine.from_args()
, I would recommend referring to the official LlamaIndex documentation or the source code.PromptTemplate
from the llama_index.prompts
module. Here is the import statement:from llama_index.prompts import PromptTemplate
query_engine = index.as_query_engine( text_qa_template=custom_qa_prompt, refine_template=custom_refine_prompt )
retriever = index.as_retriever() synth = get_response_synthesizer( text_qa_template=custom_qa_prompt, refine_template=custom_refine_prompt ) query_engine = RetrieverQueryEngine(retriever, response_synthesizer=synth)
custom_qa_prompt
and custom_refine_prompt
are instances of PromptTemplate
that you have defined.from_args
method to instantiate your query engine, you can pass the custom prompt templates as arguments. Here's an example:from llama_index.query_engine import RetrieverQueryEngine from llama_index.prompts import PromptTemplate # Define your custom templates custom_qa_prompt = PromptTemplate("Your custom QA template here") custom_refine_prompt = PromptTemplate("Your custom refine template here") # Instantiate the query engine query_engine = RetrieverQueryEngine.from_args( retriever=your_retriever, # replace with your retriever text_qa_template=custom_qa_prompt, refine_template=custom_refine_prompt, # include other parameters as needed )
your_retriever
should be replaced with the retriever you are using. The text_qa_template
and refine_template
parameters are used to pass the custom prompt templates.from_args
may vary depending on your use case. For a full list of parameters, refer to the LlamaIndex Documentation ().ServiceContext
a system prompt, you can use the update_prompts
method. This method allows you to update the prompts used by the ServiceContext
. Here's an example:from llama_index.prompts import PromptTemplate from llama_index import ServiceContext # Define your system prompt system_prompt = PromptTemplate("Your system prompt here") # Instantiate the service context service_context = ServiceContext.from_defaults() # Update the system prompt service_context.update_prompts({"system_prompt": system_prompt})
update_prompts
method takes a dictionary where the keys are the names of the prompts and the values are the PromptTemplate
instances.ServiceContext
. For more details, refer to the LlamaIndex Documentation ().