Find answers from the community

Updated 3 months ago

How does one replace the contents of

How does one replace the contents of TEXT_QA_SYSTEM_PROMPT? Looking at the docs, I can only see how to replace the response_synthesizer:text_qa_template which comes later
L
C
2 comments
Those are indeed one and the same

i.e. the template contains a system prompt.

You can just update the text_qa_template to be a new chat template

A small example

Plain Text
 qa_prompt_str = (
    "Context information is below.\n"
    "---------------------\n"
    "{context_str}\n"
    "---------------------\n"
    "Given the context information and not prior knowledge, "
    "answer the question: {query_str}\n"
)

from llama_index.core.llms import ChatMessage, MessageRole
from llama_index.core import ChatPromptTemplate

# Text QA Prompt
chat_text_qa_msgs = [
    ChatMessage(
        role=MessageRole.SYSTEM,
        content=(
            "Always answer the question, even if the context isn't helpful."
        ),
    ),
    ChatMessage(role=MessageRole.USER, content=qa_prompt_str),
]
text_qa_template = ChatPromptTemplate(chat_text_qa_msgs)
Add a reply
Sign up and join the conversation on Discord