Find answers from the community

Updated 3 months ago

you may see references to legacy prompt subclasses such as QuestionAnswerPrompt, RefinePro


you may see references to legacy prompt subclasses such as QuestionAnswerPrompt, RefinePrompt. These have been deprecated (and now are type aliases of PromptTemplate). Now you can directly specify PromptTemplate(template) to construct custom prompts. But you still have to make sure the template string contains the expected parameters (e.g. {context_str} and {query_str}) when replacing a default question answer prompt.

since RefinePrompt have been deprecated, how could i modify the following code:
Plain Text
CHAT_REFINE_PROMPT_LC = ChatPromptTemplate.from_messages(CHAT_REFINE_PROMPT_TMPL_MSGS)
CHAT_REFINE_PROMPT = RefinePrompt.from_langchain_prompt(CHAT_REFINE_PROMPT_LC)
k
f
5 comments
Please stand by @fengxiaoyang1998 as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
The RefinePrompt class has been deprecated and is now a type alias of PromptTemplate. This means you can directly use PromptTemplate to construct your prompts.

The ChatPromptTemplate class is used for chat prompts. It takes a list of ChatMessage objects, where each ChatMessage has a content and a role.

Here's how you can modify your code:

Plain Text
from llama_index.core import ChatPromptTemplate, ChatMessage, MessageRole

CHAT_REFINE_PROMPT_TMPL_MSGS = [
    ChatMessage(content="{query_str}", role=MessageRole.USER),
    ChatMessage(content="{existing_answer}", role=MessageRole.ASSISTANT),
    ChatMessage(
        content="We have the opportunity to refine the above answer "
        "(only if needed) with some more context below.\n"
        "------------\n"
        "{context_msg}\n"
        "------------\n"
        "Given the new context and using the best of your knowledge, improve the existing answer. "
        "If you can't improve the existing answer, just repeat it again.",
        role=MessageRole.USER,
    ),
]

CHAT_REFINE_PROMPT = ChatPromptTemplate(CHAT_REFINE_PROMPT_TMPL_MSGS)


In this code, CHAT_REFINE_PROMPT is a ChatPromptTemplate object that you can use to format your chat prompts.

Please note that the template string should contain the expected parameters (e.g. {context_str} and {query_str}) when replacing a default question answer prompt.

For more information, you can refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai
how could i update these code:
Plain Text
# refine prompt selector
DEFAULT_REFINE_PROMPT_SEL_LC = ConditionalPromptSelector(
    default_prompt=DEFAULT_REFINE_PROMPT.get_langchain_prompt(),
    conditionals=[(is_chat_model, CHAT_REFINE_PROMPT.get_langchain_prompt())],
)
REFINE_TEMPLATE = RefinePrompt(langchain_prompt_selector=DEFAULT_REFINE_PROMPT_SEL_LC)
Let me check the knowledge base for you. One moment! :books:
----------------------
The RefinePrompt class has been deprecated and is now a type alias of PromptTemplate. The ConditionalPromptSelector class is used to select a prompt based on a condition.

Here's how you can modify your code:

Plain Text
from llama_index.core import SelectorPromptTemplate

# refine prompt selector
REFINE_TEMPLATE = SelectorPromptTemplate(
    default_template=DEFAULT_REFINE_PROMPT,
    conditionals=[(is_chat_model, CHAT_REFINE_PROMPT)],
)


In this code, REFINE_TEMPLATE is a SelectorPromptTemplate object that you can use to select a prompt based on whether the model is a chat model. If is_chat_model is True, it will use CHAT_REFINE_PROMPT; otherwise, it will use DEFAULT_REFINE_PROMPT.

Please note that the template string should contain the expected parameters (e.g. {{context_str}} and {{query_str}}) when replacing a default question answer prompt.

For more information, you can refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord