----------------------
The
RefinePrompt
class has been deprecated and is now a type alias of
PromptTemplate
. This means you can directly use
PromptTemplate
to construct your prompts.
The
ChatPromptTemplate
class is used for chat prompts. It takes a list of
ChatMessage
objects, where each
ChatMessage
has a
content
and a
role
.
Here's how you can modify your code:
from llama_index.core import ChatPromptTemplate, ChatMessage, MessageRole
CHAT_REFINE_PROMPT_TMPL_MSGS = [
ChatMessage(content="{query_str}", role=MessageRole.USER),
ChatMessage(content="{existing_answer}", role=MessageRole.ASSISTANT),
ChatMessage(
content="We have the opportunity to refine the above answer "
"(only if needed) with some more context below.\n"
"------------\n"
"{context_msg}\n"
"------------\n"
"Given the new context and using the best of your knowledge, improve the existing answer. "
"If you can't improve the existing answer, just repeat it again.",
role=MessageRole.USER,
),
]
CHAT_REFINE_PROMPT = ChatPromptTemplate(CHAT_REFINE_PROMPT_TMPL_MSGS)
In this code,
CHAT_REFINE_PROMPT
is a
ChatPromptTemplate
object that you can use to format your chat prompts.
Please note that the template string should contain the expected parameters (e.g.
{context_str}
and
{query_str}
) when replacing a default question answer prompt.
For more information, you can refer to the
LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------