----------------------
To change the prompt format for the Zephyr 7b beta model in LlamaCPP, you can use the
update_prompts
function. This function allows you to customize prompts on any module that implements
get_prompts
. You just need to pass in argument values with the keys equal to the keys you see in the prompt dictionary obtained through
get_prompts
.
Here is an example of how you can do it:
from llama_index.prompts import PromptTemplate
# Define your custom prompt template
qa_prompt_tmpl_str = (
"Context information is below.\n"
"---------------------\n"
"{context_str}\n"
"---------------------\n"
"Given the context information and not prior knowledge, "
"answer the query in the style of a Shakespeare play.\n"
"Query: {query_str}\n"
"Answer: "
)
qa_prompt_tmpl = PromptTemplate(qa_prompt_tmpl_str)
# Update the prompts of your model
model.update_prompts(
{"response_synthesizer:text_qa_template": qa_prompt_tmpl}
)
In this example,
model
should be your instance of the Zephyr 7b beta model. Please replace
model
with the actual variable name of your model instance.
Please note that the keys you use in the dictionary passed to
update_prompts
should match the keys in the prompt dictionary of your model. In the example above,
"response_synthesizer:text_qa_template"
is used as a placeholder. You should replace it with the actual key for the prompt you want to customize.
For more information, you can refer to the
LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------