Find answers from the community

Updated 2 months ago

how to change prompt format for zephyr 7b beta model in llamacpp

how to change prompt format for zephyr 7b beta model in llamacpp
k
a
3 comments
Please stand by @adeelhasan as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
To change the prompt format for the Zephyr 7b beta model in LlamaCPP, you can use the update_prompts function. This function allows you to customize prompts on any module that implements get_prompts. You just need to pass in argument values with the keys equal to the keys you see in the prompt dictionary obtained through get_prompts.

Here is an example of how you can do it:

Plain Text
from llama_index.prompts import PromptTemplate

# Define your custom prompt template
qa_prompt_tmpl_str = (
    "Context information is below.\n"
    "---------------------\n"
    "{context_str}\n"
    "---------------------\n"
    "Given the context information and not prior knowledge, "
    "answer the query in the style of a Shakespeare play.\n"
    "Query: {query_str}\n"
    "Answer: "
)
qa_prompt_tmpl = PromptTemplate(qa_prompt_tmpl_str)

# Update the prompts of your model
model.update_prompts(
    {"response_synthesizer:text_qa_template": qa_prompt_tmpl}
)


In this example, model should be your instance of the Zephyr 7b beta model. Please replace model with the actual variable name of your model instance.

Please note that the keys you use in the dictionary passed to update_prompts should match the keys in the prompt dictionary of your model. In the example above, "response_synthesizer:text_qa_template" is used as a placeholder. You should replace it with the actual key for the prompt you want to customize.

For more information, you can refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai but i am asking the how to define custom message_to_prompt function which we pass in LlamaCPP class for zephyr model
Add a reply
Sign up and join the conversation on Discord