Find answers from the community

Updated 2 months ago

How to Set a Custom Prompt for a Chat Engine

At a glance

The community member is having trouble using a custom prompt with a chat engine. They provide code examples of how they previously used a custom prompt with a query engine, and ask how they can set the prompt for the chat engine. In the comments, another community member suggests two options: adding a system prompt to the chat engine, or accessing and updating the prompt being used by the engine. The community member responds that they will try these suggestions, but there is no explicitly marked answer.

Useful resources
Hi there! I can't use my custom prompt with a chat engine.
In this code, how can I set my prompt?
chat_engine = vector_index.as_chat_engine(chat_mode="condense_question") response = chat_engine.chat("Question.... bla bla bla")

In my previous query engine I used:
query_engine = summary_index.as_query_engine(streaming=True, response_mode="tree_summarize", verbose=True, text_qa_template=custom_prompt) query_engine.update_prompts( {"response_synthesizer:summary_template": custom_prompt} )
Thanks!
W
s
2 comments
You can add system prompt for your chat engine:
chat_engine = vector_index.as_chat_engine(chat_mode="condense_question", system_prompt="Add system prompt for llm here")


or you can access the prompt being used by your engine and update it: https://docs.llamaindex.ai/en/stable/module_guides/models/prompts/usage_pattern/#accessing-prompts
I'll try, thanks!
Add a reply
Sign up and join the conversation on Discord