Find answers from the community

Updated 2 weeks ago

How to Set a Custom Prompt for a Chat Engine

Hi there! I can't use my custom prompt with a chat engine.
In this code, how can I set my prompt?
chat_engine = vector_index.as_chat_engine(chat_mode="condense_question") response = chat_engine.chat("Question.... bla bla bla")

In my previous query engine I used:
query_engine = summary_index.as_query_engine(streaming=True, response_mode="tree_summarize", verbose=True, text_qa_template=custom_prompt) query_engine.update_prompts( {"response_synthesizer:summary_template": custom_prompt} )
Thanks!
W
s
2 comments
You can add system prompt for your chat engine:
chat_engine = vector_index.as_chat_engine(chat_mode="condense_question", system_prompt="Add system prompt for llm here")


or you can access the prompt being used by your engine and update it: https://docs.llamaindex.ai/en/stable/module_guides/models/prompts/usage_pattern/#accessing-prompts
I'll try, thanks!
Add a reply
Sign up and join the conversation on Discord