Hello, I am testing the context chat engine which works fine with OpenAI. To use it with local models, I wanted to adapt the prompt according to the specific prompt template which the model was trained on (as it is recommended) but I am not sure how to customize the full prompt of the chat engine (not just the system prompt). Are there any recommendations on how to customize the prompt for the context chat engine (with system prompt, context, history, query) to adapt it to the model’s prompt template? I know how to get the full prompt of a query engine and found a prompt template for the condense question engine but I cannot find out how it is supposed to work for the context chat engine where both context and chat history are passed to the model? Am I missing anything?