Hi guys, a really newbie question, but is there any way to get a query engine to assume a certain persona? In particular a sub-question query engine. I tried to include it in the prompt but the LLM doesn't really follow the persona instruction well. Was wondering if anyone had success providing a custom text_qa_template in the response synthesizer for a custom persona.
Hi @Logan M thank you so much. This definitely helped! Btw, since we're using a chat model, is there a way to incorporate memory into the chat_qa_prompt? Currently it requires the context_str and query_str variable. Was thinking if we can use langchain conversationbuffer memory and include the chat history into the chat_qa_prompt?