Few questions about prompts in query engines and agents:
- text_qa_template - if I'm using a openai chat model, can I use ChatPromptTemplate.format_messages as input to this? I see examples just using a single huge string with the prompt, with context_str and query_str variables - I would like to use system, and user messages with custom variables (that I format and add before I pass the formatted prompt messages to text_qa_template when creating a query engine and its tool) apart from the context and query variables.
- How do I add prompts to agents? I see there is prefix_messages and system_prompt - not sure which to use - does prefix_messages basically get added before every call?