Find answers from the community

Updated 2 years ago

Persona

Hi guys, a really newbie question, but is there any way to get a query engine to assume a certain persona? In particular a sub-question query engine. I tried to include it in the prompt but the LLM doesn't really follow the persona instruction well. Was wondering if anyone had success providing a custom text_qa_template in the response synthesizer for a custom persona.
L
H
3 comments
Definitely not a noob question, it's a little complicates

You'll probably want to add a system message with the persona (assuming you are using a chat model)


https://discord.com/channels/1059199217496772688/1109906051727364147/1109972300578693191
Hi @Logan M thank you so much. This definitely helped! Btw, since we're using a chat model, is there a way to incorporate memory into the chat_qa_prompt? Currently it requires the context_str and query_str variable. Was thinking if we can use langchain conversationbuffer memory and include the chat history into the chat_qa_prompt?
You could manually insert the chat history and create the prompt every time before querying.

It's not quite the same, but you could also look into using our chat engine stuff too

https://gpt-index.readthedocs.io/en/latest/how_to/chat_engine/root.html
Add a reply
Sign up and join the conversation on Discord