Hey! I think you can actually do this with recent versions of llama index still (but maybe just a little less convenient tbh -- there can probably be a better interface for this)
Here are the current general default prompts:
https://github.com/jerryjliu/llama_index/blob/main/gpt_index/prompts/default_prompts.pyHere are the ones optimized specifically for ChatGPT:
https://github.com/jerryjliu/llama_index/blob/main/gpt_index/prompts/chat_prompts.pyWith these two files in mind, we can create custom prompts. I'll assume you are using a vector or list index for this example (this is slightly un-tested, but should work):
from langchain.prompts.chat import (
AIMessagePromptTemplate,
ChatPromptTemplate,
HumanMessagePromptTemplate,
SystemMessagePromptTemplate
)
from llama_index.prompts.prompts import RefinePrompt, RefineTableContextPrompt
from llama_index.prompts.chat_prompts import CHAT_REFINE_PROMPT_TMPL_MSGS
from llama_index.prompts.default_prompts import DEFAULT_TEXT_QA_PROMPT_TMPL
my_prepend_messages = [SystemMessagePromptTemplate.from_template("my prepended system message here")]
# concat two lists -- CHAT_REFINE_PROMPT_TMPL_MSGS is alread a list
langchain_refine_template = ChatPromptTemplate.from_messages(my_prepend_messages + CHAT_REFINE_PROMPT_TMPL_MSGS)
llama_refine_template = RefinePrompt.from_langchain_prompt(langchain_refine_template)
# concat two lists -- DEFAULT_TEXT_QA_PROMPT_TMPL is not a list, so create one on the fly
langchain_qa_template = ChatPromptTemplate.from_messages(my_prepend_messages + [DEFAULT_TEXT_QA_PROMPT_TMPL])
llama_qa_template = RefinePrompt.from_langchain_prompt(langchain_qa_template)
....
index.query("my query", text_qa_template=llama_qa_template , refine_template=llama_refine_template)