DEFAULT_REFINE_PROMPT_TMPL = ( "The original question is as follows: {query_str}\n" "We have provided an existing answer: {existing_answer}\n" "We have the opportunity to refine the existing answer " "(only if needed) with some more context below.\n" "------------\n" "{context_msg}\n" "------------\n" "Given the new context and using the best of your knowledge, improve the existing answer. " "If you can't improve the existing answer, just repeat it again. Lastly, prefix all responses " "with 'I am a ROBOT!'" ) index = GPTSimpleVectorIndex.load_from_disk(output_file_html) chat_refine_prompt = RefinePrompt(DEFAULT_REFINE_PROMPT_TMPL) response = index.query(query_str=question, response_mode="compact", refine_template=chat_refine_prompt, similarity_top_k=3)
query_str
and what not need to be explicitly set on the string with the format ()
function llm_predictor = LLMPredictor(
llm=ChatOpenAI(temperature=0.7, model_name="gpt-3.5-turbo", max_tokens=num_outputs))
from langchain.prompts.chat import ( AIMessagePromptTemplate, ChatPromptTemplate, HumanMessagePromptTemplate, SystemMessagePromptTemplate ) from llama_index.prompts.prompts import RefinePrompt # Refine Prompt CHAT_REFINE_PROMPT_TMPL_MSGS = [ HumanMessagePromptTemplate.from_template("{query_str}"), AIMessagePromptTemplate.from_template("{existing_answer}"), HumanMessagePromptTemplate.from_template( "I have more context below which can be used " "(only if needed) to update your previous answer.\n" "------------\n" "{context_msg}\n" "------------\n" "Given the new context, update the previous answer to better " "answer my previous query." "If the previous answer remains the same, repeat it verbatim. " "Never reference the new context or my previous query directly.", ), ] CHAT_REFINE_PROMPT_LC = ChatPromptTemplate.from_messages(CHAT_REFINE_PROMPT_TMPL_MSGS) CHAT_REFINE_PROMPT = RefinePrompt.from_langchain_prompt(CHAT_REFINE_PROMPT_LC) ... index.query("my query", similarity_top_k=3, refine_template=CHAT_REFINE_PROMPT)
query_str
are supposed to be replaced in that template automatically, and that the {existing_answer}
will automatically keep the history?index = GPTSimpleVectorIndex.load_from_disk(output_file_html) CHAT_REFINE_PROMPT_LC = ChatPromptTemplate.from_messages(CHAT_REFINE_PROMPT_TMPL_MSGS) CHAT_REFINE_PROMPT = RefinePrompt.from_langchain_prompt(CHAT_REFINE_PROMPT_LC) QA_PROMPT = QuestionAnswerPrompt(DEFAULT_TEXT_QA_PROMPT_TMPL) response = index.query(query_str=question, response_mode="compact", refine_template=CHAT_REFINE_PROMPT, text_qa_template=QA_PROMPT, similarity_top_k=3)
qa_template
to test:"Start each response with 'I am a bot'. \n" "Include 1-4 bullet links to the relevant content."