The new context does not provide any additional information, so the original answer remains the same.
so it messes it up when I try loading the response into a JSON file. Do you know how to remove this additional commentary it keeps returning?from langchain.prompts.chat import ( AIMessagePromptTemplate, ChatPromptTemplate, HumanMessagePromptTemplate, ) from llama_index.prompts.prompts import RefinePrompt # Refine Prompt CHAT_REFINE_PROMPT_TMPL_MSGS = [ HumanMessagePromptTemplate.from_template("{query_str}"), AIMessagePromptTemplate.from_template("{existing_answer}"), HumanMessagePromptTemplate.from_template( "I have more context below which can be used " "(only if needed) to update your previous answer.\n" "------------\n" "{context_msg}\n" "------------\n" "Given the new context, update the previous answer to better " "answer my previous query." "If the previous answer remains the same, repeat it verbatim. " "Never reference the new context or my previous query directly.", ), ] CHAT_REFINE_PROMPT_LC = ChatPromptTemplate.from_messages(CHAT_REFINE_PROMPT_TMPL_MSGS) CHAT_REFINE_PROMPT = RefinePrompt.from_langchain_prompt(CHAT_REFINE_PROMPT_LC) ... query_engine = index.as_query_engine(..., refine_template=CHAT_REFINE_PROMPT)
# Set query config query_configs = [ { "index_struct_type": "simple_dict", "query_mode": "default", "query_kwargs": { "similarity_top_k": 1, "response_mode": "compact", "refine_template": CHAT_REFINE_PROMPT }, }, { "index_struct_type": "list", "query_mode": "default", "query_kwargs": { "response_mode": "tree_summarize", "use_async": True, "verbose": True, "refine_template": CHAT_REFINE_PROMPT }, }, { "index_struct_type": "tree", "query_mode": "default", "query_kwargs": { "verbose": True, "refine_template": CHAT_REFINE_PROMPT }, }, ] try: print("BEFORE Querying Graph.query: ", graph) response = graph.query( query_str=query_str, query_configs=query_configs, service_context=service_context_chatgpt, ) ...
CHAT_REFINE_PROMPT_TMPL_MSGS
globally or do I have to define my query_str
first and then define CHAT_REFINE_PROMPT_TMPL_MSGS
for each separate query?The new context does not provide any additional information that would change the previous answer. The previous answer remains the same.