How to get the completed prompt?LlamaIndex uses a set of
default prompt templates.
To get the prompts from the query engine, I do this :
# define prompt viewing function
def display_prompt_dict(prompts_dict):
for k, p in prompts_dict.items():
text_md = f"**Prompt Key**: {k}<br>" f"**Text:** <br>"
display(Markdown(text_md))
print(p.get_template())
display(Markdown("<br><br>"))
prompts_dict = query_engine.get_prompts()
display_prompt_dict(prompts_dict)
which gives me this view of these prompts:
**Prompt Key:** response_synthesizer:text_qa_template
**Text:**
Context information is below.
---------------------
{context_str}
---------------------
Given the context information and not prior knowledge, answer the query.
Query: {query_str}
Answer:
**Prompt Key:** response_synthesizer:refine_template
**Text:**
The original query is as follows: {query_str}
We have provided an existing answer: {existing_answer}
We have the opportunity to refine the existing answer (only if needed) with some more context below.
------------
{context_msg}
------------
Given the new context, refine the original answer to better answer the query. If the context isn't useful, return the original answer.
Refined Answer:
=> Is it possible to get the completed prompt with
context_msg
&
query_str
completed/informed?