SimpleInputPrompt is not working
from llama_index.prompts.prompts import SimpleInputPrompt
DEFAULT_SIMPLE_INPUT_TMPL = (
"{query_str} \n"
"by using words 'permission'"
)
DEFAULT_SIMPLE_INPUT_PROMPT = SimpleInputPrompt(DEFAULT_SIMPLE_INPUT_TMPL)
retriever = VectorIndexRetriever(
index=index,
similarity_top_k=10,
vector_store_query_mode=VectorStoreQueryMode.HYBRID
)
response_synthesizer = ResponseSynthesizer.from_args(
streaming=True,
service_context=service_context,
simple_template = DEFAULT_SIMPLE_INPUT_PROMPT
)
query_engine = RetrieverQueryEngine(
retriever=retriever,
response_synthesizer=response_synthesizer,
)
query_engine = index.as_query_engine(
streaming=True,
simple_template = DEFAULT_SIMPLE_INPUT_PROMPT
)
response = query_engine.query(query_str)
The output is different. It is not using SimpleInputPrompt i have checked by logging level = debug.
logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
Logging is showing that default prompt is being used.
LOGGING:
DEBUG:llama_index.indices.response.response_builder:> Initial prompt template: Context information is below.
---------------------
{context_str}
---------------------
Given the context information and not prior knowledge, answer the question: {query_str}
I have checked response synthesizer and it contains refine_template and text_qa_template variable but not simple_template variable.
print(vars(response_synthesizer._response_builder))
{'_service_context': ,
'_streaming': ,
'text_qa_template': <llama_index.prompts.prompts.QuestionAnswerPrompt at 0x7fde16b0dfc0>,
'_refine_template': <llama_index.prompts.prompts.SimpleInputPrompt at 0x7fde09d6bd90>}
Please anyone help me with this. .