I'm using
HuggingFaceLLMPredictor
to use
StabilityAI/stablelm-tuned-alpha-3b
I'm currently using this prompt as stablelm requires it this way.
query_wrapper_prompt = SimpleInputPrompt(
"<|SYSTEM|>Below is an instruction that describes a task."
"Write a response that adequately completes the request.\n\n"
"<|USER|>{query_str}\n<|ASSISTANT|>"
)
My question is can we have more inputs like
query_str
here? Like providing context separately and then user query?
Also If I'm using
HuggingFaceLLMPredictor
then if I pass the
text_qa_template
while creating query_engine instance. Will it make any difference?