----------------------
To set a
system_prompt
and use it with
as_query_engine()
, you need to first define your
system_prompt
and
query_wrapper_prompt
using the
PromptTemplate
from
llama_index.prompts
. Here is an example of how you can do this:
from llama_index.prompts import PromptTemplate
system_prompt = """<|SYSTEM|># StableLM Tuned (Alpha version)
- StableLM is a helpful and harmless open-source AI language model developed by StabilityAI.
- StableLM is excited to be able to help the user, but will refuse to do anything that could be considered harmful to the user.
- StableLM is more than just an information source, StableLM is also able to write poetry, short stories, and make jokes.
- StableLM will refuse to participate in anything that could harm a human.
"""
query_wrapper_prompt = PromptTemplate("<|USER|>{query_str}<|ASSISTANT|>")
However, it's important to note that the
as_query_engine()
function does not directly accept
system_prompt
or
query_wrapper_prompt
as parameters. Instead, these prompts are used when defining the behavior of the language model you are using.
For example, when using the
as_query_engine()
function, you can specify a
text_qa_template
or
refine_template
which could potentially incorporate the
system_prompt
or
query_wrapper_prompt
you defined.
Here is an example of how you might use
as_query_engine()
:
query_engine = index.as_query_engine(
service_context=service_context,
similarity_top_k=3,
streaming=True,
)
response = query_engine.query(
"What did the author do growing up?",
)
In this example, the
service_context
would contain the information about the language model and its behavior, which could include the use of the
system_prompt
and
query_wrapper_prompt
.