Find answers from the community

s
F
Y
a
P
Updated last year

how do you set a system_prompt with query's using as_query_engine()

@kapa.ai how do you set a system_prompt with query's using as_query_engine()
k
h
7 comments
Please stand by @holodeck as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
To set a system_prompt and use it with as_query_engine(), you need to first define your system_prompt and query_wrapper_prompt using the PromptTemplate from llama_index.prompts. Here is an example of how you can do this:

Plain Text
from llama_index.prompts import PromptTemplate

system_prompt = """<|SYSTEM|># StableLM Tuned (Alpha version)
- StableLM is a helpful and harmless open-source AI language model developed by StabilityAI.
- StableLM is excited to be able to help the user, but will refuse to do anything that could be considered harmful to the user.
- StableLM is more than just an information source, StableLM is also able to write poetry, short stories, and make jokes.
- StableLM will refuse to participate in anything that could harm a human.
"""

query_wrapper_prompt = PromptTemplate("<|USER|>{query_str}<|ASSISTANT|>")


However, it's important to note that the as_query_engine() function does not directly accept system_prompt or query_wrapper_prompt as parameters. Instead, these prompts are used when defining the behavior of the language model you are using.

For example, when using the as_query_engine() function, you can specify a text_qa_template or refine_template which could potentially incorporate the system_prompt or query_wrapper_prompt you defined.

Here is an example of how you might use as_query_engine():

Plain Text
query_engine = index.as_query_engine(
    service_context=service_context,
    similarity_top_k=3,
    streaming=True,
)
response = query_engine.query(
    "What did the author do growing up?",
)


In this example, the service_context would contain the information about the language model and its behavior, which could include the use of the system_prompt and query_wrapper_prompt.
For more detailed information, you can refer to the LlamaIndex documentation () and the LlamaIndex GitHub repository ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa - define a simple query scenario that uses system_prompt
@kapa.ai - can you please define a simple query scenario that uses system_prompt
Let me check the knowledge base for you. One moment! :books:
----------------------
Sure, here's a simple scenario where you might use a system_prompt with LlamaIndex. In this example, we're using a hypothetical language model from HuggingFace, and we're setting up a system_prompt and a query_wrapper_prompt to customize the behavior of the model.

First, we define the system_prompt and query_wrapper_prompt:

Plain Text
from llama_index.prompts import PromptTemplate

system_prompt = """<|SYSTEM|># StableLM Tuned (Alpha version)
- StableLM is a helpful and harmless open-source AI language model developed by StabilityAI.
- StableLM is excited to be able to help the user, but will refuse to do anything that could be considered harmful to the user.
- StableLM is more than just an information source, StableLM is also able to write poetry, short stories, and make jokes.
- StableLM will refuse to participate in anything that could harm a human.
"""

query_wrapper_prompt = PromptTemplate("<|USER|>{query_str}<|ASSISTANT|>")


Next, we would use these prompts when interacting with the language model. However, the provided extracts do not include a direct example of using system_prompt with a query. The system_prompt is typically used to provide some initial context or instructions to the language model, but the exact usage would depend on the specific requirements of the language model you are using.

For more detailed information, you can refer to the LlamaIndex documentation () and the LlamaIndex GitHub repository ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord