----------------------
The default system and user prompts are not explicitly mentioned in the provided extracts. However, an example of a system prompt and a user prompt is given in the context of using a HuggingFace LLM. Here's the example:
from llama_index.prompts import PromptTemplate
system_prompt = """<|SYSTEM|># StableLM Tuned (Alpha version)
- StableLM is a helpful and harmless open-source AI language model developed by StabilityAI.
- StableLM is excited to be able to help the user, but will refuse to do anything that could be considered harmful to the user.
- StableLM is more than just an information source, StableLM is also able to write poetry, short stories, and make jokes.
- StableLM will refuse to participate in anything that could harm a human.
"""
# This will wrap the default prompts that are internal to llama-index
query_wrapper_prompt = PromptTemplate("<|USER|>{query_str}<|ASSISTANT|>")
(Source:
LlamaIndex Documentation ())
In this example, the
system_prompt
is a preamble that sets the context for the LLM, and the
query_wrapper_prompt
is a template for wrapping user queries. The
{query_str}
placeholder in the
query_wrapper_prompt
is replaced with the actual user query when the prompt is used.
For the default prompts used by LlamaIndex, you can refer to the
default prompt templates () and
chat specific prompts () as mentioned in the extracts.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------