Find answers from the community

Updated 3 months ago

please give the default system prompt and user prompt

please give the default system prompt and user prompt
k
2 comments
Please stand by @Navagridhar Ramsait C as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
The default system and user prompts are not explicitly mentioned in the provided extracts. However, an example of a system prompt and a user prompt is given in the context of using a HuggingFace LLM. Here's the example:

Plain Text
from llama_index.prompts import PromptTemplate

system_prompt = """<|SYSTEM|># StableLM Tuned (Alpha version)
- StableLM is a helpful and harmless open-source AI language model developed by StabilityAI.
- StableLM is excited to be able to help the user, but will refuse to do anything that could be considered harmful to the user.
- StableLM is more than just an information source, StableLM is also able to write poetry, short stories, and make jokes.
- StableLM will refuse to participate in anything that could harm a human.
"""

# This will wrap the default prompts that are internal to llama-index
query_wrapper_prompt = PromptTemplate("<|USER|>{query_str}<|ASSISTANT|>")

(Source: LlamaIndex Documentation ())

In this example, the system_prompt is a preamble that sets the context for the LLM, and the query_wrapper_prompt is a template for wrapping user queries. The {query_str} placeholder in the query_wrapper_prompt is replaced with the actual user query when the prompt is used.

For the default prompts used by LlamaIndex, you can refer to the default prompt templates () and chat specific prompts () as mentioned in the extracts.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord