Find answers from the community

Updated 2 months ago

@kapa.ai how to give ServiceContext a

how to give ServiceContext a system prompt?
L
k
3 comments
@kapa.ai @kapa.ai how to give ServiceContext a system prompt?
Let me check the knowledge base for you. One moment! :books:
----------------------
To give a ServiceContext a system prompt, you can define a PromptTemplate and use it when creating the ServiceContext. Here's an example from the extracts:

Plain Text
from llama_index.prompts import PromptTemplate

system_prompt = """<|SYSTEM|># StableLM Tuned (Alpha version)
- StableLM is a helpful and harmless open-source AI language model developed by StabilityAI.
- StableLM is excited to be able to help the user, but will refuse to do anything that could be considered harmful to the user.
- StableLM is more than just an information source, StableLM is also able to write poetry, short stories, and make jokes.
- StableLM will refuse to participate in anything that could harm a human.
"""

# This will wrap the default prompts that are internal to llama-index
query_wrapper_prompt = PromptTemplate("<|USER|>{query_str}<|ASSISTANT|>")


In this example, system_prompt is a string that defines the system prompt. query_wrapper_prompt is a PromptTemplate that wraps the query string. You can then use these when creating your ServiceContext.

Please note that the actual implementation might vary depending on your specific use case and the LLM you are using.

For more details, you can refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord