Find answers from the community

Updated 2 months ago

what is the custom system prompt argument in chat engine

what is the custom system prompt argument in chat engine
k
s
7 comments
Please stand by @shakedbuk as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Based on the extracts provided, there is a specific example related to creating a custom system prompt for a chat engine within the context of building a custom agent. This is demonstrated in the extract from the GPT Builder Demo, where a custom system prompt is created for an agent that can provide information about Toronto. The custom system prompt is generated through a function named create_system_prompt, which takes a task description as its argument and returns a system prompt tailored to that task. Here's the relevant code snippet:

Plain Text
def create_system_prompt(task: str):
 """Create system prompt for another agent given an input task."""
    llm = OpenAI(llm="gpt-4")
    fmt_messages = GEN_SYS_PROMPT_TMPL.format_messages(task=task)
    response = llm.chat(fmt_messages)
    return response.message.content


This function utilizes a template (GEN_SYS_PROMPT_TMPL), which is formatted with the specific task to generate the system prompt. The template itself is defined earlier in the document but not included in the extracts provided. The function then uses this template to create a prompt that is sent to a chat model (in this case, specified as gpt-4), and the response from this model is considered the system prompt for the agent.

Unfortunately, the exact argument or structure used within the chat engine to incorporate this custom system prompt is not detailed in the extracts. The extracts focus more on the creation and utilization of custom prompts within the context of LlamaIndex's framework rather than the specific implementation details within a chat engine.
For more detailed information on how to integrate custom system prompts within a chat engine using LlamaIndex, I recommend consulting the official documentation or examples provided by LlamaIndex, which might offer more comprehensive guides or examples relevant to your query.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai what is the "content" in the system prompt
Let me check the knowledge base for you. One moment! :books:
----------------------
In the context of LlamaIndex and the provided extracts, the "content" within a system prompt refers to the textual information or instructions that are part of a ChatMessage object, which is used to construct prompts for interaction with language models. This content is designed to guide the language model in generating responses or performing tasks based on the given instructions or context.

For example, in the GPT Builder Demo extract, the content of a ChatMessage is used to define the role and the specific instructions or information that should be conveyed to the language model. Here's a snippet illustrating how the content is defined within a ChatMessage:

Plain Text
gen_sys_prompt_messages = [
    ChatMessage(
        role="system",
        content="You are helping to build a system prompt for another bot.",
    ),
    ChatMessage(role="user", content=GEN_SYS_PROMPT_STR),
]


In this example, the content for the first ChatMessage is a string that instructs the language model about its role in the interaction, specifically indicating that it is helping to build a system prompt for another bot. The second ChatMessage uses a variable (GEN_SYS_PROMPT_STR) as its content, which likely contains further instructions or context for generating a system prompt based on a given task.

The content is essentially the message or instruction part of a prompt that is directed towards the language model, guiding it on how to respond or what action to take. This is a crucial component of constructing prompts in LlamaIndex, as it directly influences the behavior and output of the language model.
For more detailed examples or information on how content is used within system prompts in LlamaIndex, you would need to refer to the official documentation or source code, as the provided extracts do not offer further specifics on the implementation or usage beyond these examples.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord