----------------------
LlamaIndex itself provides robust prompt abstractions that capture all the common interaction patterns with LLMs. It supports a set of
default prompt templates () that work well out of the box. You can also provide your own prompt templates to further customize the behavior of the framework.
For example, you can define a prompt template using the
ChatPromptTemplate
class in LlamaIndex, as shown in this
example ():
from llama_index.core import ChatPromptTemplate
from llama_index.core.llms import ChatMessage
prompt = ChatPromptTemplate(
message_templates=[
ChatMessage(
role="system",
content=(
"You are an expert assistant for summarizing and extracting insights from sales call transcripts."
),
),
ChatMessage(
role="user",
content=(
"Here is the transcript: \n"
"------\n"
"{transcript}\n"
"------"
),
),
]
)
You can also specify a system prompt to give the agent additional instruction or personality, as shown in this
example ():
from llama_index.agent.openai import OpenAIAgent
from llama_index.llms.openai import OpenAI
from llama_index.core.prompts.system import SHAKESPEARE_WRITING_ASSISTANT
llm = OpenAI(model="gpt-3.5-turbo-0613")
agent = OpenAIAgent.from_tools(
[multiply_tool, add_tool],
llm=llm,
verbose=True,
system_prompt=SHAKESPEARE_WRITING_ASSISTANT,
)