----------------------
To send a custom prompt to a ReactAgent in LlamaIndex, you can use the
as_chat_engine
method of the index and specify the
system_prompt
parameter with your custom prompt. Here's an example:
from llama_index.prompts.system import SHAKESPEARE_WRITING_ASSISTANT
chat_engine = index.as_chat_engine(
chat_mode="react",
system_prompt=SHAKESPEARE_WRITING_ASSISTANT,
verbose=True,
)
response = chat_engine.chat("Hi, tell me a tragic story")
In this example,
SHAKESPEARE_WRITING_ASSISTANT
is a predefined prompt. If you want to use a custom prompt, you can define it using the
Prompt
class and then pass it to the
system_prompt
parameter. Here's an example of how to define a custom prompt:
from llama_index import Prompt
template = (
"We have provided context information below. \n"
"---------------------\n"
"{context_str}"
"\n---------------------\n"
"Given this information, please answer the question: {query_str}\n"
)
custom_prompt = Prompt(template)
You can then use
custom_prompt
in place of
SHAKESPEARE_WRITING_ASSISTANT
in the
as_chat_engine
method.
For more details, you can refer to the
LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again
:speech_balloon:
----------------------