ReActAgent
a system prompt, you can use the update_prompts
method. Here's a step-by-step guide:ReActAgent
. You can do this with the following code:from llama_index.core.agent import ReActAgent from llama_index.llms.openai import OpenAI llm = OpenAI(model="gpt-3.5-turbo-0613") # or any other model you prefer agent = ReActAgent.from_tools( query_engine_tools, # replace with your list of tools llm=llm, verbose=True, )
PromptTemplate
with your custom string:from llama_index.core import PromptTemplate react_system_header_str = """\ You are designed to help with a variety of tasks, from answering questions to providing summaries to other types of analyses. # Add more details to your prompt here... """ react_system_prompt = PromptTemplate(react_system_header_str)
ReActAgent
with the update_prompts
method:agent.update_prompts({"agent_worker:system_prompt": react_system_prompt})
react_system_header_str
with your own custom prompt and query_engine_tools
with your list of tools.from llama_index.core import PromptTemplate # This is an example. Replace this with the actual default prompt default_prompt_str = "This is the default prompt.\n" # This is your arbitrary prompt arbitrary_prompt_str = "This is an arbitrary prompt.\n" # Combine the default prompt and the arbitrary prompt combined_prompt_str = default_prompt_str + arbitrary_prompt_str # Create a PromptTemplate with the combined prompt combined_prompt = PromptTemplate(combined_prompt_str)
update_prompts
method to update the system prompt:agent.update_prompts({"agent_worker:system_prompt": combined_prompt})
default_prompt_str
with the actual default prompt and arbitrary_prompt_str
with your arbitrary prompt.ReActAgentWorker
, you can follow these steps:from llama_index.core import PromptTemplate # This is an example. Replace this with the actual default prompt default_prompt_str = "This is the default prompt.\n" # This is your arbitrary prompt arbitrary_prompt_str = "This is an arbitrary prompt.\n" # Combine the default prompt and the arbitrary prompt combined_prompt_str = default_prompt_str + arbitrary_prompt_str # Create a PromptTemplate with the combined prompt combined_prompt = PromptTemplate(combined_prompt_str)
update_prompts
method to update the system prompt:agent_worker.update_prompts({{"system_prompt": combined_prompt}})
default_prompt_str
with the actual default prompt and arbitrary_prompt_str
with your arbitrary prompt.update_prompts
method is available in both ReActAgent
and ReActAgentWorker
. The difference is that for ReActAgentWorker
, you don't need to prefix the key with agent_worker:
.raise ValueError(f"Module {module_name} not found.") ValueError: Module agent_worker not found.
agent_worker
. This could be due to a few reasons:agent_worker
does not exist in your current project or environment. Make sure that you have correctly named your modules and that the module is in the correct directory.agent_worker
module before it has been properly imported or defined. Ensure that you have imported the module using the correct syntax. For example:from llama_index.core.agent import ReActAgentWorker
ReActAgentWorker
instance, make sure you're not prefixing the key with agent_worker:
. The correct syntax should be:agent_worker.update_prompts({"system_prompt": combined_prompt})