Find answers from the community

Updated 4 months ago

How to use PromptTemplate to pass input as variables with ReActAgent ?

At a glance

The post asks how to use PromptTemplate to pass input as variables with ReActAgent. The comments provide a detailed explanation, including the steps to follow:

1. Import the PromptTemplate from llama_index.core.

2. Define a prompt string with placeholders for variables.

3. Create a PromptTemplate object using the prompt string.

4. Use the get_prompts() method of the ReActAgent to get the current prompts.

5. Update the prompts using the update_prompts() method of the ReActAgent, passing in a dictionary where the key is the name of the prompt to update and the value is the PromptTemplate object.

The community members also mention that the placeholders in the prompt string should match the keys in the dictionary passed to update_prompts().

Additionally, a community member asks what input agent.chat() expects and whether JSON data can be passed as input, but there is no explicit answer provided.

@kapa.ai How to use PromptTemplate to pass input as variables with ReActAgent ?
k
p
4 comments
Please stand by @praveenks30#120294 as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
To use PromptTemplate to pass input as variables with ReActAgent, you need to follow these steps:

  1. Import the PromptTemplate from llama_index.core.
Plain Text
from llama_index.core import PromptTemplate


  1. Define your prompt string. This string can contain placeholders for variables that you want to pass in later.
Plain Text
react_system_header_str = """\
You are designed to help with a variety of tasks, from answering questions   
to providing summaries to other types of analyses.


  1. Create a PromptTemplate object using the string you defined.
Plain Text
react_system_prompt = PromptTemplate(react_system_header_str)


  1. Use the get_prompts() method of the ReActAgent to get the current prompts.
Plain Text
agent.get_prompts()


  1. Update the prompts using the update_prompts() method of the ReActAgent. Pass in a dictionary where the key is the name of the prompt you want to update and the value is the PromptTemplate object you created.
Plain Text
agent.update_prompts({"agent_worker:system_prompt": react_system_prompt})


Here is the complete code:

Plain Text
from llama_index.core import PromptTemplate

react_system_header_str = """\
You are designed to help with a variety of tasks, from answering questions   
to providing summaries to other types of analyses.

react_system_prompt = PromptTemplate(react_system_header_str)

agent.get_prompts()

agent.update_prompts({"agent_worker:system_prompt": react_system_prompt})


Please note that the placeholders in your prompt string should match the keys in the dictionary you pass to update_prompts().
For more details, you can refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai what does agent.chat() expects as input.. can I pass json data as an input ?
Add a reply
Sign up and join the conversation on Discord