Find answers from the community

Updated 10 months ago

if Reacgt agentic chat is not working

if Reacgt agentic chat is not working well with Gemini, whats a good way to debug and potentially optimize the prompts, or is it just generally not great
L
m
4 comments
Probably the best way to change it is changing the react prompt

The default header is here
https://github.com/run-llama/llama_index/blob/main/llama-index-core/llama_index/core/agent/react/prompts.py#L7

Plain Text
from llama_index.core.agent.react.formatter import ReActChatFormatter

my_formatter= ReActChatFormatter(system_header=my_header)

agent = ReActAgent.from_tools(..., react_chat_formatter=my_formatter)
(side note, I'm so surprised no other LLM APIs are copying openais function/tools api, its so much better than this prompting nightmare lol)
Yeah it's really annoying
Add a reply
Sign up and join the conversation on Discord