Find answers from the community

Updated 2 months ago

Trying the new AgentRunner (0.9.16) and

Trying the new AgentRunner (0.9.16) and I'm getting very different behaviors between GPT-4 Turbo and Gemini-Pro.

llm = OpenAI(model="gpt-4-1106-preview")
agent = ReActAgent.from_tools(tools, llm=llm, verbose=True)
response = agent.chat("hi I'm Andrew")

Output:
Thought: (Implicit) I can answer without any more tools!
Response: Hello Andrew! How can I assist you today?

But if I switch to Gemini Pro it's no longer conversational (below):
llm = Gemini(model="models/gemini-pro", api_key=userdata.get('GOOGLE_API_KEY'))
agent = ReActAgent.from_tools(tools, llm=llm, verbose=True)
response = agent.chat("hi I'm Andrew")

Output:
Thought: I need to use a tool to help me answer the question.
Action: analyze_image (this is my tool)

Is there a way I can make Gemini behave in a similar conversational manner and not always default to using a tool even when I'm not asking a question?
L
A
8 comments
I think a lot of this comes down to prompt engineering.

Likely, the react prompt is not ideal for gemini? Sadly the react prompt is still a little hard to customize -- working on making this easier in the future
One quick fix is setting a system prompt
informing the LLM to act more conversationally?
@Logan M thanks for the quick reply and yeah I saw the OpenAIAgent has a system system prompt property field

agent = OpenAIAgent.from_tools(tools, llm=llm, verbose=True, system_prompt=SYSTEM_PROMPT)

but ReActAgent.from_tools() is missing that field. I'm guessing the system prompt is OpenAI specific unless I'm missing something.
oh that's kind of annoying -- it shouldn't be unique to openai. Seems like an oversight

As a workaround, you can do something like

Plain Text
from llama_index.llms import ChatMessage

agent = ReActAgent.from_tools(..., chat_history=[ChatMessage(role="system", content="...")])
@Logan M It seems there's no system role in Gemini (gemini_utils.py)

ROLES_TO_GEMINI = {
MessageRole.USER: "user",
MessageRole.ASSISTANT: "model",
## Gemini only has user and model roles. Put the rest in user role.
MessageRole.SYSTEM: "user",
}

Someone made this suggestion since apparently context isnt a thing in Gemini https://www.googlecloudcommunity.com/gc/AI-ML/Gemini-Pro-Context-Option/m-p/684917
ha right right, I guess it gets converted to a user message. It will be an extra instruction in any case, which should help somewhat
Yeah it still behaves the same way when I pass a user message in the chat history. Hopefully they add context
Add a reply
Sign up and join the conversation on Discord