Find answers from the community

Updated 11 months ago

Gemini

At a glance

A community member is trying to use the Tree Summarize feature with the Gemini Pro model, but is encountering an error stating that the Gemini model does not support system messages. The community member has tried to look at the GitHub code to find the source of the system message, but has been unable to locate it.

In the comments, another community member shares their code and notes that the code works when using the ChatGPT3.5 model instead of Gemini Pro. They also mention that Gemini Pro works fine for a simple completion task.

Another community member suggests that there is a pull request (PR) to fix this issue soon. The original community member then shares that they were able to get it to work by following a different example in the documentation, which uses a PromptTemplate defined in the code, rather than the default prompt template that uses a system prompt.

The final comment notes that this makes sense, as the default prompt template uses a system prompt, which the Gemini Pro model does not support.

There is no explicitly marked answer in the comments.

Useful resources
Hi, I'm trying to use Tree Summarize with Gemini Pro. I'm getting this error:
"raise ValueError("Gemini model don't support system messages")
ValueError: Gemini model don't support system messages"

It seems that the Gemini Pro model does not accept system messages. I've tried to look at the Github code to see where the system message was, but I haven't been able to find it.

Would anyone have a fix? Thanks!!
W
i
L
6 comments
Hi, can you share your code?
thanks for the quick reply, my code:

from llama_index.core import SimpleDirectoryReader
from llama_index.core.response_synthesizers import TreeSummarize
from llama_index.llms.vertex import Vertex
from google.oauth2 import service_account
from llama_index.core import Settings
import asyncio

filename = "test_service_account.json"
credentials: service_account.Credentials = (
service_account.Credentials.from_service_account_file(filename)
)

llm = Vertex(
model="gemini-pro",
project=credentials.project_id,
credentials=credentials,
temperature=0.1,
)

Settings.llm=llm

reader = SimpleDirectoryReader(
input_files=["LWST.docx"]
)

docs = reader.load_data()

text = docs[0].text

summarizer = TreeSummarize(verbose=True)

async def summ():
response = await summarizer.aget_response("what is this judgment about?", [text])

print(asyncio.run(summ()))
the above works if i use chatgpt3.5 instead

also, gemini is working fine if i do i simple complete chat like, "print(llm.complete("Hello Gemini").text)"
There's a PR to fix this soon I believe
great, thanks for all your super quick answers.

i got it to work. there are two different examples in the docs:-

https://docs.llamaindex.ai/en/stable/module_guides/querying/response_synthesizers/root.html

^ this works for Gemini Pro, TreeSummarize with a PromptTemplate defined in the code

https://docs.llamaindex.ai/en/stable/examples/response_synthesizers/tree_summarize.html

^ this does not work for Gemini Pro. No PromptTemplate defined in the user's code. this is the first google result for "tree summarize llamaindex". i was following this when i encountered the gemini system message error
That makes sense, since the default prompt template uses a system prompt
Add a reply
Sign up and join the conversation on Discord