Find answers from the community

i
iach
Offline, last seen 2 months ago
Joined September 25, 2024
i've been trying gemini pro using vertex ai, using llamindex's integration of vertex ai. i've been defining it as "llm" and using Settings.llm=llm. i use gemini pro as the llm for llamaindex's Tree Summarize.

is there a way to specify a "model.start_chat(response_validation=False)" parameter for gemini? this should disable gemini's response validation, which blocks potentially "risky" material from appearing as output, by throwing up an error. for example, see this article: https://medium.com/@its.jwho/errorhandling-vulnerability-tests-on-gemini-19601b246b52.

the response validation of gemini is oversensitive. i'm summarising a court judgment which in which there was a "harassment" claim, but gemini is giving me a ResponseValidationError ("category: HARM_CATEGORY_HARASSMENT
probability: MEDIUM"), and with the error message stating "To skip the response validation, specify `model.start_chat(response_validation=False)"

i see in utils.py for llamaindex there is "start_chat" in the code: https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/llms/llama-index-llms-vertex/llama_index/llms/vertex/utils.py

so in short, using llamaindex, how do i set "model.start_chat(response_validation=False)" as a parameter for vertex AI, gemini pro? if there is no easy way, is there a way of calling vertex ai's gemini pro using vertex AI's API, and then having it take effect as usual in llamaindex's "Settings.llm=llm"? thanks!
1 comment
i
i
iach
·

Gemini

Hi, I'm trying to use Tree Summarize with Gemini Pro. I'm getting this error:
"raise ValueError("Gemini model don't support system messages")
ValueError: Gemini model don't support system messages"

It seems that the Gemini Pro model does not accept system messages. I've tried to look at the Github code to see where the system message was, but I haven't been able to find it.

Would anyone have a fix? Thanks!!
6 comments
L
i
W