Find answers from the community

Updated 3 months ago

**Bug Description**

Bug Description

I'm encountering an issue while trying to access the "achat" interface to chat with the Gemini model (gemini-1.5-flash-002) using Vertex AI. The error message I'm receiving is as follows:

Plain Text
ERROR: Unknown field for GenerationConfig: safety_settings

 File "/workspaces/CORTEX/.venv/lib/python3.10/site-packages/llama_index/llms/vertex/base.py", line 384, in achat
   generation = await acompletion_with_retry(
 
 File "/workspaces/CORTEX/.venv/lib/python3.10/site-packages/llama_index/llms/vertex/utils.py", line 148, in acompletion_with_retry
   return await _completion_with_retry(**kwargs)


I suspect this may be an issue with llamaindex, potentially due to a recent Google update that affected some configurations. However, I am unsure of the root cause.

Version Details:
  • llama-index==0.11.14
  • llama-index-llms-vertex==0.3.6
  • google-ai-generativelanguage==0.6.4
  • google-generativeai==0.5.4
Steps to Reproduce:
Plain Text
llm = Vertex(...)
chat = await llm.achat(...)


Error: See above.

Relevant Logs/Tracebacks: No response.
L
A
10 comments
yea someone raised an issue on github for this. Would love a PR, otherwise I'll try to get to it at some point

I think downgrading may actually fix it... pip install -U llama-index-llms-vertex==0.3.4
It was me, I was hoping to get some help here so I could close there
Do I have to pass any extra parameters?

Plain Text
ResponseValidationError: The model response did not complete successfully.
Finish reason: 2.
Finish message: .
Safety ratings: [].
To protect the integrity of the chat session, the request and response were not added to chat history.
To skip the response validation, specify `model.start_chat(response_validation=False)`.
Note that letting blocked or otherwise incomplete responses into chat history might lead to future interactions being blocked by the service.
Was this integration working before? Or is it a problem of mine
Seems like google maybe updated and broke something? I really have no idea actually πŸ˜₯
Actually I think this was a problem of mine
I set MAX_TOKENS too low, now it worked
But the first problem I have no idea why it's not working
Add a reply
Sign up and join the conversation on Discord