Find answers from the community

Updated 5 months ago

Enable Function calling and agent runner...

https://github.com/run-llama/llama_index/pull/14088 i can see this code, however when i use it in the code, i get :

File "/home//anaconda3/envs//lib/python3.11/site-packages/llama_index/core/agent/function_calling/step.py", line 158, in from_tools
return cls(
^^^^
File "/home//anaconda3/envs//lib/python3.11/site-packages/llama_index/core/agent/function_calling/step.py", line 102, in init
raise ValueError(
ValueError: Model name models/gemini-1.5-flash-latest does not support function calling API.
L
m
7 comments
Did you update your vertexai integration package?

pip install -U llama-index-llms-vertexai
I did.
pip freeze | grep llama-index
llama-index==0.10.55
llama-index-agent-openai==0.2.8
llama-index-cli==0.1.12
llama-index-core==0.10.55
llama-index-embeddings-openai==0.1.10
llama-index-experimental==0.1.3
llama-index-indices-managed-llama-cloud==0.2.3
llama-index-legacy==0.9.48
llama-index-llms-anthropic==0.1.15
llama-index-llms-gemini==0.1.11
llama-index-llms-openai==0.1.25
llama-index-llms-vertex==0.2.1
llama-index-multi-modal-llms-openai==0.1.7
llama-index-postprocessor-cohere-rerank==0.1.7
llama-index-program-openai==0.1.6
llama-index-question-gen-openai==0.1.3
llama-index-readers-file==0.1.29
llama-index-readers-llama-parse==0.1.6
llama-index-vector-stores-chroma==0.1.10
i think it should be gemini package, not vertex
Oh, that pr was for vertex, not gemini
so we cannot use function calling with llam]aindex and gemini right?
thats a shame, on lmsys the flast 1.5is much better than gpt 3.5:/
Not unless someone implements it, or you use gemini through vertexai
Add a reply
Sign up and join the conversation on Discord