Find answers from the community

Updated 2 months ago

Tool use issues after update

It seems that something changed in llamaindex (I think this: https://github.com/run-llama/llama_index/pull/14885) and now I get ValueError: Expected at least one tool call, but got 0 tool calls. from every query. What am I supposed to be doing? The changelog doesn't mention a migration path to fix your code.
C
L
18 comments
Tool use issues after update
Hmm there shouldn't be any migration needed
but taking a look
Oh interesting
So you aren't using an agent, just a query engine with an output cls?
I don't think this error would be caused by the above PR you linked. But let me see if I can replicate

(I think what might be happening is the LLM is failing to predict an output class?)
Hmm, I was not able to replicate (but did notice a small unrelated bug with the latest ollama)

With my changes, this is what I used

Plain Text
from typing import List
from pydantic.v1 import BaseModel


class Biography(BaseModel):
    """Data model for a biography."""

    name: str
    best_known_for: List[str]
    extra_info: str


from llama_index.core import VectorStoreIndex
from llama_index.llms.ollama import Ollama

llm = Ollama(model="mistral:latest", is_function_calling_model=False, json_mode=True)

index = VectorStoreIndex.from_documents(
    documents,
)

query_engine = index.as_query_engine(
    output_cls=Biography, response_mode="compact", llm=llm
)

response = query_engine.query("Who is Paul Graham?")


Without json_mode=True, I was getting parsing errors (becasue mistral wasn't writing json)
Will push the patch for function calling in a sec (ollama supports function calling now, but not for mistral, for llama3.1 and a few others)
is the patch you'll push the default being False, or the ability to set is_function_calling_model at all?
if it's the former I can set that manually and see if it helps me too
hmm i guess it's adding is_function_calling_model because I get [reportCallIssue]: No parameter named "is_function_calling_model"
Yea the param isn't there yet, v0.2.1 of ollama hardcodes it to True (whoops)
But in any case, wasn't able to get the same issue you had (but again, I suspect the LLM just isn't outputing the output class)
I've reduced the example code and ran it again after reducing it to make sure I get the problem I described. I also get complaints about weaviate client not being closed properly, but that's just because the crash in llamaindex stops execution before the cleanup i added at the end.
Plain Text
 ValueError: Expected at least one tool call, but got 0 tool calls.
 /usr/local/lib/python3.11/site-packages/weaviate/warnings.py:303: ResourceWarning: Con004: The connection to Weaviate was not closed properly. This can lead to memory leaks.
             Please make sure to close the connection using `client.close()`.
 Traceback (most recent call last):
   File "src/python/grpcio/grpc/_cython/_cygrpc/aio/grpc_aio.pyx.pxi", line 110, in grpc._cython.cygrpc.shutdown_grpc_aio
   File "src/python/grpcio/grpc/_cython/_cygrpc/aio/grpc_aio.pyx.pxi", line 114, in grpc._cython.cygrpc.shutdown_grpc_aio
   File "src/python/grpcio/grpc/_cython/_cygrpc/aio/grpc_aio.pyx.pxi", line 78, in grpc._cython.cygrpc._actual_aio_shutdown
 AttributeError: 'NoneType' object has no attribute 'POLLER'
 Exception ignored in: 'grpc._cython.cygrpc.AioChannel.__dealloc__'
 Traceback (most recent call last):
   File "src/python/grpcio/grpc/_cython/_cygrpc/aio/grpc_aio.pyx.pxi", line 110, in grpc._cython.cygrpc.shutdown_grpc_aio
   File "src/python/grpcio/grpc/_cython/_cygrpc/aio/grpc_aio.pyx.pxi", line 114, in grpc._cython.cygrpc.shutdown_grpc_aio
   File "src/python/grpcio/grpc/_cython/_cygrpc/aio/grpc_aio.pyx.pxi", line 78, in grpc._cython.cygrpc._actual_aio_shutdown
 AttributeError: 'NoneType' object has no attribute 'POLLER'
If I uncomment the , is_function_calling_model=False) then it works. so thanks.
Add a reply
Sign up and join the conversation on Discord