Find answers from the community

Updated 2 months ago

Llm.complete getting failed with value error: chat response object has no field usage while using ollama #17035

At a glance

A community member reported a bug where llm.complete was failing with a ValueError: Chat response object has no field usage when using ollama. Other community members tried to reproduce the issue, and one suggested that the problem might be related to the ollama 4.0 update. The community members discussed potential solutions, such as setting up a virtual environment with minimal packages, downgrading ollama to a previous version, and updating the ollama package to work with the latest version. The issue remains open, and a community member has created a pull request to address the problem.

Useful resources
[Bug]: llm.complete getting failed with Value error: Chat response object has no field usage while using ollama #17035

Raised a bug. If that is not a bug let me know will close immediately
L
T
11 comments
I can't reproduce πŸ€”
pip freeze | grep llama -- what versions of things do you have?
ah oh, I think ollama 4.0 broke this
Will setup venv with minimal packages
I am running docker ollama , should I downgrade to 3.x and try ?
I would give it a shot. I tried it, was working, updated, it broke
Ok downgraded it's working. So it's nothing to do inference engine ? We need to change pip installation of ollama only ?
Shall I close the bug ?
nah keep the issue open. I can update to work with 4.0
Add a reply
Sign up and join the conversation on Discord