Find answers from the community

Updated 2 weeks ago

Llm.complete getting failed with value error: chat response object has no field usage while using ollama #17035

[Bug]: llm.complete getting failed with Value error: Chat response object has no field usage while using ollama #17035

Raised a bug. If that is not a bug let me know will close immediately
L
T
11 comments
I can't reproduce πŸ€”
pip freeze | grep llama -- what versions of things do you have?
ah oh, I think ollama 4.0 broke this
Will setup venv with minimal packages
I am running docker ollama , should I downgrade to 3.x and try ?
I would give it a shot. I tried it, was working, updated, it broke
Ok downgraded it's working. So it's nothing to do inference engine ? We need to change pip installation of ollama only ?
Shall I close the bug ?
nah keep the issue open. I can update to work with 4.0
Add a reply
Sign up and join the conversation on Discord