Find answers from the community

Updated 2 weeks ago

Getting started with local llama: troubleshooting a usage error

I am just getting in to AI stuff and I follow the quick start and I was trying with local ollama but every time I just get this error I don't know what it is and what I am doing wrong, but I spent 2 to 3 hours on this, I hope somebody can help here

Plain Text
ValueError: "ChatResponse" object has no field "usage"
L
R
3 comments
pip freeze | grep llama would get all relevant versions
code is literally this: https://docs.llamaindex.ai/en/stable/getting_started/starter_example_local/

dependencies version I will update in some time
Add a reply
Sign up and join the conversation on Discord