Find answers from the community

Updated last week

Error Loading Vocabulary from Local HF Embed Model File

Using Notebook (https://docs.llamaindex.ai/en/stable/examples/embeddings/huggingface/) to convert HF embed model into ONNX. Produced model works fine in Notebook but when I download the files in local machine and try to run; it throws below error: "OSError: Unable to load vocabulary from file. Please check that the provided vocabulary is accessible and not corrupted."
Attachment
image.png
L
J
3 comments
Seems you are missing some files
I can see vocab.txt file in BGE model get replaced by model.onnx_json in snowflake model. But snowflakes runs fine(below pic) as long as i run it in same Colab where i generated it but the moment i download all 6 files manually into my local machine it throws "OSError: Unable to load vocabulary from file. Please check that the provided vocabulary is accessible and not corrupted."
Attachment
image.png
It's working now. Turns out i was missing tokenizer.json
Attachment
image.png
Add a reply
Sign up and join the conversation on Discord