Using Notebook (https://docs.llamaindex.ai/en/stable/examples/embeddings/huggingface/) to convert HF embed model into ONNX. Produced model works fine in Notebook but when I download the files in local machine and try to run; it throws below error: "OSError: Unable to load vocabulary from file. Please check that the provided vocabulary is accessible and not corrupted."
I can see vocab.txt file in BGE model get replaced by model.onnx_json in snowflake model. But snowflakes runs fine(below pic) as long as i run it in same Colab where i generated it but the moment i download all 6 files manually into my local machine it throws "OSError: Unable to load vocabulary from file. Please check that the provided vocabulary is accessible and not corrupted."