Find answers from the community

Updated 11 months ago

I'm running into errors with some of the

At a glance

The community member is experiencing issues with generating embeddings using the "sentence-transformers/all-distilroberta-v1" model, which has 768 dimensions. They are encountering an error related to the CUDA indexing. The community member notes that models with 384 and 512 dimensions work fine, but the 768-dimensional model is causing problems.

In the comments, another community member suggests using the latest version of the "llama-index-embeddings-huggingface" library, which is v0.2.0. The community member who is experiencing the issue had installed v0.1.4.

Another community member suggests switching to using the "sentence-transformers" library as the backend, instead of pure PyTorch. This seems to work fine, as demonstrated by the code example provided.

There is no explicitly marked answer, but the community members have provided suggestions that may help resolve the issue.

I'm running into errors with some of the embedding models

sentence-transformers/all-distilroberta-v1
it states it has 768 dimensions and then during generating the embeddings I get this error:
../aten/src/ATen/native/cuda/Indexing.cu:1290: indexSelectLargeIndex: block: [59,0,0], thread: [64,0,0] Assertion srcIndex < srcSelectDimSize failed.
it happens with all those embedding models stating to have 768 Dimensions I have tested so far 384, 512 dimensions was no issue and the embedding worked fine.

I run tests which embedding model would be best for our data and so I came across this issue. Do I do something wrong or is 768 Dimensions not working at all?
L
o
8 comments
How are you setting up the embed model? Do you have the latest version of llama-index-embeddings-huggingface installed?
I did pip install it last week
I do it like this embed_model = HuggingFaceEmbedding(model_name=embedding_models[model]['path'])

then when it comes to the embedding part after the parsing it crashes with that error
I got this version llama-index-embeddings-huggingface==0.1.4
Latest is v0.2.0
Switched to using the sentence-transformers library as the backend, instead of pure pytorch
Seems to work fine

Plain Text
>>> from llama_index.embeddings.huggingface import HuggingFaceEmbedding
>>> embed_model = HuggingFaceEmbedding(model_name="sentence-transformers/all-distilroberta-v1")
>>> embeds = embed_model.get_text_embedding("Test")
>>> embeds = embed_model.get_query_embedding("Test")
>>> 
ok I will try that
Add a reply
Sign up and join the conversation on Discord