Hi all, I am trying to load a hugging face embedding model from local storage using the wrapper HuggingFaceEmbedding from here : https://docs.llamaindex.ai/en/stable/examples/embeddings/huggingface/. However, not matter what I have tried I can't make it load the model properly from local storage as I get the following warning: WARNING - No sentence-transformers model found with name local_storage_for_embedding_model. Creating a new one with mean pooling.
The code I use is:
embed_model_local_path = Path("local_storage_for_embedding_model")
Settings.embed_model = HuggingFaceEmbedding(
model_name=str(embed_model_local_path), # Pass the temp directory path
local_files_only=True
)
and I try to load BAAI/bge-small-en-v1.5 (I have download all the necessary files from huggingface to my local system in a folder with name "local_storage_for_embedding_model" in the root directory of my repository)
Can anyone please help? Thanks in advance.
from llama_index.embeddings.huggingface import HuggingFaceEmbedding # loads BAAI/bge-small-en # embed_model = HuggingFaceEmbedding() # loads BAAI/bge-small-en-v1.5 embed_model = HuggingFaceEmbedding(model_name="BAAI/bge-small-en-v1.5")
embed_model = HuggingFaceEmbedding(model_name="BAAI/bge-small-en-v1.5")
it will first look into cache folder, if the model is there it will pick that. If not then only it will download the model