Find answers from the community

Updated 10 months ago

Embeddings

With langchain It's possible to set device CPU for embedding
For example:
Plain Text
from langchain.embeddings.huggingface import HuggingFaceEmbeddings

       embedding_model = HuggingFaceEmbeddings(
            model_name="sentence-transformers/all-mpnet-base-v2", 
            model_kwargs={"device": "cpu"}, # Use CPU for embedding 
       )

I couldn't find anything in the documentation LlamaIndex HuggingFaceEmbedding (from llama_index.embeddings.huggingface import HuggingFaceEmbedding)
Is this possible with LlamaIndex HuggingFaceEmbedding ?
L
L
3 comments
It's directly a kwarg

HuggingFaceEmbedding(..., device="cpu")
Thanks a lot.
And do you know if it is possible to specify the number of cores CPU to use? (e.g. os.cpu_count())
nope (I don't think huggingface even exposes an option like that?)
Add a reply
Sign up and join the conversation on Discord