Find answers from the community

Updated 4 months ago

Embedding

At a glance
Is there any ways to use "HuggingFaceEmbedding" for private repository? I tried doing something like "model = AutoModel.from_pretrained(model, token="")" but appears to have an error "Bert Model does not have an attribute get_text_embedding.".
L
1 comment
You can load the model and tokenizer directly yourself and pass it in

embeddings = HuggingFaceEmbedding(model=model, tokenizer=tokenizer, device="cpu", max_length=512)
Add a reply
Sign up and join the conversation on Discord