Log in
Log into community
Find answers from the community
View all posts
Related posts
Was this helpful?
π
π
π
Powered by
Hall
Inactive
Updated 4 months ago
0
Follow
Embedding
Embedding
Inactive
0
Follow
At a glance
E
Eternity
last year
Β·
Is there any ways to use "HuggingFaceEmbedding" for private repository? I tried doing something like "model = AutoModel.from_pretrained(model, token="")" but appears to have an error "Bert Model does not have an attribute get_text_embedding.".
L
1 comment
Share
Open in Discord
L
Logan M
last year
You can load the model and tokenizer directly yourself and pass it in
embeddings = HuggingFaceEmbedding(model=model, tokenizer=tokenizer, device="cpu", max_length=512)
Add a reply
Sign up and join the conversation on Discord
Join on Discord