Log in
Log into community
Find answers from the community
View all posts
Related posts
Was this helpful?
π
π
π
Powered by
Hall
Inactive
Updated last year
0
Follow
Higgingface
Higgingface
Inactive
0
Follow
At a glance
D
DangFutures
last year
Β·
hola, does the hugging face wrapper support using flash attention2 or rope scaling?
L
D
3 comments
Share
Open in Discord
L
Logan M
last year
The LLM wrapper? It uses whatever you want
Its probably easiest to load the model with whichever settings you want and pass it in directly
HuggingFaceLLM(model=model, tokenizer=tokenizer, ...)
D
DangFutures
last year
:0
D
DangFutures
last year
ty ty
Add a reply
Sign up and join the conversation on Discord
Join on Discord