Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 3 months ago
0
Follow
can i use llama-cpp-python with
can i use llama-cpp-python with
Inactive
0
Follow
a
aqsa
7 months ago
Β·
can i use llama-cpp-python with llamaindex to run llama3 fine tune model in gguf format for hugging face?
L
3 comments
Share
Open in Discord
L
Logan M
7 months ago
huggingface just added support for loading gguf files, so yes
L
Logan M
7 months ago
Using HuggingFaceLLM
L
Logan M
7 months ago
just load the model and tokenizer, and pass it in mostly
HuggingFaceLLM(model=model, tokenizer=tokenizer, ...)
Add a reply
Sign up and join the conversation on Discord
Join on Discord