Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 3 months ago
0
Follow
Lmamacpp
Lmamacpp
Inactive
0
Follow
w
woojim
last year
Β·
or change downloading default model to be GGUF models TheBloke uplodes
L
w
4 comments
Share
Open in Discord
L
Logan M
last year
Yea I've been meaning to update the download path (and maybe added a warning print or something if a user loads a GGML file)
I'll get that done today
L
Logan M
last year
Do you know of a good gguf chat model? Not many online yet
w
woojim
last year
Sorry I don't really know. I actually learned of this change today. Was planning on converting blokes ggml but been too busy with other work.
L
Logan M
last year
I found one gguf for llama2-13b-chat, but it only had two downloads and zero documentation π seemed sketch
Add a reply
Sign up and join the conversation on Discord
Join on Discord