Find answers from the community

Updated 3 months ago

Lmamacpp

or change downloading default model to be GGUF models TheBloke uplodes
L
w
4 comments
Yea I've been meaning to update the download path (and maybe added a warning print or something if a user loads a GGML file)

I'll get that done today
Do you know of a good gguf chat model? Not many online yet
Sorry I don't really know. I actually learned of this change today. Was planning on converting blokes ggml but been too busy with other work.
I found one gguf for llama2-13b-chat, but it only had two downloads and zero documentation πŸ˜‚ seemed sketch
Add a reply
Sign up and join the conversation on Discord