Find answers from the community

Updated 6 months ago

Lmamacpp

At a glance

The post suggests changing the default downloading model to be GGUF models uploaded by TheBloke. In the comments, a community member mentions they plan to update the download path and add a warning if a user loads a GGML file. Another community member asks if anyone knows of a good GGUF chat model, as there are not many available online yet. The responses indicate that the community members are still exploring and working on converting GGML models to GGUF format, but have not found a reliable GGUF chat model to recommend.

or change downloading default model to be GGUF models TheBloke uplodes
L
w
4 comments
Yea I've been meaning to update the download path (and maybe added a warning print or something if a user loads a GGML file)

I'll get that done today
Do you know of a good gguf chat model? Not many online yet
Sorry I don't really know. I actually learned of this change today. Was planning on converting blokes ggml but been too busy with other work.
I found one gguf for llama2-13b-chat, but it only had two downloads and zero documentation πŸ˜‚ seemed sketch
Add a reply
Sign up and join the conversation on Discord