Find answers from the community

s
F
Y
a
P
Updated last month

@Logan M

L
A
7 comments
Nope. I don't use llama-cpp either, its a nightmare to use
I use purely ollama
Lol okay! Why do you say that? Are there any advantages of using ollama over llama_cpp?
ollama requries like zero config/setup. It automatically installs to use the GPU you have, it downloads and manages your models, it automatically formats your prompts into the llm-specific format
llama-cpp requires all of that to be manually done
Cool! I'll give that a try. Thanks @Logan M !!
Add a reply
Sign up and join the conversation on Discord