Log in
Log into community
Find answers from the community
View all posts
Related posts
Was this helpful?
π
π
π
Powered by
Hall
Inactive
Updated 4 months ago
0
Follow
@Logan M
@Logan M
Inactive
0
Follow
At a glance
A
Anurag Agrawal
9 months ago
Β·
L
A
7 comments
Share
Open in Discord
L
Logan M
9 months ago
Nope. I don't use llama-cpp either, its a nightmare to use
L
Logan M
9 months ago
I use purely ollama
A
Anurag Agrawal
9 months ago
Lol okay! Why do you say that? Are there any advantages of using ollama over llama_cpp?
L
Logan M
9 months ago
ollama requries like zero config/setup. It automatically installs to use the GPU you have, it downloads and manages your models, it automatically formats your prompts into the llm-specific format
L
Logan M
9 months ago
its beautiful
L
Logan M
9 months ago
llama-cpp requires all of that to be manually done
A
Anurag Agrawal
9 months ago
Cool! I'll give that a try. Thanks @Logan M !!
Add a reply
Sign up and join the conversation on Discord
Join on Discord