Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 2 months ago
0
Follow
@Logan M
@Logan M
Inactive
0
Follow
A
Anurag Agrawal
7 months ago
ยท
L
A
7 comments
Share
Open in Discord
L
Logan M
7 months ago
Nope. I don't use llama-cpp either, its a nightmare to use
L
Logan M
7 months ago
I use purely ollama
A
Anurag Agrawal
7 months ago
Lol okay! Why do you say that? Are there any advantages of using ollama over llama_cpp?
L
Logan M
7 months ago
ollama requries like zero config/setup. It automatically installs to use the GPU you have, it downloads and manages your models, it automatically formats your prompts into the llm-specific format
L
Logan M
7 months ago
its beautiful
L
Logan M
7 months ago
llama-cpp requires all of that to be manually done
A
Anurag Agrawal
7 months ago
Cool! I'll give that a try. Thanks @Logan M !!
Add a reply
Sign up and join the conversation on Discord
Join on Discord