logoalt Hacker News

cientificotoday at 6:57 AM8 repliesview on HN

For most users that wanted to run LLM locally, ollama solved the UX problem.

One command, and you are running the models even with the rocm drivers without knowing.

If llama provides such UX, they failed terrible at communicating that. Starting with the name. Llama.cpp: that's a cpp library! Ollama is the wrapper. That's the mental model. I don't want to build my own program! I just want to have fun :-P


Replies

anakainetoday at 7:10 AM

Llama.cpp now has a gui installed by default. It previously lacked this. Times have changed.

show 3 replies
omgitspaveltoday at 9:28 AM

agree. We can easily compare it with docker. Of course people can use runc directly, but most people select not to and use `docker run` instead.

And you can blame docker in a similar manner. LXC existed for at least 5 years before docker. But docker was just much more convenient to use for an average user.

UX is a huge factor for adoption of technology. If a project fails at creating the right interface, there is nothing wrong with creating a wrapper.

samustoday at 8:38 AM

How about kobold.cpp then? Or LMStudio (I know it's not open source, but at least they give proper credit to llama.cpp)?

Re curation: they should strive to not integrate broken support for models and avoid uploading broken GGUFs.

ekianjotoday at 8:41 AM

> For most users that wanted to run LLM locally, ollama solved the UX problem

This does not absolve them from the license violation

ameliustoday at 8:00 AM

Whip that llama! Oh wait, that's a different program.

show 1 reply
croestoday at 9:06 AM

But if you’re just a GUI wrapper then at least attribute the library you created the GUI for

well_ackshuallytoday at 8:01 AM

>solved the UX problem.

>One command

Notwithstanding the fact that there's about zero difference between `ollama run model-name` and `llama-cpp -hf model-name`, and that running things in the terminal is already a gigantic UX blocker (Ollama's popularity comes from the fact that it has a GUI), why are you putting the blame back on an open source project that owes you approximately zero communication ?

show 2 replies
FrozenSynapsetoday at 7:30 AM

but if ollama is much slower, that's cutting on your fun and you'll be having better fun with a faster GUI

show 1 reply