logoalt Hacker News

nine_kyesterday at 5:14 AM2 repliesview on HN

Users have to pay for the compute somehow. Maybe by paying for models run in datacenters. Maybe paying for hardware that's capable enough to run models locally.


Replies

Bootvisyesterday at 5:52 AM

I can upgrade to a bigger LLM I use through an API with one click. If it runs on my device device I need to buy a new phone.

show 1 reply
lostloginyesterday at 6:28 AM

But also: if Apple's way works, it’s incredibly wasteful.

Server side means shared resources, shared upgrades and shared costs. The privacy aspect matters, but at what cost?

show 2 replies