logoalt Hacker News

supermdguyyesterday at 2:12 AM1 replyview on HN

Interesting to see this after the recent post about Chrome’s on-device model using up 4gb of storage, which frustrated a lot of people [1].

I agree local models are great, and it’s cool that Apple has models built in now. But I feel like it basically has to be an OS level feature or users are going to get upset. I’d certainly rather have a small utility call out to OpenAI than download its own model.

[1]: https://news.ycombinator.com/item?id=48019219


Replies

appreciatorBusyesterday at 1:11 PM

The way I interpret the drama over the Chrome model is that for a large chunk of users, perhaps the majority, Chrome is the OS, and this 4GB model will be their OS Level feature for local AI.