Interesting to see this after the recent post about Chrome’s on-device model using up 4gb of storage, which frustrated a lot of people [1].
I agree local models are great, and it’s cool that Apple has models built in now. But I feel like it basically has to be an OS level feature or users are going to get upset. I’d certainly rather have a small utility call out to OpenAI than download its own model.
The way I interpret the drama over the Chrome model is that for a large chunk of users, perhaps the majority, Chrome is the OS, and this 4GB model will be their OS Level feature for local AI.