Do you have a source for this local stuff?
i can kinda see it, they spent a lot of time getting Gemma 4 pretty efficient and then seeing everyone buy macs to run them and realize it’s maybe a real moat since Apple doesn’t make any AI
Would be an interesting product if it could actually give you GPT performance locally, will be an awful experience if it’s essentially just cloud AI…like a premium laptop where most of the features are locked behind a subscription would be wild
They are already rolling it out in Chrome: https://www.pcmag.com/news/chrome-is-quietly-downloading-4gb... It won't work on a Chromebook with 4GB of RAM, so they need beefier hardware.