logoalt Hacker News

imoverclockedtoday at 3:29 AM3 repliesview on HN

It’s pretty great that despite having large data centers capable of doing this kind of computation, Apple continues to make things work locally. I think there is a lot of value in being able to hold the entirety of a product in hand.


Replies

xnxtoday at 4:15 AM

Google has a family of local models too! https://ai.google.dev/gemma/docs

coliveiratoday at 5:06 AM

It's very convenient for Apple to do this: less expenses on costly AI chips, and more excuses to ask customers to buy their latest hardware.

show 1 reply
v5v3today at 11:21 AM

With no company having a clear lead in everyday ai for the non technical mainstream user, there is only going to be a race to the bottom for subscription and API pricing.

Local doesn't cost the company anything, and increases the minimum hardware customers need to buy.