logoalt Hacker News

Apple's accidental moat: How the "AI Loser" may end up winning

377 pointsby walterbelltoday at 2:53 AM337 commentsview on HN

Comments

Trastertoday at 2:36 PM

People can correct me if I'm wrong, but I think the core logic behind OpenAI's valuation was essentially that AI would work like search. Google had the best search engine, it became a centre of gravity that sucked everything in and suddenly network effects meant it was the centre of the universe. There seem to be 2 big problems with that though. The first is that for search, queries are both demand for the product and a way of making the product better. The second, is that Google was genuinely the best product for a very long time.

Maybe point (1) was unclear at some point, but I think it's mostly clear today that's not happening. Training the model is modestly distinct from inference.

Point (2) is really funny - because sure, at some point OpenAI was the best, and then Sam Altman blew the place up and spawned a whole host of competitors who could replicate and eventually surpass OpenAI's state of the art.

It now looks like AI is a death march. You must spend billions of dollars to have the best model or you won't be able to sell inference. But even if you do, a whole host of better funded competitors are going to beat you within months so your inference charges better pay off extremely quickly. When the gap between models starts to drop, distribution becomes king and OpenAI can't compete in that field either.

Google can do that. Meta can do that. MSFT probably can do that. Amazon can do that. OpenAI cannot. They do not have the cash to do it.

show 5 replies
amazingamazingtoday at 4:10 AM

Gemma4 in my view is good enough to do things similar to Gemini 2.5 flash, meaning if I point it code and ask for help and there is a problem with the code it’ll answer correctly in terms of suggestions but it’s not great at using all tools or one shooting things that require a lot of context or “expert knowledge”

If a couple more iterations of this, say gemma6 is as good as current opus and runs completely locally on a Mac, I won’t really bother with the cloud models.

That’s a problem.

For the others anyway.

show 11 replies
grtteeetoday at 3:30 AM

This is the classic apple approach - wait to understand what the thing is capable of doing (aka let others make sunk investments), envision a solution that is way better than the competition and then architect a path to building a leapfrog product that builds a large lead.

show 9 replies
hapticmonkeytoday at 3:55 AM

Apple aren’t in the business of building chatbots to impress investors (other than some WWDC2024 vaporware they’d rather not talk about any more). They’re in the business of consumer hardware.

Consumers want iPhones and (if Apple are right) some form of AR glasses in the next decade. That’s their focus. There’s a huge amount of machine learning and inference that’s required to get those to work. But it’s under the hood and computed locally. Hence their chips. I don’t see what Apple have to gain by building a competitor to what OpenAI has to offer.

show 3 replies
an0maloustoday at 12:32 PM

The best part is that it’ll all run on your device, instead of siphoning off your data to the provider. Local first AI.

I think the creatives will also turn around their seething hatred of AI for Apple AI because they use more ethical training data and it feels more like they own their AI, no one’s charging them a subscription fee to use it and then using their private data for training.

show 1 reply
pjmlptoday at 6:26 AM

What I don't get about Apple is when everyone else was giving up on yet another VR attempt, moving into AI, they decide AI isn't worth it, and it was the right time for a me too VR headset.

So no VR, given the price and lack of developer support, and late arrival into AI.

show 3 replies
pramtoday at 3:55 AM

I've had it turned off since Sequoia, and this I truly appreciate. It hasn't nagged me once to turn it or Siri on, and it isn't mandatory.

When I open up JIRA or Slack I am always greeted with multiple new dialogues pointing at some new AI bullshit, in comparison. We hates it precious

show 2 replies
int32_64today at 4:15 AM

Nvidia restricts gamer cards in data centers through licensing, eventually they will probably release a cheaper consumer AI card to corner the local AI market that can't be used in data centers if they feel too much of a threat from Apple.

Imagine a future where Nvidia sells the exact same product at completely different prices, cheap for those using local models, and expensive for those deploying proprietary models in data centers.

show 5 replies
Ifkaluvatoday at 3:51 PM

I’m confused why he keeps calling out “the Mac Mini craze after claw went viral”. I thought the various versions of claw used remote models, not local models, and I thought the point of using a Mac mini was that it can send and receive iMessages, not anything about the hardware.

show 1 reply
andsoitistoday at 7:49 AM

Using the author’s logic, it is Google then that will lead.

Unlike Apple, they have even more devices in the field PLUS they have strong models PLUS Apple uses Google models.

show 2 replies
schnitzelstoattoday at 1:23 PM

When using Siri recently it really struck me how much worse it feels after using ChatGPT. It struggles to understand what I say correctly and you have to give commands in more of a 'computer-friendly' form.

I hope they can at least fix this, as I really only use it as a hands-free system while driving.

jayd16today at 4:45 AM

My capex is even less than Apple, I can ship to user's Apple hardware and I can't access iPhone user photos either...so really I'm the winner.

harrouettoday at 8:07 AM

Thing is, Apple never considered racing against LLM runners. Apple's success comes from human-centered design, it is not trying to launch a me-too product just because it increases their stock price. iPod was not the first MP3 player. iPhone was not even 3G at launch -- in the middle of 3G marketing craze.

They sure got lucky that unified memory is well-suited for running AI, but they just focused on having cost- and energy-efficient computing power. They've been having glasses in sight for the last 10 years (when was Magic Leap's first product?) and these chips have been developed with that in mind. But not only the chips: nothing was forcing Apple to spend the extra money for blazing fast SSD -- but they did.

So yes, Apple is a hardware company. All the services it sells run on their hardware. They've just designed their hardware to support their users' workflows, ignoring distractions.

With that said, LLM makes the GPU + memory bandwidth fun again. NVidia can't do it alone, Intel can't do it alone, but Apple positioned itself for it. It reminds me how everyone was surprised when then introduced 64-bit ARM for everyone: very few people understood what they were doing.

Tbh there are NVidia GPUs that beat Apple perf 2x or 3x, but these are desktop or server chips consuming 10x the power. Now all Apple needs to do is keep delivering performance out of Apple Silicon at good prices and best energy efficiency. Local LLM make sense when you need it immediately, anywhere, privately -- hence you need energy efficiency.

bawanatoday at 12:27 PM

Any field with abstraction becomes susceptible to ai disruption. In fact, ai susceptibility is proportional to the amount of abstraction. In this sense, the more abstraction then the more ai will displace people (my observation). This turns the millenia old model upside down. Traditionally more abstraction required more schooling and experience and was rewarded with more financial rewards. Until robots and world models become safe, affordable and ubiquitous, the financial apex of careers will be those that are abstraction resistant (technicians, emts, trades, etc) and those protected by requlation and the requlators(politicians, ceos)

ianbookertoday at 12:14 PM

Why is Nvidia so central to LLMs? Because they embraced ML a decade ago. Apple did as well, machine learning is central to so many things in the iPhone. Its not so surprising then, that a strong showing in ML sets you up good for LLMs..

pdhborgestoday at 8:13 AM

Apple's accidental moat now is taking the rise of hardware prices due to AI eat into their margins and just expand the mac user base.

-1today at 5:11 AM

Maybe they thought an investment in a product with lots of substitutes & high capital requirements wasn't very attractive.

sky2224today at 6:07 AM

Honestly, I think part of the reason Apple hasn't jumped deep into AI is due to two big reasons:

1) Apple is not a data company.

2) Apple hasn't found a compelling, intuitive, and most of all, consistent, user experience for AI yet.

Regarding point 2: I haven't seen anyone share a hands down improved UX for a user driven product outside of something that is a variation of a chat bot. Even the main AI players can't advertise anything more than, "have AI plan your vacation".

show 2 replies
mring33621today at 4:57 PM

The moat is that they saved their money and can remain in business indefinitely!

46493168today at 3:59 AM

Apple is almost 2 years out from their announcement of Apple Intelligence. It has barely delivered on any of the hype. New Siri was delayed and barely mentioned in the last WWDC; none of the features are released in China.

In other news, people keep buying iPhones, and Apple just had its best quarter ever in China. AAPL is up 24% from last year.

show 3 replies
nielsbottoday at 5:51 AM

> I am actually of the opinion that without some kind of bailout, OpenAI could be bankrupt in the next 18-24 months, but I am horrible at predictions

I find this intriguing.. Does anyone here have enough insight to speculate more?

show 3 replies
nottorptoday at 9:18 AM

> Think about the App Store. Apple didn’t build the apps, they built the platform where apps ran best, and the ecosystem followed.

As far as I remember Apple basically got forced into opening the platform to 3rd party developers. Not by regulation but by public pressure. It wasn't their initial intention to allow it.

10keanetoday at 6:05 AM

there are always three elements in the equations of business model: 1. marginal cost 2. marginal revenue 3. value created

for llm providers, i always believe the key is to focus on high value problems such as coding or knowledge work, becaues of the high marginal cost of having new customers - the token burnt. and low marginal revenue if the problem is not valuable enough. in this sense no llm providers can scale like previous social media platforms without taking huge losses. and no meaning user stickiness can be built unless you have users' data. and there is no meaningful business model unless people are willing to pay a high price for the problem you solve, in the same way as paying for a saas.

i am really not optimistic about the llm providers other than anthropic. it seems that the rest are just burning money, and for what? there is no clear path for monetization.

and when the local llm is powerful enough, they will soon be obsolete for the cost, and the unsustainable business model. in the end of the day, i do agree that it is the consumer hardware provider that can win this game.

show 1 reply
bigyabaitoday at 3:33 AM

I just realized that next year Apple's Neural Engine will be 10 years old, just like the "NPUs will change AI forever!" puff pieces.

Here's to another 10 years of scuffed Metal Compute Shaders, I guess.

javchztoday at 3:53 AM

What I think was a wasted opportunity was not bringing the xserve back, being one of the few e2e solutions out there at scale.

oliver236today at 10:17 AM

The whole premise is that if you don't get to AGI first then you loose. The idea is that Anthropic with AGI could build a better version of Apple, or whatever it wants.

This was the conversation like 1 year ago. What has changed?

show 3 replies
jbverschoortoday at 6:08 AM

So Apple’s AI acceleration and memory architecture is accidental, but nvidia’s is not?

show 1 reply
m3kw9today at 4:19 PM

Looks like Apple fell into a winning/winnable position in the AI wars. Their privacy/safety first culture is the cause of them not embracing AI as effectively as other more maverick styles. Their AI was always hindered by privacy, and local first AI is their savour.

asdevtoday at 4:34 AM

Apple is just waiting for all the slop to inevitably crash to see what actually works

ameliustoday at 8:40 AM

In the larger scheme of things, the great winner will be open source, as we'll simply use AI to recreate the entire MacOS ecosystem :)

show 1 reply
f_allweintoday at 12:15 PM

maybe “The Only Way to Win is Not to Play”

sublineartoday at 4:31 AM

> Pure strategy, luck, or a bit of both? I keep going back and forth on this, honestly, and I still don’t know if this was Apple’s strategy all along, or they didn’t feel in the position to make a bet and are just flowing as the events unfold maximising their optionality.

Maximizing the available options is in fact a "strategy", and often a winning one when it comes to technology. I would love to be reminded of a list of tech innovators who were first and still the best.

Anyway, hasn't this always been Apple's strategy?

gambutintoday at 6:13 AM

That’s actually by design. Apple never jumps on the tech hype bandwagon.

they wait until the dust settles before making their well-thought-out moves.

Every time they’ve jumped the hype train too quickly it hasn’t worked out, like Siri for example.

show 1 reply
rickdeckardtoday at 7:56 AM

I think the article is missing a whole aspect on how Apple is ensuring to not face actual competition while they're "playing it safe":

Even if the investment is overblown, there is market-demand for the services offered in the AI-industry. In a competitive playing field with equal opportunities, Apple would be affected by not participating. But they are establishing again their digital market concept, where they hinder a level playing field for Apple users.

Like they did with the Appstore (where Apple is owning the marketplace but also competes in it) they are setting themselves up as the "the bakn always wins" gatekeeper in the Apple ecosystem for AI services, by making "Apple Intelligence" an ecosystem orchestration layer (and thus themselves the gatekeeper).

1. They made a deal with OpenAI to close Apple's competitive gap on consumer AI, allowing users to upgrade to paid ChatGPT subscriptions from within the iOS menu. OpenAI has to pay at least (!) the usual revenue share for this, but considering that Apple integrated them directly into iOS I'm sure OpenAI has to pay MORE than that. (also supported by the fact that OpenAI doesn't allow users to upgrade to the 200USD PRO tier using this path, but only the 20USD Plus tier) [1]

2. Apple's integration is set up to collect data from this AI digital market they created: Their legal text for the initial release with OpenAI already states that all requests sent to ChatGPT are first evaluated by "Apple Intelligence & Siri" and "your request is analyzed to determine whether ChatGPT might have useful results" [2]. This architecture requires(!) them to not only collect and analyze data about the type of requests, but also gives them first-right-to-refuse for all tasks.

3. Developers are "encouraged" to integrate Apple Intelligence right into their apps [3]. This will have AI-tasks first evaluated by Apple

4. Apple has confirmed that they are interested to enable other AI-providers using the same path [4]

--> Apple will be the gatekeeper to decide whether they can fulfill a task by themselves or offer the user to hand it off to a 3rd party service provider.

--> Apple will be in control of the "Neural Engine" on the device, and I expect them to use it to run inference models they created based on statistics of step#2 above

--> I expect that AI orchestration, including training those models and distributing/maintaining them on the devices will be a significant part of Apple's AI strategy. This could cover alot of text and image processing and already significantly reduce their datacenter cost for cloud-based AI-services. For the remaining, more compute-intensive AI-services they will be able to closely monitor (via above step#2) when it will be most economic to in-source a service instead of "just" getting revenue-share for it (via above step#1).

So the juggernaut Apple is making sure to get the reward from those taking the risk. I don't see the US doing much about this anti-competitive practice so far, but at least in the EU this strategy has been identified and is being scrutinized.

[1] https://help.openai.com/en/articles/7905739-chatgpt-ios-app-...

[2] https://www.apple.com/legal/privacy/data/en/chatgpt-extensio...

[3] https://developer.apple.com/apple-intelligence/

[4] https://9to5mac.com/2024/06/10/craig-federighi-says-apple-ho...

boxedtoday at 6:40 AM

It's the same everywhere: great fundamentals pay off. It's true of martial arts, dance, and absolutely about software platforms. You just have to trust that process and invest in it, which Apple does (although frustratingly not enough!).

nltoday at 4:01 AM

> Then Stargate Texas was cancelled, OpenAI and Oracle couldn’t agree terms, and the demand that had justified Micron’s entire strategic pivot simply vanished. Micron’s stock crashed.

Well.. no. The Stargate expansion was cancelled the orginally planned 1.2MW (!) datacenter is going ahead:

> The main site is located in Abilene, Texas, where an initial expansion phase with a capacity of 1.2 GW is being built on a campus spanning over 1,000 acres (approximately 400 hectares). Construction costs for this phase amount to around $15 billion. While two buildings have already been completed and put into operation, work is underway on further construction phases, the so-called Longhorn and Hamby sections. Satellite data confirms active construction activity, and completion of the last planned building is projected to take until 2029.

> The Stargate story, however, is also a story of fading ambitions. In March 2026, Bloomberg reported that Oracle and OpenAI had abandoned their original expansion plans for the Abilene campus. Instead of expanding to 2 GW, they would stick with the planned 1.2 GW for this location. OpenAI stated that it preferred to build the additional capacity at other locations. Microsoft then took over the planning of two additional AI factory buildings in the immediate vicinity of the OpenAI campus, which the data center provider Crusoe will build for Microsoft. This effectively creates two adjacent AI megacampus locations in Abilene, sharing an industrial infrastructure. The original partnership dynamics between OpenAI and SoftBank proved problematic: media reports described disagreements over site selection and energy sources as points of contention.

https://xpert.digital/en/digitale-ruestungsspirale/

> Micron’s stock crashed. [the link included an image of dropping to $320]

Micron’s stock is back to $420 today

> One analysis found a max-plan subscriber consuming $27,000 worth of compute with their 200$ Max subscription.

Actually, no. They'd miscalculated and consumed $2700 worth of tokens.

The same place that checked that claim also points out:

> In fact, Anthropic’s own data suggests the average Claude Code developer uses about $6 per day in API-equivalent compute.

https://www.financialexpress.com/life/technology-why-is-clau...

I like Apple's chips, but why do we put up with crappy analysis like this?

show 1 reply
ajrosstoday at 4:38 AM

This seems mistaken to me. The core idea is that LLMs are commoditizing and that the UI (Siri in this case) is what users will stick with.

But... what's the argument that the bulk of "AI value" in the coming decade is going to be... Siri Queries?! That seems ridiculous on its face.

You don't code with Siri, you don't coordinate automated workforces with Siri, you don't use Siri to replace your customer service department, you don't use Siri to build your documentation collation system. You don't implement your auto-kill weaponry system in Siri. And Siri isn't going to be the face of SkyNet and the death of human society.

Siri is what you use to get your iPhone to do random stuff. And it's great. But ... the world is a whole lot bigger than that.

rvztoday at 4:09 AM

Apple never competed in the "AI race" in the first place, because they already knew they were already at the finish line.

This was really unsurprising [0].

[0] https://news.ycombinator.com/item?id=40278371

show 2 replies
hansmayertoday at 8:11 AM

For the love of all that's holy - folks please stop using AI to publish smart sounding texts. While you may think you are "polishing" your text, you are just disrespecting your readers. Write in your own words.

livinglisttoday at 3:49 AM

But why do I feel like the quality of the software from Apple declined sharply in recent years? The liquid glass design feels very unpolished and not well thought out throughout almost everywhere… seems like even Apple can’t resist falling victim to AI slop

show 2 replies
Kevin_VAItoday at 10:38 AM

[dead]

gayboytoday at 3:55 PM

[flagged]

worthless-trashtoday at 3:36 AM

Don't worry, when apple introduce it, it'll be revolutionary and 10% thinner.

show 1 reply
microslop2026today at 4:32 AM

I like how we are acting like this market is so novel and emergent revering the luck of some while lamenting the failures of others when it was all "roadmapped" a decade ago. It's like watching a Shaanxi shadow puppet show with artificial folk lore about the origins of the industry. I hate reality television!

dzongatoday at 12:29 PM

one day people will realize that Tim Cook as one of the best killer CEOs.

by now - by now he has more hits than Steve Jobs. His precision, and being able to manage risk maybe due to his supply chain background have made Apple into the killer it is today.

if we were in the age of Robber barons he would've been up there with them.