logoalt Hacker News

barrkeltoday at 11:00 AM22 repliesview on HN

I don't buy the central thesis of the article. We won't be in a supply crunch forever.

However, I do believe that we're at an inflection point where DC hardware is diverging rapidly from consumer compute.

Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.

I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked, based on the logic that hardware had moved on but local compute was basically stagnant, and if I wanted to own my computing hardware, I'd better buy something now that will last a while.

This way, my laptop is just a disposable client for my real workstation, a Tailscale connection away, and I'm free to do whatever I like with it.

I could sell the RAM alone now for the price I paid for it.


Replies

raincoletoday at 11:45 AM

We won't be in a supply crunch forever. We'll have a demand crunch. The demand of powerful consumer hardware will shrink so much that producing them will lose the economics of scale. It 've always been bound to happen, just delayed by the trend of pursuing realistic graphics for games.

People who are willing to drop $20k on a computer might not be affected much tho.

show 7 replies
kace91today at 11:27 AM

The thing is, other than AI stuff, where does a non powerful computer limit you?

My phone has 16gigs of ram and a terabyte of storage, laptops today are ridiculous compared to anything I studied with.

I'm not arguing mind you, just trying to understand the usecases people are thinking of here.

show 9 replies
NoSalttoday at 2:10 PM

> "I personally dropped $20k on a high end desktop"

This absolutely boggles my mind. Do you mind if I ask what type of computing you do in order to justify this purchase to yourself?

show 2 replies
guessmynametoday at 11:13 AM

> I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked […]

768GB of RAM is insane…

Meanwhile, I’ve been going back and forth for over a year about spending $10k on a MacBook Pro with 128GB. I can’t shake the feeling I’d never actually use that much, and that, long term, cloud compute is going to matter more than sinking money into a single, non-upgradable machine anyway.

show 6 replies
Aurornistoday at 1:31 PM

> Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node.

How can you say this when Apple is releasing extremely fast M5 MacBook Pros? Or the $600 MacBook Neo that has incredible performance for that price point?

Even x86 is getting some interesting options. The Strix Halo platform has become popular with LLM users that the parts are being sold in high numbers for little desktop systems.

show 2 replies
TheRoquetoday at 1:54 PM

We are on borrowed time, most of the world is running on oil and this resource is not unlimited at all. A lot of countries have gone past their production peak, meaning it's only downhill from here. Everything is gonna be more costly, more expensive, our lavish "democracies" lifestyles are only possible because we have (had) this amazing freely available resource, but without it it's gonna change. Even at a geopolitical scale you can see this pretty obviously, countries that talked about free market, free exchange are now starting to close the doors and play individually. Anyways, my point is, we are in for decades, if not a century of slow decline.

show 1 reply
rkagerertoday at 8:30 PM

Care to share some more detailed specs?

clusterhackstoday at 4:07 PM

"I personally dropped $20k on a high end desktop . . . "

This is where I think current hackers should be headed. I grew up with lots of family who were backyard mechanics, wrenching on cars and motorcycles. Their investment in tools made my occasional PC purchase look extremely affordable. Based on what I read, senior mechanics often have five-figure US dollar investments in tools. Of course, I guess high quality torque wrenches probably outlast current GPU chips? I'd hate to be stuck making a $10K investment every 24 months on a new GPU . . .

I have been renting GPU resources and running open weight models, but recently my preferred provider simply doesn't have hardware available. I'm now kicking myself a little for not simply making a big purchase last fall when prices were better.

nayukitoday at 5:06 PM

> Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.

It really feels like we're slowly marching back to the era of mainframe computers and dumb terminals. Maybe the democratization of hardware was a temporary aberration.

picturetoday at 11:11 AM

It seems like you largely agree with the article - people shall own nothing and be happy. Perhaps the artificially induced supply crunch could go on indefinitely.

Also, I wonder how many of us, even here on HN, have the ability to spend that amount of money on computer for personal use. Frankly I wouldn't even know what to do with all the RAM - should I just ramdisk every program I use and every digital thing I made in the last five years?

Anyhow, I suppose for the folks who can't afford hardware (perhaps by design), one ought to own nothing and be happy.

show 3 replies
motbus3today at 12:40 PM

I believe superficially speaking you could be right. But I think it was realised that causing the scarcity of products and commodities is a power move.

We live in world where we optimised for globalization. Industry in china, oil in middle east, etc...

This approach proved to be fragile on the hands of people with money and/or power enough to tilt the scale

show 1 reply
randusernametoday at 4:37 PM

This will be me. Bestowing upon my descendants a collection of Mighty Beanz, a few unkillable appliances, and the best consumer computing hardware the early 2020s could buy.

And I fear they will be equally confused and annoyed by disposing of all of them.

efficaxtoday at 7:50 PM

i thought i was crazy with a $7k threadripper w/ 128G of ram

kgeisttoday at 11:59 AM

>we're at an inflection point where DC hardware is diverging rapidly from consumer compute.

I thought the trend is the opposite direction, with RTX 5x series converging with server atchitectures (Blackwell-based such as RTX 6000 Pro+). Just less VRAM and fewer tensor cores, artificially.

Where is the divergence happening? Or you don't view RTX 5x as consumer hardware?

show 1 reply
doctorpanglosstoday at 4:37 PM

You're responding to an LLM authored article that doesn't know anything. "Let that sink in for a moment."

porkeynontoday at 1:28 PM

$20k?

People laugh at young men for looksmaxxing. And then there’s this. I dunno. As someone who has been playing computer games since the 70s, I clearly do not understand the culture anymore. But what forces would drive a young man to spend the price of a used car to play a derivative FPS? It seems heartbreaking. Just like the looksmaxxer.

show 2 replies
wat10000today at 5:27 PM

I think you're probably right, but I'm not so confident the supply crunch will end.

Tech feels increasingly fragile with more and more consolidation. We have a huge chunk of advanced chip manufacturing situated on a tiny island off the coast of a rising superpower that hates that island being independent. Fabs in general are so expensive that you need a huge market to justify building one. That market is there, for now. But it doesn't seem like there's much redundancy. If there's an economic shock, like, I dunno, 20% of the world's oil supply suddenly being blockaded, I worry that could tip things into a death spiral instead.

dmitrygrtoday at 2:24 PM

> Laptops are increasingly just clients for someone else's compute

Are you kidding? Apple's mobile chips are now delivering perf that AMD & intel desktop never could or did.

show 1 reply
danaristoday at 2:04 PM

> Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.

What are you talking about?

My laptops are, and always have been, primarily places where I do local computing. I write code there, I watch movies there, I listen to music there, I play games there...all with local storage, local compute, and local control (though I do also store a bunch of my movies on a personal media server, housed in my TV stand, because it can hold a lot more). My smartphone is similar.

If you think that the vast majority of the work most people do on their personal computers is moving to LLMs, or cloud gaming, then I think you are operating in a pretty serious bubble. 99.9% of all work that most people do is still best done locally: word processing, spreadsheets, email, writing code, etc. Even in the cases where the application is hosted online (like Google Docs/Sheets), the compute is still primarily local.

The closest to what you're describing that I think makes any sense is the proliferation of streaming media—but again, while they store the vast libraries of content for us, the decoding is done locally, after the content has reached our devices.

It doesn't matter if a cutting-edge AI-optimized server can perform 10, 100, or 1000 times better than my laptop at any particular task: if the speed at which my laptop performs it is faster than I, as a human, can keep up (whatever that means for the particular task), then there's no reason not to do the task locally.

shevy-javatoday at 11:50 AM

> We won't be in a supply crunch forever.

I don't share the same 1:1 opinion with regards to the article, but it is absolutely clear that RAM prices have gone up enormously. Just compare them. That is fact.

It may be cheaper lateron, but ... when will that happen? Is there a guarantee? Supply crunch can also mean that fewer people can afford something because the prices are now much higher than before. Add to this the oil crisis Trump started and we are now suddenly having to pay more just because a few mafiosi benefit from this. (See Krugman's analysis of the recent stock market flow of money/stocks.)

show 2 replies
WheelsAtLargetoday at 8:08 PM

[dead]

echelontoday at 11:39 AM

Local is a dead end.

Open source efforts need to give up on local AI and embrace cloud compute.

We need to stop building toy models to run on RTX and instead try to compete with the hyperscalers. We need open weights models that are big and run on H200s. Those are the class of models that will be able to compete.

When the hyperscalers reach take off, we're done for. If we can stay within ~6months, we might be able to slow them down or even break them.

If there was something 80-90% as good as Opus or Seedance or Nano Banana, more of the ecosystem would switch to open source because it offers control and sovereignty. But we don't have that right now.

If we had really competitive open weights models, universities, research teams, other labs, and other companies would be able to collaboratively contribute to the effort.

Everyone in the open source world is trying to shrink these models to fit on their 3090 instead, though, and that's such a wasted effort. It's short term thinking.

An "OpenRunPod/OpenOpenRouter" + one click deploy of models just as good as Gemini will win over LMStudio and ComfyUI trying to hack a solution on your own Nvidia gaming card.

That's such a tiny segment of the market, and the tools are all horrible to use anyway. It's like we learned nothing from "The Year of Linux on Desktop 1999". Only when we realized the data center was our friend did we frame our open source effort appropriately.

show 3 replies