logoalt Hacker News

Hold on to Your Hardware

511 pointsby LucidLynxtoday at 10:10 AM425 commentsview on HN

Comments

barrkeltoday at 11:00 AM

I don't buy the central thesis of the article. We won't be in a supply crunch forever.

However, I do believe that we're at an inflection point where DC hardware is diverging rapidly from consumer compute.

Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.

I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked, based on the logic that hardware had moved on but local compute was basically stagnant, and if I wanted to own my computing hardware, I'd better buy something now that will last a while.

This way, my laptop is just a disposable client for my real workstation, a Tailscale connection away, and I'm free to do whatever I like with it.

I could sell the RAM alone now for the price I paid for it.

show 22 replies
bluejay2387today at 11:43 AM

The general take here seems to be "everything eventually passes". That isn't always true. I wonder how many people have a primary computing device that they don't even have full control over now (Apple phones, tablets...). Years ago the concept of spending over $1k on a computer that I didn't even have the right to install my own software on was considered ridiculous by many people (myself included). Now many people primarily consume content on a device controlled almost entirely by the company they bought it from. If the economics lead to a situation where its more profitable to sell you compute time than sell you computers then businesses will chose to not sell you computers. I have no idea if that is what ends up happening.

show 5 replies
rswailtoday at 11:58 AM

A long article begging the question when the last paragraph or two countered the panic of the beginning. Two Chinese firms are ramping up production of consumer RAM/SSDs because they see a market opening as the existing producers move to selling to enterprise/hyperscalars.

There have been memory chip panics before, the US funded RAM production back into the 80s/90s in competition with Japan at the time.

The AI boom/"hyperscale" currently is almost exactly like the dotcom boom.

It's already starting to shake down. Anthropic is occupying the developer space, OpenAI has just exited the video/media production space. More focused and vertical market AI is emerging.

The current vortice of money between OpenAI <-> Microsoft <-> Oracle <-> NVidea <-> Google <-> etc etc is going to break.

show 2 replies
BLKNSLVRtoday at 1:26 PM

This may not be entirely appropriate to the reasons behind the article, but it feels tangentially related:

I'd like to say a brief thank you to what the brief, golden period of globalisation was able to bring us.

I hope that that level of international trade and economic cooperation across geographical, ideological, political, and religious boundaries can be achieved again at some point in the future, but it seems the pendulum is swinging the other way for the time being.

I hope that, wherever the current direction ends up, there are lessons that can be learnt about what we had, and somehow fumbled, such that there is motivation enough to get back there.

show 3 replies
meindnochtoday at 11:49 AM

I know this may sound ridiculous, but m-maybe... maybe it's time for us to make software... less bloated?

Maybe... just maybe, a TODO list app shouldn't run 4 processes, and consume hundreds of megabytes of RAM?

show 5 replies
lenovatoday at 1:48 PM

Oh man, I've come across this person's blog before and I love it, not just because of the personalization/personality they've put into the site's design, but because of all of the random CLI/TUI-based tools they've developed. Examples:

- https://xn--gckvb8fzb.com/projects/

Their github repos:

- https://github.com/mrusme

They even built a BBS-style reader client that supports Hacker News:

https://github.com/mrusme/neonmodem

I miss the days of the web being weird like this :-)

dust42today at 11:11 AM

Just to mention one thing, helium -which is a necessity for chip production- is a byproduct of LNG production. And 20% of that is just gone (Qatar) and the question is how long it will take to get that back. So not only a chip shortage because of AI buying chips in huge volumes but also because production will be hampered.

Tongue in cheek: we urgently need fusion power plants. For the AI and the helium.

show 4 replies
saadn92today at 1:22 PM

The article's dystopia section is dramatic but the practical point is real. I've been self-hosting more and more over the past year specifically because I got uncomfortable with how much of my stack depended on someone else's servers.

Running a VPS with Tailscale for private access, SQLite instead of managed databases, flat files synced with git instead of cloud storage. None of this requires expensive hardware, it just requires caring enough to set it up

show 3 replies
upofadowntoday at 12:00 PM

This article inspired me to look and see what this computer is. Apparently it is a "AMD Athlon(tm) II X2 250 Processor" from 2009. So 17 years old. It has 8 GB of DDR3 memory and runs at 3 GHz. It currently has OpenBSD on it, but at least one source thinks it could run Windows 10.

The fact that I didn't know any of this is what is significant here. At some point I stopped caring about this sort of thing. It really doesn't matter any more. Don't get my wrong, I am as nerdy as they come. My first computer was a wire wrapped 8080 based system. That was followed by an also wire wrapped 8086 based system of my own design I used for day to day computing tasks (it ran Forth). If someone like me can get to the point of not caring there is no real reason for anyone else to care.

show 3 replies
CraigJPerrytoday at 10:49 AM

Articles entire thesis looks like it can be completely de-railed if one activity happened: ai infrastructure firms cease to be able to secure more capital.

Is that likely? History says it's inevitable, but timeframe is an open question.

show 3 replies
QuiEgotoday at 7:42 PM

On the plus side, we've reached the end of Moore's law and are living in an amazing age of personal computing devices.

M1 Apple Silicon MacBook Airs are still good computers 5+ years after release.

Many games are still playable (and being released on!) the PS4, which is almost 12 years old.

The iPhone 15 pro has 8gb of RAM which will likely be sufficient for a long time.

Don't get me wrong, this whole parts shortage is exceptionally annoying, but we're living in a great time to weather the storm.

xbmcusertoday at 12:37 PM

In the last month 20-30% of oil supply 30% gas supply and 30-40% of fertilizer production has been destroyed and could take any where from 8 months to 5 years to come back online. Governments are acting as everything is okay so that there is no panic but we have crossed the point of no return even if the war ends today food & energy shortages are over the horizon. If you can get an ev, solar heat pumps, battery storage etc get it now today as fossil fuel based energy prices are going to go through the roof. I see similarities to when covid hit people kept looking at things happening in other countries and not preparing for the shit to hit their own cities and countries.

focusgroup0today at 5:25 PM

Not to mention Age Verification / KYC being baked into every future OS and device. Buy and hodl to have a hope of independent, censorship-resistant computing in the future.

2716057today at 12:36 PM

As long as there are consumers paying for hardware ownership there will be businesses willing to sell it to them. The worst scenario I could imagine is that one has to pay a premium for fully-owned hardware simply because consumer's desire for it becomes an oddity and it is thus sold in low quantities.

The current AI-induced shortages aside, the times have never been better in my opinion. There is overwhelming choice; ordinary consumers can access anything from Raspberry PIs all the way up to enterprise servers and AI accelerators. The situation was very different in the 1990s when I built my first PC.

show 1 reply
anonzzziestoday at 11:15 AM

I do not see this from an infinite shortage point; I see this from a locked down hardware point. Old hardware is hackable, new hardware mostly not. That is for me where the real pain is and why I just buy old computers and phones that are rootable.

commandlinefantoday at 1:56 PM

When I started programming in the early 80's, personal computing had just recently become a thing. Before that, if you wanted to learn to program, you first needed access to a very rare piece of hardware that only a select few were granted access to. But when personal computing became a reality, programming exploded - anybody could learn it with a modest investment.

I suspect we're trending back to the pre-personal computing era where access to 'raw' computing power will be hard to come by. It will become harder and harder to learn to program just because it'll be harder and harder to get your hands on the necessary equipment.

darkwatertoday at 12:19 PM

> For the better part of two decades, consumers lived in a golden age of tech. Memory got cheaper, storage increased in capacity and hardware got faster and absurdly affordable.

I got my first PC circa 1992 (a 2nd hand IBM PS/2, 80286 processor with 2MB RAM and 30MB HDD) and the "golden age" was already there. We are well over 40 years of almost uninterrupted "pay less for more performances" in the home/personal computing space, and that's because that space started around 50 years ago. There was some fluctuation (remember the earthquake affecting HDD prices a few years ago?) but demand was there and manufacturing tech became more efficient.

The actual important change is that for most consumer uses, the perf improvements stopped to make sense already what, over 10 years ago?

show 1 reply
abmmgbtoday at 12:33 PM

I actually think the central thesis is thought provoking, we have shifted far away from locally installed shit to remote data centre access, this was initially driven by cloud-based initiatives and now spiralling upwards by AI. For any researchers, hackers, builders wanting to play with locally installed AI, hardware could become a bottleneck especially as many machines, such as the beloved Macs, are not upgradable

the__alchemisttoday at 1:13 PM

It is wild thinking how a few years ago, I didn't buy a 4090 direct from nvidia because "$1600 (USD) is too much to pay for a graphics card; if I need a better one, i'll upgrade in a few years. (Went with 4080, which is substantially slower and was $1200) Joke's on me!

It will be scarcity mindset from here on out; will always buy the top tier thing .

jleyanktoday at 11:12 AM

Hold onto your hardware. Hold on to your existing software and the current version. Don’t upgrade without a specific need. None of the “progress” is actually helpful to hackers and I’m not sure it’s even helpful to typical users. There’s enough information being given to and slurped by others, don’t make it more effective.

show 2 replies
drillsteps5today at 4:45 PM

I very much would like to know how much of this presumably ordered (and backordered) hardware (RAM/SSD/.../wafers) is going to end up being released back to the market when the dust settles. I haven't seen any estimations but in order to put all this hardware to work the hyperscalers need to be building data centers at ludicrous speed. That should be appearing in construction data, jobs data, and many other places. Are we actually seeing any of that? Or is it all just based on the back-of-the-napkin math by Mr Altman and Co and they put all the money they got towards the future projects?

adamwong246today at 1:05 PM

I have often imagined writing a book, roughly "Fahrenheit 451 but with computers instead of books". Imagine a world you do not buy an iPhone- one is assigned to you at birth, a world were "installing software" on "a computer you own" are not just antiquated or taboo, but unthinkable.

show 2 replies
mememememememotoday at 10:31 AM

In such a future the iPhone and android ecosystem is dead? Because a single $1k phone is a hell of a computer. So if you can still buy a phone you can still get a computer. Local AI aside these are very capable.

show 3 replies
vladdetoday at 12:23 PM

when you click away to another tab, the title and favicon of the page changes to something weird, but really legit looking.

a couple of my favorites: "rust programming socks - Google", "Amazon.com: waifu pillow", "Rick Astley - Never Gonna Give You Up", "censorship on hacker news - Google"

show 2 replies
MisterTeatoday at 2:29 PM

To the people saying "The shortage wont last forever." - Yes, you might be right. However, such a supply crunch creates a perfect vacuum for rapidly change to fill in the vacuous hardware landscape of computing and shift the balance of power.

Think about it like this: Imagine the AI/Cloud/Crypto companies who are buying up all these compute and storage resources realize they now control the compute hardware market becoming compute lords. What happens when joe/jane six pack or company xyz needs a new PC or two thousand but cant afford them due to the supply crunch? Once the compute lords realize they control the compute supply they will move to rent you their compute trapping users in a walled garden. And the users wont care because they aren't computer enthusiasts like many of us here. They only need a tool that works. They *do not* care about the details.

They hardware lords could further this by building proprietary hardware in collusion with the vendors they have exclusivity with to build weaker terminal devices with just enough local ram and storage to connect to a remote compute cluster. Hardware shortage solved!

All they need to do is collude with the hardware makers with circular contracts to keep buying hardware in "anticipation of the AI driven cloud compute boom." The hardware demand cycle is kept up and consumers are purposefully kept out of the market to push people into walled gardens.

This is unsustainable of course and will eventually fall over but it could tie up computing resources for well over a decade as compute lords dry up the consumer hardware market pushing people to use their hoarded compute resources instead of owning your own. We are in a period where computing serfdom could be a likely outcome that could cause a lot of damage to freedom of use and hardware availability and the future ability to use the internet freely.

pmdrtoday at 12:13 PM

I've seen comments on here before that went somewhere along the line of "adults don't care about RAM prices." HN is no stranger to siding with the oppressors.

bob1029today at 2:41 PM

I am still rocking that 5700XT 50th anniversary edition. I see no reason it won't make it to 2030 at this point. There was a moment where I thought it was dying, but it was a combination of dust and a shader bug in BF6 that caused the concern. I've also got a 1080ti in case of disaster.

Newer graphics hardware is pointless to me. The expensive new techniques I find incredibly offense from an interactivity standpoint (temporal AA, nanite & friends). I run Battlefield 6 at 75% render scale with everything set to low. I really don't care how ass the game looks as long as it runs well. I much more enjoy being able to effectively dispatch my enemies than observe aesthetic clutter.

tmtvltoday at 11:41 AM

I grabbed an upgrade at the end of last year because my ~10 year old workhorse is starting to show signs of aging. Despite 16 gigs of RAM having lasted me thus far I decided to bite the bullet and get 32; so I expect this new machine to last me another 10 years (although I now have a full SSD, whereas my old workhorse had an SSD for the OS and a hybrid drive for /home, so we'll see whether or not it will actually last).

show 1 reply
mmackhtoday at 11:52 AM

We are in a renaissance of computing right at this moment. If expand our definition of computers outside of screens and traditional input devices, microcontrollers are capable of so much more, with so much less (energy consumption | ram | storage).

The tipping point for MCUs was WiFi - which not only allows you to speak multiple protocols (UDP/Zigbee/HTTP/etc) and have audio IO, but also P2P communication and novel new form factors. There's been incredible progress with the miniaturisation of sensors and how we're able to understand and perceive our environment.

So yes, whilst traditional hardware is getting more expensive and locked down, there's a strong counter movement towards computing for everyone - and by that I also mean that there's going to be less abstraction in the entire stack. Good times ahead!

show 1 reply
duskdozertoday at 10:59 AM

    uBlock Origin has prevented the following page from loading:

    https://xn--gckvb8fzb.com/hold-on-to-your-hardware/

    This happened because of the following filter:

    ||xn--$document
    The filter has been found in:   IDN Homograph Attack Protection - Complete Blockage
show 2 replies
tangotaylortoday at 1:40 PM

I'm going to fight pessimism with cynicism here: the Department of Defense is not going to let everything move to the cloud because they need compute at the edge for AI-enabled weapons and R&D. For example, Anduril's products, Eric Schdmit's secretive Bumblebee project, or startups like Scout AI. Communications and GPS are just too easy to jam and their answer is giving weapons more last-mile autonomy to operate in radio silence.

War aside, I also bet there's going to be a huge demand for edge-compute for other kinds of robotics: self-driving cars, delivery robots, factory robots, or general-purpose humanoids (Tesla Optimus, Boston Dynamics Atlas, 1X NEO, etc). Moving that kind of compute to the cloud is too laggy and unreliable. I know researchers who've tried it, the results were mixed.

Also, the engineers working on these platforms aren't going to reinvent the wheel every time they need to connect hardware together and they're going to use interoperable standards, like PCIe for storage or GPUs, DIMM slots for memory, ATX for power, etc. So I don't see general-purpose computing dying.

arexxbifstoday at 2:55 PM

It's not that I disagree with the basic premise and concern of the text, but I'm not convinced about the "RAM shortage will lead to thin clients" argument, because the thin client is going to be a browser.

Everything today is a web app. If it doesn't exist and you want to vibe code it? It's probably going to become a web app, vibed using a web app.

The problem is, web apps are stupendous memory hogs. We're even seeing Chromebooks with 8 gigs of RAM now. LLM:s are all trained for and implemented in apps assuming the user can have $infinity browsers running, whether it's on their PC or on their phone. It's going to be very hard to change that in a way that's beneficial to what passes for business models at AI companies.

Ah, the paradoxes of modern software.

show 1 reply
n2j3today at 3:27 PM

I wish I was well versed into dialectic/Hegelian thought as I am sure there's a way of seeing this as a step towards abolition of private property altogether. The question is who owns the means of production(computation) I suppose.

Kiboneutoday at 2:40 PM

The other side of this is that we can still make software more efficient, and make better use of the old hardware than we had ever thought possible.

I’m doing more with a decade old GPU, which was manufactured before “Attention is all you need“, than I could 5 years ago, when quantization techniques were implemented.

I’m holding on to my 32 bit machines.

Most linux distributions dropped support for them (for good reason). But at the end of the day these machines are a fabric of up to ~ 4 billion bytes that can be used in a myriad of ways, and we only covered a fraction of the state space before we had moved on.

shusakutoday at 11:07 AM

> These days, the biggest customers are not gamers, creators, PC builders or even crypto miners anymore. Today, it’s hyperscalers. … > These buyers don’t care if RAM costs 20% more and neither do they wait for Black Friday deals. Instead, they sign contracts measured in exabytes and billions of dollars.

Does all this not apply to businesses buying computers for their employees?

kelvinjps10today at 4:13 PM

I was about to upgrade because I'm using a Thinkpad t480, I decided to optimize my computer instead. I run i3 and a couple of native apps and chromium web apps run fast enough. And some kernel and other tweaks + gamemode in arch make gaming better.

I must admit that my workflow it's not that heavy.

pcbluestoday at 1:01 PM

I think what many people don't realise is that there will be a glut of cheap computer parts including CPUs, GPU cards, and memory when the AI and AI-adjacent businesses go bust and a bunch of data centres get pulled down.

show 1 reply
politelemontoday at 2:03 PM

> If you need a new device, buy it;

I would specifically add, whatever you have, or whatever you choose to buy, it would greatly benefit you to ensure a degree is Linux compatibility to ensure its lifespan can be extended further than the greed enthusiasts at MS, Apple, and Google would like you to. They will be facing the same declines in purchasing habits and are further incentivised to assert their ownership over what you might mistakenly consider your devices.

G_o_Dtoday at 2:12 PM

Memory got cheaper, storage increased in capacity.

In my country for offline store purchase of USB HDD only 4Tb seagate variant available, thats 15000 in pur currency thats almost 1.5 month salary in private sector

Any higher size and have to import, and and forex applied, prices goes upto 4 0's , when i read people on youtube or blogs saying they rotate 15Tb and higher on their nas raids, that seems just dream for use never to fulfill

pjmlptoday at 12:57 PM

I have been holding for my hardware for decades, some of my private hardware traces back to 2009.

Phones and tablets only get replaced when they die.

Why should I throw away stuff that still works as intended?

redbelltoday at 3:08 PM

In a totally unrelated matter to the subject, I found the linked website's name very strange! Visiting the website, I can see in the address bar that the name is in Chinese or Japanese!! This is the first occurrence I witnessed of this kind.

show 1 reply
vjerancrnjaktoday at 10:55 AM

haha, all of a sudden I see a tab "waifu pillow" on Amazon, and think I have a split personality that runs searches in between consciousness shifts, and then I come back to a funny message.

show 1 reply
solomonbtoday at 4:18 PM

I'm not saying we are in one, but isn't a RAM shortage like this is exactly what one would expect at the early stages of a take off scenario?

cirelli94today at 2:05 PM

Okay but what about the icon and tab name changing and the pop-up about disabling javascript?!

show 1 reply
rolandhvartoday at 12:42 PM

So what happens when the datacenters need to upgrade (new hardware, or stupid enterprisey reasons like "must be new when replacing broken stuff")? Surely there remains a secondary market for the enthusiasts?

altcognitotoday at 1:14 PM

Part of this is that memory companies recognize that nobody is going to enforce antitrust law for the forseeable future, so collusion to raise prices is the norm now.

Velocifyertoday at 5:08 PM

Why doesn't Hacker News render punnycode in domains?

usrbinbashtoday at 11:24 AM

As the old saying goes: "This too will pass."

Consumer hardware will always be a market worth serving for companies who don't see their stock price as their product.

If the existing companies are unwilling to make a sale, I am sure new players will arise picking up their slack.

https://www.youtube.com/watch?v=SrX0jPAdSxU

evanwolftoday at 6:06 PM

so… Hold on to Your Medical Devices. (And everything else with a chip in it.)

🔗 View 31 more comments