> Computers have been running thousands of times slower than they should be for decades
I've been hearing this complaint for decades and I'll never understand it. The suggestion seems completely at odds with my own experience. Regardless of OS, they all seem extremely fast, and feel faster and faster as time goes on.
I remember a time when I could visually see the screen repaint after minimizing a window, or waiting 3 minutes for the OS to boot, or waiting 30 minutes to install a 600mb video game from local media. My m2 air with 16gb of memory only has to reboot for updates, I haphazardly open 100 browser tabs, run spotify, slack, an IDE, build whatever project I'm working on, and the machine occasionally gets warm. Everything works fine, I never have performance issues. My linux machines, gaming pc, and phone feel just as snappy. It feels to me that we are living in a golden age of computer performance.
My M4 Max 128GB ... 90% of the time is like you say.
10% of the time, Windowserver takes off and spends 150% CPU. Or I develop keystroke lag. Or I can't get a terminal open because Time Machine has the backup volume in the half mounted state.
It's thousands of times faster than the Ultra 1 that was once on my desk. And I can certainly do workloads that fundamentally take thousands of times more cycles. But I usually spend a greater proportion of this machine's speed on the UI and responsiveness doesn't always win over 30 years ago.
Ok. Today we have multi-Ghz processors, with multiple cores at that.
Photons travel about 1 foot per nanosecond ... so the CPU can executes MANY instructions between the time photons leave your screen, and the time they reach your eyes.
Now, on Windows start Word (on a Mac start Writer) ... come on ... I'll wait.
Still with me? Don't blame the SSD and reload it again from the cache.
Weep.
>Regardless of OS, they all seem extremely fast, and feel faster and faster as time goes on.
One analogy is that the distance between two places in the world hasn't changed, but we're not arriving significantly faster than we before modern jetliners were invented. There was a period of new technology followed by rapid incremental progress toward shortened travel times until it leveled off.
However, the number of people able to consistently travel between more places in the world has continued to increase. New airports open regularly, and airliners have been optimized to fit more people, at the cost of passenger comfort.
Similarly, computers, operating systems, and their software aren't aligned in optimizing for user experience. Until a certain point, user interactions on MacOS took highest priority, which is why a single or dual core Mac felt more responsive than today, despite the capabilities and total work capacity of new Macs being orders of magnitude higher.
So we're not really even asking for the equivalent of faster jet planes, here, just wistfully remembering when we didn't need to arrive hours early to wait in lines and have to undress to get through security. Eventually all of us who remember the old era will be gone, and the next people will yearn for something that has changed from the experiences they shared.
> Regardless of OS, they all seem extremely fast, and feel faster and faster as time goes on.
This very much depends on what hardware you have and what you're doing on it (how much spare capacity you have).
Back in university I had a Techbite Zin 2, it had a Celeron N3350 and 4 GB of LPDDR4. It was affordable for me as a student (while I also had a PC in the dorm) and the keyboard was great and it worked out nicely for note taking and some web browsing when visiting parents in the countryside.
At the same time, the OS made a world of difference and it was anything but fast. Windows was pretty much unusable and it was the kind of hardware where you started to think whether you really need XFCE or whether LXDE would be enough.
I think both of the statements can be true: that Wirth's law is true and computers run way, way slower than they should due to bad software... and that normally you don't really feel it due to us throwing a lot of hardware at the problem to make us able to ignore it.
It's largely the same as you get with modern video game graphics and engines like UE5, where only now we are seeing horrible performance across the board that mainstream hardware often can't make up for and so devs reach for upscaling and framegen as something they demand you use (e.g. Borderlands 4), instead of just something to use for mobile gaming.
It's also like running ESLint and Prettier on your project and having a full build and formatting iteration take like 2 minutes without cache (though faster with cache), HOWEVER then you install Oxlint and Oxfmt and are surprised to find out that it takes SECONDS for the whole codebase. Maybe the "rewrite it in Rust" folks had a point. Bad code in Rust and similar languages will still run badly, but a fast runtime will make good code fly.
I could also probably compare the old Skype against modern Teams, or probably any split between the pre-Electron and modern day world.
Note: runtime in the loose sense, e.g. compiled native executables, vs the kind that also have GC, vs something like JVM and .NET, vs other interpreters like Python and Ruby and so on. Idk what you'd call it more precisely, execution model?
> Regardless of OS, they all seem extremely fast, and feel faster and faster as time goes on.
The modern throughput is faster by far. However, what some people mean when they talk about "slower" is the latency snappiness that characterizes early microcomputer systems. That has definitely gotten way worse in an empirically measurable fashion.
Dan Luu's article explains this very well [1].
It is difficult today to go through that lived experience of that low latency today because you don't appreciate it until you lived it for years. Few people have access to an Apple ][ rig with a composite monitor for years on end any longer. The hackers that experienced that low latency never forgot it, because the responsiveness feels like a fluid extension of your thoughts in a way higher latency systems cannot match.
I think the best example is in iOS. On old iOS versions, the keyboard responsiveness took precedence over everything, no matter what. If you touched the keyboard, it would respond with an animation indicating what you are doing. The app itself may be frozen, but the self contained keyboard process would continue on, letting you know the app you are using is a buggy mess.
Now in iOS 26, you can just be typing in Notes or just the safari address bar for example, and the keyboard will randomly lag behind and freeze, likely because it is waiting on some autocomplete task to run on the keyboard process itself. And this is on top of the line, modern hardware.
A lot of the fundamentals that were focused on in the past to ensure responsiveness to user input was never lost, became lost. And lost for no real good reason, other than lazy development practices, unnecessary abstraction layers, and other modern developer conveniences.