I sometimes wonder what the alternate reality where semiconductor advances ended in the eighties would look like.
We might have had to manage with just a few MB of RAM and efficient ARM cores running at maybe 30 MHz or so. Would we still get web browsers? How about the rest of the digital transformation?
One thing I do know for sure. LLMs would have been impossible.
We had web browsers, kinda, in that we'd call up BBSes, and use ansi for menus and such.
My Vic20 could do this, and a C64 easily, really it was just graphics that were wanting.
I was sending electronic messages around the world via FidoNet and PunterNet, downloaded software, was on forums, and that all on BBSes.
When I think of the web of old, it's the actual information I love.
And a terminal connected to a bbs could be thought of as a text browser, really.
I even connectd to CompuServe in the early 80s via my C64 through "datapac", a dial gateway via telnet.
ANSI was a standard too, it could have evolved further.
Teletext existed in the 80s and was widely in use, so we'd have some kind of information network.
BBSes existed at the same time and if you were into BBSes you were obsessive about it.
I remember using the web on 25mhz computers. It ran about as fast as it does today with a couple ghz. Our internet was a lot slower than as well.
Apart from transputers mentioned already, there’s https://greenarrays.com/home/documents/g144apps.php
Both the hardware and the forth software.
APIs in a B2B style would likely be much more prevalent, less advertising (yay!) and less money in the internet so more like the original internet I guess.
GUIs like https://en.wikipedia.org/wiki/SymbOS
And https://en.wikipedia.org/wiki/Newton_OS
Show that we could have had quality desktops and mobile devices
I always think the Core 2 Duo was the inflexion point for me. Before that current software always seemed to struggle on current hardware but after it was generally fine.
As much as I like my Apple Silicon Mac I could do everything I need to on 2008 hardware.
I don't think there's really a credible alternate reality where Moore's law just stops like that when it was in full swing.
The ones that "could have happened" IMO are the transistor never being invented, or even mechanical computers becoming much more popular much earlier (there's a book about this alternate reality, The Difference Engine).
I don't think transistors being invented was that certain to happen, we could've got better vacuum tubes, or maybe something else.
This is basically the premise of the Fallout universe. I think in the story it was the transistor was never invented though.
We did have web browsers, I had Internet Explorer on Windows 3.1, 33mhz 8mb RAM.
There are web browsers for 8-bits today, and there were web browsers for e.g. Amiga's with 68000 CPU's from 1979 back in the day.
And imagine if telecom had topped out around ISDN somewhere, with perhaps OC-3 (155Mbps) for the bleeding-fastest network core links.
We'd probably get MP3 but not video to any great or compelling degree. Mostly-text web, perhaps more gopher-like. Client-side stuff would have to be very compact, I wonder if NAPLPS would've taken off.
Screen reader software would probably love that timeline.
> One thing I do know for sure. LLMs would have been impossible.
Maybe they could, as ASICs in some laboratories :)
tbh we'd probably just have really good Forth programmers instead of LLMs. same vibe, fewer parameters.
Actually real AI isn’t going to be possible unless we return to this arch. Contemporary stacks are wasting 80% of their energy which we now need for AI. Graphics and videos are not a key or necessary part of most computing workflows.
> Would we still get web browsers?
Yes, just that they would not run millions of lines of JavaScript for some social media tracking algorithm, newsletter signup, GDPR popup, newsletter popup, ad popup, etc. and you'd probably just be presented with the text only and at best a relevant static image or two. The web would be a place to get long-form information, sort of a massive e-book, not a battleground of corporations clamoring for 5 seconds of attention to make $0.05 off each of 500 million people's doom scrolling while on the toilet.
Web browsers existed back then, the web in the days of NCSA Mosaic was basically exactly the above
For me the interesting alternate reality is where CPUs got stuck in the 200-400mhz range for speed, but somehow continued to become more efficient.
It’s kind of the ideal combination in some ways. It’s fast enough to competently run a nice desktop GUI, but not so fast that you can get overly fancy with it. Eventually you’d end up OSes that look like highly refined versions of System 7.6/Mac OS 8 or Windows 2000, which sounds lovely.