Fascinating. We hear that the leaps in AI have been made possible by orders of magnitude increases in compute and data availability, and of course that’s substantially true—but exactly how true? It’s a nice exercise in perspective to see how much or how little modern machine learning methods would have been capable of if you brought them by time machine to the 70’s and optimized them for that environment.
I like how the author's "modern" machine to connect to it is still 20 years old.
With a concave trackpoint, respect.
BTW, I nag Framework at every conference I go to that people want this shell and keyboard. It's been years. I think it's time to go through the effort to figure out how to do the production run of the case myself. Framework actually wants people to do things like this but you know, manufacturing is hard. Anyone wanna help?
The fact that it is possible at all says more about how simple transformers actually are underneath than it does about the hardware.
> I don't have an actual paper tape reader, so the object code is directly deposited in memory through the console.
So, really, a Turing Machine is all you need?
Woah. Dude has a running PDP-11/34 in 2026? Personally, I find that more impressive than the program.
[dead]
Thanks for reposting! I'm the author of ATTN-11. Happy to answer any questions about the fixed-point arithmetic, the PDP-11 hardware, or the training process.