I wonder if EWD would have had the same opinion if he were alive today, with every Unicode font having the APL characters immediately available on the screen.
Did he feel the language design was bad, or would having TTF fonts being able to show "rho", "iota", "grade up" have removed one or more of his objections?
APL suffers from the same apparent problems as Perl: They have friction coming from an unconventional syntax that's hard to understand without knowing the language beforehand, and when faced with competition, people went with the path of least resistance.
* Out of all people, and especially in the newer generations, it is increasingly uncommon to find someone with a desktop or even a laptop.
* Out of them, very few decide to do anything with it besides checking mail, social media, the web or play games.
* Out of them, very few decide to learn a programming language.
* Out of them, very few decide to learn anything besides Javascript or maybe Python.
* Out of them, very few decide to learn anything besides Java/C#/C++, learn algorithms, or learn tools like Vim or Emacs.
* Out of them, very few decide to learn anything besides Rust/Go/Haskell/Lisp/Scheme or even Fortran.
* Out of them, very few decide to learn a language with an alien, symbolic notation that resembles a code golfing language, and which, too, requires them to possibly learn a completely new keyboard layout to type with proificiency.
Not trying to discredit APL's contributions to functional programming and the like, but from the letter, it is pretty obvious Djkstra had little respect for friction. Not saying that he's right to dismiss it outright, though.
It has been a long time since I used APL (in college)
We had APL terminals which had APL keys and would print APL characters. It was significantly more immersive that way.
Looking at this letter i start to vaguely recall things.
Decades later, I recall the output operator, not shown anywhere here.
⎕←<something>
which would print whatever <something> was. (was I misremembering?)
I do recall using matrix operations in a similar way to the math classes I was taking at the same time. matrix multiplication, inversion, dot products seemed to be more "math oriented" than other computer languages.
In other computer languages, you had to adapt to the language. For example:
x = x + 1
y = mx + b
in these two statements, one only makes sense in math class, one only makes sense to increment a variable in a computer language.One can appreaciate striving for simplicity (a programming language that can be taught and explained with pen and paper), but one must also consider that computers are meta-devices.
Before computers, we could write things only on paper, either with our hands or a typewriter. So, naturally, when computers came about, the way of thinking about programming was very text-driven, with an emphasis on what a typewriter could represent.
But then, code could be written directly with computers, opening up more typesetting possibilities thanks to keyboards not being bound anymore by the mechanical limitations of typewriters. You could add keys and combinations to your heart's desire, and they would be natively digital and unlimited.
Now, with graphics, both 2D and 3D, and a myriad or other HIDs, shouldn't we try to make another cognitive jump?
Reminds me when Russ Cox (Go, Google, Bell Labs) used Rob Pike's APL like Ivy to solve 2021 Advent of code puzzles:
https://www.reddit.com/r/apljk/comments/uccbd6/russ_cox_solv...
I wrote a lot of APL for my undergraduate Senior Project in 1978/1979.
I really enjoyed it because it was fun. You could do an incredible amount of work in a single line of code.
The only problem was, that line would then be almost impossible to read and understand! It could easily be used as a "write-only" language even without a separate obfuscation step.
When I become a professional programmer right after college, I never used it again, and learned to write code that was readable above all else.
As if medieval math notation was not weird enough, people decided to invent APL to be even more bizarre. As a proud Perl5 dev, I totally don't buy it. Neither do I buy into Raku's brave use of all possible Unicode symbols. Perhaps I'm ageing.
https://news.ycombinator.com/item?id=16246544 - 27 Jan 2018, 26 comments.
APL was the first language to have operators for "do this to all that stuff". They were headed for functional programming. But the syntax was too weird.
Ironically, I think the examples given in the post validate Dijkstra’s points, instead of disproving them, as the author intended.
Dijkstra's go-to language (pun intended) was Algol 60 (& Pascal) – everything else was shit in his view. Some of his comments:
FORTRAN — "an infantile disorder"
COBOL — "the use of COBOL cripples the mind"
BASIC — students exposed to it are "mentally mutilated beyond hope of regeneration"
PL/I — "the fatal disease"
APL — "a mistake, carried through to perfection"
He liked his languages and programs to be easily traceable with pen & paper. He always wrote programs on the paper (and proved correctness) and only then into computer. REPL-driven development (what APL pioneered) was a foreign concept to him. He would be so appalled by LLM code generation.
> Your writings made me wonder in which discipline you got your doctor’s degree.
Is this a horrible sideswipe or do people think it was intended frankly?
The opening paragraphs about how people enamoured by a shiny gadget will overlook a terrible interface brings immediately to my mind the modern day LLMs.
APL is the only language I've ever dreamt about writing (as in: I could see the characters); I'd dreamt about programming in the past, but those dreams were usually what I would categorized as a nightmare—desperately trying to fix a bug that I couldn't figure out.
Due to my affinity for the language, and my wish to have worked in its heyday (would love to have an APL gig someday), I have been exposed to various writings and recordings of Ken Iverson. I've also been exposed to a few of Dijkstra's thoughts on APL.
I have to say that Iverson generally comes across as a very generous and curious individual while Dijkstra seems to have been a miserable ass. Maybe, given the lens, I've not given Dijkstra a proper chance to demonstrate a more positive attitude, so I'm open to any suggestions of writings where he doesn't seem like such a grump.