I like Ada. I can’t believe this whole discussion about how types are handled missed the entire ML family of languages. ML, Standard ML, Concurrent ML, Caml, OCaml, and more have structural types, supported and enforced by the compiler.
Ada has one of the same primary issues as PL/I, PHP, and Perl. As much as one might like it, it’s a huge language with loads of syntax and semantics baked into the core language. The article keeps saying that’s a selling point. To some extent and to some people that’s true. However, it also touts the annexes as something wonderful. That’s also true, and more true in my opinion. If only more of the language had been in standardized annexes with a smaller core it may have seen far more adoption.
I find multiple "strange" flaws with the article, even for my appreciation of Ada _and_ the article as an essay:
* The article claims only Ada has true separation of implementation vs specification (the interface), but as far as I am able to reason, also e.g. JavaScript is perfectly able to define "private" elements (not exported by an ES6 module) while being usable in the module that declares them -- if this isn't "syntactical" (and semantical) separation like what is prescribed to Ada, what is the difference(s) the article tries to point out?
* Similarly, Java is mentioned where `private` apparently (according to the article) makes the declaration "visible to inheritance, to reflection, and to the compiler itself when it checks subclass compatibility" -- all of which is false if I remember my Java correctly -- a private declaration is _not_ visible to inheritance and consequently the compiler can ignore it / fast-track in a subclass since it works much the same as it has, in the superclass, making the "compatibility" a guarantee by much the same consequence
I am still reading the article, but having discovered the above points, it detracts from my taking it as seriously as I set out to -- wanting to identify value in Ada that we "may have missed" -- a view the article very much wants to front.
I like the article overall but the continually repeated 'Language X didn't have that until <YEAR>' is very grating after the first ten or so.
I also wish there were concrete code examples. Show me what you are talking about rather than just telling me how great it is. Put some side by side comparisons!
The Twitter account is from April 2026:
https://xcancel.com/Iqiipi_Essays
There is no named public author. A truly amazing productivity for such a short time period and generously the author does not take any credit.
> Every language that has added sum types in the past twenty years has added, with its own syntax, what Ada's designers put in the original standard.
While true, that doesn't mean that other language's sum types originated in Ada. As [1] states,
> NPL and Hope are notable for being the first languages with call-by-pattern evaluation and algebraic data types
and a modern language like Haskell has origins in Hope (from 1980) through Miranda.
[1] https://en.wikipedia.org/wiki/Hope_(programming_language)
I really don't want this to be AI writing because I enjoyed it, but as other commenters have pointed out, the rate of publishing (according to the linked Twitter account) is very rapid. I'm worried that I can't tell.
From the main page of this website:
"These are not positions. They are proposals — structures through which a subject might be examined rather than verdicts about it."
The entire site is AI written.
The US Air Force intended to use ADA, but had to use JOVIAL instead because ADA took so long to be developed. Most people have never heard of JOVIAL but it still exists in the USAF as a legacy.
I worked with JOVIAL as part of my first project as a programmer in 1981, even though we didn't even have a full JOVIAL compiler there yet (it existed elsewhere). I remember all the talk about the future being ADA but it was only an incomplete specification at the time.
My work on DoD ADA projects tended to focus on DoD STD 2167 (mid to late 1980s).
Sadly the review meetings focused on document structure instead of thoughtful software design and analysis. ADA didn't help; it was cumbersome to get working well, and ADA experience in the contracting agencies was low. The waterfall approach made the projects slow to implement.
Ada is underrated. I am spending lots of my time writing tons of open source software in Ada, mostly for myself, though.
The article states, quoting:
"JavaScript's module system — introduced in 2015, thirty-two years after Ada's — provides import and export but no mechanism for a type to have a specification whose representation is hidden from importers."
Then:
"in Ada, the implementation of a private type is not merely inaccessible, it is syntactically absent from the client's view of the world."
Am I missing something -- a JavaScript module is perfectly able to declare a private element by simply not exporting it, accomplishing what the author prescribes to Ada as "is not merely inaccessible, it is syntactically absent from the client's view of the world"? Same would go for some of the other language author somewhat carelessly lumps together with JavaScript.
I loved the article, and I have always had curiosity about Ada -- beyond some of the more modern languages in fact -- but I just don't see where Ada separates interface from implementation in a manner that's distinctly better or different from e.g. JavaScript modules.
> The verbosity was deliberate — Ichbiah wanted programs to be readable by people other than their authors, and readability over time favours explicitness — but it was experienced as bureaucratic and un-hacker-like, and the programming culture that formed in the 1980s and 1990s was organised around the proposition that conciseness was sophistication. Ada was the language of procurement officers. C was the language of people who understood machines. The cultural verdict was delivered early and never substantially revisited.
IMO, this was the telling paragraph.
I remember learning ADA at uni in the 90s and not loving it because of the syntax and it being slow to work with. I also remember the Arianne 5 rocket crash in the late 90s being blamed for a software bug, and the software being written in ADA. Now i understand that it was not a pure software issue, but still, all that safety did not prevent the major disaster that it was
I've written a few small projects in Ada, and it's a better language than it gets credit for.
Yes, it's verbose. I like verbosity; it forces clarity. Once you adjust, the code becomes easier to read, not harder. You spend less time guessing intent and more time verifying it. Or verify it, ignore what you verified, then go back and remind yourself you're an idiot when you realize the code your ignored was right. That might just be me.
In small, purpose-built applications, it's been pleasant to code with. The type system is strict but doesn't yell at you a lot. The language encourages you to be explicit about what the program is actually doing, especially when you're working close to the hardware, which is a nice feature.
It has quirks, like anything else. But most of them feel like the cost of writing better, safer code.
Ada doesn't try to be clever. It tries to be clear, even if it is as clear as mud.
imo, the real value of Ada/SPARK today is that it enforces a clear split between specification and implementation, which is exactly what your LLM needs.
You define the interface, types, pre/post conditions you want in .ads file, then let the agent loose writing the .adb body file. The language’s focus on readability means your agent has no problem reading and cross referencing specs. The compiler and proof tools verify the body implements the spec.
I absolutely love this article.
When I was a young lad, must have been 20 I came across some programming books, including programming in Ada.
I read so much of it but never wrote a line of code in it, despite trying. Couldn't get the build environment to work.
But the idea of contracts in that way seemed so logical. I didn't understand the difference this article underpins though. I learned Java and thought interfaces were the same.
Great article, great language.
Not going with C/C++/Rust style brace syntax probably cost them more devs than anything else...
Well, that and the proprietary compilers
Ada is a language that had a lot of useful features much earlier than any of the languages that are popular today, and some of those features are still missing from the languages easily available today.
In the beginning Ada has been criticized mainly for 2 reasons, it was claimed that it is too complex and it was criticized for being too verbose.
Today, the criticism about complexity seems naive, because many later languages have become much more complex than Ada, in many cases because they have started as simpler languages to which extra features have been added later, and because the need for such features had not been anticipated during the initial language design, adding them later was difficult, increasing the complexity of the updated language.
The criticism about verbosity is correct, but it could easily be solved by preserving the abstract Ada syntax and just replacing many tokens with less verbose symbols. This can easily be done with a source preprocessor, but this is avoided in most places, because then the source programs have a non-standard appearance.
It would have been good if the Ada standard had been updated to specify a standardized abbreviated syntax besides the classic syntax. This would not have been unusual, because several old languages have specified abbreviated and non-abbreviated syntactic alternatives, including languages like IBM PL/I or ALGOL 68. Even the language C had a more verbose syntactic alternative (with trigraphs), which has almost never been used, but nonetheless all C compilers had to support both the standard syntax and its trigraph alternative.
However, the real defect of Ada has been neither complexity nor verbosity, but expensive compilers and software tools, which have ensured its replacement by the free C/C++.
The so-called complexity of Ada has always been mitigated by the fact that besides its reference specification document, Ada always had a design rationale document accompanying the language specification. The rationale explained the reasons for the choices made when designing the language.
Such a rationale document would have been extremely useful for many other programming languages, which frequently include some obscure features whose purpose is not obvious, or which look like mistakes, even if sometimes there are good reasons for their existence.
When Ada was introduced, it was marketed as a language similar to Pascal. The reason is that at that time Pascal had become the language most frequently used for teaching programming in universities.
Fortunately the resemblances between Ada and Pascal are only superficial. In reality the Ada syntax and semantics are much more similar to earlier languages like ALGOL 68 and Xerox Mesa, which were languages far superior to Pascal.
The parent article mentions that Ada includes in the language specification the handling of concurrent tasks, instead of delegating such things to a system library (task = term used by IBM since 1964 for what now is normally called "thread", a term first used in 1966 in some Multics documents and popularized much later by the Mach operating system).
However, I do not believe that this is a valuable feature of Ada. You can indeed build any concurrent applications around the Ada mechanism of task "rendez-vous", but I think that this concept is a little too high-level.
It incorporates 2 lower level actions, and for the highest efficiency in implementations sometimes it may be necessary to have access to the lowest level actions. This means that sometimes using a system library for implementing the communication between concurrent threads may provide higher performance than the built-in Ada concurrency primitives.
but why does "the industry ignored it" hold as the central framing when the actual story seems to be "the DoD mandated it, contractors used it, and it worked fine for exactly what it was built for"? the implicit assumption is that widespread adoption is the metric for a language succeeding, but ada wasnt trying to win over web developers, it was trying to stop missiles from being maintained in 450 incompatible dialects, which... it actaully did?
It'd be a neat trick to have a single unified language which could bridge the gap between software and hardware description languages.
>Ada's deployment domain meant that Ada's successes were invisible. A software project that compiles without error, runs without race conditions, and has been formally verified to satisfy its specification does not generate incident reports or post-mortems or conference talks about what went wrong. Ada's successes — the aircraft that have not crashed, the railway signalling systems that have not failed, the missile guidance software that has not misguided — are invisible precisely because they are successes.
Um... this is most certainly not true. Back in the late 1990s and early 2000s, Ada was the language of choice at my Australian university for both computer science and software engineering degrees.
I distinctly recall my lecturer telling us a story about a fancy presentation of Ada in military tank (AFV) systems for the DoD. The story goes that during the presentation, in front of a live audience, the presenter AND the audience had to duck after the tank's turret began spinning around and around. The code had entered an infinite loop!
The next language ought to ensure memory-safe conditions across the network.
Reading the Steelman document is like reading a shopping list of everything that's gone into modern Fortran.
I am wondering what the Ada equivalent of affine types is. What is the feature that solves the problem that affine types solve in Rust.
Every time Ada is mentioned here, I start a quest - how to try it for free on Windows.
And every time I fail.
off-topic, this article has almost the same theme as dawnfox/dayfox which I love. It fits nicely with my terminal on the left. Cool stuff
Wonderful article and a good fit with HN’s motto of “move slowly and preserve things” as opposed to Silicon Valley’s jingoistic “move fast and break things”.
It highlights the often perplexing human tendency to reinvent rather than reuse. Why do we, as a species, ignore hard-won experience and instead restart? In doing so, often making mistakes that could have been avoided if we’d taken the time or had the curiosity/humility to learn from others. This seems particularly prevalent in software: “standing on the feet of giants” is a default rather than exception.
That aside, the article was thoroughly educational and enjoyable. I came away with much-deepened insight and admiration for those involved in researching, designing and building the language. Resolved to find and read the referenced “steelman” and language design rationale papers.
> JavaScript's module system — introduced in 2015, thirty-two years after Ada's — provides import and export but no mechanism for a type to have a specification whose representation is hidden from importers.
What?
#1 JavaScript doesn't have formal types. What does it even mean by "representation"?
#2 You can just define a variable and not export it. You can't import a variable that isn't exported.
There are several little LLM hallucinations like this throughout the article. It's distracting and annoying.
Edit: Look, I know that complaining about downvotes is annoying, but I find this genuinely perplexing. Could someone just explain what the hell that paragraph was supposed to mean instead of downvoting me?
It looks like OpenClaw started blogging. :D
I would never work on projects that ADA is used for.
1. Would never work on "missile tech" or other "kills people" tech.
2. Would never work for (civ) aircraft tech, as i would probably burn out for the stress of messing something up and having a airplane crash.
That said, im sure its also used for stuff that does not kill people, or does not have a high stress level.
[flagged]
[dead]
[dead]
[flagged]
Ada was also ignored because the typical compiler cost tens of thousands of dollars. No open source or free compiler existed during the decades where popular languages could be had for free.
I think that is the biggest factor of all.