First counterexample that comes to mind: Rails vs 90s networked/shared line-of-business crud app development was a 10x factor. It also enabled a lot of internal tools that wouldn't have been worth doing without it.
But after people's expectations adjusted it was just back on the treadmill.
I don't think we've found a new steady-state yet, but I have some gut feeling guesses about where it's going to be.
In particular if that steady-state requires 4 to 40GB blob of binary code to be installed or an internet connection to an AI SaaS provider and a credit card.
I remember when coding was free as in beer and freedom!
Ah, Fails. "Before we made x improvement, the app had to restart 400x a day, now it's only 10x!"
For all the complaining we do about "enshittification", we (Hackernews, the broader industry, whatever), are perfectly willing to pay the price of stability and performance to get a little development speed. That's one prong of how enshittification happens. "I can make compromises in the quality of my product because time to market is the one thing, the only thing that matters in this move-fast-break-things economy—and pass my savings on to the customer (in the form of hidden costs)!"
90% of my my experience has always been dealing with large-ish corporate systems. I am in Europe, so YMMV even when talking about corporate instead of smaller scale projects.
In my experience stuff like RAILS had negligible impact in my field because companies would always require solid backup from some big name vendor (MS, Oracle, IBM, Sun - back in the day, or even SAP).
So most if not all the smaller silver bullets did not even make a blimp on the radar... and stuff like Java or .NET, while definitely better than C or COBOL... did not really deliver in terms of productivity boost (in part because, as noted in the message I am answering to, expectations kept growing at the same pace)