> We're thinking about AI wrong.
And this write up is not an exception.
Why even bother thinking about AI, when Anthropic and OpenAI CEOs openly tell us what they want (quote from recent Dwarkesh interview) - "Then further down the spectrum, there’s 90% less demand for SWEs, which I think will happen but this is a spectrum."
So save thinking and listen to intent - replace 90% of SWEs in near future (6-12 months according to Amodei).
Not without some major breakthrough. What's hilarious is that all these developers building the tools are going to be the first to be without jobs. Their kids will be ecstatic: "Tell me again, dad, so, you had this awesome and well paying easy job and you wrecked it? Shut up kid, and tuck in that flap, there is too much wind in our cardboard box."
If the goal is to reduce the need for SWE, you don’t need AI for that. I suspect I’m not alone in observing how companies are often very inefficient, so that devs end up spending a lot of time on projects of questionable value—something that seems to happen more often the larger the organization. I recall at one job my manager insisted I delegate building a react app for an internal tool to a team of contractors rather than letting me focus for two weeks and knock it out myself.
It’s always the people management stuff that’s the hard part, but AI isn’t going to solve that. I don’t know what my previous manager’s deal was, but AI wouldn’t fix it.
The funny thing is I think these things would work much better if they WEREN'T so insistent on the agentic thing. Like, I find in-IDE AI tools a lot more precise and I usually move just as fast as a TUI with a lot less rework. But Claude is CONSTANTLY pushing me to try to "one shot" a big feature while asking me for as little context as possible. I'd much rather it work with me as opposed to just wandering off and writing a thousand lines. It's obviously designed for anthropic's best interests rather than mine.
Where is this "90% less demand for SWEs" going to come from? Are we going to run out software to write?
Historically when SWEs became more efficient then we just started making more complicated software (and SWE demand actually increased).
I sort of agree the random pontification and bad analogies aren't super useful, but I'm not sure why you would believe the intent of the AI CEOs has more bearing on outcomes than, you know, actual utility over time. I mean those guys are so far out over their skis in terms of investor expectations, it's the last opinion I would take seriously in terms of best-effort predictions.
I don't think anyone serious believes this. Replacing developers with a less costly alternative is obviously a very market bullish dream, it has existed since as long as I've worked in the field. First it was supposed to be UML generated code by "architects", then it was supposed to be developers from developing countries, then no-code frameworks, etc.
AI will be a tool, no more no less. Most likely a good one, but there will still need to be people driving it, guiding it, fixing for it, etc.
All these discourses from CEO are just that, stock market pumping, because tech is the most profitable sector, and software engineers are costly, so having investors dream about scale + less costs is good for the stock price.