In my experience, it's been the complete opposite. The very experienced engineers that are actually willing to use top of the line tooling are much better than they were before, including those that are over 40, and over 50.
Part of the practical degradation of traditional programmers over time has always been concentration and deep calculation, just like in chess. The old chess player knows chess much better than a 19 year old phenom, but they cannot calculate for that many hours at the same speed as before, so their experience eventually loses to the raw calculation. Maybe at 35, or at 45, but you are just not as good. Claude Code and Codex save you the computation, while every single instinct and 2 second "intuition", which is what you build with experience, is still online.
It's not just that it's a more fair competition: It's now unfair in the opposite direction. The senior that before could lead a team of 6 is now leading a team of agents, and reviewing their code just as before. Hell, it's easier to get the agent to change direction than most juniors around me, which will not be easy to correct with just plain, low-judgement feedback.
> AI-users thus become less effective engineers over time, as their technical skills atrophy
Based on my experience, I think this will prove more true than not in the long run, unfortunately.
Professionally, I see people largely falling into two camps: those that augment their reasoning with AI, and those that replace their reasoning with AI. I’m not too worried about the former, it’s the latter for whom I’m worried.
My mom is a (US public school) high school teacher, and she vents to me about the number of students who just take “Google AI overview” as an absolute source of truth. Maybe it’s just the new “you can’t cite Wikipedia”, but she feels that since the pandemic, there’s a notable decline in the critical thinking skills of children coming through her classes.
We have a whole generation (or two) of kids that have grown up being told what to like, hate, believe, etc. by influencers and anonymous people on the internet. They’d already outsourced their reasoning before LLMs were a thing. Most of them don’t appear to be ready to constructively engage with a system that is designed to make them believe they are getting what they want with dubious quality.
Anecdotally, it feels like something materially changed in the US software hiring market at the start of this year to me. It feels like more and more businesses are taking a wait and see approach to avoid over-investing in human capital in the next few years.
It also feels like the hiring "signal", which was always weak before, is just completely gone now, when every job you do advertise receives over 500 LLM written applications and cover letters that all look and feel the same.
The pro-athlete comparison in this article is bit silly IMO- there are obvious physical body issues that occur with aging if you rely on your muscles etc to make money. If you compare to other fields of knowledge work, such as say law or medicine, there are loads of examples of very experienced, very sharp operators in their 40s and 50s.
I keep reading about how AI will be fine because people can just retrain for different careers. However, I never read what those careers are or who is going to pay for retraining.
I certainly don't have the money or time to go back to college and start a new career at the bottom.
I really wish seemingly intelligent people would stop using the abstraction analogy (like the article does). The key word is: determinism. Every level of abstraction (inc. power tools, C, etc.) added a deterministic layer you can rely on to more effectively do whatever it is that you're doing - same result, every time. LLM's use natural language to describe programming and the result is varied at the very best (hence agents, so we can brute force the result instead). I think the real moat is becoming the person who can actually still program.
> If you work in construction, you need to lift and carry a series of heavy objects in order to be effective. But lifting heavy objects puts long-term wear on your back and joints, making you less effective over time. Construction workers don’t say that being a good construction worker means not lifting heavy objects. They say “too bad, that’s the job”.
My parents were both construction workers. There is an understanding that you cannot lift heavy objects forever. You stop lifting objects and move to being a foreman, a supervisor... and if you are uncomfortable learning to get others to do work that you before have done yourself, you burn out your body entirely and the consequences are horrible.
This is factual reality, but it is also a parable that has been important for me to internalize about delegation in my own career. It is not irrelevant to AI use, but I don't think it slots onto it totally as neatly.
Unless I'm missing something, there's an obvious logic issue here.
If we truly need to sacrifice our skill to be productive by using LLMs that atrophy us, then the only devs that have a limited lifespan are us. The next ones won't have a skillset to atrophy since they won't have built it through manual work.
Also, I hereby propose to publicly ban the "LLMs generating code are like compilers generating machine code" analogy, it's getting old to reargue the same idea time after time.
If by software engineering, one means typing code character by character into a text editor, sure it's going to be difficult to find someone to pay you for it.
If you mean creating software, well we are creating more software than ever before and the definition of what software is has never been so diverse. I can see many different careers branching off from here.
From Reddit:
> After being laid off, a programmer becomes a welder. One day while working, he suddenly muttered to himself, "It's been so long, I've even forgotten how to solve three sum". A coworker next to him quietly replied, "Two pointers".
> The career of a pro athlete has a maximum lifespan of around fifteen years. You have the opportunity to make a lot of money until around your mid-thirties,
This sounds ageist - I'm around 40 and feel I am at my mental peak, compared to even my mid 20's. This isn't a good analogy at all, the brain doesn't "wear out" like a professional athletes' body does, it just changes its structure. The brain is a remarkable organ.
We're forgetting one thing: we (mere engineers) have control over nothing. The vast majority of us are at the mercy of executives and investors. Before AI we had some sort of grip because our skills weren't so much a commodity, and yeah, dealing with code and systems architecture and data and distributed systems wasn't that easy. Now AI is a tool not for us but for the higher-ups, they can finally commoditize software engineering and need only a small fraction of us. I see engineers around here fighting and discussing who'll be left behind (the 80%) and who'll remain because they're "more than mere coders" (the 20%)... what we don't discuss here is that we're all now at the mercy of Anthropic et al, and that's bad. The irony is that the vast majority of us use Anthropic, so we are just loading the guns for them to use them. It's sad, but we call it progress. Nuts
Was it ever a lifetime career? Haven't most people looked around and asked themselves where are all the 50+ engineers? They basically don't exist in large numbers. Ageism is real in this industry. You either save up enough money to retire early, switch into management, or get forced out of the industry eventually. AI is just accelerating the trend. I see very few junior engineers resisting AI. I see a LOT of staff+ engineers resisting it. Just look at the comments on HN. Anti-AI sentiment is real.
"No longer"? When was it exactly?
The truth is that software engineering, as a profession, is not even a full hundred years old. Even if someone spent their all career with it, it has probably changed so much over time that it became a completely different job.
So far, we have barely scratched the surface.
This is such a misleading title. The post isn't about software engineering not being a lifetime career, it's about this:
> If AI does turn out to make you dumber, why can’t we just keep writing code by hand? You can! You just might not be able to earn a salary doing so, for the same reason that there aren’t many jobs out there for carpenters who refuse to use power tools.
The argument the piece makes is that being a software engineer who insists on writing code by hand may no longer be a lifetime career.
I think the definition of "software engineer" is changing, and it's not even changing that much. We construct software to help solve human problems. We can keep on doing that, just now we get to do it more.
Just stop calling everyone a software engineer. You can be a script kiddie, coder, developer - but all of those are not engineers. Engineering is much more than just writing the code. There is a reason why you can earn a degree in engineering - not saying a degree is automatically making you an engineer or that it's even needed.
People who think that "software engineering" and "writing code" are the same thing will indeed be out of a job. People who understand the difference will continue to thrive.
Thinking that software engineers can be replaced by AI is like thinking that mathematicians can be replaced by calculators.
Argument A: AI means you don't learn as much, so even though you are more effective, it inhibits your growth and you shouldn't use it. However, on a pragmatic level, it's effective to hire a bajillion people, fire them at will, and get AI to do everything. You will get so many JIRA tickets closed and so many lines of code written.
Argument B: AI means you don't learn as much, and the single most useful work product of a software engineer is knowing how the code functions, so it's depriving your company of the main benefit of your work. Also, layoffs are terrible business strategy because every lost employee is years of knowledge walking out the door, every new hire is a risk, and red PRs are derisking the business.
Institutional and personal knowledge seem similar, but the implications of each are radically different.
I'm repeating what others have essentially said, but ask yourself what's on your resume. If it says "Software Engineer" and that's all it talks about, then yea you might not find it's a lifetime career.
But if it's a diversity of things (that use or leverage software development) then you probably have a lifetime career ahead of you.
I've been writing software for over 40 years but I've never seen myself as having a software engineering career. I've been a research assistant in geophysics, a marine technician on research ships, a game developer, an advisor to the UN, and a lot more. Yes all through that I used software, but I did a lot of other things in the process of using it.
Even if a manager can just conjure the software they want instantly using AI, they are still going to prefer having a nerd to manage it for them - to know how to prompt engineer or even just organise it all.
It might not look much like software engineering, but it's still going to be nerd stuff that most people don't want to bother with.
The final verdict that software engineers wont exist x years from now is a bit contrived. Today, my team is looking to hire an AI software engineer. I reached out to my close group of developer friends to gauge interest only to find out that each one of them is ALSO trying to find/hire software engineers, all looking for this new "paradigm" of programming knowledge. Maybe how the role itself looks different today than it did 5 years ago but it seems like every company is trying to accelerate their development and finding new opportunities that didn't exist before the AI craze.
More than anything, I believe that AI is pushing out those who enjoyed the ~act~ of programming more than the product being delivered itself. Mostly because those individuals might have the hardest time adopting this new way of getting things done.
And honestly, I feel for them. Coding has always felt like an art form to me. Nothing feels better than someone commenting on the elegance/beauty of something youve written.
Seems the solution here is the same it's actually always been if you want career progression: be more than just a code jockey. The true value of an engineer is to be plugged into overall roadmaps, broader thinking around product, how to achieve company goals, etc etc.
Yes, LLMs might dramatically reduce the amount of code we write by hand. But I'm a lot less convinced they'll solve all of the amorphous, human-interacting aspects of the job.
I am just not seeing a future where a product manager opens their laptop and says "build me a self driving car company" and then gets one
The differentiator is augmenting reasoning with AI versus replacing reasoning with AI. But those who choose to replace their reasoning with AI probably weren't good at it to begin with; cause if they were, they'd choose to not replace it. Exception is that AI can actually replace reasoning (which it can't, yet) - then it's game over with a career in software engineering anyway.
I don't understand it. The time-limited career would work if we were born with innate ability for software engineering and would lose it over time by using AI. Most people are not born with that ability though, it needs to be developed first.
And read Programming as Theory Building already, it's not that long
80% of my day to day job has never been pumping out lots of code. it is a complicated career is it? we do a lot of alignment, design and thinking. i can't even agree the idea of outsourcing thinking, i think AI is very good at helping us to think clearly, but it doesn't really "think" for us.
if you do that then... likely very replacable.
I don’t understand why so many people are convinced that “this time is different.” New tools raise the ceiling of what’s possible. New jobs emerge at the limit of what’s possible with new tools. The jobs doing what we do today will disappear. New jobs with greater complexity and specialization will emerge. I have watched this happen in the software industry in my lifetime. I expect that it will continue to happen.
From all I can see, things for US citizen developers are all but over.
Not AI, offshoring combined with downsizing of US based engineering orgs.
Corp America has figured it out finally after 2 decades of entitled developers making 2 day tasks into 2 week tasks in the name of "best practices", "architecture" and "Doing It Right!" etc, all while commanding high salaries.
It turns out that Good Enough is in fact good enough and the people who write the checks are onto it. Even if its not quite good enough, cheap offshore resources can just be sent back to make it work. US based staff of 5 people who can be held responsible for guiding a much larger offshore group seems to be the common pattern.
All of this was imparted to me by a CIO on a recent interview with a financially strong mid sized company in the eastern US. The developers I interviewed with where EXCEPTIONALLY COMFORTABLE and displayed zero signs of any kind of stress from maintaining their literally 20 years out of date infra. It was insinuated that the team I interviewed with "probably wont look the same in 6 months" too.
"We may be in the first generation of software engineers in the same position. If so, it’s probably a good idea to plan accordingly."
He compares software engineers to pro athletes. What does it mean to plan accordingly? Start working with the mob to fix poker games? I don't know what "plan accordingly" means at all but it is a thought provoking statement.Here's a better comparison to pro athletes. Their work output is winning games. How do they get good at (and stay good at) that? Is it by playing real games for points?
That's a part of it, but only a small part. They don't get good at the thing mainly by doing the thing. They get good at it by training to do the thing.
An NFL football player does a ton of things other than playing in games. They have practice scrimmages. They do drills like throwing, catching, running patterns, tackling, reading quarterbacks, stripping balls, picking up fumbles, etc. They work with coaches on their technique. They watch film. They spend many hours in the gym and on the track building their strength, speed, cardio, and stamina.
Yes, it's true that your software skills will atrophy if you don't use them. But that doesn't mean your skills have to get worse and worse causing you to eventually quit the job. It means you need to set aside time to maintain your skills. It may no longer happen automatically as a side effect of your work, but it can happen intentionally instead.
There’s a hierarchy amongst knowledge work and AI hasn’t yet been able to do the work that is rare and valuable.
Over the past two decades, there have been lot of solved problems like building boring scalable web apps, UX design etc and AI is fairly good at this, enough so that good prompting can get you very far. This shouldn’t be a surprise, there’s a lot of publicly available data for this (GitHub repos etc).
On the other hand, there are rarer Computer science problems like designing efficient Datacenters, GPUs, DL models. Think about problems that someone of Jeff Dean’s or James Hamilton’s (AWS SVP) or a skilled Computer Architecture researcher like David Paterson’s ability would solve. These are incredibly hard and rare problems and AI hasn’t been able to make much progress in these areas. That’s true for other sciences as well.
If you’re a regular Joe like me who builds boring CRUD apps, AI is coming for you.
What I mean is if you are working on incredibly hard and rare problems that require rare skills and also those problems don’t have publicly available data that LLMs can be trained on, you’re safe from being “automated” away. If not, you must plan accordingly. Also if you’re a skilled manager (in any field) AI cannot replace you, highly skilled managers that can get the best out of their teams have rare skills that aren’t easily replicable even amongst humans much less AI. Although, if going forward we need fewer developers we will need fewer managers too.
Comparing software development to carrying heavy things at a construction site feels like a real stretch to me.
'If you work in construction, you need to lift and carry a series of heavy objects in order to be effective. But lifting heavy objects puts long-term wear on your back and joints, making you less effective over time. Construction workers don’t say that being a good construction worker means not lifting heavy objects. They say “too bad, that’s the job”.'
On another note, for sure software developers are saying things like "this is the part of the job that I like" or "if you aren't doing the work, you won't be good at the work." But other people are saying this, too. I just saw an episode of "Hacks" ("QuickScribbl") where the writers say pretty much this exact thing when confronted with AI tooling designed to "make their job easier". Is writing comedy also like lifting heavy objects at a construction site?
I think the polarizing response regarding AI depends on which lenses you are looking through. For junior roles, yes, the job is rapidly disappearing. But for senior roles, experience and judgment are more important than ever.
So yes, software engineering may no longer be a lifetime career for a lot of people, much like elite sport is not a viable career for most—but still, some will, and must, make it their career.
"You can! You just might not be able to earn a salary doing so, for the same reason that there aren’t many jobs out there for carpenters who refuse to use power tools."
I've long regarded myself as more a master craftsman than an engineer, and I've had the pleasure of working on one-of-a-kind or first-of-a-kind things. Perhaps fortunately I'm near retirement. But I genuinely enjoy the coding: it's how I engage with the problem and learn to understand it. It's also how I ensure that I'll be able to read the code and find things in the code base when I come back to it years later. Last thing I want to do is spend my days overseeing someone (or something) else's code. If I wanted to be a manager of programmers I could have done that years ago.
Following this logic, mathematicians disappear first.
Screwdriver -> power drill
Hand-coding -> llms/agents
Sometimes the only thing that can fit into a tricky spot is a screwdriver. The power drill didn't make screwdrivers obsolete, it just made them less necessary day-to-day.
Same thing here. LLMs are power tools, but sometimes, the only thing that can fit into a "tricky spot" with code/systems is knowing how to do it by hand.
If you don't know how to code then you can't really influence the AI technically and that can result in everything being the same.
Maybe you want a react app and using redux for state would be the best for the specific case but the AI doesn't recommend it and you don't know, then you are missing out and can end up with something suboptimal This was just an example
The majority of my activity today as a professional software engineer with two decades' of experience; trying to get Team Sales to express what they want. It's so hard. I see no way an LLM can do this. I could possibly be replaced with someone who spent their time begging Team Sales to type what they want into an LLM.
>If the models are good enough, you will simply get outcompeted by engineers willing to trade their long-term cognitive ability for a short-term lucrative career
> (2) AI-users thus become less effective engineers over time, as their technical skills atrophy
Wouldn't (2) imply that if everyone just used AI there eventually would come a time when there aren't engineers who will outcompete you (because their skills are so atrophied)?
Ageism is alive and well in our industry.
People need to learn the difference between fluid intelligence and crystalized intelligence.
People need to hear that startup success is maximal when the founders are older, not younger. VCs chasing youth are statistics deniers.
Tbh I don’t know if any careers are lifetime these days. Maybe plumbers have job security. GPUs need cooling…
Software engineering today is almost nothing like the role it was 30 years ago.
Maybe if you somehow stick with the same company for your entire career, it could feel somewhat similar... but I doubt it, as 'best practices' and many other things cause it to change.
The days of 'lifetime career' had already gone for most people, way before AI arrived.
Agent-assisted programming is fundamentally the skill of directing and supervising agents. I don't see any reason to believe that working a job where you direct and supervise agents will make you any worse at directing and supervising agents long term.
The practical lesson is probably to build adjacent judgment: product sense, domain expertise, systems taste, and communication. Pure implementation may be the most exposed slice.
such an incoherent argument.
> professional athletes & construction workers - work in physical fields which means there's physical limits to what they experience both in terms of what they do & what their body can do.
> software engineering is an art & engineering. which means as long you're of sound mind - you can do it till you die of old age or even if say you go blind. Because you ability to refine / taste is not dependent on your physical capabilities.
> llm's one shoting things - is not engineering because engineering is about compromising within constraints & using rules of thumb. so if you have no constraints u r not engineering.
It won't be a career if AI gets good enough that you don't have to read / understand the code - otherwise, AI won't have much impact on jobs I don't think.
I take issue with the premise that "Using AI means you don’t learn as much from your work" With AI assistance, I tackle far more tasks than I would without it. Learning per task goes down, but cumulative learning does not.
Totally agreed and on point. Calculator operators aren't around a lot.
Multiple times per week I have the same conversation. It goes something like this:
The developers who still think their job is about writing code will perhaps not have a job in the future. Brutal as it may sound: I'm fine with that. I'm getting old and I value my remaining time on the planet.Business owners who think they can do without developers because they think LLMs replace developers are fine by me too. Natural selection will take care of them in due course.