Junior devs have always been useless. You used to give them tasks that take them a week or two even though a senior engineer could do it in a couple hours, not because you wanted them to contribute, but because you wanted them to learn to contribute.
The same ethos makes sense with AI, it's just that every company is trying to avoid paying that training tax. Why turn a junior into a senior yourself if you can get the competition to pay for it instead.
As I tell my students: juniors, you must write the code
https://htmx.org/essays/yes-and/
Everyone else: we must let the juniors write the code.
Seniors come from juniors. If you want seniors, you must let the juniors write the code.
This is 100% an issue on the side of the senior developers. Imagine saying "these juniors are useless" because you are making them work in assembly, but C has just been released. You are giving them menial work that is no longer required to do by humans. Instead of giving them the task "update these email templates", the norm should be: "create this new service that automates an internal process". They will make mistakes and they will learn - but what they will be doing is going to be very useful and also give them chance to grow the necessary skills for this new era, with the supervision of a senior.
My nightmare scenario (which might start to materilize) is that our last years in the industry will be becoming prompt monkies / agent "managers" working on codebases we barely understand in such velocity there's no way we can gain real understanding. Whenever something breaks (and it will , a lot) A.I will fix it - or so we'll hope. And the sad thing is - this might work; you'll get more stuff done with fewer people. Sure, we didn't sign up for this, it's not a fun job what I've described, but why should management care? They have their own problems and A.I is threatening their jobs as well.
Actually the truth is that a lot of senior devs are not very good either, and have negative value. But they have an inflated value of themselves that does not reflect reality.
Pretty much all software projects seem to peak, and then decline in quality. There are only a handful of senior devs in the world who are actually good programmers.
This post, ironically, seems very likely to have been written by an LLM :/
"it's not x, but y", with bonus em-dash:
> your value as a developer is not in your ability to ship code. It’s in your ability to look at code
"But here’s the thing."
"And honestly?"
I can't wait until the AI people realize that without developers' original ideas, AI has nothing new to steal from. We don't create, AI will spit out the same old concepts. What, you're gonna create the next generation of AI by training it on what the very same AI has already produced? C'mon now.
You don't get technical creativity reflexes by using AI. This is technical stagnation in the making. By cannibalizing its own sources, AI is ensuring that future generations are locked-in subscription models to do the most basic technical tasks. This is all obvious, yet we speed up every chance we get.
I can't seem to get the article to load, but I think I get the gist from the title.
I hired a junior "dev" who literally hadn't even completed an HTML course. Before AI I could not have hired them because they literally did not know how to dev. After AI, anyone with a little grit can push themselves into the field pretty easily.
As with everything in life: you can choose to hard route or you can choose the easy route and your results will follow accordingly.
I think this concern is overblown. AI is an incredible teaching tool. It's probably better for teaching/explaining than for writing code. This will make the next generation of junior devs far more effective than previous generations. Not because they're skipping the fundamentals...because they have a better grasp of the fundamentals due to back-and-forth with infinitely patient AI teachers.
It feels like to me that junior devs don’t understand what they even need to learn. They just use agentic coding to get things done, without any deeper knowledge.
The worst is, they think they know exactly what they need to learn, and also think they can make good decisions.
Ai might bring forward the standardisation we never had. If coding dynamics shift enough then all the opinions about libraries and engines and frameworks might get less focused on readability and more on efficiency and easy composition by Ai.
Security gets outsourced to audited layers and Ai does the stupid boring jobs of gluing them together. Some developers become more specialised and niche, some pivot to product, some pivot to other areas.
There are plenty of people who joined software for the payout and hate it. Plenty of people who grown to hate it over time.
I've been enjoying using it to figure out toy projects but paying an API and depending on a service to code is very sour. I really hope hardware specialises and local models become good enough. Gate keeping development on centralised services would be a loss for everyone and ripe for dystopian outcomes.
I see so much creativity coming from young developers I just can’t agree. Yes most developers in the past 20 years who were only chasing big tech money were useless. Good riddance
Is this writeup AI generated?
It also makes junior devs unobtainable. Because who in their right mind is going to start a career in CS these days?
Copying homework and cheating at exams don't make student learn.
It takes time to become a junior too. Emerging tech landscape could affect skills and knowledge that is expected from entry level job applicants.
I recently read a similar discussion in the context of AI in science and PhD students. And the point the author was making that the goal of having PhD students is NOT to produce academic research, but to train people. I think the same idea applies here. Somebody still needs to train people, and the companies will probably need to ensure that they have resources for that, as there will not be enough senior people for all the tasks.
Every week, I read an article on the consequences of reliable coding agents in SWE industry. All such discussions on HN leads to a fundamental suspicion of the empirical scaling laws of LLMs or the infinite greed and short-sightedness of the market in inflating a bubble. I'm tired.
It's interesting to watch industry after industry hollow itself out from the inside then inevitably die long after all the financial people, investment bankers and management consultants have all cashed their checks.
Steve Jobs famously accurately called this out years ago [1].
Xerox, Boeing, PC manufacturers (who basically created the Taiwanese makers through a series of short-term outsourcing steps), etc. But there are two examples I want to talk about specifically.
First, one lasting impact of the 2008 GFC was that entry-level jobs disappeared. This devastated a generation of millenial college graduates who suddenly had a mountain of student loan debt (thanks to education costs outpacing inflation by a lot) but suddenly no jobs. It became a bit of a joke to poke fun at such people who had a ton of debt and worked as baristas but this was a shallow "analysis". It was really a systemic collapse. Those entry-level workers are your future senior workers and leaders. Those jobs have never come back.
The rise of DVR/TiVo and ultimately streaming brought on a golden age of TV in the 2000s. It was kind of the last hurrah for network shows that produced 22 episodes a year before streamers instead produced 8 episodes every 4 years.
But what made this system work was an ecosystem. Living in LA, Atlanta and a few other places was relatively cheap so aspiring actors and writers and entertainmnet professionals could get by with secon djobs and relatively low income. These became the future headline actors and senior professionals. Background work and odd jobs were sufficient. Background work also taught people how to be on a set.
Studios still had large writing staffs. Some writers would be on set. Those writers were your future producers and showrunners.
Part of what supported all of this was syndication. That is, networks produced shows and basic cable channels would pay to rerun them. Syndicating some shows was incredibly profitable in some cases (eg Seinfeld).
So the streamers came along and stripped things down. They got rid of junior positions. They adopted so-called "mini writing rooms". Those writers didn't tend to ever be on set. The runs were shorter and an 8 episode series couldn't support a writer in the same way a 22 episode series could. The streamers then were largely showing just their own content so residuals and syndication fees just went away.
All of this is short-term thinking. Hollywood has been both a massive industry and a source of American soft power internationally by spreading culture, basically.
I think the software engineering space is going through a similar transformation to what happened to the entertainment industry. A handful of people will do very well. AIs will destroy entry-level jobs and basically destroy that company and industry's future.
I predict in 10-20 years we'll see China totally dominating this space and a bunch of Linkedin "thought leaders" and politicians will be standing around scratchin their heads asking "what happened?"
Yes and no. Often times managers are now asking ask Claude code to write it but I want it delivered tomorrow. This leads to us forcing to use LLM generated code without enough time to review or understand it.
"AI is making junior devs useless" is a dangerous and incorrect conclusion. If this idea is repeated too often, people may start to believe it and even quit studying computer science altogether.
First of all, developers who only learn to code in a short bootcamp are often not well prepared — but that was already true before AI. In the past, many junior developers were students who were learning programming while studying, not just people who took a quick Python course on Udemy.
Instead of declaring junior developers useless, we should raise the standard: learn how to code properly, how to maintain code, understand networks, and build strong foundations in math and computer science. A well-trained junior developer is still extremely valuable and will always be needed.
I assume junior devs can at least search. AI often doesn't even do that. That's why there are things like context7, which in a narrow context helps but not perfect.
There are lots of ambiguous situations where a search and human "inference" can solve that AI still can't.
I can tell the AI to do something, it uses the worst approach, I tell it a better way exists, it says it validated it does not, I link to a GitHub issue saying it can be done via workarounds and it still fails. It's worse for longer tasks where it always shortcuts to if it fails pick a "safe" approach (including not doing it).
Funny enough we need the junior to guide the AI.
Junior devs: you have an oracle you can pester incessantly. Make the most of it so you can learn to detect its mistakes, know when to push back, and what to ask of it. That's when you are in the clear. Juniors who merely parrot the LLM get fired.
This is going to be music to deaf ears.
Companies will continue to demand it (I know people working at companies that are literally looking at AI usage as an individual performance metric, puke emoji), and probably 95% of humans using pretty understandable human logic aren’t going to work harder than they need to on purpose.
I wish I had a solution. I think the jury is still out on whether programming will be a dead profession in a short number of years, replaced by technical protect operators.
Problem is not making juniors useless. They kind of are by definition. Problem is that now they have very little chance to become seniors.
> If I’m reviewing your code and I ask you why you went with a certain approach, and you tell me “the AI suggested it”, I’ve immediately lost confidence in you.
I’ve experienced similar things and so understand the feeling, but this is poor leadership. If someone on your team makes it all the way to a code review and still thinks ‘the AI suggested it’, you failed to train them, failed to set expectations and they have justifiably lost more confidence in you than vice versa.
If we analyze the rest of the article through the lens of weak leadership, it sounds less like an AI problem and more like a corporate leadership problem.
Useless? Where do they expect the senior engineers to come from in the future?
AI made juniors without potential useless, not all juniors.
This is good advice for seniors too.
Eg. When using Ai Deep Research for hard to debug issues, asking for the why makes for a much better response.
You're going to have to do the unthinkable:
Invest in the training of your junior employees.
The cost of generating code is now laughable, so that's not the economic value brought to the table by a junior engineer, or really, any engineer. The value is now generated by knowing what code is good code. You're going to have to have talks, book clubs, hackathons, and the like to get your juniors to know what good code is. Do they know what design patterns are? How about good architecture? If they can't name a few design patterns, you're not investing enough.
Just another silly uninformed take.
This is ridiculous. New developers will learn a completely different skill path from what we learned, and they will get where we are faster than we did.
tl;dr ask why
When I started my career I heard people say almost verbatim "Stack overflow is making junior devs useless", with the idea all we did was copypaste scripts over. The same people failed, and the same people who can use the tools will succeed now.
I did my first completely vibe coded not looking at a line of code implementation last year and my second this year.
I could care less about why either Claude, Codex or before that a developer was using a for loop or a while loop. I did and do care about architecture.
I’m no more going to review every line of code with AI than I am when I was delegating to more junior developers. I’m going to ask Claude Code about how it implemented something where I know there is an efficient way vs naive way, find and test corner cases via manual and automated tests and do the same for functional and non functional requirements.
The "Junior Trap" is real: if you offload your thinking to Claude or GPT-4, you’re hitting "Done" for the day, but you’re accruing massive Learning Debt. You aren't building the failure-pattern recognition that actually makes an engineer valuable.
In a world where "Code is no longer a skill," the only way to survive is to stop being a "Prompt Operator" and start being a "System Auditor." If you can’t explain the trade-offs of the architectural pattern the AI just gave you, you aren't an engineer, you're just the person holding the screwdriver while the machine builds the house.
Nah, it makes teams useless. Maybe not quite yet, but soon, one engineer will be able to do a few sprint teams' worth of work, and deliver features orders of magnitude faster than a team working in parallel. Yeah, generally at first this will be seniors only. But before long, a junior will be able to come in and learn to manage one sprint team's worth of work under the guidance of a senior and partnered with a PM, and grow the product from there. Long term, I imagine 90% workforce reduction will be the norm. Just about all software is a rinse and repeat of some other software, not much true innovation, so picking and choosing and implementing some other software's feature into your own will start to become trivial single-day projects from start to finish. Hopefully AI creates some new industries that SWEs can roll into, but I'm feeling more doomer every day.
I maintain that in the future, any person wishing to learn any skill (not just coding!) will need to willingly eschew the use of AI when learning until they have "built the muscles". The literature is clear that repeated, hands-on practice is really the only way to build skills.
I suspect the progression will be "No AI until intuition (whatever that is for that skill)" -> "Gradual use of AI to understand where it falls short" -> "AI native expert".
How to actually implement this at scale is still TBD ;-) Ironically, AI will be invaluable for this e.g. as a hyper-personalized tutor but it will also present an irresistible temptation to offload the hands-on practice. We already have studies indicating the former is helpful but the latter stifles mastery. At this point I can only see self-discipline as a mechanism to willingly avoid AI.
Unfortunately, our testing-oriented education system only serves to incentivize over-reliance on AI (Goodhart's Law etc.) None of our current institutions and processes are suited for what is already happening and will only accelerate from here on. Things will need to change radically.
For this reason, I once predicted apprenticeships will be a thing again, and already there are signs with Microsoft's preceptorship proposal: https://dl.acm.org/doi/10.1145/3779312
This is highly encouraging because a tech giant is not only acknowledging the problem, but proposing a solution. Not a complete solution by far but at least a start.