Personally, I prefer vibe coding in the sense of stitching things together at the function-to-method level.
Unlike people who take the extreme position that vibe coders are useless, I do think LLMs often write individual functions or methods better than I do. But in a way, that does not fundamentally change the nature of the work. Even before LLMs, many functions and methods were effectively assembled from libraries, Stack Overflow snippets, documentation examples, and copied patterns.
The real limitation comes from the nature of transformer-based LLMs and their context windows. Agentic coding has a ceiling. Once the codebase reaches a scale where the agent can no longer hold the relevant structure in context, you need a programmer again.
At that point, software engineering becomes necessary: knowing how to split things according to cohesion and coupling, using patterns to constrain degrees of freedom, and designing boundaries that keep the system understandable.
In my experience, agentic coding is useful for building skeletons. But if you let the agent write everything by itself, the codebase tends to degrade. The human role is to divide the work into task units that the agent can handle well.
Eventually, a person is still needed.
If you make an agent do everything, it tends to create god objects, or it strangely glues things together even when the structure could have been separated with a simpler pattern. Thinking about it now, this may be exactly why I was drawn to books like EIB: they teach how to constrain freedom in software design so the system does not collapse under its own flexibility.
I know this is anecdotal but after almost 2 years of no activity, I have been absolutely hounded by recruiters for nearly a month. They show up in my LinkedIn feed and I get multiple emails a week asking to interview. What in the world changed? It doesn't look like the job market's improved much. In fact I see more layoffs than ever before.
Basically - I anticipate a 3x rise in software engineering salaries in next five years if the dumb rhetoric of "oh coding is solved problem" rhetoric continued because of the collapse in supply side.
What I see use an immense amount of bugs and security issues that can be found much easier now then even before, because of AI. Also I see less trust in using AI in direct coding, because there are many examples of code additions that breach the safety of software in enterprises. Now to solve this, it requires for actual humans to do coding. And with that, it is probably true that more use of AI in coding leads to more SE's required to oversee ensure security. I personally see the big benefit of AI tooling to be in testing, security checking, documenting, etc. rather then coding itself.
This just means big layoffs are coming in this sector and they are astroturfing before hand so that they can show this stat that jobs are available Meta and Microsoft just started the ball rolling and it will accelerate over the next 2 years.
I foresee the need for engineers to be really "wavy".
I have personally never been busier or more productive. It's like all the "work" of my work has disappeared. There are no more blockers and I can just run free and get as much done as I want and the only thing slowing me down is Jira.
The real downturn is going to be the SaaS apocalypse. In the next year or two there will be a reckoning where all these expensive low-code/no-code middleware applications suddenly don't make sense.
So I think it will be less about the ranks of engineers being thinned out unilaterally, and more about large swathes of products being obsolete.
We also try to understand it here: https://hugston.com/news/the-west-strategic-battle-defeat-th...
Nobody wants to hire a new team member when it takes 3 months to train them and at the same time a new opus comes out by then.
I suspect hiring will pick up when capability of the models stops growing so quickly or gaps between start widening. Obviously the problem capabilities are not slowing down and gaps get shorter…
Companies hiring more people to build AI based, self-healing and self-developing systems faster? „We don’t need those old programmers, we need new people who know how to build harnesses around AI”. Hiring those „old” programmers, but from other companies.
The title of the submission is an almost comical example of hn navel-gazing - of the many interesting things in the article surely the job prospects of hn readers should not be near the top of the list
On the off chance you care - you can keep javascript disabled on this article, and just a No Style page style to read it.
90% of the job ads I see have the word "AI" in them. It can be a startup hoping for a get-rick-quick opportunity from the AI hype, or an established company.
Both types expect you to spend as many tokens as possible so that the AI bubble doesn't burst (presumably because leadership has a financial interest in this).
Your actual productivity isn't important. If you point out that you're much faster writing code on your own in 90% of cases, you will be told you're not good at AI, you're not prompting it correctly and that generally you're not AI-native and that you'll be left behind. To be precise, token usage is a performance metric, so you'll be let go if Claude is not running continuously 8 hours a day.
I'd like to know how many places have mandates to write 100% of your code using AI, as well as to max out your AI agent's plan. For some reason nobody talks about it even though I know several companies around the world that are forcing this on their employees.
If you're looking for a job then you don't have a choice, it's better to have an income. But if you're looking to change jobs to get away from AI to actually be productive and gain experience then it's a very bad job market.
So there will be again waves of hiring developers only for companies to realize after 5 years that they have too many employees and fire them again?
Title is editorialized and the report is from two months ago.
Are salaries rising too?
[flagged]
our labor market is cyclic, relatively short busts and long initially-slow-and-faster-and-faster booms. We had busts of 2000-2003, 2008-2010(11?), 2022- i guess 2026. I wasn't in US in 199x, yet i guess beginning of the 199x also was a bit tough.
Unavoidable AI-based productivity growth, in software and in all the other industries, will lead to the software, specifically AI in this case, not just eating the wold, it would be devouring it. Such AI revolution will mean even more need for software engineers, just like the Personal Computer revolution and the Internet revolution did in their times. Of course the software engineering will get changed like it did in those previous revolutions.
What did they write that article with?
The year is 2026. The unemployment rate just printed 4.28%, AI capex is 2% of GDP (650bn), AI adjacent commodities are up 65% since Jan-23 and approximately 2,800 data centers are planned for construction in the US. In spite of the current displacement narrative – job postings for software engineers are rising rapidly, up 11% YoY. ... We wrote last week that we see the near-term dynamics around the AI capex story as inflationary, but given markets are focused on the forward narrative, we outline a more constructive take on the end state below. Before that, however, it’s worth reflecting that the imminent disintermediation narrative rests on the speed of diffusion.
The chart "Job Postings For Software Engineers Are Rapidly Rising" seems to show a rise from 65 to 71 for "Indeed job postings" from October 2025 to March 2025. That's a 9% increase. Then they inflate that by extrapolating it to a year. The graph exaggerates the change by depressing the zero line to way off the bottom and expanding the scale. This could just be noise.
The chart "Adoption Rate of Generative AI at Work and Home versus the Rate for Other Technologies" has one (1) data point for Generative AI.
This article bashes some iffy numbers into supporting their narrative.
Suggested reading: [1]
[1] https://en.wikipedia.org/wiki/How_to_Lie_with_Statistics