logoalt Hacker News

HarHarVeryFunnyyesterday at 12:59 PM8 repliesview on HN

> Software is going to pile up because developing it is now cheap.

Software to do what, though ?!

Coding, maybe 10% of a developers job (Brooks "Silver Bullet" estimates 1/6), was never the bottleneck, and even if you automated that away entirely then you've only reduced development time by 10% (assuming you are not doing human code review etc).

I would also argue that software development as a whole (not just the coding part) was also typically never the bottleneck to companies shipping product faster, maybe also not for automating their business faster (internal IT systems), since the rest of the company is not moving that fast, business needs are not changing that fast, and external factors that might drive change are not moving that fast either.

I think that when the dust settles we'll find that LLM-assisted coding has had far less impact than those trying to sell it to us are forecasting. There will be exceptions of course, especially in terms of what a lone developer can do, or how fast a software startup can get going, but in terms of impact to larger established companies I expect not so much.


Replies

rafterydjyesterday at 2:56 PM

+1 for any mention of Fred Brooks. I like your point about software as a whole not being a bottleneck. In the 1970s the hardware was co-evolving with business uses (it still is, but constraints were much more severe) leading to large headcounts on software projects that _absolutely_ had to work and _absolutely_ required uncommon expertise. Most people had no concept of a computer's capabilities, computer science was not as widely distributed.

One thing that I would point to today to show that the landscape is different - the average programmer/engineer/developer today has no actual admin staff. Fred Brooks' example team setup of "The Surgical Team" has more support staff than programmers. Anyone who responds to the questions like "who manages the calendar" and "who manages the documentation" will state that the engineers doing it themselves offer the best results. Same goes for designing test cases, performing rollbacks, etc.

The fact of the matter is that any self respecting engineer today works in an environment where pro-activity and self-sufficiency are prerequisites. Managing your calendar and workload, communicating to leadership and users, these are all common tasks that would have been another person a generation ago.

So when discussing writing code more efficiently and aiding in software development, what I am essentially seeing is more people trying everything they can to offload work that used to be another person's job anyway. If you care about communication - you offload coding standards. If you care about security - you offload feature refactors, and so on.

In my opinion, I think that at some point we'll either realize that we need highly competent people _and also_ regular people to help us ensure the work gets done to a good standard. Or, we will each eventually survive by working alone in a room with a suite of AI tools, and wonder why we're still making software in the first place.

show 1 reply
g42gregoryyesterday at 11:04 PM

> Software to do what, though ?!

Replace all Oracle Applications in the Enterprise, for example. That will keep Corporate IT/Dev teams busy for quite a while.

Of course, this does not involve Oracle infrastructure, such as Database.

tracker1yesterday at 8:14 PM

That's kind of the point in GP... everything around the code has improved... the workflows, definitions, documentation, process. I'd say that all of those things are improving and expanding at a rate faster than the improvements in code output, which are also happening at a faster turn around than actual people.

I've said several times that when I use an Agent, I'm getting about 2-4x the value and about 10x the output... the "value" is features landing in code and the difference to the 10x is documentation and testing. While a lot of that may not get reviewed by every person that touches a product, it helps with further ai based feature development.

I'm not a big fan of running many agents or outright vibe coding slop... but you can definitely leverage the coding agents and get a lot of improved output.

show 2 replies
philwelchyesterday at 3:44 PM

As I recall, “No Silver Bullet” fundamentally rested on the assumption that the subroutine was the last word in abstractions to make programming more efficient, which probably wasn’t even defensible at the time because Lisp had already been invented, and is even less defensible after the past several decades of programming language research. Brooks was still onto something when it came to irreducible complexity, but offloading complexity an LLM can tackle to the LLM still saves time.

One of the lesser discussed Brooks essays is actually the best description of AI-first development: the “surgical team”. It just turns out the surgeon is the only human, and like many modern surgeries, the surgeon is controlling a robot instead of operating by hand.

It would be interesting to reread The Mythical Man-Month and see how each essay applies to AI-first development.

show 1 reply
nsxwolfyesterday at 9:41 PM

Everybody’s cooking, nobody’s eating.

show 1 reply
pojzonyesterday at 9:08 PM

We can all agree that very big portion of the time needed during product engineering is.. syncing progress, requirements, plans, etc etc. And we have to do it over and over due to how big teams are.

Fast forward, fire half of those ppl, for sure fire all middle managers, scrum masters, coaches, wooden-architects.

Suddenly you save up so much time on syncing, you can ship twice as fast.

And NO, quality and impact doesnt go down. It actually goes up.

This is probably something you did not want to hear :)

Few competent ppl with AI are much much much better than dozens of medicore teams.

We need now „Product Builders” and „Product maintainers”. All of the other roles lost value.

ikrenjiyesterday at 5:01 PM

you can't continue shipping code the same way pre-LLM vs post-LLM and expecting a huge speed gain. the trick is abandoning the old models and bottlenecks and embracing the new possibilities enabled by LLMs. requires a high trust environment

show 3 replies