logoalt Hacker News

Niko901chyesterday at 1:16 PM14 repliesview on HN

AI coding tools are making this problem worse in a subtle way. When an agent can generate a "scalable event-driven architecture" in 5 minutes, the build cost of complexity drops to near zero. But the maintenance cost doesn't.

So now you get Engineer B's output even faster, with even more impressive-sounding abstractions, and the promotion packet writes itself in minutes too. Meanwhile the actual cost - debugging, onboarding, incident response at 3am - stays exactly the same or gets worse, because now nobody fully understands what was generated.

The real test for simplicity has always been: can the next person who touches this code understand it without asking you? AI-generated complexity fails that test spectacularly.


Replies

slfnflctdyesterday at 2:46 PM

> now nobody fully understands what was generated

To be fair, a lot of the on call people being pulled in at 3am before LLMs existed didn't understand the systems they were supporting very well, either. This will definitely make it worse, though.

I think part of charting a safe career path now involves evaluating how strong any given org's culture of understanding the code and stack is. I definitely do not ever want to be in a position again where no one in the whole place knows how something works while the higher-ups are having a meltdown because something critical broke.

show 4 replies
__MatrixMan__yesterday at 3:01 PM

I think we'll see a decline of software as a product for this reason. If your job is to solve a problem, and you use AI to generate a tool that solves that problem, or you use money to buy a tool that solves that problem, well then it's still your job to solve that problem regardless of which tool you use.

But given how poorly bought software tends to fit the use case of the person it was bought for... eventually generate-something-custom will start making more and more sense.

If you end up generating something that nobody understands, then when you quit and get a new job, somebody else will probably use your project as context for generating something that suits the way they want to solve that problem. Time will have passed, so the needs will have changed, they'll end up with something different. They'll also only partially understand it, but the gaps will be in different places this time around. Overall I think it'll be an improvement because there will be less distance (both in time and along the social graph) between the software's user its creator--them being most of the time the same person.

mattcollinsyesterday at 1:44 PM

On the other hand, AI coding tools make it relatively easy to set and apply policies that can help with this sort of thing.

I like to have something like the following in AGENTS.md:

## Guiding Principles - Optimise for long-term maintainability - KISS - YAGNI

show 3 replies
BloondAndDoomyesterday at 3:26 PM

This is something I keep thinking while coding with AI, and same with introducing library dependencies for the simplest problems. It’s not whether how quickly I can get there but more about how can I keep it simple to maintain not only for myself but for the next AI agent.

Biggest problem is that next person is me 6 months later :) but even when it’s not a next person problem how much of the design I can just keep in my mind at a given time, ironically AI has the exact same problem aka context window

show 1 reply
mrweaselyesterday at 3:57 PM

There's also the operational cost of running whatever is churned out. I wouldn't exactly blame that on AIs, but a large contingency of developers optimize for popular tech-stacks and not ease of operations. I don't think that will change just because they start using AI. In my experience the AI won't tell you that you're massively overbuilding something or that if we did this in C and used Postgresql we'd be able to run this on an old Pentium III with 4GB of RAM. If you want Kubernetes and ElasticSearch, you'll get exactly that.

johnfnyesterday at 7:49 PM

Was this comment written by an LLM? It has a lot of the tell-tale signals, and pangram gives it a 100% chance of being written by AI.

andixyesterday at 5:38 PM

> When an agent can generate a "scalable event-driven architecture" in 5 minutes

Currently they can't. Anyone with a basic understand of sw engineering will find numerous issues with the result of such a prompt within minutes.

show 1 reply
Cthulhu_yesterday at 4:25 PM

AI generators only generate that if you tell them to though - as a developer (especially senior) it's your job to know what you want and tell the AI coding tools that.

dude250711yesterday at 1:29 PM

It's a bad time to be an altruistic perfectionist, tell you what.

Avoid hands-on tech/team lead positions like hell.

show 3 replies
kaleidawaveyesterday at 3:02 PM

The "coding benchmarks" should be based heavily biased on what dependencies they use etc. Reduced characters etc.

brightballyesterday at 1:40 PM

The flip side of this is that languages who have a major selling point of maintainability just had their value increase dramatically.

ameliusyesterday at 1:40 PM

Simplicity is a driver for better abstractions. But now with AI, will we even develop new abstractions?

show 1 reply
jasondigitizedyesterday at 3:02 PM

What is the maintenance cost when Opus 6 or whatever is available?

an0malousyesterday at 3:44 PM

Agree but I wouldn't say it's subtle, the slop builds up quickly