logoalt Hacker News

bredrenyesterday at 5:14 PM1 replyview on HN

> ...this is equivalent to providing all of your employees with a flamethrower, and then saying they bear all responsibility for the fires they start.

This is essentially the policy of most SWE groups but with PRs merged, though right?

You can / should use AI to accelerate SWE workflows and assist in reviews but if you merge something that is bad or breaks production that is on you.

> "Hey, don't blame us for giving them flamethrowers, it's company policy not to burn everything to the ground!".

Flamethrowers are inherently dangerous to the operator and are ~intended to be used burn things to the ground.

I'm no expert on arms but there is probably a simile with a better fit out there.


Replies

applfanboysbgonyesterday at 5:25 PM

The way LLMs are being forced upon the workforce in tech is just as bad, actually, yes.

> Flamethrowers are inherently dangerous to the operator and are ~intended to be used burn things to the ground.

I actually think bringing up this point reinforces the analogy rather than undercutting it. LLMs are ~intended to spread disinformation, eg. Deepseek on 1989, Grok going full Mecha-Hitler, ChatGPT selling out prompts to advertisers. One of the biggest impacts LLMs will have on human society is as a propaganda tool with a reach of billions.