The way LLMs are being forced upon the workforce in tech is just as bad, actually, yes.
> Flamethrowers are inherently dangerous to the operator and are ~intended to be used burn things to the ground.
I actually think bringing up this point reinforces the analogy rather than undercutting it. LLMs are ~intended to spread disinformation, eg. Deepseek on 1989, Grok going full Mecha-Hitler, ChatGPT selling out prompts to advertisers. One of the biggest impacts LLMs will have on human society is as a propaganda tool with a reach of billions.