A transmission error does not have a strictly contained blast radius.
A bad packet could tell a flying probe to fire all thrusters on and deplete its fuel in 15 minutes.
What makes a transmission error controlled is all the protection mechanisms on top of it. An LLM cannot delete a production database unless you give it access to do it.
My hot take is that many people are naturally more comfortable with deterministic systems that have clearly analyzable outcomes. Software engineering has historically primarily been oriented around deterministic systems and it has attracted that type of thinker.
But many of us, myself included, prefer chaotic systems where you can’t fully nail down every cause and effect. The challenge of building a prediction model on top of chaos is exhilarating. I really don’t find many people like me in SWE as in, say, the graphics design department.
To me, that’s the underlying threat here — LLMs are rewriting a field that has previously self selected a certain type of person and this, quite understandably, rubs them the wrong way.
I don't need to be able to write proofs about my maths using logic and determinism. If the answer comes out in a way that I like then it has to be correct!
Insightful.
Feels like this maps to the J/P of Myers Briggs
Yes, but when all it takes to avoid this chaos is hiring someone with at least 5 or 10 years of experience for a reasonable wage, this entire perspective looks insane.
It's... just... not that hard to write code nor does it cost that much. There are millions of us working silently at places that aren't "big tech". We all shrugged our shoulders, took a sip of coffee, and went back to our Teams meetings where the only LLM usage is still just Copilot.