logoalt Hacker News

misterflibbletoday at 11:52 AM12 repliesview on HN

Subtly? I beg to differ. My team leader only communicates to me using his LLM and so his "thoughts" are not his own!


Replies

jerrygarciatoday at 12:17 PM

I often wonder if the popularity of LLMs among company executives is that they are the perfect yes men.

They rarely disagree with any idea or proposal, providing a salve for the insecurities of their users.

show 2 replies
oddsockmachinetoday at 4:29 PM

I'm dealing with the same nonsense. I get LLM-generated reviews of my work, documents, and plans which are not grounded in reality or nuance. Regularly have to explain why the AI is wrong. I was told I should run my docs through the LLM to make them read better. But they're not even being read by humans at this point.

beached_whaletoday at 12:35 PM

This is one of my fears with this, losing ones voice. Everyone's expression distilled to the mean. This has ramifications in things like recognizing if a person is who they say they are too. At least currently, it is punished/shunned to sound like an LLM, but it's well within reason to see that shift to individuality being penalized.

show 1 reply
nidnoggtoday at 12:59 PM

Guilty as charged. In my mind, when I'm insecure about a response or if I don't have enough expertise in the topic at hand I end up running it through an LLM. Lately I've been really trying harder to keep my original ideas as much as possible. I'm seeing a bit of an improvement, but still early to tell

show 3 replies
ge96today at 3:28 PM

Man that's so annoying I have a similar problem our devops person I ask question to literally gives me AI responses

Also annoying to me working with a "partner" non-technical they just send me an LLM dump of how to do something

I was trying to explain it to them in an analogy like showing up to a mechanic and telling them what to do based on what ChatGPT said

erutoday at 12:25 PM

Well, has it been an improvement?

show 1 reply
avaertoday at 12:40 PM

Just because thoughts are translated doesn't mean they are consumed in the process.

However I don't doubt many "team leaders" can and should be replaced with LLMs.

ModernMechtoday at 11:58 AM

AI doesn't have to be conscious or sentient to take over, all that needs to happen is for politicians, law enforcement, journalists, educators etc. to uncritically parrot everything it outputs. The military is already using AI to make targeting decisions. If they just go with whatever the AI says to strike, then AI is already fighting our wars.

show 2 replies
SecretDreamstoday at 12:33 PM

I would be looking for another job.

I'm fine with using LLMs as coding tools. But I find it deeply offensive when someone is very explicitly using them to communicate with me.

Communication is such a deeply human experience. It lets people feel each other out, and learn things beyond just the words being said. To have that filtered out by an LLM is just disgraceful.

show 4 replies
MattGaisertoday at 12:17 PM

And I would bet he judges your work with AI, assigns you work generated by AI, and perhaps evaluates whether you yourself use enough AI.

show 1 reply
tcp_handshakertoday at 12:29 PM

[dead]