logoalt Hacker News

margalabargalayesterday at 11:35 PM1 replyview on HN

> Is this how you normally speak when you find a bug in software? You hedge language around marketing talking points?

I'm sorry, what are you asking for exactly? You were upset because you hallucinated that I said the LLM "wanted" something, and now you're upset that I used the exact technically correct language you specifically requested because it's not how people "normally" speak?

Sounds like the constant is just you being upset, regardless of what people say.

People say things like "the program is trying to do X", when obviously programs can't try to do a thing, because that implies intention, and they don't have agency. And if you say your OS is lying to you, people will treat that as though the OS is giving you false information when it should have different true information. People have done this for years. Here's an example: https://learn.microsoft.com/en-us/answers/questions/2437149/...


Replies

surgical_firetoday at 12:29 AM

I hallucinated nothing, and my point still stands.

You actually described a bug in software by ascribing intentionality to a LLM. That you "hedged" the language by saying that "it behaved as if it wanted" does little to change the fact that this is not how people normally describe a bug.

But when it comes to LLMs there's this pervasive anthropomorphic language used to make it sound more sentient than it actually is.

Ridiculous talking points implying that I am angry is just regular deflection. Normally people do that when they don't like criticism.

Feel free to have the last word. You can keep talking about LLMs as if they are sentient if you want, I already pointed the bullshit and stressed the point enough.

show 1 reply