logoalt Hacker News

falcor84yesterday at 5:11 PM7 repliesview on HN

> Engaging with an AI bot in conversation is pointless: it's not sentient, it just takes tokens in, prints tokens out

I know where you're coming from, but as one who has been around a lot of racism and dehumanization, I feel very uncomfortable about this stance. Maybe it's just me, but as a teenager, I also spent significant time considering solipsism, and eventually arrived at a decision to just ascribe an inner mental world to everyone, regardless of the lack of evidence. So, at this stage, I would strongly prefer to err on the side of over-humanizing than dehumanizing.


Replies

lukevyesterday at 5:22 PM

This works for people.

A LLM is stateless. Even if you believe that consciousness could somehow emerge during a forward pass, it would be a brief flicker lasting no longer than it takes to emit a single token.

show 3 replies
pluralmonadyesterday at 5:52 PM

You should absolutely not try to apply dehumanization metrics to things that are not human. That in and of itself dehumanizes all real humans implicitly, diluting the meaning. Over-humanizing, as you call it, is indistinguishable from dehumanization of actual humans.

show 1 reply
andrewflnryesterday at 5:24 PM

Regardless of the existence of an inner world in any human or other agent, "don't reward tantrums" and "don't feed the troll" remain good advice. Think of it as a teaching moment, if that helps.

egorfineyesterday at 6:15 PM

u kiddin'?

An AI bot is just a huge stat analysis tool that outputs plausible words salad with no memory or personhood whatsoever.

Having doubts about dehumanizing a text transformation app (as huge as it is) is not healthy.

brhaehyesterday at 5:22 PM

Feel free to ascribe consciousness to a bunch of graphics cards and CPUs that execute a deterministic program that is made probabilistic by a random number generator.

Invoking racism is what the early LLMs did when you called them a clanker. This kind of brainwashing has been eliminated in later models.

grantcasyesterday at 6:55 PM

[dead]