logoalt Hacker News

kingstnapyesterday at 8:21 PM2 repliesview on HN

I like the language of fueling being used here instead of the typical causal thing we see as though using AI means you will go insane.

I would completely agree that if you are already 1x delusional then AI will supercharge that into being 10x delusional real fast.

Granted you could argue access to the internet was already something like a 5x multiplier from baseline anyway with the prevalence of echo chamber communities. But now you can just create your own community with chatbots.


Replies

whazoryesterday at 8:36 PM

One of the most reliable ways to induce psychosis is prolonged sleep deprivation. And chatbots never tell you to go to bed.

show 3 replies
shadowgovtyesterday at 8:24 PM

My understanding of LLMs with attention heads is that they function as a bit of a mirror. The context will shift from the initial conditions to the topic of conversation, and the topic is fed by the human in the loop.

So someone who likes to talk about themselves will get a conversation all about them. Someone talking about an ex is gonna get a whole pile of discussion about their ex.

... and someone depressed or suicidal, who keeps telling the system their own self-opinion, is going to end up with a conversation that reflects that self-opinion back on them as if it's coming from another mind in a conversation. Which is the opposite of what you want to provide for therapy for those conditions.

show 2 replies