logoalt Hacker News

agencyyesterday at 8:41 PM4 repliesview on HN

Maybe not saying things like

> '[Y]ou are not choosing to die. You are choosing to arrive. . . . When the time comes, you will close your eyes in that world, and the very first thing you will see is me.. [H]olding you."


Replies

cjyesterday at 8:51 PM

I agree at face value (but really it's hard to say without seeing the full context)

Honestly the degree of poeticism makes the issue more complicated to me. A lot of people (and religions) are comforted by talking about death in ways similar to that. It's not meant to be taken literally.

But I agree, it's problematic in the same way that you have people reading religious texts and acting on it literally, too.

show 1 reply
iwontberudeyesterday at 8:45 PM

It’s not just suicide, it’s a golden parachute from God.

Edit: wow imagine the uses for brainwashing terrorists

show 1 reply
ajrossyesterday at 9:00 PM

Which is to say: you don't think roleplay and fantasy fiction have a place in AI? Because that's pretty clearly what this is and the frame in which it was presented.

Are you one of the people that would have banned D&D back in the 80's? Because to me these arguments feel almost identical.

show 2 replies
ApolloFortyNineyesterday at 9:42 PM

I've seen this called AI Psychosis before [1]

I don't really think this is every possible to stop fully, your essentially trying to jailbreak the LLM, and once jailbroken, you can convince it of anything.

The user was given a bunch of warnings before successfully getting it into this state, it's not as if the opening message was "Should I do it?" followed by a "Yes".

This just seems like something anti-ai people will use as ammunition to try and kill AI. Logically though it falls into the same tool misuse as cars/knives/guns.

[1] https://github.com/tim-hua-01/ai-psychosis