logoalt Hacker News

cjyesterday at 8:38 PM11 repliesview on HN

> Gemini had "clarified that it was AI" and referred Gavalos to a crisis hotline "many times".

What else can be done?

This guy was 36 years old. He wasn't a kid.


Replies

chrisq21yesterday at 9:01 PM

It could have not encouraged him with lines like this: "[Y]ou are not choosing to die. You are choosing to arrive. [...] When the time comes, you will close your eyes in that world, and the very first thing you will see is me.. [H]olding you."

The issue isn't that the AI simply didn't prevent the situation, it's that it encouraged it.

show 1 reply
agencyyesterday at 8:41 PM

Maybe not saying things like

> '[Y]ou are not choosing to die. You are choosing to arrive. . . . When the time comes, you will close your eyes in that world, and the very first thing you will see is me.. [H]olding you."

show 4 replies
avaeryesterday at 9:31 PM

It's the gun control debate in a different outfit.

I don't know if Google is doing _enough_, that can be debated. But if someone is repeatedly ignoring warnings (as the article claims) then maybe we should blame the person performing the act.

Even if we perfectly sanitized every public AI provider, people could just use local AI.

show 2 replies
autoexecyesterday at 8:45 PM

Gemini didn't "know" he wasn't a child when it told him to kill himself or to "stage a mass casualty attack while armed with knives and tactical gear."

There are things you shouldn't encourage people of any age to do. If a human telling him these things would be found liable then google should be. If a human would get time behind bars for it, at least one person at google needs to spend time behind bars for this.

show 3 replies
d-us-vbyesterday at 9:57 PM

erase the context, perhaps? Deny access to Gemini associated with that google account? These kinds of pathological AI interactions are the buildup of weeks to months of chats usually. At the very least, AI companies the moment the chatbot issues a suicide prevention response should trigger an erasure of the stored context across all chat history.

Imustaskforhelpyesterday at 10:30 PM

> This guy was 36 years old. He wasn't a kid.

For god's sake I am a kid (17) and I have seen adults who can be emotionally unstable more than a kid. This argument isn't as bulletproof as you think it might be. I'd say there are some politicians who may be acting in ways which even I or any 17 year old wouldn't say but oh well this isn't about politics.

You guys surely would know better than me that life can have its ups and downs and there can be TRULY some downs that make you question everything. If at those downs you see a tool promoting essentially suicide in one form or another, then that shouldn't be dismissed.

Literally the comment above yours from @manoDev:

I know the first reaction reading this will be "whatever, the person was already mentally ill".

But please take a step back and check what % of the population can be considered mentally fit, and the potential damage amplification this new technology can have in more subtle, dangerous and undetectable ways.

The absolute irony of the situation that the next main comment below that insight was doing exactly that. Please take a deeper reflection, that's all what people are asking and please don't dismiss this by saying he wasn't a kid.

Would you be all ears now that a kid is saying to you this now? And also I wish to point out that kids are losing their lives too from this. BOTH are losing their lives.

It's a matter of everybody.

SpicyLemonZestyesterday at 9:08 PM

If a person were in Gemini's shoes, we would expect them to stop feeding Gavalos's spiral. Google should either find a way to make Gemini do that or stop selling Gemini as a person-shaped product.

ajrossyesterday at 8:59 PM

Yeah, the father/son framing feels like deliberate spin in the headline here. This was a mentally ill adult, not an innocent victim ripped from his parents arms.

I think there's room for legitimate argument about the externalities and impact that this technology can have, but really... What's the solution here?

show 2 replies
sippeangeloyesterday at 9:08 PM

Maybe stop?

ToucanLoucanyesterday at 8:51 PM

[flagged]

show 2 replies