logoalt Hacker News

Forgeties79today at 1:55 AM1 replyview on HN

ChatGPT is not a human being, let alone a licensed therapist. You don’t call a therapist at 3 in the morning. You go to a hospital. If you are literally about to kill yourself Sam Altman is not your answer.

Hell call a crisis hotline. Talk to a person. Not a potential (bot) enabler.


Replies

projectazoriantoday at 4:50 AM

> ChatGPT is not a human being, let alone a licensed therapist. You don’t call a therapist at 3 in the morning. You go to a hospital. If you are literally about to kill yourself Sam Altman is not your answer.

You know that mental health is a continuum right? There are a lot of problems people have that fall far short of active suicidal ideation. Maybe you think they should just add them to their journal for discussion at their regularly scheduled therapy session, but the world doesn't work that way. The "ruminating at 3am" headspace can be a productive one and is difficult to access in a normal therapy session.

Not to mention that many people who have actually called suicide hotlines will tell you that they aren't terribly helpful. (edit: not saying that they're always unhelpful, but many people have unhelpful experiences, or have eg. social anxiety that stops them from calling)

show 1 reply