logoalt Hacker News

mjr00yesterday at 9:08 PM3 repliesview on HN

This is touched upon in the article:

> Last year, OpenAI released estimates on the number of ChatGPT users who exhibit possible signs of mental health emergencies, including mania, psychosis or suicidal thoughts.

> The company said that around 0.07% of ChatGPT users active in a given week exhibited such signs.

0.07% doesn't sound like much, but ChatGPT has about a billion WAU, which means -seventy million- 700,000 people per week.


Replies

onion2kyesterday at 10:26 PM

Is that different to the number of people who have that going on in their life even without AI though? If it's 0.01% outside of AI, and 0.07% of AI users, then either AI attracts people with those conditions or AI increases the likelihood of having them. That's worth studying.

It's also possible that 0.1% of people have them and AI is actually reducing the number of cases...

show 1 reply
sd9yesterday at 9:10 PM

700,000

Still, a lot

show 1 reply
avaeryesterday at 9:16 PM

That number terrifies me not because it is so high, but because it exists.

What is stopping an entity (corporate, government, or otherwise) from using a prompt to make sweeping decisions about whether people are mentally or otherwise "fit" for something based on AI usage? Clearly not the technology.

I'm not saying mental health problems don't exist, but using AI to compute it freaks me out.

show 2 replies