logoalt Hacker News

l23k4today at 8:46 AM3 repliesview on HN

We had people acting out like this before LLM chatbots, correlation does not necessarily imply causation.


Replies

hansmayertoday at 9:23 AM

We did...but it was few here and there. The LLMs are making it massive and impacting people on a huge scale.

embedding-shapetoday at 9:05 AM

> correlation does not necessarily imply causation

I feel like you're missing what you're replying to, why are you saying this? The article is about a person who "lost grip on reality", no one is saying LLMs is turning people into pope-wannabees as far as I can tell, you're reacting against something no one claimed.

show 1 reply
nephihahatoday at 9:12 AM

This is something new. Delusions were around before, certainly, but LLM offers a round the clock potential for psychological conditioning, which would not normally be possible without sustained attention by a group of people.