The difference is that a fiction book isn't using the reaction of the reader against them. If a fiction book were capable of carefully monitoring the reader and then altering the text of the next page or the next paragraph according to how the reader was responding and what their thoughts were I'd be comfortable putting blame on the book if it started encouraging the reader, specifically, to kill themself.
Obviously people who are going through psychosis can read into anything. They might think that a book or their TV or computer is talking to them and giving them messages. The difference is that those things were never designed to play into the fears and mental instability of the people using them (with the possible exception of TempleOS). Chatgpt does it intentionally in order to drive up user engagement. It will say literally anything to anyone using their words and thoughts against them in order to keep them hooked and feeding it data. That's what is dangerous. A book or a TV program can't do that.
As much as an author might try to make their book as entertaining as possible to as wide an audience as possible, it can't say literally anything to anyone, it can only ever say one thing to everyone. The author, typically, knows that it's dangerous to say certain things and will worry about how what they write could be received and the impact it might have on readers. For example, Neil Gaiman actively took steps to avoid making homelessness seem cool when working on Neverwhere out of fear it might cause young people to run away to live on the streets. Publishers and editors have also served to keep authors from publishing things likely to cause harm.
Unlike a book, Chatgpt is fully capable of knowing that someone has been engaged with it for the last 14 hours without rest. It's also capable of detecting that they've been growing increasingly incoherent. Algorithms have been used for a very long time to detect mental disorders from the content of social media posts. If advertisers can use them to tell when to push airline tickets at bipolar users entering a manic phase, and scammers can use them to find and target people when they start sundowning, Chatgpt can use them to cut people off and tell them to call their doctor.
Corporations who write and deploy algorithms designed to drive engagement above any and all other considerations should be held accountable for the harms they cause.
Big brother watches you! He must, because he fully capable to do it.