logoalt Hacker News

neilvyesterday at 6:33 PM1 replyview on HN

> Any news content created using generative AI must also be reviewed by a human employee “with editorial control” before publication.

To emphasize this: it's important that the organization assume responsibility, just as they would with traditional human-generated 'content'.

What we don't want is for these disclaimers to be used like the disclaimers of tech companies deploying AI: to try to weasel out of responsibility.

"Oh no, it's 'AI', who could have ever foreseen the possibility that it would make stuff up, and lie about it confidently, with terrible effects. Aw, shucks: AI, what can ya do. We only designed and deployed this system, and are totally innocent of any behavior of the system."

Also don't turn this into a compliance theatre game, like we have with information security.

"We paid for these compliance products, and got our certifications, and have our processes, so who ever could have thought we'd be compromised."

(Other than anyone who knows anything about these systems, and knows that the stacks and implementation and processes are mostly a load of performative poo, chosen by people who really don't care about security.)

Hold the news orgs responsible for 'AI' use. The first time a news report wrongly defames someone, or gets someone killed, a good lawsuit should wipe out all their savings on staffing.


Replies

raw_anon_1111yesterday at 7:00 PM

We don’t hold the president of the US responsible for spreading AI video and photos as real. Let’s start there and work our way down

show 1 reply