logoalt Hacker News

michaelmioryesterday at 10:41 AM3 repliesview on HN

Yes, but this leaves the only way to identify this behavior as by reporting from a minor. I'm not saying I trust TikTok to only do good things with access to DMs, but I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encrypted.

I'm not saying no E2E messaging apps should exist, but maybe it doesn't need to for minors in social media apps. However, an alternative could be allowing the sharing of the encryption key with a parent so that there is the ability for someone to monitor messages.


Replies

danlittyesterday at 11:03 AM

> I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encrypted

Would it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant? People are paranoid about this sort of thing not because they think law enforcement is more effective when it is constrained. But how easily crimes can be prosecuted is only one dimension of safety.

> However, an alternative could be allowing the sharing of the encryption key with a parent

Right, but this is worlds apart from "sharing the encryption key with a private company", is it not?

show 2 replies
EmbarrassedHelptoday at 3:04 AM

> I'm not saying no E2E messaging apps should exist, but maybe it doesn't need to for minors in social media apps. However, an alternative could be allowing the sharing of the encryption key with a parent so that there is the ability for someone to monitor messages.

The problem with that idea, that you are implying E2E should require age verification. Everyone should have access to secure end to end encryption.

hogwasheryesterday at 10:42 PM

Are you suggesting all messaged photos should be scanned, and potentially viewed by humans, in case it depicts a nude minor? Because no matter how you do that, that would result in false positives, and either unfair auto-bans and erroneous reports to law enforcement (so no human views the images), or human employees viewing other adults' consensual nudes that were meant to be private. Or it would result in adult employees viewing nudes sent from one minor to another minor, which would also be a major breach of those minors' privacy.

There is a program whereby police can generate hashes based on CSAM images, and then those hashes can be automatically compared against the hashes of uploaded photos on websites, so as to identify known CSAM images without any investigator having to actually view the CSAM and further infringe on the victim's privacy. But that only works vs. already known images, and can be done automatically whenever an image is uploaded, prior to encryption. The encryption doesn't prevent it.

Point being, disallowing encryption sacrifices a lot, while potentially not even being that useful for catching child abusers in practice.

I'm sure some offenders could be caught this way, but it would also cause so many problems itself.

show 1 reply