> Discord seems to have intentionally softened its age-verification steps so it can tell regulators, “we’re doing something to protect children,” while still leaving enough wiggle room that technically savvy users can work around it.
...source?
I sincerely doubt that Discord's lawyers advocated for age verification that was hackable by tech savvy users.
It seems more likely that they are trying to balance two things:
1. Age verification requirements
2. Not storing or sending photos of people's (children's) faces
Both of these are very important, legally, to protect the company. It is highly unlikely that anyone in Discord's leadership, let alone compliance, is advocating for backdoors (at least for us.)
Usually in cases like this, there is no source, there can’t be. Long long ago, long enough to be past the statute of limitations, I was involved in a similar regulatory compliance situation. We specifically communicated in such a way that “actual effectiveness” wasn’t talked about, and we set that up with a single, verbal only and without recording, meeting between the team and one of the lawyers.
Point is, these kinds of schemes where internal communication is deliberately hobbled to comply maliciously with requirements while still being completely in the clear as far as any actual recorded evidence goes. And there’s always at least one person piping in with a naïve “source?” as if people would keep recorded evidence of their criminal conspiracies.