logoalt Hacker News

Heliodexyesterday at 7:26 PM1 replyview on HN

> But they aren't finding the kids on discord because Discord is not a social network that links pedophiles with children. Roblox is that.

If being more open and public correlates strongly enough with "links pedophiles with children", then yeah, true. I expect Roblox to do plenty to improve its platform safety with their track record. The recent introduction of their ID verification system to prevent communication between users outside of specified age buckets, solely in the context of improving child safety, is working and significantly reducing cases of child exploitation both on and off of the platform.

> it facilitates pedophiles predating on children.

I don't agree that it facilitates this kind of behaviour, nor does there exist enough evidence to make such a claim. Try red-teaming it: take the place of a bad actor that aims to cause harm to a child on the Roblox platform.

First, the actor will need an account. Next, they'll need to join a game (one with a low content maturity rating) and find a vulnerable child. Of course, they can't actually communicate with each other, so the actor needs to verify their age. For this, they will need either the face of a child, a verifiable ID document of a child, or the ability to defraud the system (likely by forging an ID document).

Assume they can get past that, and are now considered in the same age bracket as the child, and can thus communicate with them. They now need to direct the child off-platform to avoid them being caught very easily by Roblox's safety teams and systems. They won't be able to use a social media link, as the actor isn't permitted to send them and the child isn't permitted to receive them due to them both being classified as minors by Roblox. So the actor will need to use chat or private messages to give them specific instructions, which will of course be heavily censored due to both parties being minors.

Okay, perhaps an easier way would be to have the actor create an experience with some unfiltered messaging system and let the child join them in it (if they don't have parental experience restrictions available, in that case the actor will be out of luck). They'll of course need to get the experience approved with a sufficient content maturity level. However, any unfiltered messaging system is going to be caught quickly by automated checking; all experience communications must go through TextService:FilterStringAsync(). Roblox's text-filtering is industry-leading, and much more protective than almost any other platform that allows free-form text communication.

Regardless, we'll assume that the actor is able to direct the user off-platform successfully somehow. In the case that a member of the safety team does follow up on any chat/message history and finds a Terms of Service violation, they'll request that the alternative platform take action, which will of course have no impact as the communications aren't against their Terms of Service.

> And if Roblox was IRL, it would've already been sued into oblivion

I hope I've effectively demonstrated that it's much more difficult to abuse or exploit users on the platform after recent updates. If Roblox was IRL, they wouldn't be able to implement any of these safety checks or features, and they probably would be sued into oblivion.


Replies

freejazzyesterday at 8:56 PM

>If being more open and public correlates strongly enough with "links pedophiles with children", then yeah, true. I expect Roblox to do plenty to improve its platform safety with their track record. The recent introduction of their ID verification system to prevent communication between users outside of specified age buckets, solely in the context of improving child safety, is working and significantly reducing cases of child exploitation both on and off of the platform.

What are you, their PR agent?

>I don't agree that it facilitates this kind of behaviour.

It doesn't really matter if you agree that it does. It is a fact that it does.

show 1 reply