Most of this debate makes more sense if the actual goal is liability reduction, not child safety. If it were genuinely about protecting kids, you'd regulate infinite scroll and algorithmic engagement optimization, not who can log in.
Most people have only a light grasp of what infinite scroll and algorithmic engagement optimization means. They know they like the scrolling apps more, but it takes a bit of research and education to really understand the specific mechanics and alternatives. We get this well as tech literate but many people using these apps today, are neither tech literate, nor even remember a world before infinite scrolling media was a thing. It seems incredibly obvious mechanism but I've explained it to people, and it takes a few times for it to really sink in and become a specific mental model for how they see the world.
I think this would be good for everybody, not just kids. It doesn't even have to be complicated: Just that after a certain amount of time scrolling/watching, put in a message asking if it's maybe time to stop with some information about how these algorithms try to keep you for as long as possible. Maybe a link to a government page with more information.
It doesn't have to be perfect and there will of course be easy workarounds to hid the warnings for people that want. The goal is to improve the situation though, not solve it perfectly. Like putting information about the dangers of smoking on packages of smokes; it doesn't stop people from smoking but it does make the danger very easy to learn.
I'm happy they don't because they don't know what they're doing. Hopefully countries prioritizing public health will implement a social media ban for the vulnerable population which gives them some time to grow up without all that garbage poisoning their brains. Then when they're 16 or whatever age, hopefully by that age we'll have realized that this is actually like cigarettes and everyone, all age groups treat it like that.
Better than muddying the waters trying to make it less addictive but then letting them on there when their brains aren't ready.
If the concern is time-wasting, even having upvotes or likes and sorting on them is plenty engaging. I spent thousands of hours as a teenager on Reddit, HN, and the old blue Facebook chronological feed.
I think it's because there's always a group of nosy busybodies finger-wagging about protecting the children and we have to do decorative theatrics to satiate whatever narratives they've convinced themselves of
Pretty sure they’re doing both of those things but it takes a long time for the regulation to reach the final stage
Interestingly, regulating these would be good for adults as well. A lot of these very large online companies enjoy an asymmetric power advantage. We should aim to protect ourselves against them, in addition to our children.
If the US really cared about child safety they'd go after people in the epstien files.