logoalt Hacker News

Jury finds Meta liable in case over child sexual exploitation on its platforms

436 pointsby billforlast Tuesday at 9:54 PM510 commentsview on HN

Comments

Aurornislast Tuesday at 10:50 PM

Many will cheer for any case that hurts Meta without reading the details, but we should be aware that these cases are one of the key reasons why companies are backtracking from features like end-to-end encryption:

> The New Mexico case also raised concerns that allowing teens to use end-to-end encryption on Instagram chats — a privacy measure that blocks anyone other than sender and receiver from viewing a conversation — could make it harder for law enforcement to catch predators. Midway through trial, Meta said it would stop supporting end-to-end-encrypted messaging on Instagram later this year.

The New York case has explicitly gone after their support of end-to-end encryption as a target: https://www.reuters.com/legal/government/meta-executive-warn...

show 16 replies
dwedgeyesterday at 10:50 AM

Maybe I'm just getting old and cynical but, while I think current social media is bad for children, I'm very suspicious of the current international agreement that it's time to take action, especially with all the ID verification coming from multiple avenues

show 16 replies
nclin_yesterday at 6:22 PM

375 million awarded at $5000 per child harmed. Implying that only 75,000 children were harmed.

Got away with it again, good profit, will repeat.

show 5 replies
sharkjacobslast Tuesday at 11:25 PM

> The New Mexico attorney general’s office created multiple fake Facebook and Instagram profiles posing as children as part of its investigation into Meta. Those test accounts encountered sexually suggestive content and requests to share pornographic content, the suit alleges.

> The fake child accounts were allegedly contacted and solicited for sex by the three New Mexico adult men who were arrested in May of 2024. Two of the three men were arrested at a motel, where they allegedly believed they would be meeting up with a 12-year-old girl, based on their conversations with the decoy accounts.

and

> “The product is very good at connecting people with interests, and if your interest is little girls, it will be really good at connecting you with little girls,” Bejar said.

This is what it's about right? The article doesn't make it seem like encryption is meaningfully part of this case at all.

> Midway through trial, Meta said it would stop supporting end-to-end-encrypted messaging on Instagram later this year.

There's no indication that that decision, or the announcement, are directly related to the trial, just they just happened at the same time? It's a link drawn by CNN, without presenting any clear connection

show 1 reply
spenvotoday at 6:08 PM

Meta's own research (and its use of it) has shown that it repeatedly ignores well-substantiated facts about the harms of its products. Now that Section 230 seems like a flawed shield, I fear the takeaway for other companies will be: never conduct honest research in the first place to preserve plausible deniability.

Meta has always wanted the appearance of caring about safety (helps them attract talent and keep mission-related morale high), while nearly always prioritizing growth (save for tiny blips of time, like in 2017 when the fallout of the cambridge analytica stuff was hitting a crescendo), whereas companies like X are run by people explicitly disinterested in putting significant resources into safety, especially research.

I will also add that, for the past few years, Meta and X both have become extremely hostile to external researchers of their platforms, shutting down access to tools and data.

bradley13yesterday at 6:11 AM

We don't want age verification, and we do want E2E encryption. Yet, because Meta is an evil company, we cheer on this judgement.

Reality, folks: you can't have both.

show 6 replies
luniastoday at 12:29 PM

Social media for children should be moderated by their parents, full stop. End-to-end encryption exists. You cannot un-invent it. It is trivial to roll your own encrypted chat service.

zeeshana07xyesterday at 1:23 PM

Fines like this only work if they're large enough to change behavior. $375M for a company Meta's size is more of an accounting entry than a deterrent.

show 3 replies
fnyyesterday at 3:15 PM

This fine from New Mexico is about 0.6% of Meta's annual profit.

If all 50 states sue at the same rate, that'll be a 30% dent, and I'm sure states can sue for more than 0.6% too. That would be historic action against malfeasance and would send a strong FAFO single to all corporates.

Let's lobby for it.

show 1 reply
exabrialyesterday at 12:32 PM

That fine is missing a few zeros on the right side

show 1 reply
tombertyesterday at 5:10 PM

They had to pay about $375 million. That's a lot of money, but I suspect that Facebook has made considerably more than that on targeting children.

I'm hardly the first person to use this logic, but if they make more money breaking the law than they have to pay in fines, then it's not a fine, it's a business expense.

show 2 replies
sarbanharbleyesterday at 12:25 PM

It takes 7 clicks to turn off ads that promote eating disorders. Thats enough proof.

show 2 replies
CrzyLngPwdyesterday at 6:23 PM

The fine is just one of the costs of doing business for these megacorps.

show 1 reply
ourmandaveyesterday at 10:16 AM

Do we have to wait for any appeals before the performative mail out settlement checks for $1 routine?

show 1 reply
HardwareLustyesterday at 12:07 PM

$375M isn't even a slap on the wrist for a company that raked in $60B last year.

someguynamedqtoday at 3:06 PM

People trust third parties too much to manage the security of their communication

cedwsyesterday at 6:12 PM

Wasn't Zuckerberg caught red handed in emails signing off on this? When is he going to be facing consequences?

show 2 replies
deepsunyesterday at 12:22 AM

I cheer any decision that holds any private web property (like Facebook) accountable for it's user actions.

It helps to reduce hegemony of large social platforms and promotes privately owned websites. For example, I know everyone who has permissions to post on my website (or pre-moderate strangers comments), and is ready to take responsibility for their posts, what my website publishes.

Currently the legal stance seems strange to me -- large media platforms are allowed to store, distribute, rank and sell strangers data, while at the same time they claim they are not responsible for it.

show 1 reply
awonghyesterday at 5:53 PM

As part of the ongoing enshittification of the internet, tragedy of the commons etc., these big centralized internet platforms decided that instead of being responsible and making their products *slightly* less terrible it was better to maximize short term engagement metrics, and that, egotistically, the chance of there being real consequences for their actions was near zero. (Or, even more cynically, that their yearly performance review was more important).

Now I'm afraid they've screwed everyone over and the idea of an anonymous open internet is now dead- we're gonna see age (read, real ID) verification gating on every site and app soon....

The dumb thing is to look back and see how umimportant it is that Facebook feed algorithm be this addictive. They already had the network effects and no real competitors. They could have just left it alone.

show 5 replies
throw7yesterday at 1:59 PM

If Meta did advertise the "safety of its platforms for young users" then they should be held accountable for that. It seems clear from the whistleblowers that Meta had internal data that they knew they were not safe for young users, but Zuck gotta get those ads($$$) in front of young kids.

show 2 replies
Aboutplantsyesterday at 1:13 PM

Why can’t penalties be tied to a percentage of Revenue?

show 2 replies
billforyesterday at 5:12 PM

and also https://news.ycombinator.com/item?id=47514916 It might be good to roll all the comments together.

show 1 reply
elwebmasteryesterday at 7:23 AM

Can one be opposed to age verification in the OS and yet totally happy that Meta got this fine? There is a very big difference between e2e encryption /telephone and social media. Social media is more akin to a phone book. I do not recall there ever being any phone books listing minors. That's completely unacceptable and unnecessary. I am totally OK with phonebooks (or their modern digital equivalents which enable people discovery and user generated content discovery) to abide by the same KYC rules as banks. And be only for adults. Your kids using e2e encrypted messaging to communicate with their friends whom they have met in person? Nothing wrong with that, we all have the right to privacy. Kids listing their contact information publicly? Absolute no.

0ckpuppetyesterday at 11:27 AM

the leaders of these companies don'tlet their kids use it.

show 1 reply
bandie91today at 9:02 AM

what is so hard to teach children not to e-messaging with strangers just like not to snail-mailing with strangers? also the parents should be able to join the conversation just like in the analogue world. call me backward but i dont want to outsource parenting neither to government nor to remote businneses.

show 1 reply
muskyFelonyesterday at 1:07 PM

Regulate and fine social media and adtech companies until its no longer economically feasible to generate the massive profits and stock valuations that is prompting this garbage.

show 2 replies
elAhmoyesterday at 2:26 PM

They earn this in around 16 hours.

montroseryesterday at 10:41 AM

Cost of doing business...

show 2 replies
badpennyyesterday at 1:13 PM

0.6% of last year's profits.

show 1 reply
electric_museyesterday at 10:39 AM

The same company intentionally driving minors towards this content (despite claiming to care about them) is also lobbying in secrecy for requiring all of us to scan our ID and face in order to use our phones and computers.

Their stated reason? Child safety.

Their actual reason? You can figure that out.

show 16 replies
nixassyesterday at 11:22 AM

Oh no those pesky Europeans extorting money from US tech companies. No, wait..

CobrastanJorjiyesterday at 4:49 PM

"We remain confident in our record of protecting teens online," said the company that clearly was not punished enough to hurt their confidence.

Alen_Pyesterday at 1:27 PM

Most Facebook users are basically teenagers, so it's no wonder it took them this long to add any real restrictions...or maybe they just wanted us to think they cared.

WarcrimeActualyesterday at 5:22 PM

I haven't read this article, but I can tell you for certain that no verdict was handed down that will punish them in any way that matters. They have and generate more money than they could ever spend and they're functionally above the law because of the money and lawyers they can afford. The law itself is broken in this country and when you get big enough you can literally get away with murder.

show 5 replies
bilekastoday at 11:38 AM

If I was to take my tinfoil hat off, one could see a world where Facebook let this happen in the first place in order to have a case to make for less security in communications.

I don't like Meta in any sense of the word and I think they've degraded humanity and society as a whole significantly for generations now to come. But I hope my conspiratorial mind is just over reacting.

mattfrommarsyesterday at 2:49 PM

That’s good! We need to protect our children.

But who gets the $375 million dollars? Anyone know the cut the law firm will get from this incredible amount of money?

tremonyesterday at 5:38 PM

"told to pay"? As in, they're not even fined? What a horrible choice of headline.

girishsoyesterday at 3:37 PM

Why do we call this company "Meta"? It's the same old "Facebook".

show 3 replies
maqniusyesterday at 6:15 PM

Tststs.. it's only allowed to harm adults and the environment for profit.

show 1 reply
SlightlyLeftPadyesterday at 5:53 AM

$375M - That’s it?!

show 1 reply
fragmedeyesterday at 4:30 AM

So... end to end message encryption means meta can't see messages child molesters are sending to children.

show 2 replies
fridderyesterday at 5:18 PM

I wonder if this stand and if it will lead to more suits against Meta.

paxyslast Tuesday at 10:49 PM

Happy to see it, but if a fine is the only consequence then they’re going to go back to doing the exact same thing tomorrow.

cwmooreyesterday at 10:39 AM

Seems insufficient to keep Social Security solvent after 2040.

Are the kids alright?

camillomillertoday at 10:20 AM

I really would love to be in the mind of Meta spokespeople who have to craft messages that completely hide the truth, sound convincing, and have to live with it, to understand how they do it without blowing up. I think that's also quite damaging for someone's mental health.

randycupertinoyesterday at 5:08 AM

This is one of the first times the court found the platform itself can be liable, overruling frequent industry claims that they just host content and are never responsible for the content. $375 million sounds big but is peanuts compared to their annual revenue. And of course Meta will appeal and then try to drag everything out for years and years. Expect copycat lawsuits.

These platforms expose minors to predators and bad actors, and Meta was proven lying about safety.

show 2 replies
vpShaneyesterday at 3:34 PM

Make the fine scale, and fit the severity of the issue. This should be $375 Billion not $375 Million. These are our future generations they're destroying.

notnullorvoidyesterday at 6:52 PM

As usual the company is going to financially shield those responsible, while they in turn shield the company from societal blame.

🔗 View 43 more comments