logoalt Hacker News

TikTok will not introduce end-to-end encryption, saying it makes users less safe

378 pointsby 1659447091today at 1:31 AM366 commentsview on HN

Comments

Trastertoday at 9:27 AM

I think this is... fine? Am I just totally naive. I think it's fine to say "You don't really have privacy on this app" - as long as there are relatively good options of apps that do have privacy (and I think there are). TikTok is really a public by default type of social media, there's not much idea of mutual following or closed groups. So sure, you don't have privacy on tiktok, if you want it you can move to snapchat or signal or whatever platform of your choice.

Like, it's literally a platform that was run under the watchful eye of the CCP, and now the US version is some kleptocratic nightmare, so I just don't see the point in expecting some sort of principled stance out of them.

In some ways I think it's worse for places like Facebook to "care about privacy" and use E2EE but then massively under-resource policing of CSAM on their platform. If you're going to embrace 'privacy' I do think it's on you to also then put additional resources into tackling the downsides of that.

show 5 replies
xeckrtoday at 4:25 AM

Brilliant. They're repackaging the argument governments have long made about E2EE being dangerous to children.

show 6 replies
ThoAppelsintoday at 6:16 AM

DMs are akin to private conversations in real life. Thus, every DM feature should entail E2EE.

It’s ok for a platform to not feature private conversations. They should just have no DM feature at all, then; make all messages publicly visible.

Private conversations are indeed not for all ages. Parents should be able to grant access to that on individual basis.

show 5 replies
ranyumetoday at 4:20 AM

This might be off-topic but on-topic about child safety... but I'm surprised people are being myopic about age verification. Age verification should be banned, but people ignore that nowadays most widely used online services already ask for your age and act accordingly: twitter, youtube, google in general, any online marketplace. They already got so much data on their users and optimize their algorithms for those groups in an opaque way.

So yeah, age verification should be taken down, as well as the datamining these companies do and the opaque tunning of their algorithms. It baffles me: people are concerned about their children's DMs but are not concerned about what companies serves them and what they do with their data.

show 5 replies
MetaWhirledPeastoday at 5:45 PM

"makes users less safe"

They don't believe that. It makes it more difficult to deal with governments, is all. Big Brother needs your messages from time to time, and TikTok is not willing to risk getting shut down to argue against that. We can't have pesky principles getting in the way of money.

computerextoday at 5:03 AM

TikTok is a front for government surveillance, so it's not really surprising that this is their position.

show 4 replies
beaker52today at 7:50 PM

Someone in the UK government is furiously writing this down.

gorgoilertoday at 9:16 AM

I don’t really understand how we are supposed to believe in e2ee in closed proprietary apps. Even if some trusted auditor confirms they have plumbed in libsignal correctly, we have no way of knowing that their rendering code is free of content scanning hooks.

We know the technology exists. Apple had it all polished and ready to go for image scanning. I suppose the only thing in which we can place our faith is that it would be such an enormous scandal to be caught in the act that WhatsApp et al daren’t even try it.

(There is something to be said for e2ee: it protects you against an attack on Meta’s servers. Anyone who gets a shell will have nothing more than random data. Anyone who finds a hard drive in the data centre dumpster will have nothing more than a paperweight.)

show 3 replies
swiftcodertoday at 10:07 AM

> the controversial privacy feature used by nearly all its rivals

"controversial" according to who? The NSA / GCHQ?

show 1 reply
ronsortoday at 3:45 AM

Why would you use TikTok for private communications anyway? It's mostly a public short video sharing platform.

show 5 replies
cdrnsftoday at 6:13 PM

TikTok and other social media apps' business models are antithetical to privacy.

show 1 reply
zzo38computertoday at 7:06 PM

In my opinion, a separate software should be used for the end-to-end encryption than for the communication, although there are other things to do for security other than only programming the computer correctly (such as securely agreeing the keys and ciphers in person).

hexage1814today at 8:05 AM

It doesn't matter. Web-based cryptography is always snake oil

https://web.archive.org/web/https://www.devever.net/~hl/webc...

show 4 replies
maxdotoday at 7:45 PM

People seriously discuss privacy in Chinese app . With all respect, their government will not allow you even a hint of privacy

dlev_pikatoday at 6:50 PM

lol

It makes sense - they extract every possible bit of personal information from your device - why would they make you believe they care about your privacy?

You want to communicate privately? TikTok is not the place, and that’s ok. shrugs

matricariatoday at 6:03 AM

Since when is E2EE controversial? Not using E2EE should be controversial.

show 1 reply
gradientsrneattoday at 4:58 PM

A middle ground would be to implement E2EE but have messages signed (and ideally organized in a Merkel tree), so that if a DM is reported there's cryptographic proof that the accounts sent the messages.

pothamktoday at 5:02 AM

The core tension here isn’t really about encryption itself, it’s about moderation models.

Most large platforms rely heavily on server-side visibility for abuse detection, spam filtering, recommendation systems, and safety tooling. End-to-end encryption removes that visibility by design. Once a platform is built around centralized analysis of user content, adding strong E2EE later isn’t just a feature toggle — it conflicts with large parts of the existing architecture.

show 1 reply
maesttoday at 3:25 AM

Do you feel safer knowing DMs are not encrypted?

show 3 replies
lucasfin000today at 5:50 PM

I dont think the argument is really about child safety. If it was tiktok would also be working on fixing their algorithm that can send minors toward harmful content, which is a far larger documented vector than encrypted DMs. This is about preserving access.

krickelkrackeltoday at 8:23 AM

Just like door locks are making the world less free!

2OEH8eoCRo0today at 8:08 PM

What unsafe things are users most likely to encounter?

matesztoday at 5:42 AM

Fun fact - there is a big correlation between World Wars and compulsory education. Of course governments and big corporations "care" about children. Of course!

Schlagbohrertoday at 10:10 AM

This BBC article is insanely written.

sheepttoday at 3:43 AM

I feel like this makes sense for a platform that targets teens. Plus, I wouldn't trust TikTok to implement E2E encryption properly—who knows what they've snuck into their client.

show 2 replies
SuperSandro2000today at 8:15 PM

hahaha, good one

blackqueerirohtoday at 5:09 AM

There is no way to do E2EE on a traditional social media platform with user-generated content and comply with existing US law.

You can’t moderate an E2EE platform.

show 3 replies
_el1s7today at 9:34 AM

That's good, people who need E2EE shouldn't use TikTok either way, there are plenty of other secure apps for that.

TikTok is a social media app, and it gets heavily abused as it is.

zthrowawaytoday at 12:48 PM

Making users less safe from… letting us snoop on all your communications for “national security”.

0xbrayotoday at 7:46 AM

unrelated but I'm always surprised by the number of people who don't know that instagram dms are not encrypted by default.

1970-01-01today at 1:27 PM

I see it like this: Taking in the totality of the danger, they're right. If the source (social network) and the destination (child brain) cannot be treated as trustworthy, then you must control the content for overall safety. If you could trust either end, then you could dismiss the argument. But you cannot trust children to be cognizant of abuse, and you already know social media literally reinvented abusive behaviors for the 21st century. Do nothing and children will be harmed. Overreach by any amount and you have destroyed freedom. The only middle ground is weaker encrypted E2E comms. Something that creates a forcing function with very high cost (an electric bill or SaaS service) for the sniffer but can be broken with enough horsepower. Think about what millions of dollars per character would do. Good luck codifying that insane compromise into a law.

gnarlousetoday at 8:33 AM

Maybe just don't use TikTok. Shocking that adults use a platform for children.

lwansbroughtoday at 8:59 AM

The Chinese spyware app won’t do E2EE? I can’t believe what I’m reading.

show 2 replies
dev_l1x_betoday at 8:34 AM

I take privacy suggestions from social media companies on a daily basis.

insane_dreamertoday at 5:21 PM

I'll never let my kids have a TikTok account anyway (once they're adults they can have one of course if they want to).

9864247888754today at 7:56 AM

And their target audience won't question it.

edarchistoday at 9:21 AM

> But critics have said E2EE makes it harder to stop harmful content spreading online, because it means tech firms and law enforcement have no way of viewing any material sent in direct messages.

Like they give a damn. I report accounts that explicitly sell fake credit cards, citing laws that make it illegal and 95% of the time "we checked and there is no violation here, we know that you're not happy but don't give a crap".

So the argument of security is utter bullshit and they just want to snoop.

bastoday at 4:58 AM

Fascinating. What a time to be alive.

hd4today at 8:12 AM

I hate the BBC so much - "controversial privacy tech" "E2EE ... the best way to protect conversations from .. even repressive authorities" "End-to-end encryption has been criticised by governments, police forces"

They're saying this at the same time as they're clutching pearls over Iran's repression of protestors. Typical of the ethical consistency I would expect from them.

tw04today at 4:53 AM

Reminder, Larry “citizens shouldn’t get any privacy” Ellison now owns tik tok. If you’re still using it or have friends and family using it you should stop immediately. It WILL eventually be used against you if this regime gets its way.

https://digitaldemocracynow.org/2025/03/22/the-troubling-imp...

show 1 reply
iso1631today at 10:56 AM

The actual headline is currently

> TikTok won't protect DMs with controversial privacy tech, saying it would put users at risk

Not sure if this was changed since first posting, I don't mind updates, but unless it'd redacting for legal purposes (which should then itself be clearly mentioned), the BBC should provide a public changelog like wikipedia

cresttoday at 10:00 AM

A Chinese company saying you don't need encryption. Why should anyone waste time debunking their bad faith "arguments"?

camillomillertoday at 8:31 AM

Doublespeak. War is peace.

Tyrubiastoday at 4:25 AM

TikTok’s stance against end-to-end encryption is unsurprising but still concerning. TikTok is a source of information on many topics, such as the genocide in Gaza, which traditional media underreport and many governments try to suppress. The network effect of big social media platforms means many people will likely talk about these topics in TikTok DMs. No matter what legal controls TikTok claims to enforce, there is no substitute for technological barriers for preventing invasions of privacy and government overreach. This is yet another example where corporations and governments sacrifice people’s autonomy and privacy in the name of security.

show 3 replies
burnt-resistortoday at 5:00 AM

It's the Max app for Americans, now with 900% more US and IL government spying.

blueTiger33today at 10:26 AM

so we need no encryption?...at the end of the day we have nothing to hide right CIA,FBI? :D

🔗 View 13 more comments