logoalt Hacker News

coppsilgoldlast Sunday at 7:39 PM8 repliesview on HN

Requiring authorized silicon (and software) isn't even the biggest problem here.

They do not use zero knowledge proof systems or blind signatures. So every time you use your device to attest you leave behind something (the attestation packet) that can be used to link the action to your device. They put on a show about how much they care about your privacy by introducing indirection into the process (static device 'ID' is used to acquire an ephemeral 'ID' from an intermediate server) but it's just a show because you don't know what those intermediary severs are doing: You should assume they log everything.

And this just the remote attestation vector, the DRM 'ID' vector is even worse (no meaningful indirection, every license server has access to your burned-in-silicon static identity). And the Google account vector is what it is.

Using blind signatures for remote attestation has actually been proposed, but no one notable is currently using it: <https://en.wikipedia.org/wiki/Direct_Anonymous_Attestation>

There are several possible reasons for this, the obvious one is that they want to be able to violate your privacy at will or are mandated to have the capability. The other is that because it's not possible to link an attestation to a particular device the only mitigation to abuse that is feasible is rate limiting which may not be good enough for them - an adversary could set up a farm where every device generates $/hour from providing remote attestations to 'malicious' actors.


Replies

AnthonyMouselast Sunday at 9:22 PM

> The other is that because it's not possible to link an attestation to a particular device the only mitigation to abuse that is feasible is rate limiting

I still don't see how you can keep something anonymous and still rate limit it. If a service can tell that two requests came from the same party in order to count them then two services can tell that two requests came from the same party (by both pretending to be the same service) and therefore correlate them.

show 2 replies
xinayderlast Sunday at 9:08 PM

Can we stop normalizing being surveilled online and on our devices?

Saying something like "the problem is not hardware attestation, but that they don't use ZKP".

You are normalizing the new behavior. You shouldn't. It doesn't matter if they use ZKP or the latest, secure technology for hardware attestation. The issue is hardware attestation. It's the same with age ID. The issue is not that Age ID is prone to data leaks, the problem itself is called Age ID.

show 4 replies
zx8080yesterday at 2:00 AM

> Requiring authorized silicon (and software) isn't even the biggest problem here. It is indeed the biggest issue. It prevents be from owning and using the hardware I pay for, own, or make myself. It's switching the personal computers as we know it from being open to proprietary and owned by 2 large US corporations.

I don't agree that it's not a problem.

show 1 reply
to11mtmyesterday at 4:14 PM

> Requiring authorized silicon (and software) isn't even the biggest problem here.

I agree, except I worry it's a bigger concern than we realize.

I still remember what CableCard (and the hoops needed for HW manufacturers to get certified) did to the DIY DVR Market...

Hoodedcrowlast Sunday at 7:44 PM

Would like to read a writeup on this, I was certain it was going to be something like this from the app's announcement.

Also I recall a discussion on Graphene's forums that DRM ID is not only retained there, but stays the same across profiles.

show 1 reply
miki123211yesterday at 4:39 PM

Ultimately, the point of hardware attestation isn't to ensure that your device is trusted, but that the action you're trying to perform was done by a human, not a bot doing millions of them per second. It's just another CAPTCHA mechanism in disguise, required because bots have gotten so good at solving the existing ones.

With a secure device, the only way to get an attestation for an account signup is to do the signup on that device, with real fingers clicking real buttons on a real screen. There's no way to short-circuit the process by automatically sending a JSON request and bypassing the actual signup flow from a Python script, like you can do with an insecure endpoint.

With blind signatures, a single compromised device destroys the value of the entire scheme, as it can be used to issue an infinite number of attestations with 0 human oversight.

What we need is a blind signature construction where the verifier can revoke a signature, each signature carries proof that none of the revoked signatures comes from the same signer, and where it is impossible for one signer to issue more than n distinct signatures during one time window. Not sure if this would be possible with ZKPs; my cryptography knowledge doesn't extend that far.

show 1 reply
vbezhenaryesterday at 9:44 AM

Can you revoke certificate for a specific device using privacy schemes?

Like imagine that someone managed to extract key from the specific device and distributed that key in a software implementation to fake attestation. Now Google needs to revoke that particular key to disallow its usage. This is obvious requirement.

show 1 reply
willis936last Sunday at 7:58 PM

Are these the kinds of issues privacy pass intends to fix? If so, what carrot and/or stick will get it adopted?