Ultimately, the point of hardware attestation isn't to ensure that your device is trusted, but that the action you're trying to perform was done by a human, not a bot doing millions of them per second. It's just another CAPTCHA mechanism in disguise, required because bots have gotten so good at solving the existing ones.
With a secure device, the only way to get an attestation for an account signup is to do the signup on that device, with real fingers clicking real buttons on a real screen. There's no way to short-circuit the process by automatically sending a JSON request and bypassing the actual signup flow from a Python script, like you can do with an insecure endpoint.
With blind signatures, a single compromised device destroys the value of the entire scheme, as it can be used to issue an infinite number of attestations with 0 human oversight.
What we need is a blind signature construction where the verifier can revoke a signature, each signature carries proof that none of the revoked signatures comes from the same signer, and where it is impossible for one signer to issue more than n distinct signatures during one time window. Not sure if this would be possible with ZKPs; my cryptography knowledge doesn't extend that far.
> Ultimately, the point of hardware attestation isn't to ensure that your device is trusted, but that the action you're trying to perform was done by a human, not a bot doing millions of them per second. It's just another CAPTCHA mechanism in disguise, required because bots have gotten so good at solving the existing ones.
...no? Maybe this is true of end-user device attestation. But there are other use-cases for attestation.
Server device attestation is an entirely different thing. It's used in e.g. IaaS "Confidential VM" offerings, where the audience for the attestation information is the customer, rather than the server host. It's a very pro-privacy / pro-data-sovereignty feature.
And while embedded device attestation is sometimes about preventing customers from tampering with IoT stuff you "sold" them, more often it's about being able to trust and confidently assert that e.g. the climate sensors you've deployed all over a forest as part of a research project haven't been fucked with to report false data by someone with an agenda. (Or to "apply denial" to your unmanned military satellite downlink station the moment you detect that there's some unknown person out there futzing with it.)