logoalt Hacker News

A cryptography engineer's perspective on quantum computing timelines

527 pointsby thadtyesterday at 3:31 PM224 commentsview on HN

Comments

adrian_byesterday at 5:47 PM

It should be noted that if indeed there has not remained much time until a usable quantum computer will become available, the priority is the deployment of FIPS 203 (ML-KEM) for the establishment of the secret session keys that are used in protocols like TLS or SSH.

ML-KEM is intended to replace the traditional and the elliptic-curve variant of the Diffie-Hellman algorithm for creating a shared secret value.

When FIPS 203, i.e. ML-KEM is not used, adversaries may record data transferred over the Internet and they might become able to decrypt the data after some years.

On the other hand, there is much less urgency to replace the certificates and the digital signature methods that are used today, because in most cases it would not matter if someone would become able to forge them in the future, because they cannot go in the past to use that for authentication.

The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.

OpenSSH, OpenSSL and many other cryptographic libraries and applications already support FIPS 203 (ML-KEM), so it could be easily deployed, at least for private servers and clients, without also replacing the existing methods used for authentication, e.g. certificates, where using post-quantum signing methods would add a lot of overhead, due to much bigger certificates.

show 3 replies
phicohyesterday at 6:40 PM

What surprises me is how non-linear this argument is. For a classical attack on, for example RSA, it is very easy to a factor an 8-bit composite. It is a bit harder to factor a 64-bit composite. For a 256-bit composite you need some tricky math, etc. And people did all of that. People didn't start out speculating that you can factor a 1024-bit composite and then one day out of the blue somebody did it.

The weird thing we have right now is that quantum computers are absolutely hopeless doing anything with RSA and as far as I know, nobody even tried EC. And that state of the art has not moved much in the last decade.

And then suddenly, in a few years there will be a quantum computer that can break all of the classical public key crypto that we have.

This kind of stuff might happen in a completely new field. But people have been working on quantum computers for quite a while now.

If this is easy enough that in a few years you can have a quantum computer that can break everything then people should be able to build something in a lab that breaks RSA 256. I'd like to see that before jumping to conclusions on how well this works.

show 5 replies
tux3yesterday at 5:02 PM

This is a good take, there's really not much to argue about.

>[...] the availability of HPKE hybrid recipients, which blocked on the CFRG, which took almost two years to select a stable label string for X-Wing (January 2024) with ML-KEM (August 2024), despite making precisely no changes to the designs. The IETF should have an internal post-mortem on this, but I doubt we’ll see one

My kingdom for a standards body that discusses and resolves process issues.

show 2 replies
kroyesterday at 7:11 PM

The argument to skip hybrid keys sounds dangerous to me. These algorithms are not widely deployed and thus real world tested at all. If there is a simple flaw, suddenly any cheap crawler pwns you while you tried to protect against state actors.

ggmyesterday at 11:07 PM

This is the first well reasoned write up which makes me walk back from my "QC is irrelevant, and RSA is fine" position a bit. Well done! Thank you for putting this into terms a skeptic can relate to and understand. It helped me re-frame my thinking on risks here.

show 1 reply
janalsncmyesterday at 5:52 PM

Building out a supercomputer capable of breaking cryptography is exactly the kind of thing I expect governments to be working on now. It is referenced in the article, but the analogy to the Manhattan Project is clear.

Prior to 1940 it was known that clumping enough fissile material together could produce an explosion. There were engineering questions around how to purify uranium and how to actually construct the weapon etc. But the phenomenon was known.

I say this because there’s a meme that governments are cooking up exotic technologies behind closed doors which I personally tend to doubt.

This is almost perfect analogy to the MP though. We know exactly what could happen if we clumped enough qubits together. There are hard engineering challenges of actually doing so, and governments are pretty good at clumping dollars together when they want to.

show 4 replies
wuiheerfojtoday at 1:02 AM

I buy the argument 'we should prepare for Q-Day as crypto agility is hard', but the newest paper doesn’t change the timeline meaningfully.

Given TFA accepts that error correction is the bottleneck for progress, and the gap between any and lots of error correction is small, and we presently have close to 0 error correction then nothing has practically changed with reduced qubit requirements.

Of course, it’s totally fine to have and announce a change of view on the topic, though I don’t see how the Google paper materially requires it.

xoayesterday at 6:49 PM

Yeah, sounds like it's time to take this very seriously. Sobering article to read, practical and to the point on risk posture. One brief paragraph though that I think deserves extra emphasis and I don't see in the comments here yet:

>In symmetric encryption, we don’t need to do anything, thankfully

This is valuable because it does offer a non-scalable but very important extra layer that a lot of us will be able to implement in a few important places today, or could have for awhile even. A lot of people and organizations here may have some critical systems where they can make a meat-space-man-power vs security trade by virtue of pre-shared keys and symmetric encryption instead of the more convenient and scalable normal pki. For me personally the big one is WireGuard, where as of a few years ago I've been able to switch the vast majority of key site-to-site VPNs to using PSKs. This of course requires out of band, ie, huffing it on over to every single site, and manually sharing every single profile via direct link in person vs conveniently deployable profiles. But for certain administrative capability where the magic circle in our case isn't very large this has been doable, and it gives some leeway there as any traffic being collected now or in the future will be worthless without actual direct hardware compromise.

That doesn't diminish the importance of PQE and industry action in the slightest and it can't scale to everything, but you may have software you're using capable of adding a symmetric layer today without any other updates. Might be worth considering as part of low hanging immediate fruit for critical stuff. And maybe in general depending on organization and threat posture might be worth imagining a worst-case scenario world where symmetric and OTP is all we have that's reliable over long time periods and how we'd deal with that. In principle sneakernetting around gigabytes or even terabytes of entropy securely and a hardware and software stack that automatically takes care of the rough edges should be doable but I don't know of any projects that have even started around that idea.

PQE is obviously the best outcome, we ""just"" switch albeit with a lot of increase compute and changed assumptions in protocols pain, but we're necessarily going to be leaning on a lot of new math and systems that won't have had the tires kicked nearly as long as all conventional ones have. I guess it's all feeling real now.

btdmasteryesterday at 9:39 PM

> “Doesn’t the NSA lie to break our encryption?” No, the NSA has never intentionally jeopardized US national security with a non-NOBUS backdoor, and there is no way for ML-KEM and ML-DSA to hide a NOBUS backdoor.

The most concrete issue for me, as highlighted by djb, is that when the NSA insists against hybrids, vendors like telecommunications companies will handwrite poor implementations of ML-KEM to save memory/CPU time etc. for their constrained hardware that will have stacks of timing side channels for the NSA to break. Meanwhile X25519 has standard implementations that don't have such issues already deployed, which the NSA presumably cannot break (without spending $millions per key with a hypothetical quantum attack, a lot more expensive than side channels).

show 2 replies
aborsyyesterday at 7:06 PM

I don’t know why the author likes AES 128 so badly. AES 256 adds little additional cost, and protects against store now decrypt later attacks (and situations like: “my opinion suddenly changed in few months”). The industry standard and general recommendation for quantum resistant symmetric encryption is using 256 bit keys, so just follow that. Every time he comes up with all sorts of arguments that AES 128 is good.

Age should be using 256 bit file keys, and default to PC keys in asymmetric mode.

show 2 replies
kwar13today at 3:18 AM

One of the author is from the Ethereum Foundation. Super interesting paper to read. This goes beyond just cryptocurrency. I wrote about it here a while ago: https://kaveh.page/blog/bitcoin-quantum-threat

codethiefyesterday at 8:28 PM

> Trusted Execution Environments (TEEs) like Intel SGX and AMD SEV-SNP and in general hardware attestation are just f*d. All their keys and roots are not PQ and I heard of no progress in rolling out PQ ones, which at hardware speeds means we are forced to accept they might not make it, and can’t be relied upon.

Slightly off-topic but: Does anyone know what the Signal developers plan on doing there to replace SGX? I mean it's not like outside observers haven't been looking very critically at SGX usage in Signal for years (which the Signal devs have ignored), but this does seem to put additional pressure on them.

show 2 replies
Tyypstoday at 4:13 AM

I think people have to be extremely careful with this kind of opinion. In particular seeing such a push for post-quantum crypto while the current state of the art for quantum factorisation is 15 and 21 and the fact that current assumptions (for KEM in particular) are clearly not as studied as dlog.

It's maybe good to remember that SIDH was broken in polynomial time by a classical computer 3 years ago... I'm really concerned by the current rush for PQ solutions and what are the real intentions behind it. On a side note there might even be a world where a powerfully enough quantum computer that break 2048 bigs RSA will never exists (Hooft, Palmer... Recent quantum gravity theory).

show 2 replies
bhaaktoday at 12:06 AM

> They weirdly[1] frame it around cryptocurrencies and mempools and salvaged goods or something [...]

> [1] The whole paper is a bit goofy: it has a zero-knowledge proof for a quantum circuit that will certainly be rederived and improved upon before the actual hardware to run it on will exist. They seem to believe this is about responsible disclosure, so I assume this is just physicists not being experts in our field in the same way we are not experts in theirs.

The zero-knowledge proof may come across as something of a gimmick, but two of the authors (Justin Drake and Dan Boneh) have strong ties to cryptocurrency communities, where this sort of thing is not unusual.

I also don’t think it’s particularly strange to focus on cryptocurrencies. This is one of the few domains where having access to a quantum computer ahead of others could translate directly into financial gain, so the incentive to target cryptocurrencies is quite big.

Changing the cryptographic infrastructure we rely on daily is difficult, but still easier than, for example in Bitcoin, where users would need to migrate their coins to a quantum-resistant scheme (whenever such a scheme will be implemented). Given the limited transaction throughput, migrating all vulnerable coins would take years, and even then, there would remain all those coins whose keys have been lost.

Satoshi is likely dead, incapacitated, or has lost or destroyed his keys, and thus will not be able to move his coins to safety. Even if he has still access, the movement of an estimated one million BTC, which are currently priced in by the market as to be permanently lost, would itself be a disruptive price event, regardless if done with good or bad intentions.

If you know which way the price will go (obviously way down in this case), you can always profit from such a price move, even if Satoshi's coins were blacklisted and couldn't be sold directly.

show 2 replies
palatayesterday at 6:10 PM

What is the consequence on e.g. Yubikeys (or say the Android Keystore)? Do I understand correctly that those count as "signature algorithms" and are a little less at risk than "full TEEs" because there is no "store now, decrypt later" for authentication?

E.g. can I use my Yubikey with FIDO2 for SSH together with a PQ encryption, such that I am safe from "store now, decrypt later", but can still use my Yubikey (or Android Keystore, for that matter)?

show 2 replies
upofadownyesterday at 8:19 PM

So this is the exciting paper:

* https://arxiv.org/pdf/2603.28627

The new thing here seems to be the use of the neutral atom technique. Supposedly we are up to 96 entangled qubits for a second or two based on neutral atoms.

Shouldn't that be enough capability to factor 15 using Shor's?

OhMeadhbhyesterday at 5:29 PM

In rebuttal, Peter Gutmann seems to think the progress towards quantum computing devices which can break commonly used public key crypto systems is not moving especially quickly: https://eprint.iacr.org/2025/1237

show 1 reply
NeoBildtoday at 10:10 AM

The BLAKE3 angle here is interesting — we switched from SHA-256 to BLAKE3 for hash-chaining in a local multi-agent security orchestrator precisely because of this kind of forward-looking pressure. Not the same threat model, but the instinct to not build new systems on classical primitives feels validated by this post.

Animatsyesterday at 6:35 PM

We'll know it's been cracked when all the lost Bitcoins start to move.

show 3 replies
EdNuttingtoday at 12:45 AM

The OP should take a look at Secqai - potentially serverclass motherboard management processors and beyond which will implement PQ security and hardware enforced memory safety (from what I recall):

https://www.secqai.com/

show 1 reply
thesztoday at 7:42 AM

Given that quantum computing (QC) can speed up training of neural networks (LLMs), it would be wise for Google to invest into QC as much as possible.

Google with Softbank invested about $230M into QC last year. Microsoft, IBM and Google have spent on QC $15B combined, through all of the time they researched it. $15B spent in 20 years, less than $1B per year, by three companies.

Google spent upwards of $150B last year in datacenters.

This may tell us something about how close we are to a working quantum computing.

kroyesterday at 7:18 PM

I wonder, what is the impact of this to widely deployed smartcards like credit cards / EID passports?

Aren't they relying on asymmetrical signing aswell?

show 1 reply
elwraytoday at 7:54 AM

For the uninitiated, could you share your perspective on how feasible quantum computing is? Isnt it built on quantum entanglement which seems to break universal speed limit? Is this a feasible engineer or just a scientists imagination?

amlutoyesterday at 6:07 PM

I was in this field a while back, and I always found it baffling that anyone ever believed in the earlier large estimates for the size of a quantum computer needed to run Shor's algorithm. For a working quantum computer, Shor's algorithm is about as difficult as modular exponentiation or elliptic curve scalar multiplication: if it can compute or verify signatures or encrypt or decrypt, then it can compute discrete logs. To break keys of a few hundred bits, you need a few hundred qubits plus not all that much overhead. And the error correction keeps improving all the time.

Also...

> Trusted Execution Environments (TEEs) like Intel SGX and AMD SEV-SNP and in general hardware attestation are just f**d. All their keys and roots are not PQ and I heard of no progress in rolling out PQ ones, which at hardware speeds means we are forced to accept they might not make it, and can’t be relied upon.

This part is embarrassing. We’ve had hash-based signatures that are plenty good for this for years and inspire more confidence for long-term security than the lattice schemes. Sure, the private keys are bigger. So what?

We will also need some clean way to upgrade WebAuthn keys, and WebAuthn key management currently massively sucks.

show 1 reply
krunckyesterday at 6:30 PM

This would also be a good time for certain governments to knowingly push broken PQ KE standards while there is a panicked rush to get PQ tech in place.

show 2 replies
pdhborgesyesterday at 4:45 PM

What do you recomend as reading material for someone that was in college a while ago (before AE modes got popular) to get up to speed with the new PQ developments?

show 1 reply
griffzhowlyesterday at 8:00 PM

noob question: can't we just use longer classical keys, at least as a stop gap?

show 2 replies
sans_sousetoday at 4:44 AM

I know this may be outside of scope but I am very curious as to any thoughts you may have of a potential for ternary system at hw level?

scorpionfeetyesterday at 6:51 PM

This is exactly how customers who do threat modeling see PQC. HN can armchair QB this all they want, the real money is moving fast to migrate.

The analogy to a small atomic bomb is on point.

vascotoday at 5:25 AM

> I simply don’t see how a non-expert can look at what the experts are saying, and decide “I know better, there is in fact < 1% chance.” Remember that you are betting with your users’ lives.

Problem is the experts don't tell the truth, they say whatever game theory version of the world they came up with will make people do what they think people should do. If experts just said the literal truth it'd be different, and then when they would walk it back would be understandable.

But when later it becomes clear the experts said outright lies because they thought it'd induce the right behavior, that goes out the window.

nodesocketyesterday at 9:25 PM

The first and most obvious target will be Bitcoin. It’s market cap today is $1.4T. That’s a gigantic reward for any state actor or entity with the resources and budget to break it.

Does this mean Bitcoin is going to $0? Absolutely not, it’s just going to take the community organizing and putting in the gigantic effort to make the changes. Frankly I’m not personally clear if that means all existing cold wallets need to be flashed/replaced? All existing Bitcoin miner software needs to be updated? All existing Bitcoin node software needs to be updated?

show 2 replies
enesztoday at 10:07 AM

[flagged]

atlasagentsuiteyesterday at 11:54 PM

[flagged]

burnerRhodov2today at 4:38 AM

TlDR:

The real problem is building a system that can survive noise, errors, and decoherence. Once you solve that, scaling it up is non-trivial but has a very exponential path.

commandersakiyesterday at 7:18 PM

RemindMe! 3 years "impending doom"

bjourneyesterday at 6:27 PM

> Traveling back from an excellent AtmosphereConf 2026, I saw my first aurora, from the north-facing window of a Boeing 747.

Given the author's "safety first" stance on pqc, it seems a bit incongruent to continue to fly to conferences...

vonneumannstanyesterday at 4:59 PM

This seems like something uniquely suited to the startup ecosystem. I.e. offering PQ Encryption Migration as a Service. PQ algorithms exist and now theres a large lift required to get them into the tech with substantial possible value.

show 1 reply
OsrsNeedsf2Pyesterday at 5:30 PM

Why do we "need to ship"? 1,000 qubit quantum computers are still decades away at this point

show 1 reply
Sparkyteyesterday at 5:38 PM

There is always a price to encryption. The cost goes up the more you have to cater to different and older encryptions while supporting the latest.

munrocketyesterday at 5:39 PM

Yes, this is why I invested in QRL crypto. With lates updates and no T1 exchange it looks like a good opportunity to grow.