Some browsers and some end user devices get upgraded quickly, so making it easy to make it optionally-PQ on any site, and then as that rollout extends, some specialty sites can make it mandatory, and then browser/device UX can do soft warnings to users (or other activity like downranking), and then at some point something like STS Strict can be exposed, and then largely become a default (and maybe just remove the non-PQ algorithms entirely from many sites).
I definitely was on team "the risks of a rushed upgrade might outweigh the risks of actual quantum breaks" until pretty recently -- rushing to upgrade has lots of problems always and is a great way to introduce new bugs, but based on the latest information, the balance seems to have shifted to doing an upgrade quickly.
Updating websites is going to be so much easier than dealing with other systems (bitcoin probably the worst; data at rest storage systems; hardware).
Maybe it'll require TLS 1.4/QUIC 2, with no changes but the cipher specifications, but it can happen in two or three years. Certificates themselves don't last longer than a year anyway. Corporations running ancient software that doesn't support PQ TLS will have the same configuration options to ignore the security warnings already present for TLS 1.0/plain HTTP connections.
The biggest problem I can imagine is devices talking to the internet no longer receiving firmware updates. If the web host switches protocols, the old clients will start dying off en masses.
Leaf certificates don't last long, but root CAs do. An attacker can just mint new certs from a broken root key.
Hopefully many devices can be upgraded to PQ security with a firmware update. Worse than not receiving updates, is receiving malicious firmware updates, which you can't really prevent without upgrading to something safe first.
In Chrome at the very least, the certificate not being in the certificate transparency logs should throw errors and report issues to the mothership, and that should detect abuse almost instantly.
You'd still be DoSing an entire certificate authority because a factored CA private key means the entire key is instantly useless, but it wouldn't allow attacks to last long.
The hard part is certificate authentication. And that's not included in the cipher suite setting.
Rephrased, they meant to say "there is no reason to remove support for quantum-vulnerable algorithms in the near future."
IMO that's much less likely to be accidentally misinterpreted.
IPv6 deserves a prominent spot there
I can use myself as an example here. IPv6 is supported by all my hardware, all the software I use, and my ISP provides it. Yet my LAN intentionally remains IPv4 only with NAT. Why? Because adding IPv6 to my LAN would require nonzero effort on my part and has (at least for now) quite literally zero upside for me. If I ever need something it offers I will switch to it but that hasn't happened yet.
PQC is entirely different in that the existence of a CRQC immediately breaks the security guarantee.
Which one do you think is PQ-secure?
This is the result of Cloudflare's test "Check if a host supports post-quantum TLS key exchange" offered on https://radar.cloudflare.com/post-quantum.
Hoping there is already a migration plan. Fortunately many modern tools make it easy to switch to PQ, maybe someone knows which stack HN is running and if it would be possible.
AKA “we want more funding.”
To avoid the scenario where for a prolonged period of time the intelligence community has secret access to QC, researchers against that type of thing are incentivized to shout fire when they see the glimmerings of a possibly productive path of research.
If the intelligence community is going to nab the first team that has a quantum computing breakthrough, does it actually help the public to speed up research?
It seems like an arms race the public is destined to lose because the winning team will be subsumed no matter what.
Luckily, in this particular arms race, all we the public need to do is swap encryption algorithms, and there's no risk of ending global civilization if we mess up. So we get the best of both worlds: Quantum computing for civilian purposes (simulations and whatnot), while none of the terrifying surveillance capabilities. We just need to update a couple of libraries.
By what margin? An active push can minimize the gap.
However I think you're confusing the existence of a CRQC with adoption of PQC algorithms. The latter can be done in the absence of the former.
Filippo Valsorda (maintainer of Golang's crypto packages, among other things) published a summary yesterday [0] targeted at relative laypeople, with the same "we need to target 2029" bottom line.
AES probably can't be broken but that's irrelevant because in this scenario you have the key in plaintext from the key exchange
The algorithm everyone tends to be thinking of when they bring this up has literally nothing to do with any cryptography used anywhere ever; it was wildly novel, and it was interesting only because it (1) had really nice ergonomics and (2) failed spectacularly.
It's hubris to say there are no questions, especially for key exchange. The general classes of mathematical problems for PQC seem robust, but that's generally not how crypto systems fail. They fail in the details, both algorithmically and in implementation gotchas.
From a security engineering perspective, there's no persuasive reason to avoid general adoption of, e.g., the NIST selections and related approaches. But when people suggest not to use hybrid schemes because the PQC selections are clearly robust on their own, well then reasonable people can disagree. Because, again, the devil is in the details.
The need to proclaim "no questions" feels more like a reaction to lay skepticism and potential FUD, for fear it will slow the adoption of PQC. But that's a social issue, and imbibing that urge may cause security engineers to let their guard down.
SIKE: not lattices. Literally moon math. Do you understand how SIKE/SIDH works? It's fucking wild.
I'm going to keep saying this: you know the discussion is fully off the rails when people bring SIKE/SIDH into it as evidence against MLKEM.
DJB himself seems to prefer hybrid over non-hybrid precisely over concern about the unknowns: https://blog.cr.yp.to/20260219-obaa.html
These doubts may not be the kind curious onlookers have in mind, but to say there are no doubts among researchers and practitioners is a misrepresentation. In fact, you're flatly contradicting what DJB has said on the matter:
> SIKE is not an isolated example: https://cr.yp.to/papers.html#qrcsp shows that 48% of the 69 round-1 submissions to the NIST competition have been broken by now.
https://archive.cr.yp.to/2026-02-21/18:04:14/o2UJA4Um1j0ursy...
Unqualified assurances is what you hear from a salesman. You're trying to sell people on PQC. There's no reason to believe ML-KEM is a lemon, but you're effectively saying, "it's the last KEX scheme we'll ever need", and that's just not honest from an engineering point of view, even if it's what people need to hear.
And, if we're on the subject of how trustworthy Bernstein's concerns are, I'll note again: in his own writing about the potential frailty of MLKEM, he cites SIKE, because, again, he thinks you're too dumb to understand the difference between a module lattice and a generic lattice.
Finally, I'm going to keep saying this until I don't have to say it anymore: PQC is not a "kind" of cryptography. It doesn't mean anything that N% of the Round 1 submissions to the NIST PQC Contest were cryptanalyzed. Multivariate quadratic equation cryptography, supersingular isogeny cryptography, and F_2^128 code-based cryptography are not related to each other. The point of the contest was for that to happen.
The elephant in the room in these conversations is Daniel Bernstein and the shade he has been casting on MLKEM for the last few years. The things I think you should remember about that particular elephant are (1) that he's cited SIDH as a reason to be suspicious of MLKEM, which indicates that he thinks you're an idiot, and (2) that he himself participated in the NIST PQC KEM contest with a lattice construction.
1. cryptography developed across the world, 2. the actual schemes were overwhelmingly by European authors 3. standardized by the US 4. other countries standardizations have been substantially similar (e.g. the ongoing Korean one, the German BSI's recommendations. China's CACR [had one with substantially similar schemes](https://www.sdxcentral.com/analysis/china-russia-to-adopt-sl...). Note that this is separate from a "standardization", which sounds like it is starting soon).
In particular, given that China + the US ended up with (essentially the same) underlying math, you'd have to have a very weird hypothetical scenario for the conclusion to not be "these seem secure", and instead "there is a global cabal pushing insecure schemes".
1. the main "eye catching" attack was the [attack on SIDH](https://eprint.iacr.org/2022/975.pdf). it was very much a "thought to be entirely secure" to "broken in 5 minutes with a Sage (python variant) implementation" within ~1 week. Degradation from "thought to be (sub-)exp time" to "poly time". very bad.
2. the other main other "big break" was the [RAINBOW attack](https://eprint.iacr.org/2022/214.pdf). this was a big attack, but it did not break all parameter sets, e.g. it didn't suddenly reduce a problem from exp-time to poly-time. instead, it was a (large) speedup for existing attacks.
anyway, someone popular among some people in tech (the cryptographer Dan Bernstein) has been trying (successfully) to slow the PQC transition for ~10 years. His strategy throughout has been complaining that a very particular class of scheme ("structured LWE-based schemes") are suspect. He has had several complaints that have shifted throughout the years (galois automorphism structure for a while, then whatever his "spherical models" stuff was lmao). There have been no appreciable better attacks (nothing like the above) on them since then. But he still complains, saying that instead people should use
1. NTRU, a separate structured lattice scheme (that he coincidentally submitted a scheme for standardization with). Incidentally, it had [a very bad attack](https://eprint.iacr.org/2016/127) ~ 2016. Didn't kill PQC, but killed a broad class of other schemes (NTRU-based fully homomorphic encryption, at least using tensor-based multiplication)
2. McCliece, a scheme from the late 70s (that has horrendously large public keys --- people avoid it for a reason). He also submitted a version of this for standardization. It also had a [greatly improved attack recently](https://eprint.iacr.org/2024/1193).
Of course, none of those are relevant to improved attacks on the math behind ML-KEM (algebraically structured variants on ring LWE). there have been some progress on these, but not really. It's really just "shaving bits", e.g. going from 2^140 to 2^135 type things. The rainbow attack (of the first two, the "mild" one) reduced things by a factor ~2^50, which is clearly unacceptable.
Unfortunately, because adherents of Dan Bernstein will pop up, and start saying a bunch of stuff confidently that is much too annoying to refute, as they have no clue what the actual conversation is. So the conversation becomes
1. people who know things, who tend to not bother saying anything (with rare exceptions), and 2. people who parrot Dan's (very wrong at this point honestly, but they've shifted over time, so it's more of 'wrong' and 'unwilling to admit it was wrong') opinions.
the dynamic is similar to how when discussions of vaccines on the internet occur, many medical professionals may not bother engaging, so you'll get a bunch of insane anti-vax conspiracies spread.
> anyway, someone popular among some people in tech (the cryptographer Dan Bernstein) has been trying (successfully) to slow the PQC transition for ~10 years
Sounds enough like throwing shade to make me doubt it's value, in absence of other signals.
My point was your history of posting knowledgeably about security and cryptography provides the credibility for me to go do more reading about the stuff in mswphd's post.
Seen that many are already moving to QC-resistant cryptography and that more are shifting by the day... I've got a question: what are the implications of quantum computers going to be if we consider that the entirety of cryptography will have moved to quantum-resistant cryptography?
In other words: I only ever read about quantum computing when it's to talk about breaking cryptography. But what if all cryptography moves to quantum-resistant scheme, all of it... Then what are the uses of quantum computing? Protein folding? Logistics?
Basically, so far, quantum computing research has the effect of many companies and projects adding quantum-resistant cryptographic schemes.
If, say, we've got a $10 million quantum computer that can break one 256 bit elliptic curve key in an hour... Great, EC is broken. But what if browsers, SSH, auth, etc. just about everything moves to PQ schemes...
Then what are those quantum computers useful for?
I understand that breaking even a single EC 256 bit key in a few hours on a $$$ machine is a very big deal.
But what else are they going to be useful for? For breaking ECC doesn't help humanity. It doesn't bring anything. It only destroys.
EDIT: for example I read stuff like: "Estimates are about three years to break a single 256 bit EC key on a 10 000 qbits quantum computer". What's a 10 000 qbits quantum computer going to be used for when everybody shall have moved to quantum-resistant algos?
There are tons of hypothesized applications for quantum computing based on the expectation it will provide better simulation of quantum effects for e.g. chemistry, and offer major speedups of highly parallel simulation problems like nuclear plasma or some things in finance. Easy to Google to learn more about these.
But keeping the focus squarely on the military and intelligence services, one answer to your question is that everyone is not going to switch to post-quantum cryptography instantaneously. It’s going to take a while, especially for a long tail of “infrastructure” type things like networking gear, “internet of things,” industrial sensors, etc. Things that national intelligence services might like to break into to enable breaking into other things.
Quantum breaks may also still succeed against stored encrypted data from before the switch to PQ. And for at least a couple decades, national intelligence services have been scaling up their storage resources. So they might have a “backlog” they can work through.
Finally, things don’t have to last forever. Everything the military / government builds has an expected lifespan, and it only has to be valuable during that life span. And risks can be rare but huge in national security. So if quantum code-breaking computers only help the NSA learn a few very important things for a limited time, that still might be “worth it” to them. Or if a quantum computer doesn’t break any important cryptography, but helps advance the engineering and enables better quantum computers in the future for other anpplications—again, still might be worth it.
Practically though, there are some downsides. Elliptic curves tend to have smaller ciphertexts/keys/signatures/so are better on bandwidth. If you do everything right with elliptic curves, we're also more confident in the hardness of the underlying problems (cf "generic group lower bounds", and other extensions of this model).
The new algorithms tend to be easier to implement (important, as a big source of practical insecurity is implementation issues. historically much more than the underlying assumption breaking). This isn't uniformly, e.g. I still think that the FN-DSA algorithm will have issues of this type, but ML-DSA and ML-KEM are fine. They're also easier to "specify", meaning it is much harder to accidentally choose a "weak" instance of them (in several senses. the "weak curve" attacks are not really possible. there isn't really a way to hide a NOBUS backdoor like there was for DUAL_EC_DRBG). They also tend to be faster.
Sorry, I'm just very out of the loop on some of this stuff and I'm trying to play a game of catchup.
This page lists some numbers for different PQ signature algorithms: https://blog.cloudflare.com/another-look-at-pq-signatures/#t... Right now the NIST has selected three different ones (ML-DSA, SLH-DSA, and Falcon a.k.a. FN-DSA) which each have different trade-offs.
SLH-DSA is slow and requires a large amount of data for signatures, however it's considered the most secure of the algorithms (since it's based on the well-understood security properties of symmetric hash algorithms) so it was selected primarily as a "backup" in case the other two algorithms are both broken (which may be possible as they're both based on the same mathematical structure).
ML-DSA and Falcon are both fairly fast (within an order of magnitude of Ed25519, the X25519 curve signature algorithm), but both require significantly larger keys (41x/28x) and signatures (38x/10x) compared to Ed25519. Falcon has the additional constraint that achieving the listed performance in that table requires a hardware FPU that implements IEEE-754 with constant-time double-precision math. CPUs that do not have such an FPU will need to fall back to software emulation of the required floating point math (most phone, desktop, and server CPUs have such an FPU but many embedded CPUs and microcontrollers do not).
The net result is that TLS handshakes with PQ signatures and key exchange may balloon to high single- or double-digit kilobytes in size, which will be especially impactful for users on marginal connections (and may break some "middle boxes" https://blog.cloudflare.com/nist-post-quantum-surprise/#dili...).
Post-quantum signatures will require rotating your keys, but that's less urgent.
The Internet was not created for this.
One could argue that 'but they are very good at preventing DDoS attacks' — yes they are; however, they have always loved control and kept their technology proprietary to lock their customers into their systems. And one day, a single line of code disrupted many services on the web.
Centralization and monopolies are much bigger threats to the future of the internet, IMHO. (Which always follows the same pattern: give your customers free or unbelievably cheaper services, even at a loss, lock them in, then jack up the price.)
The big change here is that we're going to roll out PQ authentication as well.
One important decision was to make this "included at no extra cost" with every plan. The last thing the Internet needs is blood-sucking parasites charging extra for this.
Context, two nearly identical comments from different users.
hackerman70000 at 16:09 https://news.ycombinator.com/item?id=47677483 :
> Cloudflare pushing PQ by default is probably the single most impactful thing that can happen for adotpion. Most developers will never voluntarily migrate their TLS config. Making it the default at the CDN layer means millions of sites get upgraded without anyone making a decision
valeriozen at 16:17 https://news.ycombinator.com/item?id=47677615 :
> cloudflare making pq the default is the only way we get real adoption. most devs are never going to mess with their tls settings unless they absolutely have to. having it happen at the cdn level is the perfect silent upgrade for millions of sites without the owners needing to do anything
So practically immediately after DES was standardized, people realized that NSA had crippled it by limiting the key length to 56 bits, and they started to use workarounds.
Before introducing RC2 and RC4 in 1987, Ronald Rivest had used since 1984 another method of extending the key length of DES, named DESX, which was cheaper than DES-EDE as it used a single block cipher function invocation. However, like also RC4, DESX was kept as a RSA trade secret, until it was leaked, also like RC4, during the mid nineties.
IDEA (1992, after a preliminary version was published in 1991) was the first block cipher function that was more secure than DES and which was also publicly described.
Quantum computers are not a threat for spies or for communications within private organizations where security is considered very important, where the use of public-key cryptography can easily be completely avoided and authentication and session key exchanges can be handled with pre-shared secret keys used only for that purpose.
Most likely the NSA or someone else is ahead of the game and already has a quantum computer. If the tech news rumors are to true the NSA has a facility in Utah that can gather large swaths of the internet and process the data.