I found the AI article analysis also interesting:

I read the Quantum Zeitgeist piece you linked (published Nov 27, 2025). 

It is basically a “prepare for PQC because Shor exists and migration is slow” article, not a “quantum is imminent and will crack everything next year” pitch. That framing is mostly reasonable. Where it goes wrong is in the usual places: it blends correct high-level security guidance with sloppy, overconfident technical specifics.

A few key assumptions in the article that are easy to over-believe:

1. “RSA-2048 could be done in hours on a sufficiently large QC.” That’s a rhetorical shortcut. The best-known public resource estimates depend on explicit hardware assumptions (gate error, cycle time, connectivity, decoding/control latency). Under those assumptions, the 2019/2021 estimate was “8 hours but ~20 million noisy qubits.”  In 2025, Gidney published a new estimate claiming “< 1 million noisy qubits” but “less than a week” runtime under the same explicit assumptions.  So “hours” is not a stable claim even within the same author’s evolving approach.

2. “Surface code needs roughly 1,000 physical qubits per logical qubit.” That’s an oversimplification that hides the real dependency: required code distance depends on your physical error model and target logical error rate, and the qubits-per-logical grows roughly with d^2 (e.g., 2 d^2 - 1 is a common counting baseline).  “About 1,000” might be plausible in some regimes, but it is not a universal constant.

3. “NIST selected four PQC algorithms … including Kyber and Dilithium.” Selection vs standardization got blurred. NIST published three finalized PQC standards in August 2024: FIPS 203 (ML-KEM/Kyber), FIPS 204 (ML-DSA/Dilithium), FIPS 205 (SLH-DSA/SPHINCS+). FALCON is selected but (as of Dec 2025) is still slated for FIPS 206 “in development.”  Also, in March 2025 NIST selected HQC as a backup KEM (a separate “belt-and-suspenders” move that the article doesn’t mention). 

4. “As of 2024, largest processors have fewer than 1,500 qubits.” That’s likely true in the narrow “physical qubit count on a single processor” sense (IBM announced Condor at 1,121 superconducting qubits in late 2023).  But this metric is the most hype-prone one; it ignores error distributions, correlated noise, leakage, and whether scaling improves logical error with code distance (the thing that actually matters for cryptography).

The strongest opposing expert viewpoint (steelman) against “quantum is a scam” is simple:

• The field is not “trying to brute-force miracles with noisy toys.” The credible milestone is demonstrating error correction that improves as you scale redundancy (below-threshold behavior). That is now a serious peer-reviewed line of work, not marketing. 

• And on the security side, the risk management argument doesn’t require believing a CRQC is imminent. It only requires (a) long-lived secrets exist and (b) migration takes years. That’s why government guidance targets a long transition horizon (e.g., NSA CNSA 2.0 pushing adoption timelines out to 2030+ for some requirements, with longer-term goals). 

Where the article is genuinely useful (and you should keep it):

• “Harvest now, decrypt later” is the one point that stays true even if you’re deeply skeptical about quantum timelines. The data you encrypt today can have a confidentiality life longer than your migration lead time. That’s a planning problem, not a physics bet. 

Where it drifts into fear-mongering / weak epistemics:

• “Experts estimate 2030s” is too hand-wavy without a distribution and a definition of “cryptographically relevant.” Serious timeline work frames probabilities over time for a specific capability (e.g., RSA-2048 in 24 hours). 

• It mixes architecture commentary with dated company references (e.g., “Honeywell” in trapped ions, when the commercial entity is Quantinuum) and asserts specific error-rate numbers without sourcing. 

• The site itself includes a disclaimer that third-party info hasn’t been reviewed for accuracy, which should downgrade confidence in its precise numerical claims. 

Reframe as a decision (with a “do nothing” option):

Option A: Do nothing now. Rational if your encrypted data has short shelf life (days/months) and you can rotate/replace quickly.

Option B: Start PQC migration as “crypto-agility + inventory,” not a rip-and-replace. This is the boring, usually-correct move: find where RSA/ECC live, classify data by secrecy lifetime, plan staged upgrades.

Option C: Overreact and “quantum-proof everything immediately.” This is where orgs burn money, break compatibility, and sometimes reduce near-term security due to rushed deployment.

My recommendation: treat the article as a decent motivation to do Option B, but do not trust its technical numbers as decision-grade. Ground yourself in NIST’s standards and transition materials, then decide based on (1) data shelf life and (2) your migration time. 

Now the hard argument against my recommendation:

If you don’t have the staff maturity to do crypto-agility well, a PQC program becomes theater: inventories go stale, teams cargo-cult “Kyber everywhere,” and you introduce new failure modes (performance regressions, handshake breakage, supply-chain incompatibilities). In that scenario, the honest move is to first fix basic key management, rotation, and dependency tracking before you touch PQC.

Small, testable steps with kill-criteria (so you don’t get trapped by hype):

Step 1: Build a cryptographic inventory (protocols, certs, libraries, HSMs, vendors). Kill-criteria: if you can’t list where RSA/ECC are used, you are not “quantum planning,” you’re storytelling.

Step 2: Classify systems by “confidentiality lifetime” (weeks / years / decades). Kill-criteria: if nobody can defend the category, assume “years” for anything containing identity, finance, or long-lived IP.

Step 3: Pilot PQC in one boundary where breakage is tolerable (e.g., internal service-to-service TLS with a rollback plan). Kill-criteria: if you can’t roll back cleanly, you’re not ready for broad deployment.

Step 4: Track reality using one metric: demonstrations that logical error improves with increased code distance (not “more qubits”). 

A short Orthodox-flavored grounding (since you asked for “ground truth” in this tradition): don’t let fear or pride run the plan. Hype feeds both. The sober path is vigilance plus humility: prepare because it is prudent, but refuse to speak more confidently than the evidence allows.

Reply to this note

Please Login to reply.

Discussion

No replies yet.