Replying to Avatar lkraider

I had my AI counter-argument bot cross-check me:

A lot of the skepticism you summarized is directionally right (the hype is real; “practical advantage next year” claims are usually marketing), but several of your strongest-sounding bullets rely on shaky premises or outdated “folk numbers.” If you want an intellectually honest view, you end up in an uncomfortable middle: quantum computing is not a “scam,” but it is also not close to the grandiose promises investors were sold.

Here are key assumptions in your writeup that could be false:

• “No progress on factorization in 20 years.” The algorithms (Shor) are old, yes, but the engineering resource estimates and compilation techniques have improved a lot. For RSA-2048 specifically, Gidney+Ekerå’s well-known estimate was ~20 million physical qubits under explicit assumptions, and in 2025 Gidney published a new estimate claiming “less than a million noisy qubits” (still with demanding assumptions and still far beyond today). 

• “Millions of physical qubits per logical qubit” as a fixed rule. Overhead depends on physical error rates, code choice, connectivity, and what you’re trying to do (memory vs T gates, etc.). IBM is explicitly arguing for qLDPC-style paths that reduce qubit overhead compared to surface-code baselines, at least for some components (e.g., memory). 

• “NISQ can only do contrived demos.” Many “supremacy/advantage” tasks are contrived, yes, but the real question is whether error-corrected logical qubits can scale with improving logical error as you increase code distance. There are now peer-reviewed results explicitly about being “below threshold” (the regime you must be in for scalable fault tolerance). 

Now the strongest opposing expert viewpoint (steelman), in plain terms:

Quantum computing is an engineering program to build a fault-tolerant machine, and the physics is not in serious doubt: we already can create, control, and measure multi-qubit entangled systems; the hard part is driving logical error down faster than system size grows. The most credible “this is real” evidence is not qubit counts or sampling stunts, but demonstrations of error correction improving as redundancy increases (crossing into “below-threshold” behavior). 

Where your skepticism is most justified (real failure modes):

• Incentives are misaligned. Startups and even big labs market “number of qubits” because it’s legible, while the actually predictive metrics (two-qubit gate fidelity distributions, leakage, correlated noise, cycle time, error model stability, decoding latency, logical error per round, etc.) are harder to sell.

• Scaling is brutal in ways that don’t show up in demos: cryogenics, wiring, calibration at scale, correlated noise, fabrication yield, and error decoding/control stacks. Even if the math works, the “systems engineering tax” can kill timelines.

• “Break RSA” headlines are routinely abused. Shor threatens RSA/ECC in principle, but the relevant question is “cryptographically relevant quantum computer” (CRQC) timelines under realistic constraints, not toy factoring demos.

On “is it a scam?”: calling it a scam conflates three different things.

1. Quantum computing as a scientific/engineering field: not a scam. There are serious roadmaps and serious peer-reviewed milestones in QEC. 

2. Quantum computing marketing: often hypey, sometimes misleading, occasionally outrageous. Even pro-quantum voices have publicly complained about hype incentives. 

3. “Quantum” used as a brand for fraud (e.g., “quantum trading bots”): that’s a separate bucket (actual scams) and shouldn’t be used to judge the underlying field. 

A decision framing that avoids hero narratives:

• Option A: Do nothing / ignore QC entirely. Rational if you’re evaluating it as a near-term product category.

• Option B: Treat QC as long-horizon R&D, and demand falsifiable milestones (logical qubits with improving logical error; small fault-tolerant demonstrations; resource estimates with explicit assumptions).

• Option C: Act only where the downside is asymmetric: migrate cryptography to PQC because migration lead times are long and “harvest now, decrypt later” is real even if CRQCs are years out. This is exactly why NIST finalized PQC standards and publishes migration guidance. 

My best recommendation (if you’re an operator/investor/strategist): be skeptical of “revolution soon,” but don’t be skeptical of the threat model. Concretely: ignore qubit-count press releases, track error-correction milestones, and in security planning treat PQC migration as a project you start now, not later. 

Now I’ll argue hard against my own recommendation:

It’s easy to overreact and waste money “quantum-proofing everything” prematurely, especially if your data has short confidentiality lifetime. Also, PQC migration itself has costs and risks (performance, protocol changes, interoperability bugs, long-term confidence in new assumptions). If your organization cannot do crypto-agility well, a rushed migration can reduce security in the near term. The sane middle is staged migration driven by data lifetime and dependency mapping, not panic.

If you want to rewrite your critique into small, testable steps with kill-criteria (so it’s not just vibes):

• Step 1 (kill-criteria: vendor can’t answer): Require any “progress” claim to be expressed as (a) logical error per round vs code distance, (b) syndrome extraction cycle time, (c) error correlations/leakage handling, (d) decoding latency budget. If they only talk “qubits,” downgrade credibility.

• Step 2 (kill-criteria: handwaving assumptions): For “break RSA” claims, demand a resource estimate with explicit physical assumptions (gate error, cycle time, connectivity, decoding, distillation). Compare against conservative published estimates like Gidney’s 2019 and 2025 papers. 

• Step 3 (kill-criteria: no security inventory): If this is about security, do a cryptographic inventory first. If you can’t locate where RSA/ECC are used, “quantum risk” talk is theater. NIST’s transition logic is basically “start before X+Y exceeds Z.” 

One last note on your sources: Reddit threads and polemical blogs are useful for stress-testing narratives, but they’re not good evidence. If you want “strong doubts” that survive scrutiny, look at skeptics who argue from physics/complexity constraints (e.g., Kalai’s “we may hit a wall” position) rather than pure finance/hype rhetoric. 

Still on scam territory for me.

Reply to this note

Please Login to reply.

Discussion

No replies yet.