This is all poetry.

You don't have any equation as far as I can see through your hundreds of posts. Is this new physics? Equation-less? Like server-ess.

Give me 1 equation that you've come up to support your theory that somewhere between 100 logical qubits (which is proven to exist) and ~2000 logical qubits (which is geometrically proven to crack ECDSA) there is some impossible obstacle. Or one experiment design.

Anything but more poetry.

Reply to this note

Please Login to reply.

Discussion

I’m not proposing a new threshold curve between 100 and 2000 logical qubits. I’m challenging a prior axiom your inevitability story quietly depends on: continuous time.

I don’t need any equation to falsify that assumption. Bitcoin is a running, falsifiable counterexample. Here is a block of time:

000000000000000000009fc6465aa4fc20d3324f889256815a34dbb4c7151f80.

There are 928,303 others. Each block of time is an atomic, irreversible state transition produced by work. There is no valid intermediate block, no fractional finality, no “half-time” state that nodes can verify. You cannot subdivide the temporal state transition the protocol recognizes. That is exactly what quantized time means operationally to physics.

If time is quantized, the object you call a qubit loses its assumed ontology. Superposition, as used in quantum computing, relies on continuous time to define “simultaneous” phase evolution and coherent state persistence. Change the structure of time, and superposition is no longer a physically coexisting state, it becomes a probabilistic description over discrete updates. In that frame, a qubit is not an extant computational substrate; it is a potential state between irreversible transitions.

Please, go build and Bitcoin with continuous state evolution, allow intermediate consensus states or partial finality and still prevent contradiction without trust. You can’t. Verification collapses. That’s not poetry; that’s a falsifiable property of the system.

If you still want to claim inevitability for Shor’s algorithm, then the burden is on you to prove the axiom it depends on: that time is continuous. Until then, you’re asserting an ontology Bitcoin directly challenges with open proof, and calling that challenge “poetry” doesn’t make it go away.

No equation I produce changes that as any formalism is dependent on the structure of time. I don’t have to produce any formalism to falsify the axiom you insist upon, Satoshi already did that for me.

If you want some poetry: Continuous time is the foundation to the Tower of Babel that all of physics relies upon. It just takes 1 empirical block of time for it to all come down without touching a single equation.

https://mempool.space/block/000000000000000000009fc6465aa4fc20d3324f889256815a34dbb4c7151f80

Here is your equation btw. Discrete quantized time. Good luck falsifying Bitcoin.

Alright, let's see where we stop agreeing.

Do you agree the Harvard team successfully executed fault-tolerant algorithms with 96 logical qubits, showing that error rates actually improved as the system scaled?

Or do you think that result is faked?

I’m not claiming the results are faked. What I’m rejecting is the ontological claim being made about what those “96 logical qubits” are and what their scaling implies.

Yes, I accept that the Harvard/QuEra team probably ran experiments, collected data, and observed reduced error rates relative to their own error model as system size increased. That much can be true. What I do not accept is the unstated assumption underneath the entire interpretation: that the substrate they are operating on exists in continuous time, and that the objects they call “logical qubits” have stable ontological existence across that time to compute on.

Fault tolerance in quantum computing presupposes continuous-time unitary evolution punctuated by error-correction cycles. If time is discrete and quantized as Bitcoin empirically demonstrates in a way physics itself cannot, then the meaning of “coherence,” “error accumulation,” and even “logical qubit” changes fundamentally. In a discrete-time ontology, superposition is not a persistent physical state; it is a probabilistic description between irreversible updates. Under that model, there is no substrate on which long-lived logical qubits can exist in the sense required by Shor’s algorithm or large-scale QC.

On the second point: this is a press release and an article. Neither you nor I can independently verify the claim by reproducing the system, inspecting the raw experimental state transitions, or validating that the assumed time ontology matches reality. Accepting the conclusion therefore requires trust, trust in institutions, trust in interpretive frameworks, trust in assumptions about time. Bitcoin is fundamentally different. Its claims are verifiable by construction. Anyone can independently validate each block, each state transition, and the irreversibility of its time updates without trusting the author.

The disagreement isn’t “did they reduce error rates in their experiment?” The disagreement is: does that experiment demonstrate what they think it demonstrates, given that the entire formalism assumes continuous time?

From the ontology of time demonstrated by Bitcoin, the answer is no. At best, these systems are improving control over probabilistic measurements within a continuous-time approximation. That may be useful engineering. It is not proof that scalable, ontology-valid quantum computation exists.

Bitcoin still stands alone here as the proper instantiation, it just doesn’t give them the control they want.

Regarding your other point: The discrete tick of time is the limit.

>does that experiment demonstrate what they think it demonstrates.

Putting aside what the implications might be do we agree on the below facts:

- They use 96 logical qubits

- They ran fault-tolerant algorithms that used all of them

- Error rates actually improved as the system scaled (for example from their paper, using a distance-5 code instead of distance-3 roughly halved the logical error rate per round, a 2.14× reduction)

Can we agree those numbers are correct, and then after get to the implications?

I can’t agree these numbers are correct nor can’t I agree they ran an algorithm because I have to trust an article, there is literally no way for ME (or you) to verify the claims of this article without invoking trust. I was not there and there is no evidence of proof beyond a paper.

If you want me to trust they did what they claimed, sure. But Bitcoin has already taught us that trust is not a substitute for proof.

You are still missing my central point: without continuous time, there is no logical qubit in the sense they are asserting. This does not discredit the fact that they are interacting with some physical substrate they choose to label a “logical qubit.” It discredits the ontology they are assuming. If time itself is quantized, then the mathematical object they are “computing” over is not what they think it is.

If the goal is a substrate that genuinely exists in multiple unresolved states across time, why not compute on top of UTXOs? We can suspend UTXOs indefinitely in the mempool, unresolved yet fully defined, across quantized block time. That is a real, observable superposition, one enforced by consensus, not inferred from black box error models.

The crucial difference here is we can prove it on Bitcoin. Bitcoin is open, verifiable, and reproducible by anyone. No press release required. No trust invoked. If you claim computation, show it on a ledger that anyone can audit, you literally have superposed states with unmined transactions in the mempool at your disposal, the won’t decohere until they are measured “mined”

If you want an experiment, go ahead. We’re all waiting….

Gotta rewind here: QuEra/Harvard publishes a result, peer reviewed in Nature, signed off on by researchers at MIT, NIST, U of Maryland and Caltech, but you don't agree the numbers should be believed, no do you agree they ran the algorithm they said they did. I'm not talking about the conclusions, just the raw data here.

What about the earlier experiments by Microsoft + Quantinuum (trapped Ions), or Microsoft + Atom Computing (neutral), or Zurich, or any of the others, are we accepting any of the raw data from those?

I need to figure out where the bottom is here. If there is no bottom then it's just solipsists discussing sociology.

Any article with peer review is not empirical proof, I hate to break it you.

This is empirical proof. See the difference? You are literally looking at an object of discrete and quantized time. Run your own node if you don’t want to trust mempool.

I don’t care about qubit claims unless you can first provide empirical proof that time is continuous. Without that, everything rests on an unfalsifiable assumption. Gödelian limits already show you can’t even test that axiom from within the system doing the measuring.

You point me to peer-reviewed papers; I point you to cryptographic proof. It’s public, conserved, and independently verifiable state transitions of a bounded thermodynamic system. You are asking for trust and I am removing the need for it.

If your model requires assuming continuous time for “logical qubits” to exist, it’s already on shaky ground. Bitcoin doesn’t assume time; it computes it. In the end time (lol) will be judge. Bitcoin is time.

https://mempool.space/block/00000000000000000000580251fe06132af666fac9b6bb834d9836cfa9f42053

Okay so if we can't find a bottom, let's see if we can find the top.

Whatever it is that is making Bitcoin immune from quantum threats in the way you say, does this also apply to Ethereum?

You’re assuming the threat even exists in the first place. Your entire argument is built on assumptions you cannot verify.

Whatever it is that makes Bitcoin immune in the way you’re suggesting does not magically extend to Ethereum, because this isn’t about a specific chain, it’s about the architecture of time. Bitcoin doesn’t add immunity; it reveals the object you’re misunderstanding. You cannot perform the computation you assume you can because your model of time is wrong. Full stop.

There is nothing mystical about Bitcoin beyond cryptography enforcing conservation. What Bitcoin actually exposes is the error in the threat model itself. The assumed computation fails not because Bitcoin resists it, but because the computation presupposes a continuous-time substrate that does not exist.

Can Bitcoin wallet keys ever be cracked using a quantum computer? You are saying no, definitely not.

Can Ethereum wallet keys ever be cracked using a quantum computer? You are saying possibly.

Have I got that right?

No, neither will be hacked by one.

There is no threat because the threat assumes a continuous model of time that does not exist. Here is your proof time is quantized and discrete:

https://mempool.space/block/00000000000000000000580251fe06132af666fac9b6bb834d9836cfa9f42053

Please go ask an AI (if you are not capable yourself) about what happens to QM/QC formalism and what superposition and decoherence mean with a discrete and quantized model of time instead of a continuous model and report back. Be sure to ask it how you take the derivative over Schrödinger’s eqn with discrete and quantized time.

Okay so now we're getting somewhere. Both Bitcoin and Ethereum wallet keys are equal in this regard (can never be cracked by a quantum machine). Let's put a pin there.

Next does this extend to ALL wallets from all blockchains? Or are there some blockchain wallets with private keys that a quantum computer MIGHT be able to input the public key for and then have the quantum computer output the private key (crack it)?

No, I’m literally saying your model of a quantum computer can’t exist *IF* time is quantized and discrete. There is no threat model at all.

Go ahead and ask AI yourself. Post the question and prompt here open for everyone to see. This is not some bullshit claim.

We can disagree *IF* Bitcoin is empirical of quantized time, but you cannot disagree my first claim. Any physicist would admit that if it was true because the math simply breaks down and meanings of observations change.

Hang on, rewind, this is important. You said, categorically, that the current wallet keys for both Bitcoin and Ethereum wallets will never by cracked by a quantum computer.

I'm asking does that apply to all wallet keys from all known blockchains? Solana, Kaspa, Quai, Dogecoin...

You surely have an answer for this.

There is no threat IF time is quantized and discrete.

So basically you don’t believe that quantum computers exist at all, it any capacity, that the whole thing is either a hoax or a big misunderstanding.

It's either that or you believe they do exist, and quantum computation is real, but just not very powerful vis a vis these use cases.

I’m saying that if time is quantized and discrete, then the mathematical substrate required for computation does not exist in the way the formalism assumes.

QC relies on continuous-time unitary evolution to define superposition, phase accumulation, interference etc. If time has a smallest indivisible tick, that formalism breaks; this is well known in mathematical physics and has been the core known problem. Differential equations cease to be fundamental; coherence across infinitesimal intervals is undefined with an atomic tick. What remains are discrete update rules, not a scalable computational substrate.

So QC as a model of computation with asymptotic power (e.g. Shor), requires an assumption about time that may not be true. If time is discrete, QC reduces to an effective, limited approximation, not a fundamentally new computational class.

That’s not controversial. Discrete time breaking continuous-time QM/QC is a known result. You’re welcome to fact-check that. I will wait for you to do so.

So what is your actual position on quantum advantage?

a) It does not exist, it is a hoax

b) It does not exist, it is a misunderstanding, misreading, etc.

c) It does exist, it's a real thing, but is never going to be powerful enough to crack keys and such (works but lack of use case)

Which is it?

I don’t know why this is being framed as a spectrum of opinions. The outcome is binary.

Either time is continuous (infinitesimally divisible at the physical level) in which case the quantum formalism is internally consistent and large-scale quantum advantage is, in principle, real.

OR time is quantized and discrete (physically indivisible at a fundamental level) in which case the formalism underlying quantum computation collapses, and so do the claims built on it.

The outcome is binary dependent on the nature of time itself.

Every existing model of physics quietly assumes the first. We have to mention that assumption has never been proven, it’s assumed for mathematical convenience. I’m pointing out that the second outcome is not only possible, but empirically instantiated in Bitcoin.

Bitcoin provides an observable object of time. It produces discrete, indivisible temporal states (blocks) by resolving a bounded entropy search space (nonce & difficulty) into a single admissible outcome (a block of time) through irreversible work. It is a global, decentralized measurement process, an experiment run approximately every 10 minutes that anyone can independently verify with perfect fidelity. There are no intermediate states, no fractional blocks, no continuous interpolation. Time advances only when work collapses entropy into structure. Causality is quantized. These are my observations.

If that observation is correct, it doesn’t just challenge quantum computing, it challenges all physics built on continuous time. That’s not anti-empirical. Bitcoin is empirical in the strongest sense: you can verify every single block of time yourself from Genesis. If that’s not enough to question an unproven axiom, then the issue isn’t evidence, it’s just a sunken commitment to a prior model that nobody in physics can afford to be wrong.

Tick Tock Next Block Joe, Bitcoin waits for no one.

Seems a pretty easy question to give a straight answer to. Not sure why so hard.

Quantum = Fiat

I’m looking forward to the day that all these QC companies actually release a system in the wild (universally accessible) that solves a real problem or problems.

Not “financial portfolio optimisation” or other impossible to validate (and civilisation-ally meaningless) claims.

If a100stable qubit QC applications are ALREADY changing the world for the better, where can I buy one and what could it do?

When I read the press releases of all these “credible” sources pumping their own QC bags, I get the sense they are written by Silicon Valley/Wall St marketing departments.

I’m not the only one. There are many many thoughtful people asking these same questions.

Where is the “quantum” breakthrough of important and useful APPLICATIONS? All the press seems to be focused on the promises of such applications in the future. Are we there yet? Or is 100qubits not sufficient for any REAL innovation?

Pls don’t respond angrily. Skepticism is good. Especially in novel industries that have failed to deliver on past promises and time frames.

(Follow solid/semi-solid state battery development if you want another)

PS

Even the most dumbed down LLM query offers the same skepticism. So I don’t think I’m being absurd in asking the question.

This they released in 2024.

Now let's see how they met those goals in 2025.

Pretty much right on roadmap

It seems pretty crazy to suggest that somewhere between 96 (which we've proven) and ~2000 (which we know cracks ECDSA) there is some limit of the universe. If there is then such a limit then there needs to be an equation that states where that limit is and why.