i don't see why in principle advancements in quantum error correction won't eventually allow us to scale to large numbers of logical qubits.

dismissive bitcoiners are relying on huge doses of "i hope it remains a hard problem".

nostr:note1pfg8nfazrv3h8spfm87yrhg4j9mc9gr6n67ldz6508m7u5p73jase9zg45

Reply to this note

Please Login to reply.

Discussion

I was about to say, this problem is real

I would say it will be a problem 😅 But I don't understand the whole discussion, after all, there is quantum-secure cryptography. We could use it anytime

If time is quantized and discrete the entire formalism falls apart Will. Superposition and decoherence mean completely different things when time is quantized.

What is a block?

k

So is Bitcoin not evidence of quantized time to you? If not, what exactly is a block?

If continuous time were fundamental, why do non-contradiction (no double spend), state validity, and finality require discrete blocks? Why does Bitcoins fundamentals collapse unless time advances in indivisible steps?

Either discrete time is essential to preserving consistency, or continuous time is doing no real work here. Which is it?

That is literally the axiom you are relying on here.

So if times quantized how would it show up in quantum computers? Would it fail after a certain threshold? Alternatively why does it work at all?

The entire mathematical formalism collapses because it depends on taking derivatives (on breaking time into infinitesimal pieces). You simply cannot do that if time is quantized and discrete. Once that assumption fails, everything built atop of the assumption fails as well.

This fundamentally changes the meaning of superposition and decoherence. In a quantized-time model, superposition is no longer a physically coexisting computational substrate; it is a potential state that exists between discrete ticks of time. Decoherence is no longer a gradual dynamical process, it is the unavoidable result of a time tick itself: a measurement. Under those conditions, the current mathematical framework is not incomplete; it is invalid if time is discrete.

That would imply physics is misidentifying what it is observing. What is being called a “logical qubit” is not a stable substrate existing across time, but a description of unresolved potential under an assumed continuous-time model. The substrate itself is not physically real in the sense required for computation, it is inferred.

If you genuinely believe computation can be performed on unresolved states across time, then go do it on Bitcoin.

Bitcoin already gives you everything a “logical qubit” is supposed to provide: a globally defined state space, unresolved states that persist across discrete ticks of time, strict non-contradiction rules, and a clear measurement event that collapses possibilities into a single outcome. The mempool is full of fully specified, mutually exclusive state transitions that remain unresolved across blocks. They are public, conserved, and enforced by consensus.

So if superposition is a real computational substrate, stop theorizing about it in sealed labs and error models. Compute on Bitcoin. Use UTXOs. Use the mempool. Demonstrate interference, phase manipulation, or speedup in a system where the states are observable and the rules are fixed.

Only then does the problem become obvious: nothing “computes” in the way quantum computing claims, because unresolved states across discrete time do not form a substrate, they form potential. Resolution only occurs at measurement, and measurement is irreversible. That’s not a limitation of Bitcoin; it’s a property of time itself. If you don’t understand Bitcoin physically, you don’t understand time at all.

Bitcoin doesn’t just meet the criteria for a “logical qubit”, it exposes why the concept collapses once you demand an open, verifiable system.

Clearest version yet, Jack. 🪬

Keep going 🧡

Thanks Chris! Your support is appreciated.

OK learning as I go so correct any mistakes but I cant find anything that makes quantum computers inherently impossible just because of quantized time. If it were the case why do QC exist at all?

Is the real issues decoherence and correlated errors instead of local errors.

So it's not a physics issue but an engineering one that prevents scaling?

correct

If time is quantized and discrete, this is simply untrue and you know it.

You will not admit it, you’ve trusted an axiom that Bitcoin openly disproves. Go and verify it for yourself.

you are a crackpot

Answer the question Will.

If time is quantized and discrete, what happens to the mathematical formalism of quantum mechanics and quantum theory?

your premise is incorrect, time is not discrete

Where is your proof?

Gödelian limits already explain why this axiom cannot be settled by measuring Planck time from within the system. Physics cannot falsify its own temporal assumptions internally. Bitcoin sidesteps that limitation by building time as an object. If continuous time were truly fundamental, the system would not work.

Bitcoin gives us something physics cannot: a physically instantiated state machine where time is constructed, not assumed. You cannot subdivide a block temporally without destroying causality, finality, determinism, and non-contradiction. Bitcoin simply does not function under continuous time. If discrete time were not fundamental, Bitcoin would be impossible, yet it runs, globally, verifiably, every day, every ~10 minutes a new block of time is constructed.

Bitcoin is my proof you are wrong. What is a block?

Does it matter if time is quantized?I thought the physics works either way

Blocks aren't ticks of a universal clock; they are discrete units of informational space (geometry). The mining process follows a Poisson distribution, meaning the "tick" is a random variable. The next block could arrive in 1 second or 1 hour. This variance proves the system is a byproduct of work in continuous space, not a fundamental temporal unit.

The purpose of applying a scientific method is to prove yourself wrong, and in failure, reinforce your hypothesis. I think you know this and just don't care, which is why I can't continue this discussion absent good faith.

To answer the question of "which is it," you suggested yourself that mining is worthless, as it represents proof of "no real work" being done. #Bitcoin

Hopium goes both directions, though. What's missing is an honest assessment and acknowledgement of the engineering practicalities involved in scaling the current state of the art. These will likely get solved, "eventually", as you say, but 1 yr. vs. 10 yrs. vs. 100 yrs. makes a qualitative difference in what current course of action is motivated.

indeed

Just for lay of the land, we are at 96 logical qubits now, with error correction working, and we need around 2,000 logical qubits to crack a wallet key.

The main issue, though, is that unexpected breakthroughs are by nature *unexpected*. That includes breakthroughs in hardware, as well as breakthroughs in pure math, which people often forget about. (The most dangerous breakthroughs for bitcoin are actually mathematical, the space of quantum algorithms and classical assistance for quantum algorithms is woefully under-explored.)

So as long as unexpected breakthroughs are a thing, there is no honest assessment of the number of years that would carry weight. All we can say is that the cracking of bitcoin wallet keys in the near term would require a major breakthrough but is certainly within the realm of possibility.

The problem is, does *anyone* have any clue, at all.

I keep pattern matching to nuclear fusion in the 80s, which would mean the current panic is ridiculous. But that assessment itself could be ridiculous; I don't know.

I am suspicious generally of advances in fundamental physics, that field went from breakneck speed in say the 1920s to an enormously expensive waste of time by around 2000. Arguable, of course, but still.

All good points.

As someone else wrote, the biggest risk is a mathematical breakthrough. Since these are inherently unpredictable there is no way to hedge against something that might happen tomorrow or might happen never, other than to just give up.

On the other hand, it is the engineering difficulties in the practical implementations of these designs much more so than quantum error correction scaling that provides more fundamental limits on system size.

Issues such as thermal cooling, control circuitry and cabling, manufacturing yield, environmental shielding, and supply chains for components all present massive challenges to creating a hundreds of thousands of qubits system needed to produce error corrected logical qubits of the size necessary to put ECC at risk.

Surmounting these requires solving not just physics or math issues, but coordinating and tooling an industrial, financial, and political capacity I am deeply skeptical is actually achievable.

I maintain that the primary motivation to look at post-quantum cryptography now is to mitigate the "harvest and decrypt later" situation perhaps more important in communication systems like Nostr than in Bitcoin.

I'd say not worrying about the uncertainty is enough motivation to adopt post quantum cryptography. The math checking out is just icing on the cake.

Either way, it's still cake.

It's definitely worth researching but the cost of implementation might be really high.

It’s not like Bitcoin is static software frozen in time, i say it will adapt faster than quantum can realistically endanger it..

The ossifiers will actively seek to prevent that.

Even when the risk is essential? Don’t they usually resist unnecessary change.. this would be some survival-level change right? Once there’s evidence the discussions can start to take place idk

We need a Quantum Canary that gives us 4 years to address the problem after it dies.

Yes! Similar to every other topic they don’t know shit about while spouting confident bullshit for the tribal hugs😂

Quantum computing is a psyop.

Is there already a BIP for changing from sha256 to something quantum resistant?

sha256 isn't the concern

I will say that assuming things scale in a linear fashion when systems become more complex is flawed. There’s no guarantee that decoherence and/or correlated noise issue will ever be overcome. Maybe they will. So in principle is it true? Perhaps but that’s not reality right now. They can’t even factor 3 digit numbers right now. And that’s after 20+ years of promises and research.