Replying to Avatar JOE2o

What forces decoherence and classical behaviour is not an entangled system getting "large" in the sense of if you have x+1 qubits then no matter how you try to cool them or arrange them all it'll never work because you've hit some kind of quantum physics speed limit.

Macroscopic quantum phenomena have been observed in systems like superconductivity and superfluidity, and that shows that collective quantum behaviour can be maintained at a large scale. Basically what w'ere talking about is a sort of thermodynamic robustness requirement that QEC is designed to meet, not a failure of quantum mechanics at scale. (Comes down to engineering again, and not any sort of natural limit like the mass density something can accrue before it collapses into a black hole)

The solution is to break things into multiple smaller, independent quantum processing units. each has a manageable number of qubits, say 50-100 (we've already done 50 logical qubits in all likelihood). These are easier to isolate and cool (andbonus, subject to localized error correction).

Modules communicate with each other not by some kind of ET phone home deal, but by generating entanglement between a communication qubit in module a and one in Module b. (and some other ways too)

There's a lot more to it, and there are other avenues being explore that do away with qubits in the sense we know them know altogether. Basically right now there are 50 or 60 different areas in any one of which a unexpected (though not at all implausible) breakthrough would change everything. That's kind of where we are. Like AI pre alpha Go.

Modularity doesn’t matter.

Shor needs one giant, non-equilibrium, actively-corrected, constantly-measured quantum state — not a bunch of small ones stitched together. It's not mapReduce..

Superconductivity and superfluidity are passive, symmetric, equilibrium ground states.

They are not that kind of macroscopic quantum system.

Engineering doesn’t get to invent new physics.

Reply to this note

Please Login to reply.

Discussion

Modular designs can absolutely get you a machine that can use shor's to break a bitcoin key. This is chip design 101, combined with quantum mechanics 101.

And this is only one of 60 or 70 separate angles in which an unexpected (but totally plausible) breakthrough can change the game. Do you want to go through all 70? we may not even need qubits at all...

You have to be honest with yourself. You have no idea what will or won't happen. You have no crystal ball. Nobody does.

Google, Cloudflare, Signal, many others have all moved to post quantum. Yet somehow the bitcoin community's only prevention work at the moment is to pretend like it can never happen. Classic head in the sand.

The smart thing is to take it very seriously and move as fast as possible to mitigate it.

Modular designs cannot run Shor on a Bitcoin key.

Shor is not MapReduce.

It requires one single, global, coherent, non-equilibrium quantum state across the entire register.

You have 70 engineering ideas to break the laws of physics? Good luck.

I don't need a crystal ball to know that the laws of the universe win every time.

And corporations grifting on the current thing? Never!

Physics does't care about compliance, projections, roadmap promises, laundered metrics, or 70 ideas for a breakthrough in the investor kit. It doesn't care who we give a nobel prize to what China does or how many billions get poured in.

Physics is not subject to fiat.

There is a hard physical ceiling.

If you want to suspend your disbelief and catch the fever, have fun.

But leave Bitcoin and ECC alone.

Honestly.

You're just plain old scientifically wrong, I don't know what to tell you. You have an internet-informed grasp of quantum physics, that's better than most people, so credit where due. But you clearly don't understand things at a deep level here, certainly not at the level of those working on these systems each day, with hands dirty. Which is why the bulk of the scientific community (at least relevant one) is saying something quite different to what you are saying. And which is why most bitcoin people who are actively engaged here are agreeing it, like fusion, is a case of sooner or later.

Look, man.

If I’m wrong, explain — scientifically — how.

Two claims:

1. Shor needs one single, global, non-equilibrium, actively-corrected, constantly-measured quantum state at macroscopic scale.

That object has never been observed — not in nature, not in any lab, not once in 40 years.

2. Forty years of better isolation have already mapped the ceiling with brutal precision.

We’re hugging it, not raising it.

You keep answering with social proof and investor deck talking points. That’s not science. That's fait thinking.

People need sound money and strong walls. Our freedom, privacy and security are under very real digital theat.

I'm not throwing down our best weapons on a rumor from the enemy.

Show me proof that a a macroscopic non-equilibrium actively controlled constantly measured error-corrected universal quantum state Is physically possible and I’ll eat my shoes on camera tomorrow.

Until then: don’t trust, verify.

Your move.

Let me clarify first, your view is that a quantum computer that can break a bitcoin key is, no matter how engineered, a physical impossibility, because the laws of physics say it is? As in this is not an engineering problem but a fundamental limit of physics, akin to faster than light travel in the conventional sense? That, with the knowledge we have today, we can conclude that, for now and for all time in the future, such a thing is totally impossible.

This is your view yes?

No more shifting goalposts.

You have a response or you don't.

If that is your position there is nothing to respond to. You would be coming at this with a comic-book level of certainly over your own personal understanding of what the physical limits of the universe are.

If your position is that you agree such a machine may be possible but is, let's say, too difficult to construct in our lifetimes, then that's another thing. Then there's a debate. Then it makes sense to look at how it can or cannot be built.

You've outlined your position as the former, but if you want to clarify that it's in fact the latter then please do.

There is a universe full of evidence that scaling this king of system beyond a certain very low threshold makes it go classical. There is zero evidence that scaling it beyond a that threshold without making it classical is possible. 40 years of qc researching only confirms this. More and more heroic isolation only grinds closer to the ceiling and makes our knowledge of it more precise and certain. Failed attempts to falsify knowledge are supposed to make us more and more certain of it.

Believing in things with zero evidence to even suggest that they might exist is irrational.

All I have asked you for is one piece of scientific evidence that breaking this appearant law of nature, the ceiling, is even possible.

You have none, so you pivot. To speculative engineering ideas. To social proofs. To reframing my position is somehow "comic book" unreasonable.

You are advocating for massive, detrimental and dangerous changes to Bitcoin, based on the unsubstantiated dream of research scientists and investors who are profiting wildly from the hype and have zero results.

If you have evidence, put it on the table.

I'll wait.

Which route do you want to go? There are around 60, as I said.

Modular is perfectly legit. Or you like Majorana?

Do you agree 10 physical Majorana qubits per logical qubit is reasonable? 50? Let's say 30.

Do you agree 1700 logical qubits is enough to crack shor's with some further optimisations (there have already been many optimisations)

Do you agree that makes 50,000 physical Majorana qubits (talk about arrangement later).

If you want to dispute any of those points then go ahead.

Or if you want to suggest that what Microsoft is up to is a pure scam, go for it too.

If not then make clear you agree with all that and let's move on to the next part.

Those are engineering ideas about isolation, Joe. You're pivoting.

Just one piece of evidence that the ceiling can be moved.

That's the only starting point.

That is how we define the number.

I'm sure you agree the machines we have now work. I'm sure you agree we've got 24 logical qubits in the bag, proven.

So somewhere between 24 and X is where you believe the limit is.

If you believe it to be over 1,700 you have no shor's argument anymore. If you believe it to be under 24 you also have no arugment anymore. So where is it? 1000? 500? 1500?

Pivot pivot pivot.

I'm not going to play the price is right with qbits.

Somewhere in that gap between 24 qbits held for a few microseconds and ~2000 held for hours to do shor on one Bitcoin key, the ceiling is lurking.

But how do we know the ceiling is low? I think that is where you are going.

Because we have reproduced in these labs - with cold and quiet and indirection through codes, etc. - the coldest quietest corners of the universe.

But you can't get quieter than silence. You can't get colder than absolute zero. And that's where we find the ceiling.

Because the physics doesn't change. All we can do is remove noise. But you can't isolate the system from itself.

We are assymptoticslly approaching total isolation. Which means we are asymptotically approaching the ceiling.

And it is no where near 2k qbits for hours.

You can't show me evidence of anything that approaches that scale because it doesn't exist.

We will get a really precise picture of where the ceiling is, and that is a cool scientific result. But we will not get a QC that can run shor on a Bitcoin key.

We have enough data to say with confidence that that is physically impossible. More every day.

I am not surprised by all the prize winners and bag holders and dreamers who don't want to admit this openly. People are weak to temptation. But this is not Bitcoin's concern.

Tick tock, next block.

>Somewhere in that gap between 24 qbits held for a few microseconds and ~2000 held for hours to do shor on one Bitcoin key, the ceiling is lurking.

Ok so now we're getting somewhere. Right then, Atom held around 28 logical qubits and ran Berns-Vaz algo w demonstrable error correction, under a second. That was 2 years ago, a lot has happened since.

And shor's we need ~2000 and hours. I'd argue that for shor's it's more likley around 1,700, we've already optimised it down by half, and there's more optimisation in the tank. pure math by the way. so by offloading the most complicated steps (modular arithmetic and fraction conversion) to highly efficient classical methods, the quantum part gets streamlined.

You're saying somewhere between 28 and 1 second and (if optimisation holds)1,700 and hours is some arbitrary limit of the universe.

Atom will ship commercial machines 48 next year. Means them + Microsoft, machiens at 48 logical, capable of running deep, sustained computations for minutes, hours, or potentially days in 2026. (this is all proven out in their lab, to prove me right that's just waiting 6 months to ship. (it is right though)

So 48, and running for hours. Where does it stop? At 96 and days? If not at 96 and days then at 192 and weeks?

If you can't say where the limit is, or even give a range, based on some physical properties, then you are basing the limit on nothing.

You have to give a number to show you've got some physical basis for your impossible statement.

You should be able to understand this from first principles and you are still squirming, but here is the equation:

Lindblad master equation

(The equation for self-decoherence.)

Γ_self ≈ γ N²

Even with perfect isolation (γ → 10⁻⁸ s⁻¹),

N² × 10⁻⁸ × 3600 ≪ 1 to stay coherent 1 hour

→ N ≲ 170 qubits max

Lindblad? That's your card? Maube for NISQ machine w/out error correction, from back in the good old days. What Atom is doing (again one of 60 or so paths being explored) keeps things way far from Rydberg state. LIke the entire purpose of QEC is to overcome the physical limits of T2 (coherence time). When you constantly measure and correct in a small group of physical qubits, the lifetime of the logical qubit gets exponentially extended, making the Lindblad N2 decay irrelevant for the computational unit.

I get the feeling this path you're on is all stuff coming out of the NISQ archives or something.

I mean the fact that atom is *already* using 1,200+ physical qubits to build 28 logical qubits with a better-than-physical error rate is empirical proof that that your Lindblad limit is obsolete in a computation context

You just made a very substantial false statement.

Lindblad is the gold-standard description of open quantum systems.

It has never been falsified.

Every attempt to scale past a few hundred physical qubits confirms it: N² correlated decoherence wins.

QEC doesn’t repeal Lindblad.

It adds more measurements and makes the N² term worse.

Show me one paper where Lindblad is falsified or QEC removes the N² scaling.

Either show me a paper that disproves Lindbald or admit you are wrong.

No more dodging.

Hello

tl;dr

JOE2O is correct

You are wrong

Show me the paper.

I have concluded you do now know what a logical qubit is.

Logical qubits are an engineering workaround to buy time by avoiding direct measurement.

Price tag: more physical qubits + constant syndrome measurements = **more self-noise**.

By physics, they **lower** the ceiling, not raise it.

Lindblad is fundamental.

Shor on a Bitcoin key needs ~2,000 qubits in one global wavefunction for hours.

Now strip away **every** engineering problem.

Give me the platonic ideal QC:

- absolute zero

- perfect vacuum

- zero cosmic rays

- silent measurement

- no logical qubits needed

Even then, **self-decoherence alone** (Γ ≈ γ N²) caps you at ≤170 physical qubits and coherence collapses in an hour.

**SELF-decoherence.**

You can’t isolate the system from itself.

That’s not an engineering limit.

That’s the universe saying “no.”

Get it now?

I'm not trying to roast you. The engineering is dazzling. But Lindblad's formula is the relationship between quantum stuff and classical stuff. All the skyscrapers of data we have track to that simple formula.

And this is why I care and why I learned about this: we can't be mangling Bitcoin or scaring people away from ECC freedom tech over something that can't happen. They are way too important.

You are completely misunderstanding physical qubits<>logical qubits.

If you understood what a logical qubit was you would understand that QEC doesn’t “repeal” Lindblad. Also repeal is a legal term, are you a lawyer? That would explain a lot.

Lindblad itself is fine as far as math and physics goes. It’s about limits imposed by environmental decoherence. Great. Key point though: if you have a way to pump coherence back into the logical qubit faster than the physical environment can drain it out, then this limit, which is again fine in itself, simply does not come into play.

What atom and many others have *already* done is empirical proof that the pump works, so to speak. You cannot say oh such a pump can never be built, because they exist today and are proven to work. And you cannot say the the resulting logical (yes logical) qubits can't preform the right knid of computation, because that's also proven. You've been proven out of an argument.

The fact that you’re misunderstanding this as “falsifying” (or, er, “repealing”) the Lindblad limit, as opposed to simply removing the need to worry about hitting it, makes it pretty clear that you don’t understand what a logical qubit actually is.

It’s like you’re saying there is a physical limit to how fast a human being can work an abacus. This is provably true, you can keep that one in your bag. But then you go on to claim that this “abacus limit” in turn limits the complexity of the mathematics that our species can do. Except, hello calculators and computers.

Dunno. It’s like debating OP_RETURN with someone only to find out at some point that they don’t actually understand what what a UTXO is. How far can the debate really go?

You have not engaged in any debate whatsoever. I told you something you didn't want to hear. You either have QC stocks in your bag or you are just high on the scifi. Dodges, childish insults and quantum marketing babble do not constitute "debate".

You can't "pump" decoherence. That sounds like it came out of a badly written Startrek. That's just some cringe QC marketing metaphor and not at all accurate.

Logical qubits trade size for time against local noise, and only against local noise. Against self-decoherence (N²), they make it quadratically worse.

It's not magic and it does "pump" anything but quantum schill bags.

Lindblad is the equation for ALL decoherence, including the N² self-interaction term that dominates at scale. Qbits decohere each other. You can't remove that, because that's how they participate in the same wave function. That's why there is a ceiling at ≤170.

That's the whole enchilada. I'm tired of repeating myself.

Are you going to say I work in a food truck now?

I've explained the same thing to you 20 ways and you have had zero valid arguments. I've given you first principles and even the formula and you just dodge and blather. Now you've resorted to petty insults.This has become far too childish and I'm not going to give it more of my time.

Good luck with your quantum bags and scifi fever dreams.

I'd pack my bags at this point too if I were you.

Let me leave this for anyone who comes across this thread, hoping for a neat conclusion.

You have posted a Lindblad argument that assumes the noise scales quadratically. That quadratic scaling is the basis of your N<170qubits.

Atom and microsoft have demonstrated logical qubits operating at an error rate four times better than the physical error rate. Other global teams have demonstratd similar. These results are empirical proof that the noise in their systems is correctable, and therefore does not scale quadratically, and therefore the actual noise profile is not following the eventually catastrophic scale path that defines your math (that ends up bing-bong at170)

It's one or the other: Either atom and microsoft and quantinium and harvard and everyone else are lying and it’s all a scam—or your argument has already been proved wrong by experiment.

Both these things cannot be true at once, that is impossible.

Passing reader: which one do you think is true? .

Ending with one last dodge.

Pure quantum shitcoin.

Live long and prosper. 🖖

Dodge accusation as a dodge, how very novel. You have no answer of course. You know you can't say they're all scamming.

Smoking that hype isn't good for you, man. When you play at shitcoin bubbles, you're rugging or you're getting rugged.

Summary of the topic you cannot confront:

You (quoting)

“QEC doesn’t repeal Lindblad. It adds more measurements and makes the N2 term worse.”

Results of multiple real-world experiments:

“Hi! We provide irrefutable empirical proof that QEC makes the N2 term considerably better, across the board!”

Classic case of real world experiments forcing theorist back to the drawing board.

Drawing board’s over there.

You are falling for weasel words, Joe.

That's how they weaponize your lack of expertise to get you to draw the wrong conclusions and keep you on the sauce.

"Better N²" means they reduced noise (very locally) and plugged a smaller number into the equation. That doesn't change the equation.

Again, we arrive at the max of 170 under the maximally generous assumption that they get those factors to zero.

This means nothing but "we did a better isolation".

BTW, I was going to give you a consolation zap and I couldn't. Set up you wallet, broham.

Nostr has too many features. I’m building a soft fork with no DMs, no zaps, no reactions, no media. Just text. you can zap some random nostr wierdo in my behalf.

And this better isolation + smaller number attempt to retroactively add an asterisk after your "QEC makes the N2 term worse" from earlier, pretty sure you know that can't fly.

Under you assumptions, the error rate for the logical qubit has to *always be worse* than the physical qubit, no matter how good the isolation is. But look, it's actually better. Also the logical lifetime of the logical qubit would have to always be shorter (for Google's Willow it's like 3x longer).

The results prove irrefutably that the N is *not* the governing factor in these QEC systems at all.

To argue that the ceiling remains at 170 despite multiple results showing the logical error rate is better (yes better) and the lifetimes longer (yes longer) than the physical is to break your own math.

It's correctable, exponential scaling, proven out by experiment. Not the uncorrectable, quadratic scaling your math depends on, and that you just broke with your asterisk anyway.

There are ideas out there, much hinges on discrete points and causal relationships. CST is one approach, hypergraphs is interesting and takes that idea further, preserves locality. There is a bright young researcher named Jonathan Gorard who has a good interview here.

https://www.youtube.com/watch?v=ZUV9Tla43G0

You are deflecting away from the implications….If anything, CST and hypergraph models strengthen the point: they both quantize time. Once time is discrete, the entire ontology of quantum computation breaks. Discrete time means:

- no ∂ψ/∂t → Schrödinger’s equation fails

-no continuous unitary evolution → Hamiltonians can’t generate gates

-no coherence across ticks → superposition becomes impossible

-no substrate for phase evolution → Shor’s algorithm cannot run

CST and hypergraphs don’t rescue quantum computing, they expose the contradiction it depends on. Yet you stand with confidence that quantum computing (in its current form is inevitable)

If time is quantized, the mathematical and physical foundations of QC disappear. Bitcoin simply demonstrates this discretization in practice, which is why invoking CST or hypergraphs only reinforces my argument.

Bitcoin is the working instantiation of what CST and hypergraph theories are still trying to formalize. The irony is that Satoshi solved the hard part in 2009 and almost no one has realized it.