Of course, but it’s also possible that the form of computing doesn’t exponentially scale either, fundamentally. It seems all the “apparent” scaling has also been just using traditional computing to imitate quantum to make it look like qubits are scaling exponentially like traditional computing.
It’s very possible, that like dozens of other styles of computing that have been tried, where we have found that the only one that scales exponentially was digital computing (von Neumann).
I think because we desperately want to apply the lens of digital computing onto quantum, since it’s the one that has become ubiquitous, we forget that there were dozens of other types of computing that were tried and all hit impassable walls. They could never make general purpose compute, and the best they ever achieved were extremely limited uses that digital computing quickly outpaced due to its simply capacity to scale exponentially.
And when all our major WC progress seems to be us attempting to attach it to tradition computing via “virtual qubits” but they still just can’t factor anything with more than 2 or 3 bits worth of genuine entropy, that sounds like a “we must have an apparent order of magnitude scaling to get our next round of funding, so make it happen” sort of situation to me.
—————
In other words, there’s nothing wrong with preparing, the asymmetric cost of not having “insurance” on this issue is too great to not at least explore all options. But it absolutely is not an inevitability, and the world is FULL of bullshit and it needs to be looked at with an insanely skeptical eye. 10x that skepticism when the proposed solution demands that we **preemptively** freeze innocent peoples bitcoin to “save everyone” from it.