Discussion
I read the paper. It only further reinforced my skepticism.
I don't want to take the time to write a detailed critique, but I think the first peer reviewer's opinions are on point:
The QC "calculations" are totally made up to justify the QC actually doing something beyond random noise. There is zero connection to any practical computation.
I can't believe how many people fall for this shit. It reads like a parody of Eric Weinstein's Theory of Everything. However, the commitment to the bit is impressive.
Your position is somewhat unclear.
Are you disputing the raw data or the interpretation of the raw data?
And, at large, are you saying quantum computing itself does not exist. Or that quantum advantage does not exist?
Or that both do exist, but there are fundamental limits to, say, how many logical qubits we can get to, or how many gates?
Very few people I've talked here actually have a clear position on this, so I think it's fair to ask.
> Are you disputing the raw data or the interpretation of the raw data?
As reviewer #1 stated, it's incomprehensible techno-jargon babble (paraphrasing). It's impossible to provide a rational and coherent critique to nonsense. The only possible critique is calling bullshit.
> are you saying quantum computing itself does not exist. Or that quantum advantage does not exist?
They're doing some interesting physics experiments with no connection to any useful computation. It "exists" but it's nonsense. No QC "advancement" has plausibly shown quantum advantage or a path to utility (see the other reviewers, which agree).
> are fundamental limits to, say, how many logical qubits we can get to, or how many gates?
Even if you buy into the bullshit completely, theoretically, many orders of magnitude more qubits are required to do anything useful. Every added qubit adds additional thermal load, noise, errors, and complexity. The capital investment required for a single useful QC would probably be higher than the current AI spend (assuming it's even possible, which is highly doubtful, IMO)
> Very few people I've talked here actually have a clear position on this, so I think it's fair to ask.
I did a deep dive on QC years ago when they were trying to run Shor's algorithm. They've apparently given up on that since they failed to compute the prime factors of numbers as small as 15 and 35 (which any grade school child can easily do in their head).
There were also many ex researchers and ex QC engineers that effectively spilled the beans that it's total horseshit, some more politely than others.
Here's just one example from a Cambridge PhD:
https://www.youtube.com/watch?v=pDj1QhPOVBo
Now, as far as I can tell they've moved on to things like OTOC and now OTOC^2 which they can't even explain without sounding like Eric Weinstein. It appears to be just looking at the behaviour of the qubits themselves vs any I/O.
I've previously concluded that QC was BS, and therefore I'm not motivated to try to unravel this new phase of BS that is much further removed from legitimate mathematics and algorithms than the previous work.
It appears to me that they're now working on "Weinstein" algorithms because it makes it much more difficult for people to call BS. People hear the super complicated techno-jargon and just assume it's over their head.
Peter Thiel funded Eric Weinstein's work, so even very smart people can be fooled by this type of scam.
I personally think it's a massive waste of time to "quantum proof" any software that relies on time-proven and well-reviewed encryption or trapdoor functions before a single QC project can plausibly show that they can run Shor's algorithm successfully on a small number such as 35.
So, my ultimate advice is: when they can prove that they can factor small numbers with Shor's algorithm, then it's somewhat reasonable to have a concern about a future capability to factor much larger integers. Until that happens, it's a waste of time and resources.
To go even further, the underlying GR/SR/QM theories are lacking and unproven, IMO, but that's a different, very complex conversation.