The quantum is everywhere and nowhere. We give it meaning by our observation of and interaction within it. This is one of the most fascinating subjects of our lives, subjectively LOL

Reply to this note

Please Login to reply.

Discussion

yup, it's in the fundamental fabric of the universe.

I tend to disagree, this is like claiming a conscious observer inside of Bitcoin mines the blocks by observing them.

Observation is verification of structure. Bitcoin reveals that the meaning was applied before the creation. We do get to apply our own meaning from the internalized experience of time, but its meaning exists before it’s even measured into reality. We are the emergence of internalized verification.

This makes sense, and your explanation is elegant, but I remain confused. ₿ yond my grasp!

that whole schroedinger thing is a misinterpretation of the meaning of his experiment

the radioactive decay process is a poisson point process. EXACTLY the same type of random event as is used by bitcoin mining to find small block header hashes.

it is complete bullshit that the quantum state is in superposition until you open the box to verify that it decayed, or not, and the cat is then either alive or dead.

no, that's not what is happening. in actual fact, the universe outside of the observation modeling device (brain and senses) does continue to exist and function and resolve its own outcomes from the inputs continuously regardless of whether you are paying attention.

idk how such a la la la i can't hear you close your eyes and the monster can't see you childish interpretation has come to prevail in quantum physics. it's just not true.

poisson point processes simply cannot be predicted. bitcoin proof of work (and hashcash before it, where it was first invented) are the combination of a target maximum hash value, which is adjusted up and down, in order to bring the timestamps in alignment with the number of blocks, so that on average, every 2 weeks (2160 blocks) there is 2 weeks, and 2160 blocks. the error rate, the proportional mismatch between the actual timestamp of the adjustment epoch start and end, and the intended target, adjusts the target (which is compressed as a 32 bit value with an exponent to expand it to 256 bits).

the difference between a simple radioactive decay and a block hash solution is that by dialing up and down the size of the target (down is harder, up is easier) allows you to filter out part of the solution space, which has a particular probability rate per attempt, and the network, has an estimated amount of hash power, which primarily is only determined after the fact by the actual differential between the target and the actual block epoch time period.

the individual blocks, can be solved anywhere from 2 or more concurrently solving within a 1 second period, out to 2-3 hours, and ultimately, in theory could get even longer, if you didn't change the target down (bigger target value, less zeroes) when a particularly long epoch happens, which is when there is a number of miners who turn off or quit mining.

radioactive decay is exactly the same, except you aren't looking for just one of your particles to decay, you have 256 of them at the same time, and you are looking for the frequency with which some number of them decay within a narrow time window. this would be the closest similar model on the same physical phenomenon.

btw, radioactive decay is one of the best entropy sources you can get. i've seen DIY projects to make them out of old americium based smoke detectors, which is basically a geiger counter sat right next to a little blob of this radioactive material, and air space between, where smoke can go. the more smoke, the more the particles are bounced off and away from the sensor, and they detect fire by an average "click" of the counter time dropping dramatically, or stopping completely.

the clicks are completely unpredictable, and even, cannot be externally observed because the whole assembly, aside from the ventilation to allow the smoky air in, is lead lined. it just emits a pulse of electricity every time a decay happens.

with this, you can trigger the reseeding of a cryptographically secure random number generator, and no external observer can front-run your encryption, as all encryption depends on a random value which is revealed, that combines with the secret to generate the cipherstream that is XORed reversibly over the message, making it unreadable without the secret.

anyhow, i didn't really need to explain all that, exactly, but the interpretation of schroedinger's experiment is bunk.

observing matter at the quantum level requires interacting with it, you can't passively observe it. consequently, you can only discover the direction, or position, but not both at the same time. this is because these distinct observation operations have the property that when you do the interaction for getting position, you can't then learn where it's going.

hm i just realised that this has a similar structure to the scalar/affine of elliptic curves. scalars are positions, and affines are directions. probably not strongly connected but i see it. one is a size the other is a position, which is the math equivalent of position and velocity.

also, it's not strictly true that you can't *record* both position and path of decaying particles as they are emitted:

/500px-Recording_bubble_chamber.jpg

it's just that you can only use this to get a post-mortem of a number of radioactive decays occurring you can't catch both things in a momentary way. the position is what the bubble chamber paths record, but the velocity can only be determined after the fact using the ... i think it's the derivative, or maybe the integral of the curve of the path (which measures its angular velocity change over time, or its acceleration, which is what makes it do those loops and such).

I agree that Schrödinger has been widely misinterpreted, but the deeper issue is that superposition itself has been misinterpreted because it has never been defined relative to the only frequency that can give it physical meaning: Planck Time. Without the smallest possible unit of temporal separation, statements like “a system exists in many states at once” are not physics. The modern interpretation of superposition implicitly assumes a continuous, infinitely divisible time axis that has never been measured, and cannot be measured, which leads directly to the central problem: QT has no operational definition of existence, no definition of measurement, and no definition of time at the scale where these claims are supposed to hold.

Given that absence, asserting “many states exist simultaneously” is nothing more than a fractional-reserve ontology.

It is the same epistemic failure that made fractional-reserve money appear coherent until Bitcoin provided the first system with verifiable finitude. If fractional-reserve accounting cannot produce sound money, it is incoherent to imagine fractional-reserve ontology can produce sound physics.

Bitcoin exposes the flaw cleanly because Bitcoin is the first system in human history that shows us explicitly, measurably, and mechanically what a sound physical system looks like when it collapses entropy into conserved structure over discrete quanta of time. Every block is the isomorphic mapping between Boltzmann entropy (Kelvin expended in a difficulty-bounded search) and Shannon entropy (satoshis, crystallized in a unique computational configuration) via the conservation of energy. Thus proving both conservation of energy and information without axiom. The only reason this works is because Bitcoin defines the smallest unit of temporal separation, one block as the interval between two irreversible collapses of an entropy field. In that framework, superposition is simply the unresolved mempool: probabilistic futures that do not exist until they are paid for in work and written as memory. After collapse, the system is deterministic. Before collapse, the system is probabilistic regarding the future tick of time, deterministic regarding the last. This is the correct physical structure of superposition.

Physics, by contrast, routinely mistakes the mempool for the blockchain. It treats unmeasured potential as if it were a physically instantiated, computable set of simultaneous states. But in Bitcoin, we know exactly what unmeasured potential is: it is a collection of candidate transitions that do not yet exist. A UTXO with ten conflicting transactions in the mempool does not “exist in ten states at once.” It exists in one states (the unspent output) until work collapses one possibility into record. That collapse is not the observer’s consciousness, is not the lab apparatus, it is the measurement. It is the only moment at which existence becomes definable. Observation occurs after in which the state (structure) can be verified.

If we took the quantum interpretation literally, we should be able to run Shor’s algorithm on Bitcoin right now. After all, the mempool is full of “simultaneously existing” candidate states, all awaiting collapse. We could treat each conflicting transaction as a superposed computational branch, let the “wavefunction” evolve, mine a block, let the decoherence select a single outcome, and then by the logic of centralized QC apply “error correction” to restore the lost superpositions and harvest exponential parallelism.

The satire here exposes the core issue: QT relies on treating unmeasured potential as computable physical substrate, the very error Bitcoin was designed to remove from money. If we don’t know the physical process beneath what we call money, we don’t actually understand what money is.