Avatar
Mike Brock
b9003833fabff271d0782e030be61b7ec38ce7d45a1b9a869fbdb34b9e2d2000
Unfashionable.

But why is this an objection?

And note, in your quote of his, he's saying it's possible that a quantum event in your brain could lead to different decisions in a branching of the wave function. Which is true! He's explicitly not saying there's a branch for every possible decision you *could* make. He's just not arguing that.

You are misunderstanding. He's saying when there's macroscopic phenomenon that are bound to quantum measurements, those can lead to different decisions.

For instance, if you say to yourself, I'm going to step to the left if a quantum measurement observes an entangled particle as spin up. And I'm going to step to the right if a quantum measurement observes the particle as spin down.

Assuming you make good on that, you are going to have two branches of the wave function:

1. Where you observed the particle as spin up, and you stepped to the right. And,

2. where you observed the particle as spin down, and you stepped to the left.

But this does not happen merely as a result of every radioactive decay. If a potassium isotope decays in your abdomen, and the gamma particle interacts with an atom in a skin cell, then the branch of the wave function is described as: There is a branch of the wave function in which that skin cell observed the decay, via the absorption of a photon. That doesn't mean you'll literally have a branch of the wave function as a result wheee you hopped on one foot, picked your nose, sat down, did push ups, ordered Thai food, and every counterfactual permutation. Carroll is NOT saying that.

"So if I’m understanding correctly, everything that could possibly happen is happening simultaneously with every action in the universe. And not only that, but it’s happening in the same physical space!"

You are not understanding correctly. This is not what Carroll or Everett are saying at all. They're saying when a quantum measurement occurs with a coherent quantum system, causing it to decohere, the universe splits into different worlds. The *size* of those worlds in Hilbert space is proportional to the amplitude of the inverse of the wave function squared. That's what it says. That's not the same thing as saying, whenever you go to a restaurant, the world splits into one universe per item on the menu, or combination of items on the menu, with a different world where you order each thing. This is applying very classical reasoning to try and understand Everett, and it's leading you astray.

Guilty as charged!

Well, given that I am partial to what Hugh Everett was saying, I don't think there is a wave function collapse. It doesn't happen. The wave function evolves smoothly. What we think we see when we see wave function collapse, is the branching of the wave function. But all information is conserved.

I think you can dispense with the conceptual understanding all together, as Everett does, and just say: there's just the wave function, and it evolves according the Schrödinger equation given a Hamiltonian. Time and its arrow is an emergent property of entropy. And the biggest mystery is just: why was the early universe in such a low entropy state?

Well, not to sound crass but: you’re wrong because you’re wrong. What you’re doing here is trying to apply your intuitive conceptual models — which are completely classical — to the quantum domain. You just can’t do that. If the quantum domain was all classical analogues, with Planck time units being like ticks in a computer processor as a discrete units of time, then there wouldn’t be a Newtonian limit at all. It’s nearly tautological. 

The reality is the classical domain is an *emergent* domain from the quantum domain, and your macroscopic intuitions simply do not apply. This is ultimately the biggest problem with trying to get people to conceptually understand quantum physics.

The conceptual problem goes even deeper than this, and is also showing up in your attempts to solve wave-particle duality with a classical model of understanding.

Most physicists don’t believe particles even exist in the way people intuitively imagine them. For instance, the classic Copenhagen interpretation is the worst offender here. It makes the assumption that particles are fundamental things. But then along came quantum field theory, which provides a far cleaner explanation as to what’s really going on. What we think of as “particles” are really just excitations in scalar fields, and the interactions between particles are really about interactions between those fields. This is why high energy physicists talk about the energies at which particles appear, in GeV (giga-electron volts). These are the energies at which one scalar field will interact with another field. This interaction is what we call a particle. The fact that some of these interactions are stable, and others are not, is also a well understood concept that is described by the Standard Model. It also implicates issues such as the nature of the vacuum and how the stability of hadrons emerges at all.

As an aside, the fact that the Higgs boson shows up at a GeV of 125.25 (approx), hints that the universe may be a false vacuum, and the vacuum may in fact decay at some point in the future (see: False vacuum decay).

Yeah, but your starting axiom is wrong. Each unit of Planck time is not a server tick in the universe. It's just false. Literally no serious theoretical physicist thinks that's true. It is a *limit* of a domain of applicability in our theoretical models. Just like the Newtonian limit is a limit of the domain applicability of where Newtonian mechanics breaks down and gives way to quantum mechanics.

You're applying classical thinking to your thought experiment, which is getting you off on the wrong foot in the first sentence.

I think one would likely need to rewind all the way to talking about Plato's *Parable* *of* *the* *Cave* to begin to answer that question, and then work our way up to adbuctive reasoning, and then get deeper into the philosophy of science itself.

But the TLDR of the response is: we don't have to do that. We can see the externalized effects of these things. The Planck scale is where quantum mechanics and general relativity come into conflict with each other. This is referred to as the domain of quantum gravity. That's what it is. It does not refer to a fixed minimum granularity of time progression in the universe. This is a seriously bad understanding of what it is, that shows up in popular understandings of quantum physics. And it's just wrong.

Also, after thinking more about what you said at the very end there, I'm very confused about what you're hypothesizing here.

We can experimentally observe coherent quantum systems. That's what quantum entanglement is. These system are, by their very definition, not collapsing spontaneously every unit of Plank time. Also, nobody believes the wave function evolves in units of Planck time, in a tick-tock way. This is a very very big misunderstanding of what the Planck time actually is. The Planck scale *is* the Newtonian limit. It's the point at which quantum gravity becomes important. Which we notably do not have a working physical theory for, hence all the String Theory and Quantum Loop Gravity stuff that you periodically hear about.

This is just a long way of saying that I'm actually really confused as to what you're actually saying, here.

But what is the basis for postulating that? You're kind of ignoring a century of experimental particle physics when you ask that question. Nothing we have so far observed would lead us to believe that's true. In fact, its' quite the opposite. We see that decohered quantum systems behave completely classically! This is what the Newtonian limit really describes at the end of the day.

FWIW, if you really want the *best* arguments against many-worlds, you should look at David Wallace's thoughts on the matter. Obviously, I'm not convinced by his arguments. But he's also an order of magnitude smarter than me. So it's possible I'm just wrong. And I always acknowledge that's a possibility, implicitly.

Replying to Avatar Mike Brock

Bell's inequality *was* about hidden variables. Here's the first paragraph on the subject from Wikipedia:

"Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories given some basic assumptions about the nature of measurement. 'Local' here refers to the principle of locality, the idea that a particle can only be influenced by its immediate surroundings, and that interactions mediated by physical fields can only occur at speeds no greater than the speed of light. 'Hidden variables' are hypothetical properties possessed by quantum particles, properties that are undetectable but still affect the outcome of experiments. In the words of physicist John Stewart Bell, for whom this family of results is named, 'If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local.'[1]"

So you're over-generalizing here. Yes, it's true that non-local hidden variables can't be ruled out. But you're then assume theres these incredibly complicated non-local physics guiding these interactions that we can't see, that appear stochastic to us, but are deterministic in some other frame. This is why I originally said hidden variable models are trying very hard!

Also, that's just not what many-worlds says. Your characterization of world splitting happening at every Planck-scale interaction is just wrong. That shows a deep misunderstanding not just of Everettian thinking, but of quantum decoherence itself. A human being and your environment is an almost entirely decohered quantum stream. All the many-worlds interpretation says is the world evolves according the Schrödinger equation. That's it.

If you take the Schrödinger equation seriously, the other worlds show up for free. Every other interpretation, including Bohemian mechanics, is trying to argue for the deletion of those other worlds.

*quantum system. Not "quantum stream". That was a bad autocorrect.

Bell's inequality *was* about hidden variables. Here's the first paragraph on the subject from Wikipedia:

"Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories given some basic assumptions about the nature of measurement. 'Local' here refers to the principle of locality, the idea that a particle can only be influenced by its immediate surroundings, and that interactions mediated by physical fields can only occur at speeds no greater than the speed of light. 'Hidden variables' are hypothetical properties possessed by quantum particles, properties that are undetectable but still affect the outcome of experiments. In the words of physicist John Stewart Bell, for whom this family of results is named, 'If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local.'[1]"

So you're over-generalizing here. Yes, it's true that non-local hidden variables can't be ruled out. But you're then assume theres these incredibly complicated non-local physics guiding these interactions that we can't see, that appear stochastic to us, but are deterministic in some other frame. This is why I originally said hidden variable models are trying very hard!

Also, that's just not what many-worlds says. Your characterization of world splitting happening at every Planck-scale interaction is just wrong. That shows a deep misunderstanding not just of Everettian thinking, but of quantum decoherence itself. A human being and your environment is an almost entirely decohered quantum stream. All the many-worlds interpretation says is the world evolves according the Schrödinger equation. That's it.

If you take the Schrödinger equation seriously, the other worlds show up for free. Every other interpretation, including Bohemian mechanics, is trying to argue for the deletion of those other worlds.

Well, one of the biggest reasons I think spontaneous collapse is wrong, is hidden in what you just said. It makes *measurement* a special feature of nature, and treats *observers* as purely classical.

This debate has nothing to do with the argument over whether wave functions collapse if a conscious being observes it or not. Nobody think that's has anything to do with anything. When physicists say *observer* in a quantum mechanics, they mean just mean anything that detects a quantum particle.