Pages

Tuesday, October 22, 2019

What is the quantum measurement problem?

Today, I want to explain just what the problem is with making measurements according to quantum theory.

Quantum mechanics tells us that matter is not made of particles. It is made of elementary constituents that are often called particles, but are really described by wave-functions. A wave-function is a mathematical object which is neither a particle nor a wave, but it can have properties of both.

The curious thing about the wave-function is that it does not itself correspond to something which we can observe. Instead, it is only a tool by help of which we calculate what we do observe. To make such a calculation, quantum theory uses the following postulates.

First, as long as you do not measure the wave-function, it changes according to the Schrödinger equation. The Schrödinger equation is different for different particles. But its most important properties are independent of the particle.

One of the important properties of the Schrödinger equation is that it guarantees that the probabilities computed from the wave-function will always add up to one, as they should. Another important property is that the change in time which one gets from the Schrödinger equation is reversible.

But for our purposes the most important property of the Schrödinger equation is that it is linear. This means if you have two solutions to this equation, then any sum of the two solutions, with arbitrary pre-factors, will also be a solution.

The second postulate of quantum mechanics tells you how you calculate from the wave-function what is the probability of getting a specific measurement outcome. This is called the “Born rule,” named after Max Born who came up with it. The Born rule says that the probability of a measurement is the absolute square of that part of the wave-function which describes a certain measurement outcome. To do this calculation, you also need to know how to describe what you are observing – say, the momentum of a particle. For this, you need further postulates, but these do not need to concern us today.

And third, there is the measurement postulate, sometimes called the “update” or “collapse” of the wave-function. This postulate says that after you have made a measurement, the probability of what you have measured suddenly changes to 1. This, I have to emphasize, is a necessary requirement to describe what we observe. I cannot stress this enough because a lot of physicists seem to find it hard to comprehend. If you do not update the wave-function after measurement, then the wave-function does not describe what we observe. We do not, ever, observe a particle that is 50% measured.

The problem with the quantum measurement is now that the update of the wave-function is incompatible with the Schrödinger equation. The Schrödinger equation, as I already said, is linear. That means if you have two different states of a system, both of which are allowed according to the Schrödinger equation, then the sum of the two states is also an allowed solution. The best known example of this is Schrödinger’s cat, which is a state that is a sum of both dead and alive. Such a sum is what physicists call a superposition.

We do, however, only observe cats that are either dead or alive. This is why we need the measurement postulate. Without it, quantum mechanics would not be compatible with observation.

The measurement problem, I have to emphasize, is not solved by decoherence, even though many physicists seem to believe this to be so. Decoherence is a process that happens if a quantum superposition interacts with its environment. The environment may simply be air or, even in vacuum, you still have the radiation of the cosmic microwave background. There is always some environment. This interaction with the environment eventually destroys the ability of quantum states to display typical quantum behavior, like the ability of particles to create interference patterns. The larger the object, the more quickly its quantum behavior gets destroyed.

Decoherence tells you that if you average over the states of the environment, because you do not know exactly what they do, then you no longer have a quantum superposition. Instead, you have a distribution of probabilities. This is what physicists call a “mixed state”. This does not solve the measurement problem because after measurement, you still have to update the probability of what you have observed to 100%. Decoherence does not tell you to do that.

Why is the measurement postulate problematic? The trouble with the measurement postulate is that the behavior of a large thing, like a detector, should follow from the behavior of the small things that it is made up of. But that is not the case. So that’s the issue. The measurement postulate is incompatible with reductionism. It makes it necessary that the formulation of quantum mechanics explicitly refers to macroscopic objects like detectors, when really what these large things are doing should follow from the theory.

A lot of people seem to think that you can solve this problem by way of re-interpreting the wave-function as merely encoding the knowledge that an observer has about the state of the system. This is what is called a Copenhagen or “neo-Copenhagen” interpretation. (And let me warn you that this is not the same as a Psi-epistemic interpretation, in case you have heard that word.)

Now, if you believe that the wave-function merely describes the knowledge an observer has then you may say, of course it needs to be updated if the observer makes a measurement. Yes, that’s very reasonable. But of course this also refers to macroscopic concepts like observers and their knowledge. And if you want to use such concepts in the postulates of your theory, you are implicitly assuming that the behavior of observers or detectors is incompatible with the behavior of the particles that make up the observers or detectors. This requires that you explain when and how this distinction is to be made and none of the existing neo-Copenhagen approaches explain this.

I already told you in an earlier blogpost why the many worlds interpretation does not solve the measurement problem. To briefly summarize it, it’s because in the many worlds interpretation one also has to use a postulate about what a detector does.

What does it take to actually solve the measurement problem? I will get to this, so stay tuned.

236 comments:

  1. "We do, however, only observe cats that are either dead or alive. This is why we need the measurement postulate. Without it, quantum mechanics would not be compatible with observation."
    But surely we wouldn't ever expect to see a 50% dead cat, or a half-measured particle? Why would we need to add a collapse mechanic to explain that? We're not an omniscient observer outside of quantum mechanics, able to observe the whole wavefunction at once. If I'm in the branch where the cat is dead, I will measure it to be 100% dead; and vice versa. 'The probability becomes one' already assumes wavefunction collapse; it doesn't support it. Bee I understand your issue with MW but I think you're not making a logical step here.

    ReplyDelete
    Replies
    1. BenJ,

      No, the assumption "the probability becomes one" does of course not assume wave-function collapse. The probability that anything happens is always one. The assumption is that the probability *of what you have measured* becomes one. This requires that you update the wavefunction. Just write it down if you do not understand what I mean.

      Many worlds doesn't help you with that. Without a wave-function update or equivalent postulate, many worlds predicts that you split into a huge number of universes and observe anything with probability one. This is clearly not what we observe.

      Delete
    2. "The assumption is that the probability *of what you have measured* becomes one. This requires that you update the wavefunction."

      Do you mean 1) the probability that I just measured r, given that I just measured r *and* the state was psi? Or do you mean the probability 2) that I just measured r or 3) will again measure r given that the state is psi?

      I can see that the former 1) must trivially be one, but it has no implications for psi at all.

      I absolutely cannot see why any of the latter 2) and 3) must necessarily be one. It also seems to me that assuming any of them to be one under all circumstances even makes some false predictions.

      Delete
    3. The state was described by psi. (I'm not a realist.)

      Delete
    4. Sabine: You probably underestimate the depth of my confusion.

      I think I understood that the state was described by psi. I even presupposed that myself in every probability I defined.

      My problem is that the only probability that clearly *must* be one after measurement, is conditional on the value just obtained, i.e. p(r|r, psi)=1. It thus does not imply a collapse, since it is obviously independent of the only state psi that enters its definition.

      On the other hand, the probabilities that would imply a collapse if they *were* one, cannot always *be* one, since they are conditional on the post-measurement state psi alone, p(r|psi), not on the previously obtained result r.

      If you are not conditioning on the previous result, then what difference does it make, whether it was obtained or not?

      What am I missing?

      Delete
    5. single_world,

      You are using a neo-Copenhagen interpretation. This is a possible interpretation, but the problem with it is that it is qua assumption in conflict with reductionism.

      Delete
    6. Sabine: what still puzzles me is your claim that the collapse is necessary to describe what we observe. If this were true, it should be so in all valid interpretations. How can a neo-Copenhagen interpretation avoid that necessary conclusion and still be a possible interpretation? If it was possible doesn't that show that the collapse isn't strictly necessary after all?

      Aside from that, I don't understand where exactly I use an assumption that conflicts with reductionism or why that even matters. I'm only trying to understand your argument about the necessity of the projection postulate. I'm not trying to show that it is unnecessary.

      Let me try to formulate my understanding again. I would be happy if you would comment on it.

      Let's say we prepare a system in a state psi and schedule a measurement at t = 0. At every time t the probability of obtaining the result r is p(t) = |(r|psi(t))|² (Born rule). Now if at t=0 we actually measure the result r, we know that

      p(0) = |(r|psi(0))|²=1,

      from which it follows that psi(0) = |r> (modulo phase), no matter what the Schrödinger equation says. Does this argument correctly represent your reasoning behind the necessity of the collapse?

      Delete
    7. Sabine, that does help, but I still feel like you're flitting between definitions of 'you' here. From a MW starting point, at no point would I conflate 'me on each branch' with an overarching me. And I have no problem with 'every eventuality *is seen* with probability 1' (rather than 'I see every eventuality', which is, as you say, pretty silly).

      Delete
    8. single_world,

      Neo-Copenhagen does not "avoid" this conclusion, it merely says that the update is the update of the state of the system under observation, but that the update is an update of the observer's information about that system.

      "Now if at t=0 we actually measure the result r, we know that..."

      It's your statement about knowledge that is in conflict with reductionism. What is knowledge? Who or what has knowledge? These are macroscopic terms that should be derivable from the theory.

      Delete
    9. Ben J,

      "And I have no problem with 'every eventuality *is seen* with probability 1' (rather than 'I see every eventuality', which is, as you say, pretty silly)."

      There is nothing in many worlds that defines what "you" are. If you simply state it is "silly" to think of yourself as the forward evolution of your past self, you are still making an assumption here. This assumption is logically equivalent to the measurement postulate.

      Delete
    10. I don't feel like I'm running into that problem though. I get that MW sometimes uses 'the electron' to refer to 'the electron as I measure it, that went right' and sometimes as 'the part of the wavefunction representing the electron that goes left and right'...

      ...but I feel like I can talk sensibly about either of those entities and make sensible predictions. Same goes for 'me'.

      Look, I don't know how the probabilities we see in the real world fall out of the multitude of worlds. But I'm still not fully grasping how 'we only ever see one branch' is an argument for any sort of collapse mechanic. What would we see different if we lived in many worlds?

      Delete
    11. And if that's the subject of a post to come please shut me down :)

      Delete
    12. Sabine: I'm just trying to figure out why or if

      p(0) = |(r|psi(0))|² = 1

      is a *fact* implied by having measured r at t=0, because that is how I understood your argument. Nothing depends, in my view, on whether this fact is known to anyone or not.

      However, I think this particular argument leading to p(0) = 1 (and the collapse) is incorrect and I don't think the collapse is necessary in general, either.

      The reason why I think that particular argument is wrong, is because it apparently confuses two different meanings of "updating probabilities", namely 1) conditioning on additional data and 2) changing the likelihood of a specific datum.

      The reason why I think the collapse is unnecessary on empirical grounds is because I believe the Ensemble Interpretation is a counter example to this claim. It doesn't use a collapse, and it neither makes any false predictions, nor fails to make any true predictions that would be implied by the collapse postulate.

      I'd like to find out on which of the above points we have a disagreement.

      Delete
    13. single_world,

      I have said this several times already. This is the last time I repeat it. You are adopting a psi-epistemic position. The problem with this position is not the update. The problem is that it is inconsistent with reductionism.

      Delete
    14. "Without a wave-function update or equivalent postulate, many worlds predicts that you split into a huge number of universes and observe anything with probability one."

      I can't make sense of the above statement. The observer becomes entangled with the system being measured and the measuring apparatus, and so observes only one definite outcome. No additional assumption needed to get a single definite outcome.

      Please correct me if I'm wrong, but I get the impression that you want to keep the observer separate from the system being measured, and are reluctant to treat the observer him/herself as part of the quantum system. But this step is essential to the MWI.

      I think the basic philosophic insight of MWI is that you can never get an outside view of a quantum system; you can only get an "inside" view, because your brain is part of the extended system, and for your brain to register a measurement it must become entangled with the subsystem being measured.

      As for the Born Rule itself, sure, there are deep philosophical questions as to what probability even means in this situation, but one hint (not a proof) is this: if the observer runs a large number of measurements on identically-prepared systems, then those branches of the wave function in which the outcome proportions are observed to differ significantly from the Born rule have very small amplitudes. Furthermore, those amplitudes go to zero as the number of measurements increases.

      Delete
    15. single_world,
      To understand Prof. Sabine's point, it helped me to explicitly think about why your position 'conflicts with reductionism.' Updating the probability to 1 after a measurement has occurred, implies that the presence of an observer has caused the probability to no longer be governed by the Schrodinger equation, but be updated to 1 (this is nonlinear). However, the observer should be able to be completely described by the Schrodinger equation (linear) because of reductionism. This is a contradiction that needs to be resolved. I get the impression that this is extremely obvious to Prof. Sabine, but for people like me, who are so used to regarding the observer as a separate non-quantum entity, it can be hard to identify and challenge our problematic assumptions.

      Delete
    16. Thomas Payne: Thanks for your help. I want to clarify that it is not my view that the observer induces any nonlinear change. In fact I believe that no such change happens at all.

      I also clearly perceive a contradiction between the collapse postulate and the Schrödinger equation. However, I absolutely cannot follow the argument given for why a collapse should be necessary to describe what we observe. (I assume it is not offered as self-evident, since "a lot of physicists seem to find it hard to comprehend.") As far as I understood the argument, I suspect it might be based on a fallacy. But I can't be sure until I understand exactly what is meant by "updating probabilities."

      If we postpone questions about interpretations of the state and just use it as a tool to calculate observable quantities (an approach that should appeal to instrumentalists), then the easiest solution to the measurement problem would be to simply abandon the projection postulate. That this is at least viable, if not fully satisfactory, is, in my view, convincingly shown by Ballentine in the paper I cited below and also in his textbook.

      If simply losing the projection postulate would create problems with reductionism, it is still not clear to me what those problems are. But it is even less clear how reintroducing an empirically unnecessary postulate which crudely contradicts the Schrödinger equation could be seen as an improvement.

      Delete
    17. single_world: My understanding is that the collapse (measurement) postulate is necessary because, for a single measurement, you don’t measure a probability less than 1. It does not make sense to say that an observed particle is 50% measured. Once the measurement has taken place, its probability of having occurred is no longer governed by the Schrodinger equation, it is 1, having actually occurred. This “updating of probabilities” is then necessary for calculations. This isn’t a problem if you think that the Schrodinger equation merely represents information the observer has about the system. However, Prof. Sabine would claim that this notion of ‘information’ and ‘observer’ is inconsistent with the fact that these things should, themselves, be governed by QM. I’m not sure I agree with this last idea.

      I think the disagreement between you and Prof. Sabine lies in you not seeing the collapse postulate as necessary. I'm not a professional physicist (although I'm considering going back to school) and am not all that qualified to give my opinion although I apparently enjoy doing so, ha. I'm also a little unclear about the relationship between the collapse postulate and the projection postulate. Enjoying the conversation!

      Delete
    18. Thomas Payne: Every random process generates a definite outcome, not a probability. The assignment of probabilities to the possible outcomes is only what defines the process as "random". The only occasions on which we can, in a sense, "observe" probabilities is when they are approximated by relative frequencies in repeated experiments on identical systems. If those experiments are performed, they produce frequencies which agree with the assumption that the Schrödinger equation is valid during the measurement process; and they often contradict the probabilities calculated from the collapse postulate.

      By the way, I used the terms "projection postulate" and "collapse postulate" as synonyms.

      Delete
  2. Copy Edit:

    >merely describes the knowledge an observe has
    4th to last paragraph; should be "an observer".

    Only one I found!

    ReplyDelete
  3. Oops, 3rd to last paragraph.

    ReplyDelete
  4. Could you please explain the difference between neo Copenhagen and psi epistemic interpretations?

    ReplyDelete
    Replies
    1. Psi-epistemic just means that the wave-function is not an element of reality. (neo)-Copenhagen interpretations assume that it is also fundamental, as in cannot be derived from an underlying theory. If there was an underlying theory from which the wave-function is emergent, this would also be Psi-epistemic. Matt Leifer explains this very nicely in this paper.

      Delete
    2. Maybe someone could enlighten me, I find it very difficult to understand the neo-Copenhagen views that the wavefunction encodes an observer's knowledge and yet at the same there is no underlying theory. What is the encoded knowledge even about if there is nothing underlying?

      Delete
    3. Auria,

      Yes, that's the key question. According to neo-Copenhagen that's a question you're not supposed to ask. The theory is the final answer, end of story.

      Delete
  5. I think collapse obviously demands a non-linear function be involved. Is there some reason that cannot be gravity?

    I realize that likely means gravity is not quantum, or if it is, it must evolve by a non-linear (thus non-Schrödinger) equation.

    As far as I know, gravity is the only observable option here, anything else is postulating unobserved fields, properties, particles or behavior outside the Standard Model. And, as so-called Dark Matter is concerned, like the previous "Dark Matter Nightmare" post (and others from Sabine), there is a very good chance we don't understand gravity well enough to rule it out as the non-linear actor.

    ReplyDelete
    Replies
    1. Dr. Castaldo,

      One example of this idea is the gravity induced collapse described in Wikipedia as the "Penrose Interpretation". It is considered speculative and unproven at the moment.

      Delete
    2. Gravity could be the culprit, especially if you follow Roger Penrose: https://en.wikipedia.org/wiki/Penrose_interpretation

      On the other hand, it could work in reverse, where gravity is emergent when the number of quantum state gets macroscopic: https://en.wikipedia.org/wiki/Entropic_gravity

      And yes, we definitely do not understand gravity well enough on the scales where quantum mechanics becomes important. And, as Sabine and others mentioned multiple times, that is not at very high energies or at very tiny distances: it is in the intermediate distances, say nanometers to micrometers where we have no idea how gravity behaves once the quantum effects become important. We know that single "particles" (i.e. localized quantum fields with not very many states) "feel" the spacetime curvature, see https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.34.1472 and links to it (collectively known as the COW experiments), but we don't know if they generate it, and if they do, how it can be squared with being in a spatial superposition. The good news is that there are experimental efforts in progress (albeit on a budget nowhere near the LHC) to measure gravitational effects of these "cat states". There are also a few theoretical models: https://iopscience.iop.org/article/10.1088/0264-9381/32/16/165022 https://www.nature.com/articles/nphys3366, so there is hope for progress in the reasonably near future.

      Delete
  6. Dear Prof Sabine,

    You certainly win with the cutest cat there.

    But the more I think about it, the less I am convinced that the "updating the probabilities" is a problem at all. That is, I merely require different measurement eigenstates to be orthogonal (given free by other postulates), and I could get that each measurement branch sees a world that is equivalent to updating. As much as I dislike quantum Bayesianism, importing a small idea from it allows us to do this.

    You also mentioned something about how (neo-)Copenhagen is different from psi-epistemic. Do you have a particular blog post about that? All these are giving me a headache.

    ReplyDelete
    Replies
    1. B.F.

      Updating probabilities is certainly not an issue computationally. It's a problem, conceptually.

      Regarding neo-Copenhapgen vs Psi-epistemic, please see my above reply to Ritam Basu.

      Delete
    2. Dear Prof Sabine,

      Yes, I see you reference Leifer, another nice person.

      But I know what your conceptual problem of updating probabilities is, because you say it so often.

      The point I am making is that you don't have to update the probabilities; the standard postulates already do that (but you might modify the measurement postulate).

      Concretely, let P(r|psi) be the probability of measuring result r given initial wavefunction psi. This is, by measurement postulate and Born rule, equal to |alpha_r|^2 of the coefficient of expansion.

      What you want is that P(r|r,psi) = 1, as in, the immediate remeasurement will always get the same result. This is the collapse part of the measurement postulate. (For the other readers, the notation is, probability of getting r, given we already measured r before, from the initial state psi.)

      But it is merely for convenience that we update/collapse the wavefunction. You can jolly well have many uncollapsed branches and still get that. Just give me that the detectors have their postulate-given orthogonal eigenstates, and all branches will have their P(r|r,psi) = 1

      So, I cannot see why you keep insisting that the probabilities must be updated. The Bayesian immediately can see that the two situations P(r,r|psi) and P(r|r,psi) are different.
      ( P(r,r|psi) = P(r|r,psi) P(r|psi) = |alpha_r|^2 )
      I would rather edit the measurement postulate so that the probabilities after a measurement equals what you get if you collapsed the wavefunction---in this wording the collapse is done merely for convenience of not carrying unwanted branches, i.e. collapse is not necessary for quantum theory to agree with experiments.

      Delete
  7. Thanks for another great article. 3 typos:

    The large the object, the more quickly its quantum behavior gets destroyed. --->
    The larger the object, the more quickly its quantum behavior gets destroyed.

    Now, if you believe that the wave-function merely describes the knowledge an observe has then --->
    Now, if you believe that the wave-function merely describes the knowledge an observer has then

    it’s because in the many worlds interpretation one also has to also use a postulate --->
    it’s because in the many worlds interpretation one also has to use a postulate


    2 rewording suggestions:

    Instead, it is only a tool by help of which we calculate what we do observe. --->
    Instead, it is only a tool which helps us calculate what we do observe.

    The problem with the quantum measurement is now that the update of the wave-function is incompatible with the Schrödinger equation --->
    The problem now with the quantum measurement is that the update of the wave-function is incompatible with the Schrödinger equation

    Regards
    JR

    ReplyDelete
    Replies
    1. JR,

      I have fixed the typos, thanks for the attentive reading. I like both your rewording suggestions, but since it's a transcript of the spoken text, I will leave it as it is.

      Delete
  8. A very nice summary!

    I would comment that

    A) The superposition of states is purely mathematical and not physical. The particle or system is always in a well defined state at any given time of measurement. So, this 'solves' the collapsing to 100% problem (in my opinion, I guess).

    B) The frequency of the wave function represents the rate of 'precession' of the angular momentum of the particle or system along the direction of propagation, for the simple case of a single particle, at least (in my theory...!)
    So, the wave function is not spread out over space and does not collapse.

    As usual, I appreciate you possibly entertaining my thoughts and your blog!

    ReplyDelete
    Replies
    1. Greg Feild wrote:
      >"The frequency of the wave function represents the rate of 'precession' of the angular momentum of the particle or system along the direction of propagation, for the simple case of a single particle, at least (in my theory...!)"

      Scalar particles (like the Higgs) have no angular momentum. But, the wave function still exists.

      Your idea might somehow work for particles with spin (I've thought of this myself for fermions). But for scalar particles, no.

      Delete
    2. (Great minds think alike.)

      This is why I was forced to conclude that all interaction is mediated by photons (among other reasons).

      Photons transfer, and *conserve*, angular momentum.

      In my model, the 'mass dependence' of an interaction is moved from the "propagator" to the partcles:

      (1/m_large) ==> (m_small)

      The precession model also accounts for the double-value nature of spinors (that someone mentioned below).

      It also explains the Born rule (of course!) since the available angular momentum for interaction varies during the 'precession'.

      In summary: angular momentum !

      Delete
    3. Greg,

      It is not enough to describe interactions: pions exist, the Higgs exists. And you can use QM to describe their motion through space even when they are not interacting.

      And they have no spin.

      Your idea will not work.

      Of course, if you would actually publish your ideas somewhere where we could see them without having to pay for them, one of us might be able to tell you in more detail exactly where they go wrong.

      But perhaps you do not want that.

      Delete
    4. Pions exist as temporary bound states of spin 1/2 particles.

      Delete
  9. Seems to me that the Ψ-epistemic interpretation is no issue if we can accept that the measurable physical reality can be at the same time in ontic normal and anti-matter state.

    By this acceptance we could throw the need for fundamental CP-symmetry violation to the trashcan - and several parallel problems too. ;)

    ReplyDelete
    Replies
    1. I have no idea what this is supposed to mean.

      Delete
    2. Meaning that when measured the antipodal state is defined by decoherence logic and causal continuum with entanged state will be fixed over spacetime, ref. the ontic states and antipodal identification studied by t'Hooft.

      Phrase "at the same time in ontic normal and anti-matter state" simply rules electrons being electrons is only an agreement - antipodal phases between particles is THE observable and in mixed state by nature; structural decoherence conserves antipodality of particles and keep illusion for need of CP-symmetry violation.

      These ideas have been raised when reading cellular automata surveys of t'Hooft...

      Delete
  10. "The measurement problem is not solved by decoherence". I think it is but the initial understanding of decoherence is misleading. The name suggests that information is lost in thermal noise.

    In fact, information spreads so the whole macroscopic world is in perfect agreement which way the measurement went. Linearity is preserved by splitting and diminishing the weights of the wave function in Many Worlds. Probability appears to normalise to 100% because all the particles in some branch of the macroscopic world, including you, are entangled with the same measurement outcome. I can't write this in math, but I don't know how to express 2+2=4 from the point of view of the left '2' either. There's no obvious algebra for "how does it feel being this term in a sum" and that may be the cause of confusion.

    ReplyDelete
    Replies
    1. Well, I can write it in math and I tell you decoherence does not solve the measurement problem. Of course if you do not trace out the environment all you have is a complicated entangled state that entirely evolves according to the Schrödinger equation. The problem is, you have to calculate from this what we observe. That's what science is all about. We observe some detector eigenstate with 100% probability. Unless you can write down a theory that gives this result, your theory is wrong.

      Delete
    2. Here's the math that shows you that decoherence doesn't solve the measurement problem.
      https://arxiv.org/pdf/quant-ph/0112095.pdf

      Delete
    3. Dear Prof Sabine,

      But we are trying to tell you that decoherence does get detector eigenstate with 100% probability.

      The basic postulates of QM already give every system a set of orthonormal eigenstates. The detector should also have them. Measurement by detector means the detector's state is specified, i.e. not traced over. This then selects the part of the system+detector wavefunction that is "system, in detection eigenstate", such that repeat measurements always get probability = 1.

      No problems arise from having multiple branches, each with relative probabilities 1. No need for collapse to get this behaviour.

      Delete
    4. You may not be able to get that as a realist result because measurement requires a theory of the observer (conjecture). Meaning a theory of how phenomena appear to things that are somewhere in the universe and made of atoms.

      Ever physical theory however realist needs a theory of the observer. Newton assumes light of infinitesimal momentum that moves infinitely fast. Maxwell's field equations is a holographic theory where each observer is surrounded by a surface and phenomena have to come as waves that cross that surface. Einstein's main work was to adjust spacetime to have the correct theory of the observer.

      In quantum mechanics people could not avoid having a theory of the observer, but Copenhagen was an especially rough and unsatisfying one. Many Worlds is a much more elegant theory of the observer and has to be understood as such. Maybe there can be better ones. But the need of a theory of the observer is universal.

      Delete
    5. B.F,

      "Measurement by detector means the detector's state is specified, i.e. not traced over. This then selects the part of the system+detector wavefunction that is "system, in detection eigenstate","

      No, this does not select an eigenstate. Please have a look at the paper that Florin mentioned above.

      Delete
    6. I think you're insisting to see a derivation of how the wave function of the universe changes to have 100% certainty for the outcome observed. It doesn't. The wave function is rearranged to be a sum of two terms, in each term the whole apparatus is entangled with one outcome, and each term has half the weight. That is the reality according to many worlds. Decoherence helps explain how that reality comes about.

      The above reality is not observable as a whole because the apparatus and the room and you are entangled with one or another outcome. So if the particles of your brain ask the particles of the apparatus which way the measurement went you get a 100% answer. The version of you in the other term gets the other 100% answer. You really need to give up a wholly observable universe and accept a local (meaning only on one branch of the wave function) concept of reality to make progress.

      Yes there's probably real physics to be done explaining how you get 50% probabilities or 33/66% if you prepare the measurement another way. Or there could be an ingenious experiment where a nano-machine makes the measurement, splits, and is kept sufficiently isolated from the environment to interfere with itself (or not). But the main point of contention seems to be accepting the point of view on the phenomena locally in one branch vs. whole universe.

      Delete
    7. Because our mathematics breaks down at really short scales (and we need to sweep infinities under the rug using renormalization and regularization), it's pretty obvious that the SM is nowhere near a fundamental description of reality. My intuition is that it's just an "effective" approximation just like the Navier Stokes equations are for fluids. They work, but at some point QM kicks in.

      Once we know what QM "means", we will have solved the measurement problem. But I think that means measuring energy scales that are many orders of magnitude away from what we're capable of right now.

      Delete
  11. Hi Sabine,
    Is your objection: How can the Schrödinger equation, which is deterministic, yield probabilistic outcomes? (This seems to me to be a much clearer way to state it.) Or is it something else that I am completely failing to understand.

    ReplyDelete
    Replies
    1. Peter,

      Of course you can have a deterministic equation for a quantity that has a probabilistic interpretation, we have this all over the place in physics. The point is that this does not work if you assume that the quantity you are talking about is ontic, and if it's epistemic you run into conflict with reductionism in your axioms, unless you accept that quantum mechanics is fundamentally incomplete. Which is really the only consistent conclusion.

      Delete
    2. I don't know exactly what you mean by "fundamentally incomplete."

      My view is that quantum mechanics is the way the universe works, and if you are trying to find a completion using some kind of local variables (the way that the EPR paper wanted), you're wasting your time. I see no evidence whatsoever, aside from some vague philosophical concerns, that you can ever get any theory that gives better predictions than quantum mechanics.

      Now, if you want to put your energy into something that might actually get results, you could attempt to find better philosophical grounds that underlie the current theories of quantum mechanics. I'm don't think you'll ever be able to find a simple, intuitive explanation for why quantum mechanics works the way it does, but you might find new interpretations of quantum mechanics that complement our current ones, which I think would be a significant discovery.

      Delete
    3. Peter,

      "if you are trying to find a completion using some kind of local variables (the way that the EPR paper wanted), you're wasting your time. I see no evidence whatsoever, aside from some vague philosophical concerns"

      I just explained why quantum mechanics is inconsistent. This is not a 'vague philosophical concern'. The reason that this problem is still unsolved is that too many physicists have, like you, the attitude that quantum mechanics is the final word and one should not even think about it.

      Delete
    4. Hi Sabine,

      I don't think I explained myself well. There is a difference between

      a) coming up with an explanation of why quantum mechanics is a reasonable theory of physics, and

      (b) coming up with a new theory which supersedes quantum mechanics and makes different predictions (possibly different in just predicting more precise probabilities when quantum mechanics says the probability of some event is 1/2).

      I would definitely encourage you to pursue question (a), although I don't think the ultimate answer is going to be simple. I think the chance of achieving (b) is much less than the chance that a successor to the supercollider will see a new, unexpected, particle.

      When you say incomplete, it sounds to me like you think there is a reasonable chance of accomplishing (b).

      Delete
    5. Peter,

      I have explained why quantum mechanics, the way it is currently used, is inconsistent. I do not think it is incomplete, I know it is, even if you cannot follow. And (a) is, for all I can tell, not a scientific question.

      Delete
    6. I think that Sabine is saying that the "measurement problem" when analysed carefully makes QM inconsistent. However Sabine used the phrase "fundamentally incomplete" in the first answer. So Peter assumes that is a reference to "EPR Incompleteness". In the last answer Sabine says "I do not think it is incomplete".

      The subtlety here is that if the Measurement problem is indeed introducing an inconsistency into QM, then since QM is not a fresh untested theory, it cannot simply be discarded wholesale. So much of the theory must survive, and its inconsistency removed, by either extending the theory, correcting some mathematics (ie the SWE) - which might be viewed as making the theory more complete as a scientific theory.

      Although Sabine would argue that the main task now is to make QM consistent (I guess).

      Delete
    7. Sabine,

      First, you haven't shown that quantum mechanics itself is inconsistent, just that some of our interpretations of it are. I haven't seen your argument that the Bohmian interpretation is inconsistent. (It's horribly unwieldly, impossible to do calculations with, and the fundamental mechanics of it don't have Lorentz symmetry, although this lack of symmetry is completely unobservable. But none of these drawbacks mean that it's not consistent.) So I am not convinced that quantum mechanics itself is inconsistent.

      Second, nobody has ever observed a violation of quantum mechanics. This puts very severe constraints on a theory that agrees with quantum mechanics for all the experiments that have been done so far, but in fact gives different predictions.

      So I think quantum mechanics is correct. I would love it if somebody came up with a better consistent theory of quantum mechanics than the Bohmian interpretation, but the fact that one consistent theory exists means that quantum mechanics is not in itself inconsistent.

      Delete
    8. "The point is that this does not work if you assume that the quantity you are talking about is ontic, and if it's epistemic you run into conflict with reductionism in your axioms, unless you accept that quantum mechanics is fundamentally incomplete."

      I found this brief summary by Prof. Sabine extremely insightful/helpful. I see why the measurement problem implies that a psi-ontic interpretation of quantum mechanics implies that the theory is, at best, incomplete. Can it be written down exactly why an epistemic interpretation is in conflict with reductionism? Are all definitions of 'observer' and 'information' incompatible with reductionism in the appropriate axioms?

      Delete
    9. Peter,

      What I explained does not depend on the interpretation. The whole point is that it is a problem that occurs in *any* interpretation. I have not been careful here stating the assumptions, sorry. The theory should be local, which dBB is not. I am working on a paper (or I should be at least) where this is listed in more detail.

      The reason no one has seen a violation of the predictions of quantum mechanics is that no one does the necessary experiment. And the reason they do not do the necessary experiment is that they convinced, like you, that there is nothing to find. Why is anyone even surprised that there is no progress in the foundations of physics?

      Delete
    10. Thomas Payne,

      "Are all definitions of 'observer' and 'information' incompatible with reductionism in the appropriate axioms?"

      Depends on exactly what reductionism you are talking about. If you are talking about ontological reductionism (stuff is made of smaller stuff), then macroscopic notions like observer and knowledge and so on should follow from the axioms for the microscopic things, so you have this conflict right away.

      If, on the other hand, you are talking about theory reduction (in the sense that one theory derives from the other), you are implicitly claiming that theory reduction runs into conflict with ontological reductionism at some point. That is incompatible with what we already know about the laws of nature, so this requires explaining.

      Delete
    11. Quantum mechanics is neither about particles nor waves, but a union of both. Replacing this vague statement with the seemingly more precise claim about wave functions, Sabine actually misrepresents what QM is about. With the emphasis on waves, the particle aspects are lost. The continuous and deterministic evolution dictated by the Schrödinger equation cannot possibly square with the observed graininess and randomness of real quantum processes.

      Having attributed physical reality to wave functions, Sabine feels forced to hypothesize an additional nonlinear measurement process ensuring the uniqueness of measurement results. I haven't understood why a statistical viewpoint is inacceptable for her. It is probably some overwhelming preconception that fundamental physics must be local and deterministic (reductionist?).

      Delete
    12. Sabine,

      There have been lots of experiments which demonstrate the non-local properties of quantum mechanics, which address questions arising from the foundations of quantum mechanics.

      Nobody has done your experiment because nobody has told the experimentalists what experiment to do, not because they're convinced there's nothing to find. This experiment may be obvious to you, but it's not to the rest of us.

      Write a paper and describe your experiment; and don't complain about physicists not being interested in foundations until after your paper has been out for a while and nobody has paid any attention to it.

      Delete
    13. Peter,

      Any theory that completes quantum mechanics will be non-local in exactly the same way as quantum mechanics is non-local, so Bell-type experiments do not tell you anything.

      "Write a paper and describe your experiment; and don't complain about physicists not being interested in foundations until after your paper has been out for a while and nobody has paid any attention to it."

      I published that paper in 2011 and no one paid any attention to it. According to your own standards I therefore have permission to complain.

      Delete
    14. Sabine,

      I completely agree with this statement:

      Any theory that completes quantum mechanics will be non-local in exactly the same way as quantum mechanics is non-local...

      But then I don't understand how you reconcile that view with this criticism of dBB:

      The theory should be local, which dBB is not.

      Delete
    15. I would not say QM is incomplete, but rather it has yet to be 'interpreted' correctly.

      Unfortunately, most theories and experiments assume the Copenhagen Interpretation, either implicitly or explicitly.

      What we really need are more 'model independent' measurements.

      Delete
  12. Dear Prof Sabine,
    What if the 'macroscopic concepts like observers and their knowledge' are arbitrary in a Psi-epistemic understanding, similar to reference frames. A wave function can be written down for some arbitrary set of knowledge by a hypothetical observer. This wave function would be consistent with all possible wave functions for accurate sets of knowledge of all possible hypothetical observers. Nothing needs to be said about the nature of these observers. All interactions contributing to decoherence could be thought to be performing measurements.
    I'm a lay-person considering going back to school to study physics, so forgive me if these thoughts betray any profound lack of understanding. I've read your blog for a while now and find it incredibly enjoyable/insightful!

    ReplyDelete
  13. Question for Sabine or anyone else:

    Now, if you believe that the wave-function merely describes the knowledge an observer has then you may say, of course it needs to be updated if the observer makes a measurement. Yes, that’s very reasonable.

    How is it consistent with quantum mechanics that the wave function only represents an observer's knowledge? After a wave function collapse you lose the interference effect from non-observed eigenstates. Successive measurements of the collapsed state will give different physical outcomes than if one measured the collapsed state. For example, the double slit experiment gives a different interference pattern if you measure which slit the photon goes through vs. not measuring which slit the photon goes through. It seems to me that saying the wave function only represents observer knowledge is just incorrect and not a matter of interpretation.

    ReplyDelete
    Replies
    1. A quantum wave function in a ψ-epistemic interpretation is not real in any way. It is a solution to a wave equation that gives probability amplitudes that predict outcomes of measurements. In that setting the wave or ψ is not something that has existence. Erwin Schrödinger thought the wave was some sort of density that moved in space, but as Sabine says above we do not really measure the wave, we measure the system and find a particle. This gets a bit subtle, for weak measurements have some elements of actually measuring the wave. However, Bohr made the proposition that a quantum system is only described by a wave, but what is measured and has reality are the observable aspects of a system, often the particle. In this Copenhagen setting there is then a quantum domain of the world and a classical domain. The classical domain is what we actually observe, and a measurement apparatus is a classical system that exhibits a change when it measures a quantum system.

      There are ψ-ontic interpretations that confer a reality to ψ. The many worlds interpretation (MWI) and the rather cumbersome deBroglie-Bohm interpretation are of this nature. In MWI there is no collapse and the observer is what I call quantum frame dragged in Hilbert space along one path in the sum over histories of a path integral.

      I think it is not decidable whether ψ is ontological (real) or not. Maybe below I might elaborate on this some later today or in the week. Measurement could be seen as a process whereby a quantum apparatus as ultimately composed of many quantum states encodes quantum states with its quantum numbers. This in general might be seen as a Turing machine the emulates other Turing machines, or a preposition that acts on Gödel numbers, think of quantum states as qubits or quantum numbers, and this leads to undecidable propositions.

      Delete
  14. Sabine wrote:

    “But for our purposes the most important property of the Schrödinger equation is that it is linear. This means if you have two solutions to this equation, then any sum of the two solutions, with arbitrary pre-factors, will also be a solution.”

    The Schrödinger equation is linear, but there is another requirement that the wave-function should be normalized to 1. If you add two solutions, the normalization of the new solution would be generally wrong.

    Then you claim that the measurement changes the normalization and therefore it is non-linear. It seems inconsistent that in one case you ignore the normalization and in the other you consider it important.

    The point is that I could write a linear unitary operator that would describe how a system in superposition “collapses” into the measured state. This seems to contradict your claim that a quantum measurement must be non-linear.

    ReplyDelete
    Replies
    1. I do not "ignore" prefactors. But, as you certainly know, just to explain what a linear equation is, it is not necessary to say that the state is normalized to one.

      The normalization of the state is only one constraint, regardless of what the dimension of the state space is. This does not help you much.

      "Then you claim that the measurement changes the normalization and therefore it is non-linear."

      No, I have not made any such claim and I do not know where you get this from.

      "I could write a linear unitary operator that would describe how a system in superposition “collapses” into the measured state."

      Interesting. And how does that operator look like?

      Delete
    2. Sabine wrote:

      “Interesting. And how does that operator look like?”

      It is just a 2 by 2 matrix. There are two options:
      U1=
      [1 1]
      [1 -1] / sqrt(2)

      U2=
      [1 -1]
      [1 1] / sqrt(2)

      Then, if the superposition before measurement is:
      psi = [1 1] / sqrt(2)

      You get:
      U1 psi = [1 0]
      U2 psi = [0 1]

      Whether the measurement process is described by U1 or U2 should be something that we cannot control. It could be decided by some hidden variable. This hidden variable could be a hidden spin that can jump between an up state and a down state, but it can never be in a superposition. The measurement is U1 if the hidden spin is up and U2 if the hidden spin is down.

      From Bell’s inequality we know that the dynamics of this hidden spin must be non-local, so I wouldn’t consider this to be a good physical model. But it is a linear model that demonstrates that wave-function collapse can be a linear process.

      Delete
    3. Udi,

      And now please show that arbitrary initial states and up in detector eigenstates.

      Delete
    4. Sabine wrote:

      “And now please show that arbitrary initial states and up in detector eigenstates.”

      I think you had some typo, I don’t understand what transformation you want me to show.

      The point is that if you give me a single initial state and a single final state, I could write a unitary operator that transforms between them. The only requirement is that they should have the same normalization, because a unitary transformation does not change the normalization.

      If there are multiple initial or final states, it becomes trickier. There is no unitary transformation that would transform 2 different initial states to a single final state. Even a non-linear reversible transformation cannot do that. It would require a irreversible transformation.

      But there is an easy way to “cheat” by adding hidden states. Then for whatever set of initial and final states you give me, I can add hidden states, which are orthogonal. Then I can write again a linear, reversible unitary transformation between these states. I guess that the way to prove that a system is non-linear would be to show that the number of required hidden states is ridiculously big. But even then, the real proof would be to write down a non-linear model that is actually simpler than the linear hidden state model.

      Bell gave us a detailed proof on the limits of local hidden variables explanations of quantum mechanics. I am not aware of any proof that shows that quantum mechanics has to be non-linear in the wave-function.

      Delete
    5. Udi,

      "If there are multiple initial or final states, it becomes trickier. There is no unitary transformation that would transform 2 different initial states to a single final state. Even a non-linear reversible transformation cannot do that. It would require a irreversible transformation."

      Indeed, which is what I have been saying. The issue of irreversibility, however, as I have said several times, is easy to solve. It's the non-linearity that's the difficult part.

      Delete
  15. Neutrinos, are particles that seem to defy easy description by a wave function. Which one of the three flavors are we going to chose in our mathematic setup? Perhaps it is too soon to ask that question as we don't really have a good fix on the masses. The detector in Italy has set the upper limit at 0.2 for the moment as reported in popular literature.
    At times the dictum of Born " shut up and calculate" appears premature.

    ReplyDelete
  16. Sabine, as you aptly noted, a particle is always detected with 100% certainty, never 50%.

    However, I think it's worth emphasizing also that the result of a detection is never a particle, but only a new and more compact wavefunction. I'd like to address this briefly, since I think it unconsciously encourages an incorrect understanding the measurement problem.

    The precision with which a measurement locates a quantum particle is inversely proportional to the energy used in the measurement, in keeping with Planck uncertainty. Thus if an experimenter uses a sufficiently energetic gamma photon, she can locate an electron to within a volume smaller than an atom, or smaller even than a nucleon. But if she uses only optical photons, the best she can do is locate the electron to within a sphere of 380 nanometers, or about 8000 silicon atoms. For an electron heading towards a silicon crystal screen with holes a millimeter apart, that is more than enough resolution to force the electron to behave like a particle and pass through one hole or the other.

    What's easy to forget is this: Within the range of uncertainty left after that optical detection, the electron remains fully quantum. Thus while the electron will no longer exhibit two-hole wave diffraction, it can (and will) still diffract like a wave as it interacts with the silicon crystal lattice surrounding those holes. Oops!

    The point is this: While measurement can dramatically alter the wavefunction of a particle, it never transforms the wavefunction into anything other than another wavefunction. The space and time scales of this transformation can be mind-boggling, such as when an intergalactic photon from the far side of the universe passes through a series of telescopic lenses and is then absorbed by an atom. However, the result is still a wavefunction, not a point. Even the final demise of that intergalactic photon as it is absorbed by an atom simply makes it part of the wavefunction of an atomic-scale system.

    The real reason folks say electrons are point-like is, as opposed to protons and neutrons that have well-defined volumes, is that as far as any collider experiment has ever discovered there is no limit to how far you can shrink the classical containment volume for an electron simply by adding more detection energy. Since the asymptotic limit of this process is a point, electrons are said to be point-like. However, this is in reality a limit that can never be reached.

    -----

    All the points I just made are 100% Richard Feynman compatible, to within at least 380 nanometers resolution :) . What I'm about to say below is just me:

    The maths of both spacetime and quantum field theory are based on assuming that point-like locations and point-like particles are real, and that is simply wrong.

    The reason is algorithmic. It is not possible to create an internally self-consistent (or efficient) formal framework that in its first step assumes the existence of entities that require infinite processing to exist. This is the deeper reason why early quantum field theory kept exploding into infinities, and it is why renormalization fixed that. Renormalization is in one sense just re-scaling everything in a way that keeps the computation process from ever assuming the simultaneous existence of both infinitesimals and larger-scale phenomena.

    The real fix? Make everything holographic. Universe-sized nothingness becomes the default, rather than point-sized infinitesimals. Sub-spacetime entangled webs of information then provide both structure and as-needed resolution. In this model even the statistics of wave measurement become mundane reflections of how every bit of information in the universe is entangled at some level with every other bit. This universally entangled data resource gives experimenters who set up temporarily unobserved (that is, quantum) regions with the ultimate in high-performance pseudorandom number generators when they re-entangle (observe) their experiments.

    ReplyDelete
  17. This sort of things - superposition, decoherence, measurement, whatever they are - happens not only to (so-called) particles, but to heftier objects:

    "Quantum superposition of molecules beyond 25 kDa"
    https://www.nature.com/articles/s41567-019-0663-9

    So whatever constitutive realism works for particles should work for these molecules.

    ReplyDelete
  18. Sabine, Papageorgiou has a correct point. Unitary evolution applies to everything in the Universe, at all times, including things describable by the nonrelativistic Schrodinger equation as well as relativistic field theory, and probably including gravity too, but we don't have a working theory of that that is quantum.

    You fixate on the "observed" quantity ONLY. And the wavefunction of that, projected out of the wavefunction of the "universe" (or rather the subset of that including the "apparatus necessary to generate the observed particle and observe it") really does (asymptotically) collapse after observation. So what you say is (asymptotically) true.

    But you "fixate" on the measurable observable alone. In reality it really is entangled with the apparatus, and you have project it out of the entire entangled wavefunction ... and the apparatus necessarily
    "amplifies" (in the standard sense of electronics [your own nervous system is electronic]) the mreaured result in such a way that the probability of the "unmeasured" branch of the observable asymptotically vanishes.

    Not not a valuable service to physics to claim that the universe is not unitary in its entirity.

    Trying to say that a special "measurement" breaks unitarity of the Universe as a whole is simply wrong. Its wrong to try
    to get people to believe this. That statement is quite independent of philosophical arguments about "Copenhagen", "many worlds", "psi-epistemic" and similar phrases, and separately independant of EPR/Bell.

    ReplyDelete
    Replies
    1. dtvmcdonald,

      And the wavefunction of that, projected out of the wavefunction of the "universe" (or rather the subset of that including the "apparatus necessary to generate the observed particle and observe it") really does (asymptotically) collapse after observation. "

      I have no idea what that means. The measurement is the projection, qua assumption.

      "Trying to say that a special "measurement" breaks unitarity of the Universe as a whole is simply wrong. Its wrong to try to get people to believe this."

      This has nothing to do with belief. The measurement process isn't linear, hence it's not unitary. We know this observationally, it's not subject of debate.

      Delete
    2. Aha! Now I understand you, Sabine. You say: "I have no idea what that means. The measurement is the projection, qua assumption."

      Well, if you say that, there can be no discussion. It means that you are discussing simplified theoretical models rather than real measurements. The Born rule was specifically designed to do just that. It gives the correct answer for reasons that are well known. But real physical measurements are far more complex. And they are unitary, all the way into your brain. It would be good to emphasize this.

      True, its unfortunate that the simplist reasoning for explaining that uses the mathematical tool of probability fluxes, which most unfortunately is the math behind Bohmism, (but does NOT discuss "particles" and thus is not invalidated by the silliness of Bohmism.)

      I will attempt more strongly in the future to avoid discussing this on your blog.

      Delete
    3. The contribution of the Many Worlds interpretation is that it tells you exactly what the Born rule is for. In Many Worlds, QFT is a realist theory where the wave function evolves smoothly to describe all outcomes, each of them with smaller weights. One way the wave function can evolve is by entanglement as follows:

      (↑ + ↓)(environment)

      Evolves to:

      (↑)(environment↑) + (↓)(environment↓)

      Because the observer and any phenomena are completely contained in either the up or the down term, and the other term becomes inaccessible, we need a way to describe how phenomena appear to observers in that situation.

      Many Worlds tells you that the Born rule does exactly that. It applies when there is entanglement that spreads to include all phenomena of interest, and what it does is express how the phenomena appear to observers that are inside one of the entangled terms, as opposed to the complete but unobservable reality with many branches.

      Ideally, you'd be able to derive the Born rule from the Schrödinger equation. But even if it's empirical that's a big improvement over other formulations that have some vague concept of what "measurement" is and lead to fuzzy thinking about consciousness etc.

      Personally I don't care if the other worlds are real or not - it may have some ethical implications but I'm not sure. Deriving the Born rule from calculus and the wave function may be an interesting technical project. But having two theories that apply always, the Schrödinger equation and the Born rule for entanglement, is much clearer than saying the universe behaves two different ways depending whether you look at it.

      Delete
  19. Looking back, I think the core issue was exposed in my first class in QM formalism. The Schrödinger formulation does not describe transitions. As Erwin himself lamented: "If we have to go on with these damned quantum jumps, then I'm sorry that I ever got involved."
    It seems to me that what looks like a set of different issues can be sheeted-home to this one issue: The lack of a description as to what happens during a state transition. The fact that the equations are time-reversible could well be taken to be a symptom of their limitations, rather than a profound observation.
    I am old enough and lucky enough to have sit in a lecture by P.A.M Dirac as an undergrad. To me, his work stands out as a direct challenge to the religion that the Schrödinger equation contains a complete description of reality and breaks relativity. It must be the opposite: by applying special relativity to QM Dirac predicted spin and antimatter. To me, there is a whole lot more to QM and its relationship to relativity that is there, and we won't get to it while we stay stuck in mythology about cats in two states.

    ReplyDelete
    Replies
    1. When you say "transitions" as "quantum jumps" you probably are discussing real photons. QM has to discuss real photons via perturbative kludges. To be unitary it has to include "states" both with and without photons.
      This can in fact be done and is treated correctly in the first few chapters of relativistic quantum field theory textbooks, but its messy (as kludges usually are), and especially messy when discussing spontaneous emission. Ideally, ones uses full QED. Doing this correctly involves very careful use of boundary conditions to avoid "infrared and ultraviolet catastrophies". See Duncan's book "The Conceptual Framework of Quantum Field Theory".

      Delete
  20. As there is a general problem with complete understanding of QM - especially around the measurement, it is good to start with focusing on its probably simplest idealization: Stern-Gerlach experiment.

    Initial hidden state of magnetic dipole has to finally choose one of two discrete possibilities: spin up or down.
    Magnetic dipoles precess in strong magnetic fields - unless having parallel or anti-parallel alignment, what suggests that they should chooses one of them to minimize energy.
    Such alignment requires change of e.g. angular momentum, what through conversation law needs a carrier for this angular momentum, like a photon.

    Considering more complete picture, like:
    "unmeasured spin" -> "measured spin" + "photon"
    it becomes a reversible process - could be unitary.

    This example is one of many arguments that measurement hides interaction with environment, like through photon - being neglected in used QM description, requires going out of this QM formalism during measurement.

    ReplyDelete
  21. De Broglie–Bohm theory offers tempting perspective.

    ReplyDelete
  22. What if eigenstates with 100% probability just don't exist except as necessary theoretical constructs? How could you tell the differerence between a very, very spiky wavefunction and a Dirac delta one?
    Maybe the real universe doesn't have 100% probability eigenstates just as it doesn't have real straight lines, as those would require point(s) (particles) with infinite momentum. Their non-existence is equally problematic for the humaan mind.

    ReplyDelete
    Replies
    1. Ward,

      Yes, that is correct, eigenstates with 100% probability are mathematical abstractions. However, noting this does not solve the problem, as you will notice if you try to write down a process that accomplishes even such a peaking.

      Delete
  23. Hi Sabine, this is great. As an experimentalist can we take about specific measurements? For photons 'state measurement' seems to be associated with photon annihilation... into some atomic excited state.
    Or there are electron spins. I'm less familiar with counting electrons, but I assume it also goes with capture of the electron by some atom. What states do we measure and not destroy them in the process?

    The measurement problem looks like a myth to me.
    I see no problem and I'm very interested in making
    and detecting single photons. Time-symmetry-wise isn't
    the creation of single photon states the opposite of detection?
    (similar but very different.)
    George H.

    ReplyDelete
  24. "but I don't know how to express 2+2=4 from the point of view of the left '2' either."

    Maybe this link helps?

    https://www.quantamagazine.org/with-category-theory-mathematics-escapes-from-equality-20191010/



    ReplyDelete
  25. Hi Sabine,

    I've always had troubles with reductionism, as "the behavior of a large thing... should follow from the behavior of the small things that it is made up of" in the sense, that, yes, it should follow, but it's not at all straightforward. I don't think one can simply say that since the underlying Schroedinger equation is linear, there must not be non-linearities in the equation governing the macroscopic detector. Or put the other way round: since the detector is non-linear, there must be quantum non-linear equation governing its behaviour.

    To illustrate: while the motion of, say, classical fluid's particles are very well defined by the Newton's equations of motion, on a larger scale the behaviour of the fluid is described by the Navier-Stokes equation, which is perfectly non-linear. Further, in the derivation of the Navier-Stokes from non-equilibrium statistical mechanics the non-linearity pops up as a consequence of the statistical description of the system. Why shouldn't it be the case for the quantum mechanics/Schroedinger equation/macroscopic detector? I.e. why do you think that non-linearity is not the consequence of the statistics?

    ReplyDelete
    Replies
    1. longrange,

      You need a theory from which you get back the linear Schroedinger equation, not a linear theory from which you get a non-linear process from which you get (presumably) the Schroedinger equation.

      Delete
  26. This dilemma between the quantum and macroscopic worlds may center around what we mean by einselected states. Zurek defines these as quantum states for large action S = Nħ, N → ∞, that are stable under perturbations by the envirnoment or reservoir of states. Is there a computable method or algorithm for computing these states? There of course can't be any inference on any particular outcome, unless there are local hidden variables that ultimately violate Bell theorem.

    ReplyDelete
  27. If a complex multistep process occurs under a state of superposition, then only the final result of that complex process will be detectable at the termination of that superposition state.

    The intermediate results will not be visible to the observer when the state of superposition is ongoing.

    For example in a complication of the dead cat thought experiment, if the cat is killed by and exploding bomb when the cat dies under a state of superposition, then the heat and blast produced by the explosion is not detectable by the outside observer. Only the final results of the action of the exploding bomb on the cat will be observed when the state of superposition is terminated.

    In a more real world example, if a fusion, fission reaction or particle decay occurs under a state of superposition, then none of the intermediate products of these reactions would be observed. This means that neither the energy nor the secondary particles produced by the reaction would be detectable while the superposition state is ongoing.

    If this state of affairs is true, how do the energy and the intermediate particles affect the world both while and after the long duration state of superposition is in place?

    ReplyDelete
  28. Sabine,

    I am in general agreement with your points (though no doubt you will fail to convince everyone who believes there is no measurement problem at all).

    But, I would quibble with the following:
    >"Quantum mechanics tells us that matter is not made of particles. It is made of elementary constituents that are often called particles, but are really described by wave-functions. A wave-function a mathematical object which is neither a particle nor a wave, but it can have properties of both."

    It seems to me that, in non-relativistic QM, the electrons really are particles: when you detect one, you can detect it at one particular point, just as you would expect for a particle. It's just that these particles' behavior is somehow controlled in a very strange way by the wave function (and all of us physicists argue over how that works!).

    On the other hand, in QFT, you really do have waves that cannot be isolated down to a point: you see this in the fact that the VEV for the product of phi and phi conjugate does not vanish at space-like separations (of course, the VEV of the commutator does vanish). I.e., create a scalar (say a Higgs) here and you can actually annihilate it there, even though "here" and "there" are spacelike separated.

    When I took QFT from Steve Weinberg, Steve tried to take a particle-based approach: the fields were just auxiliary operators with the particles being primary. I had an interesting discussion with him about this problem. (Tom Banks also discusses this in his Modern Quantum Field Theory: A Concise Introduction: Banks uses this to show why antiparticles must exist in QFT.)

    So, I don't think this is a purely semantic issue, as shown by the contrast between non-relativistic QM and QFT.

    Now, think all this through for fermions in QFT and... well, let's just say that Nature can be very confusing!

    All the best,

    Dave

    ReplyDelete
    Replies
    1. Dave,

      "in non-relativistic QM, the electrons really are particles: when you detect one, you can detect it at one particular point, just as you would expect for a particle."

      There isn't such a thing as a point and the resolution of position depends on what you measure. Suppose you measure the electron by letting it kick other electrons out of atomic shells (say, in a gas or such), your location measurement will fundamentally be limited by the wavelength of the emitted light. Not that I am saying anything new here.

      Delete
    2. Yes, Sabine, but in non-relativistic QM it does make sense to ask for the probability of finding the electron in as small a volume as you wish (you will of course pay a price in high momentum and energy due to the uncertainty principle). Of course, due to superposition, you do indeed often get contributions that add constructively from different, distinct possible discrete positions.

      But in QFT, it is not just quantum effects that "spread out" the position of the entities: in QFT, we are actually quantizing waves (hence the now deprecated terminology of "second quantization"), and those waves are intrinsically delocalized in a way not just due to the uncertainty principle (again: look at the VEV of phi times conjugate phi for spacelike separations). A rather different effect, as Banks discusses in his book and as Weinberg found when he tried to take a purely particle approach to QFT in the class I took from him back in the 1970s.

      By the way, this distinction becomes clear with a vengeance in Bohm-de Broglie mechanics, and practitioners of that subject have had to deal with it ever since Bohm's original work back in the early 1950s. (I'm not boosting Bohmian mechanics, just noting an interesting fact.)

      But the distinction is real, even in orthodox QM/QFT, as Banks discusses.

      Delete
  29. I don't know that any particle of spacetime can't be part of the part of the wave function collapse, so not that there is a particle, but when a particle, of many unmeasureable particles, can suddenly be measured due to the wave function collapsing at that point when the activation energy at that particular point, has been breached.

    ReplyDelete
  30. When the Psi ( Ψ ) collapse the CPT- symmetry is broken.
    / the measurement is not possible more to use /
    Many hypotheses were suggested to understand the
    Psi-wave-function collapse . . . but only one idea was missed :
    '' what if Psi ( Ψ ) collapse depends on wave-particle duality ? ''

    ReplyDelete
    Replies
    1. QT have busyness with "particles having wavelike behaviour'',
      with the "Wave/Particle Duality", and this dualistic behaviour
      seems to be inconsistented with our ideas about causality.
      There is a causality for every action in this universe.
      Our difficulty in identifying it is simply a reflection of our lack
      of understanding of nature at the very small/very fast, scale.
      QM is not a "counter intuitive" theory.
      Solving the "Wave/Particle Duality" puzzle would solve the
      misunderstading of QM theory
      ===

      Delete
    2. Israel: "There is a causality for every action in this universe" is an assumption, there is no proof of that. For example, we can describe the decay of radioactive materials statistically using the half-life, but we cannot identify a specific cause of why an atom or particle decays at a particular time. It is random, and for all we know uncaused.

      Delete
    3. Dr. A.M. Castaldo11:26 AM, October 23, 2019
      What is a specific cause of the decay of radioactive materials ?
      Why an atom or particle decays at a particular time ?

      Maybe because every atom, particle, cell, living creature,
      planet, star, galaxy has own a particular time of existence.
      ===

      Delete
    4. Israel: Maybe because every atom, particle, cell, living creature, planet, star, galaxy has own a particular time of existence.

      Well, I can't imagine any reason I'd argue with that!

      Good talk.

      Delete
    5. The wave function is connected with observations by its dualistic structure.
      “Observation - Detection” is nothing more than a moment when the system
      of the wave function is forced to reveal its state via a change.
      What happens during the collapse of the dualistic wave - Ψ - function?
      Then the ''a template of the probabilistic ( Ψ ) wave function''
      gets abruptly modified as a quaantum particle . . . neutrino ?. . . tachyon ?
      ======

      Delete
  31. (Very) small typo, 1st paragraph:

    A wave-function *is* a mathematical object...

    ReplyDelete
    Replies
    1. Andrea,

      Thanks for spotting, I have corrected this.

      Delete
  32. --------------------Part 1-----------------------------------------
    The Heriot Watt, Edinburg experiment shows us that "recording" does take place at particle level in quantum mechanics just like what the theory of Quantum Darwinism tells us. Why is this important?

    Starting with qualification, that is, applying qualities or attributes to a person or thing, we have the adjective in Grammar. For example, he is a "good" boy. Here "good" qualifies the person boy; shall we look at it this way: we apply the quality or attribute "good" to the thing or person boy, and the boy takes on that quality or attribute. At first there was the word or language, then there was math. Initially math had no notion of qualification; there was only the notion of quantities and numbers. Then there was the advent of qualification in math with the need to talk about loss or gain in business. So, we had "-"1 or "+"1 where the qualities or attributes "-" or "+" told us how "good" or "bad" our business was doing; or whether the boat was travelling "upstream" or "downstream; or how "hot" or "cold" the body was. Connecting the dots or hyperlinking, just as we applied the quality "good" to the person boy in language or grammar, we now apply the quality "-" or "+" to the number 1 or to a variable 'a'(-a or +a). By doing this we introduced qualification in math, and anything assigned to the variable 'a' will either take on the quality "-" or "+". What I getting at is that we did not arrive at the notion of qualification in math by accident or inadvertently, rather we were bound to arrive at it because we were responding from our background which is qualification in language, which in turn is because the human mind abstracts and discriminates based on qualification to ascertain what is harmful or useful for its survival. Shortly, we are responding from our background.

    ReplyDelete
  33. ------------------------Part 2-------------------------------------
    In grammar we can apply a group of qualities or attributes to a person or thing, for example, "tall, lean, handsome, good" boy. We can give the group "tall, lean, handsome, good" a single name or group name "ladies-man” and substitute the group name for the bunch of qualities as "ladies-man" boy. For a long time in math we could not apply a group of qualities to a variable or number, but then came computer science where we could do that. Consider a group of 4 qualities or attributes "numeric, integer, a range of -32768 to +32767, 2 bytes space in memory". This group is applied to a variable 'a' thus:
    (numeric, integer, a range of -32768 to +32767, 2 bytes space in memory) --> a. We can give the group a single name or group name "int", then the statement becomes (int) --> a. Now the variable 'a' takes on all these qualities; thus, we introduced the notion of multiple qualification with the advent of computer science. But these qualities or attributes are sort of hardwired, how nice it would be if we can create our own bunch of qualities, then apply it to a variable and then that variable will take on a complex set of qualities. Thus, was born user defined datatypes and object-oriented programming. I can take both 'int' and 'char' groups and put them under one umbrella which I may call by any name because it is my own creation; let us call it "alphanum", therefore, "alphanum" = (int, char), and I apply it to the variable 'a' thus: (alphanum)  a. Now 'a' is no longer a simple variable, rather it is a compound variable, in that, one part of it takes on integers and another part of it takes on characters. Such a compound variable is called an object.

    What I am getting at is we did not arrive at Object Oriented programming or OOPs by accident or inadvertently; we were bound to arrive at it because we were responding from our background. That is, we ourselves were biologically and psychologically programmed by nature. It is because programming is hardwired into us, we ourselves arrived at computer programming, which means it is not a coincidence. We are only responding from our background!

    Taking this reasoning further, the fundamental program of life is to record, memorize, and then respond from that memory. Life would not have appeared on earth if life itself did not respond according to a background. That is, life would not have been capable of "recording" if recording was not already going on in the inanimate world. And working backwards up to the quantum world, I suspected that "recording" must be going on at the quantum level too. And the Heriot Watt experiment, Edinburg shows that "recording" does take place at the quantum level.

    ReplyDelete
    Replies
    1. Gokul,

      You said "… life would not [be] capable of 'recording' if recording was not already going on in the inanimate world … working backwards … I suspect that 'recording' must be going on at the quantum level too."

      I like your argument, which I interpret it as a fractal symmetry hypothesis: The recording of data by computer systems is made possible by the recording of data in organisms with complex neural systems… which is made possible by the recording of data by living cells via DNA and RNA… which is made possible by the persistence and information-storing composability of carbon-based organics… which is made possible by a solar system with worlds where moderate conditions endure for eons… which is made possible by particles that form stable elements that form suns and planets… which is made possible by the curious existence of ½ spin particles called fermions that are willing to combine but refuse to share space, thus enabling structure… which are made possible by the rules of the Standard Model, and of the infrastructure behind that model and the rest of physics that we still do not fully understand.

      Your description may more poetic than strictly scientific, but I like the image it evokes. For me that image is itself a fascinating and scientifically quantifiable hypothesis, one I think is worthy of exploration: The universe is fractal tree of persistence, one in which increasingly complex forms of 'recording' build upon the earlier branches over time… with time itself being made possible by this tree of persistence, since without such persistence there is no way to measure change.

      Thanks for the interesting comment!

      Delete
    2. We know that physicist, scientist Wojciech Zurek put together the theory of Quatum Darwinism. Now, QD tells us that the interaction of an electron or photon with others of its kind is "recorded" by the surrounding particles. We learn from this article by Philip Ball, https://www.quantamagazine.org/quantum-darwinism-an-idea-to-explain-objective-reality-passes-first-tests-20190722/ that three separate experiments were conducted by three separate teams in different parts of the world that show that QD has passed the first tests. The teams are these:
      1. Chaoyang Lu (and Jian-Wei Pan of the University of Science and Technology of China in Hefei
      2. Physicist Fedor Jelezko at Ulm University in Germany in collaboration with Zurek and others, used a very different system and environment, consisting of a lone nitrogen atom substituting for a carbon atom in the crystal lattice of a diamond — a so-called nitrogen-vacancy defect.
      3. And a team from Italy

      Philip Ball writes about the results of the tests:
      "By controlling and monitoring the spins using lasers and radio-frequency pulses, the researchers could measure how a change in the nitrogen spin is registered by changes in the nuclear spins of the environment. As they reported in a preprint last September, they too observed the characteristic redundancy predicted by QD: The state of the nitrogen spin is 'recorded' as multiple copies in the surroundings, and the information about the spin saturates quickly as more of the environment is considered."

      Please note the word 'recorded' in the above quote. This means that 'recording' is going on at the quantum level.

      Honestly, very honestly, I did not know that a theory called Quantum Darwinism existed when I suspected that 'recording' must be taking place at the quantum level. Only last month, I chanced upon Philip Ball's article in Quanta Magazine. I arrived at the suspicion through a totally different means which I have expressed in this blog post and in other blog posts of Prof. Hossenfelder.

      My reasoning took the particular turn out of understanding J Krishnamurti's talks. Mathematics is not the only means to arrive at inferences, I mean, there might be a more direct perception.

      Record, memorize, and then respond from that memory is I suspect the fundamental design pattern of nature.

      Delete
    3. In my previous reply, I said the fundamental design pattern of nature is to record, memorize, and then respond from that memory which is so effectively plays out from the quantum level to the thinking process in the human mind. But what do I mean by "respond from memory". If, after recording, the behavior of the particle changes, then it is responding from memory. This is just as the human minds world view changes after it records a learned bias. Learned biases can be a great impediment to further learning. For example, hypothetical ether was an impediment to understanding the behavior of light. Here, hypothetical ether is a learned bias as much as Newtonian absolutes are learned biases. That is why Aldous Huxley said "Einstien broke the Newtonian orthodoxy".

      So, if we can experimentally show that the behavior or nature of matter changes after recording, then we can catch nature in her act.

      By understanding how the mind works, not the brain, we can unravel how nature works at the fundamental level or perhaps at every level because the mind is the software program which emulates the hardware program, that is, the programmed physiology and biochemistry. These are program, whether they are hardware or software, because the function in pattern. And working backwards we see that matter itself is a program. The electron is a program. "Energy in a pattern is matter" J Krishnamurti. And anything that functions in a pattern is a program.



      Delete
    4. ...And nothing can function in a pattern if they was no record of the pattern, otherwise how can the pattern repeat. This means that matter is a "record". There is "recording" going on in double slit experiment when we introduce the detector as an observer. The electron "records" its interaction with the detector or the observer as an influence of the observer.

      Delete
    5. "What is thinking. Thought is the response of memory. If you did not have memory you cannot think." J Krishanmurti. This memory came into being because the brain registers or records--also from J Krishnamurti. Now, extrapolating, thinking is a software process of the mind. Nature through evolution would not have arrived at this software process if such a process was not already going on at the hardware level or as a hardware process itself. So, working backwards, that is, through regression, we suspect that hardware registering or recording must have been going on at the quantum level-- recorded energy patterns. When nature arrived at the thinking process, it was responding from its background. Nature always responds from its background or fallback on its background.

      Starting from the quantum level the recording and programming gets more and more complicated or complex until we arrive at the most complicated or complex hardware program know to man, which is the human brain.

      Also, starting from the quantum level, which is steeped in uncertainty, as we proceed towards the human brain, all along it becomes more or more deterministic. That is, there is a movement from uncertainty to certainty and determinism. In this sense, there is increasing entropy of a particular system.

      Delete
    6. In my previous comment I used the word regression. This is some what akin to the use the word regression in linear regression of Artificial Intelligence. A conventional program takes data as input, processes the data, and produces data as output. However, in AI, we have lots of output but we dont know the mechanism that produced the output. If only we knew the mechanism or the underlying function, we can then extrapolate or interpolate. So, from the produce we must get to the mechanism or the underlying function that produced it, which is regressive or regression. So, generally speaking given examples or related outputs, the machine learning algorithm, deals out the underlying function.

      What I am getting at is, from the emergent, through regression, we get to the mechanism from which the emergent emerged. This mechanism is nature "responding from its background." That is, the thinking process, which is a software process, emerged from a similar hardware process of registering or recording, memorizing, and then responding from that memory. The jump is from hardware processing to software processing. Because hardware recording was going on at the quantum level, eventually, through evolution, the brain landed up recording sensory input as software images.

      Now, speaking of "responding from memory", can we show experimentally that after recording at the quatum level the behavior or nature of electrons, for example, changes from wave nature or particulate nature; some change or some relationship with other electrons emerging as a consequence of recording its interactions with other electrons or photons. Before recording, it was one thing, after recording it is a modified thing, and as a consequence, the modification has a bearing out its behavior or nature.

      Delete
    7. correction: when moving from uncertainty to certainty or determinism the entropy of a particular system "decreases".

      Delete
    8. ...entropy not in the sense of disorder, rather in not seeing a pattern or order. When you don't see a pattern or order in something, it doesn't make sense, there is uncertainty, an insensibility. So, uncertainty may not imply disorder, rather it may imply insensibility based on the human biological or psychological program.

      Consider several data, they data together seem scattered in that there seems to no relationship among the data. This is because of the limitation of the human program to make sense or see a relationship among the data or a pattern in the data. But the actuality may be that the data are related, so referentially, according to the human program, because of its incapacity to see a pattern or a relationship there is disorder or maximum entropy. Then through learning the human program or the mind begins to see a pattern or some relationship among the data, which means there is a turn around, in that, the entropy begins to decrease. And once the human mind comprehends the complete pattern or the complete relationship among the data, the entropy hits a minimum. So, isn't entropy referential? Uncertainty may not mean disorder.

      Delete
  34. We talk about "particles having wavelike behaviour and particulate behaviour, the famous "Wave/Particle Duality", but this is inconsistent with our ideas about causality. I just do not buy the idea that QM is "counter intuitive" and that we must shut up and calculate. There is a causality for every action in this universe. Our difficulty in identifying it is simply a reflection of our lack of understanding of nature at the very small/very fast, scale. What if our "environment" was wavelike, imparting wavelike behaviour onto the particles existing within the environment? This would explain much. I postulate that time itself is wavelike in the sense that its rate of progress (the rate of change of all actions), varies cyclically over duration. Time is wavelike. Space time is wavelike. This view of time, (the energy field we experience as time, I suspect the Higgs field), answers many conundrums in science and provides the detailed causality which is missing in our understanding of the very small, very fast, world, (QM). I know you won't post this Sabine and that you will dismiss it as "Spam", but may I respectfully suggest before you do dismiss this view, you read the book I sent you in its entirety, "The Binary Universe", A Theory of Time?

    ReplyDelete
  35. Sabine,

    Your reply to Peter Shor is indeed the solution to the measurement problem:

    "The point is that this does not work if you assume that the quantity you are talking about is ontic, and if it's epistemic you run into conflict with reductionism in your axioms, unless you accept that quantum mechanics is fundamentally incomplete. Which is really the only consistent conclusion."

    What I do not understand is what's wrong to EPR argument proving the same thing. Let's look again at EPR's reality criterion:

    "If, without in any way disturbing a system, we can predict with certainty (i.e., with probability equal to unity) the value of a physical quantity, then there exists an element of reality corresponding to that quantity."

    Let's apply this to an EPR-Bohm setup with detectors fixed on the same axis. We get the following:

    If locality is true (a measurement on A does not disturb B) it logically folows that the state of B before and after A is measured has to be the same. But we know that the state of B after A is measured is a state of defined spin (with a value opposite from the measurement result on A), therefore the state of B even before A is measured was also a state of defined spin. So, we have a solid proof that superposition states reflect our lack of knowledge regarding the true state of the system which must be described in terms of hidden variables. The measurement reflects a change of description from a statistical one (the wavefunction) to the exact, observed one.

    A few words about MWI now. I think I am able to formulate a clear and simple argument against it. I will focus on the inability of MWI to derive Born's rule from the wavefunction. If we ignore circular "derivation" such as Deutsch's we get the answer that MWI can just introduce Born's rule as a postulate, after all that's what other interpretations do. I think that for MWI such an approach is inconsistent. Let's see:

    1. The wavefunction of the universe represents a complete description of physics.

    2. That means that the probabilities for experimental results should be calculable from the wavefunction.

    3. We do not know at this time the result of such a calculation.

    4. But MWI suporters insist on postulating what this result should be (Born's rule).

    But this does not make any sense. It's like string theorist postulating that the standard model should come out of their theory and declaring victory, which is absurd. One cannot postulate what, according to the theory should be derived. So, at this time one cannot grant MWI the status of a valid interpretation. It's just a hypothesis waiting for confirmation or disconfirmation. If MWI will be able to get probabilistic predictions out of the wavefunction we will see if they match Born's rule or not.

    ReplyDelete
    Replies
    1. It is actually possible to derive Born's rule in MWI in the following sense: it holds in "most" branches of the multiverse. Here "most" does not come a priori with a probabilistic meaning (this would be circular reasoning, since one main goal is to justify the introduction of probabilities); the meaning is that the sense the sum of squared amplitudes of discarded branches is vanishingly small. Note that discarding some branches is necessary because there are actually branches (with vanishingly small amplitudes...) where Born's rule does not hold.
      As a probabilistic analogy: if you toss a fair coin infinitely many times, there is a point in the corresponding probability space where "heads" is never obtained. This is a point (or "branch", in MWI parlance) where the frequentist interpretation of probabilities breaks down.

      You may also object to the notion of "branches", but this is supposed to be explained by decoherence (and yes, this may still be work in progress).

      Delete
    2. Of course, you can't get Born's rule out of the MWI. The MWI just gives a quantum state of the universe. The wavefunction is just a (very complicated) function; how can you derive probabilities from a function? And why can't you attach any set of probabilities you want to this function?

      Expect to derive Born's rule just from the MWI is asking the impossible. Shouldn't it be sufficient to show Born's rule is consistent with the MWI? Better yet would be showing that Born's rule is the only set of probabilities consistent with the MWI and a few additional assumptions (hopefully as reasonable ones as you can find).

      Delete
    3. Pascal,

      "It is actually possible to derive Born's rule in MWI in the following sense: it holds in "most" branches of the multiverse."

      Can you explain or give a reference to the above statement?

      Delete
    4. Peter,

      Is there another fundamental entity in MWI besides the quantum state of the universe? If so, what it is? In Bohm's interpretation for example there are also particles, and Born rule is related to the distribution of those particles. In order to "attach" probabilities in MWI you also need "something" in the ontology of the theory. It's not clear for me what this supplementary ontology could be.

      On the other hand it is possible to get probabilities in MWI by counting branches, so it does not seem to be an impossible task. The problem is that the probabilities derived in this way are wrong.

      Delete
    5. Andrei: as far as I know, the wavefunction of the multiverse does not have a probability distribution that is inherently attached to it, so saying "most braches of the multiverse" isn't well-defined. When you attach a probability distribution to the multiverse, you are implicitly assuming Born's rule (which I don't think is necessarily a bad thing because I think it's unavoidable).

      Delete
    6. Peter,

      Any theory should define its ontology (what, according to that theory exists) and some physical laws that describe how the elements of that ontology evolve. Any experiment should be described in terms of the above concepts. If the theory has particles as fundamental entities and those particles interact according to Newton's laws you should explain the results of any experiment in terms of particles obeying Newton's laws. You cannot simply "attach" a peace of mathematics to that theory (like, say, the energy levels in an atom) and claim that the theory works. If you cannot explain the hydrogen spectrum in terms of particles obeying Newton's laws the theory is falsified.

      In MWI the ontology consists in the quantum state and Schrodinger's equation describes its time evolution. If MWI is correct one should explain the experimental observations in terms of the quantum state evolving in accordance with Schrodinger's equation. If it cannot be done, MWI is falsified. I have nothing against the idea of adding a new peace of ontology or a new physical law to the theory. But Born's rule is not introduced in this way. Born's rule is not a additional entity, independent of the quantum state, neither a new physical law that acts together with Schrodinger's equation. It's just postulated as an emerging property of the quantum state evolution. And such an approach cannot be accepted. You cannot postulate that QM emerges out of Newtonian mechanics or that the Standard model emerges out of Strings. You need to prove it. Otherwise it's cheating.

      Delete
    7. Andrei: I’ll try to sketch the derivation in a simple situation. Suppose you make an experiment with two
      Possible outcomes (“0” and “1”), with probability ½ for each according to Born’s rule. In MWI you obtain 2 branches, each with squared amplitude ½. If you repeat the experiment n times, you obtain 2^n branches, each with squared amplitude 1/(2^n). Fix some epsilon and consider the branches with some deviation at least epsilon between the proportion of 0’s and 1’s in the n outcomes. Let’s try to bound the sum of squared amplitute of said branches. Since they all have the same amplitude, this amounts to “counting branches”, and we are in the same familiar situation as in probability theory: by the law of large numbers, when n goes to infinity most branches will have deviation less than espilon.

      This was a simple situation with probability ½ for each outcome, but this argument should work in greater generality: even with different probabilities for each outcome, the conclusion will be
      that the sum of squared amplitudes of branches with deviation > epsilon from Born’s rule will go
      to 0 as n goes to infinity.

      Delete
    8. Andrei, the wavefunction of the universe does not come with instructions on how to attach probabilities to its components. Suppose I give you a function f(1)=a, f(2)=b, f(3)=a, ... Please tell me what probability distribution I should attach to {a,b,c,d, ...} and why this is uniquely defined by the function.

      Similarly, suppose I give you a wavefunction that is a state of a Hamiltonian (that evolves over time, naturally). How do you get probabilities from it without using Born's rule?

      And I don't see how adding observers, that are in some sense partial projections of the wavefunction, and is what you need to do for the MWI, makes introducing probabilities any easier.

      Delete
    9. Peter,

      The "instructions" are clear once the physical meaning of the wavefunction is known. If it gives you the particle density in a certain location you get the probability that a particle exists at that location. If it gives you the magnitude of a field you investigate what are the effects of that field upon particles and, again, you get a probability that a particle might be there and so on. Of course, you use Born's rule, but its use is justified. If the wavefunction determines the position of a particle, as in Bohm,s interpretation, the probability of a particle being at that location is given by Born's rule (assuming a certain initial particle distribution - the so-called quantum equilibrium hypothesis)

      MWI claims that, regardless of the amplitude, all possible results will be instantiated. If n locations are available a measurement will result in n worlds. There is a straightforward way to get a probability, and that probability is 1/n to find yourself in any of those worlds. But this is different from Born's rule. So, what meaning would you give to the amplitude so that the use of Born's rule in the context of MWI is justified? are some "worlds" more "real" than others?

      Delete
    10. "MWI claims that, regardless of the amplitude, all possible results will be instantiated. If n locations are available a measurement will result in n worlds. There is a straightforward way to get a probability, and that probability is 1/n to find yourself in any of those worlds."

      Assuming that the chances to be in one of the many worlds are equal. Why would they?

      Delete
    11. Pascal,

      You did not specify the physical meaning of the amplitude. If your squared amplitudes will be different than 1/2, say 1/10 and 9/10 your calculation stays the same, right? You have the same number of branches, 2^n, with different amplitudes this time. So how do you arrive to Born's rule? Would you only count those with a large amplitude? Why?

      Delete
    12. Martien,

      If you "split" in two copies and each of them will experience an outcome how could the probability be any different than 1/2?

      Delete
    13. Pascal,

      The derivation you are giving for the Born rule actually assumes the Born rule. The 50/50 initial chance can only be calculated from Born’s rule.

      You also wrote:

      “Note that discarding some branches is necessary because there are actually branches (with vanishingly small amplitudes...) where Born's rule does not hold.”

      There are branches with astonishingly small amplitudes, but there is no need to discard them by hand. Actually you need to be very careful before discarding them, otherwise your calculations might show inconsistencies.

      Also, can you give an example where Born’ rule does not hold. I am not aware of any such examples.

      Delete
    14. > Would you only count those with a large amplitude? Why?
      Andrei: each branch is counted according to its squared amplitude - so branches with a large amplitude have a larger weight, but they are all counted.

      > Also, can you give an example where Born’ rule does not hold. I am not
      > aware of any such examples.

      Udi Fuchs: by that I just meant that there are branches where Born's rule does not hold in a frequentist sense. For instance, in my previous example where the 2 outcomes are equiprobable, there is one "freak branch" where the n outcomes are all equal to 0. This is not any deeper than the similar observation in the probabilistic setting: if you throw a fair coin n times,
      it is possible to obtain no "head" at all in the n outcomes.
      Still, it is perhaps interesting to point out that a physicist living in the "freak branch" would have a hard time finding out what the rules of quantum mechanics really are.

      Delete
    15. Pascal,

      "each branch is counted according to its squared amplitude - so branches with a large amplitude have a larger weight, but they are all counted"

      What does it mean for a branch to have a larger weight? What is the physical meaning of that? Does a branch with a larger weight correspond to more worlds than a small weight one?

      Delete
  36. Physicists - it seems- talk as if there were a canonical catechism (like the Westminster Shorter and Larger Catechisms) which defines the bedrock foundational language ("state-vectors, self-adjoint operators" etc.*), for current and any future quantum mechanics. But one could at least start from the beginning with a different language.

    * Quantum Dynamics without the Wave Function
    https://arxiv.org/abs/quant-ph/0610204

    ReplyDelete
  37. Only through measurement we know a superposition, but then measurement destroys superposition and there are definite outcomes. The superposition must be untouched by measurement.

    ReplyDelete
  38. Thinking about quantum interpretations sometimes gives me a headache. I often find solace in Mermin's “such up and calculate.” I think it is fair to say the Copenhagen interpretation, and various neo-CI that accompany that are ψ-epistemic, but not all ψ-epistemic interpretations are CI or neo-CI.

    ReplyDelete
  39. -there are neurosurgeons that carefully open the skull of the patient and operate their brains
    -hey, but wait! a neurosurgeon cannot chop her own brain! THEREFORE the neurosurgeon cannot operate any brain, and obviously no one can chop the brain of this neurosurgeon; or in fact, any neurosurgeon. so, what would be so special about the neurosurgeons' brains? it makes no sense. neurosurgery does not exist at all!
    and so on and so forth

    once you accept this type of logic, multiuniverses, pilot waves, ghosts whatever becomes acceptable. the problem then is just to develop a whole new lexicon just to catalogue all the epistemic, ontological, neo, modern, post-modern, categories and so on and so forth again

    meanwhile, in copenhagen, people just go to work

    ReplyDelete
  40. Pardon my ignorance, because I am a physicist but not an expert on foundations of quantum physics, but how can we even talk about the measurement problem if we have no idea how the system of interest interacts with the apparatus? Yes of course the Schroedinger equation cannot explain the collapse because it does not include the system plus the apparatus in its description. So collapse is in that sense our modelling of the interaction between the system and the apparatus based on observations. Has there been any work on modelling this interaction?

    ReplyDelete
    Replies
    1. Juraj,

      The Schroedinger equation applies to both the prepared state and the detector.

      Delete
    2. Can you show me a single example where a explicit Hamiltonian has been written down for the apparatus with explicit interaction term between the apparatus and the system? I do not remember seeing anything like during my physics education, especially not for the double slit experiment. I think the measurement problem is about understanding that interaction and the collapse is a built-in simplification of that interaction based on observations.

      Delete
    3. Sure, just take the standard model.

      More seriously, people are using toy models for this where the detector is modeled by, you know, a lot of harmonic oscillators because everything is a harmonic oscillator. In any case, you find this if you look at the literature about decoherence and einselection.

      Delete
  41. Sabine wrote:

    Decoherence tells you that if you average over the states of the environment, because you do not know exactly what they do, then you no longer have a quantum superposition. Instead, you have a distribution of probabilities. This is what physicists call a “mixed state”. This does not solve the measurement problem because after measurement, you still have to update the probability of what you have observed to 100%. Decoherence does not tell you to do that.


    As you say yourself, what you get after decoherence is a (classical) probability distribution. Why has no one before quantum mechanics been worried about the "measurement problem in probability theory"? Why isn't it a problem to update a classical probability distribution to 1 for "tails" once I look under my hand to uncover on which side my coin has flipped to? What is the exact problem I run into with a diagonal density matrix, that I don't run into with probability theory and statistical mechanics?

    ReplyDelete
    Replies
    1. In statistical mechanics you are talking about an ensemble. That's basically a hidden variables theory when it comes to quantum mechanics. This is a consistent solution to the measurement problem, and for all I know it is the only consistent solution.

      If you are talking about probability theory (say Bayesian inference), this is not meant to be a fundamental theory of nature, so no conflict with reductionism.

      Delete
  42. Conscious minds make observations. I expect that nonconcious objects can exist in a state of all possible conditions indefinitely.
    I imagine the collection of all possible events conforming to known physical laws to be like a vast static mass (at least 10^10^100 possible states, if Sean Carroll is onto something), while conscious minds are like veins of ore embedded in the (highly multi-dimensional) mass, experiencing the passage of time as a prerequisite of the state of being conscious.
    I imagine these veins fanning outward into the past and the future in a multitude of near-identical states, giving rise to the expectation of the probability of one experience following another.
    We can test this multiple-paths interpretation given sufficient patience. I declare right now that I will have 150 years worth of experiences, even though none of the billions of humans who have preceded me have ever been observed to live 120 years.
    When I reach 150, the improbability of my conscious state will be so unlikely without resorting to multiple-paths that it makes multiple-paths the most likely explanation for my endurance.
    See you then.

    ReplyDelete
  43. POINTS!!! I read lots of stuff about points, where we have this idea of elementary particles as points. In quantum mechanics there is the old idea of a wave-particle duality, where the particle appears as a point. Really if you think about it the detection of a particle happens more on a spot. This is as someone wrote above more like a sharp Gaussian function that in a mathematical limit is at a point. Points are mathematical abstractions, and there is a vast body of work on point-set topology with this. This is the most consistent approach to the foundation of real numbers and spaces in general. This is even though there are some funny things that happen related to Gödel's theorem and something really odd called the Banach-Tarski theorem. So a point in a pure definition is really a mathematical abstraction.
    Are elementary particles then points? We often say they are point-like. Measurements of the electric dipole moment of the electron is zero down to 10^{-29}e-com, which for e in unit electric charges means the electron appears perfectly spherically symmetric to within 10,000 Planck units of distance. This is starting to put a serious potential test on string theory. Type I string type includes those that are open strings with endpoints. These with Chan-Paton factors attach to D-branes, which space might be a manifestation of. This would mean on the string scale an electron would have some sort of non-spherical structure with an electric dipole moment. This might be probed in the not too distant future, and if the electron exhibits no such structure then some aspects of string theory are in trouble.
    If the electron were to have absolutely no electron dipole moment then it is a point. This means it is a point when measured in ways that correspond to such a property. Of course, again a real measurement does not really induce and electron to be a point. It is more of a tiny spot or sphere. Also some experiments these days can put the spin of an electron "here" and its charge "over there." So this is a bit of added grist for the mill. The real measurement is of a particle-wave "squashed" or collapsed to some small dimension, and work with weak measurements, entanglements in quantum erasures and so forth are trying to open up that squashed wave, spot etc into a wider distribution. This is sort of the converse of measuring the electron electric dipole moment. 

    ReplyDelete
    Replies
    1. I was thinking about how spinors describe electrons in their non-relativistic and relativistic incarnation the other day. And how spinors are naturally described as double covering the rotation group. But what I was looking for was an intuitive way of describing it. Consider a rotating sphere, now there are two invariant tangent spaces, those at the two poles. Fix a tangent vector in one of these tangent planes. It obviously rotates with the sphere as the sphere rotates. Now shrink the sphere to a 'point'. What happens to these two tangent planes? Lets say that they become 'entangled', so that the tangent vector first rotates in the upper plane and then in the lower plane, and then back up again on the upper plane ...

      This gives a natural model of a spinor! Its a double covering that naturally arises from shrinking the sphere.

      I'm not claiming by any means that this is how electrons actually rotate, but I do think it gives a nicer explanation of a spinor than the belt or plate trick.

      What do you think? Does it have any pedagogical value?

      Delete
    2. The double covering can be seen with the belt trick or the Balinese cup dance.

      https://en.wikipedia.org/wiki/Plate_trick

      https://www.youtube.com/watch?v=JaIR-cWk_-o

      Delete
    3. @Mozibur if you are trying to explain spin-1/2 in terms of prior concepts then you are barking up the same wrong tree that many, many people have had lunch under. It is a primitive concept. It is more interesting to attempt to construct spacetime from spinors, instead of the other way around. I have read a hundred papers about relativistic rotators and the like, and every possible attempt to reduce spin-1/2 to something more elementary. It was all very interesting, but led nowhere.

      @LK ah yes point particles. My very first criticism of string theory was that it was more of the same - idealized geometric configurations embedded in a background space. That didn't work for points and it won't work for strings. It sort of works for points, but in all cases leads to insoluble contradictions. The only way out I can see is to remove the idea of embedding, by making spacetime and matter emerge from the same primitive thing, a new conception of the vacuum, with the constraint the further embeddings are either not allowed, or very highly constrained (e.g. must be conformally invariant configurations).

      -drl

      Delete
    4. This mentioning of both the spinor and the separation of spin/charge of the electron prompts mention of a little known and underappreciated reductionist method of exploring the properties of particles.

      Condense matter science is advancing technique that takes advantage of the wave nature of the electron to parse out the individual properties of the electron by using wave interference and entanglement methods to isolate the properties of the electron, understand these properties, and to possibly use those properties in sets or individually to advantage.

      This waveform separation process can make an electron look and perform differently based on how these spinors are broken apart and put back together. In this way, the electron can be made to look like a very different particle. Either massless or supermassive electron formation is possible. The electron can be made to look and perform like a Weyl fermion, which looks a lot like a neutrino. Building spinor monopoles is now possible.

      Electron entanglement with various flavors of bosons can convert various properties of the electron into bosons that can themselves form long lived condensates.

      One possible application of these electron cracking methods might someday enable the isolation of the weak hypercharge property of the electron into a condensate that could be used to study both the nature and behavior of the Higgs field. Segregating this quantum property in a boson that can be amplified in the billions from electron property segregation methods using countless electrons as feedstock might someday be possible.

      This ability to form false Higgs field bubbles of varying strengths might be useful to particle physics and cosmology to advance the study of the Higgs field.

      For example, this condensate may be conjured to produce a quasi-false-Higgs field that can be used as a probe to explore how the Cosmic Higgs field that existed just after the big bang would react to adjustable strength false Higgs field bubbles as has been posited to have occurred during Higgs field formation just after the big bang.



      Delete
  44. "classical" QM is toy theory. Full theory is quantum field theory (QFT). Analysing measurement in QM is like analysing Mercury peryhelium precession in Newton's mechanics.
    From the other side, in QFT, separation of observer apparatus and measured system is completely artificial. There are interacting fields only.

    ReplyDelete
    Replies
    1. This isn't so, QFT is as tied to classical models as ordinary QM. QFT bears the same relation to QM as say continuum mechanics does to Newtonian-Galilean mechanics.

      -drl

      Delete
  45. There is no physics but classical physics.

    Some physicists seen to forget that physics is not some solipsistic endeavor. It is about different physicists, at different times and locations, investigating systems which are external to all of them. Classical physics trivially provides a suitable framework for this. Psi-ontic theories are a failure from the outset in this regard, the measurement problem indeed being one giant "dead end" sign.

    That QM is deeply rooted in classical physics is manifest in the former's mathematical structure ( { } --> [ ]; h --> 0, Ehrenfest's theorem etc. ). There only remains the challenge of finding an appropriate classical ontology whose statistical description is QM, as in arXiv:1804.00509 [quant-ph]

    ReplyDelete
    Replies
    1. Why is classical physics trivial? After all it took the genius of Aristotle, Galileo, Newton, Maxwell and Einstein to formulate classical physics (I include General Relativity in classical physics). It's definitely not trivial. Perhaps you are confusing triviality with maturity? I would say that classical physics is a mature discipline. QM isn't just mathematical - this is a serious misconception in how physics actually proceeds. It arose from specific experiments and explanations: blackbody radiation wasn't explicable by classical thermodynamics and so Planck tried a mathematical experiment and introduced the quanta, except he took it as a mathematical trick without ontological status. It took the genius of Einstein to give that mathematical experiment ontological status as quanta by going back to Newtons idea of photons as corpuscles.

      I'd also note, despite string theory status as a formidably mathematical theory, it actually arose, again, by experiment. The Veneziano amplitude of hadronic physics. Precious few books on string theory really go into this. And thats a pity. People ought to remember, that in physics, experiment and theory go together. Actually, I recall Yang writing in one of his papers, that experiementalists shouldn't be intimidated by their theoretical colleagues.

      Delete
    2. Why do we need to solve the measurement problem? Lets take as analogy to the evolution equation of QM, that is the Schrodinger Equation, which is deterministic, smooth curves. Now smooth curves are deterministic, any neighbourhood of any point gives sufficient information to reconstruct the whole curve. Yet we know that there are more curves than just the smooth curves - the piecewise smooth curves. At a non-smooth point, there is an abrupt change of direction - that is, it's non-determinstic! Now, we could approximate these curves by smooth curves, but its intuitively far simpler to understand them as they stand. Perhaps the measurement problem, is actually a non-problem?

      Delete
    3. Mozibur,
      classical physics is obviously not trivial, but it trivially provides the desired framework: You, I and the system we are studying are each represented by some localized energy-momentum distribution, subject to certain constraints. Our perception of the system is encoded in the distribution associated with each of us; the system exists independently of that perception etc.

      And where did you get the idea that smooth curves are entirely defined by any small neighborhood of them (unless they are described by analytic functions)?

      Delete
    4. I doubt that even Bohr said there was only classical physics. He said there are two domains, one quantum mechanical and the other classical. It is the classical domain from which we make observations. In that sense one could say that only the classical domain could be said to be clearly ontological. The existential status of quantum waves is uncertain.

      As a rule in quantum mechanics observables are determined by Hermitian operators. The density matrix is ρ = |ψ)(ψ| is Hermitian and this is them an operator that determines probabilities and those are real. So whether quantum states are really ψ-epistemic or ψ-ontic is not that clear. It may be undecidable.

      Delete
    5. @YK this is certainly not so. There is no classical analog for 1/2-integer spin, for example.

      -drl

      Delete
    6. This comment has been removed by the author.

      Delete
    7. I am not a committed Copenhagen interpretationist, but what Bohr did has a certain subtle brilliance to it. The CI is really not that bad, and in some ways it does reflect the world in a phenomenological manner. We do make measurements and from what we can tell by making a sequence of measurements that wave functions collapse. Experiments with multiple Stern-Gerlach apparatuses demonstrate this.

      As Sabine points out the CI with its quantum-classical dichotomy leaves a gap with respect to how large macroscopic objects are composed of quantum particles such as atoms, electrons and nuclei. We might see this as some loss of reductionism. It makes little sense to say everything is made of atoms, but only when one performs quantum measurements on atoms.

      One can choose a ψ-ontic interpretation, the density matrix contains quantum states as ρ = |ψ)(ψ| is Hermitian and satisfies postulates for being an observable, which avoid problems with wave function collapse that demolishes quantum information and does not invoke the dichotomy between quantum and classical domains. Chris Fuchs' Qubism interpretation is ψ-epistemic and avoids the problem of quantum information loss, which in my opinion makes it in some ways preferable to CI. The big problem is QuBism lends itself to a sort of idealism such that the only existing thing are minds. That sort of leaps off into lala land. MWI is ψ-ontic and it has become a favorite of those who advance a conservation of quantum information. Quantum information may be the only possible conserved quantity we can anchor physics to with quantum gravitation.

      Carroll and Sebens have demonstrated that MWI gives a version of the Born rule if one eliminates various extremely improbable outcomes. If one performs a set of possible spin measurements n times then the number of distinct binary strings one obtains is N = 2^n. If one were to look at the majority of these the distribution of up and down would lead to a Bayesian inference on the statistics. However, the two outcomes with all up or down would lead to an erroneous conclusion. Is we think of Born rule with (O) = sum_np_nO_n and if the probabilities are thought of as Bayesian then we have to eliminate these “outlier” situations. There is then a divergence between quantum Bayesian and quantum frequentist perspectives. Curiously this shift MWI some in a more ψ-epistemic direction. A bit to chew on; if one takes the number of quantum measurements → ∞ the above argument with Born rule leads to issues of ω-consistency and ultimately Gödel's theorem. In phyiscs we often do take infinite limits, such as in statistical mechanics.

      In some ways this is more metaphysics than physics, though I am not entirely opposed to such. I think physics does however demand a minimum of metaphysics. Quantum interpretations are additional postulates, or from a mathematical perspective axioms. These are not themselves provable. So quantum interpretations that lend themselves to equivalency to Born rule of a quantum postulate have some “anchor” as I would call it. With quantum mechanics we also have to make sure we know whether we are talking about foundations or phenomenology. If we are just interested in phenomenology Bohr's CI is the most direct and simple way of thinking. It is close the the “shut up, calculate and measure” dictum. I am sure that for 90% or more of physics this is where thing will be done for the foreseeable future.

      Delete
    8. "We do make measurements and from what we can tell by making a sequence of measurements that wave functions collapse. Experiments with multiple Stern-Gerlach apparatuses demonstrate this."

      This is not true.  The joint probability for results of consecutive measurements agrees with the collapse postulate only if -- somewhat paradoxically -- the unitary time evolution during the first measurement did not change the state of the system.  But in this case the same result can be obtained without invoking the collapse postulate.  So the collapse isn't necessary to account for observation.

      See: Ballentine, L.E. Found Phys (1990) 20: 1329. https://doi.org/10.1007/BF01883489  

      Delete
    9. Given an electron that enter the SG device suppose a measurement along z gives +z. Then suppose I measure along the x direction, where upon I get either +z or -x. Now suppose I measure again along the z direction. I will have a 50% probability of +x and 50% probability of -z. The quantum information of the first measurement of the electron no longer exists with that electron after the second measurement.

      I am not entirely sure what you mean by a unitary time evolution of the first measurement. There is no prior information about the spin of the system before the first z measurement. That occurs with the measurement. The first measurement might be compared to a state preparation, where the subsequent measurement reduces the state of the system so that quantum information is lost.

      Wave function collapse is a phemomenology, where more fundamentally the quantum phase of a system in a decoherent or measurement event is transferred to many states. This transfer of the quantum phase to many other quantum states, say as decoherent sets of states with many partial entanglements of this disrupted state with a reservoir of states, means this quantum phase is effectively lost. This is the phenomenon of the collapse, which is similar to coarse grained thinking in statistical mechanics.

      Delete
    10. Lawrence Crowell,
      In psi(x,t), what meaning could you possibly attach to "x" and "t", other than abstractions of classical measuring rods and clocks? (trying to define a clock quantum mechanically would obviously be circular. General covariance in classical physics avoids such a circularity by treating coordinates as mere scaffolding for the construction of scalars which are associated with measurements).

      Delete
    11. A quantum state has a representation in space and time. We have for the expectation of some observable that completeness ∫dr|r)(r| = 1 [here using ( and ) for carrot signs]

      (ψ|O|ψ) = ∫dr∫dr'(ψ|r)(r|O|r')(r'|ψ)

      = ∫dr∫dr'ψ(r)(r|O|r')ψ(r')

      for (r|O|r') a representation of an operator in configuration variables. In relativistic QM or QFT we can set up these variables in any frame, and even in nonrelativistic QM there is Galilean invariance.

      Delete
    12. "I am not entirely sure what you mean by a unitary time evolution of the first measurement."

      I mean the unitary time evolution which describes the interaction between system and detector. If the system state is *not* changed during this interaction, then you can, but don't have to, use the collapse postulate to determine the joint probabilities of all measurement results. If it was changed, you can't use the collapse postulate.

      It is easy to see that, e.g. the probability of y conditional on some earlier result x is in general given by

      P(Y=y| X=x, psi) = |(y|phi_x)|²,

      where phi_x is the state that results from the detector interaction on the eigenstate |x> of X. This agrees with the collapse only if |phi_x> = |x>, i.e. if the detector interaction during the first measurement did not change the system state. Of course, this is a very special and idealized situation, so the collapse postulate is almost never valid exactly for sequential measurements. For details, and the discussion of some other interesting cases, see the paper cited above.

      Delete
  46. Prof. Hossenfelder,

    For the past several weeks, I have been writing so many comments on your blog posts. You are a scientist and an author, can you tell me honestly, I will understand, whether what I have been writing passes as science or not? Will it stand scientific scrutiny? Just a yes or a no will suffice thank you.

    ReplyDelete
  47. Cats are usually such adorable creatures, I would wish we could refer to Schroedinger's Rat experiment instead!

    ReplyDelete
  48. "We do, however, only observe cats that are either dead or alive. This is why we need the measurement postulate. Without it, quantum mechanics would not be compatible with observation."

    My understanding is that quantum effects, like quantum superposition, are very difficult to observe in large systems. I think the current record is 2000 atoms. The moment the system of interest interacts with a detector, the quantum system becomes a very large one including the detector and everything that interacts with the detector, made up of trillions of atoms. As that happens wavelengths contract in proportion to the number of particles involved (l = h/mv), I guess you could call that a "wavefunction collapse" of some sort but this collapse follows from the Schrodinger equation, there is no need to put it in the theory as a postulate. Removing the collapse postulate would not lead to the observation of cats in a dead/alive superposition: it is a system too large to have practically observable quantum effects. There is also no need to understand this in terms of mixed states and decoherence.

    As for the need to update the probability, a transition probability only has meaning if the initial and final states are specified, and the probability only changes if we start discussing a different initial or final state. But that is independent of the collapse postulate, isn't it?

    ReplyDelete
  49. Far from being some weird random anomaly in the system that everyone just has to accept, surely the number jumps of quantum mechanics are what is actually driving the system?

    The (law of nature) equations are just relationships that can’t move the system: it’s the number jumps of quantum mechanics that turn a set of equations into a moving system; the number jumps are essential to the system; the number jumps are separate to the equations.

    Shouldn’t we be thinking about the world as a system?

    ReplyDelete
  50. "The measurement problem, I have to emphasize, is not solved by decoherence, even though many physicists seem to believe this to be so. Decoherence is a process that happens if a quantum superposition interacts with its environment. The environment may simply be air or, even in vacuum, you still have the radiation of the cosmic microwave background."

    Or the environment could be whatever measurement device in case an observer looks at the system... Now should the measurement device together with the system be perfectly isolated from the rest of the world (in practice this never happens of course), then the device+system "system" would also be described by a Schrödinger equation. Hence, I do not see any conflict with reductionism.

    ReplyDelete
  51. When video bandwidths drop, a common compression artifact is for images to transform into oddly cubist versions of the originals. You can still see a face, sort of, but only as a set of blocks that average the pixels that should have been there.

    This cubist effect stems from algorithms that first fill in the image with a handful of averaged bits, and only later send modifying details. A block that represents average human skin thus retains its overall rich brown color even after further details have been added. However, if the image shows only a clear blue sky, those first few bits may be enough. This kind of 'bit stretching' greatly reduces the number of bits needed to send an image.

    Notably, stretched-bit regions are much more sensitive to changes. Flipping a couple of bits in a full image has no visible effect, but the same changes in a bit-stretched block might transform the sky from blue to red. The more the bits are stretched, the greater this sensitivity becomes.

    So, the real question: Why did Terry just go on a bizarre IT tangent?

    Bear with me! Classical objects, objects that display averaged behaviors such as temperature, are incredibly rich in static and dynamic data. The one-inch cube of tungsten that sits on my desk exchanges phonons and momentum at data rates that would swamp the capacity of an international fiber cable. We call this magnificent dynamic interplay of raw information 'heat', and the best we can do is take its average.

    So, a question: Is it possible to reduce the static and dynamic information of a real-world object down to a handful of bits, thus forcing those bits into the same kind of stretched-out duty as in cubist images? And how would this affect the system?

    The answer is yes, it can be done. And what it does is make the system quantum.

    This in fact is what a quantum system is… period. That is, you cannot make a system quantum without dramatically reducing its data content, and conversely, you cannot dramatically reduce the data content of a system without making some of its features quantum. Techniques for reducing data in a system can be as simple as shrinking its size or subjecting it to extreme cooling.

    However, forcing systems to use stretched bits makes them behave… oddly!

    Stretched bits can for example make systems excruciatingly sensitive to even the tiniest changes. If location and momentum bits are paired down to their absolute minimums, adding any new bits will transform their values dramatically. Also, since the maths of our universe force location and momentum to share even those few bits, more precision in one means a loss of precision in the other. (Ringbauer has some great experimental papers on this topic.)

    Stretched-bit dynamics have another name: quantum jumps. These events look like 'jumps' mainly because the swirling maelstrom of classical data that we embody is so astronomically larger than the miniscule data capacities of stretched-bit systems. Simply touching such systems unavoidably means drowning them in a firehose of new, detail-creating bits.

    So, a question: Why did the electron chose the left hole over the right one?

    We think of the statistical perfection of the Born rule as one of the deep mysteries of physics. However, the main source of its exquisite randomness is no farther away than you, the observer. The electron chose the left hole because you told it to. It's just that you had no more control of where you told the electron go than a thundercloud has control over where it deposits a particular drop of rain.

    Finally, I should note that this stretched-bit interpretation of quantum mechanics is the diametric opposite of the many-worlds interpretation. It is the no other worlds interpretation, in which the only thing that exists beyond the stretched fabric of bit-starved quantum systems is… nothing at all. In 'now' quantum mechanics, we are the ones hiding behind the curtain.

    ReplyDelete
    Replies
    1. Terry Bollinger,

      " Why did the electron chose the left hole over the right one?"

      Because the Lorentz force caused by the electric and magnetic fields originating in the electrons and quarks in the barrier determined it to take that path.

      Delete
    2. Terry Bollinger and others,
      You use the word “system”, but you haven’t actually got a system until you have a cause for number change. You need MORE than a set of (law of nature) equations incorporating delta symbols if you want to claim that the world is a “system”.

      Delete
    3. As Terry, Israel, Dave, and others are hinting the measurement problem at some point opens up into the much deeper question of why are there particles. We talk about measurements with binary outcomes which suggest counting and Bayesian probabilities. But why do we see only sharp outcomes? Where does quantization come from in the first place?

      The best available theory, QFT, deals with fields. Particles are modelled as very tightly localized excitations of these fields. MWI in my view provides a satisfactory explanation of how classical behaviour with crisp non-interfering outcomes and 100% events emerges as the experience of people and systems within that theory. I'm a lay person and don't have the details but the explanation seems consistent and free of mysticism.

      But there are two problems: QFT feels like an extravagantly redundant theory. A simple particle has to be modelled as a field everywhere in space. All the possible evolutions have to be added up. Then somehow despite this, state transitions quantize so that the electron field only produces electrons (or electron-photon interactions). Why? There's something suspicious here. It looks like the degrees of freedom of the universe should have a computationally compact form, close to what we call particles, of which the wave function is the Fourier transform (sort of). In that regard the Shroedinger equation is saying something about the waveform of a sound and we're missing the actual algorithm how the mp3 is encoded (analogy).

      There's also entanglement which is suspiciously non-local and, according to Susskind and Carroll, central to almost any interaction. Any interaction is a multiplication of a superimposed state with another to get an entangled sum, a superposition terms. It sure looks like the real degrees of freedom of the universe are once removed from particles, and entanglement is some sort of re-arrangement of what the degrees of freedom mean. In the unentangled state the DoF describe a particle, and then they somehow describe a relationship between particles. Understanding these deeper concepts may be the route to de-mystifying measurement in particular.

      BTW, Nima Arkani-Hamad has released a graduate course where he sets out his efforts at deriving amplitudes from deeper mathematical entities. I don't understand it, or I can barely follow what he's saying, not the implications, but you might find it interesting.
      https://www.youtube.com/watch?v=Sn0W_mwA7Q0

      Delete
  52. Terry

    Some things I picked up in my amateur following of Susskind's online course on string theory.
    An entity moving very fast gets compressed with respect to an appropriate observer.
    The dimension of a brane is an observed value and not an inherent value.
    For example, a 3-dimnsional brane moving fast in 1D can appear to a suitable observer to be a 2D rather than a 3D brane.
    If some entity moves near speed c in a cyclic motion then its observed measurements can be restricted to -1 when approaching and +1 when receding. Like a quantised doppler effect, and also like a quantised spin value. This leads me to wonder if spin is more inherently complex than its observed quantised measurements.

    In a nearby thread we mentioned colour dimensions, and I suggested three such dimensions (red, green and blue) to replace one (fifth) electric charge dimension of the Kaluza Klein theory. If you think of the colour dimensions as branes, then they may inherently be more complex, with extra dimension but still only producing the two colour measurements that are observed (which are +1 for R, G and B and colour charge -1 for antired, andigreen and antiblue colour charge).
    A note that there is no observer who can actually know the inherent number of dimensions, except the designer of a toy model.

    Austin Fearnley

    ReplyDelete
    Replies
    1. Austin,

      Interesting comments, thanks!

      Regarding spin, I recall from the most advanced technical book I ever found on photons (and seem to have misplaced, argh) that spin in photons remains oddly unresolved.

      I like the model you just mentioned in which linear, elliptical, and circular photon polarizations are all just infinitely squashed projections of great circle rotations around a sphere. When you get into the details there remain issues, though, or at least that's what I recall from that book I can't find!

      To be honest, I fall asleep very quickly when I try to read string theory works. Like many fantasy and science fiction stories, I suspect such writings would benefit from more character development to help compensate for excessive levels of undisciplined imagination. Or more to the point: It's not really reductionism when the 'simpler' entities invoked are so astronomically richer and more complex in behavior than the tiny set of fermions and bosons they supposedly 'explain'.

      That said, I deeply agree with your excellent point that based on all known particle data, the absolute simplest mathematical model for electric charge is to demote unit positive charge into a composite sum of three more fundamental anti-rgb color-charge unit vectors, and negative charge similarly into a sum of the three rgb unit vectors. While this 3-space is profoundly anisotropic, it is not broken, and the charge of any particle ever found becomes a simple vector location within it.

      Delete
    2. Let's say, the photon is one quantum of action; L = h.

      As the photon propagates, this angular momentum vector can 'tumble' any which way!

      The rate of tumbling (the frequency of the photon) determines its energy.

      Greg

      Delete
  53. Dear Sabine,
    Thanks for this clear exposition of your view on the measurement problem. You state that the Copenhagen interpretation is basically sound, except for the loss of reductionism. Now, I think we should say "loss of NAIVE reductionism". Your strong version of reductionism implies that a microscopic theory cannot appeal to macroscopic things (observers) because the latter should be explained in term of the former, and not vice-versa.
    Let me note however, that there is no internal contradiction in QM. If you take a detector and put it (or its parts) under a microscope, you will always find that it obeys the rule of QM. Detectors, observers, are not different from anything else in the world. For instance the macroscopic properties of condensed-matter can be deduced from the microscopic rules of QM. I am perfectly happy with this weaker version of reductionism, in the same way that I am happy with the notions of space and time that come with relativity, and which differ from their naive classical versions.
    In a world where hbar is finite, strong ("naive") reductionism is probably untenable, because you never have access to a "pure" system to be measured. During a measurement you cannot separate the system S from the apparatus A, at least not completely. QM is simply a set of rules that tells you how to extract a max of information about S, compatible with the finiteness of hbar.

    ReplyDelete
    Replies
    1. The Copenhagen interpretation has a quantum domain and a classical domain. The system measured is quantum and the measurement apparatus is classical or macroscopic. I say macroscopic because it can have thermal properties and if it acquires information about a system its entropy changes by the Shannon formula.

      A quantum system may be a superposition of states in some basis such as

      |ψ) = sum_na_n|n).

      Suppose I make a measurement of this and find the outcome m with a prior probability p_m = a*_ma_m. Now the system is in the state |m), where I have done a renormalizing of the probabilities. What sort of operator might do this? It would be one with all zeros except the m by m entry with 1/a_n. This would be a projector operator P_m multiplied by 1/a_m. This operator acting on |ψ) would give

      P_m|ψ) = a_m|m).

      Here is an odd thing. This projector operator is such that (P_m)^2 = P_m, or that it is idempotent. Now suppose I have an inverse operator (P_m)^{-1} so that P_m(P_m)^{-1} = 1. Idempotent property would then imply that

      (P_m)^2(P_m)^{-1} = P_m = P_m(P_m)^{-1} = 1.

      This implies that P_m is a unit operator, which is a contradiction. One can generalize this argument with P_m multiplied by 1/a_m or in polar form as P_m/√p_m. So there is no inverse, which means this is not a unitary operator.

      It is for this reason there is a loss of reductionism. The process is not unitary, at least if we do not take account of the measurement apparatus as a composition of quantum states. If the measurement apparatus is made of several or many moles of atoms and quantum states that is intractable. So in decoherence the measurement or general disruption of a quantum state by the environment is represented by decoherent sets. This is similar to the idea of coarse graining in statistical mechanics. In this perspective there is still unitarity, but where we are not able to know how this evolution occurred.

      Delete
    2. Lawrence, I agree with your maths, but that was not my point.
      My point is that, in a quantum world, the measuring apparatus, (detectors, observers) cannot be separated from the thing being observed. This probably precludes any form of naive reductionism à la Sabine. But I agree that one should put these thoughts in a more formal and rigorous manner.

      Delete
    3. To consider the entire world as quantum mechanical, which seems more consistent, means that a measurement or decoherence event is where quantum states encode other quantum states. The needle state of an apparatus is ultimately a quantum state, or many of them, that emulate another quantum state. This has properties that appear similar to a universal Turing machine emulating other Turing machines or a Gödel numbering. I have mentioned this before and on this page as well.

      Delete
    4. Thank you Lawrence, and Terry for the thread about Newton's Cradle. I think this captures two insights: that the detector is definitely seen as a quantum system made of the same stuff as the experiment, and that the apparent irreversibly of measurement is an intractable thermodynamic phenomenon.

      Dare I suggest the obvious experiment? Suppose instead of a cat in a box we have a nano-machine made of levers and cogs that are a few hundred atoms big. Our machine "observes" the particle in the sense that all the particles of the machine entangle with the particle and we reason about the function of the machine classically "the cog moves and pushes this" etc. Depending on the observation the machine is built to emit fresh photons A or B with let's say different phase.

      If we keep the machine cold and isolated enough, and we only let photons A and B leave the box, will they interfere? In that way could we have quantum behaviour at either end and classical but unobserved (isolated) in the middle? I assume this has been suggested and is practically hard to do. But do you believe it clarifies matters?

      Delete
  54. I think physicists should change the lingo a bit. I personally would do away with the word "observer". It has it's historical reasons for being there but now it's a burden, causing confusion and leading to weird psychotheories with too much vague language to be useful. I would replace it with "interaction" - be it part of the planned experiment or unwanted environmental "noise".

    More spefically I don't think it makes sense to spend much time on the mind of the human interpreting results of experiments. I don't think that a fly sees anything different when it looks at the interference pattern on the screen after a two slit experiment was performed than a human. Nor a low-level light sensitive organism. To those that think there is a differerence, my question would be: at what level do you think something else is seen? At the level of the mouse? The fly? The light sensitive primitive organism? It's like the question: "Do good dogs go to heaven? What about a good homo erectus?" Unfortunately we can 't ask them...

    Doing away with "observer" would clean much of the intellectual fog.

    ReplyDelete
  55. I think the best interpretation is

    https://www.physicsforums.com/threads/the-thermal-interpretation-of-quantum-physics.967116/

    It is simple and intuitive, however it does not explain EPR completely but it is in the right direction

    ReplyDelete
    Replies
    1. On a quick perusal I found these three (long) papers by Arnold Neumaier at the University of Vienna both interesting and well-researched. He argues for example that there is only one universe, that points are impossible except as asymptotic limits, and that there is more continuity between quantum and classical than is generally recognized.

      However, after e-searching briefly through all three papers I was disappointed that I never found any definition of the key phrase "the thermal interpretation of quantum physics." Each time he starts to define it, he instead slips into talking about all of its benefits. Blah. That's not a definition, that's just advertising.

      I understand how difficult it can be to pull back from an extended mathematical work enough to figure out what the unifying idea behind it all really is. But that is precisely why it is so important for every author to do just that. Otherwise it becomes an exercise for the readers, for whom the task is likely to be astronomically more difficult.

      I would suggest that Dr Neumeier add a sentence or two about why he calls his idea the 'thermal' interpretation of QM. Or, if he did at some point deep in some paper, to move it up to the first sentence of his abstracts. I suspect it would make his ideas significantly more accessible.

      Delete
    2. @TB he means that macroscopic properties emerge from statistical behavior in a similar way to thermodynamics from statistical mechanics. The Schroedinger equation itself, although referred to as a wave equation, is really more like a diffusion equation such as one encounters in analyzing heat flow.

      -drl

      Delete
    3. drl, thanks. That was quite helpful and makes the papers more interesting to me.

      Delete
  56. Prof Sabine,
    In the relational quantum mechanics approach (https://arxiv.org/abs/quant-ph/9609002) 'All systems are assumed to be equivalent, there is no observer-observed distinction, and the theory describes only the information that systems have about each other; nevertheless, the theory is complete.' By claiming that there is no 'observer-independent state of a system,' doesn't this approach get around your criticism of the measurement problem implying that psi-epistemic approaches are incomplete?

    ReplyDelete
  57. I find it a bit curious that the preparation stage of an experiment already projects out states (e.g. "up" and "down" in case of the Stern-Gerlach experiment), but I have never heard of a "preparation problem of quantum mechanics".
    The whole point in my opinion is that if you update the classical boundary conditions, be that a preparation (choice of an experimental setup) or a measurement (state of a detector) you have to update your quantum probabilities to stay consistent.

    ReplyDelete
  58. When I mean Quantum mechanics, I mean Quantum mechanics.

    ReplyDelete
  59. This comment has been removed by the author.

    ReplyDelete
  60. Quantum jumps have recently been shown to occur over time, they are not instantaneous. Perhaps decoherence and a wave function collapse also occur over time. Resolving this problem may mean measuring this part of the way through. An instantaneous collapse tells us nothing really of the process.

    ReplyDelete
  61. I have exhausted all what I explored, there is nothing new to say; moreover, I will be going offline for a few months, may appear occasionally though. Thank you for being patient with me.

    ReplyDelete
  62. Most folks have seen a Newton's Cradle. It is a chain of barely touching steel balls, each held by two strings to keep their motions aligned.

    If you drop a ball at one end of a Cradle, its impact transmits a momentum impulse to the other end of the chain, launching the end ball into motion. The end ball then reverses direction and begins the process over. The Cradle is a great demo of momentum transfer, but it is also an excellent example of a time-reversible oscillatory process, since a video of a Cradle in action looks the same whether it is played forward or backward.

    Now, let's change things up a bit by replacing the final ball with a cluster of much smaller balls, each suspended from one string instead of two. What happens?

    The momentum is transferred as before into the final unit, but then everything falls apart, literally. When the momentum impulse hits, the less constrained tiny balls scatter like billiard balls after a hard cue ball break. The original sharp momentum pulse is blurred and diffused over time, making its reconstruction extremely unlikely. Time symmetry is lost, since the process can no longer repeat itself.

    Another way to describe this loss of time symmetry is to say the Cradle is 'observed' by the chaotic transformation that takes place within its more complex end unit. It's an observation in the sense that the end unit keeps an energy 'record' of the initial ball impact, the one that took place at the other end of the chain. Furthermore, since time-reversibility has been lost, that now-singular impact takes on a temporal uniqueness that makes it more like a classical event.

    Much like the chain of balls in Newton's Cradle, a photon also carries momentum across space. And also like Newton's Cradle, a photon remains 'simple' and time-reversible only if both ends of its momentum transmission path remain simple and quantum. While that sounds unlikely, it actually happens all the time. We call it 'transparency', which is a rather amazing bit of physics in its own right.

    Now, let's take another look at the two-hole electron diffraction experiment.

    A photon that bounces off of an electron does not immediately become 'classical', since both the electron and photon are simple and thus capable of time-reversal. However, if the photon next hits a bit of thermal matter that not only absorbs but shreds and purees its momentum across a large ensemble of jiggling atoms, its time-reversal become statistically unlikely, and thus the photon becomes part of classical physics. Notice, however — and this is important — that this simplicity is lost at both ends of the photon chain, not just at the absorption or detection end. The original electron emission event also loses its ability to be oscillatory, which means that it too must become classical.

    Stated a bit differently: Chaotic absorption of a photon forces its entire momentum history to become classical. Thus the absorption also makes the entangled emitting electron classical, since the simple oscillatory (quantum) behavior of that electron is disrupted just as much as that of the photon. It is as if the observer 'donates' a bit of their own classical reality to the electron via momentum entanglement, no matter how distant the electron may be.

    Decoherence has it mostly right in asserting that interactions with larger objects degrade quantum behavior. However, the deeper message is that there is a statistical continuum in all forms of quantum observation. Even a dust speck can 'observe' quantum events, provided only that it is complex enough to provide statistically irreversible shredding of one end of a previously simple wave function.

    The bottom line is this: For matters quantum, it's not the consciousness of the observer that matters. It's the thermal chaos in the rhodopsin of her eyes. And yes, there probably should be a song about that… :)

    ReplyDelete
  63. Honestly, all my comments on Prof. Hossenfelder's blog are together a sort of a crude draft of a crude paper. And Prof. Hossenfelder and all the other highly qualified physicists are sort of reviewers of my terrible paper. The final review says that the crude draft of a crude paper does not pass as science, that is, it is unscientific. A fair enough and honest estimate; I indebted to you all. Thereafter, some comments were very interesting, so I could not stop commenting. . .

    ReplyDelete
    Replies
    1. Gokul,
      Your comments portray you as someone who is passionate about science and enjoys engaging with complex topics. I get the impression that you read non peer-reviewed internet articles and spend a lot of time thinking about modern physics. This is fun and rewarding; but it is not enough to actually contribute. Professional physicists spend years (usually full-time, usually at a university) studying mathematics and physics before they get to the place where they can contribute meaningful research. I think you would find this article by Nobel Laureate Gerard 't Hooft titled "How to become a GOOD theoretical physicist" very helpful.

      Delete
    2. Thomas Payne (heh!), just in case you did not later update your excellent advice with the link (I could not find one), Gerard 't Hooft's article is located at:

      http://www.goodtheorist.science/

      Gokul,

      First I should emphasize that I am emphatically not a physicist, nor would I dream (or to be honest, want) to claim to be one. I'm nothing more than a poor, bewildered information specialist who has deeply enjoyed and studied science and physics for my entire life. The physical world is so fun and so delightful!

      However, since I do have some familiarity with issues such as machine intelligence, distributed cognition, and mixed-system analytical efficiency, I must also admit to a secondary interest in modern physics: How group cognition can go badly wrong.

      Modern theoretical particle physics is from this perspective deeply fascinating because of its mind-boggling levels of failure, a mode in which it has persisted stubbornly for almost half a century now. The phrase "self-perpetuating failure" comes to mind, and I mean that as an analytical comment, not an insult or slight. It is as if this field has somehow found a way to ensure with almost complete certainty that brilliant folks can spend their entire lives and intellectual capital working in this area, yet with near certainty never produce any kind of physically meaningful or testable result that might upset existing higher-level funding patterns and strategies. Wow! And this despite massive funding and many unsolved mysteries.

      Machine cognition can afford to be more brutal. If a huge search branch goes senescent (unproductive), you cut your losses and largely toss out the entire branch, on the assumption that its foundations are so flawed that following it will just produce more noise, rather than get you closer to actual solutions.

      Along with Dirac, Feynman/Wheeler (it was their combo that was most powerful; they kept each from flying in opposite directions), Einstein of course, and a few others, Gerard 't Hooft is one of my favorites for analytical style and willingness to test assumptions. His guide on how to be a theorist makes a lot of sense! But alas, I did break out laughing at one line:

      "Finally, if you are mad enough that you want to solve those tremendously perplexing problems of reconciling gravitational physics with the quantum world, you end up studying general relativity, superstring theory, M-theory, Calabi-Yau compactification and so on. That's presently the top of the sky scraper."

      Oh wow! I have no idea what the real top of the problem-solving skyscraper is, but with the profound exception of GR (a must-know amazing work, GR!), I must disagree with Professor 't Hooft on his list. From my more cognitive analysis viewpoint, I would assert that there is about a 99% probability or higher that devoting yourself to this particular list (minus GR) will not in itself promote you to becoming a great physicist. These are some of the very fractal branches that can ensure that you will spend a lifetime on equations and accomplish exactly zip!

      If you don't buy that assertion, well, look at the stuff that has been done in topics like string theory over the last 50 years, and show me a list of outcomes in terms of new insights into actual, experimental, demonstrable physics results. As I said earlier: In brutal machine logic, badly senescent search branches should be cut and discarded, not cultivated.

      My only poor advice for you, Gokul, and anyone else who might care, is just this:

      As in the early days of physics, I strongly suspect that simplicity still counts. If I had to make one attempt at predicting the future of theoretical physics, it would be only this: The real underpinnings of reality will in time be found to be so simple, both conceptually and in terms of their initial, pre-emergence behaviors, that you can count their parts on just a small number of fingers.

      Maybe even just two.

      Delete
  64. Your argument against the many worlds interpretation relies on an incompatibility between the behavior of observers or detectors and behavior of the particles. If one concedes this is true, it still doesn’t require that a sentient creature must be an observer. Hypothetically, one could program an artificial intelligence into a quantum computer which could then coexist in a superposition of states, each state believing it observed a different outcome to an experiment performed in the computer. This would not be forbidden by the postulates of quantum mechanics. The AI simply wouldn’t be an observer in the sense required by QM. Of course, the next question would be how would we know if we were true observers, or non-observing sentient creatures existing in a superposition of states.

    ReplyDelete

COMMENTS ON THIS BLOG ARE PERMANENTLY CLOSED. You can join the discussion on Patreon.

Note: Only a member of this blog may post a comment.