Pages

Wednesday, October 02, 2019

Has Reductionism Run its Course?

For more than 2000 years, ever since Democritus’ first musings about atoms, reductionism has driven scientific inquiry. The idea is simple enough: Things are made of smaller things, and if you know what the small things do, you learn what the large things do. Simple – and stunningly successful.

After 2000 years of taking things apart into smaller things, we have learned that all matter is made of molecules, and that molecules are made of atoms. Democritus originally coined the word “atom” to refer to indivisible, elementary units of matter. But what we have come to call “atoms”, we now know, is made of even smaller particles. And those smaller particles are yet again made of even smaller particles.

© Sabine Hossenfelder
The smallest constituents of matter, for all we currently know, are the 25 particles which physicists collect in the standard model of particle physics. Are these particles made up of yet another set of smaller particles, strings, or other things?

It is certainly possible that the particles of the standard model are not the ultimate constituents of matter. But we presently have no particular reason to think they have a substructure. And this raises the question whether attempting to look even closer into the structure of matter is a promising research direction – right here, right now.

It is a question that every researcher in the foundations of physics will be asking themselves, now that the Large Hadron Collider has confirmed the standard model, but found nothing beyond that.

20 years ago, it seemed clear to me that probing physical processes at ever shorter distances is the most reliable way to better understand how the universe works. And since it takes high energies to resolve short distances, this means that slamming particles together at high energies is the route forward. In other words, if you want to know more, you build bigger particle colliders.

This is also, unsurprisingly, what most particle physicists are convinced of. Going to higher energies, so their story goes, is the most reliable way to search for something fundamentally new. This is, in a nutshell, particle physicists’ major argument in favor of building a new particle collider, one even larger than the presently operating Large Hadron Collider.

But this simple story is too simple.

The idea that reductionism means things are made of smaller things is what philosophers more specifically call “methodological reductionism”. It’s a statement about the properties of stuff. But there is another type of reductionism, “theory reductionism”, which instead refers to the relation between theories. One theory can be “reduced” to another one, if the former can be derived from the latter.

Now, the examples of reductionism that particle physicists like to put forward are the cases where both types of reductionism coincide: Atomic physics explains chemistry. Statistical mechanics explains the laws of thermodynamics. The quark model explains regularities in proton collisions. And so on.

But not all cases of successful theory reduction have also been cases of methodological reduction. Take Maxwell’s unification of the electric and magnetic force. From Maxwell’s theory you can derive a whole bunch of equations, such as the Coulomb law and Faraday’s law, that people used before Maxwell explained where they come from. Electromagnetism, is therefore clearly a case of theory reduction, but it did not come with a methodological reduction.

Another well-known exception is Einstein’s theory of General Relativity. General Relativity can be used in more situations than Newton’s theory of gravity. But it is not the physics on short distances that reveals the differences between the two theories. Instead, it is the behavior of bodies at high relative speed and strong gravitational fields that Newtonian gravity cannot cope with.

Another example that belongs on this list is quantum mechanics. Quantum mechanics reproduces classical mechanics in suitable approximations. It is not, however, a theory about small constituents of larger things. Yes, quantum mechanics is often portrayed as a theory for microscopic scales, but, no, this is not correct. Quantum mechanics is really a theory for all scales, large to small. We have observed quantum effects over distances exceeding 100km and for objects weighting as “much” as a nanogram, composed of more than 1013 atoms. It’s just that quantum effects on large scales are difficult to create and observe.

Finally, I would like to mention Noether’s theorem, according to which symmetries give rise to conservation laws. This example is different from the previous ones in that Noether’s theorem was not applied to any theory in particular. But it has resulted in a more fundamental understanding of natural law, and therefore I think it deserve a place on the list.

In summary, history does not support particle physicists’ belief that a deeper understanding of natural law will most likely come from studying shorter distances. On the very contrary, I have begun to worry that physicists’ confidence in methodological reductionism stands in the way of progress. That’s because it suggests we ask certain questions instead of others. And those may just be the wrong questions to ask.

If you believe in methodological reductionism, for example, you may ask what dark energy is made of. But maybe dark energy is not made of anything. Instead, dark energy may be an artifact of our difficulty averaging non-linear equations.

It’s similar with dark matter. The methodological reductionist will ask for a microscopic theory and look for a particle that dark matter is made of. Yet, maybe dark matter is really a phenomenon associated with our misunderstanding of space-time on long distances.

The maybe biggest problem that methodological reductionism causes lies in the area of quantum gravity, that is our attempt to resolve the inconsistency between quantum theory and general relativity. Pretty much all existing approaches – string theory, loop quantum gravity, causal dynamical triangulation (check out my video for more) – assume that methodological reductionism is the answer. Therefore, they rely on new hypotheses for short-distance physics. But maybe that’s the wrong way to tackle the problem. The root of our problem may instead be that quantum theory itself must be replaced by a more fundamental theory, one that explains how quantization works in the first place.

Approaches based on methodological reductionism – like grand unified forces, supersymmetry, string theory, preon models, or technicolor – have failed for the past 30 years. This does not mean that there is nothing more to find at short distances. But it does strongly suggest that the next step forward will be a case of theory reduction that does not rely on taking things apart into smaller things.

121 comments:

  1. Well I can't agree more, I have always felt string theory, what I now call string-reductionism to be a bridge too far. The basic question I always want to ask an s-theorist always is : do you consider a string to be a mathematical or a physical object. If it is the latter there is no reason at all to let further reductionist reasoning let loose on it. M.

    ReplyDelete
  2. I think there are so many artificial barriers now. Worse than what Galileo had to go through.

    ReplyDelete
  3. Copy Edit:

    >Going to higher energies, so their story, is the most reliable way ...

    "so their story," is probably "so their story goes,"

    >Now, the examples of reductionism that particle physicist like ...

    "physicist" should be "physicists".

    And I agree, I haven't heard the issue put like this, but its smart. Thanks for the examples.

    ReplyDelete
    Replies
    1. Dr Castaldo,

      Thanks, as always, for the careful reading. I have fixed those blunders.

      Delete
  4. To me it seems there is a lot more research (in quantum gravity) into replacing Quantum Mechanics as can be seen by your phrase: "The root of our problem may instead be that quantum theory itself must be replaced by a more fundamental theory". But isn't that an aesthetic evaluation you complained about in your book? Why (most?)researchers think the ugly theory of the micro is the one fundamentally wrong instead of the beautiful theory of the macro? Shouldn't we be also trying to find other good theories that result into Lorentz Invariance?

    ReplyDelete
  5. Sabine wrote:

    “Approaches based on methodological reductionism – like grand unified forces, supersymmetry, string theory, preon models, or technicolor – have failed for the past 30 years.”

    Grand unified theories (like SU(5)) and supersymmetry fit exactly into your definition of “theory reductionism”.

    GUT is an attempt equivalent to the unification of electric and magnetic forces in Maxwell’s equation. SU(5) GUT is not an example of a failed methodological reductioninsm, it is an example of a failed theory reductionism.

    Supersymmetry attempts to extent the Lorentz symmetry in the same way that Lorentz symmetry extended Galilean symmetry. It does not tell us what the electron is made of, it suggests that the electron should have a supersymmetric partner.

    ReplyDelete
    Replies
    1. Udi,

      These theories posit that the symmetry is restored at high energies above which new particles will complete the presently known set. Yes, they are examples of theory reductionism. You seem to have missed the point where I say that particle physicists seem to only recall the cases where both types of reductionism happen to coincide.

      Delete
    2. PS:

      "Supersymmetry attempts to extent the Lorentz symmetry in the same way that Lorentz symmetry extended Galilean symmetry."

      For all we know today, Lorentz symmetry isn't broken.

      Delete
  6. Reductionism isn't limited to phenomenological physics. The universe as an underlying structure of basic quantum fields isn't identical to the proposed structure of the phenomena in the microcosm. Because the underlying structure of the basic quantum fields creates the observable phenomena (the general concept of QFT).
    It was Parmenides who reasoned that phenomenological reality is created by an underlying reality. It isn't for sure that the Greek word "atom" was originally about matter. There are translations that suggest it was related to Parmenides underlying structure.

    ReplyDelete
  7. In computer science terminology, reductionism is a search heuristic — that is, a rule-of-thumb that can help find solutions to complex problems. Searching for human settlements by walking downstream is a good example of a search heuristic. There is no guarantee it will work, but it has proven useful often enough to be worth a try.

    For information systems, the need to use a finite number of bits means that the reductionism heuristic can succeed only down to the level of granularity imposed by those bits. Forced attempts to find still smaller parts will fail, but in a fascinating way: They tend to rebound, generating more complexity rather than less.

    A trivial example is that someone might "simplify" modulo 7 addition by using the labels "Sunday" through "Monday" instead of 1-7. It's not wrong, exactly, and in fact it can be very handy within a narrow domain. But if accepted too blindly, this kind of overspecification limits generality; it creates human noise.

    Ironically, the opposite strategy overgeneralizing is even riskier. This failure mode tends to hit mathematically inclined programmers the hardest, especially if they do not actively guard against it. An example would be generalizing software to cover an almost infinite number of vibration modes when the constraints of the actual physical problem allow only a narrow range of modes. Because premature overgeneralization ignores key constraints of the problem, it can obscure the critical clues needed to finding a more elegant solution.

    Pragmatically, if a promising solution strategy starts out small but over time begins to expand without limit, it is likely an overgeneralization. It means that at some point the "exit ramp" to the real solution was overlooked, and that most of what has happened since has been noise elaboration. Such spaces can be huge fun to explore, I should note. Like Second Life, they enable players to build off of each other's work and create increasingly complex ecosystems, all without fear of contradiction.

    From a game theory and artificial intelligence perspective, the publication trajectory of theoretical physics gives evidence that some kind of exit ramp was missed back in the late 1970s. It happened shortly after completion of one of the most amazing intellectual constructs in human history, the Standard Model of particle physics. It was only then that publications in theoretical physics began to go fractal, generating enormous numbers of papers that are both mutually contradictory and experimentally unverifiable.

    Such branched solution structures are common in human history, but they are more commonly called religions. As with religions, a lack of experimental verification means there is no way to eliminate branches fully, so the direction of growth is guided mostly by the persuasive abilities of charismatic leaders.

    It is oddly easy to uncover the main assumption that led to this overshoot: Planck foam. This experimentally unverifiable concept was itself based on the debatable faith assumption that despite of the powerful topological warning message from General Relativity that gravity is not the same as the other forces, it nonetheless must be the same. A more rational search strategy would have been to focus on creating a version of the Standard Model with fewer assumptions and energy levels comparable only to the particles it predicted. Such a compact Standard Model could still have string-like vibrations, but only by limiting them to the very narrow set of vibration-like behaviors seen experimentally in Regge trajectories. It was these very vibrations that through an act of faith were transplanted across 20 magnitudes of increasing energy and shrinking size to become the astronomically overgeneralized Planck-scale strings of string theory.

    That is always the danger of letting heuristics become articles of faith: You stop being a science and start becoming a religion.

    ReplyDelete
    Replies
    1. Hi Terry,

      your comment is quite inspirational. I have never seen reductionism in this way before!

      Delete
    2. Thanks you, your kind words are greatly appreciated!

      Delete
  8. Dear Sabine,

    Thank you for clarifying the distinction between reductionism in material structure and in theory structure. I consider quark confinement as a manifestation that the limit in material structure reduction has been reached. If this consideration is true, it would mean that QCD should be derivable from an underlying theory.

    ReplyDelete
  9. Do the real "string theorists" of today, meaning the high end ones at least one generation removed from the originals, still actually believe in string throry without the quotes ... that is, actual literal strings like the "5 types" of heterotic strings, or literal strings tethered to branes? IF they do, presumably the must agree that R-parity is seriously broken.

    The alternative to "yes" seems to be that they are using "string theory" in quotes for something like ADS/CFT.

    Is this correct?

    ReplyDelete
  10. "Electromagnetism, is therefore clearly a case of theory reduction, but it did not come with a methodological reduction." You're giving credit to Boltzmann and Gibbs for reducing the bulk properties of matter to a measure of atomic/molecular states, but not to Maxwell for reducing electricity and magnetism to waves? Seems unfair, since in both cases it's quantum fields all the way down.... Particleism!

    ReplyDelete
    Replies
    1. Maxwell did not "reduce electricity and magnetism to waves." That's a serious misunderstanding of what Maxwell did. Maxwell combined numerous previously separate equations to one consistent set and found from this that electromagnetic waves must exist.

      Delete
  11. This is a succinct, well-written outline of where things stand today. I noticed that you take a "side" and you made the effort to acknowledge other positions.

    By the way, I know that you were a guest on Econtalk. I plan on listening to that discussion this week. I assume that the host, Russ Roberts, read your book in preparation for the discussion. An economist and a physicist meet in a bar. What do they talk about? :-)

    ReplyDelete
  12. Bee,

    are Verlinde's entropic gravity, Xiao-Gang Wen theory of emergent particles, Volovik superfluid universe and Sean Carroll gravitizing quantum mechanics all examples of anti-reductionism?

    ReplyDelete
    Replies
    1. Wen, Volovik, and Carroll's approaches are clearly methodologically reductionist. Verlinde's is not, on the face of it, but he interprets it this way. That is to say, in his papers he takes the point of view that the modification of gravity that he deals with is emergent from some underlying short-scale structure, but this doesn't actually enter his derivation. So it's somewhat ambiguous. One could say that by motivation its methodological reductionism, but by mathematics it's theory reductionism.

      Delete
    2. interesting. what's the current status of entropic gravity? i know lubos thinks is wrong on the grounds of reversibility.

      when you say

      "The root of our problem may instead be that quantum theory itself must be replaced by a more fundamental theory, one that explains how quantization works in the first place."

      do you have any idea what this more fundamental theory is, and if discovered, will it explain both QM and GR and the 19 experimental parameters of the SM (or more when including neutrino masses)

      what would such a theory even look like, is there any branch of mathematics you think could serve as guideline?

      Delete
    3. The current status of entropic gravity is that it's no longer called entropic gravity because people didn't understand that it's reversible.

      As I have said a few times, I think we should solve the measurement problem first. Yes, I have a pretty good idea how that theory looks like. What it has to say about quantum gravity, though, I don't know.

      What branch of mathematics? Well, as I keep repeating, the measurement process is non-linear. It's not that I am postulating this, we know experimentally it's not linear. So anyone who sticks with the mathematics of linear systems only can pack their bags for what I am concerned.

      Delete
    4. Sabine wrote:

      “What branch of mathematics? Well, as I keep repeating, the measurement process is non-linear. It's not that I am postulating this, we know experimentally it's not linear. So anyone who sticks with the mathematics of linear systems only can pack their bags for what I am concerned.”

      A wave-function collapse can be described using a projection operator. It is non-unitary and irreversible, but it is linear. I’m not sure why you are so certain that the measurement process is non-linear.

      Delete
    5. Udi,

      The collapse is a projection followed by normalization. If the initial state 1 ends up in the detector eigenstate 1 and the initial state 2 ends up in the detector eigenstate 2, then the initial state 1+2, properly normalized, does not end up in the detector eigenstate 1+2. I explained this in my previous post about many worlds.

      Delete
    6. Sabine wrote:

      “The collapse is a projection followed by normalization. If the initial state 1 ends up in the detector eigenstate 1 and the initial state 2 ends up in the detector eigenstate 2, then the initial state 1+2, properly normalized, does not end up in the detector eigenstate 1+2. I explained this in my previous post about many worlds.”

      I’m not sure if you are talking about MWI or an interpretation with a collapse.

      If there is collapse, then you need a mechanism for this collapse. Bell taught us that this mechanism would be non-local, but I don’t see why it should be non-linear. For example, with hidden variables, the collapse might seem non-linear in the wave-function, but in the combined space of the wave-function and the hidden variables it might be linear.

      To be clear, being non-local is a much bigger issue than being non-linear. If it has to be non-linear, we will have to live with this complication, I just do not see why it must be non-linear.

      In MWI a measurement is not a special event and there is no collapse. The initial state 1+2 does end up in a detector superposition.

      Delete
    7. “... it has to be ...” non-local - the much bigger issue.

      I find it amazing that you can accept/see more easily non-locality than non-linearity.
      In MWI I thought that you get out of Bell’s non-locality via counterfactual definiteness somehow.
      I also once found MWI a nice explanation, but only before branching and not for the very measurement problem – see here.

      Delete
    8. Udi,

      I am sorry, I thought you had a typo in “... it has to be non-linear ...” - my fault - damn it.

      Delete
    9. Reimond wrote:

      "In MWI a measurement is not a special event and there is no collapse. The initial state 1+2 does end up in a detector superposition"

      It seems to me, that MWI does not try to measure anything physical at all. If my consideration would be correct, I would not include MWI into that what I call physics research!

      Delete
    10. A former LEP expermentalist wrote:
      “It seems to me, that MWI does not try to measure anything physical at all. If my consideration would be correct, I would not include MWI into that what I call physics research!”

      MWI is not physics research. MWI only role is to show us that all quantum interpretations do not belong in physics research. There is not a single shred of experimental evidence that requires quantum interpretations.

      Delete
  13. The root meaning of the word abstract is "to draw away". We are going to use the word in that sense. From the etymology we are moving to the connotation. Firstly, let’s move from an object to its property. If from an object, we draw away color, then color is an abstraction. We are looking at color as something apart from the object, or more appropriately, with reference to our connation, we are looking at the property as something away from the object. That is why any color is an abstract noun.

    Secondly, we are moving from the property to the object. The potter has the design of a pot in his mind. In that design, the pot has a color say orange. Now is there a concrete pot? No. The pot only exists as a concept or design in the mind of the potter. That concept or design is "away" from the object pot and in the mind of the potter even if the pot does not exist; therefore, it is abstract. The potter then applies his skill to the lump of clay and realizes the abstraction in his mind into a pot, which now becomes concrete, and then paints it orange. We have now applied the abstraction, color, to the concrete object pot.

    Just as we draw away or abstract a property from an object, we can also realize a property by applying it to a concrete object.

    Sticking to our connotation "to draw away", we start with the cell as an abstraction of structure and functionality--you can also draw away functionality from an object and look at it as separate from the object or "away" from the object. The next level of abstraction is the tissue, one level up is the organ, then the systems, circulatory, respiratory, nervous etc., and finally the whole human body. These are all levels of abstraction. We can go on in this way to higher levels of abstraction like animal kingdom, plant kingdom, living things, solar system, galaxies, cosmos so on and so forth and ask where does increasing level of abstraction end. This is like starting with the cell as the highest level of abstraction and moving to the next lower level abstraction, which are organic compounds; followed by molecules; followed by atoms; followed by protons, neutrons and electrons; followed by quarks and leptons; and then asking where reductionism ends.

    Starting with the cell, there are higher levels of abstraction and lower levels of abstraction, but the common thread is abstraction. Proceeding from quarks and leptons and moving to the infinitesimal abstraction, which is energy in pattern, we still land up with the semblance of a pattern. Undo the pattern and all is energy. If you see a pattern, you abstract because there is the pattern as something separate from the whole. This undoing of the pattern at the infinitesimal level and merging into the whole is the ending of abstraction. Since there are only levels of abstraction, then if I stop abstracting at any level there will be a merging into the whole. Then there is the uninterpreted primordial or the undefined: the colorless, the tasteless, the odorless, the indescribable of Lyall Watson's actuality. This implies that I can end reductionism at any level and still land up with the uninterpreted primordial or the undefined. The ending of abstraction is the ending of reductionism.

    ReplyDelete
  14. Thank you, this is a beautiful essay. It provides a basis for an idea that I've voiced several times without any justification except gut feel: that while the 20th century belonged to the particle physicists, the 21st feels like it will belong to the condensed matter folks.

    ReplyDelete
  15. Continued. . . reductionism and abstraction. . .

    "the undoing" is "negation". When you negate the pattern of energy, it is all energy.

    ReplyDelete
  16. In my understanding the prevailing theories, as there are quantum mechanics and relativity, have been blocked for reductionism because they were based on “principles” rather on lower level physics. In common understanding a principle is not deduced from facts of a lower level but are basic rules by themselves. - In the case of QM, Erwin Schrödinger attempted to avoid this but lost his fight against Werner Heisenberg. In the case of relativity Hendrik Lorentz deduced his approach to RT from lower level facts (as for instance the known behaviour of fields and the properties of elementary particles), but he in this case lost his fight against Einstein.

    Consequently, solutions could be that both, QM and relativity in this example, are given a new basis, as for instance in relativity it is available as the approach of Lorentz. And in particle physics we should be open to look at an inner structure of these objects.

    ReplyDelete
    Replies
    1. antooneo,

      your current comment seems quite plausible to me, rather then some of your previous contributions, which were really hard to understand.

      Delete
    2. Former LEP experimentalist:

      thank you for feedback. But what did you find hard to understand in my preceding comments: The physical contents or my way to explain them?

      Regarding the question of reductionism: I think that there was a real break in the method of physical theories 100 years ago which is not so difficult to understand (i.e. basing on principles rather on lower level physical laws), and that has caused the present situation that reductionism is interrupted (what Sabine has addressed).

      Delete
    3. Dear Antooneo

      "But what did you find hard to understand in my preceding comments: The physical contents or my way to explain them?"

      The answer is: Both!

      "Regarding the question of reductionism: I think that there was a real break in the method of physical theories 100 years ago which is not so difficult to understand"

      May be, that you yourself have no problem in understanding this topic. The feedback from your audience should advice you, that other people have quite more problems in understanding this low level principles. Personally, I have not even heard about this remarkable breakthrough.

      Perhaps you should better write an article about this basic principles for the general audience, for example Scientific American or Quanta magazine.

      Delete
  17. I think the finite universe is conscious with free will which would mean the whole universe is effectively in charge of the parts. Spacetime seems to be unified and in charge of the particles. The particles seem to act like children, low energy particles given less freedom than high energy particles.

    A 2000 atom molecule was recently shown to have quantum effects. Maybe a quantum coherent molecule serves as a homunculus in brains.

    Maybe extremely high energy particles can become new universes. Maybe particles are the universe's way of very slowly procreating.

    ReplyDelete
    Replies
    1. Kevin,

      there is no evidence at all, that the finite(???)universe is conscious and that it might have free will. But for human beings it is quite obvious that they are conscious and have some kind of "free will", whatever this exactly might be.

      Delete
  18. Funny thing to mention too is that Many Worlds, which you wrote about in a previous blogspot >really< bumps into reductionism in an 'explosive' way. Still I don't buy that idea. Talking about reductionism it then conflicts with why MW should only be applicable to the microscopic world and not to the macroscopic one. MW, then according to me becomes a violation of 'gauge in-variance' at distance-scale.

    ReplyDelete
  19. Whenever I see "reductionism", my first question:

    How is the theory of biological living cells (which was mentioned in a comment above) "reduced" to the Standard Model of particles?

    ReplyDelete
    Replies
    1. That is a wrong question and or simply it can't be done; because particles have much more degrees-of-freedom then living cells. In fact there is no valid-relation between cells and elementary particles to even quantify this problem.

      Delete
    2. That answers my question right there: "it can't be done".

      Delete
    3. A photon or gauge particle with zero mass has two degrees of freedom. These are the two transverse helicity states or polarizations of the electric field, or analogue for weak QFD or strong QCD. If it absorbs a Goldstone scalar from the Higgs, such as the W and Z particles, these then have 3 degrees of freedom. This is an additional longitudinal DOF. A fermion has spin and its gauge quantum numbers, isospin, charge or color, its mass and spin. This means from a physics perspective a cell with around 10^{-10}moles of atoms will have the same number of degrees of freedom as the particles that it is made from.

      This would be the case is the whole is equal to the sum of the parts. Yet we know from QM and entanglements this is not quite true. In fact the quantum rules are adjusted by an einselection of classical states, and this appears to imply some form of emergence. Another way to see it is that QM is Markovian in a frequentist statistical sense, and biology is subMarkovian or has a measure of pink noise. As such it could be argued a biological organism, even the simplest archeobacterium has more degrees of freedom than the particles which compose it.

      Delete
    4. Bohr already reasoned that reductionism must fail in case of living organisms. His argument goes like this:
      "... we must keep in mind ... that the conditions holding for biological and physical researches are not directly comparable, since the necessity of keeping the object of investigation alive imposes a restriction on the former, which finds no counterpart in the latter. Thus, we should doubtless kill an animal if we tried to carry the investigation of its organs so far that we could describe the rôle played by single atoms in vital functions. In every experiment on living organisms, there must remain an uncertainty as regards the physical conditions to which they are subjected, and the idea suggests itself that the minimal freedom we must allow the organism in this respect is just large enough to permit it, so to say, to hide its ultimate secrets from us. On this view, the existence of life must be considered as an elementary fact that cannot be explained, but must be taken as a starting point in biology, in a similar way as the quantum of action, which appears as an irrational element from the point of view of classical mechanical physics, taken together with the existence of the elementary particles, forms the foundation of atomic physics. The asserted impossibility of a physical or chemical explanation of the function peculiar to life would in this sense be analogous to the insufficiency of the mechanical analysis for the understanding of the stability of atoms."

      Delete
    5. “... a biological organism, even the simplest archeobacterium has more degrees of freedom than the particles which compose it.”
      Better put it this way: The number of possibilities, possible relations between the particles explodes combinatorically with the number of particles.
      But the degrees of freedom, the entropy of a combined object is also further restricted, because the parts are correlated in space – it has a shape, it “... isn’t statistically independent of itself ...” - same holds for a measurement device – that´s what I asked Sabine here.
      On top of being almost “classical”, a living entity, a cell is a pretty organized and low entropy entity. It has to get rid of entropy all the time – it does so by being in a permanent flow of low entropy energy in and high entropy energy out. Life feeds on low entropy energy – you eat sugar and not a bag of unrelated C, H, O atoms.

      Weak emergence is strong enough.
      Reductionism works fine.
      We just need to include also the measurement process that puts stuff into relation.
      Measurement is the very process of getting entangled first and finally being reduced again – the first is unitary, the reduction is the non-linear step. See also here.
      The very job description of entanglement is to reach and check out the neighborhood. A neighborhood in permanent relation has a shape and is localized in space.

      Delete
    6. @Marc E:

      "That is a wrong question and or simply it can't be done; because particles have much more degrees-of-freedom then living cells. In fact there is no valid-relation between cells and elementary particles to even quantify this problem."

      Your statement is correct. But for me this exactly the reason why it is a very good question indeed.

      Delete
    7. I was thinking of DoF as quantum bits, such as the polarization states of a photon. In standard Statistical mechanics we have microstates that define macrostates, where if the microstate is "shuffled around" the macrostate remains unchanged. Clearly a living system has a small macrostate, because if it is significantly perturbed it ceases to live. This then means the DoFs are coupled to each other in complex ways. We could think of these as constraints, and the DoFs of the living system are then in a sense reduced. In effect the excess are "dumped out."

      Living systems are not closed thermodynamic systems. They are open systems. Howewver, we are thinking here along the lines of closed thermodynamics, which is the standard approach. This makes things difficult, and as far as I know there is no settled idea of just what entropy is for open systems.

      So in one sense you are right. If one just thinks of there being DoFs of particles these are the same. IF one considers the total thermodynamic throughput in one sense there may be more. Constraints within the system though in a sense local to that system means a reduction.

      Delete
    8. Another aspect of converting DoFs is that “... without the formation of molecular hydrogen it’s very hard for cosmic gas to cool off. If it can’t cool off it can’t condense to make stars ...” mentioned in this otherwise a bit esoteric opinion piece here.

      Delete
    9. "...of getting entangled first and finally being reduced again"
      Reduced again? What do you mean by ‘again’ ?

      Delete
    10. Falker,

      I regard measurement (reduction of the wavefunction/state) as an integral part of the dynamics of our world, that happens all the “time” and everywhere. Thus after a measurement is before a measurement – nature does this again and again.
      The unitary evolution in between measurements just calculates the probabilities that then are realized in a measurement.
      A measurement happens when a small system of QM particles gets in the neighborhood of a bigger system of QM particles. (*)
      A QM experiment like double slit, EPR, ... in the laboratory does just this in a controlled environment. It brings a small QM system, e.g. a single QM particle close to a big system, e.g. the slits and the screen, the “classical” measurement device.


      ------------------
      (*) in a way, the big influences the small, but this is no top-down causation and does not contradict reductionism, since also the big consists out of just QM particles. Both systems just get entangled. Both systems are put into relation. This relational aspect is very important and so far, did not play a major role in reductionistic thinking.
      Finally, in the reduction, one of the superposed states is selected and the entanglement breaks again – that´s the non-linear part.

      Delete
    11. Reimond,

      Ok, thanks. It’s an important point (‘integral part of the dynamics...’). This might be obvious for physicists(?), but it was not for me.
      I was somehow beginning to wonder whether talking about an isolated quantum system makes sense. Well, yes that makes sense, since the whole problem is about the becoming of the system once placed in a larger context/environment!

      It seems clear also as you said that there is no downward causation here. I have still to become more familiar with the two steps of the process (entanglement and reduction). As I see it now (I might be wrong), entanglement is related to the fact that the system...conforms to the universe. Or we could put it the other way round: the antecedent universe enters into the constitution of the system. This first phase is a phase of reception of the actual world. If nature could speak, the quantum system might get an answer like ‘hello, welcome!’ ;-)
      The second phase seems to be related to how the system becomes completely fused or integrated with the environment. Well, I have to think more about it.
      I entirely agree about the ‘relational aspect’. Btw, I am just playing on a philosophical ground. And I have a little meaning problem with QM.
      (sorry for the late answer, I am slow)

      Delete
  20. Reductionism in particle physics is associated with high energy. The smallest units of nature are generally thought to be observed as the wavelength of a probe field λ → 0 and the frequency ν → ∞. For this reason the energy diverges when one observes on every smaller scales. Since the smallest a black hole can be is one that has a Schwarzschild radius equal to its wavelength these limits are replaced with λ → ℓ_p and ν → 1/ℓ_p.

    Consider a proton. At very low energy it appears as a point-like particle. At the MeV scale it now appears more as a sphere and at the GeV energy and larger quarks and gluons become apparent. Now suppose we consider the proton boosted to frames closer to the speed of light. The gluons in the protons will become ever more apparent. If the proton is moving towards you then some tiny probe field will tell you the gluons are like Feynman's petite partons the multiply as the proton is boosted to higher energy or γ. In this case the proton is more UV appearing. If the proton is moving away it is far more IR and low energy. For a black hole the proton will approach the horizon, become red shifted to IR and it will appear at very low energy. If you on the other hand boost yourself to an extremely high accelerated frame near the horizon the proton can appear UV, and the gluons made very apparent. The black hole is then a sort of running renormalization group machine. Either one remains away from the black hole and quantum fields become entangled with the black hole in an extreme IR setting. Or one boosts to an extreme frame where particles are at near Planck energy, Hawking radiation is extreme and things appear to be in very maximally mixed states.

    So which is it fundamentally? It is either, and which every way nature appears is entirely up to how the observer chooses to perform these measurements. In fact the Einstein Field equation could be seen as

    UV quantum gravity = IR gauge plus fermionic fields

    and as a correspondence between either reductionism of localized states or more nonlocal or so called holism of states in entanglements.

    ReplyDelete
  21. I wouldn’t quite say that reduction itself is a problem, but rather a necessesary evil that the human is forced to negotiate. It seems to me that the human cannot make sense of complex reality until broken into chunks it can make sense of by means of analogies with previously grasped ideas. Here an education may effectively be referred to as someone’s “bag of analogies”.

    Another thought to consider here is causality itself. If something occurs by means of something else, then it may effectively be said to “emerge” from those dynamics. Thus here reductions exist to discover. Does anything exist without causal dynamics from which to exist, and so yield no possible ontological reductions? Maybe, though to me that would imply supernatural dynamics.

    ReplyDelete
  22. Tank you. I agree that we need another Noether or Einstein more than another LHC. What do you make of Nima Arkani-Hamad and others' slogan that "spacetime is doomed"? It sounds like an opening for theory reductionism, but he seems wedded to String Theory. Then again at some point Nima says String Theory is just an excuse to postulate zillons of different vacua to solve fine tuning.

    Question/Speculation: What do you think of the idea that the universe is strongly observer dependent and any theory we come up with only has to explain (or we can only hope to test) how it reproduces the experience from our very particular point of view. They theory may postulate a detailed mechanism for "what is" but we can only test the part "what it looks like to us". In that sense String Theory and Multiverse are theories of "what is" with testability problems. Anthropic solutions to the Fine Tuning Problem, and Carroll's view that probabilities are a point-of-view phenomenon are theories of "what it looks like to us". The latter kind of theories imply a big reducton of scope, we can only really talk about what the universe looks like to us, but these theories may be more robust.

    I'm looking forward to a physical theory that starts with completely abstract quantum computation and explains spacetime, quantization, and crisp macroscopic observables as emergent properties, but so far physicists like Carroll and Susskind talk only vaguely about spacetime as entanglement. Erik Verlinde's paper about entropic gravity is very interesting, but it looks like some synthesis or Theory reduction as you term it is around the corner.

    ReplyDelete
    Replies
    1. How is it that spacetime is built from quantum entanglements? That is of course an interesting problem. One way of arguing this is with the convex set structure of physics. If we think of all physics as a form of convex sets of states, then there are dualisms of measures p and q that obey 1/p + 1/q = 1. For quantum mechanics this is p = ½ as an L^2 measure theory. It then has a corresponding q = ½ measure system that I think is spacetime physics. A straight probability system has p = 1, sum of probabilities as unity, and the corresponding q → ∞ has no measure or distribution system. This is any deterministic system, think completely localized, that can be a Turing machine, Conway's Game of life or classical mechanics.

      Thermal equilibrium is not possible with quantum fields in curved spacetiome, nor is is likely in quantum gravity. The reason is not too hard to see. Suppose you have a black hole in a thermal background with the same temperature as its horizon T ~ 1/8M. The black hole has an equiprobability of absorbing or emitting a photon with energy δM The temperature then adjusts as T - δT ~ 1/8(M + δM) or T + δT ~ 1/8(M - δM) and is shifted away from thermal equality. This will then enhance the probability the black hole either then grows by absorbing more photons or by emitting them. There is no thermal equilibrium. Quantum gravitation is likely the same, for the effective specific heat of event horizons is negative.

      More generally the entropy of a black hole is S = A/4ℓ_p^2 + quantum corrections, where these corrections are ~ (δS/δh^a)k^a. Here h^a is tangent to an event horizon and k^a is normal. This condition for coincidence of a null surface with a quantum extremal surface at one point with null tangent g^a so that (δS/δh^a)k^a ≥ (δS/δg^a)k^a by subadditivity. However, this surface generally occurs inside the cosmological horizon. This means there is no equilibrium. Equilibrium is only approximated by stretching the horizon out to enormous distance after the spatial surface has inflated. This subadditivity connects the entropy of horizons with the entropy of a set of quantum states or and entanglement of quantum states.

      As for a strong observer dependence with the universe in general, I tend to put those ideas on a back burner. I don't completely discount them, but I would prefer to see a working understanding of quantum gravity and cosmology that is not dependent on observership, say in the strong anthropic principle, and I am less impressed with ideas about the conscious state of observers. Maybe at some point physics will come around to this, but I would prefer to see prior to this at least some effective theory of quantum gravity that is not so tied to observers or mental states.

      Delete
    2. @Pavlos

      "What do you think of the idea that the universe is strongly observer dependent"

      I don't know if the universe itself is observerer dependent, but the observations we make are dependent on the individual observerer. Within particle physics, the observations might be confirmed by different experiments, but multiple observations from different people can hardly be combined in order to have a consistent explanation of what happens. The "combined measurement" is quite often ambiguos.

      Human beings are quite adopted to apply ambiguos information in order to make further progress. However, ambiguos information can hardly be described in terms of computer programs! (Halting Problem!) Only humans are able to pass the turing test. Computers will answer this kind of problems with error code 42 ;-)

      Delete
    3. Sorry, I don't mean "observer dependent" as in humans with consciousness and subjective opinions. More as in abstract principles like relativity. For example in special relativity observers disagree about the ordering of events. Susskind says that observers can disagree about the experience of falling into black holes. Carroll I think would say that an observer on a different branch of decoherence would see a different "roll" on the measurement probability. String theory and Multiverse seem to suggest that we see only a tiny viewpoint of a vast landscape. Hence observer or viewpoint dependent.

      Until recently reductionism worked because physicists could absolutely see the smaller things. The Higgs was barely visible in the statistics. Maybe in the future we have to get used to theories that postulate a huge and inaccessible "what is", perhaps several conflicting formulations of "what is", and ll that matters is that they're consistent with a very limited and viewpoint dependent "what we see". The theory of "what is" gets vastly bigger while our accessible viewpoint gets proportionally smaller.

      Delete
    4. Pavlos, I understand that you are quite interested in abstract principles. That is fine for me. Mathematicians like to think about abstract principles very much. And they have developed strong formalisms, in order not to foolish themselves. No physicist knows for example up to now eg. the convergence criteria for perturbative predictions. It would be really interesting to learn more about this topic.

      For me, physics is about explaining and predicting phenomena of nature. It could be, that reductionism may describe in principal real world observations, but I do not believe, that this abstract principles will ever explain our real world observations in practise. I'm not expecting, that evidence for this principles will be available in the foreseeable future.

      Thus, a theory of collective phenomena is required from a practical point of view. Reductionism and emergent phenomena do not exclude each other. They are two complementing views on the same subject.

      Delete
  23. Continued. . . reductionism and abstraction. . .

    How do you abstract? You abstract according your background. If your background is biology, you abstract as a biologist, or you look at the world through the colored eyes of a biologist. You may wonder what has looking got to do with abstraction. What you pick while you look is what you abstract. As a biologist you draw away or abstract the cell while looking at the human body, and then study its structure and function as separate or away from the human body. In that your focus or concentration is on the abstract, which the cell, and the rest of the human body blurs into the background. How you look at the cell as a biologist is different from how you look at it as a biochemist or a Quantum physicist. For example, the quantum physicist may only be interested in the quantum phenomenon involved in photosynthesis. That is, your background dictates what you pick up or abstract when you look at the cell. Likewise, a body builder may not abstract a cell or focus on the cell at all, instead, he may abstract the muscular structure of the whole human body or say the limbs alone and look at them.

    Abstraction is always with reference to a background. Without a background what will you abstract or "draw away"? You will look at a thing as though you are looking at it for the first time. Therefore, this background is the frame of reference, a framework according which you abstract or even describe. So, what happens when I dismantle, undo or negate this framework? I can no longer abstract; abstraction ends. In my previous post this morning, we said that reductionism ends when abstraction ends at any level. Following from the above reasoning, this implies that when I remove the framework, abstraction ends, and therefore reductionism.

    What is measurement? There can be no measurement without comparison, for example, between a standard ruler and a straight line. And all comparison needs a reference. This reference is the framework or the frame of reference; in our example, it is the ruler. Measurement ends when I undo, negate or remove the framework. Now, measurement needs a frame of reference and so does abstraction. This implies that abstraction is measurement, and the framework is the background. This leads us to the inference that because both abstraction and measurement end when we undo, negate or remove the frame of reference, reductionism ends when measurement ends.

    In this context, we can treat description and interpretation just as we treat abstraction. Description and interpretation are also measurement.

    I have deliberately avoided the words "observer" and "program". "Observer" or "Program" and “framework” or “frame of reference” is interchangeable. Observer influence and measurement are also interchangeable. In this context, abstraction, description and interpretation take place when the observer-program is running. So, when the program stops running reductionism ends because there is no longer abstraction, description or interpretation; there is the uninterpreted primordial or the undefined.

    ReplyDelete
    Replies
    1. "All that is experienced and your own mind are the unique primary reality.
      They cannot be conceptualized according to cause and effect systems of thought." -Longchenpa

      "There is only one truly serious philosophical problem, and that is suicide." -Camus

      "The solution to the problem of life is seen in the vanishing of the problem."
      -Wittgenstein

      Delete
    2. Dear 19,

      if you want us to believe your claims, you will have to provide really hard evidence for this prepositions.

      Delete
  24. Thrift wrote: How is the theory of biological living cells "reduced" to the Standard Model of particles?

    Could you be more specific? You seem to be implying that organisms (or "life" in general) impose some kind of additional burden on matter.

    ReplyDelete
    Replies
    1. Perhaps the problem is not imposing a burden on matter, but what people think matter (which is everything there is) is.

      "It’s not the physics picture of matter that’s the problem; it’s the ordinary everyday picture of matter."

      Consciousness Isn’t a Mystery. It’s Matter.
      By Galen Strawson
      https://www.nytimes.com/2016/05/16/opinion/consciousness-isnt-a-mystery-its-matter.html

      Delete
    2. The question I think is whether you can predict the behavior of a macroscopic system based on its microscopic components.

      I don't think that's the case. For example, one cannot predict whether a Turing machine will halt on a given input, and that holds whether or not you know the quantum state of every particle in the machine or not.

      Delete
    3. I don't know whether there's a "Turing-machine halting problem" in the code of cells, but the code of OpenWorm can be examined. (Can it be reduced to quantum mechanics code?)

      https://github.com/openworm

      Delete
    4. There is a sort of halting problem in biology. We call it cancer. The cell cycle runs a typical cell from its resting phase through the molecular processes involved with division. There are a whole lot of signaling that goes on in the exterior of the cytosol with g-factors that bind to tyrosine phosphorylation activated pathways. In the interior there are cytokine modulated processes that signal replicase to duplicate DNA and others that activate kinetiochores to produce tubulin fibers to separate chromosomes and lots more.

      There are a lot of things that can go wrong. If the gene that expresses the tyrosine growth receptor has a single nucleotide polymorphism (SNP) it can change the shape of this polypeptide so it autophosphorylates the tyrosine residue to initiate a cell cycle. This will lead to a runaway cell cycle. There are isozymes of the PDK kinase that initiate a large pathway in the cell cycle and lots of others. Well there are defense mechanisms, such as the P53 gene that expresses a large protein that can process gene repair or it can initiate apotosis or programmed cell death. So a cell that is running away can be stopped or it can be killed. The P53 gene or it TP53 polypeptide is a tumor suppressant. There is also the retinoblastoma gene and others that are meant to stop renagade cells.

      Is this perfect? Of course not. There can be SNPs on the gene that express these, and in fact the aromatic ring compounds in smoke preferentially bind onto the P53 gene and muck it up. There are genes that produce DNA transferases that check genes and activate b-cell activity if there is an errant gene, but that is not perfect. This comes down to the problem really of how does a system “know” or encode information concerning the halting state of all molecular pathways? It can do a pretty good job, but it is not perfect. Biology does not solve the halting problem, but it does a pretty good job of getting around it.

      Delete
    5. You just described a non-linear system. Feedback in coupled biochemical pathways (metabolic and cellular).
      For me it is almost a miracle how evolution was able to generate such a complexity.

      In physics now it is only the question where to insert a tiny non-linear element aka measurement.

      Delete
    6. Without spoiling that probabilities add up to 1.

      Delete
    7. The very non-linearity - also in biochemical pathways - is a nice argument to kill fat Schrödinger’s cats.
      A living cat, as well as all “classical” objects imply non-linearity, but superposition in QM needs 100% linearity.

      Strikes me to be a better argument than Sean Carroll’s here.
      Parallel worlds where e.g. Jesus was not crucified, but killed by a poisoned apple - how preposterous.

      Sure, the “Schrödinger’s Cat Killer” want become a blockbuster like killing Vampires, but at least low-fat Schrödinger’s cats do exist.

      Delete
  25. I wonder if we find nothing finally, does that mean that Lawrence Krauss is right?

    ReplyDelete
  26. You said it of Quantum Mechanics and I suspect it is also true of General Relativity that they both approximate nature not duplicate it. If true this would be the cause for much of the controversy and difficulties in advancing those physics. When it comes to the smallest particles in QM, ones that can't be observed directly, I sometimes wonder if they al physically exist rather than are purely mathematical representations that calculate phenomena we can and do observe causing us to assume all the particles are physical entity's?

    ReplyDelete
  27. "Another well-known exception is Einstein’s theory of General Relativity. General Relativity can be used in more situations than Newton’s theory of gravity. But it is not the physics on short distances that reveals the differences between the two theories. Instead, it is the behavior of bodies at high relative speed and strong gravitational fields that Newtonian gravity cannot cope with."

    Actually it is very easy to make a relativistic Newtonian theory of (scalar) gravity that reduces to the usual one in the right limit. The problem is that it gives the wrong value and sign for the precession of Mercury's orbit and does not predict the deflection of light. It was called Nordström's theory.

    -drl

    ReplyDelete
  28. Reductionism is intrinsically limited.

    At the more basic level Reductionism assumes the impossibility of "strong emergency": That any complex system properties/behavior can be fully explained/reduced by the properties/behavior of its elementary components; this also assumes implicitly that the fundamental laws governing a complex system elementary components are enough to predict/describe the whole system; or in other words that these fundamental laws are "complete".

    But ultimately this is a Mathematical Logic problem and it had been known since Godel incompleteness theorems in the 1930's that we can not assume that any axiomatic system is complete; and even more Chaitin extension of Godel results imply that complexity is a source of incompleteness and incompleteness is pervasive in complex systems.

    Physicist Nobel laureate Phil Anderson's "More is different" is just a physicist reflection on results already given mathematical precision by Chaitin and others.

    Observational results of rotating galaxies strongly shows that you can consider these rotational speeds as a new "natural law" for galaxies, a complex system of stars, a strong emergent property making "dark matter" superfluous. http://physicsbuzz.physicscentral.com/2016/10/a-natural-law-for-rotating-galaxies.html?m=1

    ReplyDelete
    Replies
    1. Emergent phenomena can be very strong and surprising, but they emerge upwards. Chemistry and neurons cause consciousness and although the phenomenon and the microscopic mechanisms are there to be observed we don't adequately understand how one gives rise to the other. That's upward emergence. Consciousness doesn't cause telekinesis, or if it did we would expect to look for and find the microscopic route for any such downward influences - same as we can analyse how consciousness moves your hand electrochemically.

      The observation that dark matter correlates with visible matter is a good clue but cannot be explained away as emergent for the same reason. Even if complexity is involved, we have to find a mechanism why individual stars move as they do. It could be mundane matter somehow overlooked, dark matter that just clusters that way, or some revision of Einstein's equations. Researchers have tied to adjust GR for large distances and it doesn't seem to work. One possibility I haven't heard explored is adjusting GR for complex distributions of masses, hinting that mass is held somehow between the baryons. No I've no idea how to do that.

      Delete
    2. You really are confusing "weak emergence" and "strong emergence" into one.

      Weak emergence is the only one assumed by reductionism and mainstream thinking since it is "upward emergence", but "strong emergence" is the possibility of new properties/behaviors in the system as a whole that are not reducible to its elementary components, or in other words new independent properties(new physics) from the physics of its elementary components.

      As was was already noted by Phil Anderson the "new" physics of complex systems is as "fundamental" as the physics of its elementary components, and that is perfectly clear from a logical point of view. But reductionist mindsets are blind to this.

      Delete
    3. Hi Jeremy,

      I completely agree with you. The importance of Kurt Gödel's theorems is largely overlooked even in the mathematical community. Heading for a TOE is to a large extent still the continuation of the failed Hilbert program.

      Delete
    4. Exactly! Many theoretical physicists are still stuck with the Galilean/Newtonian view of Physics that is modeled by the axiomatic structure of Euclidean Geometry; but a lot had been learned about after that about axiomatic systems.

      It is just wishful thinking to assume that a few "fundamental" principles are enough(complete) to fully describe Reality and Godel results extension by Chaitin and others clearly contradict that assumption.

      These results are a strong reaffirmation of the empirical root of the scientific method: Nothing can replace the constant observation/testing of Reality since the only way to discover Reality unlimited source of irreducible properties is by observing it.

      Theoreticians always will tend to give priority to their pet theories(dogmatism) and belittle empirical evidence but the history of Science is there to show that sooner or later empirical facts will fly in the face of their always limited and narrow assumptions. There is no TOE, that is just the ultimate dream of dogmatic minds.

      Delete
  29. Continued. . . reductionism and abstraction. . .

    Let us not go as far as the uninterpreted primordial. Keeping it simple, if we consider the electron or photon to be the least interpreted primordial, then the least interpreted primordial may be in superposition. Let us go into this a bit. The detector is the observer in every sense in which we have been using the word. And detection is measurement. We said in my previous post that in the absence of the observer there is no measurement. In the double slit experiment also in the absence of the observer or detector, the superposition is untouched by measurement, which is represented by the interference pattern on the screen. Superposition is when measurement is not, and measurement is not when the observer or detector is not. But we have shown that in the absence of the observer or measurement reductionism ends. That is, superposition begins when reductionism ends.

    ReplyDelete
  30. Empty space, or Spacetime, is often thought of as just the stage where the particle performers play. But the stage may be more powerful than all the particles combined and might be very much in charge.

    The obvious is, Spacetime is bigger than all the particles put together by a very large margin. Spacetime is thought to have preceded the particles. Spacetime can move faster than the speed of light unlike particles. Spacetime operates at the highest frequency, Planck time, and the shortest wavelength, Planck length allowing it to compute faster than any particle.

    If Specetime is a consciousness with free will it can transfer information from one side of the universe to the other instantly by creating two entangled virtual particle at opposite end of the universe that after wave function collapse, information is transferred. Particles can only transfer information at most at the speed of light.

    If a bad boy or good girl particle captures the attention of Spacetime, Spacetime can use its free will to increase or decrease time speed to the particle thereby speeding up or slowing wave function collapse. Just slightly increasing or decreasing time speed can put the particle in the part of their quantum cycle where collapse can happen at 0% or 100% of the time depending on what Spacetime desires.



    ReplyDelete
    Replies
    1. Understanding bad boy or good girl bias

      Let's understand the human-psychological-observer. This observer comes into being through conditioning or programming, if I am conditioned as a Hindu, I believe in reincarnation; alternately, if I am conditioned as a Christian I beleive in resurrection. The conditioning or programming is the background from which I respond or react. The Hindu-program or the Christian-program or any other program for that matter always creates a bias or a prejudice for or against. For example, when the Christian-program is running, by the virtue of the psychological program, the Christian dislikes a Hindu because he is a non-believer. We know that the program dictates the belief; however, when the Hindu-program is running, the hindu has a great affinity for another Hindu. The program clearly influences our looking. There is a bias based on the program or the background or the observer. Our culture, religion, nationalism, political ideology, familial inclinations, prejudices and fears, caste, gender, tradition, dogma etc., condition or program us. Our background is the result of this conditioning or programming; and from this background we respond, react or look at the world.

      So, as long as there is the observer, that is, as long as the program is running, I am translating or interpreting or describing all what I am looking at or hearing according to the background or the psychological program. Then there is a prejudice for or against.

      You describe someone as 'bad' or 'good' based on your programming or the background. Brainwashing and indoctrination are programming or conditioning. They maim you, and the mind becomes incapable of looking or hearing without being judgmental. You cannot see "what is" as "it is"; it colors your vision. We know what havoc indoctrinated youth wreak on this planet earth, for example, suicide bombers. What happens when the observer who is the background is suspended, or when the psychological program stops running or stalls? Then you see the fact or the truth.

      Delete
  31. Theoretical nueral network, superposition, and entanglement

    Let us say there are 7 dots or hyperlinks. Each dot or hyperlink is a bit of information. These dots or hyperlinks are connected or hyperlinked to form a (theoretical) neural network of meaningful information. Recognition is to place what we see or hear with what we know and then find a match. If there is a match then we recognize what we see or hear, and if there is no match then it is something new. Let us say that these dots being bits of information are entangled and therefore in superposition. Recording or registering the neural network in memory is to entangle them into a superposition of states. That is, the neural network exists in memory as a superposition achieved through entanglement of hyperlinks or dots.

    That said, let an external dot or hyperlink come into the field of sensory perception. In an attempt to recognize, thought places the external dot or hyperlink against the neural network, which is in superposition, to find a match. If the external dot matches any one of the dots in the nueral network, there is recognition, which resolves the superposition, in that, each entangled dot takes on a true value or bit of the information it represents. For example, if the nueral network represents the 7 colors of a spectrum, VIBGYOR, with each color corresponding to a dot, then when the external dot matches red, I not only recognize red but also extrapolate and recognize red as belonging to a spectrum because the recognition of one dot resolves the superposition whereby each entangled dot takes on a true value or color thereby reconstructing the spectrum. The whole nueral network of 7 colors is called forth so now there is a total recall.

    This matching or recognition is an act of measurement. It is measurement that resolves a superposition and causes the wave function to collapse. Then, what happened to the other possibilities? Well you connected the dots (pun) and wasn’t there a total recall? It is connecting or hyperlinking the dots that is new to the act of detection which the human brain performs. And in connecting or hyperlinking the dots, the other possibilities also are realized. It is now evident that the other possibilities are realized not in other worlds but in this world itself, that is, inside the brain-mind.

    ReplyDelete
  32. Continued. . . Theoretical nueral network, superposition, and entanglement. . .

    If I entangle two electrons, one of spin “up” and the other of spin “down” into a superposition of these states and separated by a distance greater than 100km, then when I measure one electron as “up”, the other must be “down”. Likewise, if I entangle, 7 photons each representing a color in the spectrum, VIBGYOR and separated from one another by distance greater than 100km, then when I measure one as “Violet”, the others take on a real value or color they each represent. That is, I resolve the superposition into its constituent states or colors. Measuring anyone of the photons is enough to resolve the superposition.

    ReplyDelete
  33. Continued. . . Theoretical nueral network, superposition, and entanglement. . .

    In addition, the brain can use something akin to Quantum Darwinism to keep those dots or hyperlinks that are the best suited or fittest for problem solving while the other dots or hyperlinks fade from memory or become extinct. Or at the next higher level, those nueral networks that are best suited or most adaptive survive while other neural networks fade away or become extinct.

    ReplyDelete
  34. Seeing as how this is a philosophical discussion, the discussants might be interested in this entry at the Stanford Encyclopedia of Philosophy, a well-respected (and free!) online resource: https://plato.stanford.edu/entries/scientific-reduction/

    ReplyDelete
    Replies
    1. Eric,

      I agree with you that Gokuls contributions are quite philosophical considerations and hardly to be understood from a scientific point of view. But if you could "read between the lines" what Gokul wants us to say, it might still be helpful within this discussion.

      Delete
  35. In my view there are many good reasons to think that reductionism is not the complete story.

    Look for example the pertubative expansion. In 0th order, one has the unpertubed case, without any interaction. Applying higher orders, all possible contributions to the process under consideration are symmetrized, it seems to be a kind of averaging over everything that might happen. Looking at QCD alone, you will find an enourmous growth of possible contributions to the process under consideration. Nobody has shown so far, that the perturbative expansion might converge. But in order to describe real observations, you have to add a lot of additional stuff... Supersymmetry is not quite helpful wrt. the combinatorial explosion of possible explanations of what might happen in reality.

    Solid state physics is yet another example. Quite shure, superconductivity is a quantum mechanical phenomenon, as well as the phenomema accuring in 2-dim layers of graphene and other materials. But all this things can not be deduced from basic quantum mechanical theory. Solid state physics can only be understood by looking at the collective behavior of the particles being involved. One could extent this list quite easily. Quite shurely all this are indications to step out beyond reductionism!

    ReplyDelete
  36. This discussion of philosophy is fun but pointless. What Sabine is talking about is the isolation of matter in a vacuum. What QM and GR have in common is that they both require a re-evaluation of the vacuum. There is really only one way to satisfy the demands of both, which is to make matter and spacetime emerge, as in a group contraction or a phase transition, as aspects of an underlying vacuum in which one has neither empty space nor isolated matter. One would always imply the other. And that indeed would be the end of naive reductionism. BTW I have an example if you're interested.

    (Progress in physics is almost always related to a redefinition of the vacuum.)

    -drl

    ReplyDelete
  37. "The root of our problem may instead be that quantum theory itself must be replaced by a more fundamental theory, one that explains how quantization works in the first place."

    That was indeed David Finkelstein's point of view, and he worked at it for most of his life. I admired his courage.

    -drl

    ReplyDelete
  38. The cause of the crisis in the basis of fundamental knowledge (mathematics, physics, cosmology) is the ontology crisis. We need a new dialectical view of matter in the spirit of Plato: matter is that from which all forms are born. Physicists, poets and musicians should have a single picture of the Universe as an eternal, integral process of generating meanings and structures.
    “We repeat: worldunderstanding is spaceunderstanding. / Повторяем: миропонимание - пространствопонимание.” (Pavel Florensky)

    ReplyDelete
    Replies
    1. Hi Vladimir, I have left pysics research about 25 years ago, but I am still very interested in natural science, in particular neuro science. On the other hand, I have also much interest in "Geisteswissenschaften" as it is called in Germany. My own personal experience and intuition tells me that we need indeed a more dialectical view of matter. Some condensed matter physicists are already getting a glimpse on this matter.

      Delete
    2. Hello my dear friend!
      Yes, today physicists together with philosophers must rethink the whole dialectical line from Heraclitus to our times taking into account the accumulated knowledge and problems in the philosophical basis of fundamental science. A holistic paradigm should come to the aid of the atomistic paradigm. Here, physicists should recall the good philosophical precepts of Einstein, Wheeler and Hegel.
      A. Einstein: “At the present time, a physicists has to deal with philosophic problems to a much greater extent than physicists of the previous generations. Physicists forced to that the difficulties of their own science.”
      J. Wheeler “Philosophy is too important to be left to the philosophers.”
      G. Hegel: "An educated people without a metaphysics is like a richly decorated temple without a holy of holies."
      By the way, in Russia we call with love the name of George Wilhelm Friedrich Hegel - "Egor Fedorovich."
      In my philosophical dialectics, I rely on the ideas of Nikolai Kuzansky: "coincidence of opposites", "coincidence of maximum and minimum."
      I participated in contests of the FQXi several times. Check out an interesting essay by Nobel Laureate Josephson - On the Fundamentality of Meaning by Brian D. Josephson

      Delete
  39. Funny comic on just some of this:

    http://www.smbc-comics.com/comic/enemy

    ReplyDelete
    Replies
    1. Quite funny indeed. It reminds me on the story of a drunk person loosing a key in the darkness. Quite naturally he will look for the lost key around the next lamppost, regardless of logical considerations where the key might have been lost.

      Delete
  40. The Weizmann Institute of Science experiment provides evidence on how the influence of the observer affects not only the outcome but also the resolution or definition of the outcome. The scientists at the institute were able to manipulate the influence of the observer by increasing or decreasing the electric current in the detector. Say, at 0% influence, that is, in the absence of the observer, there was a clear-cut interference pattern on the screen. As they increased the influence of the observer in steps, the inference decreased, and the outcome became more and more singular, resolute and definitive. At 100% influence, the outcome was singular and well defined, that is, there was a well-defined single outcome.
    Here, we see that there is a movement from superposition to a single outcome in terms of resolution and definition with the increase in the influence of the observer. The observer is a simple detector, and it abstracted a well-defined, singular reality from the superposition at 100% influence. Building on the evidence, we complicate and sophisticate the observer; the observer then constructs a reality by relating two or more things whose resolution, meaning, description or interpretation begin to become a little more definite, that is, the definition increases a little more. If we go on in this way by complicating or by making the observer more and more complex and sophisticated, then the resolution, as in the diagram of this blogpost, increases leading to the observer constructing a highly interrelated and peculiarly defined reality. In this way, let us say, that we landed up with a very complex and sophisticated observer like an animal that sees in black and white. What it sees is very well-defined and peculiar, which was possible by putting together or relating huge amounts of information that it gathered from the simple photons reflected from the object in view. With further complication and sophistication, we arrive at the human being who can now see reality in colors. If you want a non-living example you can consider the color tv, which is the sophisticated form of a black and white tv.
    Working backwards, I “reduce” the color tv to a black and white tv to a radio. . . so on so forth until I land up with the detector in the Quantum mechanics experiment. As I reduce the complexity and the sophistication, the definition is getting less peculiar or specific to something more general together with the resolution getting coarser and coarser. The reality is getting less peculiar and more general and tending towards something simpler and less complicated because the ability to form complex or complicated relationships among information inputs is diminishing with decreasing sophistication. Reality or “the construct” is simpler now.
    Once we “reduce” or “bring back” the observer in steps from its most complicated and sophisticated make up or built to the detector in the Quantum Mechanics experiment, we have a very, very simple, single out come at 100% the influence of the observer or detector. And then if we reduce this influence in steps to 0%, we “reduce” or “bring back” the superposition. Here, we are talking of reductionism in terms of reducing the complexity and sophistication of the observer thereby reducing or decreasing the peculiarity, resolution or definition of reality and tending it towards generality. Ultimately, reality is reduced to a fuzzy superposition you can make nothing of.

    ReplyDelete
    Replies
    1. Gokul,

      are there any publications on this experimental results?

      Delete
  41. Sabine,

    Your mention of preon models at the end of your excellent article reminded me of a very specific example of how trying to interpret everything in terms of smaller and smaller particles can lead to unnecessary noise and confusion.

    The Harari-Shupe-Seiberg preon model, better known as the rishon model, has always fascinated me for its informationally concise way of describing the fermions in a single family. Such conciseness is usually (but not always) an indicator that "something" about a model is insightful in an important way. But alas, this does not in general means that it will be an obvious indicator of the needed solution, since almost by its nature an unexpected conciseness is akin to a glimpse into a world no one has ever seen before. Such was the case both for the data behind the double helix of DNA, and the simplicity of the photoelectric effect for which Einstein won his (only) Nobel prize.

    So perhaps not too surprisingly, attempts to develop the rishon model of +⅓ charged T particles and 0 charged V particles from which other fermions tended to fall apart when pushed hard, and so never move deeply into the rest of physics.

    Both Prince de Broglie and Paul Dirac provided beautiful examples in their works of one technique for dealing with such situations: Focus on exactly what is in front of you, no matter how simple it seems, and look for hidden assumptions. Reducing rishons to particles never worked because they do not behave like particles even in the papers that first proposed them. Instead, they behave like orthogonal unit vectors in a 3-space.

    Understanding why requires careful attention to how our minds tend to gloss over simple details. A triplet of rishons, such as TVV for the anti-red anti-down quark, nominally has three "equal" T and V particles in it. But if the order of the particles indicates color, what happens to that order if the rishons are in motion? One must postulate that "somehow" they always form linear strings. But in that case, which end is the start of the string? We tend to gloss over this point because we are so accustomed to imposing left-to-right (or right-to-left) order on symbols that we forget it has implications if those symbols are particles. The only particle solution is to designate a third type of rishon, a hidden rishon, as the "starter" rishon for such a string.

    Another problem is that since rishons are symmetric in contributing to fermions, their mutual relationship must be equivalent to an equilateral triangle, rather than a string. A starter particle is still needed for such a triangle, since otherwise flipping TVV over would create VVT and thus violate color conservation.

    The full solution is to move the hidden starter rishon to one side of the center of the triangle. This arrangement at last ensures conservation of color and symmetry of implied by the first rishon papers, at the expense of forcing the four nominal "particles" to be transformed into the vertices of a rigid equilateral pyramid.

    There is simpler way to interpret such a pyramid: It is the corner of a 3-space. The starter becomes the origin, the T particles the three positive unit vectors, and the V particles zero-length vectors along those axes. The unit vectors are the anti-color charges of the strong force, so that TVV, VTV, and VVT each have a different anti-color.

    This is of course no longer a rishon or preon model, but something… different. It's an odd sort of space-first model that suggests that if rishon simplicity has any meaning at all, it is that fermions and bosons may well be the simplest levels of particles, with interacting vector spaces taking over below that level.

    I rather like the idea that reality stops being particle-like below the fermions, if only because I find the concept of truly point-like particles to be deeply in conflict with the idea that the universe has only a finite amount of information available to it.

    ReplyDelete
    Replies
    1. Hi Terry,

      "I rather like the idea that reality stops being particle-like below the fermions, if only because I find the concept of truly point-like particles to be deeply in conflict with the idea that the universe has only a finite amount of information available to it."

      Could you please explain your point a bit more in detail? It's a quite natural idea for physicists, thinking of that there are no infinities in real world, including infinitely small point-like particles. 1-dim superstrings however do not convince me in either way.

      Delete
  42. There seems also to be a further confusion at play. As Sabine pointed out in her original post, reductionism is of different kinds - she calls them "methodological" and "theory" reductionism. I have different names for them, but perhaps the distinction is the same.

    As I see it, "upward emergence" is "downward reduction", nothing more. but then there are different kinds of downward reduction: "constitutive" (in the sense that big things are assemblies/agglomerates of smaller things); "functional" (in the sense that temperature is the macroscopic/aggregated effect of microscopic jiggling); "causal" (in the sense that a tsunami is caused by sub-oceanic tectonic movements); and so on. As I understand it, distinguishing between these "kinds" of reduction is not exactly what Sabine is referring to, I think she means something else.

    For there is another way of thinking about categorising reductions. There is "ontological" reduction: which involves the drawing of mereological and/or causal connections between some observed macroscopic structure/behaviour and the structure/behaviour of its microscopic constituents. And then there is "epistemic" reduction: which involves the drawing of connections between descriptions/interpretations of observed macroscopic effects and descriptions/interpretations of observed/postulated microscopic effects. I think these two classes of reduction correspond (perhaps only roughly) to Sabine's "methodological" and "theory" reduction.

    And there's the rub. Absent miracles, ontological reduction must always be possible; whereas epistemic reduction almost never is. We can readily "reduce" pressure to aggregated microscopic dynamics; even as we cannot so readily "reduce" the wetness of water to the properties of hydrogen and oxygen atoms.

    And this is where "strong emergence" comes in: it declares as fundamental that which is descriptively/epistemically irreducible (e.g., David Chalmers and consciousness), without regard for the fact that it is causally/ontologically reducible.

    ReplyDelete
    Replies
    1. I disagree that consciousness is causally reducible. Hand-waving about emergence isn't an explanation.

      Delete
    2. All we can observe und understand about our reality is always a subset of the true real world. Please falsify me, if I am wrong. Wether conciousness is causally reduceable or not, is currently not much more than a religios belief. I am quite shure, that particle physics won't solve the problem. Neuro science has a much better chance to make some particular discoveries in order to learn more about the relevant details.

      Delete
  43. Sabine,

    Perhaps the distinction to be drawn here is between ontological (constitutive and/or causal) reduction and epistemic (descriptive) reduction. Absent miracles, the former must always be possible, even if we do not ourselves know how to do it; whereas the latter almost always runs into trouble.

    As instrumentalist descriptions of (observed) macroscopic phenomena, therefore, it is perhaps unsurprising that QM and GR - not unlike 'consciousness' - are not simply reducible.

    As I say to my students, big rainbows are not made of little rainbows.

    ReplyDelete
    Replies
    1. Tibor,

      "As I say to my students, big rainbows are not made of little rainbows."

      Obviously the wave function is not very helpful in this case for calculating the probability of a rainbow observation, or in order to proof that big rainbows are not made of little rainbows. But on the other hand, the rainbow is made from small "particles" interacting with themselves and with the quantum vacuum, seeing it this way, it is indeed made of "little rainbows". But this view does not really help in understanding the nature of a rainbow anyway.

      Delete
  44. This is a great summary of what are, IMHO, the gaping holes in our standard model. There are some aspects of a newer model of physics that would not be explained by reductionism alone and your post points those out. Between Quantum Field theory, GR and Maxwell's equations, we have few starting points as we did with atomic theory, from which to build off of. I have always felt Maxwell's equations were wonderful symmetry tools but are too abstract to glean the fundamentals of these interactions. I can hear somebody saying..."but we've know how it works for almost 200 years", to which I would say, explain why we can't accept that Maxwell's equations shows there are no magnetic monopoles? The hunt for these us still a research topic, even though symmetry clearly shows they should not exist, so by our insistence on looking for them we acknowledge Maxwell's equations are not complete. Even the simple concepts suffer when we don't have a complete understanding of the fundamentals. Also, we struggle with singularities in the near field (in the far field, we have smooth approximations that work well but give us little info on the near field nature). QFT, GR and EM suffer from our lack of knowledge of the near field at a discrete level, leading to some difficult calculations and bad assumptions.

    ReplyDelete
    Replies
    1. @MH,

      I would like to add, that the meaning of "information" cannot be understood by reductionist approaches alone. In my view, this is a particular source of the big confusion, when physicists and mathematicians are talking with philosophers and other branches of science.

      Delete
  45. Sabine,

    With “... dark energy may be an artifact of our difficulty averaging non-linear equations.” do you refer to the Cosmological non-Constant Problem, especially eq. 22 in here or something entirely different?

    ReplyDelete
    Replies
    1. Reimond,

      Something entirely different. Will write about this in more detail in a separate blogpost.

      Delete
  46. Reductionism does come to an end but like an upside down parabola. The point at which one has to concentrate so much energy to see smaller things that one creates a black hole which increases its surface area not volume by absorbing matter/energy. So one adds more energy and the surface area increases and thus more energy equates to larger things not smaller things at that point. Lenny Susskind explains this in his book "The Black Hole War" clearer than I can here. Other resources on the web are available as well. Sort of like getting to the North pole, one can only go south.

    ReplyDelete
  47. https://phys.org/news/2019-10-axion-particle-solid-state-crystal.html

    Axion particle spotted in solid-state crystal

    "An axion insulator is a correlated topological phase, predicted to arise from the formation of a charge-density wave in a Weyl semimetal. The accompanying sliding mode in the charge-density-wave phase, the phason, is an axion. It is expected to cause anomalous magneto-electric transport effects. However, this axionic charge density wave has so far eluded experimental detection. In this paper, we report the observation of a large, positive contribution to the magneto-conductance in the sliding mode of the charge-density wave Weyl semimetal (TaSe4)2I for collinear electric and magnetic fields (E||B). The positive contribution to the magneto-conductance originates from the anomalous axionic contribution of the chiral anomaly to the phason current, and is locked to the parallel alignment of E and B. By rotating B, we show that the angular dependence of the magneto-conductance is consistent with the anomalous transport of an axionic charge-density wave."

    Is this the real thing, another particle to be included in the particle zoo of the standard model or just another quasiparticle?

    ReplyDelete
    Replies
    1. Quasiparticle. That doesn't mean it's not real, though, it means it's not fundamental.

      Delete
  48. Quick terminological note (thanks for the article): what you describe at the beginning of the article is more appropriately called ontological reductionism. It's a view about what the world is like. Methodological reductionism is normative, it is the idea that we should assume ontological reductionism for the purpose of enquiry (whether or not it's ultimately true), that we should attempt to explain phenomena by decomposing them into parts.

    ReplyDelete
    Replies
    1. Quentin,

      I originally wrote "ontological reductionism", but then I looked it up on Wikipedia and changed all instances of "ontological" to "methodological". At least according to Wikipedia "methodological" is the correct word.

      Delete
    2. I wondered about this myself. Per the Wiki article "Reductionism":

      "Ontological reductionism is the belief that reality is composed of a minimum number of kinds of entities or substances."

      "Methodological reductionism is the position that the best scientific strategy is to attempt to reduce explanations to the smallest possible entities."

      So the former refers to the (putative) reality and the latter refers to following that belief in one's approach to science. I think the blog post talks about approach, a topic Sabine has addressed many times.

      My question for Sabine is: What is your stance on ontological reductionism? I've gotten the impression in the past you were all in on it. Is that right, and, if so, are you still?

      Delete
    3. Wyrd,

      I am not a realist, so don't like making statements about reality. I don't have a strong problem with ontological reductionism but I don't think it's a scientific question whether it is or isn't correct.

      Delete
  49. Dear Sabine,

    I would really appreciate, once to read an backreaction blogpost about the foundations of reductionism within the context of Kurt Gödels theorems and possibly the relevance of intuitionistic or constructive logic within this context.

    ReplyDelete
    Replies
    1. Why do you think that Gödels theorems (which ones?) have anything to do with reductionism? And, no, his incompleteness theorem doesn't mean that there must be a world with pink unicorns because we can't prove that it's not true.

      Delete
  50. Replies
    1. In this review “... One Century After Hilbert” let us take Hilbert´s Einstein-Hilbert action. Witten’s contribution is here. I am always appalled of the naturalness and ease with which the action is stuffed into the path integral. Simply assuming that spacetime metric can be in superposition.
      Smolin’s contribution to the review is here. In section 2.8 he asks “Where does the Planck mass come from?”.
      For him it also seems natural to put the action into the path integral – eq. 31, but he also sees that “Without matter, the gravitational action is invariant under a scaling”.
      Scale invariance also arises in a phase transition in statistical field theory across a boundary at some threshold.
      Maybe once in a while we should question our assumptions, especially the natural ones.
      Does the Einstein-Hilbert action really belong into the path integral?

      Delete
    2. Typo:
      - “review” -> “reviewed book”;
      - “appalled of” -> “appalled by” (but this is much too strong) -> “astonished by”

      And here is also a new minutephysics video about Einstein, Friedmann and biases.

      Delete
  51. As Aristotle observed 2500 years ago, the nature of the elementary substance will never be known.

    ReplyDelete

COMMENTS ON THIS BLOG ARE PERMANENTLY CLOSED. You can join the discussion on Patreon.

Note: Only a member of this blog may post a comment.