Saturday, July 11, 2020

Do we need a Theory of Everything?

I get constantly asked if I could please comment on other people’s theories of everything. That could be Garrett Lisi’s E8 theory or Eric Weinstein’s geometric unity or Stephen Wolfram’s idea that the universe is but a big graph, and so on. Good, then. Let me tell you what I think about this. But I’m afraid it may not be what you wanted to hear.

Before we start, let me remind you what physicists mean by a “Theory of Everything”. For all we currently know, the universe and everything in it is held together by four fundamental interactions. That’s the electromagnetic force, the strong and the weak nuclear force, and gravity. All other forces that you are familiar with, say, the van der Waals force, or muscle force, or the force that’s pulling you down an infinite sequence of links on Wikipedia, these are all non-fundamental forces that derive from the four fundamental interactions. At least in principle.

Now, three of the fundamental interactions, the electromagnetic and the strong and weak nuclear force, are of the same type. They are collected in what is known as the standard model of particle physics. The three forces in the standard model are described by quantum field theories which means, in a nutshell, that all particles obey the principles of quantum mechanics, like the uncertainty principle, and they can be entangled and so on. Gravity, however, is described by Einstein’s theory of General Relativity and does not know anything about quantum mechanics, so it stands apart from the other three forces. That’s a problem because we know that all the quantum particles in the standard model have a gravitational pull. But we do not know how this works. We just do not have a theory to describe how elementary particles gravitate. For this, we would need a theory for the quantum behavior of gravity, a theory of “quantum gravity,” as it’s called.

We need a theory of quantum gravity because general relativity and the standard model are mathematically incompatible. So far, this is a purely theoretical problem because with the experiments that we can currently do, we do not need to use quantum gravity. In all presently possible experiments, we either measure quantum effects, but then the particle masses are so small that we cannot measure their gravitational pull. Or we can observe the gravitational pull of some objects, but then they do not have quantum behavior. So, at the moment we do not need quantum gravity to actually describe any observation. However, this will hopefully change in the coming decades. I talked about this in an earlier video.

Besides the missing theory of quantum gravity, there are various other issues that physicists have with the standard model. Most notably it’s that, while the three forces in the standard model are all of the same type, they are also all different in that each of them belongs to a different type of symmetry. Physicists would much rather have all these forces unified to one, which means that they would all come from the same mathematical structure.

In many cases that structure is one big symmetry group. Since we do not observe this, the idea is that the big symmetry would manifest itself only at energies so high that we have not yet been able to test them. At the energies that we have tested it so far, the symmetry would have to be broken, which gives rise to the standard model. This unification of the forces of the standard model is called a “grand unification” or a “grand unified theory”, GUT for short.

What physicists mean by a theory of everything is then a theory from which all the four fundamental interactions derive. This means it is both a grand unified theory and a theory of quantum gravity.

This sounds like a nice idea, yes. But. There is no reason that nature should actually be described by a theory of everything. While we *do need a theory of quantum gravity to avoid logical inconsistency in the laws of nature, the forces in the standard model do not have to be unified, and they do not have to be unified with gravity. It would be pretty, yes, but it’s unnecessary. The standard model works just fine without unification.

So this whole idea of a theory of everything is based on an unscientific premise. Some people would like the laws of nature to be pretty in a very specific way. They want it to be simple, they want it to be symmetric, they want it to be natural, and here I have to warn you that “natural” is a technical term. So they have an idea of what they want to be true. Then they stumble over some piece of mathematics that strikes them as particularly pretty and they become convinced that certainly it must play a role for the laws of nature. In brief, they invent a theory for what they think the universe *should be like.

This is simply not a good strategy to develop scientific theories, and no, it is most certainly not standard methodology. Indeed, the opposite is the case. Relying on beauty in theory development has historically worked badly. In physics, breakthroughs in theory-development have come instead from the resolution of mathematical inconsistencies. I have literally written a book about how problematic it is that researchers in the foundations of physics insist on using methods of theory development that we have no reason to think should work, and that as a matter of fact do not work.

The search for a theory of everything and for grand unification began in the 1980s. To the extent that the theories which physicists have come up with were falsifiable they have been falsified. Nature clearly doesn’t give a damn what physicists think is pretty math.

Having said that, what do you think I think about Lisi’s and Weinstein’s and Wolfram’s attempts at a theory of everything? Well, scientific history teaches us that their method of guessing some pretty piece of math and hoping it’s useful for something is extremely unpromising. It is not impossible it works, but it is almost certainly a waste of time. And I have looked closely enough at Lisi’s and Weinstein’s and Wolfram’s and many other people’s theories of everything to be able to tell you that they have not convincingly solved any actual problem in the existing fundamental theories. And I’m not interested to look any closer, because I don’t also want to waste my time.

But I don’t like commenting on individual people’s theories of everything. I don’t like it because it strikes me as deeply unfair. These are mostly researchers working alone or in small groups. They are very dedicated to their pursuit and they work incredibly hard on it. They’re mostly not paid by tax money so it’s really their private thing and who am I to judge them? Also, many of you evidently find it entertaining to have geniuses with their theories of everything around. That’s all fine with me.

I get a problem if theories that despite having turned out to be useless grow to large, tax-paid research programs that employ thousands of people, as it has happened with string theory and supersymmetry and grand unification. That creates a problem because it eats up resources and can entirely stall progress, which is what has happened in the foundations of physics.

People like Lisi and Weinstein and Wolfram at least remind us that the big programs are not the only thing you can do with math. So, odd as it sounds, while I don’t think their specific research avenue is any more promising than string theory, I’m glad they do it anyway. Indeed, physics can need more people like them who have the courage to go their own way, no matter how difficult.

The brief summary is that if you hear something about a newly proposed theory of everything, do not ask whether the math is right. Because many of the people who work on this are really smart and they know their math and it’s probably right. The question you, and all science journalists who report on such things, should ask is what reason do we have to think that this particular piece of math has anything to do with reality. “Because it’s pretty” is not a scientific answer. And I have never seen a theory of everything that gave a satisfactory scientific answer to this question.


  1. Thanks for that. My thoughts go a step further, in that it’s pretty obvious string and other studied theories have been proven (in the experimentalist fashion) to not work by the sheer volume of effort expended on showing that they might work. We need to look at more painfully “obviously” wrong theories to save physics.

    1. Hey, Thomas.
      Liked Your comment.
      But, however;
      What are some more
      painfully, ', obviously wrong' theories
      We need to :look' at
      to save phisics. ?

    2. Or it may be that reality lies outside the domain of science.

    3. Sabine once said that she would be extremely grateful if anyone could tell her what it means for something to exist.

      Brilliant question.

      Superdeterminism can’t answer it any more than quantum mechanics can answer it.

      But quantum mechanics nevertheless gives us the key to answering it.

  2. Even QED in Minkowski spacetime is not yet mathematically well-defined; there is even a $1,000,000 prize for making it so. The fantastic 10 digit agreement between the theoretical predictions of QED and the experimental results is a miracle. It would be good for
    physicists to acknowledge the unsatisfactory mathematical
    structure of this prediction. It starts from a mathematically
    undefined Feynman Integral, proceeds by making many very
    complicated manipulations, and ends up with a formal series
    that Dyson showed to be divergent! Physicists think of it as an
    asymptotic expansion, but they have no mathematical proof of
    this. I often joke that this agreement of theory and experiment
    is a new proof of the existence of God and that she loves

    1. But, but, it's the most incredibly accurate and precise calculation in the history of the world, ever !

    2. The mass-gap problem is an aspect of quantum chromodynamics (QCD). This is because the gauge boson interacts with itself strongly. Photons of QED do not interact with each other directly and only do so through intermediary electron-positron pairs.

      There are doublets of quarks up-down, strange-charge, beauty-truth labelled as (u, d), (s, c), (b, t) called flavors. These also have units of charge -⅓e and ⅔e, for e the unit of charge an proton. These also have two units of QCD charge called color. So a quark of any sort will have two of three colors (b, y, r) for blue, yellow and red. This is the QCD interaction and it is "blind" to the type of quark or flavor and its electric charge. So given any of these quarks q it might have the colors red and blue, say q^{r,b}, and if interacts with the gauge particle of QCD called a gluon, that if it has the color yellow, label it by q^y with this quark by

      q^{r,b} + g^y → q^{y,b} + y^r.

      The QCD charge or color is swapped between the gluon and the quark. In this way the gluons exchange momentum with the quark and quarks are bound into hadrons.

      There is also the weak interaction, or quantum flavor dynamics (QFD). This exchanges an up quark, with charge -⅓e, with a down quark with a charge ⅔e, for instance. The up quark interacts with the weak interaction gauge boson W^+ into the down quark with that flips the charge, There is also a weak interaction charge called isospin involved that has by the Nishijima-Gel Mann relationship a connection to the electric charge. There is also the neutral weak gauge boson Z that flips flavors between doublets so that u + Z → s, where both up and strange quarks have charge -⅓e. The strong or QCD interaction has a time of interaction of about 10^{-23}sec, which is close to the time for quantum electromagnetism or QED. Weak interaction have a longer time for interaction, about 10^{-13}sec. However, in the early days it was found there were these longer weak processes that involved what then was a third quark, which was then labelled as strange.

      This is a blitz idea of gauge fields. The important thing is that QCD has gluons that carry the color charge and interact with each other. As a result they define a chromo-vacuum that is self-confining, and for the (uud) quarks of the proton this is extremely stable. In some GUTs the proton decays by a transformation between lepton number and quark flavor, but that is another story. We often hear that Higgs field gives particles masses, and for the quarks this is around 10MeV or so. Gluons BTW are massless. The mass of a proton comes largely not from the Higgs field, but because QCD is self-binding, a part of this is so called anti-screening, and the majority of mass of a proton and all other hadrons is determined this way. The mass-gap problem is to find an analytic process whereby one can show this mass is determined by QCD, or on principle any nonabelian gauge field theory, in this manner.

      So far the problem has been most successfully addressed with Lattice Gauge Field Theory (LGFT), where the gauge potentials are defined on a lattice and they change from edge-link to edge-link to give chromo-electric and magnetic fields. As the lattice is increased in size this is a sort of renormalization of LGFT and the mass-gap is numerically estimated. This has proven to be pretty successful and the masses of basic hadrons computed fairly close to their experimental values.

    3. Gauge theory is a beautiful and abstract mathematical concept that people think 'nature' respects, although it leads to some very ugly physics.

    4. I think that the last term of
      q^{r,b} + g^y → q^{y,b} + y^r.
      should have been g^r not y^r? Ie a red gluon?

      I would have expected q^{r} + g^{y,b'} → q^{y} + g^{r,b'}
      to be one way of expressing this as do not quarks have one of six colours selected from r, b, y, r', b' and y'? And it is the gluons that have colour/anticolour aggregates?
      Where apostrophe means an anticolour.

      In my preon model you could have q^{r,b} which would represent an antiyellow quark as in my model antiyellow is equivalent to red plus blue. Further, in my model, an antiyellow quark actually has three physical colour components q^{r, b, y'}.

      Or are you meaning that all quark-gluon interactions have integer numbers of [r, b, y] on each side?

      Sorry to question you on this if I am wrong.


      Austin Fearnley

    5. Prof. David Edwards7:26 AM, July 11, 2020

      How do you know QED in Minkowski spacetime can be mathematically well-defined? It's far more important in Physics that the theory agrees with observation than it be mathematically well-defined. And even arithmetic is not known to be consistent - but maybe Feynman's doodles are.

      And mathematicians know how to play around with divergent series - they're not the end of the world.

      "The fantastic 10 digit agreement"
      It wouldn't impress a mathematician who knows π up to infinite digits. I always wonder why Physicists go on about this agreement. It just represents the precision of the equipment. If it were an agreement up to 1 million digits, the theory still wouldn't explain gravity.

    6. Yes, I made a typographic error and this should be

      q^{r} + g^{y,b'} → q^{y} + g^{r,b'}.

      With preons and rishons and related ideas. I am less knowledgeable. The idea is that leptons and quarks are composed of fundamental particles on a smaller scale. There is a caveat I see in that one has particles in the 1MeV to 100GeV mass range composed of particle on a scale down to around 19^{-28}cm, which is equivalent to energy at around 10^{14}GeV. This requires a detailed balance of sorts of some extreme negative potential energy well with some large masses of these preons that balance out to the small masses we observe.

      As for your final comment it is the case the color index is conserved. There is no creation of destruction of the color charge, at least with QCD ---- GUTs are another natter.

    7. @ Steven Evans,

      1. No one knows whether or not QED in Minkowski spacetime can be mathematically well-defined! That was my point.

      2. Even theoretical physicists would like their theories to be mathematically well-defined! They understood the old quantum theory of 1900-1925 to be a hodgepodge of techniques. QFT currently has a similar messy structure!

      3. By Godel's Theorem the currently axioms for arithmetic can only be empirically assumed to be consistent but cannot be proven to be so from more reliable axioms.

    8. "QFT currently has a similar messy structure!"

      And works empirically. That's the point - empirically true beats mathematically well-defined. Why would the physics of the very small necessarily be exactly commensurable with the structures in a brain evolved to handle physics zoomed out by 10 orders of magnitude? No reason to think so.

      "3. By Godel's Theorem..."
      That's what I wrote. Like you say we can consider arithmetic empirically consistent - so Mathematical proof is no better than Physical empirical proof.

    9. Professor,
      Thanks for that.
      it was really good.

      I had to stop laughing,
      before I could reply.

      thanks again.

    10. @ Steven Evans,

      Sometime try to follow such a "computation" and you'll start to understand what I mean by QFT having a messy structure!

    11. "3. By Godel's Theorem the currently axioms for arithmetic can only be empirically assumed to be consistent but cannot be proven to be so from more reliable axioms."

      This sounds like a misunderstanding. Godel's theorem doesn't really address the consistency of axiomatic systems. It merely states that any consistent system (like arithmetic) can be shown to have correct statements whose correctness cannot be proven within the system, i.e., the axioms of the system are necessarily "incomplete."

    12. Prof. David Edwards4:03 PM, July 13, 2020

      I'll take your word for it. But the point of the blog post is that messiness doesn't matter as long as the results agree with observation. Maths is not physical reality - obviously it's a handy tool, but there is absolutely no reason to fetishise it.

    13. @ aydemir,

      Goddel proved a number of related results! For example,
      Gödel's completeness theorem establishes semantic completeness for first-order logic. He also proved the incompleteness of
      any self-consistent recursive axiomatic system powerful enough to describe the arithmetic of the natural numbers (for example Peano arithmetic). He also showed that the consistency of such a system can be expressed in the system, but not proved in it.

    14. aydemir1 wrote:
      > Godel's theorem doesn't really address the consistency of axiomatic systems. It merely states that any consistent system (like arithmetic) can be shown to have correct statements whose correctness cannot be proven within the system, i.e., the axioms of the system are necessarily "incomplete."

      No, our other friends are correct. Gödel did indeed prove that if the axioms prove their own consistency, then the axioms are in fact inconsistent.

      What probably has you confused is that there are two Incompleteness Theorems. You are referring to number 1; our friends are referring to the second.

      Check out any standard reference.

    15. @Prof. David Edwards:

      Actually Gentzen proved the consistency of arithmetic by assuming the existence of the first uncountable ordinal. This is consistent with Goedel as he isn't proving the consistency of arithmetic together with the existence of the first uncountable ordinal.

    16. @Greg Feild:

      It's actually the other way around. Gauge theory is called what it is *because* it was discovered by physicists.

      The same motion in mathematics is called fibre bundles.

      When Yang told Chern in the 70s he was surprised to find that gauge theory had a natural geometric interpretation in terms of fibre bundles and also how surprised he was that mathematicians had come up with the same idea. Chern told him that the notion of fibre bundles was very natural in mathematics.

      Conceptually speaking, the notion of bundles are as ubiquitous in mathematics as vectors and are as simple. It's in the details that the devil, so to speak, rears his head. For example, defining the higher versions of vectors, ie tensors, is notoriously unintuitive and working with infinite dimensional vector spaces is fiendishly intricate. Likewise with bundles.

      Nevertheless, the key idea is straightforward enough.

    17. Mozibur9:42 PM, July 14, 2020

      "the consistency of arithmetic by assuming the existence of the first uncountable ordinal. "

      So it depends on the consistency of another system which is not known to be consistent i.e. we don't know if arithmetic is consistent.

    18. @Steven Evans:

      Well, if that was actually possible then we would contradict Goedels second theorem about proving the consistency of any formal system.

      The only get-out clause is using what is called a relative consistency proof and which I outlined above. This is not nothing. The notion of the smallest infinite ordinal is quite natural - so to speak.

    19. Mozibur2:43 PM, July 15, 2020

      Right, so arithmetic is not known to be consistent.

    20. @Steven Evans:

      That's just a childish argument. Like I said, Gentzen has proved arithmetic to be relatively consistent which is as much as anyone can do. In short, with that understood, people simply say that arithmetic has been proven consistent.

    21. Mozibur9:18 AM, July 16, 2020

      Arithmetic hasn't been proven consistent, so anybody who says it has is wrong. Gentzen showed arithmetic was consistent if one assumes the consistency of PRA with transfinite induction up to ε0. But the consistency of this system cannot be proven if it's consistent - so it doesn't prove that arithmetic is consistent. Obviously, mathematicians will try to reduce their assumptions as far as possible, but the point for physicists is that any mathematical system they use is only known to be consistent empirically. Mathematicians cannot prove that a Primary School child doing their sums won't one day show correctly that 1=0. Unlikely, but a fact.

  3. "In physics, breakthroughs in theory-development have come instead from the resolution of mathematical inconsistencies". So true!

    In your previous posts you have pointed to several such inconsistencies. I would like to add one to the list, suspecting that it just might be the cause for the stagnation in the foundations of physics on multiple fronts: Inconsistencies in classical electrodynamics (CED).

    At the turn of the 20th century, CED was the only game in town, yet it was merely a proto-theory, mathematically inconsistent and conceptually flawed. Einstein, of course, knew that - after all, if he could consistently write the energy-momentum tensor of charged particles, he would not have declared the r.h.s. of his equation "the wing made of wood"; Dirac, Schwinger, Feynman and Wheeler tried to fix it, and many still do - all rendering CED mathematically consistent but manifestly at odd with experiments, including those traditionally considered in the scope of CED.

    The above fatal flaw (of a proposed physical theory) does not seem to alarm physicists, as QM has taken over classical physics (with its own inconsistencies, e.g. the measurement problem). But ironically, the giant leap undertaken by mathematical physics as a result of the quantum revolution, allows for a fresh new look at CED's pathology. Classical particles in my proposed consistent-CED, cannot be the naive points one usually has in mind, and a universe hosting such particles - a much more bizarre place than expected. So much so, that QM becomes just a natural statistical description of it, and (illusion of-) `dark-matter' - an inevitable consequence.

    Summarizing, to solve current inconsistencies one can take the direct path - adding more fields, complicating the Lagrangian etc. - or fix prior inconsistencies which lead to the elimination of multiple current ones. This latter path is academically suicidal for a young scientist to take in today's immediate-reward academic race, nor is there much openness to such heretic ideas, which could undermine entire `industries'.

    1. Yehonatan Knoll wrote:
      >In [her] previous posts [Sabine has] pointed to several such inconsistencies. I would like to add one to the list, suspecting that it just might be the cause for the stagnation in the foundations of physics on multiple fronts: Inconsistencies in classical electrodynamics (CED).
      >Classical particles in my proposed consistent-CED, cannot be the naive points one usually has in mind...

      I think it is worth elaborating a bit for others on the issue you raise.

      In classical EM, the energy stored in the electric field of a point particle is infinite. In fact, if the particle has radius r, the energy stored in the field goes as 1/r.

      For an electron, the value of r at which the electric field accounts for all of the mass of the electron is the "classical radius of the electron," which is slightly under 3 fermis (AKA femtometers).

      We have good reason to believe experimentally that the electron is smaller than this. This means that the effective mass of the electric field exceeds the total mass of the electron.

      The standard solution is to assume that the "bare" electron has a negative "bare" mass. Alas, this is not just a mathematical artifice: it actually does affect the dynamics of the electron.

      Basically, a negative mass means that the force and the acceleration are in opposite directions via F=ma.

      You might expect this to lead to unstable behavior and it does: the electron can then have a mode of motion in which the interaction between the field and the bare electron results in the electron accelerating forever without any continuing external force.

      The standard reference on all this is Rohrlich's Classical Charged Particles. The favored solution is that an electron starts acceleration before it feels a force in just such a way so as to avoid the runaway solutions.

      That is weird (and of course has never been observed).

      In quantum field theory, the behavior of the energy in the field is softened considerably: it only goes as log(r) instead of 1/r. See here for a readable discussion of the history.

      This slower divergence gives us some "breathing room" and allows physicists to suppose that some minimum length (maximum momentum) cut-off is provided by other interactions -- QCD, the weak interaction, or gravity.

      To the best of my knowledge, no one has ever shown that this actually does occur. Nor does anyone seem to know whether the runaway solutions that occur in the classical theory also occur in QED (I think this latter question may be solvable).

      Lawrence is probably right that the reason so few physicists work on this is that it is not likely to advance one's career in the current hothouse environment -- a difficult question with a low chance of big payoff.

      It does still bother me, though I have no good intuition as to whether an answer will clarify deeper problems.

      To anyone who thinks he has found an answer that can be explained in a brief comment here: you probably do not understand the problem. As the link I provided to the arXiv indicates, some of the most brilliant physicists of the last century struggled with this. Any correct answer probably involves some very hairy calculations.

      All the best,


    2. @PhysicistDave It indeed takes a unique mathematical structure with unexpected properties arXiv:0902.4606 [quant-ph]

    3. I remind everyone that the original definition of a field is derived from the equal and opposite force between two interacting objects.

      When this mathematical abstraction and calculation convenience is considered to be a real result of a single, lone particle, the above mentioned absurdities ensue.

    4. Greg,

      Your comments are consistently of low quality. You clearly know very little about physics but erroneously think you do. I recommend you stop it before I put you on the blacklist, thank you.

    5. @ Yehonatan Knoll,

      Did you ever consider that scale covariance of classical electrodynamics may actually be a manifestation of self-similarity manifest in both Lorentz and gauge symmetries?

      By the same token, did you ever consider that the inconsistencies of QED may be deeply related to the Landau pole problem and the need for ghost fields in gauge theory?

  4. Supposing that we could overcome every limitation and build an accelerator the size of the galaxy, amazing structures might appear, but they needn't resolve to anything we could unify at that scale. So we build an accelerator the size of the local group and perhaps reveal more structures. We might begin to suspect that we are only revealing structures in a fractal universe, and that the process could go on indefinitely. We might even ask, when we have results from the the local group accelerator x 1000, whether we are discovering things that have always been there, or creating things that never existed until we probed at that energy scale?

    1. Even if some beings could build an accelerator that enclosed a galaxy in sending protons along each direction you would have to wait 10s of thousands of years for the particles to cycle around for a collision. Needless to say this brings up issues of just how long they would have to build this thing.

  5. To me a Theory of Everything is the same as some Theory of Everything. Do we need some Theory of Everything? We have no choice.


  6. "Of course I don't know what I'm doing. That would be way too expensive. But I do have a theory about it."

    The above summarizes many engineers' discussions with management.

    We engineers owe the source of our theories to the existing physics interactions you postulate, and measure our understanding of them by success or not of our designs.

    Because practical applications seems always to reduce to useful simplifications, I doubt applying physics will ever need a theory of everything.

    Since fundamental physics seems, to me, essentially a quest for knowledge, I think you must continue your struggle as long as it takes.

    Personally, I hope your researchers will recognize when potentially useful plateaus of understanding are reached, and pause their research to develop their insights into a useable form for a world that needs them.

    Something like I believe you are doing, Dr. H.

  7. I think you are denigrating all these lone researchers who are trying to construct theories of everything.

    "Then they stumble over some piece of mathematics that strikes them as particularly pretty and they become convinced that certainly it must play a role for the laws of nature. ... This is simply not a good strategy to develop scientific theories, and no, it is most certainly not standard methodology."

    You could describe Einstein's discovery of the theory of general relativity in these general terms, as well. So this strategy worked at least once. Of course, Einstein had some reasons to believe that differential geometry was the right mathematics for general relativity, but I am sure Lisi, Weinstein, and Wolfram all have reasons for thinking that the math they are working on is the right mathematics for a unified theory. (Probably, they're wrong, but I'm fairly sure they didn't just pick an area of mathematics and say "this is nice math ... it must be related to the correct theory of the universe.")

    1. Newton's gravitational theory didn't satisfy special relativity. So, Einstein original motivation was the making of a special relativistic theory of gravitation.

    2. Has QM reconciled with special relativity? The collapse of the wave function exceeds the speed of light, but does this work out mathematically somehow?

      Still wondering if there's a way to say the entangled particle pair sees itself as a single entity but sees *everything else* as entangled. Some "relativity of entanglement": the observer can't tell whether it's part of an entangled pair or the thing it's observing is.

    3. @ Dax,

      One can easily define relativistic quantum theories of "events", "trajectories", etc. See p.75 of "The Mathematical Foundations of Quantum Mechanics" on my webpage.

    4. @Dax: quantum field theory (invented by Dirac) reconciles quantum mechanics with special relativity. I would say that the collapse of the wave function is no more mysterious in QFT than it is in ordinary QM. (And physicists differ as to whether this has been resolved, with Sabine on the "unresolved" side, and many, many physicists on the "resolved" side.)

    5. @Dax:

      Physicists have got as far as unifying special relativity with quantum mechanics but not general relativity. The former is called QFT and it was a long and messy business getting there and its still pretty messy when you actually see the standard model in print.

      In NCG (non-commutative geometry), which tidies up the standard model, there is no motion of point or locality. In a sense everything is entangled.

  8. It is not necessary to have a grand group that unifies everything, better a grand unification is not a necessary condition for a final theory of gravity, cosmology and QFTs. There are positive and negative things about the idea. The argument has always been that the universe emerged from some sort of supergravity quantum states, classically what we call a singularity, where all the forces of nature are unified into a single form. This seems to make sense in one hand, but on the other there is problem. Once you get into groups such as E8×E8, that you have 496 group elements, of which 480 are roots and the other 16 weight that define Hamilonian-like elements. These roots transform between states, just as colors on gluons act in the adjoint representation of SU(3). SU(3) has 81 of these and this is not an entirely easy theory to work with. The net effect of bundling everything into a large gauge group is you are not making things simpler; you are making things very complicated.

    Wigner had a nice idea on small groups, where ultimately the most simple groups behind all of physics are in fact very small. Or maybe better put they are as simple as small as needed, but not too small. The near horizon condition for a Kerr or Reisner-Nordstrom black hole for a stationary observer is AdS_2×S^2. The horizon has two spatial dimensions, and the stretched horizon has one of time. This is why in some ways I say it is a sort of transmission line. The extent of time on this stretched horizon is rather short though. The Planck acceleration is g ≃ 10^{52}m/s^2. The time dilation means time on the horizon is T ≃ t_p ln|gt|, and for t = 10^{73}s for the duration of a solar mass black hole and T is only 168 Planck time units! The holographic principle would tell us that the symmetries on this stretched horizon is in the S^2×R^1 or equivalently AdS_2×S^1. We have Maldecena to thanks for telling us this is equivalent to CFT_1×U(1).

    That is nice, we have a very simple group theory. Can this be everything? That depends. One might appeal to Bott periodicity with CFT_2 and CFT_4. Maybe more physically we could say with symmetry breaking there is a large degeneracy of possible vacua, or equivalently this simple gauge group breaks as a degeneracy of states so there is a large set of states for a broken large symmetry. The large symmetry can only exist in a sense as what might be called shadow states. Maybe this is what string theory is.

    As a further possible indication of this, the electron electric dipole moment has been found by The ACME Collaboration (October 2018) to be zero to 10^{-29}e-cm Nature. 562 7727, 355–360. Now if string theory is correct the stringy aspects of physics would give an electric dipole to around P_string = e√{8π}ℓ_p ≃ 8×10^{-33}e-cm. Experimental results are getting pretty close, close to 2 order of magnitude. This might be compared to finding the Earth is a perfect sphere to within any bump that would be the size of a 100 millionth the diameter of a nucleus! As a result the type II strings attached to a D-brane, say the D-brane of this universe, should start to show up as a deviation as an electron dipole moment of the electron.

    This is a nexus point in physics. The CKM matrix theory predicts this electric dipole, but due to quarks and CP violations there. This gets into the Peccei-Quinn theory and axions. There are supersymmetry models with CP violation with P ~ 10^{-26}e-cm that are ruled out, where as some have it smaller. We may then before long get actual experimental data that either supports or refutes theories which employ large unification groups.

    1. @Lawrence Crowell:

      'The D-brane of the universe'

      I take it you mean a spacetime filling D-brane? In which case the strings attached to this D-brand are free strings in spacetime?

    2. Yes, a D-3 brane corresponding to the spatial surface of the universe.

  9. This raises two issues, the possibility that there is no quantum gravity theory to be had, and that the LHC will never discover anything new. What would physicists do then?

    1. If there is no way of reconciling quantum theory with gravity, then it appears that we live in a universe with inconsistent laws of physics. Is this possible? Maybe. We could be living in a computer simulation which is badly programmed, and which will crash once the first black hole evaporates, some 10^65 years from now. But most physicists (me as well) think this is a cop-out theory and that there is some way to reconcile gravity and quantum mechanics.

      And let me point out that the LHC already has discovered something new ... the Higgs boson.

    2. We might in fact live in a broken world. String theory cannot exist in a de Sitter spacetime or the FLRW spacetime with Λ > 0. Supergravity is not consistent in this spacetime as found by Vafa. In fact quantum gravitation might be not possible. This either means everything we are doing is completely wrong, not just wrong but our methods are wrong, or if these concepts are correct they must be so in ways different from thought.

      We have a sense that gravitation must be quantum mechanical. An expansion of the metric

      g_{ab} = g^0_{ab} + g^½_{ab} + g^1_{ab} + …

      has the first term a classical background, then a linearized quantum gravitation and then a quantum gravitation that is nonlinear, or if linear has a matter-field source, and then higher terms. The linearized quantum gravitation term is easily arrived at with g_{ab} = η_{ab} + h_{ab}, Minkowski plus deviation, that is linear. We take the transverse traceless part and we have a theory gravity waves that are easily quantized. The theory is similar to a theory of diphotons that occurs with the Hanbury-Brown and Twiss theory of photon bunching. Further, all one must do is look at a basic Penrose diagram of a Schwarzschild black hole. The bottom triangle is a white hole that produces quanta in the two diamond regions, and the top triangle is the black hole that absorbs quanta. The diagram is an idealized gadget, but it does imply the spacetime configuration has much the same content as raising and lowering operators a^†, a of quantum mechanics.

      We are then between a rock and a hard place. String theory and M-theory exists on anti-de Sitter spacetimes with Λ < 0. AdS_n spacetime has topology ℝ^{n-1}×S^1, where the circle S^1 is time. There are closed timelike loops. Techniques often consider a patch with dimensions d < 1/√Λ so the patch can be made into a conformal region without CTCs. However, we most clearly do not live in AdS_4.

      Yet the role of AdS may be very relevant. The interior of a Kerr or RN black hole is similar to AdS, and in fact there are closed timelike curves. This region of spacetime is not standard in that a spatial surface does not evolve into a unique spatial surface. The evolute of a spatial surface around a CTC at each local point or region is not a diffeomorphic map. It is also AdS-like. Also, the boundary of AdS_5 or junctions in AdS_5 can be an Einstein spacetime E_4 or dS_4. There are then possible connections to the mathematical representations of string/M-theory. It may just not be directly what we observe.

    3. cont due to space limit

      Aaronson wrote some interesting papers on the role of closed timelike curves in computing. One such papers is This opens the door for some sort of entanglement or correlation between our observable universe and spacetimes with CTCs. This can include Kerr-Newman black holes or AdS spacetimes. In the case of black holes we have the interior r_- horizon that in the eternal case is continuous with ℐ^+ and leads to the prospect for Hogarth-Malament spacetimes and loop holes to the Church-Gödel-Turing theorems and theses on computation and incompleteness. However, Hawking radiation breaks this continuity and an observer approaching r_- may reach it at the moment the black hole evaporates away as Hawking radiation. The observer may be maximally thermalized or decoherently scattered away. So the nature of this interior region with the timelike singularity and CTCs is not clear. There is also the AdS spacetime and possible connection to physical states in the observable spacetime. Is there a correlation between spacetimes with CTCs and those without, such as where we are, and in this way nature is able to quantum compute beyond BPQ? Interesting prospects to ponder.

      The world we live in may be broken from a single perspective within this spacetime. If we are able to measure quantum physics involving these exotic entanglements or correlations with CTCs in AdS or BH interiors, then maybe things are not that broken. However, from a limited perspective of just this world we may in fact live in a fragmented world or as Vaga calls it the swampland.

  10. I guess it’s a pretty fair assumption that a »theory of everything« will not work with the postulated theory objects of the „well known“ standard model of particle physics (SM).
    Documented facts are,
    the Quark Parton Model (QPM), developed by Richard Feynman in the 1960s, describes nucleons as the composition of basic point-like components that Feynman partons called. These components were then identified with the quarks, postulated by Gell-Mann and Zweig at the same time a few years earlier. According to the Quark-Parton Model, a deep inelastic scattering event (DIS) is to be understood as an incoherent superposition of elastic lepton-particle scattering processes.
    A fundamental (epistemological) problem is immediately recognizable. All experimental setups, implementations, and interpretations of deep elastic scattering are extremely theory-laden.
    In summary: A cascade of interaction conjectures, approximations, corrections and additional theoretical objects subsequently "refined" the theoretical nucleon model
    Even worse, fundamental contradictions exist at the theoretical basis of the SM, which, despite better knowledge, are not corrected. An example:
    The nonexistent spin of quarks and gluons
    The first assumption was, due to the theoretical specifications of the mid-1960s that in the image of the SM the postulated proton spin is composed to 100% of the spin components of the quarks. This assumption was not confirmed in 1988 in the EMC experiments. On the contrary, much smaller, even zero-compatible components were measured (ΔΣ = 0.12 ± 0.17 European Muon Collaboration). Also the next assumption that (postulated) gluons contribute to the proton spin did not yield the desired result. In the third, current version of the theory, quarks, gluons (...and virtual Quark-anti-Quark pairs if one wishes too) and a somehow composed dynamical-relativistic orbital angular momentum generate the proton spin.

    On closer inspection, the second readjustment has the „putative advantage” that the result in the context of lattice gauge theory and constructs, such as "pion clouds", algorithmically "calculated", can’t be falsified. But this purely theoretical based construction obviously does not justify the classification of quarks as fermions. No matter how the asymmetrical ensemble of (more or less unobservable) postulated theoretical objects and interactions is advertised and will be advertised in the future, the quarks themselves were never "measured" as spin-½ particles.

    Summary in simple words: It is possible to create a theory-laden ensemble of Quarks and “other” theory objects and their postulated interactions, but the Quark itself - as an entity - has still no intrinsic spin -½ in this composition. That means that Quarks aren’t fermions, no matter what the actual theoretical approach would be! This is a basic, pure analytical and logical statement.

    Generally speaking: If one postulates a theoretical entity with an intrinsic value but one discovers that one needs to add theoretical objects and postulated interactions to get the desired postulated intrinsic value, one has to admit that one’s entity has no physical characteristic as such.

    Further more:
    „Why“ – „epistemologically speaking“ - should a postulated complex, multi-object-asymmetric, charge-fragmented, dynamic substructure (quarks-based proton) create a spin value ½ and an elementary charge of exactly 1·e over dynamic states in the temporal or statistical mean? The comparison with the SM postulated elementary "leptonic" electron, with spin value ½ and elementary charge (-) 1·e, which are "created" without "dynamic efforts" and without sub-structure, identifies the quarks-gluon thesis as a – frankly speaking - math-based fairy tale.

  11. First, Dr Edwards, might you have a searchable name for folks interested in finding out more about the prize you mentioned?

    One of Feynman’s (multiple) dark secrets is that he was more of an intuitionist than a mathematician; Dyson was the mathematician behind QED. Feynman’s intuitions however had an annoyingly tendency to be correct, even when Feynman could not explain why even to himself. If there is a Muse for physics, Feynman, Einstein, and Dirac arguably were on the short list of folks she touched in the 1900s.

    Dr Edwards said (with some expansion by me):

    “Physicists think of [the divergent formal series (TB: I think you may mean prior to renormalization?) in QED — the one that is capable of making amazingly accurate predictions, but whose precise definition is possible only by first applying a number of exceedingly complicated manipulations to the abstract and conspicuously hand-wavy Feynman concept of an ‘integral of all possible histories’ — ] as an asymptotic expansion… but they have no mathematical proof of this!”

    While I find the question of how to formalize QED fascinating, I confess to having zero interest in even thinking about pursuing such a prize myself. If was super rich, however, I would have loved to help fund such a prize! In one of my day jobs I helped define and direct federal funding to curious and insightful minds of all ages to help them explore new science frontiers. The results were often delightful, especially from younger minds not yet caught up in funding politics. A great day job, that!

    Though I can’t be a philanthropist, I do know from first hand experience the deep satisfaction of encouraging innovative thinking in others. And since one of my other day jobs was in defining and funding research in machine intelligence, there are occasions where lessons from that world could possibly assist physics research.

    Take for example an ancient rule from game theory: If some problem — such as the inability to formally prove some ill-defined set of ideas that still manage to produce extremely effective algorithms — does not resolve after a huge amount of work (decades in this case), then it’s time to backtrack and re-examine how the question was posed. This is because unproductive branches of exploration often mean that the branch setup contained an inadvertent paradox.

    So: If the QED algorithm, an iterative calculation that only allows the result to be approached asymptotically, matches physical reality extremely well; yet the concepts and equations that inspired the algorithm cannot be formally proven; then one should ask: Which of these two entities — both of which have formally precise definitions, but only one of which attempts a closed-form solution in which material objects are modeled as infinitesimal points — is more likely to be the one that accurately reflects how the real universe operates?


    Let me put that more bluntly: One way out of the QED paradox is to postulate that the universe is a finite set of naturally-occurring bits whose number is defined by the total mass-energy of the universe. All of physics, both classical and quantum, then becomes rules operating on those bits (PAVIS). In such a universe it is the QED algorithms, not the QED equations, that are the simpler and more accurate representation of how physics works. The algorithm is good at predicting field behaviors because the fields in the real world also emerge iteratively, and also have finite resolution.

    This is a classical-first universe with finite resolution, no superstrings, no Planck foam, and no quantum gravity. Spacetime is defined entirely by the address bits of matter and energy, and so has no existence in their absence. Quantum mechanics becomes nothing more than the inherent jittery uncertainty that occurs when you run out of resolution bits at the lowest-scale edges of mass and energy: dark wave functions, the bit-starved gaps in the classical Boltzmann fabric.

    1. 1. Millennium Prize Problems;

      2. I meant after renormalization! The renormalization only makes the individual terms finite (This requires a very difficult proof!)

    2. Terry Bollinger wrote: "It's time to backtrack and re-examine how the question was posed."

      Well said! Centuries were spent trying to prove Euclid's fifth axiom, and only by stepping back it was discovered that the task was not only futile, but impossible. Worse, it blocked the view to entirely new fields, other geometries. In physics, the particle concept is such a dead end, as PhysicistDave has unwittingly illustrated with his comment. None of the "particles" that keep popping up in condensed matter physics are fundamental. But many physicists cannot conceive of a physics without particles, and they insist on wasting more time studying them.

    3. Feynman demonstrated how quantum amplitudes could be calculated below some cut-off in energy, often denoted by Λ, and not to be confused with the cosmological constant denoted by Λ, so that processes which occur at energy above this scale can be ignored. Feynman showed this to be the case even if Λ → ∞. This is then sometimes said that the infinities in QED can be removed and the relevant finite part computed. Some have said this leads to the sweeping of a problem under the rug. On the other hand, this leads to a scaling principle call the renormalization group (RG) flow where coupling parameters can be rescaled to higher energy.

      This can be seen with quantum gravitation in one sense. The vacuum is the zero-particle state for all quantum fields. The universe can be thought to have a large number of these Planck volumes of around 10^{-99}cm^3. The cosmological constant Λ ≃ 10^{-52}m^{-2} defines the cosmological horizon at L = √(3/Λ) ≃ 10^{26}m or around 1.3×10^{10}ly. This means there are in the observable universe around 10^{183} of these Planck units of volume in the region we can causally communicate to called the observable universe. The bounding horizon has around 10^{122} Planck area “pixels” as well. While the numbers of these Planck volumes or voxels and in the bulk and the Planck pixels on the horizon are constants, the expansion of the universe means Planck volume leave our causal region, but new ones emerge. Also, this occurs according to a holographic principle.

      The holographic principle indicates the QFT content on an event horizon is equivalent to the same quantum information in the bulk, With the above this means the number of Planck pixels ℓ_p^2/L^2 is equal to the quantum units in the bulk determined by the uncertainty spread of fields δL. This means

      (ℓ_p/L)^2 = (δL/L)^3,

      where the uncertainty spread in the bulk is given by δL^3 = ℓ_p^2L, where L = √(3/Λ). This is around 10^{-18}m. Using the Heisenberg uncertainty this is a fluctuation of energy around 100GeV. This is interestingly around the mass of the Higgs particle! This is one reason I thing the gravitational coupling constant is the dimensionless (M_h/m_p) ≃ 10^{-17}. BTW, I ignore the small bits of mass-energy in the universe as particles and fields.

      This scale is such that Planck units below the Planck length in scale are stretched to the Planck length and beyond while at the same time the expansion of the universe is such that Planck volumes move beyond the horizon. We can think of the extreme UV vacuum modes above the Planck energy as mixed states or states not entangled with the rest of the universe. The Planck volumes that move beyond the horizon then becomes unentangled with the region they left. The accelerated expansion of the universe is one which then maintains a constant entanglement entropy within the O-region of the cosmic horizon. Inflation is a case where the horizon scale is much smaller and there is an RG flow. The holographic connection to the Higgs mass then raises some fascinating questions on the stability of the Higgs field, which if the Higgs mass is larger will pop upwards to the Planck mass. There are some unknown connections here.

      This then suggests the trick that Feynman performed is one of the most brilliant and salient pieces of work in the history of 20th century physics. This ultimately connects with the Planck limit and quantum gravitation.

    4. Terry,

      Bits/ binary digits don’t exist, except as an idea in the mind/brain, and as symbols which mean something from the point of view of human beings. The “binary digits” concept is physically implemented by higher and lower voltages in computers. There are particles, atoms and molecules, but no actual genuine binary digits.

      However, I agree that (what we would represent as) algorithms must exist, in the same sense that laws of nature (that we represent as equations) exist.

    5. Werner wrote:
      >it blocked the view to entirely new fields, other geometries. In physics, the particle concept is such a dead end, as PhysicistDave has unwittingly illustrated with his comment.

      No, I neither "unwittingly" nor "wittingly" illustrated that!

      You have a very nasty tendency to put words in other people's mouths.

      I am as happy as the next guy to treat "particles" as a merely useful approximation to the underlying quantum fields.

      That does not help to solve the basic problems: the measurement problem and Bell's theorem.

      Neither Bell's theorem nor the fact that we have separate "measurement axioms" makes any necessary reference to particles. (The standard tests of Bell's theorem do indeed require separate, individual photons in terms of the detection process, but make no assumptions as to whether these photons are really particles, waves, slithy toves, or whatever.)

      You keep making stupidly snarky comments implying that you know how to do better than all physicists, past and present, but you never actually do it!

      You're just full of hot air -- all talk and no action.


    6. Terry

      You wrote: "Spacetime is defined entirely by the address bits of matter and energy, and so has no existence in their absence."
      I agree, but IMO there is more to the story as follows...

      There are other issues commented on in this thread which are relevant. One is it/bit and the other is Lawrence's: no creation or destruction of colour charge in QCD.

      I completely agree with the indestructibility of colour charge [irrespective of QCD], but what happens to the colour charge [which is an essential ingredient of protons and neutrons] when the spacetime collapses at the end of the Penrose CCC cycle of the universe (or however it will be that the end will occur)? IMO the colour charge will disappear but will not be destroyed. The colour charge will still exist but it will not be within a fermion. My model has the formulae for this recombination but it is simply summarised as 'adding red to antired results in net neutral colour'. The net neutral colour is taken into the photons at the end of the CCC so colour charge is not destroyed but merely recombined. In my model all change is recombination with a strict accounting system whereby all going in to an interaction must come out but in a revised form, nothing is created or destroyed from zero as it were.

      And as you say, without the fermions the bosons have no matter to interact with to make a spacetime. But nothing fundamental has been destroyed and everything can be reconstituted given the right circumstances because it has been a recombination rather than a destruction of fundamental entities.

      It versus bit: I formed my view after following Susskind's online 'theoretical minimum' course on string theory over ten years ago. (Twenty hours of lectures still available online.) It is a matter of perspective that an external observer of a particle which is travelling at near the speed of light, with respect to the observer, has reduced information available. In the limit it reduces to a binary outcome. So this bit is in the brain of an external observer but an internal observer might record much more than one bit. Returning to QCD redness, the red 'bit' could be red = 1 and antired = 0. But so much more than one bit is likely to be available for an internal and hence hypothetical observer. But that extra information is not available to us (unless the speed were to drop).

      I also interpreted the bit as motion towards us (=red) or away from us(antired), for example as a particle whirling around us in a circle. And further abstracted this to a particle travelling through an independent red space in its independent red time +t direction (red) or its -t direction (antired). So I view the binary red/antired as being much more physical than a mathematical/logical bit. And these bits do not need to be created nor destroyed in any particle interaction or in the CCC end-of-universe-cycle event. And so all the bits are available to recombine and begin a new cycle. I admit that these seemingly-indestructible bits do appear mathematical as they are always available for us to perform mathematical operations on.

      I thought I had finished, but the matter of a 2D holographic universe might be raised. The number of dimensions apparent to an observer depends on the relative speeds of those dimensions. This is covered by Susskind's course. So 2D to one observer could be 3D to another observer with a different perspective. Not sure how relevant that is to the holographic universe.

      Austin Fearnley

    7. Terry Bollinger wrote:
      >One of Feynman’s (multiple) dark secrets is that he was more of an intuitionist than a mathematician...

      Not exactly a secret! He bragged about it (I took two years of classes from Feynman, as you may recall).

      I actually discussed this once with Jon Mathews, then chair of the physics department at Caltech, who told me that Feynman was actually a better mathematician than Feynman let on. Jon felt it was part of the persona Feynman constructed for himself.


    8. Lorraine, a binary digit, or bit, is the answer to a yes/no or true/false question, in which 1 stands for true and 0 for false. (Is the voltage within a certain range?) Unless you believe the world is in your head, the world contains the answers to such questions. (The symbols 1 and 0 are human conventions, but the concept is what is important, not the arbitrary symbols.)

      True/false logic may not work in all cases, but it works well in enough cases to be a useful tool by itself, and it can be extended to fuzzy logic. So it seems to be a good starting point for any thinking entity. Animal experiments have shown the concept to be present in monkeys and other creatures.

      By the way, Feynman's diagrams are another example of an algorithm which is an explicit part of physics, and taught as such to thousands of physics students. Denying the existence of algorithms in science and claiming it has only equations is like a blind person denying the existence of colors. Science is never finished (despite what John Horgan has said) so maybe it needs more algorithms than it currently has, but it does have algorithms. (The bit value is 1.)

      This of course is my own point of view, which assumes the world is real outside of my head. I keep hoping though that if I can make it clear enough you will be able to accept it as at least equally as valid as yours.

      As long as I am commenting, I will take the opportunity to thank Dr. Bollinger for his always interesting and often humorous comments. (I had to laugh at one recently, "... my job in the US Federal Government (when there was such a thing)...")

    9. PhysicistDave wrote: "You have a very nasty tendency to put words in other people's mouths."

      No, I did not put any words in your mouth. I just pointed to the conclusion that any physicist (though not dogmatists) could draw.

      > "That does not help to solve the basic problems: the measurement problem and Bell's theorem."

      I've already replied to that: You can't say what (in your view) the measurement problem is, and as the experiments show, Bell's theorem is irrelevant (in this universe).
      What conclusions to draw from the EPR/B experiments is another matter. Perhaps that it is wrong to think of "particles, waves, slithy toves, or whatever" as travelling from the source to the detectors.

      Also non-physicist readers will be able to distinguish substance and polemics. As every reader of this blog can verify, you've never come up with physical arguments against what I had written. At best, you called it irrelevant without explaining why.

      > "all talk and no action. Contemptible."

      Yes! With best wishes for a speedy return to your normal state,


    10. JimV,

      I’m glad you agree that there are no actual binary digits except in theory i.e. in individual people’s minds. “Binary digits” in computers are actually higher and lower voltages, where either the higher or the lower voltage can represent one or true. “Binary digits”/ voltages, individually and groups of them, are used to symbolise information. Computers are processing symbols that only represent information from the point of view of human beings, and symbols that only represent logic from the point of view of human beings. From the “point of view” of a computer/ AI there is only voltage: the computer/ AI can’t know what the symbols are supposed to represent.

    11. @Terry:

      Any really new ideas in physics or mathematics is going to be 'hand-wavy'. Imaginary numbers, the calculus as well as Feynmans path integral and the like. Rigor tends to follow after they have already proven their utility - which goes for all the examples above.

      There's actually been a number of attempts to make the path integral rigorous - or at least it's end results. One method that actually takes its cue directly from Feynmans argument is the Henstock-Kurzweil integral. I expect progress on this to be quite slow since mathematicians have invested a great deal of time and energy in the Lesbegue integral, the standard integral in mathematics. Unfortunately this integral is badly behaved in infinite dimensional spaces of which path space is an example. It's not only integrals that are badly behaved there - differentiation is also badly behaved. But it seems to me that quite a lot of progress has been made recently in coming to grips with calculus in such spaces so I expect much more progress will be made in thinking through what a rigorous path integral looks like in the future.

    12. PhysicistDave,

      "The standard tests of Bell's theorem do indeed require separate, individual photons in terms of the detection process, but make no assumptions as to whether these photons are really particles, waves, slithy toves, or whatever."

      So, "individual photons" can be waves, like classical polarized waves after all! This is a nice admission from your part. This is one of those moments when the use of QED is imperative, so:


    13. "“Binary digits” in computers are actually higher and lower voltages, where either the higher or the lower voltage can represent one or true."--LF

      Yes, and again, if in the English language the meanings of "true" and false" were exchanged, all logical statements could still be expressed, and their meanings understood. As I said, conventions and symbols are arbitrary (as long as all parties agree on them), and it is the underlying concept which is important. Human brains (and other creature's brains) can hold the concepts of logic and manipulate them. Computers manifestly can do the same. We use "true" and "false" or "1" and "0". Computers (some of them) use voltage ranges. Why is one convention significant and the other not?

      A map is not the territory, but a map which is useful and works well is evidence that the territory exists. Binary digits (true and false assessments) exist. (Assuming the universe is real.) The very question of whether bits exist assumes that answer (true or false) exists!

    14. Re JimV 12:31 PM, July 15, 2020:

      4.5 [1] volts or higher in a computer circuit might represent “one” or “true” from the point of view of human beings. Alternatively, 4.5 [1] volts or higher in a computer circuit might represent “zero” or “false” from the point of view of human beings. Whether 4.5 volts represents “one” or “true”, or 4.5 volts represents “zero” or “false”, is decided by human beings.

      JimV is claiming that the computer knows that the 4.5 volts (or some other threshold voltage) represents “one” or “true” (or alternatively “zero” or “false”); and that the computer knows that all other voltages represent “zero” or “false” (or alternatively “one” or “true”).

      Also, from the point of view of human beings, depending on arbitrary codes and conventions designed by human beings, sets of these voltages represent (e.g.) numbers, words and sentences. JimV is claiming that a computer can identify the relevant sets of voltages, and know what the sets of voltages represent.

      1. The number depends on the threshold voltage of the transistor that human beings decide to use.

    15. Andrei wrote to me:
      >[Dave]"The standard tests of Bell's theorem do indeed require separate, individual photons in terms of the detection process, but make no assumptions as to whether these photons are really particles, waves, slithy toves, or whatever."

      >[Andrei] So, "individual photons" can be waves...


      I did not say that.

      I just said that the proof of Bell's theorem does not discuss the issue at all. and therefore you need make “no assumption” as to what is being transmitted for the purpose of testing Bell's theorem.

      In fact, the experiment that proves that Bell's theorem does not hold of the real world does happen to work with photons, which cannot be understood as classical waves: classical waves are not quantized, as you know as well as I. This follows from the basic property of classical EM: that Maxwell's equations are linear.

      To know that the experiment violates Bell's theorem, you need no such assumption at all.

      To understand in detail how QM predicts the results of the experiment and how the “guts” of the experiment operate, yes, you need to know that photons are involved.

      I.e., our friend Werner can just look at the results of the experiment and see that Bell's theorem is violated. But if he wants to know how we knew in advance that this would happen, how we were able to calculate the results, he needs to understand photons.

      And so do you. You have never explained how you could calculate the results of the Aspect experiment without QM.


      By the way, there are numerous experiments one can do that have nothing to do with physics in which Bell's theorem is true. But the Aspect experiment happens to be a case in which Bell's theorem does not hold, and this cannot happen in classical physics, so it shows that photons are intrinsically quantum.

      Remember: you agreed that if I proved your specific math claim was false, then you agreed that your general claim was false. And you agreed that I did prove your specific math claim was false.

      QED again.

      As to whether photons are waves or particles, wrong question. As physicists know (but you do not), photons are just what we call the quanta of the EM field in QFT. They are what they are: to insist on fitting them into a classical straitjacket just has never worked.

      The closest thing is Bohmian mechanics: as I have told you, I actually have a modification of Bohmian mechanics in which the vacuum looks like SED. But of course the rest does not look like SED at all, because SED has been proven false by experiment.

      (And, no, I am not pushing my model either: I view it as an oddity that I have never bothered to publish. Maybe I never will, since it would only give comfort to the SED crazies.)

      So, finally: QED.

      (For anyone who does not get the joke, QED is the theory of photons – quantum electrodynamics.)

    16. I commented to Terry on 6:28 AM, July 14, 2020 that IMO a red up quark contained rgbRgb colour units. This can be reduced to 000100 if we keep track of place/positions as b does not equal r even though 0 = 0.

      I wrote on 4:48 AM, July 13, 2020 that IMO 'R' (for Red) means something physical way beyond a single digit. It may be a single digit to observers only because of compactification of a huge amount of data caused to be unavailable because of a relative speed close to the speed of light.

      If we exchanged 'true' and 'false' here we would convert 000100 into 111011 which would represent an antired antiup quark with electric charge -0.67.

      An electron has electric charge of -1 and zero colour charge. Ignoring other properties e.g spin, weak isospin, mass, we could represent the electron by 111111 which could be reduced to a new binary {1} whereas a positron would be 000000 or {0}. The {0} could represent antimatter of an elementary particle but the 000000 would represent a breakdown of the antimatter property into smaller bits in three independent colour dimensions.

      But although 111111 is matter for an electron, 000100 is matter for an up quark, so there is no clear relationship between binary 0s and antimatter.

      Further, IMO given a huge set of random binary colour units one could use them to make a set of matter-only elementary particles or one could instead make a set of antimatter-only antiparticles. So, looking to see where all the {0} have gone in the universe makes sense for missing antimatter elementary particles, but does not necessarily make sense at a more fundamental level as there we may have a complete set of properties and anti-properties already being used.

      Austin Fearnley

    17. PhysicistDave,

      "The standard tests of Bell's theorem do indeed require separate, individual photons in terms of the detection process, but make no assumptions as to whether these photons are really particles, waves, slithy toves, or whatever."

      "the experiment that proves that Bell's theorem does not hold of the real world does happen to work with photons, which cannot be understood as classical waves"

      Your above statements contradict each other. Both refer to the experimental test of Bell's theorem, not to the theoretical derivation of the theorem. In the first you say that "individual photons" can be anything ("whatever"), in the second that they cannot be classical waves. So, make up your mind!

      "classical waves are not quantized, as you know as well as I. This follows from the basic property of classical EM: that Maxwell's equations are linear."

      What we actually measure is not the EM fields themselves but the influence of those fields on matter (charges). So, in order to know if classical waves could reproduce QM's results one needs a classical model of matter (atoms). You simply assert that such a model is not possible but you presented no proof for that assertion. The evidence we have (the original classical atom where the electron loses too much energy, and the current SED atom, where the electron gains too much energy) makes me believe that a general proof that a middle ground, a classical stable atom, does not exist, cannot be provided. But you are invited to prove me wrong.

      So, until this proof is presented I am justified to claim that the stability of atoms cannot rule out classical EM. We just do not know. So, I am justified in analyzing Bell’s theorem/tests from the point of view of classical EM without addressing the hypothetical classical atom. In other words, all this talk about atoms and quantized energy is a red herring that helps you avoid my very simple and clear argument, that the state of the source (and as a consequence the hidden variable) and the states of the detectors are not independent variables in classical EM. So, unless you are willing to deny Feynman’s formula, relating the fields to their sources (a formula you said you derived yourself) or deny that electrons and nuclei are charged, or the existence of the Lorentz force, Bell’s theorem in the context of classical EM is doomed.

    18. "JimV is claiming that a computer can identify the relevant sets of voltages, and know what the sets of voltages represent."-LF

      Where did I claim that? I was simply pointing out that the conventions and symbols used to express a concept are arbitrary so computer voltages are not better or worse than "1" and "0".

      Can you identify the thresholds of your own neurons and synapses and what each of their settings represent as constituents of your current thoughts? And how is that relevant to the issue of whether the concept represented by binary digits exists?

      To me that seems a non-sequitur. Do I take your changing the subject to mean that you now agree that binary digits (the answers to true/false questions) exist in the real world?

      (I would be happy to debate computers vs. brains, to try to reach mutual understanding on that, but one far-off-topic discussion per post per pair of commenters is the most I think our host should be asked to put up with--if that--and only if accompanied by compensatory site donations.)

    19. Mozibur,

      Thanks for the excellent quick discussion of integration theory! I had fun looking up Henstock-Kurzweil, and you’ve awaked an old (back to college really) interest of mine in the theory behind integrals. I will try to follow up in my personal training on some of your excellent pointers.

      One issue that has been of interest to me lately is how… well, tangled together?… biological issues of cognition can be with physics and maths. For example, if you examine all of the steps closely, the concept of time requires a cognitive version of approximation in its definition, specifically the idea that two adjacent ticks of the same clock are identical, and thus countable. I know that sounds trivially obvious, but that’s the point: Our brains provide us, without asking, a powerful ability and tendency to search for things that are “enough alike” to call them “the same thing”. The sad truth is that no two clock ticks can ever be absolutely identical, if only because the universe itself changes from one cycle to the next. And while this kind of approximation feels trivial to our pre-wired brains, I can assure from attempts to replicate it in machines that it is not.

      The fact that the very first step in creating a “formal” system requires sufficient cognition to both perceive similarities and discard secondary factors is fascinating. Our brains create an illusion of absolute sameness by in effect saying “Hey, don’t worry that two ticks are never really the same… I’ll take care of that for you!”



      My apologies for not getting back to your sooner! Yes, I realize you were already aware of the Glashow cube coordinates and how they fit with your ideas, and I think I could have worded that better.

      One thing that I think has kind of gotten lost in the complexity of modern physics is that even for deeply quantum things, and even in the Standard Model that captures in exquisite detail what we know about particles, simplicity still counts.

      The vector 3-space of quaternions is usually represented by the unit vectors ijk, which are ordered counterclockwise (right-handed) when looking down at the origin o from the all-positive-units region. Along with their origin, these three vectors define a cube (the Glashow cube) with 8 vertices {o,i,j,k,j+k,k+i,i+j,i+j+k}. You are concisely using 1’s to represent the same concepts, and that notation obviously has brevity advantages.

      However, for every such right-handed quaternion vector space oijk and associated Glashow cube there is also a left-handed quaternion space, call it OIJK, with identical structure but clockwise unit vector order. This space and its Glashow cube can be mapped onto the negative unit vectors octant of the right-handed 3-space, but they have different algebraic properties. The left-handed Glashow cube is {O,I,J,K,J+K,K+I,I+J,I+J+K}.

      So here’s the point about simplicity: These two sets, the left- and right-handed quaternion vector spaces and their corresponding unit-vector Glashow cubes, exactly index all of the defining properties (except mass) of the fermions and anti-fermions of each fermion, with pro- and anti-matter versions layered across both cube chiralities. Even the charge-degenerate neutrino and anti-neutrino are distinguished as the charge-free chiral origins o and O.

      So is this just numerology? There being such an exact indexing of the gorgeous quaternions into fermions, without there also being some deeper mathematical connection being there just seems… well, unlikely.

      I don’t know for sure how any of the above maps into your binary string notation, but I do want to encourage you to keep looking for simplicity. There’s something remarkable algebraic and low-information going on in the fermions, despite their seeming complexity.

    20. Andrei wrote to me:
      >The evidence we have (the original classical atom where the electron loses too much energy, and the current SED atom, where the electron gains too much energy) makes me believe that a general proof that a middle ground, a classical stable atom, does not exist, cannot be provided. But you are invited to prove me wrong.

      Why would I want to prove you wrong??

      (Well, I already have, but you won't admit it!)

      Look: you and your SED friends have crackpot ideas that have been disproven for decades. But you still will not give up.

      It's not my or anyone else's job to prove that you will continue to fail.

      No one takes you seriously. No one ever will. No one cares.

      And for you guys to continue to chase phantoms is probably for the best: keeps you away from any real science where you might do actual harm.

      No, Andrei, I am content with the situation as it is. All competent scientists know you are wrong, but you can just stay in your little sandbox with your friends and don't bother the big kids. All for the best.


    21. JimV,

      Re JimV 10:20 AM, July 16, 2020:

      I have never “chang[ed] the subject”. The subject has always been your unrealistic belief that computers could be conscious of the meaning behind the man-made symbols being processed. I.e. the subject has always been your unrealistic belief that “a computer can identify the relevant sets of voltages,” that represent numbers, words and sentences “and know what the sets of voltages represent”.

    22. Terry Bollinger: a powerful ability and tendency to search for things that are “enough alike” to call them “the same thing”.

      I think that goes down to the wetware; neurons. A neuron can have 10,000 inputs, both positive (primers) and negative (inhibitors), and can match partial patterns. We don't have to see a whole dog to match a dog, in fact we can recognize a dog from a partial outline of a dog if some key features are shown, and if those are hidden, then other key features will do.

      That makes sense from an evolutionary perspective. No two rocks, trees, or holes in the ground are exactly alike, no two snakes, spiders or edible plants appear identical. It is helpful to still recognize things that are only partly visible; be they desirable or dangerous.

      The same applies to all senses; hearing, vision, touch and smell. We recognize fire, but no two fires look the same. Survival demands matching to generalized models based on close-enough key features.

      And although some neurons are highly specialized, most of them are very messy. They can fire for no apparent reason; they can fail to fire for no apparent reason. They can fire on a partial pattern that is obviously not unique enough; but that might be an evolved response too: Better to mistake a stick for a snake, then vice versa.

      I suspect much of our creative thinking and discovery is due to precisely this mechanism; in research we find phenomena that we try to generalize, and despite the noise of individual variation, we do find a mental model that mostly fits our observations.

      Newton's studies led him to realize that, some noise aside, gravity mostly followed a square law. The example data he had was close enough to that he could formalize an idealized model that fit almost everything, with minor variations we might attribute to measurement noise. (And the idea of measurement noise is itself an idealized model of something that happens).

      I think the mind is filled with literally millions of these nested idealized models. Our idealized model of a 'car" is composed of hundreds of other idealized models; wheels, levers, pedals, cranks, seats, buckles, etc.

      Noisy pattern matching bubbles all the way up from the individual neuron itself, out of the necessity of finding useful order in the chaotic flood of noisy and mixed sensory data.

    23. Lorraine,

      I have written games of logic, such as the pegboard game "Mastermind", on computers and the computers have always executed the logical processes flawlessly. Whether the computers "know" they are doing so is not part of my claim with respect to bits. I do point out in passing that you yourself are not conscious of what is happening among your neurons and synapses, so you have no logical ground from which to criticize computers for that lack. I have tried to explain the parallels I see between computer processes and human brains elsewhere so I did not intend to re-litigate that in this thread. I am saying that both human brains (sometimes) and computers have the ability to process logical operations (such as AND and OR), and in this thread you seem to be saying that the inputs and results of such operations (bits) do not exist as other than a purely human concept.

      I believe in a real, physical world. In that world, trees fall whether or not any human is conscious of it, and answers to true/false questions exist whether or not any human knows those answers. (Has the tree fallen or not?) This world, or universe, in fact existed for over 13 billion years before any human existed. So for me, bits exist. You have claimed that bits only exist from a human point of view (even refusing to accept the evidence from experiments with other animals). It seems a short step to me from that to saying the entire world is in your head; or perhaps a collective effort among all humans although I don't know how that could work.

      For me, intelligence evolved as an aid to survival and reproduction in a real world. The main process that intelligence uses for this is to construct simulations, or maps, of real processes to predict the future consequences of different options. The existence of useful maps can either mean that the territory they represent exists or that human minds are somehow creating the territory from the maps. I prefer the first explanation as both the simpler and more humble one (using Occam's Razor and Mario's Sharp Rock, respectively).

      The fact that computers can also process such maps and even reach better conclusions than humans have (e.g., AlphaGoZero) fulfills predictions I made many years ago based on my real-world belief, so it tends to confirm it. Consciousness is a different issue, of lesser importance to me. The ultimate consciousness to me would be not only to know something, or to know that you know it, but to know that you know that you know ... it, and I think we all fail somewhere in that endless loop. (Thanks again, Zeno.)

    24. @Terry:

      It's always been appreciated that there is a subjective and objective dimension of time. But this actually goes for everything. It's what philosophically speaking is understood as qualia. The irreducibile element of actual experience.

      It's not just human cognition that thinks in terms of the ideal. It seems any kind of cognition must do so. A giraffe doesn't distinguish one tree from another - they're all good enough to munch on. And a lion doesn't distinguish one giraffe from another for the same reason. Of course, strictly speaking one tree is different to another, likewise with giraffes and lions.

      This ability of the mind to think in ideals was first theorised by Plato (this is standardly Euro-centric especially when we recall that it's on Plato's own authority and admission that Philo-Sophia had older roots in Egypt). It is the ideal that is uniform, being a unity and singular whilst the real world, being multiplicity is not.

      But actually, are ideals only in mind? After all, one electron 'recognises' another - even though they are strictly phenomenally speaking different. A reflection of that in matter is it's uniformity at all scales.

      This is one reason that Plato was prepared to say that the real world has it's archetypes in the ideal noumenal realm. Of course it's since Kant - a physicist turned philosopher - that the noumenal was drained of all content and meaning and matter in all its multiplicity given new meaning. This is what he called his Copernican turn in philosophy. ie a revolutionary turn. This turn was then followed by Nietzsche (in theology/ethics) and Marx (in history) and then by many others. Thus the most pre-eminent philosophy in our times (if not the most eminent) is essentially materialistic. Personally I think it is to our great loss.

    25. Terry

      I was aware of the Glashow cube connection only because of your comments a year back, but my mind refers to the Rubik Cube rather than the Glashow cube. I feel that I aready have a solution to the simplicity you mentioned to my own satisfaction, but I needed to look online at the Glashow Cube to see what you meant.

      I wish that the Rubik Cube used RGB primary colours with opposite faces being complementary, but they do not so I will pretend that they do. The Rubik Cube has eight vertices which can represent (but w.r.t colour properties only) the electron (RGB), positron (rgb), and six quarks on the other vertices for example Rgb being red quarks and rGB being antired quarks.

      But this picture needs to be completed by using other properties. I have seen the picture online that you referred to with two cubes balancing at a 0 vertex. This adds electric charge, weak isospin and hypercharge as extra dimensions. And as you noted, the neutrinos are drawn at the origin.

      In my model I think spin and weak isospin are the independent or eigen extra properties and built on those. I made a preon model with four preons which lets one build all the Standard Model particles and bosons and even built leptoquarks. The trouble is one can build almost anything! But the four preons in themselves are not as interesting as the quantum properties such as spin and weak isospin, and the way they may be aggregated into sub-preons.

      Just as Red is a bit (or three bits if given a group property, nodding maybe to Lorraine) then spin is also a bit. And in a similar way I think of it as a 4D block of dimensions compactified because of relative speed c. Somehow that compactified universe may be be curled around our space to give an intrinsic-only angular spin effect. Weak isospin is a kind of electric charge but needs to be treated independently. If the spin space and our space are curling together that may explain double cover. Take two coins which touch and rotate one around the other. The rotating coin takes 720
      degrees to get back to its starting place.

      One can take the two cubes that you mention and see if one cube is a double cover of the other. In geometric algebra one cube would have a positive signed trivector and the other cube would have a negative signed trivector. One cube represents matter and the other cube represents antimatter.

      What properties are reversed in the two cubes? I am currently working on this. IMO, so far, the trivector is reversed, the signs of time, mass, electric charge, spin and weak isospin are reversed. Where particle time is variable without affecting thermodynamic time. Reversing the sign of mass can explain dark matter and dark energy, reversing the sign of time for an antiparticle can bypass Bell's Inequalities.

      Austin Fearnley


    26. Long before the human being had a written symbology or written language or science; they already had a conscience, and they did everything by a process of rough estimation; because socialization and cooperation fundamentally require that we all be able to represent in our brains the general frameworks in which we coincide; This phenomenon is a complete composite of emotionally structured states. The mirror reproduces reality; but only one property; you can pass your hand over the reproduction in the mirror of a glass and it feels totally different from the real one, that "feeling" is integrated into our consciousness.

    27. Terry

      I want to raise a maths versus physics issue with respect to the use of positive and negative trivectors in geometric algebra (GA). I believe that software coders use GA in computer games development so it may be easier to use GA than quaternions, even if they are equivalent options. Your antiparticle (Glashow) cube has a negative trivector but any normal, physical, euclidean 3D space (R^3) can be analysed either using a positive or a negative background sign for a volume of space (a trivector), and the answers outputted by GA analysis are the same. But you must pick one or the other for the analysis. Take your pick and there is no difference in outcomes.

      The indifference to the results of the use of GA of the choice of sign of a trivector could mean that using GA in your antiparticle cube would give results identical to the results in the normal matter cube. This implies that if we lived in an antiparticle-dominated world it would look identical to our world. Probably, we cannot know using GA whether our world is particle-dominated or antiparticle-dominated. To sum up, applying either trivector option of GA to our R^3 world gives the same answer and that answer makes it look like our world is matter rather than antimatter.

      What we really need is a change in perspective as this is a sort of low-tech relativistic issue. We need to analyse the negative trivector space (of antiparticles) from the point of view of an observer in our positive trivector space (of the universe). This is not a normal (AFAIK) use of GA. It would be a mistake, I believe, to assume that mathematical impartiality overrides physical differences in this context.

      Austin Fearnley

  12. Dr. Hossenfelder,

    What does "natural" mean in a technical sense?

    1. i aM,

      I have written about this dozens of times. You can search this blog's archive for "naturalness" using the search function. This may be a start. Or, well, read my book, link is in the sidebar.

    2. @i aM wh:

      In mathematics naturalness technically speaking is quite close to what is meant by general covariance physically. I mean as how Einstein used it con eptually in coming up with GR. It's a term that's used ubiquitously in category theory.

      In many ways we can say that category theory is the study of naturalness, that is physically speaking, of covariance.

      Given that category theory is still scarily abstract territory for physicists who haven't come across it, I'm dropping in this comment to show that it does have a natural connection to physics through the notion of covariance.

      Unfortunately this is nothing to do with naturalness in physics in the sense Sabine has already written about.

  13. arXiv:0704.0646 (gr-qc)
    [Submitted on 5 Apr 2007 (v1), last revised 8 Oct 2007 (this version, v2)]

    The Mathematical Universe
    Max Tegmark (MIT)

    1. "The Mathematical Universe"

      It's t'other way round, chuck.

      Maths is a shadow cast by the physical universe on the physical brain via coarse perception and evolution of the brain. Maths is a painting of physical reality on the brain.

    2. @ Steven Evans,

      John Wheeler: "It from bit. Otherwise put, every it — every particle, every field of force, even the space-time continuum itself — derives its function, its meaning, its very existence entirely — even if in some contexts indirectly — from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. It from bit symbolizes the idea that every item of the physical world has at bottom — a very deep bottom, in most instances — an immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yes-no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and that this is a participatory universe."

    3. Mathematics accurately shapes its world, which is larger than the real world.
      More or not?

    4. Prof. David Edwards7:45 AM, July 12, 2020

      It from Bit is just a speculation. The contents of the mind, including Maths, are clearly not an exact copy of the physical world. How could it be?

    5. inMatrix.ru8:30 AM, July 12, 2020

      "Mathematics accurately shapes its world, which is larger than the real world."

      But there's no reason to think that Maths can perfectly describe the real world. Fundamental human mathematical intuitions, like counting and the continuum, are fudges of an evolved brain.

    6. @ Steven Evans,

      The Quantum Revolution is that there is no "Physical World"; only phenomena! Weyl quote:

      "For as long as I do not proceed beyond what is given, or, more exactly, what is given at the moment, there is no need for the substructure of an objective world. Even if I include memory and in principle acknowledge it as valid testimony, if I furthermore accept as data the contents of the consciousness of others on equal terms with my own, thus opening myself to the mystery of intersubjective communication, I would still not have to proceed as we actually do, but might ask instead for the ‘transformations’ which mediate between the images of the several consciousnesses. Such a presentation would fit in with Leibniz’s monadology."

      This is my approach to quantum logic.

    7. We really have no idea what the relationship between the physical world and mathematics is. Speculations about this have been around since Plato proposed a dualism between physical and ideal forms. There have been many metaphysical ideas along these lines for centuries. What we do have is Wigner’s “unreasonable effectiveness of mathematics” with respect to calculating things and the physical world. Mathematics does work remarkably well as the formalism of a theory that is truly effective.

      Tegmark’s “Mathematical Universe Hypothesis” is interesting to think and talk about late in the evening over cigars and scotch, but it is not clear how one can take this seriously. The issue with Gödel’s theorem has made Tegmark revise this into what he called “Computational Universe Hypothesis” where physical processes calculate things. Seth Lloyd considers the entire universe as a sort of quantum computer. There are computer or computational aspect to physics or physical theory, but I have been a bit reticent about considering the universe a computer.

      Parallel to this has been speculation about the nature of mathematics in of itself. Brouwer proposed mathematics was nothing more that a game of construction, called intuitionism, which is different from other schools such as Hilbert’s formalism. Is mathematics just a mental game of the human mind, say intuitionism, or is it something that has some existential property that while not physical is nonetheless there? We really have no idea. My sense of things is that it is not satisfactory to me to think of mathematics as nothing more than some sort of game, but on the other hand I have no idea what is meant by saying mathematics has an objective existence. Kurt Gödel argued unprovably true propositions indicates a Platonist or objective basis for mathematics.

      I read a paper by Tegmark on this around 2005 or so. My honest personal reaction was somewhat underwhelming. A theory of physics that claims mathematics as the basis for everything implies that mathematics is an empirical subject. Largely outside of numerical analysis and related work mathematics is not what I would call empirical. To me this means Tegmark’s idea here is really a sort of metaphysics.

      No matter how you look at this we will never be happy. There is no happy solution to this question on the relationship between physics and mathematics. It is as Garrison Keillor put it in his Guy Noir skits, “On the 10th floor of the Atlas building on a dark night in a city that knows how to keep its secrets, one man seeks answers to life’s persistent questions, Guy Noir private eye.”

    8. Prof. David Edwards4:46 AM, July 13, 2020

      Dear, oh dear. Is this the kind of claptrap that turns you on? So there was no quark-gluon plasma 13.7 billion years ago there is just some transformation that mediates between our consciousnesses an image that makes us think this? Are we plugged into the matrix? Are we in a computer simulation? These are all dead-end considerations. Observation is what has worked thus far, and considering there to be a Physical world is the easiest way to think of the situation.

      Your approach to quantum logic doesn't tell us anything - double slit interference or not, entanglement - the results of the experiments remain the same. You are just juggling the theory around a little to no effect. It's just quantum semantics.

    9. Lawrence Crowell7:07 AM, July 13, 2020

      One is never going to be able to show that the universe *is* Mathematical or an algorithm because of the finite precision of physical observations. Mathematical intuition in the human brain is just a coarse representation of the physical world. Take the human intuition of the continuum. It's not known if space-time is actually continuous, and the human intuition is fuzzy anyway - can human intuition distinguish between the Solovay model with all sets of reals Lebesgue measurable and the usual model of the reals. No. So there are multiple continua that match our immediate intuition.

      The intuition is a fuzzy short-cut as you would expect in a very finite brain evolved to make quick decisions to survive. When we scrutinise our mathematical intuition it helps us do Physics, but this is not "unreasonable effectiveness" but completely unsurprising as the intuitions are structures in the brain resulting from coarse observations of the Physical world.

    10. @ Steven Evans,

      Said by someone who has probably never seriously studied QT!

    11. Prof. David Edwards2:43 AM, July 14, 2020

      There's clearly no "probably" about it, but you are welcome to report something known about the physical world based on quantum logic if you so desire........Then people might read your essay.

    12. @ Steven Evans,

      Quantum theory is best understood as a form of perspectivism not physicalism! In fact, Gleason's Theorem has as a corollary that the standard quantum logic cannot be embedded in Boolean Logic. Hence there is no "physical world" in the classical sense! (One can get a substitute which is an inverse system of worlds which is dual to the system of perspectives.)

    13. Mathematics has these pure ideals, such as infinitesimals. Physically we can't locate a particle at a point, but do so on a spot. Yet, mathematics is most consistent according to point-set topology as the bass for geometry or differential geometry.

      It from bit, or it from qubit, is an interesting idea. Spacetime is possibly bullt up from large N entanglements of states. These N-tangles define what we phenomenologically define as space or spacetime.

    14. Prof. David Edwards,

      "the standard quantum logic cannot be embedded in Boolean Logic. Hence there is no "physical world" in the classical sense!"

      This conclusion does not follow.

      What is known as "quantum logic" is simply a reflection of the limitations one encounters when performing various measurements. If the position of an electron is measured, say by forcing the particle to pass through a very narrow slit, the particle will interact strongly with the slit (in fact with the electrons and nuclei from the slit's material) and this interaction results in a large spread in momentum. If you want to measure the electron's momentum very precisely you need to use two very large and very distant slits so the uncertainty in position becomes large. Since a slit cannot be large and narrow at the same time the two experiments cannot be performed at the same time. So, we cannot prepare a state with a known position and a known momentum and QM's formalism reflects this experimental reality.

      The above explanation is perfectly compatible with both classical physics (in fact it is based on classical physics) and with classical logic. The jump from "you cannot measure both x and p" to "x and/or p cannot exist" is not justified.

    15. @ Andrei,

      You're stuck in classical theory and the old quantum theory and haven't assimilated the 1925-6 revolution!

    16. Prof. David Edwards,

      "You're stuck in classical theory and the old quantum theory and haven't assimilated the 1925-6 revolution!"

      Give me any example of an experiment and I will provide you with an explanation compatible with classical logic. And my source for information is not "the old quantum theory" but a quite new interpretation (consistent histories). This paper is a good example:

      The New Quantum Logic

      Robert B. Griffiths

    17. Prof. David Edwards5:41 AM, July 14, 2020

      "Quantum theory is best understood as a form of perspectivism not physicalism!"

      So you keep claiming, but the empirical evidence does not support perspectivism over, say, superdeterminism or Many-Worlds, so in what sense is it the "best" theory? In no sense.

      "In fact, Gleason's Theorem has as a corollary that the standard quantum logic cannot be embedded in Boolean Logic."

      But Gleason's theorem has its assumptions, too. You're just juggling the theory around a little. And even if quantum logic can't be embedded in Boolean logic in the theory, it doesn't mean something that coarsely looks like quantum logic up to a certain finite precision can't look like Boolean logic up to a certain level of precision en masse. You are confusing a model with reality. You're just like the String Theorists - finding lots of results about a theory which is absolutely not known to be relevant to observations.

      "Hence there is no "physical world" in the classical sense! "

      I don't know why you think a "physical world" requires Boolean logic. We're just saying everything isn't just in our minds, that's all.

      "One can get a substitute which is an inverse system of worlds which is dual to the system of perspectives."

      This all exists only in your head, though. Perspectivism is just your perspective.

    18. @Lawrence Crowell:

      I'd say that Tegmark as is just a modern and narrow incarnation (or reincarnation?) of Pythagoreanism - the world is an aspect of number. Personally, I think that mathematics descibes the world so well because it is a science of the necessary, and obviously what physicists look for in chiselling down down to the laws of nature are those laws that are as close to being neccessary as possible.

      It looks like to me that you're confusing formalisam and intuitionism as philosophies of mathematics. Quite often it's formalism that is described as a game. As for Intuitionism, Von Neumann wrote to Carnap on the day that Goedel published his result saying:

      "Thus today I am of the opinion that 1. Gödel has shown the unrealizability of Hilbert's program. 2. There is no more reason to reject intuitionism (if one disregards the aesthetic issue, which in practice will also for me be the decisive factor). Therefore I consider the state of the foundational discussion in Königsberg to be outdated, for Gödel's fundamental discoveries have brought the question to a completely different level."

      To see that something that is concretely mathematical can be said about intuitionism, look up Heyting algebras which are to intuitionist logic what Boolean logics are to classical logic.

      I'd also add that points are definitely not neccessary to to topology. All we need are open sets.

    19. The idea the universe is mathematics has been around a while. The exact relationship between the physical world is not clear, but many physical quantities add or subtract. One mass plus another mass is a total mass. Geometry, which tells us things about the relationship objects similarly satisfies math, such as adding distances and then more the Pythagorean theorem. A weight scale is in a way a sort of equation, with the fulcrum being the equal sign. It takes off from there.

      The impact of Gödel’s theorem is an ongoing subject. Gödel thought the self-referential truth of unprovable propositions meant there was a Platonic realm for mathematics. Most mathematicians do hold to objectivity in mathematics. I would say intuitionism is more popular with physicists. For myself, I do not hold fast to any of these metamathematical concepts.

  14. Sabine, you write "There is no reason that nature should actually be described by a theory of everything."

    1. I haven't quite been able to decide whether that's a spam comment or not, but since it doesn't advertise anything eventually decided to publish it.

  15. Paul Dirac was persistently, extremely critical about procedures of renormalization. In 1963, he wrote, "… in the renormalization theory we have a theory that has defied all the attempts of the mathematician to make it sound. I am inclined to suspect that the renormalization theory is something that will not survive in the future,…" He further observed that "One can distinguish between two main procedures for a theoretical physicist. One of them is to work from the experimental basis ... The other procedure is to work from the mathematical basis. One examines and criticizes the existing theory. One tries to pin-point the faults in it and then tries to remove them. The difficulty here is to remove the faults without destroying the very great successes of the existing theory."

    1. '... in the renormalization theory we have a theory that has defied all attempts of the mathematician to make it sound. I'm inclined to to suspect that the renormalization theory is something that will not survive in the future...'

      The mathematical physicist, Alain Connes actually traces the idea of renormalization back to the British physicist, George Green. He also has a theory of renormalization, the Connes-Kreimer which makes the notion rigorous. Moreover they connect it to Gallus their.

      I'm inclined to think that renormalization theory will survive as theory both mathematically and physically.

    2. That should be the Connes-Kreimer theory!

    3. @ Mozibur,

      Connes-Kreimer provides a systematic, rigorous way to do renormalization; the series still does not converge and QED still does not mathematically exist!

    4. @Prof. David Edwards:

      Given that renormalization was discovered/invented to take care of divergences in QED, I don't see how your two sentences fit together.

    5. @ Mozibur,

      It only takes care of the divergences in the individual terms of the perturbation expansion; the infinite series still doesn't converge. Physicists get their predictions by only using a small number of its leading terms. The justification for this would require that the perturbation expansion is an asymptotic expansion. This is only known for
      the well-defined QFT (phi^4)2.

    6. @Prof. David Edwards:

      There are exactly solvable CFTs such as Liouville theory. QFTs suffer from other problems such as the ill-defined path integral, nevertheless they are very successfully used, and mathematicians have come up with rigorous definitions in many special cases. This is no different from the history of physics where mathematicians play catch-up to physics. After all, Newton successfully used calculus in his theory of gravity without waiting to be fully rigorous about calculus.

      QFTs aren't rigorously defined in general and there's a lot of work that has gone into making them so like the locally covariantly, axiomatic, fubctorial and algebraic approaches (and others). It's a work in progress.

      The original point that I was making, and which I'll reiterate, is that au contraire to Dirac's remark, renormalization has a rigorous approach (and that this isn't well-known). Of course the whole perturbative approach has problems itself with non-perturbative effects like instantons showing up. Basically, these are global effects which are become visible in the fibre bundle approach and can also easily be seen not to be visible locally.

  16. While the Standard Model has been extremely successful, it has some unsatisfactory features. It survived a transition from three massless neutrinos to one massive neutrino. Leptons have an electrical charge "one", while quarks have 1/3 or 2/3. Quarks have been postulated to be individually unobservable.

    Is there any experimental search for particles with 1/3 charge?

    1. I think if you postulate that "unitary-quarks" (like leptons) have +1 or -1 charge you are led to the unavoidable conclusion that baryons, including stable baryons like the proton, would have contain both matter and antimatter "unitary-quarks" in close proximity without sub-particles ever colliding and annihilating (at least within a time-frame beyond the age of the universe anyway). Although this seems like a theoretical nonsense to accept, it may instead be a theoretical prejudice in heavy disguise.

      The claimed experimental evidence for fractionally charged quarks appears very heavily theory dependent on closer examination and a few papers on the physics arxiv wrongly or rightly draw attention to this. My interpretation is that 1/3 and 2/3 charged quarks are unobservables added to the pre-existing theoretical framework, like dark matter added to General Relativity for example, to explain newly observed phenomena (in the case of fractionally charged quarks - the composite structure of baryons).

      Fractionally charged quarks solve the immediately pressing problems simply and neatly (including avoiding matter and antimatter closely co-existing in stable baryons), but at the same time give rise to a difficult residue problem to solve: why the existence of the extreme matter-antimatter asymmetry in the universe.

    2. Hi cg
      Quarks are unobservable because they do not exist. They are just a convenient way to describe some complexities of the proton. Just like Heisenberg promoting the description of a neutron as being a separate particle instead of the previous view that it was a proton + electron. Descriptive/mathematical convenience distorts reality.

  17. Hi Sabine,

    a naive and probably stupid question. I understand that that the logical unification way is that gravity is quantum in nature, then all should fit in the same theoretical bag. That much I think I understand. But what if it is the opposite? I mean what if gravity is the quantum bag in which everything else fits?

    The reason I ask is because gravity seems transparent to every other force, as it looks to me like it only acts on relative energies (except in a black hole where we do not know).


    1. Hi akidbelle,

      What is needed is some resolution of the inconsistency between quantum theory and non-quantum gravity. This does not necessarily mean gravity has to become a quantum theory, it's just that this is the option that has received the bulk of attention.

    2. Hi Sabine,

      thanks for answering. That much I understand, but perhaps I did not explain properly. Maybe resolving inconsistencies is not ambitious enough. I the result may be "SM included in QG = QM". That would be a concept of gravity in QM, where all particles are part of or deduced from QG. Essentially, mass is mass and then I think it should drive everything else.

      But perhaps that is nuts or too ambitious.

      Best, and thanks again.

  18. I am reminded of Pente (a Go-like game.) There are two ways to win a game of Pente: Either be the first to place five stones in a row on a 19x19 board, or be the first to capture five pairs of the opponent's stones. Are the rules consistent? Yes. Is it possible that some other rule exists in the universe of Pente that 'unifies' both rules into one at some 'deeper' level? Of course not. And so it goes for the fiats that govern a wide range of systems. Unification of the laws that govern their behaviour may be nice (for some), but it is not required. The only necessary condition is that they be consistent. That's it.

  19. Regarding Weinstein’s geometric unity-theory/idea, has anyone published anything on where it's weaknesses are, or perhaps why it is outright wrong? It is very hard to understand as a layperson without proper guidance.

    1. Weinstein himself doesn't seem to have anything published. Best thing I could find are recordings of lectures and slides in those. It's hard to criticize something if you don't know what it is.

  20. To your list I'd add Cohl Furey and her love for the Octonion.

    1. I've looked at some of Cohl Fureys online videos and they seem very well done. It wouldn't surprise me if the Octonions were implicated in physics. After all, the quaternions are implicated in Diracs equation although very few people point this out.

  21. The problem with attempting to create a 'theory of everything', is that we are working with incomplete data. The three scientific frameworks, quantum mechanics, relativity, and classical mechanics, are effective theories. Each is good in its own narrow domain, and it shows in the way they don't work well together. They are all far from a fundamental expression of physical reality. The sad truth is that if we had a fundamental theory, we wouldn't need a Unified Field Theory, or GUT. By definition, a fundamental theory could only exist within a unified framework. Instead of looking for answers to questions that have been asked so often since the 1980's that they have almost become a belief system on their own, it would furnish us with answers to questions we haven't been asking... because nature, as you say, doesn't give a damn about our current ideas or what we would consider popular. In that aspect, physics as currently practiced is broken, and I'm not so sure there are enough people around who really want to fix it... especially when the next particle collider needs to be built.

    1. Hi DJ
      I could not agree with you more "They are all far from a fundamental expression of physical reality". Quantum Gravity would be just a poor extension to a bunch of so-so theories. A fundamental theory will describe the basic actions of reality with basic physics concepts instead of a bunch of connived math and concepts. Thanks for your thoughts.

  22. Hi Sabine !!!
    I hope your day is going well.
    I hope you , and all you know are doing well.
    - please feel free to put this in your 'do not publish' file.

    The question you asked was if we really need a Theory of Everything. I'm wondeing too.
    Is there any other animal on this planet, besides human beings, that think such things.

    Honestly, I don't know.

    I cannot be in a house, or walk out of a house and lose the scientific mindset.
    Everywhere I look around me, it's amazing,.. and overwhelming.

    I've heard it said,'you can't see Disneyland in 3 days'.
    If we take that as a metaphor for three life spans on this planet, perhaps we shouldn't expect all the answers in the first one. lol.

    Thinking of you,
    Wishing you well.

    Love Your Work

  23. Please do not submit your own theories of everything to this thread.

    1. Hi Sabine,
      I would like to reach just Physicist Dave and I couldn't find a way to contact him. For once more, it is not my intention to submit my own theory in yours Blog.

      Please could you contact me via the following e-mail: I just want to send you (if you wish) the link to my work I submitted some days ago in a repository.

  24. Science postulates an objective reality independent of the observer. Mathematics describes this reality and allows predictions but is not reality. The history of science is that resolution of paradoxes leads to a deeper understanding of reality. As I understand it, we have two excellent, and powerful, mathematical views of reality: quantum mechanics and relativity but the problem is that they are incompatible. If both are true but incompatible, this suggests there is a deeper truth, not yet discovered. Or, perhaps, there are hidden assumptions about each model that need to be addressed. Clearly the fundamental forces described by our mathematical exist and can be utilized. This leads to a deeper question as to why these forces exist at all.

    1. @ Negate OPM,

      Science doesn't need to postulate an objective reality independent of the observer! Only to have reasonable agreement between theoretical predictions and experimental observations.

  25. Hello,
    Do you think theoretical physics should be ahead of experimental physics? If so, is it possible to quantify the gap and how big should it be?
    Take care,

  26. Study the classics!

    "Selected Papers on Quantum Electrodynamics", Julian Schwinger(Editor).

    This monumental collection of thirty-four historical papers on quantum electrodynamics features contributions from the twentieth century's leading physicists: Dyson, Fermi, Feynman, Foley, Heisenberg, Klein, Oppenheimer, Pauli, Weisskopf, and others.

  27. I recommend John Barrow's book Theories of Everything, which is mainly about what theories of everything won't tell us.

    Actually, I recommend all of his books.

  28. To the incompatibility of gravity and the other forces:

    The gravitational force is so different from the other ones that it should be imperative to assume that gravity is something completely different; rather than to continue the attempt to find any similarity. There are still important points to be worked out.

    It seems to be persistently overlooked that general relativity, and special relativity as its basis, contains logical conflicts.

    - Special relativity is incompatible with rotational motion. That was objected by Mach and Lorentz to Einstein. Einstein has conceded this to both, there is a documented letter exchange of Einstein with Lorentz. But Einstein did not take any consequences. The matter is open until today.
    - General relativity is based on the strong equivalence principle. But this was falsified in several cases.

    There are solutions which are presently not much in the view of the physical community:

    - Special relativity in the direction of Lorentz has the same results as the version of Einstein. But it assumes an absolute frame and does not need Minkowski metric. And it avoids the mentioned problem with rotational motion. – There is a lot of literature of well-known physicists who have proved the consistency of this way with experiments.

    - Also general relativity can be developed in a way consistent with the direction of Lorentz. Einstein himself has started this way in 1911 by assuming that the variation of the speed of light in a gravitational field causes the known gravitational phenomena, also the relativistic ones. But he discontinued this way after he made a mistake in his calculation. - Einstein’s original way of a variable c rather than curved space-time was meanwhile confirmed as possible by several authors.

    To solve these issues should to be accepted as due homework by the physical community before further attempts for GUT and similar ideas are being started.

    1. "- Special relativity is incompatible with rotational motion. That was objected by Mach and Lorentz to Einstein. Einstein has conceded this to both, there is a documented letter exchange of Einstein with Lorentz. But Einstein did not take any consequences. The matter is open until today.
      - General relativity is based on the strong equivalence principle. But this was falsified in several cases."

      Can you provide some links to articles in serious journals which support your claims?

    2. Phillip Helbig:

      Regarding special relativity and rotational motion: The letter exchange between Lorentz and Einstein about this topic is documented. You can find the letter text and partially facsimiles of Einstein’s handwriting in the book “Einstein and the ether” of Ludwik Kostro. (This book is anyway very interesting for the history of relativity.)

      For the lack of a solution by Einstein there is no literature as the lack of an action is normally not documented.

      Regarding the strong equivalence principle there are two examples. First: A charged object like an electron radiates at acceleration. An example is bremsstrahlung. But an electron at rest in a gravitational field does not radiate. Both is standard knowledge in physics.
      Second: There is dilation in a gravitational field. There a very precise experiments about it. But there is no dilation with respect to acceleration. A proof was the muon ring at CERN. All these experiments are well documented.

      If you have problems to find these documents I shall be ready to help you.

  29. Wondering about a theory of everything. De Broglie found that matter has waves, what if breaking of the wave function was actually a deeper spectra where going from one orbital to another resulted in what we call matter instead of the traditional light that we see in spectral lines. We could call it dark orbitals.

  30. Hi, Negate,
    Who", "Why", "How", and "Where" certainly are questions in Physics as well as Journalism, Religion, Philosophy, etc. In this discussion, "Everything" seems to be a subset of "What" that deals with fundamental forces.
    To me, the question seems profound, but bounded from the POV of significance.
    Reflection does lead to a multitude of other questions, but do you really think they can be separated by "depth?"
    In this connection, are you certain you could defend a definition of "depth" in a debate?

  31. As a layperson, I found this video particularly clear in communicating both how to consider the subject of "A Theory of Everything," and how to properly convey respect for the talented scientists who work in that area. I appreciate the objectivity.

  32. Prof. David Edwards, thank you for the information on the incentive prize, and for your clarification on just how difficult it is to prove Feynman’s QED. Fascinating!


    Warner, thank you for tolerating my no-particles view!

    However, just to argue against myself (I know myself too well), I will now attempt to contradict my own argument by asserting this: The infinitesimal point model has been one of the most incredibly productive and powerful models in the multi-millennium history of mathematics and physics, and is not something to disregard or dismiss lightly.

    It was not until the advent of quantum mechanics that the impossibility of true points, at least in the physical universe, was recognized. But even after quantum mechanics was added into the mix, almost every analytical and logical situation in mathematics and physics still seemed to scream: “use points, Use Points, USE POINTS!” Einstein used points, de Broglie used points (and waves), Dirac used points (Dirac delta functions, anyone?), Feynman used points, Bell used points (and waves), and Bohm used points (and waves). Just about every early quantum founder, with the intriguing possible exception of Schrödinger, used points. Finally, quantum field theory is built around points. That is perhaps the most surprising example, since quantum uncertainty implicitly entangles infinite energy with every mathematical point used in field theory. This is presumably a factor in why point-based field theories such as QED tend to mosey off into la-la land if not applied with great care.


    Regarding why the universe persistently promotes points, my brother Gary gave me a great analogy yesterday: It’s as if mathematics and physics have together constructed a superb and extremely powerful analytical telescope, but everyone has been peering through the wrong end of it.

    What follows is… well, just an image? Certainly not a theory. An invocation of our visual cortex to help convey the impacts of rearranging interpretation priorities?

    Imagine that instead of infinitesimal point particles joining together to from a uniform cosmos, the exact opposite happened: A timeless, homogenous entity of proto mass-energy shattered, for lack of a better word, under the emergence and exponential explosion of a new process, a crushing headwind called time. The pressure of this headwind flattened and scattered the fragments out over a complementary creation of space. As the fragments grew smaller, they became more resistant to this headwind of time, replacing simplicity with a complex dance of causal dynamics at finer and smoother scales.

    It is this pressure of causal time, with its relentless shattering of mass-energy into the smallest possible classical units, that is the deeper origin of our need for points. Time obliquely encourages us to accept not only the illusion of perfect points in physics, mathematics, and philosophy, but also the illusion of mathematically perfect differential smoothness when dealing with vast ensembles of fragments.

    However, the push towards points could only go so far before the fragments lacked sufficient mass, and thus enough resolution, to break into still smaller fragments. The quantum equilibrium limit had been reached. Atoms, nuclei, and nucleons are all examples of such equilibrium fragments. Externally they are subject to the flow of time, but internally they stubbornly resist it. At their edges lie complex phenomena such as chemistry, in which a little energy can alter and bind quantum states without destroying them.

    There is a final level of deviousness in this reverse-telescope image of the universe. Like software stored for later use, the rules for creating still smaller classical fragments stay intact after quantum equilibrium is reached. When activated by energy, these rules can instantiate and illuminate, however briefly, the finer details of the stable fragments. This is physics as virtual instantiation software… PAVIS.

    1. Hi Terry,

      thanks for your reply! At last, I've understood the acronym PAVIS, though not the underlying concept. :-)
      I have no idea how it could be mapped to QED.

      > "It is this pressure of causal time [...] that is the deeper origin for our need for points."

      I'm not sure I understand what you mean by causal time and how it came into existence. Is it (asymptotically?) the same as the familiar (ordinary) time? Are you talking about points in space, or in space-time?

      > "quantum uncertainty implicitly entangles infinite energy with every mathematical point used in field theory"

      I have a different view here. Energy is a derived concept, and "uncertainty" applies only to quantities that are not fundamental. The coordinates of primitive events are precise, and fuzziness only enters when patterns of events are interpreted as particles.

    2. Hi Werner,

      > … “At last, I've understood the acronym PAVIS, though not the underlying concept. :-)”

      This may help: Imagine that the Mandelbrot set has a built-in equilibrium level. There is no cost for generating details up to this equilibrium level, but beyond this level any further expansion of details becomes a local-only process that is both temporary and exponentially more costly as you dive deeper.

      The main impact of such adding such an equilibrium level would be to make the details of the equilibrium level seem simpler, more “solid”, and more “particle like” than any other level, since they would literally be both the smallest and simplest components of the overall set. This becomes the classical Mandelbrot set.

      Below this visible Mandelbrot set would be the potential for creating more details. Even though no such details exist yet, the rules still apply to these regions, creating a potential explore more detailed levels of Mandelbrot reality. These unrealized potentials form the quantum part of the Mandelbrot set.

      PAVIS is just the assertion that the real universe works a lot like an equilibrium Mandelbrot set, only with a more complicated rule set that make equilibrium inevitable.

      Thus there are no true particles in PAVIS, only rules that generate structure. Saying that physics is only rules sounds a bit radical these days. On the other hand, did the idea of a point-like spinning electron ever really make sense? The mathematical paradoxes of pointiness are largely dodged in a PAVIS universe.

      > … “I have no idea how [PAVIS] could be mapped to QED.”

      Pair-like rules such as the creation-annihilation operators a and a† are already PAVIS rules. Old Newtonian momentum action-reaction is also a PAVIS rule, and a rather important one since it deals with information. In both QED and generalized Feynman diagrams, emission processes are pair productions rules in which half of a virtual pair immediately transforms the emitting entity. That asymmetry also makes such emissions historic in the far field, meaning, amusingly, that the Schrödinger cat was classical from as soon as the radioactive particle was emitted.

      > … “ ‘It is this pressure of causal time [...] that is the deeper origin for our need for points.’ … I'm not sure I understand what you mean by causal time and how it came into existence. Is it (asymptotically?) the same as the familiar (ordinary) time? Are you talking about points in space, or in space-time?”

      Causal time is classical time, by which I mean any process that leaves a detectable historical record. Electron in orbitals are acausal or quantum; so the timelessness of the proto-universe persists there. Emitted electrons in contrast are causal and part of historical time, since they leave (mostly) irreversible changes in their wakes.

      The fracturing process I mentioned still exists, but is much rarer now. We call it wave function collapse, with voluminous Rydberg atoms breaking down a classical nucleus and electron being a nicely visual example. But most wave functions these days are at mass-volume equilibrium, and thus are quite stable. For example, given that it is the equilibrium-protected wave functions of electrons in condensed matter define classical volume, chemistry, and material properties, and electronic behaviors, I am sometimes genuinely baffled why people persist in thinking wave functions are not “real”. It’s the other way around: It is the idea of point-like electrons whizzing around “inside” such orbitals that is nothing more than a human abstraction, lacking any physical meaning until more energy is added.

      > … “ ‘quantum uncertainty implicitly entangles infinite energy with every mathematical point used in field theory’ … I have a different view here … fuzziness only enters when patterns of events are interpreted as particles.”

      I will admit that to be honest, I didn’t quite follow this part? You may have had earlier comments on it that I simply missed.

  33. Lawrence Crowell, thanks for an interesting perspective of Feynman’s work! You write well about such mathematical details, and I appreciate your insights.


    Lorraine Ford, yes, there is a whole fascinating discussion about the role of intelligence and cognition when defining concepts even as basic as time, let alone bits. I’ve had fascinating discussions with a philosopher friend of mine, Ronald Green, on just such topics. What I would say is this: When I talk about “bits”, I am talking about a binary-normalized version of a rather specific and conceptually non-trivial variant of statistically unrecoverable asymmetric decoherence in momentum pairs, those pairs being instantiated mainly by phonons in condensed matter, and by photons or momentum in space. Also, instead of saying “bits”, one could just as easily call it “persistence”, which is a property that virtual pairs do not come by easily. Creating two diverging arrows of causal time helps. But in any case, here “mass-energy” equals “bits of spatial resolution”. Entangle the bits locally and you get locality at multiple scales. Run out of bits and you get quantum mechanics.


    Austin Fearnley, that was an interesting read. I agree that color is more important that is generally recognized, but to get the fully generalized (and actually much simpler) version of color, you might want first to combine it with electric charge.

    That is, the simplest data model for the electric and color charges of fermions is to abandon separate color and electric charges and replace them with the six unit vectors of the (dual) Glashow cubes. These vectors correspond exactly to the total charges on each of the three down quarks in the Delta-minus baryon, and on the three anti-down quarks in the anti-Delta-minus baryon. Vector sums of these six Glashow unit vectors cover all possible fermion and anti-fermion charges. Hadrons — mesons and baryons — use quarks with raw Glashow charges that then combine in xyz space, giving them volume and exposing the strong force at close range. Electrons and positrons combine Glashow units inside the charge 3-space, and so have no visible structure or color in xyz. But since the charges of protons and electrons are constructed from the same Glashow charge units, they end up with identical long-distance electric charge magnitudes. (How to construct up quark charges with this data model is left as an exercise to interested readers.)


    PhysicistDave, I am jealous. My degrees are in computer science, so I never heard of Feynman or his works until about a year after he died. Yet I like what he said enough that my twenty-odd personal notebooks into which I’ve scribbled physics nonsense and ramblings for over two decades are all labeled “Studies of the Feynman Lectures”, whether that is apt or not. I especially liked his Volume III.

    Your comment about Feynman being better at math than he let on made me laugh out loud, because, well… it just fits. He remained proudly and steadfastly the Bronx brawler for his entire life, and he loved to sucker-punch folks who in his mind thought too highly of themselves. Hiding his math skills would have been a good way to get in some unexpected punches. I suspect that all those equations he liked to do while sort-of watching strippers was another technique he had for impressing his colleagues. It was the same trick Fermi apparently used, having a slate of pre-fab answers that he could then pretend to come up instantly and out of the blue.

    Still, Feynman’s overall view of math was… well, different, and inclined towards introspection of simple mysteries. He especially liked the breadth and compactness of Euler’s identity, e^(iπ)+1=0. I have little doubt that he spent many more hours contemplating that little mathematical navel than he ever spent preparing for his Nobel Laureate speech.

    1. Terry

      Yes, totally agree. We have also agreed previously on colour charges connecting QCD to QED. But what about connecting colour charge to the weak force (as this named thread is on the ToE)? Weak isospin is a form of electric charge, so hence it must be similar to a unit of RGB or rgb in your cube model, where u.c. is negative electrical charge and l.c. is positive charge? My model has weak isospin as a separate and independent entity, though.

      BTW to answer your quiz question, IMO a red up quark contains rgb + Rgb colour charge components. with net colour Red and net electrical charge
      4/6 = (1/6 + 1/6 + 1/6) + (-1/6 + 1/6 + 1/6)

      And what about gluons which contain colour charges?
      A red-antigreen gluon would contain Rgb and RgB which has zero electrical charge. More complicated gluons would have more such components. It seems so weird when explained using complex mathematics notation but is more straightforward just adding extra colour components into a single gluon and providing extra functions for the gluons.

      Austin Fearnley

    2. Terry,

      Are you talking about a Boltzmann brain? Anyway, I think it is misleading to appropriate the word “bits”.

      What I mean by algorithms, as opposed to equations, is: algorithms represent steps taken (i.e. lawful outcomes) in response to information situations; but equations represent lawful outcomes “in response to” information relationships. One needs to separate out the elements: discontinuous steps are different things to continuous relationships; relationships can’t take account of a wider situation, but algorithms can.

      You said: “it is the QED algorithms, not the QED equations, that are the simpler and more accurate representation of how physics works”. So how would you describe “algorithms” as opposed to “equations”?

    3. Lorraine,

      > … “Are you talking about a Boltzmann brain?”

      Absolutely and unequivocally no! That whole idea just makes my brain hurt, and not in a good way… or in a Boltzmann way either… :)

      > … “Anyway, I think it is misleading to appropriate the word ‘bits’.”

      Here me out on this, and see what you think: The reason I keep using the word bits for something as seeming unrelated as asymmetrically decohered virtual momentum pairs (just rolls off the tongue, doesn’t it?) is because such pairs really do implement a form of classical information. That in turn means that there is always away to express their contents in log base 2 form, that is, binary or bits.

      I can get a lot more specific than that though: Set up an electron two-hole interference experiment. Calibrate it so that each electron has a 50/50 chance of going through either hole.

      If you do this in the dark so as not to let collapse the electron wave function, the wave remains coherent — uncollapsed — and the universe contains no record of how the electron went through the apparatus.

      Add enough light to detect which hole the electron goes through, and this situation abruptly changes. If you level the left hole 0 and the right hole 1, then for each electron now tested you will add one bit of information, a 0 or 1, to the classical history of the universe, changing it forever and slightly reducing its possible range of future histories…

      By one bit, exactly, for each electron.

      So, not only do I mean that historical outcomes of quantum experiment can be expressed as bits because they are information in general, I mean that you can set up such experiments so they literally add exactly one bit of new history — of limits on possible future histories — with each experiment done.

      > … “What I mean by algorithms, as opposed to equations, is: algorithms represent steps taken (i.e. lawful outcomes) in response to information situations; but equations represent lawful outcomes ‘in response to’ information relationships. … You said: ‘it is the QED algorithms, not the QED equations, that are the simpler and more accurate representation of how physics works’. So how would you describe ‘algorithms’ as opposed to ‘equations’?

      We may be more in agreement than it seems? Or not? Let me try:

      In PAVIS, the fundamental equations of the universe are all creation and annihilation of pairs, some of which are transient and some of which are enduring. Creation-annihilation rules in physics are not just “like” PAVIS rules, they are PAVIS rules. If there is a difference, it’s that they create bundles of absolutely conserved properties that are never assumed to be points, just regions.

      What I suspect is going on in QED is that framing its equations in terms of point-like particles is a form of noise. The computational pragmatism of the algorithmic forms of QED necessarily strips a lot of that noise out, causing the algorithms to be a simpler and more direct expression of the real rules behind QED than the particle-based formulations. I also suspect the real rules to be pair-like in form.

      So I am certainly not denying equations or absolute relationships. If anything I am suggesting that physics is nothing but such relationships, sans particles. But I am also saying that none of these rules will mention particles directly, and will instead only suggest how to create smaller and smaller particle-like regions in space. QED currently tries to make points real, and that unavoidably makes it a “noisy” theory. A better start would be to begin with algorithms that work, and then rewrite them using nothing but property creation-annihilation rules. I am not claiming that would be easy, but one way or the other, I don’t think QED will ever be rid of its anomalies until it switches to a more rule-based, particle-free way of describing the algorithms that already work so well.


    4. Lorraine wins this one. In a tenth of the words.

    5. Terry,

      I think there is no such thing as “one bit” or “bits” of information. The world always has at least 2 items of information: 1) the category of information (which seems to embody a lawful relationship) e.g. mass or velocity; and 2) a number or numbers associated with that category. Without an authentic category, a number like “0” or “1” is meaningless: it’s not information.

      There is also the issue of how information (categories and numbers) exists. Re “physics is nothing but such relationships, sans particles. … none of these rules … mention particles directly”: I would have thought that the particle is “missing” because it is the thing from whose point of view a lot of the information (the rules/ relationships/ categories and numbers) exists. Also, on the subject of how and where information exists, I doubt that probability is a category of information that exists from the point of view of particles, atoms or molecules: it might only really exist from the point of view of human beings.

      But is new information ever added to the world? I think that quantum mechanical events add new number information to the world when the number is not a consequence of existing information relationships because a “step” has been taken. If the number is a genuine consequence of existing information relationships, then no new information has been added to the world.

      So I would think that, irrespective of their individual content, equations represent the fact that relationships exist in the world, and algorithms represent the fact that “steps” exist in the world. So I think we agree that “steps” exist. :)

    6. Korean War Photo Documentary said:
      "Lorraine wins this one. In a tenth of the words."

      ROFL, and the round goes to Lorraine! KWPD, I think I resemble that remark!

    7. Lorraine Ford,

      Well, the discussions here have convinced me to start writing a book explaining why naive physicalism fails from the viewpoint of mathematical logic, electronic circuit design, and contemporary physics. Assuming I finish it (!), I' plan to credit you and Steve Evans for helpful conversations, unless either of you objects.

      One meta-comment: you and I think that physics as it now exists cannot explain consciousness. Some of our friends here disagree.

      It is not really our job to convince them.

      After all, if they are right, then sooner or later they should indeed be able to use physics to explain consciousness with the same level of detail that, for example, astrophysics can explain how stars shine or biochemists can explain how photosynthesis works.

      No informed person thinks they can do that today.

      A few years ago, I raised this issue with Gerry Schneider, a senior neuroscientist at MIT, without revealing my own view (he actually thought I was a physicalist since he knew I am a physicist): Gerry found it amusing that anyone was silly enough to think that we are anywhere near to a physical explanation of consciousness.

      So, for some years to come, the ball is in their court: they are sure something can be done that certainly cannot be done today or for the foreseeable future.

      The one fear I have long had is that naive physicalism would retard the progress of neuroscience by convincing researchers that a very difficult problem was no problem at all.

      But based on my discussion with Gerry and my reading of the literature, I think that is now less of a risk. Nowadays, it tends to be very old fellows like JimV who cannot see that the “hard problem of consciousness” is indeed hard.

      I am also intrigued as to why some people have such a religious commitment to the idea that they themselves are, in effect, robots. Potentially, that is very dangerous: if consciousness is just some sort of illusion, then why worry at all about other people's feelings?

      I think that was a real danger with people who adored B. F. Skinner: if all that matters is external “behavior,” then nothing is wrong as long as we can force people to “behave” as if they are happy.

      But, again, I think there are few Skinnerians left.

      JimV just admitted that he has no interest in the question of what things are conscious, which I think makes him rather eccentric among human beings: I find that most people wonder whether a zebrafish is conscious, or a ladybug, or an amoeba.

      After all, it matters morally: the reason it is morally wrong to torture a dog is that most of us are pretty sure that dogs have the experience of pain pretty much as we do. It is not morally wrong to “torture” an azalea bush or a laptop computer because most of us are pretty sure that azalea bushes and laptop computers have no experience at all, neither positive nor negative.

      So, morally, JimV really does need to care. In practice, I strongly suspect he does: e.g., that he does not torture dogs but also does not worry about whether he is torturing his azalea bush!

      Anyway, the exchange here is useful to see why people think as they do. But as long as their mistaken beliefs about consciousness do not cause them to torture dogs (or be anxious about causing pain to their laptop computer), we have no obligation to convince them of their errors.

      Time, eventually, will do that.

      All the best,



    8. Dave, Terry, Lorraine, Sabine,


      With that one concept you may have given the basis for an answer to the title question of this blog topic.

      Best thanks to you and all who expressed your thoughts here.


    9. Somewhat OT set of questions for consciousness aficionados: why does it exist in the Animalia Kingdom only? Are there species of animals which are not conscious? When, in the evolution of animals, did consciousness first arise? Is consciousness an example of convergent evolution (or, do Octopodia and Primates have the same sort of consciousness)?

    10. Terry,

      Re Terry Bollinger 9:40 AM, July 17, 2020:

      I can’t think of anything witty to say to that. I presume that you now agree that binary digits can’t exist in the world; binary digits are not information because they lack a category, where a category (e.g. mass, velocity) is essentially a lawful relationship.

    11. Dave,

      Re PhysicistDave 2:24 AM, July 18, 2020:

      Thanks for the vote of confidence.

      I think that this is the state of the human mind/ brain: that you can spell out in detail how computers (or the climate) operates, but nevertheless people like JimV hold on to their beliefs about what is happening. People like JimV don’t understand the difference between genuine information (which is what living things process) and symbols of information (which is what computers process).

      But it is worse than that. There are groups like the Future of Life Institute that are (unintentionally) spreading fake news about computers/ AIs; and there are truly absurd articles and discussions in the media about the supposed potential consciousness, intelligence and/or feelings of computers/ AIs.

    12. Lorraine Ford wrote to me:
      >But it is worse than that. There are groups like the Future of Life Institute that are (unintentionally) spreading fake news about computers/ AIs; and there are truly absurd articles and discussions in the media about the supposed potential consciousness, intelligence and/or feelings of computers/ AIs.

      Well, one of the really scary things is the talk about “uploading” consciousness into a machine. No evidence at all that this will work, and of course no one familiar with either neuroscience or computer technology could possibly think it can be done in the foreseeable future.

      But I am afraid some idiots are going to try somehow to do it anyway, not realizing that they are just committing suicide.

      The entrepreneur Martine Rothblatt, founder of Sirius Radio, is actually trying to upload Martine's spouse into a computer. The good news is that this seems not to require the destruction of the real flesh-and-blood spouse!

      But someday, it is going to dawn on someone that you cannot really upload the brain without dissecting the brain to find all the neuronal interconnections: this is not going to be good for the flesh-and-blood brain.

      All the best,


    13. Dave,

      Re Martine Rothblatt’s “head on a shelf”:

      I think there is an idea that information could be independent of substrate. Computer/ “information” science has somehow promoted this type of idea. But the truth is that many different substrates and codes can be used to symbolically represent information, and that these symbols of information only represent information from the point of view of human beings. The symbols of information are not themselves information.

      The conscious and unconscious “information content” of a brain, with its neurons and complex molecules and interactions, can never be uploaded to electrical circuits and transistors because information (as opposed to symbols of information) can never be independent of substrate.

    14. Lorraine Ford wrote to me:
      >The conscious and unconscious “information content” of a brain, with its neurons and complex molecules and interactions, can never be uploaded to electrical circuits and transistors because information (as opposed to symbols of information) can never be independent of substrate.

      Well... here is a thought experiment that has, I assume, occurred to many people independently:

      Suppose you take one neuron out of some guy's brain and replace it by an electronic prosthesis that makes all the correct axonic and dendritic connections the neuron used to make, and processes the incoming signals the same way the neuron did.

      I think we are not too far from being able to do this for some simple neural systems (probably not for higher mammals, though).

      It seems that with just one neuron replaced, and the prosthetic artificial neuron doing what the old neuron used to do, the guy should think just as well as before.

      I assume you see where this is going...

      Now replace a second neuron and then a third... and finally a hundred billion.

      Either there is a breakpoint at which it stops functioning or you end up with the guy now purely electronic but thinking he is still fine.

      And now the real kicker...

      Slow down the clock speed of the electronics – the guy should still think but much more s-l-o-w-l-y.

      Now replace one of the (very, very slow) electronic neurons by a human clerk acting similarly to Searle's “Chinese room” clerk. And then a second neuron... and finally all of them.

      You end up with an insanely large bureaucracy of clerks who now collectively constitute the guy's “mind.”


      So what does this show?

      I don't know.

      The conclusion seems even sillier than Searle's conclusion, and, of course, Searle viewed the conclusion of his argument as a reductio ad absurdum – obviously and absurdly false.

      On the one hand, this could prove that materialism and functionalism are obviously false: i.e., those are the initial assumptions that must be wrong – there was something other than neurons there from the beginning.

      Or, conversely, determined physicalists/materialists/functionalists could argue that this shows that the hundred billion Searlean clerks actually do continue to constitute the collective mind of our poor experimental subject.

      This thought experiment was invented back around 1980 in conversations between myself and a fellow physics grad student, Harry Orbach, at Stanford. It so disturbed Harry that he changed careers as a result, from theoretical physics to neuroscience!

      However, the argument is so obvious to anyone familiar with Searle's argument that I assume it has been invented independently many times (anyone who has seen it elsewhere, let me know).

      Anyway, it has caused me to scratch my head for forty years. The only conclusion I am sure of is that I do not understand consciousness, and almost certainly neither does anyone else.

      All the best,


    15. PhysicistDave: The consciousness doesn't exist in the neurons, it exists in the arrangement and interlinkage of the neurons. It is a pattern of interaction.

      If I lay all the individual parts of a car out on benches, is it a car? No. it is a collection of parts. a bare piston or bolt has no "essence of car" in it. The parts must be assembled into a semi-specific order to be a car.

      The same with the brain. Imagine using some Star Trek transporter technology we can take every individual neuron out of the brain and isolate it, still alive and intact, in its own little jar on a bench. We'll keep the rest of the body alive by artificial means. Is that a person?

      No. It is a collection of parts, a potential person, but not a person, not conscious, incapable of feeling or thinking. There is no "essence of consciousness" in an individual neuron; it is a pattern matching machine with no inputs and no way to produce outputs.

      But when we put it back together, in the original order, we have an arrangement of parts capable of interactions, in endless cycles until something breaks. That is all there is to consciousness; cycles of thought and self-interaction, with memories and senses intruding.

      Just like your replacement notion, there is no reason to think there is anything special about any individual neuron. They are semi-reliable pattern matchers. they are not perfect, they run short of materials, they build up waste that interferes with their operation and it must be eliminated (one purpose of sleep recently discovered), they can fire and fail to fire for no discernible reason. They can die by the thousands without us noticing.

      The secret is not in the neuron, the secret is in the semi-specific and robust arrangement of them, that permits this continuous cycling of interactions.

      (Robust in the technical sense; unlike a CPU in which a single bad transistor can kill it, the brain works in so many parallel cycles it can tolerate extensive neural dying without completely failing; we see this in Alzheimer's patients, cancer patients, alcoholics and other people with physical brain injuries.)

    16. Dave,

      You’ve got it wrong. In fact, Searle’s Chinese Room argument is that a computer processing symbols can’t decipher what the symbols represent, just like non-English-speaking Chinese clerks processing English language symbols can’t decipher what the symbols represent.

      So assuming that a computer could identify appropriate sets of its own voltages, the computer next needs to decipher what the voltages represent. Something like: higher voltage, higher voltage, lower voltage, higher voltage… might represent the word “tree” in the English language, where the letters of the word have been re-represented as binary digits (i.e. voltages) according to some man-made convention and code. The computer doesn’t know English; doesn’t know the convention or code; doesn’t know that individual higher (or lower) voltages are part of a code; and doesn’t have any lived experience of trees anyway. Also, a computer is not able to spend time or energy identifying and deciphering sets of symbols: this is not the procedure that the computer was set up to do. In other words, Searle is correct: a computer can’t decipher what the symbols represent.

      Re “You end up with an insanely large bureaucracy of clerks who now collectively constitute the guy's “mind.”” No they don’t. The guy’s physical brain, and what the guy experiences is not the same as a whole lot of clerks running around. The information that is processed in the brain is presumably entirely law-of-nature lawful information; a whole lot of clerks running round can only hope to REPRESENT law-of-nature lawful information.

      And this gets back to the point I was making: materials, molecules and cells have unique lawful properties and abilities that can’t be reproduced by other materials, molecules or cells. Your supposition [1] is not valid. Lawful information (as opposed to symbolic representations of information) can never be independent of substrate.

      1. “Suppose you take one neuron out of some guy's brain and replace it by an electronic prosthesis that makes all the correct axonic and dendritic connections the neuron used to make, and processes the incoming signals the same way the neuron did.”

    17. Lorraine, you seem to be saying that since binary digits (answers to true/false questions) require a category (i.e., a question) to be informative, binary digits don't exist.

      If that is logical then so are the following claims: fish need water to exist, therefore fish don't exist; trains need tracks to be useful, so trains do not exist; and so forth.*

      The fact is, true/false questions do exist and therefore so do binary digits (their answers). In saying that binary digits do not exist while admitting that they exist in combination with categories you are making an illogical claim--assuming there is a real world. If the world is something you imagined, then yes, nothing exists. But then why are you arguing with yourself?

      Since you and Dr. Miller are raising the issue of consciousness yet again, the answer to that is just as logical. To conclude it is not a natural consequence of the known laws of physics, and specifically due to the computing and memory abilities of neurons and synapses, raises questions which you have not answered: what are the magic properties, outside of standard physics, by which minds arise and why can't we find them in all of our experiments; why do mental abilites in flat worms, cats, dogs, chimpanzees, and humans seem to scale with their numbers of neurons; why do feral children (raised in the wild without parents or any human training) lack the capability to learn language and social practices; and why do neural networks in digital computers, given human parental initialization and practice, do so well at mimicing human abilities such as playing Go? (see Dr. Scott Aaronson's recent post at "Shetl-Optimized" for another example.)

      Given that all the above observations are consistent with the simple hypothesis that minds/consciousness arise from the known properties of neurons and synapses, and therefore could be simulated by electronic circuits in computers, the burden is on you to propose a more explanatory hypothesis, which fits all the above evidence plus more.

      Until then your objections are all of the same form as saying that since weather is too complex to be accurately calculated far in advance, there could be some unknown magic involved, hiding in the gaps of our knowledge; that is, when you are not simply assuming what you need to prove, e.g., that patterns of neuron settings in human brains are information and patterns of bits in computers can never be. (Even the ones they generate themselves e.g by practicing Go? Nope, not True Scotsmen.)

      All of this has been expressed to you so many times that I am embarrassed to repeat it, albeit for the sake of new onlookers.**

      * (footnote) this method of applying analogies (a form of pattern-matching) is how I have always done logic, which is to say logic is not something innate in me but something learned by example, and greatly facilitated by having been taught a language with which to frame the analogies. This seems to me to be something a computer could learn to do also.

      ** I wonder if either of us would pass a Turing Test based on our dialog.

    18. I just noticed a comment from Dr. Miller which represents some issues, involving me, in a way that I see as inaccurate. This response will probably suffer the same fate.

      "Nowadays, it tends to be very old fellows like JimV who cannot see that the “hard problem of consciousness” is indeed hard."--Dr. Miller

      It is very hard in the sense of "why does a rose smell like a rose" or "why does F=MA" is hard, along with a trillion other things that philosophy cannot solve. It is less hard if instead of hoping for a philosophical breakthrough, one accepts that certain things empirically exist and consciousness is one of them. Then one gets to work looking for empirical correlating factors, such as numbers of neurons, to determine how it works physically, not why it exists.

      As for the age of adherents, I would claim Dr. Turing (much older historically), and many other theoretical computer scientists and not a few neuroscientists on my side of the issue (most younger).* I don't expect such arguments from authority to sway anyone, however, and prefer just to cite the evidence as I see it myself.

      "JimV just admitted that he has no interest in the question of what things are conscious, which I think makes him rather eccentric among human beings ... After all, it matters morally ..."--DM

      Citation needed. Less interest in how consciousness "feels" than in how intelligence works in general is not "no interest in what things are conscious". In fact I think my side of the argument is where evidence of thinking ability in flat worms and dogs and monkeys has been cited. (Mostly in previous posts at this site. "I thought I was out but they are dragging me back.")

      As for reflections on my moral consideration of others, if it pleases people to dismiss my arguments on the grounds that I am a bad person, and that only bad people hold such views (that a computer with the sensory ability and 70+ billion-neuron-processing ability and personal training and practice of a human could possibly think as well as a human), I guess that is their prerogative. As it is mine to feel that such arguments are mirrors which reflect on their side as well.

      * According to Scott Aaronson in “Can Computers Become Conscious?”, May, 2016:

      "I should start by explaining that, in the circles where I hang out—computer scientists, software developers, AI and machine learning researchers, etc.—the default answer to the title question would be “obviously yes.”"

      He goes on to demolish some of the arguments against the proposition, but then admits there are some things about consciousness (as to how it might or might not work in some extreme circumstances) he does not understand, which might require more brain knowledge to understand, but concludes that "At the same time, I also firmly believe that, if anyone thinks that way, the burden is on them to articulate what it is about the brain that could possibly make it relevantly different from a digital computer that passes the Turing test. It’s their job!" So he is not a dualist, but thinks brain biology could possible contain some mechanism which is not currently present in digital computers, for all he knows. Dr. Aaronson is about 35 years old.

    19. Lorraine Ford wrote to me:
      >You’ve got it wrong. In fact, Searle’s Chinese Room argument is that a computer processing symbols can’t decipher what the symbols represent, just like non-English-speaking Chinese clerks processing English language symbols can’t decipher what the symbols represent.

      Oh, I think we agree as to the point Searle was making.

      Yes, Harry's and my argument is making a different point: indeed, I am not quite sure what point our argument does make, except that the experiment would be interesting (at least if you are not the experimental subject!).

      But I think you can see how we were inspired by Searle's thought experiment, so I did have to credit Searle.

      Lorraine also wrote:
      >The guy’s physical brain, and what the guy experiences is not the same as a whole lot of clerks running around.

      Well, yes, that's the point: the conclusion seems absurd, but each step in the argument seems legit. So, what is going on?

      Lorraine also wrote:
      >[Lorraine] Your supposition [1] is not valid. Lawful information (as opposed to symbolic representations of information) can never be independent of substrate.

      >[Dave] 1. “Suppose you take one neuron out of some guy's brain and replace it by an electronic prosthesis that makes all the correct axonic and dendritic connections the neuron used to make, and processes the incoming signals the same way the neuron did.”

      Well, you know, we are not far away from replacing peripheral artificial neurons by prosthetic neurons that will transmit data back and forth from sensors and actuators to and from the central nervous system. I expect those to work. Don't you?

      Will it really make any difference at all if we just replace one or ten or a hundred neurons in the CNS with artificial neurons that interface properly with all the natural neurons that remain? I do not know of course, but I would guess that would work.

      But of course the “whole lot of clerks running around” is clearly not a brain.

      So, at which step do you fail to have a mind?

      By the way, this is a neuroscience version of what philosophers call the “Ship of Theseus” paradox.

      I do not see this as a semantic or logical issue but a real empirical question. If you actually did the experiment, which may someday be possible, something would happen.

      I'm just not sure what that would be.

      Maybe at some point your mind blinks out like a light turned off. Maybe your mind gradually dims as you replace one neuron after another like a light that is turned off by a dimmer. Or maybe your mind somehow clings on to the “whole lot of clerks running around” (no, I don't believe that, but Nature does not care what I believe).

      Or, if we want to pretend to be Descartes, maybe your “soul” finds it harder and harder to “hook in” to your brain as you replace more and more flesh-and-blood neurons with silicon.

      Or whatever.

      Again, I just do not know.

      But it does seem to me that any correct theory that really solves the mind-brain problem has to have detailed (and correct) answers to the questions raised by my and my friend's argument.

      All the best,


    20. JimV wrote:
      >The fact is, true/false questions do exist and therefore so do binary digits (their answers). In saying that binary digits do not exist while admitting that they exist in combination with categories you are making an illogical claim--assuming there is a real world. If the world is something you imagined, then yes, nothing exists. But then why are you arguing with yourself?

      Jim, physics (established physics, not Max Tegmark's fantasies) does not know about bits.

      I know you have experience programming computers going back almost as far as my own. (You ever worked on IBM 1620s? Truly bizarre architecture!)

      But, besides programming, I also have experience designing digital circuits and, specifically, designing error-detection-and-correction (EDAC) systems.

      And the whole reason we need EDAC is that bits do not exist.

      “Bits” are an abstraction, an approximation we humans impose on the real world, which is always analog, never truly digital. What engineers call the “digital abstraction” sometimes works fairly well, but it is never really true, which is why we need EDAC.

      My own work in engineering consisted largely of dealing with situations in which the “digital” signal was not so digital and trying to use some techniques to take the indecisive analog signal and (hopefully) convert it to a signal that is more truly digital. Lots of clever techniques worked out since Richard Hamming got the ball rolling seven decades ago, but none are perfect (and none ever will be).

      Reality is not digital; reality is analog.

      JimV also wrote:
      >why do neural networks in digital computers, given human parental initialization and practice, do so well at mimicing human abilities such as playing Go?

      Why do wax figures look so much like real people? Because we humans create them to!

      A hammer can imitate a fist. A hammer is not a fist.

      A simulation is not the thing simulated. A simulation of an airplane does not fly.

      JimV also wrote:
      >Given that all the above observations are consistent with the simple hypothesis that minds/consciousness arise from the known properties of neurons and synapses, and therefore could be simulated by electronic circuits in computers …

      Your “therefore” is not a “therefore” – you think it obviously follows but it does not. You are assuming what has to be proved: i.e., that it is the “functional” properties of neurons, as viewed in the digital approximation, that causes consciousness.

      Maybe. Maybe not. You are guessing, without evidence.

      You are reasoning just as theists reason to God. Seems good to them, but no logic involved.

      Jim, I think fundamentally what is going on here is that what Lorraine and I and most people who are concerned with the problem of consciousness are talking about is the issue of the “interior experience” that we all have and are all certain that we have regardless of what any external observation may show.

      You, on the other hand, have made very explicit that all you care about is the externally observable behavior of humans, computers, etc.

      You just are not interested in discussing what we are talking about.

      I've seen similar behavior in which atheists announce that they are only interested in matters of this world and therefore God does not exist.

      But of course that says nothing about whether God exists: it is merely an announcement that they are not interested in the topic.

      You are taking the same tack: you just do not wish to address the topic of interior experience.

      Your choice. But then you are not talking about the topic that is generally referred to as the problem of consciousness.


    21. JimV wrote:
      >According to Scott Aaronson in “Can Computers Become Conscious?”, May, 2016:

      >[Scott] "I should start by explaining that, in the circles where I hang out—computer scientists, software developers, AI and machine learning researchers, etc.—the default answer to the title question would be “obviously yes.”"


      Unfortunately, in my experience, such people, with some exceptions, tend to know very little about neuroscience, physics (especially the physics of semiconductor devices), or the discussions that have occurred among scientists and philosophers on this subject for the last century.

      When I have tried to point them to such discussions by very eminent scientists (e.g., Nobel laureates such as Wigner or John Eccles), they tend to contemptuously refuse to read them.

      Comp sci folks are riding high now because the world runs on computers and because they are very highly paid. Unfortunately, some of them have become very arrogant as a result, thinking that they created the digital world we live in, when it was really created by physicists and electrical engineers.

      I suppose that Catholic priests similarly rode pretty high... before the Reformation.

      JimV also wrote:
      >As for reflections on my moral consideration of others, if it pleases people to dismiss my arguments on the grounds that I am a bad person, and that only bad people hold such views (that a computer with the sensory ability and 70+ billion-neuron-processing ability and personal training and practice of a human could possibly think as well as a human), I guess that is their prerogative.

      Nope. What worries me is your eagerness to disown any interest in the interior experience of other human beings: if you took this seriously... well, there are moral consequences.

      In fact, I am pretty sure you are inconsistent on this. But younger folks may be more consistent than guys who think this way in your and my generation. And that could be scary.

      Jim also wrote:
      >"[Dave] JimV just admitted that he has no interest in the question of what things are conscious, which I think makes him rather eccentric among human beings ... After all, it matters morally ..."--DM

      > [Jim] Citation needed.

      Sure: right after that, Jim wrote:
      >Less interest in how consciousness "feels" than in how intelligence works in general is not "no interest in what things are conscious".

      That is what worries me: for over three centuries, back to Descartes, people who discuss the mind-body issue have made clear that what they are concerned about is not “how intelligence works” but indeed “how consciousness 'feels',” i.e., the nature of our “interior experience” and how this connects to physics.

      People who only care about how humans “work” vs. how they “feel” – which includes pain, anguish, etc. – well, we have had way too much of that in the last hundred years. You are probably a very nice person yourself, but encouraging people to focus on how humans “work” and not on how they “feel” seems to me to have led to very, very bad results.

      But, yes, at some level that is irrelevant. You are free to have an interest only in how humans work and not how they feel, though perhaps the two subjects are not as disjoint as you think.

      But then you are not talking about the mind-body problem as it has been discussed for over three hundred years. From Descartes to Wigner and Eccles to Searle to Chalmers and McGinn to some of us here, we are talking about the issue of how those interior feelings are possible in a universe that is, certainly, largely physical.

      You do not have to participate in that discussion. But, if that is your intent, it would be nice for you to own up to it.

      And, yes, I do think that, in moral terms, if you manage to convince most young people to make the same choice, the result will not be good for humanity.

      At least, that is what I see when I look at the last hundred years of human history.


  34. Great book:

    "QED and the Men Who Made It", Silvan S. Schweber

  35. Cool one Sab! great method to put all and sundry in their place

    Hey that shirt makes you look like you have some facial hair growth, which actually goes with the color of the shirt, and is a quite topical conversation in itself

    SO a 2 for 1 video wery economical!

  36. @ Andrei,

    Heisenberg's uncertainty relations of 1927 were given using only the old quantum theory (1900-1925) after the new quantum theory (1925+) was already developed and were given in order to motivate the existence of incompatible observables in the new theory.

  37. @ Andrei,

    "Explanations" are easy to come by! Coherent ones coming systematically from a mathematically well-defined theory are much harder to obtain. And you don't have one, while I do!

    1. Prof. David Edwards,

      In order to show that your argument is not sound I do not need to provide a "mathematically well-defined theory", only to point out that at least one premise is not proven to be true. Please present your argument (or a link to a place where the argument is presented) that "there is no "physical world" in the classical sense" and I will quickly point out the faulty premise.

    2. @ Andrei,

      Also see:

      "Geometry of Quantum Theory", Varadarajan

    3. Prof. David Edwards,

      I've looked into Varadarajan's book (second edition). I could not find an argument that leads to the conclusion that "there is no "physical world" in the classical sense". Please help me locate it!

      However, I did find an assumption that, it seems to me, is at the root at the distinction between classical and quantum logic. At page 7 we read:

      "The point is that only the experimentally verifiable statements are to be regarded as members of the logic of the system. Consequently, as it happens in many questions in atomic physics, it may be impossible to verify experimentally statements which involve the values of two physical quantities of the system - for example measurements of the position and momentum of an electron. One can verify statements about one of them but not, in general, those which involve both of them."

      This is all fine but where is the discussion about what classical properties can be experimentally verified? Even a classical electron will interact with a narrow slit composed of classical charges and its momentum will be changed as a result. Sure, in principle you could compensate for this if the position/momenta of all charged particles in the slit is known. But how would you know it? You cannot measure them because you still don't know the exact state of the instrument and so on, an infinite regress. So, in fact, if one also restricts the meaningful classical statements to those experimentally verifiable you would end up with a "classical logic" that is not boolean.

      So, my conclusion would be that the non-boolean character of "quantum logic" tells us exactly nothing about the physical world. It's simply an artifact of the arbitrary decision to label statements that are not experimentally verifiable "meaningless".

    4. @ Andrei

      See "Quantum Logic and Probability Theory" in the The Stanford Encyclopedia of Philosophy. It should be there! Or read my essay "The Mathematical Foundations of Quantum Mechanics" on my web page.

    5. @ Andrei,

      The non-embeddability of the standard quantum logic in a Boolean Logic is an immediate corrolary of Gleason' Theorem; if it so embedded there would be dispersion free measures on it!

    6. @ Andrei,

      If there were a "physical world" in the classical sense, then there would be a God's eye view of it, and hence an embedding of the quantum logic into a Boolean Logic.

    7. @ Andrei,

      Also see Kochen–Specker theorem in Wikipedia;

      "It turns out to be impossible to simultaneously embed all the commuting subalgebras of the algebra of these observables in one commutative algebra, assumed to represent the classical structure of the hidden-variables theory, if the Hilbert space dimension is at least three."

    8. @ Andrei,

      Also see The Kochen-Specker Theorem in the SEP.

    9. and "Kochen-Specker theorem revisited" on arXiv.

    10. Prof. David Edwards,

      Also see Kochen–Specker theorem in Wikipedia:

      "It turns out to be impossible to simultaneously embed all the commuting subalgebras of the algebra of these observables in one commutative algebra, assumed to represent the classical structure of the hidden-variables theory, if the Hilbert space dimension is at least three."

      From the wiki page we read:

      "the KS theorem only excludes noncontextual hidden-variable theories"

      Classical field theories, such as classical electromagnetism are contextual. This is a direct consequence of the long-range forces that exist between the test particle, like an electron, and the experimental environment. For example the possible trajectories of a classical electron in a two-slit experiment would be different from the possible trajectories of the same electron in a single-slit experiment, because the EM fields associated with the charged particles in the barrier depends on the distribution of those charges which in turn depends on the geometry of the barrier. So, the Kochen–Specker theorem does not apply to classical field theories.

      in this brilliant paper by Tim Maudlin:

      The Labyrinth of Quantum Logic

      We read:

      "Consider the situation. We do one experiment with one slit open. We record a number. We do another experiment with the other slit open. We record another number. Now we wonder what will happen with both slits open. Well, as far as logic of any kind goes, anything could happen! The apparatus could blow up. All of the electrons could be reflected back to the source. The electrons could all turn into rabbits. Logic won’t prevent it! The fallacy, once you point it out, is glaringly obvious. The first experiment shows how many electrons get through slit A and land in R when only slit A is open. And the second how many get through slit B and land in R when only slit B is open. From the point of view of logic, this tells us exactly nothing about what will happen when both slits are open."

      Maudlin's observation applies to all experiments where non-commuting observable are measured. Neither classical logic (as argued by Maudlin), neither classical physics (as argued by me in the context of field theories) predicts that measurements obtained in one experiment should be relevant for different experiments.

      In any experiment there is a God's eye view, but the view changes when the experimental environment changes. In a double slit experiment the classical electron does have a well defined trajectory and it does pass through one slit only. But this trajectory is different if the other slit is open or closed because the fields associated with the second slit make the electron move differently as a result of the Lorentz force.

    11. @ Andrei,

      The classical theory you're describing doesn't satisfy special relativity!

    12. Prof. David Edwards,

      "The classical theory you're describing doesn't satisfy special relativity!"

      I'm speaking about classical electromagnetism (Maxwell + Newton's laws + Lorentz force law) which is a relativistic theory, not about electrostatics which is indeed non-local.

  38. @PhysicistDave and @all,
    Under such intense activity most posts are ignored, therefore I will make a last call.
    If there is anyone interested in my work, please contact me via the following e-mail: to send you the link.

    Unfortunately, I cannot share details (due to Sabine's Blog policy), however it has to do with fundamental physics that involves an unexpected discovery coming from classical mechanics.

  39. @ Peter Shor,

    Worse, it has a strange loop!! Think Hegel instead of Spinoza!

  40. Sabine, you are saying that we need a quantum theory of gravity, not a theory of everything. But presumably, a theory of everything will in particular have to propose a quantum theory of gravity.
    Can you tell us whether the proposed "theories of everything" have made some progress toward a quantum theory of gravity?

    1. We have to rely on what we already know:
      - the known charge carriers (electrons and protons) have the same charge but differ significantly in size and mass
      - electromagnetic fields have its origin on the surface, so the field of a proton, compared to an electron, arrives earlier(!) at the destination
      - due to their mass electrons accelerate faster than protons
      Asymmetries lead to an offset.

  41. The number of decimal digits of precision that can be recoverably converted to an IEEE 64 bit double float is 15. In your local, friendly C compiler, it's captured as DBL_DIG.

    Therefore, you're absolutely done when you have 15 digit agreement between theory and experiment; anything after that should be regarded as beyond giving a damn.

    1. I hope you realize that you can write programs that keep more decimal digits around than the standard IEEE 64 double bit float. They might be slower, because you can't use the floating point accelerator, but 15 bits is not a hard limit.

  42. Austin Fearnley, James Arathoon, …, Lawrence Crowell, …
    …you are referring to exactly what?
    Theoretical physicists are used to discuss abstract objects based on mathematics. Wheter something math-based makes real sense or not, well the “best proof” would be an application based on one’s theory. But: To quote Richard Feynman - "It is important to realize that in physics today, we have no knowledge of what energy is."
    Another fact is: The use of »secondary terms« like energy or mass is not only widespread within the framework of (theoretical) basic research, there are only secondary terms in this field existing.

    »Preachers and understanders« of »secondary terms« believe in the suggestive radiance. They somehow have a “good feeling” of scientific closeness, for example when they hear about electric charge, photons, mass, electric field or gravitational field, talk about them and insert these terms or sizes into formalisms.
    On closer inspection what »secondary term« stands for: Quarks are not particles, neither in the phenomenological nor in the quantum-theoretical sense, since they do not appear as isolable particles or states. The physical particles, on the other hand, can be thought of as bound states composed of quarks. No physical objects correspond to the elementary quantities of quantum field theory. The desired, different types of postulated elementary particles in the standard model (SM) differ in the quantum numbers of dynamic properties such as charge or isospin. Some are massless by postulate, others are not. The electron is postulated as a mass- and charge-point, as desired in theory. This simply means that one takes results-oriented mathematical elements that somehow fit with or without mass. This arbitrary procedure is possible because from a mathematical point of view “anything goes along with the theory”.
    However, quantized properties are characterized by inner symmetries and have nothing in common with properties in the usual sense that can be understood as physical qualities inherent in things. The isospin of the nucleons or the "color" of the quarks no longer express any qualities in this sense, but only arbitrarily defined basic states or directions in an abstract space that are related to each other through symmetry transformations. Almost all previously known symbol systems are cited. Sometimes it is the colors (red, blue, green), sometimes letters (u, d, s, c, b, t), sometimes symbolic properties (strange, charm, beauty, ...) and furthermore flavors. The terms 'tohu' and 'wabohu' from the story of creation in the Old Testament were proposed for a structure still below the quarks.

    In sum, the quark masses postulated according to the SM do not yield the nucleon masses by far. Gluons are massless.
    Postulated Up-Quark mass: 2.3 ± 0.7 ± 0.5 MeV / c² up (u)
    Postulated down-quark mass: 4.8 ± 0.5 ± 0.3 MeV / c² down (d)
    938,272 0813 (58) MeV / c² Proton mass duu ~ 0,8 - 1,2% (!!!) Quark mass fraction
    939,565 4133 (58) MeV / c² neutron mass ddu ~ 1,1 - 1,4% (!!!) Quark mass fraction
    Thus, also heavy ions composed of protons and neutrons (such as lead or gold nuclei) can not be represented by quarks and gluons. This means that according to the principle of mass-energy equivalence, nucleons and, ultimately, heavy ions consist almost entirely of phenomenologically indeterminate binding energy. Even more complicated is the fact that the ions are accelerated to almost the speed of light before they collide. This means that there is also a considerable amount of external energy added to the binding energy.
    So, would could it phenomenologically means if one refers exemplary to an accelerated high energy proton postulated made of postulated Quarks, gluons, virtual seaquarks, which means overall mostly phenomenologically undefined »binding energy« and “relativistic“ energy? Further more: Compare this accelerated composite particle to a postulated »elementary one« like an accelerated electron.

    1. We do not know what energy is in the same way we have no essential knowledge of what any category is. This does not mean we are ignorant of energy, and from how we do understand energy, from the work-energy theorem K = W = ∫F·dr. This works well enough. This is of course based on Newton’s second law F = ma, where the left hand side is a dynamical principle, but on the right we have a mass m, which is kinematical, and a geometric construction in the acceleration. That is a funny thing if you think about it.

      Things such as quantum numbers are operative. Even things such as QCD charges called colors. We can only infer this physics indirectly since there are no isolated quarks or gluons. At least so far we know of none and theoretically they are no possible. I would say that we know these things well enough or to understand the world FAPP. We do not need to have some noumenal understanding or observations of the world to speak with some confidence about the universe according to physical quantities.

    2. I am an amateur and my grasp of physics is very incomplete, but I do realise and have written before that energy is something I know little about. It is not something I can write here about without getting into trouble, but I do think some fundamentals are wrong even at the level of an accelerated electron emitting a photon. In my model, weak isospin is conserved. I should not have said 'wrong' but merely not viewed in the most concise way. Like epicycles versus ellipses.

      IMO I am not using red, green and blue merely as labels. I think that red and antired represent vestages of information about a 4D (compactified) universe that we are barred by relative speed c from interacting with in any other way. Ditto for green and blue. To my mind these dimensions are just as real as our 4D spacetime.

      If R, G and B were just labels then it would not be clear why Red is equivalent to antigreen (g) plus antiblue (b). R, G and B as mere labels do not carry any grouping information. In my previous comments I only gave a simplified picture, but say redness is actually composed of even smaller bits namely Rgb and similarly antigreen is RgB.

      So adding antigreen to antiblue gives RgB + RGb which is rearranged to give RGB + Rgb which is a neural colour plus Red. Aggregating colours in this simple way manufactures a grouping structure building relationships between colours. However, the colour Red with such relationships is no longer the more fundamental Red = 1 or 0 bit.

      Austin Fearnley

    3. Lawrence Crowell, as you – more or less – pointed out, in the context of the SM, the definition of the mass of a particle relates exclusively to its kinematic effect. However, mass-effects as a source of a gravitational field is not taken into account, as is the gravitational interaction, which cannot be described in the standard model. The kinematic effect of the mass manifests itself in the propagator of the particle.
      What do you exactly mean by category? If you are reffering in this context to mathematics it’s obviously not primarely about physics.
      Brigitte Falkenburg brings it in »Particle Metaphysics: A Critical Account of Subatomic Reality (2007)« right to the point… „It must be made transparent step by step what physicists themselves consider to be the empirical basis for current knowledge of particle physics. And it must be transparent what the mean in detail when the talk about subatomic particles and fields. The continued use of these terms in quantum physics gives rise to serious semantic problems. Modern particle physics is indeed the hardest case for incommensurability in Kuhn’s sense.“
      … „Subatomic structure does not really exist per se. It is only exhibited in a scattering experiment of a given energy, that is, due to an interaction. The higher the energy transfer during the interaction, the smaller the measured structures. In addition, according to the laws of quantum field theory at very high scattering energies, new structures arise. Quantum chromodynamics tells us that the higher the scattering energy, the more quark antiquark pairs and gluons are created inside the nucleon. According to the model of scattering in this domain, this give rise once again to scaling violations which have indeed observed. This sheds new light on Eddington’s old question on whether the experimental method gives rise to discovery or manufacture. Does the interaction at a certain scattering energy reveal the measured structures or does it generate them?“
      …“It is not possible to trace a measured cross-section back to its individual cause. No causal story relates a measured form factor or structure function to its cause“…“With the beams generated in particle accelerators, one can neither look into the atom, nor see subatomic structures, nor observe pointlike structures inside the nucleon. Such talk is metaphorical. The only thing a particle makes visible is the macroscopic structure of the target“…
      …“Niels Bohr’s quantum philosophy…Bohr’s claim was that the classical language is indispensable. This has remained valid up to the present day. At the individual level of clicks in particle detectors and particle tracks on photographs, all measurements results have to expressed in classical terms. Indeed, the use of the familiar physical quantities of length, time, mass and momentum-energy at a subatomic scale is due to an extrapolation of the language of classical physics to the non-classical domain.“…
      Austin Fearnley, I don’t know your model. You obviously use terms and theoretical objects of the SM-formalism. „Phenomenologically speaking“ - in terms of real-object oriented physics - this is leading nowhere (too).
      Remember…The SM aims at capturing matter formation and interactions through purely abstract mathematical symmetries (keyword gauge theory) The mathematical approach of the SM, starting from zero-dimensional, massless objects, obviously does not provide any connection to the perceivable physical reality where mass and extent represent fundamental properties. The SM-post-correction using the Higgs mechanism theoretically gives mass to particles , but firstly it "violates" the original approach, secondly the statement that the Higgs formalism gives the particles mass is not true at all, because quarks-based proton and neutron receive only about 1% of their respective masses due to the postulated Higgs field and thirdly the supposed mass-giving terms do not include any mass calculation at all. The mass values here do not follow from a physical equation but must be known as free parameters.

  43. bee,

    how do we know quantum mechanics still applies inside a black hole?

  44. As a theory of everything, what I can see as a major error in string theory, is that it assumes that everything comes from 10 dimensions. Some aspects of particles may come from one, two, three, ten, and 25 dimensions and combinations of these.

    1. The bosonic string exists in spacetime of 26 dimensions. Why this is so is complicated and involves Virasoro algebra and its connection to the Klein j-invariant function. This is also an automorphism on the Fischer-Greiss group, which takes one into some really strange stuff that Borcherds calls the monstrous moonshine.

      The bosonic string 26 Lorentz spacetime is mapped into anti--de Sitter spacetimes tensored with spheres in the M-theory. These spacetimes are either 10 dimensions as AdS_5×S^5 or in 11 dimensions as the dual forms AdS_4×S^7 or AdS_7×S^4. These hold the various string types.

      The real problem is not so much the dimensions of these string theories, but rather negative vacuum energy. The bosonic string has a negative vacuum energy and its first excited state is a tachyon. This gets mapped into string theory in spacetime with a negative cosmological constant Λ < 0. We do not live in such a spacetime, and string theory appears to only be consistent in spacetime with Λ ≤ 0. The observable universe has Λ ≃ 10^{-52}m^{-2}, which corresponds to the astrophysical Hubble constant or parameter of 70km/sec-Mpc. There is a strange controversy brewing over that. Since the cosmological constant is very small then while positive we might think string theory should be maybe slightly broken. This is not so, not even close. Supersymmetry in a mild broken form is not apparent at all. The problem may be that the inflationary spacetime had Λ ≃ 10^{60}, which would have broken SUSY, SUGRA and string theory violently. This may be one reason there is no signature of string theory or SUSY.

      As I see it this is an extremely fortunate situation. While many see this as a funeral dirge, Chopin’s piano sonata playing, this really opens the field up. Low energy SUSY is in comatose, much appears not consistent and all is chaos. So why are we unhappy? There may be a number of options here. String/M-theory may still have some bearing on physics and cosmology, but not in the way we think. There may be relationships between de Sitter spacetime, anti-de Sitter spacetime and black holes that need to be uncovered. The apparent failure here really opens the number of possibilities wide open. So what are we waiting for?

    2. The story of 'higher dimensions' in physics is quite a bit older than string theory. In fact, in this string theory takes it cue from Kalazu-Klien which importantly achieved a unification of gravity and electromagnetism by adding a fifth compact circular dimension. It was important enough that Einstein himself worked on it. Progress stalled because there seemed to be no straight forward mechanism of stabilising the small radius. After all, Einstein gravity teaches us that spacetime is dynamical and hence the fifth circular dimension also ought to be dynamical - but then what prevents it from blowing up or contracting to a point? The same problem arises in string theory. I'm not sure how they tackle this so I'll be interested yo hear how this is tackled.

      However higher or internal dimensions were actually thought up originally by Hertz well before Kaluza or Klien or even Einstein. In fact, he was driven to this by philosophical considerations. He argued that Newtonian physics wasn't basic because it had two basic concepts - inertia and force. So he dispensed with force by adding new degrees of freedom which he argued referred to 'hidden' (his word!) dimensions of space. All his forces were constraint forces. Conceptually speaking, this is what Einstein showed for gravity - after all, it's well-known that Einstein dispensed with the notion of force in his theory of gravity. And this well before Einstein. In fact, Einstein had read Hertz so it's more than likely he was influenced by him in this.

    3. Addendum:

      Another important - at least to my mind - concept is the theory of the microcontinuum or generalised continua by the Cosserat brothers around 1909. Elasticity is a continua and it has no microstructure. They posited internal degrees of freedom, like micro-rotations and shears. Personally, I think this is conceptually important in how we can think of so called higher dimensions. They can be alternatively thought of as hidden small internal dimensions. This makes sense simply because the notion of the extensionless point - since Euclid - the basic element of continua makes no physical sense.

  45. It’s interesting to assess quarks using the top-down PAVIS model.

    The quick summary: In PAVIS there are no particles. Instead, under the pressure of information-generating classical time, a proto universe of simple mass-energy breaks down via wave collapse into bundles of absolutely-conserved quantum numbers. Under the pressure of time and self-observation, these bundles keep fragmenting until they are as small and point-like as possible. The fragmentation process ends at (multiple) quantum equilibrium levels, with each level being determined primarily by mass-limited lack of spatial resolution. Nucleons and atoms are the most common examples of equilibrium fragments (or bundles) in our vicinity. Everything prior to the equilibrium level is subject to historical time and causality. However, the interiors of the bundles remain capable of resisting time, and become the unrealized potentials that constitute the quantum domain. The quantum domain retains the same rules for creating smaller fragments as the classical domain, but for them these rules remain dormant until more energy is added.

    Quarks are conspicuously and unequivocally below the PAVIS quantum equilibrium level, and this makes them potentials defined by rules rather than particles. Many physicists in the early days of the Standard Model regarded them in pretty much this fashion. However, their status as particles received a promotion when high-energy accelerators showed that there are distinctly particle-like entities moving around inside of protons and neutrons.

    The excellent irony in the PAVIS interpretation of quarks is this: The situation for quarks as potentials inside of nucleons is no different from the situation of electrons in atomic orbitals. Both are quantum equilibrium bundles, so both lack particles or historic time in their interiors. In PAVIS, the idea that electrons whiz around inside of atoms is no more (or less) real than the idea that quarks whiz around inside protons. In both cases, seeing the entities as particle-like requires adding energy to drive them past quantum equilibrium to create transient smaller entities that have recordable histories.

    There is one critical difference between electrons and quarks, though.

    Electron bundles include all three of the rules needed to generate isotropic potential fields in xyz space, what we see as electromagnetic fields. In contrast, quark bundles possess only one (d) or two (u) of these rules. Quarks consequently are unable to exhibit precise, particle-like locations unless they are still within quantum-time proximity of additional quarks that can “loan them” the missing rules.

    This is more commonly called confinement. The incomplete rule sets of quarks ensure that even when enough energy is provided to make them appear as recordable entities with well-defined locations, they remain below the quantum equilibrium limit overall. Incidentally, this idea of creating particles by “mixing rules”, or rule chemistry, is characteristic of how PAVIS reorganizes data.

    And yes, I did just suggest both that the color force is a dimensional subset of the electric force, and that the triplet of color charges (actually, the simpler Glashow unit vectors that combine color and electric charges) are linked directly to the structure of xyz space.

  46. The paradox is that the pursuit for the ultimate theory
    has lead to the so called 'post-empirical science'
    a complete oxymoron.

  47. The paradox is that the pursuit for the ultimate theory
    has lead to the so called 'post-empirical science'
    a complete oxymoron.

  48. "....all the vectors describe probabilities." ?????

    1. @ ian aitchison,

      Gleason's Theorem shows that every non-zero element of the Hilbert Space represents a pure probability measure on the Standard Quantum Logic; i.e., such pure states are in 1-1 correspondence with the elements of projective Hilbert Space.

  49. Pubbli,

    It would seem to be a paradox but are you certain your premise is true? Dr H's 7/12/14 blog on the topic doesn't confirm that, to me. This question starts somewhere around the idea that theories predict but they aren't confirmed. When Dr H's blog addresses semantic tap dancing, eg between "confirm" and "assess", I lose sight of the trail you assume.

    Best regards, Bert

  50. The following experiment might possibly show a unified electroweak field at a field strength as low as 10^18 watts/cm2. A number of laser types at power ranges 10^11 – 10^13 W/cm2 were amplified using gold nano-particles with varying estimated power gains between 4 and 9 orders of magnitude.

    Accelerated alpha-decay of 232U isotope achieved by exposure of its aqueous solution with gold nanoparticles to laser radiation

    A.V. Simakin, G.A. Shafeev

    Accelerated electroweak activity was witnessed in the accelerated alpha decay of U232 during the presence of the amplified electroweak field.


COMMENTS ON THIS BLOG ARE PERMANENTLY CLOSED. You can join the discussion on Patreon.

Note: Only a member of this blog may post a comment.