*Lost in Math*was published two years ago, and this week the paperback edition will appear. I want to use the occasion to tell you why I wrote the book and what has happened since.

In

*Lost in Math*, I explain why I have become very worried about what is happening in the foundations of physics. What is happening, you ask? Well, nothing. We have not made progress for 40 years. The problems we are trying to solve today are the same problems we were trying to solve half a century ago.

This worries me because if we do not make progress understanding nature on the most fundamental level, then scientific progress will eventually be reduced to working out details of applications of what we already know. This means that overall societal progress depends crucially on progress in the foundations of physics, more so than on any other discipline.

I know that a lot of scientists in other disciplines find that tremendously offensive. But if they object all I have to do is remind them that without breakthroughs in the foundations of physics there would be no transistors, no microchips, no hard disks, no computers, no wifi, no internet. There would be no artificial intelligence, no lasers, no magnetic resonance imaging, no electron microscopes, no digital cameras. Computer science would not exist. Modern medicine would not exist either because the imaging methods and tools for data analysis would never have been invented. In brief, without the work that physicists did 100 years ago, modern civilization as we know it today would not exist.

I find it somewhat perplexing that so few people seem to realize how big of a problem it is that progress in the foundations of physics has stalled. Part of the reason, I think, is that physicists in the foundations themselves have been talking so much rubbish that people have come to believe foundational work is just philosophical speculation and has lost any relevance for technological progress.

Indeed, I am afraid, most of my colleagues now believe that themselves. It’s wrong, needless to say. A better understand of the theories that we currently use to make all these fancy devices, will almost certainly lead to practical applications. Maybe not in 5 years or 10 years, but more in 100 or 500 years. But eventually, it will.

So, my book

*Lost in Math*is an examination of what has gone wrong. As the subtitle says, the problem is that physicists rely on unscientific methods to develop new theories. These methods are variations of arguments from mathematical beauty, though many physicists are not aware that this is what they are doing.

This problem has been particularly apparent when it comes to the belief that the Large Hadron Collider (LHC) should see new fundamental particles besides the Higgs boson. The reason so many physicists believed this, is that if it had happened, if the LHC would have found other new particles, then the theories would have been much more beautiful. I explained in my book why this argument is unscientific and why therefore, we have no reason to think the LHC should see anything new besides the Higgs. And indeed that’s exactly what happened.

Since the publication of my book, it has slowly sunken in with particle physicists that they were indeed wrong and that their methods did not work. They have largely given up using this particular argument from beauty that led to those wrong LHC predictions. That’s good, of course, but it does not really solve the problem, because they have not analyzed how it could happen that they collectively – and we are talking here about thousands of people – believed in something that was obviously unscientific.

So this is where we stand today. The recognition that something is going wrong in the foundations of physics is spreading. But physicists still have not done anything to fix the problem.

How can we even fix the problem? Well, I explain this in my book. The key is to have a look at what has historically worked. Where have breakthroughs come from in the foundations of physics? Historically a lot of breakthroughs were driven by experimental discoveries. But the simple things have been done and new experiments now are so costly and take such a long time to build, that coincidental discoveries have become incredibly unlikely. You do not just tinker around with a 27 kilometer particle collider.

This means we have to look at the other type of breakthrough, where a theoretical prediction turned out to be correct. Think of Einstein and Dirac and of Higgs and the others who predicted the Higgs boson. What did these correct predictions have in common?

They have in common that they were based on theoretical advances which resolved an inconsistency in the then existing theories. What I mean by inconsistency here is an internal logical disagreement. Therefore, the conclusion I draw from looking at the history of physics is that we should stop trying to make our theories prettier, and instead focus on solving the real problems with these theories.

Some of the inconsistencies in the current theories are the missing quantization of gravity, the measurement problem in quantum mechanics, some aspects of dark energy and dark matter, and some issues with quantum field theories.

I don’t think physicists have really understood what I told them, or maybe they don’t want to understand it. Most of them claim there is no problem, which is patently ridiculous, because everyone who follows popular science news knows that they have been producing loads of nonsense predictions for decades and nothing ever panned out. Clearly, something is going wrong there.

But what I have found very encouraging is the reaction of young physicists to the book, students and postdocs. They don’t want to repeat the mistakes of the past, and they are frequently asking for practical advice. Which I am happy to give, to the extent that I can. The young people give me hope that things will change, eventually, though it might take some time.

“Lost in Math” contains several interviews with key people in the field, Frank Wilczek, Steven Weinberg, Gian Francesco Giudice, who was head of the CERN theory division at the time, Garrett Lisi. George Ellis. Chad Orzel. So you will not only get to hear my opinion, but also that of others. If you haven’t had a chance to read the hardcover, the paperback edition has just appeared, so check it out!

Hi Sabine,

ReplyDeleteI have been reading about those 40 years for at least 7 years, so I guess it is now closer to 50.

Also, there are many kind of beauties, even in math. I seems likely to me that the next paradigm will have its own beauty (guessing an increase in conceptual simplicity). But in the beginning it may look ugly. For instance, according to Planck, quantizing the oscillator was "desperate". Abandoning hope and facing facts is a very hard exercise.

Best,

J.

akidbelle,

DeleteI usually say 40 years because I'd say that the first 10 years of trying arguments from beauty after completing the standard model were okay. It's just when particle physicists refused to learn the lesson from incoming data that said it's not working that the problem began. But, yes, maybe 50 years is more accurate.

And, yes, the issue with arguments from beauty is not so much the arguments per se, but that paradigm shifts tend to also shift notions of beauty, so axiomatizing specific notions of beauty -- as physicists have done -- is putting the carriage before the horse.

The measurement problem in quantum mechanics is only due to physicists insisting on physicalism and not being satisfied with perspectivism.

ReplyDeleteThat's wrong. The measurement problem is an inconsistency between quantum mechanics and reductionism. This clearly demonstrates that quantum mechanics is not fundamental.

DeleteThat's only because you insist that the reduction be to a form of physicalism instead of perspectivism! See my "The Mathematical Foundations of Quantum Mechanics" on my website for a completely coherent exposition of quantum theory not requiring any further reduction.

DeleteAs an uneducated layman, I can say this out loud: the word "physical" has no meaning in this context.

DeleteReductionism eventually hits a fork in the road: should we postulate an infinite regress, or say that quarks are made of "quark stuff" and that's the end of it?

Sabine, I believe that Quantum Mechanics is a fundamental theory. The problem is our thinking of how QM must marry GR. I am working on something that is not only just out of the box, but has no relation whatsoever with the so-called box.

DeleteThe only way you can avoid further reduction is to postulate the existence of strong emergence which is incompatible with evidence. I don't need to read your book to know that you haven't solve the problem but merely ignored it because it's not a problem you can get rid of with verbal acrobatics.

DeleteGolden,

DeleteI don't know what makes you think anyone cares what you "believe" especially not if that belief is obviously wrong. If quantum mechanics was fundamental, then it would not be necessary to postulate what happens in a measurement, it should be derivable from the dynamical law.

This comment has been removed by the author.

DeleteYes Sabine, no one should care what I believe. What is important is what it is we measure in the real world. If what I believe is congruent with what we measure, well, then, it may very well be important that someone should care what I believe because there may very well be an element of truth in that belief. We care about Einstein's beliefs because, there is an element of truth that resonates with reality in his beliefs.

DeleteOn your view that "... it should be derivable from the dynamical law ...", you seem to be advocating for a predetermined world. The laws of physics can not and will never ever be able to tell us what will happen in the future because this would mean we should be able to predict whether or not President Donald. Trump will be re-elected, because all we would need to do is have the full knowledge of all the positions and velocities of all matter in the Universe and from this submit this information to a supercomputer well programmed with all the existing laws of motion and simple watch the evolution of this system. The supercomputer should be able to tell us everything about the beginning, evolution and fate of the universe. This simple argument should enough to let one realize that it is just but a fallacy to think that the present reality should allow us to derive in exactitude, what the next moment should be like. I know from you many videos and posts, you do not believe in freewill.

From your YOUTUBE videos, I know you have a dislike of the use of the word "believe". I use this word in place of "know". For example, "I believe Newton's There Laws of Motion do describe the Universe we live in." I would not say "I know Newton's There Laws of motion describe the Universe we live in."

Time to give cold fusion another look? The experimental apparatus is cheap. Just kidding. Shock wave-induced fusion inside bubbles sounds like a better bet.

ReplyDeleteThat's wrong. The measurement problem is an inconsistency between quantum mechanics and reductionism. This clearly demonstrates that quantum mechanics is not fundamental.

DeleteSabine, I think your reply went to the wrong comment.

DeleteSorry! Got lost in my own comment section!

DeleteI thought it maybe was your stock "That would be an ecumenical matter" response. I had a mental image of Father Jack.

DeletePaul, I think you are right there. I always felt there was a tremendous rush to close down the very idea of cold fusion - possibly for some political/financial reasons. Cullham was a centre of hot fusion research, which obviously didn't want to acknowledge such a possibility. They produced a negative assessment of the idea within a few weeks.

DeleteThe theory seemed to evolve from one of light nuclear fusion, to one in which nickel or palladium electrodes could take up so much hydrogen that some protons would fuse with the metal nuclei and produce a series of decay products - copper in the case of nickel - with the release of energy in the form of heat.

The simple argument that the nuclear repulsion between the nucleus and a proton was too great for this to happen at a useful rate, was countered by the argument that you didn't have an isolated nucleus, but a solid state problem where the probability was higher (electrons in a metal have a higher effective mass than isolated electrons). I don't know if this was valid.

I have two questions. What was the progress in the decades before those 40 years? To me it seems that this part of physics was always very slow. The second question, why do you say that all the good things (transistors, ...) are due to the foundations of physics and not to the more shut up and calculate subfields?

ReplyDeleteProgress in the foundations of physics in the first half of the 20st century was stunning: Quantum mechanics, special and general relativity, relativistic quantum mechanics and quantum field theory, gauge theories and, ultimately, the standard model of particle physics. And that despite the fact that the # of physicists at the time was about a factor 100 smaller than today (see numbers and references in my book).

DeleteI don't know what "more shut up and calculate subfields" of the foundations of physics are supposed to be.

Thank you. Now I see what you mean by foundations of physics. It includes developing new theories. My guess was that it meant only working on foundation problems within a given theory. Such as the measurement problem in quantum mechanics, but did not mean the development of quantum mechanics. In any case I personally wouldn't expect new theories to pop up all the time. Also all the benefits to society you mentioned didn't come from the foundations of physics subfield. They came from applications of already existing theories.

DeleteI think that the theory of everything is very simple, but it is like a maze with many dead ends. There are many tricks in the math that don't resemble anything real, so it is hard to tell what is real and imaginary while in the maze.

ReplyDelete

ReplyDelete“What did these correct predictions have in common?

They have in common that they were based on theoretical advances which resolved an inconsistency in the then existing theories.”

To quote a physicist I admire “the universe doesn’t have to be anything”, get past preconception and intuition and I believe you can see an empirical foundational space-time framework that deserves a look for many logical, rational reasons. Look at how we physically experience and observe space-time coupled with the experimental evidence? It might be a dead end, or might help resolve some of those inconsistencies.

The only defense I have saying this is I have a good awareness of my own bias, intuitions, and pre-conceptions that try and sway my observations and rationale. I’m better at weeding those influences out than most. In other words I’m more apt to see what is, not what I want, thought, or expected. Saying more will violate the posting rules, I’ll respect that, but I don’t think the specifics are that great of an insight that others can’t figure it out themselves.

If what highly trained US Navy pilots, from the Nimitz carrier group in 2004, observed with their own eyes, simultaneously detected by multiple fleet radars, is to be believed (and there’s little reason to doubt them), then ‘someone’ has advanced well beyond our current foundational knowledge of physics. Wherever technical civilizations have arisen, throughout the Universe, they will have been confronted with the same laws of physics that our terrestrial science has unveiled, and likely followed a similar pathway of discovery. If one is to believe in the veracity of reports with the caliber of the Nimitz encounters, then we may assume that there exist civilizations that have leapt beyond the threshold of our contemporary foundational physics understanding.

ReplyDeleteI’ve harped on this before; the possibility that something peculiar may be going on between superconductors and the most universal of all forces – gravity, with a possible tie-in to Dark Energy, and even Dark Matter. Despite condensed matter physics occupying the low energy end of physics experimentation, in contrast with the LHC, I suspect there may be ‘paydirt’ to be found here that will expand our knowledge horizons beyond its current foundations. Indeed, without hopefully coming across as too bombastic, it just might be revolutionary. One problem is that experiments in this area of research have been conducted by a mixed bag of professional researchers and what most sensible people would consider fringe scientists.

As I noted in the comment section of Sabine’s 23 March 2020 post “Are dark energy and dark matter scientific?”, the beginnings of the odd anomalies associated with superconductors were uncovered by a professional research group – Janet Tate, et al. – performing experiments on quartz spheres that were to be deployed in the Gravity Probe B experiment. They found that the mass of cooper pairs was greater than theory predicted. Building on this discovery other groups developed theories to explain this discrepancy. One such theory, by Clovis de Matos and Christian Beck, postulated that a coupling was occurring between superconductors and Dark Energy via “graviphotons”. If their model is right it could explain other anomalies detected by another research group led by Martin Tajmar at the Austrian Research Center, who found acceleration signals emanating from rapidly spun up niobium rings immersed in liquid helium.

Recently, I read a quite professional sounding paper by an experimenter in the Seattle, Washington area, from 1997, using only a hobbyist grade, inch diameter YBCO superconductor. It is on the arXiv, but I’m not sure it’s OK to provide a link to it. In that period there were claims by several individuals of weight anomalies detected immediately above YBCO superconductors. The effect only occurred during the superconductor’s transition through its critical temperature. I immediately thought of the same phenomena reported by the Tajmar group and explained by de Matos and Beck as the “absorption” or “emission” of graviphotons depending on which direction the superconductor was passing through on its critical temperature. The niobium ring in the Tajmar experiment didn’t have any external magnetic field applied to it. In the YBCO experiments with ‘positive’ results, a common denominator was the presence of an external magnetic field from a close-by magnet. Perhaps the magnetic field in the YBCO experiments aligned and thus concentrated the postulated graviphotons, compensating for 10-fold fewer cooper pairs over niobium, permitting detection.

These references to superconductors that act as gravity shields come from Podkletnov and others. This paper below is that which kicked off this idea

DeleteE. Podkletnov, R. Nieminen, "A possibility of gravitational force shielding by bulk YBa2Cu3O7−x superconductor". Physica C. 203 (3–4): 441–444 (1992)

This one below is accessible on the web:

E, Podkletnov, "Weak gravitation shielding properties of composite bulk Y Ba2Cu3O7−x superconductor below 70 K under e.m. field". arXiv:cond-mat/9701074 (1997).

It must be pointed out that attempts by others to reproduce this have failed consistently. As of today the result and the concept it purports to uphold is largely unconfirmed.

The idea is with bosonization or a boson field φ = e^{ψψ} an integral spin field is built from fermions. The Thirring fermion theory with a potential V(ψ) = gψ^†ψψ^†ψ, quartic in fermions, produces the Sine-Gordon equation and soliton. This is a form of Josephson’s theory on superconducting diode or gate. We can also think of a graviton as a G ~ ψ^4 field. So, the idea is not entirely crazy. However, the coupling for this graviton is extremely small. The proposal that gravitons could be built this way in a lab is extreme.

The idea of anti-gravitation is the fun stuff of science fiction. It has to be pointed out that if one had a gravity shield of some sort it would have to produce some local curvature of spacetime that is fairly large. This curvature would locally, say within the confines of the system, have to cancel out a gravity force and this would be substantial. Think of the Meissner effect with superconductors, and the shielding of a magnetic field results in a significant local change in that magnetic field.

Later this morning I will try to write about UFOs and the prospect we are being visited by alien spacecraft.

The idea there are aliens in outer space in some ways is ancient. The notion the heavens had gods or angels and so forth are just ancient world ideas of space aliens. The more science-like idea of space aliens, largely pioneered by HG Wells, is a modern variant of the same idea and where these beings are made physical. The gods or God are more powerful than us mortal humans, and most science fiction has aliens with far more advanced technology that borders on magic.

DeleteAs Fermi put it, “Where are they?” I would say that UFO claims or putative evidence aside we as yet have no conclusive evidence for visitations by alien spacecraft, whether robotic or piloted. I think the answer to Fermi’s question has three points. The first is speeds faster than the speed of light are inaccessible. We live in a spacetime with a certain energy condition that serves as a topological obstruction to closed timelike curves, faster than light travel, warpdrives and so forth. With special relativity you need infinite energy, which signals some sort of obstruction. The second reason I suspect is that planets with highly complex or exuberant life are comparatively rare, and that in any given time few have intelligent life. Earth has had life for 3.8 billion years, complex life for 600 million, humans for 100,000 years and advanced technology for maybe 2 centuries. ETI is probably very rare and any galaxy may sport at most a few during its existence. Finally, I think it is likely that intelligent life which develop technology ends up mishandling it and snuff themselves out. So even for some ETI this may limit the numbers that are able to colonize a galaxy, and there is so far no galaxy found that exhibits signs of hyper-technology such as Dyson spheres.

The paintings by Zdzisław Bekinski, which you can look up and there are some Youtube presentations, are these haunting fantastic-realist renderings that reflect our condition. They are disturbing, but they represent on a deep level the human condition. They reflect our long-term prospects.

There have also been reports and even photographs from military pilots as well. The problem is these images never show a physical surface. These show up either as lights or as in the case of recent videos from aircraft as dark shapes. They do not appear as any solid object or something with definite material surface. This to me suggests optical or other effects instead of something with a physical material.

Lawrence Crowell,

DeleteI didn’t explicitly state it, but my presumption was that the transitory weight reductions, claimed to be detected by a number of experimenters, was not due to gravity shielding, but arose from a burst of the postulated graviphotons (in the de Matos/Beck theory) coupling to the ‘target masses’ in the various experiments. The idea here is that the graviphotons by imparting a ‘small’ acceleration moment opposite to the Earth’s field, during the brief interval that the superconductor is passing through its critical temperature, masqueraded as gravity shielding. I’ve long been aware that gravity shielding is incompatible with General Relativity, from reading reactions to Podkletnov’s claims by professional scientists.

The early work that you cited, and linked to, by Podkletnov entailed claims of continuous weight reduction above a spinning YBCO superconductor immersed in liquid helium vapors in the range of 0.3% to about 2%. I was reluctant to bring up his work because of his penchant for secrecy, and unwillingness to cooperate with other groups like NASA to assist them with recreating his experiments. On top of that his experimental arrangements were very elaborate and required extreme engineering in several fields. No one, to my knowledge, has ever been able to fully replicate his experiments as laid out in his various papers. NASA did succeed in partially replicating one of his experiments, namely the construction of a sizable YBCO superconductor. But, as I recall, they basically did a static test with the world’s most sensitive gravitometer positioned above this disc and found no attenuation/enhancement of the Earth’s field. I am virtually positive that they didn’t even have the superconductor pass through its critical temperature, but would have to check.

I wasn’t sure if you were referring to the graviphoton of the de Matos/Beck theory in the 5th paragraph of your first posting, but I guess that was your intent, as you referred to an integral spin field built from fermions. You stated that “…the coupling for this graviton is extremely small”. I knew from general reading that conventional graviton emission in atomic transitions is hugely suppressed in nature vis-à-vis photon emission by a factor of 4.8 X 10^-43 for electrons, and 1.6 X 10^-36 for protons. So, if you are absolutely certain that the graviphoton of the de Matos/Beck theory couples to matter at say a level comparable to the conventional graviton, then it would definitely invalidate their theory. But then it makes me wonder how theories like theirs can pass peer review if it’s so obvious that there is a major flaw in their reasoning.

You stated that videos of UFO’s “…do not appear as any solid object or something with definite material surface. This to me suggests optical or other effects instead of something with a physical material.” In the Nimitz incident commander David Fravor described the object that his rear seater, along with a second pair of Navy fliers in a second Hornet also observed at relatively close quarters, as shaped like a “tic-tac” and sold white in color, with a length of 40 feet. Initially the flyers observed the “tic-tac” hovering close above the ocean creating a disturbance on the surface. As commander Fravor began to descend in a wide circle for a closer inspection the object began to ascend mirroring Fravor’s Hornet’s motion on the far side of the circle. When commander Fravor cut across the circle, aiming his aircraft’s nose directly at the object, it accelerated away from them at a stupendous rate. He was informed by radio that the object’s movement was tracked by one of the ships to their combat air patrol (CAP) point, entailing a minimum speed of 4000 mph, as I recall. Such a detailed description doesn’t sound like an optical illusion.

I believe Dr. Crowell has nailed it. The odds against space aliens existing and choosing to travel light-years only to dart around in our skies a few times per century are so low as to be practically non-existent. Whereas both eyes and radar use electro-magnetic radiation with which optical illusions are possible. (And we are also susceptible to practical jokers.) Did the departing object create a sonic boom? I rest my case.

DeleteHowever, as Theodore Sturgeon pointed out a long time ago, and Dr. Krugman more recently, the one thing that might unite a fractious humanity would be an invasion by space aliens. Perhaps that was the point of the story.

I'd like to point out that many people have already supplied very reasonable explanations for the "alien videos" you are referring to.

DeleteAside from that, I doubt any government would declassify and make public evidence for intelligent extraterrestial life in such a cavalier manner.

@David Schroeder: Podkletnov made enough noise to be noticeable. Other people and these related theories I am not familiar with. I looking at literature one has to pick and choose.

DeleteAs I said a quartic fermion interaction can construct a graviton. We might think of this as either some sort of bound state of four fermions or an entanglement of four fermions. I really prefer the latter. By the same thinking we can think of a graviton as a colorless entanglement between two gauge bosons such as a gluon or a gluon-like particle. Bern et al have done work on just this. I actually see the two as equivalent. So, we could in principle generate gravitons with a Thirring-like quartic fermion theory. The problem is the coupling is 44 orders of magnitude smaller than the electric charge. Further, the problem with electrons as the fermions, is they are charged, and gravitons in most theories are neutral. This would mean we would have to consider fermions such as the neutrino, which make the problem far more difficult with compound interest. Neutrimos interact via the weak interaction and trying to manipulate then into entanglements or coherent states equivalent to gravitons would be extremely difficult. We would also have a bit of a problem in that the graviton would have the same parity violation as neutrinos. In addition, the graviton would have to have no isospin charge for the weak interaction or quantum flavo-dynamics. So, we then might have to consider what are called sterile neutrinos. Now we have heaped mental abuse on top of torture. Constructing a graviton from fermions is an interesting idea, but honestly, I have no idea how it would actually be done.

The Alcubierre warp drive could be used as a sort of anti-gravity system. The compression of space upwards and the expansion of space down or leeward would serve as a sort of anti-gravity. This of course violates the Hawking-Penrose energy condition, the most general being the averaged weak energy condition T^{00} ≥ 0. This tends to get us into a heart of the matter, for the spacetime we observe is topologically M^3×ℝ, where M^3 is a spatial manifold and ℝ is a real number line standing in for time. This could also be ℝ^+ for the positive reals starting at t = 0. For spacetimes that violate the Hawking-Penrose conditions time has the topology S^1 or a circle. Time exists in closed timelike curves. It does not appear that we live in that type of spacetime.

@David Schroeder and JimV: I saw the videos from the Navy craft. I seem to remember they were black, but if they were featureless and white the same thing. I have no idea what this is for sure. It may be some sort of optical effect. Also, it is a stretch to think that some ETs would come here. If they do it would more than likely be some robotic system. I think the probability is very small. These UFO sightings are very quirky and fleeting, where some object or optical effect appears and then vanishes. I think before I really entertain the idea of ETI visiting Earth I want to see the ship on the ground with the robot or being coming out, say like Klaatu on the classic film

The Day the Earth Stood Still.Why don't you start by assuming the hypothetical aliens are not complete morons and they are familiar with basic military tactics. They would (1) scout first, (2) determine what their plan is and come up with a variety of ways to achieve it. If the earth could some how "unite" to put up some type of meaningful defense, the first things the aliens would do is form an alliance with a faction of the people, split them off and then defeat the rest.

DeleteThe notion that there is a naive group of aliens that will show up with a force just strong enough so that a "united earth" can defeat them makes for really poor sci-fi movies and even weaker realistic scenarios.

One could easily wonder why the Native Americans didn't "unite" to destroy their invaders. The answer is the same -- the invaders were not complete morons and they conducted reconnaissance, infiltrated, undermined, and made alliances to break up any chance of unity. These tactics have been used for thousands of years, so likely an alien civilization would know about them.

@Lawrence Crowell,

DeleteFor a while I entertained hopes that the theory developed by de Matos and Beck, or some other similar model, constituted a major advancement in fundamental physics even though I understood it at a rather superficial level. My hopes were buoyed further by apparent laboratory confirmation of their model in the guise of anomalies showing up in superconductor experiments by professionals and amateurs alike. I even fancied the notion that our supposed ‘visitors’ in the course of their scientific progress had developed a highly advanced propulsion technology by exploiting the properties of these “graviphotons”, their scientists having paralleled the theoretical discovery route followed by de Matos and Beck.

But, alas, with your deep and extensive knowledge of physics it’s quite evident that the de Matos/Beck model doesn’t have legs to stand on. Still, there’s the matter of superconductor anomalies, beginning with the discovery by Janet Tate, et. Al, of the roughly 10 fold higher mass value of Cooper-pairs than predicted by theory. Whether there is a connection between this mass discrepancy and other investigators reports of acceleration signals and weight anomalies in the vicinity of superconductors isn’t so clear with the graviphoton concept down for the count. But, intuitively, I wouldn’t be surprised if there is a connection; albeit, assuming experimental error isn’t the cause of the latter anomalies.

The investigator reporting a weight anomaly that I referred to in my first comment was Frederic N. Rounds, whose 1997 paper can be looked up on the arXiv. I was greatly impressed by his careful experimental protocols and analysis. Like myself, he appears to be a non-physicist, with a general knowledge of physics. I also conducted some experiments with hobbyist grade YBCO superconductors, but with a significant difference – I had no permanent magnet in close proximity to my superconductor. As I mentioned earlier, those individuals reporting weight anomalies of target masses always had a strong permanent magnet usually positioned below their YBCO chips. In every case the anomaly appeared only when the superconductor passed through its critical temperature.

Another difference between other amateur experimenters and myself was that I used a solid state accelerometer (ADXL203, 1 milli-g resolution), as the sensing element. Everyone else used sensitive digital scales in several configurations. Rounds had the superconductor and permanent magnet resting directly on the scale. His assembly was elevated high enough so that the magnet, or changes in overall magnetic flux as the superconductor transitioned, would not affect the balance’s electronics. Others used a beam balance with the target mass suspended above the superconductor and a counterweight resting on the scale. Here the length of the beam was expected to provide adequate separation to the balance’s electronics.

I am very keen now to try some more experimentation, but this time with a strong magnet beneath the YBCO chip. I’m particularly interested in trying to replicate Round’s very carefully documented setup. But as I still have my ADXL203 accelerometer module handy, encased within an aluminum Budbox, its output connected to a dual op-amp for establishing analog common and signal gain of 10, I will play with that first and see if anything shows up.

Lawrence Crowell,

DeleteI had some additional things to say on (hyothetical) gravitons that I was going to post days ago, but then I lost my satellite internet service this past Tuesday, which only got restored after a long time on the phone with a rep today. Just a week, or so, earlier I had to cut down two trees, which leafed out blocking my signal for a similar stretch of time. Such are the problems with rural living. I'll post again as soon as I collect my thoughts.

I would try using only algebra and trigonometry. Even skip the calculus until one does the infinitesimal calculations on a spread sheet. No fancy mathematics tricks and then one might be able to figure out the final theory. Many people won't get lost in the math.

ReplyDeleteWhy do I not experience my daily life as diffeomorphism-invariant?

ReplyDeleteSabine,

ReplyDeleteIt is true that some of the problems you cite (dark matter, dark energy, quantum gravity) are true inconsistencies that should one day be resolved (and many people work on them). But I don't quite buy it that "quantum measurement" is a pressing problem of modern physics. I know you disagree with this, I've read you earlier post and your comment above where you claim that "the measurement problem is an inconsistency between quantum mechanics and reductionism".

At the very least, the measurement problem belongs to a very different category of inconsistencies. Quantum gravity is a theoretical inconsistency between two established theories. Dark matter is an observational inconsistency between the amount of matter that we can see and the amount we infer from gravitational effects.

But an inconsistency between QM and reductionism? To begin with, QM is a physical theory, while "reductionism" is neither a physical theory nor an observation. It is a philosophical cocnept which, it's true, has worked pretty well so far. Physicists have always assumed that you can reduce a level of description (say, chemistry) to the laws of a lower level (say, physics). It has worked every time and they have come to believe it is a universal truth.

But why? Reductionism is akin to Newton's absolute space and time. A philosophical assumption that everyone believed until the early 20th century. But after Einstein, nobody would think of going back to it, of insisting that there is an "inconsistency" between modern physics and absolute space. We have realized that absolute space was just a construction of our mind.

So, if QM clashes with reductionism, then too bad for reductionism! Just as SR did away with absolute space-time, QM does away with naive reductionism.

On one point I agree with you. Although a lot of papers have been written on the implications of abandoning absolute space and time, not so much has been written on giving up reductionism. And, just as absolute space-time time was not really abandoned but rather turned into its Einsteinian version, then also naive reductionism should be transfigurated into something less naive and compatible with QM.

Opamanfred,

Delete"So, if QM clashes with reductionism, then too bad for reductionism! Just as SR did away with absolute space-time, QM does away with naive reductionism."That is one possible way of resolving the inconsistency, yes. We write this in our paper and I have also explained this previously. However, we do not have a consistent theory for how reductionism breaks down either. Just saying "do away with it" is not a scientific theory.

Good to see that we mostly agree!

DeleteOn one thng I object. We do have a perfectly working theory: it's QM. What we lack is a satisfactory philosophical interpretation. The comparison with SR still holds. It is counterintuitive that some events are simultaneaous as seen from here but not from somewhere else. But that's what the math says and experiments confirm. However, our brain is hard-wired to believe simultaneity is absolute, and we need some philosophical acrobacies to get to terms with this.

Same is true for QM: mathematically consistent and in agreement with experiments, but still needs a convincing philosophical interpretation.

By the way, QM is perfectly compatible with SOME form of reductionism:eg, you can reduce chemistry to atomic physics, which is basically applied QM. That's whyi said we should abandom NAIVE reductonism.

PS: what paper are you referring to?

This statement that fundamental physics has been stalled for 40 years assumes largely that elementary particles are the main foundation. Particle physics in one definition has been stalled, since the standard model and QCD were developed in the 1970s, that was confirmed in 2012 and no other development has followed suit. There have been interesting theoretical techniques, primarily supersymmetry comes to mind, and there have been further frontier ideas with strings. However, there are plenty of areas of physics that I think are foundations. The theory of entanglement has been a great area, fractional Hall physics, new forms of phase transitions such as quantum phase transitions at superlow temperatures and so forth. I think a really interesting prospect is for some complete map of different phase transition physics into a single scheme, or period table of sorts.

ReplyDeleteThere have been other developments with astronomy and astrophysics and with cosmology. Our understanding of physical cosmology is thousands of times beyond what it was 40 years ago. There are other more “geographical” developments such as the discovery of extrasolar planets and we have probes and robots on or around other planets in the solar system.

I agree that quantum gravitation and the quantum measurement, or what I really see as decoherence maps between quantum and classical domains, are big outstanding issues. We might get some information on these. I think colliding black holes should have quantum gravity signatures in gravitational waves. So I don’t physics has been all that stalled. I think the foundations are largely shifting directions. I studied standard model stuff back in the 1980s, and I got away from that field then, I saw that doing QFT work appeared to be a complicated dead end. So I did something even less useful called general relativity.

I'd also mention LQG, they have actually made some sense of the Wheeler-de Witt equation and moreover found solutions.

DeleteDid you come across Connes work on the standard model? His spectral action principle outputs the entire SM including neutrino mixing - all hundred plus terms or so!

I find that marvellous. I don't know why we weren't told this in our physics course.

Stephen Wolfram has an interesting theory that seems to reproduce and unify GR and QM. He also talks a lot about "the math" problem in physics. It would be interesting to hear you two talk about your perceptions of how to escape the math maze.

ReplyDeleteWolfram's "theory" is really only a research direction. A very reasonable one, IMHO, but not an explanation of anything AFAIK. A discussion between Wolfram and our host would be interesting though I fear that Wolfram would only focus on "selling" his own ideas. He does give good talks, I will give him that.

DeleteWolfram's "theory" is a typing monkeys approach to a theory of everything. I doubt the world has enough monkeys to generate Hamlet. I'm sure the world doesn't have enough humans to make anything reasonable out of his algorithmic "theory". Besides, to lean on the analogy even further, it's not clear his framework is even a typewriter ;)

DeleteI've checked Wolfram's site and the work is fascinating but it might be more about mathematical "deep structure" than physics. In other words it finds a common thread in mathematical formalism. On the other hand, taking it more seriously, it might be giving us a rule based system for characterizing possible universes-- our own might be an infinitesimal needle buried in an infinite haystack. Apart from all that, I've always been attracted to this sort of thing, this sort of reasoning, and I'm fascinated by his work. It's certainly worth study. Was not aware of this. Now I've got another distraction!

DeleteI'm not a physicist, so I'm not qualified to judge the validity of the theory, only that it seems to be one of many that are taking radically new philosophical directions. And I judge this philosophical direction to be interesting and new.

DeleteIt is in this vein that I suggest our host to speak to Mr Wolfram. Even if he's wrong, he makes a lot of similar philosophical points and is always interesting to listen to.

Wolfram's structure is a sort of graph theory. It might have some application, but so far I see no clear connection to physics that we understand and work with. It is more a formalism in search of a theory.

DeleteFrom a programmer perspective:

DeleteThe "project" (Wikipedia:Wolfram_Physics_Project) is a collection of language elements ("Wolfram Physics Project Functions") added to "The Wolfram Language and System" that would take a big chuck of time for one to get familiar with, if one is even happy working within a proprietary product in the first place.

That one can reformulate or reproduce quantum mechanics and general relativity in this new "Wolfram" programming terminology may be interesting, but who knows if it is useful spend time to learn and program with it.

@NickW:

DeleteI vaguely knew that Wolfram had come up with some magnum opus on the foundations of physics but I'd dismissed it - I do this with anything that mentions computation in relation to physics. To me, it seems like the flavour of the era rather than something truly fundamental.

Anyway, I just looked up his work now and it seems his work is akin to causal networks (they don't use the term computation!) of Raphael Sorkin. This goes off from an old theorem of Malament that showed that the causal structure of GR is enough to reconstruct the spacetime manifold. In this sense, GR is a causal theory rather than a geometric one.

Personally, I think this is an interesting approach to quantum gravity. One thing the community has shown is that they can build a 'growing' universe without violating Lorentz invariance. Thus the block universe perspective of GR is not the only perspective and it brings back some flavour of presentism into contention. This I find important. As intuitively speaking, the resent is one of the most fundamental observations one can make. We do not view the universe as an entire block all at once, to use Spinozas term, *sub species aeternitas* (from the perspective of eternity, that is outside of all time).

It's also interesting because quantization is a pervasive feature of modern physics. But what is its fundamental meaning? I mean, rather than starting with a classical system and then quantising it, we really should have a recipe for building quantum systems *ab initio*. That to me is the truly fundamental approach. I think Sorkins doing some brave work by tackling this difficult question.

I notice that there is a Kindle version of the German edition of your book but not for the English edition. Will this change any time soon?

ReplyDelete? Not sure what you mean. The Kindle version of the English edition appeared with the hardcover in 2018. You have to watch out to use an US store as I don't have a publisher in the UK (or in Australia or in South Africa), in case that answers the question.

DeleteAmazon in the US won't sell it to me because they know I'm in the UK. I suspect that even if I bypassed that, my kindle reader wouldn't show it to me.

DeleteAs I pointed out to Sabine a long time ago, you are right - there is no Kindle version available in the UK!

DeleteSabine, here is my email address - if you contact me, I will send you a copy of the Amazon screen that we (or at least some of us) get in the UK

dave at dbailey dot co dot uk

Sabine,

DeleteI don't understand your comment - I can buy all sorts of books from the amazon.co.uk - they don't need to be published here!

Norman,

DeleteI do not have a publisher in the UK. I know that this is inconvenient and people from the UK have complained to me about this for 2 years, but let me assure you that it is not that I didn't want a publisher in the UK, it's that no publisher in the UK was interested. There is nothing I can do about this.

David,

DeleteI know that there is no Kindle version available in the UK. That is because I do not have a publisher in the UK.

David,

DeleteYou can totally buy my book in the UK. I am sure about that because I have seen my book in stores in the UK with my own eyes. You can also order my book from the UK. But it'll be imported from the USA. Now how the copyrights on Kindle work seems to be a different matter altogether. And really I am not the right person to ask. The publisher is Basic Books and you are most welcome to report your misery to them.

Have you guys ever tried to use amazon.com for kindle books? The one for the US? That works for me whenever I need it or I use any other countries site. I use the same credentials everywhere and it works. Especially for kindle books. Oh, BTW, Apple has it too. My preferred reader at the moment.

DeleteSabine,

DeleteI have emailed Basic Books, but so far they have not come back to me.

Christian - I have tried that, but it seems to recognise that I have a UK amazon account, ant it links to that. I don't seem to get the offer to buy Sabine's book in Kindle form!

Sabine,

DeleteBasic Books has sent me a copy of your book in Kindle format - Yay!

I have written back to find out how to pay them and also to find out why I was unable to purchase your book in the ordinary way, here in the UK. Hopefully that might prod them into correcting whatever sort of bureaucratic glitch has caused this. Anyway, I will keep you updated!

I guess Norman Stevens could obtain a Kindle copy via the same route.

David,

DeleteAmazing! Most amazing!

Mathematicians are not magicians. Are physicists really lost in math or are they kost in their own make believe?

ReplyDeleteA significant problem is to distinguish knowledge application from science. Knowledge application is readily paid; pure science is an expensive luxury that is easily lost by the payer.

ReplyDeleteAt its heart, science is a philosophy that postulates an objective reality that can be measured through models. Good models are explanatory and predictive (of things we don't yet know). Better models are simpler. Great models resolve paradox (such relativity and quantum dynamics) and lead to a deeper understanding of reality.

Science requires iconoclasm; a willingness to go against the elders and caretakers of the received wisdom. this is an increasingly expensive proposition; both in terms of funding for the experiment and the ability to earn a livelihood.

18th century scientific explanations were understandable by the average college graduate. Immense wealth generated from applications of this knowledge was the divided. It is not so clear in the 21st century. Most knowledge workers barely understand the science of their profession; and usually have no more insight into another field than a lay person. The low hanging fruit has been harvested. The question is whether the high hanging fruit is worth the price paid to retrieve it. In a resource strapped society, this becomes an increasingly difficult argument.

We're not a resource-strapped society. UK GDP in 1700 was 10 billion pounds (using 2013 pounds). In 2018 it was 2 trillion pounds. You can tweak the data as you like but the results will be the same.

DeleteIt's part of a certain mythical tradition of thought that says science requires 'iconoclasm'. I'd say that this is in the scientific realm what individualism amounts to in the social and political realm.

DeleteBut I don't think this is quite true. The great physicists were actually conservatives. They were, instead, forced to their new and novel conclusions by exhausting all the usual methods.

Being iconoclastic for iconoclasm sake, like art for arts sake, is generally a losing proposition.

The problems with resources is not that we are 'resource-strapped', but how they are distributed. In the neo-liberal world order most wealth is held by a tiny demographic, the famous one percent. This in my mind, is not conducive to scientific research (not to mention the social and moral concerns here). Should Bill Gates be deciding research programmes rather than scientists themselves?

Sabine,

ReplyDeleteYour critical reflection about present physics is really important and necessary for any progress. But in my view you do not follow here your own rules.

You say about correct predictions:

“What did these correct predictions have in common?

They have in common that they were based on theoretical advances which resolved an inconsistency in the then existing theories. What I mean by inconsistency here is an internal logical disagreement.”

There are inconsistencies in accepted theories and most physicists do not care, you as well in some cases.

Example: Einstein’s SR.

Einstein’s denial of an absolute frame excludes rotational motion. That was at his time criticized by Mach and Lorentz. Einstein has *admitted* to both that this is a logical problem. In case of Lorentz there was a detailed discussion by letter between Lorentz and Einstein. Einstein admitted this logical conflict but stated to Lorentz that his principle (here the strong equivalence principle) was more important for him than this conflict. And this principle is in conflict with a fixed frame. – This incompatibility between Einstein’s space-time and rotation is unresolved until today. There is even new experimental evidence.

And I explained another point here earlier: The mentioned strong equivalence principle is falsified by several experiments. But GR is based on it.

I think that as long as such elementary errors are maintained, we should not be surprised that open problems like inflation and dark energy cannot be solved.

>"And I explained another point here earlier: The mentioned strong equivalence principle is falsified by several experiments. But GR is based on it.

DeleteI think that as long as such elementary errors are maintained, we should not be surprised that open problems like inflation and dark energy cannot be solved."<

I do believe it is inaccurate speaking of "errors being maintained". People in the field are well aware of these facts and limitations. It's pretty obvious that GR and quantum theories are not the end of the story.

So what happens is that people are searching for a deeper understanding while still using those incredibly successful theories. It may be that latter part that appears to you to be "maintaining elementary errors". It is, however, just common sense to use the best available tools until better ones are available.

Shall we in fact continue with those “incredibly successful theories” like Einstein’s relativity?

DeleteWe know from the unresolved discussion between Einstein and Mach / Lorentz that Einstein’s basic opinion against an absolute frame is not correct. This is a too big point to be ignored. And it is the cause of the metrics of Minkowski and Riemann which make relativity so complicated.

We have alternatively the relativity of Lorentz based on an absolute frame. It provides the same results for the successful experiments based on Einstein. But the handling is much simpler because it uses Euclidean geometry. And it offers solutions for the mentioned open problems.

A) Antooneo is correct.

DeleteB) There are no "people in the field" unless someone has recently done, said, or produced something even remotely sensible of which I am unaware.

C) Search for a "deeper" understanding when people lack even the most basic fundamental understanding is not going to clear up anything.

D) I fear seeming rude or ignorant, but ... Physics!

Best greetings!

Yes, thank you!

DeleteI think that it is important to be open for an investigation of new solutions. My impression is that some fundamental theories in physics are treated like religions. I know of professors of physics who question certain statements of Einstein; and when they do it nobody is willing to talk to them anymore.

Greetings back!

I think all of our successes have resulted in us painting ourselves into a corner that we can't escape, a false bottom or solution. We may need to start from scratch. Maybe invent a physic where geometry isn't fundamental but is derived. It will get us away from the singularities and infinities that aren't physically real.

ReplyDeleteThe problem with today's physics is lack of reality. If anyone truly believes that QM and/or GR are “reality” they are “lost in it”. Unfortunately that stuff seems to be taught as religion and most physicists are too gullible to realize the difference.

ReplyDeleteSabine, about a year ago I bought and read your book "Lost in math".

ReplyDeleteThe phrase "can't see the forest because of the trees" comes to mind without searching. Althought it's obvious that physical emergent processes cannot be modelled exactly by math only, there is the lurking possibility that the most reduced physics could be pure logic of math; for example in terms of the number theory...

Thus, when excited about the sophisticated mathematics of emergent phenomena, one may become blind. In the field of mathematics alone, it is fruitful to comprehend how discrete logic generates the fabric of continuous differentials as to find many other methods that has direct interfaces with physics, after all.

So when doing physics I think it’s a way of balancing ugliness and beauty.

I often wonder where people get these impressions. Physics is an experimental science and these are put to the test. QM has been exhaustively tested. GR has fewer experimental benchmarks, but still a fair number and the detection of gravitational waves was the latest big qualifier for GR.

ReplyDeleteI think most physicists recognize that certainly GR may well be a classical approximation that is some limiting case of a more complete physics that in one way includes QM. With QM, we have the Wheeler question, "Why the quantum?" Either QM is a sort of physical logic that transcends such a question. In this setting it is not a limiting case of some other theory. Then there may be an affirmative answer to Wheeler's question QM might be some limiting case or it is something that exists as it is for some determined reason.

Physicists go through this education process taking the better part of a decade. We may get bleary eyed in a graduate course in classical electrodynamics, but it is not the same as attending some religious service. Sure, it is taught in a somewhat formulaic way, for students have to be moved forwards to complete prelim or qualifier exams and so forth. Yet at no point are these subjects taught as unquestionable articles of faith.

If anyone is interested, you can "diagram" the laws of physics in the same way that you can "diagram" a sentence. This keeps all the math grounded to known physics for which there is a ton of experimental evidence.

ReplyDeleteJohn Wilson - Scirealm - SRQM

ReplyDeleteMy humble opinion is that the general Einsteinian space-time frameworks satisfy classical physics; but not quantum physics; For me, a fundamental and interdependent space-time and energy are necessary; but if the mathematics in classical physics allow playing with T, S and E independently, perhaps in quantum physics this cannot be done without falling into contradictions

Dr. Hossenfelder,

ReplyDeleteI thought your book was outstanding and I enjoyed it very much. However, I will admit that I may have taken the purpose for your book and the point of the book a little too far as I am asking questions about everything now. Example, the standard model, I have read so much about how it is one of the great successes of quantum, and it is a success. But there are so many unanswered questions, and it looks like nobody is interested in answering any of the questions. For me I would like to know why is there a 100 times difference in proton mass compared to its combined quark masses, or how come the top quark does not combine with anything, and one of my favorites, what is the purpose of Mesons? What do two combined quarks with a short lifespan actually do for the standard model and reality? I am sure there are plenty of theories, but what are the answers? How much further could we go if we had more answers?

And, a slight change of topics, it is kind of a related follow up question for Lawrence Crowell, as he made a statement that buzzed the retired cop in me. In his 06-06-2020 0905 post on the topic of aliens he made a statement that contained the term “conclusive evidence.” I would ask him to define what “conclusive evidence” is or means. Let me be clear, I am not trying to start an ET - UFO conversation as this is not the place for that. Rather, I am trying to point out that based on my experience I believe I definitely have a definition different than what I think Lawrence is referring to (my partner and I had a 100% conviction rate on our major cases, including a few murders, and DUI arrests over 25 years) Maybe a slightly relaxed take on “conclusive evidence” in the sciences could start research into an area looking for scientific “conclusive evidence” because from what I can see there really are a whole lot of unanswered questions.

Thanks Dr. Hossenfelder.

Riffing on Wolfram: imagine a cellular automaton evolving according to simple rules and we have already determined that these rules lead to computationally irreducible evolution. Shrink the cells and let them approach zero dimensions as a limit. In the limit we have a sort of continuous space with a "structure" determining distribution and evolution of whatever might be measurable in that space. How useful would differential laws be within such a space? Could one even formulate any? At best, they are only useful in a limited way, over, perhaps, certain distances and spans of time and with other restrictions, maybe involving velocities and masses. In other cases, starting with reducible rules, we might arrive at spaces that can be perfectly characterized by differential laws. I think the possibilities could be expressed as spaces allowing 1. perfect periodic behavior, 2. quasi-periodic behavior, and 3. a class of spaces that are not periodic on any scale. The last category is not reducible and chaotic. The first category is reducible and all reducible spaces allow perfect periodic behavior. The middle category is irreducible but allows approximation and periodic behavior with restrictions. In the first case, a "final theory" is possible. In the last case, no theory of any kind is possible, no prediction is possible at all-- chaotic. In the middle case, approximations are possible but the theories are never complete and new phenomena always appear as new regimes are explored. No final theory is possible in the "quasi-periodic" space but one can always find better approximations. And within each category, there are infinite variations, of course. So we might wonder what sort of space we inhabit. Probably #2, don't you think? This is quick and dirty. Like I said, it's a riff.

ReplyDeleteI bought my hardcover copy of Lost in Math from Barns and Noble in the US. It is available also on B&N's Nook e-reader, similar to Kindle. B&N was recently purchased by a UK bookseller so maybe Lost in Math is available in the UK on the Nook.

ReplyDelete

ReplyDeleteSome things are confused here:

Scientific ideas can be wrong - they are not called unscientific in this case. Actually, the methodology of the present searches for something physically fundamental are based on what worked in the past (e.g. quantize all the interactions). String theorists use the same methodology as their predecessors did - whether Geoffrey Chew or Yang and Mills. (see e.g. Castellani on early stringy methodology, arxiv 1802.03850)

And the other confusion is the one between fundamental physics and basic science. The progress in basic science all over the disciplines will not be stalled just because the physicists cannot make sense of the measurement problem or THEORETICAL inconsistencies between otherwise very well behaving theories. The LHC may be a very large machine, but fundamental particle physics is a teeny tiny part of a very large discipline. Although with a historically grown perception of being THE most important (or at least mostly reported on) part of what physics is.

And really - there is no methodological difference between (your) superfluid dark matter and the string theorist hiding away space dimensions.

I hope to read some new ideas (research paper?) on how the research you criticize could be refined to initiate progress. We need evolving thoughts of criticism as much as we do need new results. Anyway, a constant chain of complaints will not lead to anything but polemics.

Sincerly

"To try and fool others is bad, to try and fool oneself is stupid"

ReplyDelete(Pendapat Saya)

Any new material or interesting corrections in the paperback version?

ReplyDeleteHi David,

DeleteNo, just a few typos fixed.

For some reason a subthread has turned to ETI. I think within the science fiction genre Stanislaw Lem probably got some of this right. Intelligent life from elsewhere may be so fundamentally different from us that we might have trouble understanding whether any signal from them is intelligent. The way some ETI might express concept could be in effect so utterly scrambled relative to our way of expressing things that we might have trouble seeing a signal anything but noise.In fact we encrypt messages so they are noisy in appearance.

ReplyDeleteI think when it comes to conclusive information we have some sense of what this means. It certainly means some signal to noise ratio that is greater than one. Of course the SETI folks may be getting ETI signals that are encyrpted and indecipherable. I rather doubt it, but it is not impossible. Yet to say ETI exists or not. If in the more extreme case they might be visiting here I think we need tangible evidence. A piece from an alien spacecraft might be a start, or some interaction with us humans. So far it is no cigar.

Lawrence,I did not mean to take this into an ETI discussion as that is not what Dr. Hossenfelder's blog is about. Rather, I was looking to show the point that "conclusive evidence" can mean many things and that maybe the rigors of the scientific method create a standard of "conclusive evidence" that can be impossible to meet. And that maybe conclusive evidence does not necessarily mean meeting a complete mathematical standard. One could say that string theory meets a mathematical rigorous conclusive evidence standard, but it could be argued that it is not always about the math as it does not do so well at meeting a "reality" based conclusive evidence standard.

DeletePhysics is the science of our universe and it is such a wonderful place. Maybe physics could really progress and move forward with a little more imagination and a little less mathematical rigor, at least to start looking at other things.

Sabine,

ReplyDeleteYour view of “fundamental physics” and its role in our lives sounds rather parochial. During these past forty years, despite the lack of any advance in our understanding of “fundamental physics,” science, math and technology have all shown great advances in all sorts of areas. Our scientific evolution hasn’t come to a standstill; the world still continues to rotate around the Sun, and life goes on. Maybe these past forty years just proved that your “fundamental physics” is not really as necessary as you claim, having been replaced by maybe not so “fundamental” but at least as useful and interesting physics.

aydemir,

DeleteI wasn't talking about progress in physics by way of measuring, what?, numbers of papers? Though even on that count progress in physics has slowed down. I was talking, explicitly if I may point out, about societal progress. Societal progress today builds on progress in the foundations of physics a century ago. Societal progress is already slowing and if we don't get our act together in the foundations of physics that's not going to improve.

Aydemir,

Deleteyou are just saying: Shut up and calculate - it has been said more than 80 years, that is the problem.

"overall societal progress depends crucially on progress in the foundations of physics, more so than on any other discipline."

DeleteSabine, I find hard to believe you actually meant that. Don't you think most merits should go to biology and chemestry, at least through increasing crop productivity? Yes, we can count modern electronics as critically dependent on QED (and material science, and informatics), but appart from that the contribution of foundational physics has been sparse (GPS, nuclear plants, anything else?) and it's been a long time since these theories, both in term of years and in terms of a series of newers important discoveries. If a (series of) discovery(ies) such as the standard model was negligible in terms of impacts on our daily lives, why do you think foundational sciences will be more crucial to further progress than, say, working on quantum computers, CRISPR, or even social sciences? I think we can agree we could use some improvement on how to organize our societies before the next revolution in the foundations of physics.

Allow me to look backward: proceedings of a 1953 International Conference of Theoretical Physics (900 pages) contains zero papers devoted to general relativity--the focus is upon field theory. Richard Price, in his 'Primer of General Relativity' writes: "...the necessary mathematics of tensors is not part of the background of most physicists." (1982, American Journal of Physics). Thus, as of 1982, tensors were not an integral part of 'most' physicists background ! I mention those two items because it has been forgotten that progress in theoretical physics has always been slow. The present day is hardly an exception. The essay, Still Lost in Math, states: "the problem is that physicists rely on unscientific methods to develop new theories." If that is the case, then it has always been the case--as that did not originate with trends like supersymmetry or superstrings-- that 1953 conference begins with Yukawa's concoction of a non-local field theory (see page 2). With Yang and Mills' 1954 gauge theory paper learn: "our motivation was completely divorced from general relativity." (page 184, Dawning of Gauge Theory). Reiterating: progress in theoretical physics has always been slow. That progress in 'foundations' should be even slower should be a surprise to no one. Reading 'Still Lost in Math,' there is: "...some issues with quantum field theories." Whatever those issues are, it is hard to believe that physicists are totally lost, as the 2020 Particle Data Group numbers reveal agreement between experiment and theory down to length scales of 10^(-19) meters ! (page 21, section 9, quantum chromodynamics, where uncertainties are spelled out).

ReplyDeleteLost in Math or Lost in Philosophy?

ReplyDeleteQuantum philosophy/quantum epistemology (common agreed/disagreed understanding of QM) is built on many partial interpretations cumulated during the long and complicated journey from Planck to Born (energy packages, wave-particle photon, electron jumping among orbits, particle-wave electron, matrix mechanics non-commutativity, wave equation, uncertainty principle, probability wave interpretation, complementarity, measurement, entanglement, decoherence).

However “shut up and calculate” stuff based on existence of intrizic spin, Dirac equation and its extensions -- abelian QED U(1) and non-abelian EW SU(2) and QCD SU(3) covering all mathemathical technicalities (plus hyper-charges and iso-spins) -- is not manifesting itself in general understanding/epistemology despite its phenomenal success in explanation of experiments/quantum ontology (interaction by exchange of particles, matter particles, field particles, virtual particles (in Feynman diagrams), hyper-charges, iso-spins, color charge, internal degrees of freedom, fiber bundles, internal geometrization, holograms, susy, quark mixing, neutrino oscilations, gauge, groups, abelian/non-abelian, non-linearities, renormalization, running constants, isolated quark non-observability/observability, 2-colored gluons, asymptotic freedom, confinement, charged and neutral weak current, EM charge of W bosons, hadronization/non-hadronization of heavy quarks, mass (lepton masses, hadron masses), 3 generations, …).

The fact that “quantum” means exactly and just “quantum of spin” (angular momentum) all the time since Planck remained and remains mostly unnoticed. Instead one boldly tries to to unify SM with gravity based on personal perception of beauty/ugliness.

There is no clear, comprehensive, logical, consistent understanding of geometry of spinors (Dirac equation) beyond that they are just used as the weird tool to work with matter particles/fermions available. This situation is here already 90 years since Weyl discovered neutrinos in Dirac equation. Weyl also invented gauge field from which our current understanding of SM was later developed however this part of story – the successfully one – ended in 70s as you already mentioned. Till 90s many thinkable and unthinkable GUTs based on the group theoretical arguments were developed and studied. Later development is almost completely lacking any groundings/connection with physical reality. Surprisingly evident non-linearities residing in electro-weak model and quantum chromodynamics and their relation to non-linearity in gravity did not attract significant attention.

We all know that Newton’s absolute space and time are nonse but unitarity with necessary background dependence of QM is not just ok, it is the central dogma to which we resort. Where is the courage and imagination? What about the possibility that ‘diffeomorphism-invariant’ QM is kind of “self-dual” view/question and is going to offer us answers to all open problems at once (SM+gravity unification, DM, DE).

Dark matter is despairedly searched ether of our times but how can one know which of plentitude of empty-handed experiments is to be recognized as modern version of Michelson-Morley? What about to think of physics resulting in MOND? Or about of connection between the SM properties of matter (that constitutes and fills the universe), their relation to gravity and the properties of observed universe itself?

Dark energy is the measure of our ignorance of both macro- and micro- cosmos. How do we even dare to think of explanations when we lowly know just about the 5 % of all mystery? Or? Is there missing something? Like a bit critical view to the current cosmology, which sees our cosmos as 13.8 bilion years old and expanding in very complicated way from the initial singularity to its observed size without any reliable (and model independent) direct evidence?

Hi Sabine,

ReplyDeleteThanks for the post and your scientific outreach.

I have a remark: on multiple occasions, you've repeated that physicists need to think about the measurement problem.

If we read books from statistics or probability theory, they speak of sampling random variable realizations from probability distributions. The mechanism by which this happens (i.e., the measurement problem) is not discussed because it is assumed that getting data points is a "learning" process. i.e., it is a mental process not a physical mechanism.

As a physicist who works closely with data myself, I've always found this picture satisfying. That is, it resolves Wigner's friend's paradox (wavefunctions reflect what we know, not what there is. And collapse further reduced what we know based on what we learned) and is consistent with the picture from my colleagues in the math department. What are we all missing?

[p.s. in part this is a question because it annoys me to hear the measurement problem is a problem as not all physicists agree. In part, this is also my way of communicating to your audience (unless I am convinced otherwise) that many of us fail to see the measurement problem as a problem]

Steve,

DeleteThe interpretation you suggests is not reductionist. It posits that the wave-function encodes knowledge held by observers and obtained by measurements, but there isn't any such thing as an "observer" or their "knowledge" or a "measurement" in the fundamental theory.

This means you either have to postulate that these terms cannot be derived from the fundamental constituents, which means you advocate strong emergence, for which there is not even a theory. Or you have to accept that the measurement postulate is emergent. In this case, however, it cannot emerge from quantum theory as we know it, meaning the theory is itself incomplete.

Hi Sabine,

Deletethanks for the reply.

You gave me two options: 1) "postulate that these terms cannot be derived from the fundamental constituents" or 2) "accept that the measurement postulate is emergent".

I agree with 1 insofar as they cannot be derived from QM or something deeper within physics. The concept of a probability, the process of sampling of random variables from probability distributions, and the learning process (relying on Bayes' theorem) is more fundamental than QM. It follows from the logic of the mathematics of probability theory.

I disagree with you when you say "for which there is not even a theory" because the mathematics underlying probability theory and what it means to sample from distributions are established and hold equally well for QM as they do for classical statistical mechanics.

Your thoughts (and time!) on this are appreciated.

A QMO, a quantummechanical object, can act in two different ways: either linearily, then it gets projected into its own future/continuation, or nonlinearily, that means its destruction. In the latter case, an (almost) pointwise action of a distributed field becomes possible. The continuation of a QMO can be described with (at least) two non-commuting observables, e.g. position and momentum, resulting in the uncertainty principle.

DeleteThese three postulates are the aximatic base of QM. Inherently, axioms can't be reduced further. QM does not emerge from other theories. The "observer" of different physical theories might see an inconsistency here. But the inconsistency is not in QM itself.

Steve Presse wrote:

Delete>The mechanism by which this happens (i.e., the measurement problem) is not discussed because it is assumed that getting data points is a "learning" process. i.e., it is a mental process not a physical mechanism.

Steve, you appear to be making the physical world depend on (our) mental processes.

You have three options here:

A) Argue that our mental processes are emergent from the physical (quantum) world. But then you are back to explaining it all within quantum mechanics without adding any additional postulates.

B) Make mental phenomena into something that is not simply emergent from the quantum world but something sui generis. Perhaps that is how the world works -- but then we would really like more information about this new natural entity that transcends physics.

C) Waffle -- this is of course the standard approach: admit that measurement is an additional postulate added to QM, but, when pushed, argue that it is somehow emergent from QM, which means it is not a separate postulate.

Of course, there is also option D:

D) QM is incomplete: there is a deeper theory.

As you know, Einstein, Schrödinger, de Broglie, et al. opted for D. Wigner, Penrose, and, again, Schrödinger opted for B.

Frankly, A and C strike me as dishonest -- simple con games that some physicists use to hide their insecurities.

I do not myself know whether B or D is true: I am reluctant to claim that the physical world depends upon consciousness, but then Nature presumably does not care what I think!

It seems to me that you are opting for B, in which case, the obvious questions are: How do you know? And could you give us more details, please?

If you are opting for A or C... well, I am willing to grant that you are not knowingly being dishonest, but I think John Bell was right that, fundamentally, those approaches are dishonest.

Dave

Dear PhysicistDave,

Delete>Steve, you appear to be making the physical world depend on (our) mental processes.

Hmmm...you appear to be making the same mistake made by all those who initially rejected the maximum entropy derivation of probabilities in statistical mechanism, ie., "why would I want to maximize my uncertainty? Entropy is a physical quantity that has nothing to do with our mental state." Luckily, for all of us, this opinion died out and the power of information theory in statistical physics was recognized and science moved forward.

Yours is a silly mistake to make in the 21st century. Probabilities mean something, ie., they have subjective interpretations. And to make the mental leap that because probabilities have subjective interpretations we must have a model of a human with consciousness in QM is silly.

Steve Presse wrote to me:

Delete>Yours is a silly mistake to make in the 21st century. Probabilities mean something, ie., they have subjective interpretations. And to make the mental leap that because probabilities have subjective interpretations we must have a model of a human with consciousness in QM is silly.

So, there can be something "subjective" without there being a "subject"?

Okaaaayyyyy.... (as he looks for a safe exist to escape from the crazy guy).

Dear PhysicistDave,

DeleteLet's get serious and talk science please.

> So, there can be something "subjective" without there being a "subject"?

A subject is NOT required and yours is a childish play of words.

Would it make it better if I called them Bayesian probabilities?

We don't all need to be the Right Rev. Bayes now, do we? ;)

Let's not play word games.

I want to help but work with me.

You are setting up a false conundrum when you say that what I originally said depends on an observer's mental state. It absolutely does not.

Where do words like mental state, consciousness, observer, collapse etc.... pop up in an serious text on probability theory?

Ask yourself the question: why not?

[p.s. they are imprecise terms that should disappear from our 21st century vocabulary. That's why.]

Also ask yourself this: why do people not speak of interpreting probabilities in Stat Mech? It's exactly the SAME issue. Is it because people believe that probabilities in stat mech are used to HIDE finer grained details that we can uncover so they are not to be interpreted subjectively? Well, if they do, they are also wrong.

Sabine also setup a false dichotomy by telling us that incorporating the measurement process within physics requires reductionism (seeing the introduction of the observer as being in conflict with reductionism).

But there is no conflict and there is no need for words like "observers".

Let's be serious and let's use the language of mathematics to communicate.

So let's start from scratch:

1) QM is probabilistic.

Like it or not, it just is. Probabilities aren't used to HIDE finer grained details that are left for us to discover (i.e., reductionist hidden variables). They can be and ARE fundamental objects in QM.

2) Probabilities can't be measured like a scalar in thermodynamics or classical mechanics. They must be sampled.

Hence the "measurement problem".

They need to be interpreted and this is true of probabilities everywhere in Physics (including QM and stat mech).

Steve Presse,

DeleteI was flippant in my reply late last night (your own post was pretty flippant too – one good flip deserves another!). But I think the issue does deserve a more thoughtful response.

You declare that probability in QM is "subjective," but you deny that this has anything to do with consciousness.

That is, at best, confusing. I am guessing that you are appealing to some sort of Bayesian approach. Traditionally, Bayesian approaches start with some

person's"priors" -- i.e., theydostart with consciousness.Furthermore, the word "subjective" does indeed come from the word "subject": in normal English, when we refer to something as being "subjective," we do mean that it somehow relates to the views, feelings, etc. of some actual conscious person.

If you are using these words in some unusual sense, it would be nice if you would explain it.

Indeed, when we physicists teach quantum mechanics, we have not generally used a subjectivist, Bayesian approach: in fact, I have

neverseen that done. The probabilist predictions of QM can be and usually are interpreted from a frequentist perspective.I know that from time to time some people (usually not physicists) have tried to interpret the wave function as simply revealing our own ignorance. But students are

supposedto learn that this just does not work. If the particle in the two-slit experimentreallywent through one slit or the other slit but we just do not know which, it is very difficult to explain the interference pattern: you should just get some linear combination of the probability distributions from each of the two slits separately.But there is a broader issue here: you treat it as trivially obvious “in the 21st century” that the measurement problem is trivial.

Yet, for nearly a hundred years the most brilliant minds in physics – Einstein, Schrödinger, Heisenberg, de Broglie, Wigner, Wheeler, Penrose, and, most recently, Steve Weinberg – have struggled with and debated the measurement problem.

But, you are smarter than all of them: all of their work can be dismissed because they were not... Steve Presse.

I am truly impressed by your... er, self-confidence.

A bit of googling shows that you work in cell biology at a university that is not very prestigious, not even in the top hundred in the US News rankings. It is nice to know that geniuses like you can be found even at fifth-rate universities! (Your university accepted my daughter even though my daughter decided not to complete the application process. Pretty funny.)

Pardon the sarcasm, but does it ever occur to you that if you disagree with Einstein, Schrödinger, Heisenberg, de Broglie, Wigner, Wheeler, Penrose, and, most recently, Steve Weinberg, then just maybe possibly

youare the one who is wrong?PhysicistDave,

Deletedo a bit more googling and you will find that I am in Physics (not cell biology) in a top 50 department. Nice try at ad hominem.

For the sake of our audience, let's argue science, shall we?

Let's start by addressing your misconceptions/disdain for logic.

>That is, at best, confusing. I am guessing that you are appealing to some sort of Bayesian approach. Traditionally, Bayesian approaches start with some person's "priors" -- i.e., they do start with consciousness.

I laughed out loud when I saw that.

To answer your question: No!

All probability distributions must be defined over some range (the support of the distribution). The prior defines that broadest possible range of the distribution which may have non-zero support when data is supplied.

For technical reasons, not worth delving into here, we can also assign variable weights (a probability) to each region of the support.

That support plus weights is called a "prior".

Consistency of Bayesian methods requires those "prior weights" to provide vanishingly small contributions to your final conclusion (posterior weights) with enough data.

Note the absence of the word "consciousness" in my description armchair physicistdave? ;)

I think you're a bit of a lost/hopeless cause. You like dropping names like Einstein and Schroedinger. But when push comes to shove, your arguments have little substance.

On the other hand, I do hope our broader audience appreciates that the "measurement problem" is not a problem if probabilities are treated as fundamental objects (not ways of hiding finer details) whose properties are teased out by the process of sampling (i.e., "collapse") as is REQUIRED for all probabilistic theories.

Steve Presse wrote to me:

Delete>Let's start by addressing your misconceptions/disdain for [Bayesian] logic.

Well, Steve, I was learning about Bayesian analysis since

longbefore you were born: more likely you are confused than I.By the way, I also am co-inventor on various patents on applications of information theory, another subject I was studying, quite literally, years before you were born.

Steve also wrote:

>do a bit more googling and you will find that I am in Physics (not cell biology) in a top 50 department.

The work you list on your lab page is not what most physicists would call physics.

And I'd never heard of your department, and your school is 117 on the US News rankings. No one takes your school seriously. As I said, they admitted my daughter (who of course did not go there)

even though she decided never to finish the application.Now,thatis open admissions!But no doubt you are an unsung genius even though you are stuck at such a school.

By the way, everyone should know your bachelor's is in chem and your PhD in chemical physics, which from a physicist's viewpoint is more chem than physics.

Steve also wrote:

> I think you're a bit of a lost/hopeless cause. You like dropping names like Einstein and Schroedinger.

...

> On the other hand, I do hope our broader audience appreciates that the "measurement problem" is not a problem.

Oh, I certainly hope you are right. I hope everyone here is smart enough to realize that when a genius like

Steve Pressedisagrees with a good-for-nothing like Steve Weinberg (I mean -- who cares that Weinberg is co-founder of the Standard Model and has that silly Nobel prize?), then of course they should just assume that the geniusSteve Presseis right and that the worthless good-for-nothing Steve Weinberg is wrong!I mean just because Weinberg wrote a three-volume text on quantum field theory and a text on quantum mechanics and then there is his text on general relativity...

Did I mention that I took QM from Dick Feynman and QFT from Steve Weinberg? I truly regret that I did not have a chance to take all that from a true genius like

Steve Presse, even though his degrees are really in chem, not physics.And, Steve, why do you leave all those blank lines at the end of your posts just eating up screen space? Okay, blank lines equal blank _______ .

Steve Presse, I salute you! Now I finally understand QM!Go Sun Devils!

(Aside to any other readers: if you really think

Steve Presseis more brilliant than Steve Weinberg, take Presse seriously. Otherwise, think about what I said about the two-slit experiment.)@ PhysicistDave

DeleteYour answer about probabilities inspires me to reflect on the difficult dialogue between specialists and non-specialists (in bad English)

In physics at least, the non-specialist as the outsider who oppose logical arguments or conceptual coherence meet various attitudes which can be classified as follows, whether the problem submitted is already known to them or not.

1 - Ignorance of the problem submitted, by choice or epistemological belief in pragmatism or instrumentalism in science: all that matters are predictive formalism.

2 - Denial of the problem (justified or not)

3 - The argument that the problem is already solved (true or not)

4 - Recognition that the problem exists and is not yet resolved (true or false)

But in discussion essays as we see in blogs and forums, a recurring problem is a form of argument of authority. Namely, the specialist in a domain does not imagine that he, his peers and the big names in the domain can or could have lastingly misunderstood a problem, or deny it (repress it), or solve it falsely. The non-specialist is then referred to the need to become in turn specialist - to study the basics of the field and the literature - in order to be able to issue an opinion then authorized on the question, and to his naivety to believe in the possibility of the relevance of his intervention - naivety which indeed often exists -.

This attitude is understandable if he thinks that the only possible problems are of a mathematical nature. But if he himself has experience with conceptual problems that are unrecognized or denied or falsely resolved in his discipline, for example if he recognizes the existence of the problem of measurement in QM, then he has at least one example of a problem who has long been / misunderstood / denied / falsely resolved / by his peers and the big names in the field.

Let’s take the example of the subjective conception of probability, which goes back a long way. It remains common among probability specialists and probabilistic theories specialists, users of this field. However, the argument which opposes it does not require any specialized competence. If the probabilities were only subjective, it would become inexplicable the fact that all the rational and informed observers of a probabilistic theory expect for such possible event the same probability. This intersubjective agreement therefore has an objective basis. In addition, experience repeated N times without the need for observers confirms this value of the probability by providing a frequency which objectively tends towards this value when N tends towards infinity (this propensity of the frequency to join the probability leads to looking at a probabilistic law as a manifestation of an internal potentiality in the phenomenon studied, which Popper aptly called "propensity"). In general, all theories that use probabilistic laws should include the observer in the theory while having to explain the inexplicable intersubjective agreement. And above all should explain why these laws do not need observers to exist.

These facts of experience should warn the specialist that other problems may as well be the same, namely that expertise is not required to discuss certain problems, which cannot be decided in advance which they are.

And thank you again to Sabine for opening her blog to discussion.

Dear sir,

DeleteWe received your paper on the photoelectric effect. However, we googled you and found that you worked in a patent office. Furthermore, it is not even one of the better patent offices!

Jean-Paul wrote to me:

Delete>The non-specialist is then referred to the need to become in turn specialist - to study the basics of the field and the literature - in order to be able to issue an opinion then authorized on the question, and to his naivety to believe in the possibility of the relevance of his intervention - naivety which indeed often exists -.

Yes, that is of course the attitude of specialists and it is correct. I can think of zero cases in which non-specialists have made a serious contribution to theoretical physics in the last hundred years -- the subject is fairly mature, and it takes a lot of work to get up to matters of current research interest.

We have had, in the last few months, several chemists -- Ian Miller, Andrei, and now Steve Presse -- spouting utter nonsense about quantum mechanics and then becoming belligerent and obnoxious when their nonsense is pointed out.

But of course they would not take

meseriously if I started announcing that what chemists know about organic chemistry is all wrong and thatIwill set them aright!As I keep pointing out, the real issue can be seen clearly if one considers brain surgeons or auto mechanics. Would these goofballs treat a brain surgeon or an auto mechanic with the same contempt? If a brain surgeon tells them they have a tumor that needs to be dealt with or their auto mechanic tells them their brakes are failing, would they start telling the brain surgeon or auto mechanic that brain surgeons and auto mechanics are idiots in their own fields and that Ian, Andrei, and Steve will now teach them the new, correct theories of brain surgery and auto repairs?

I

reallyhope they do that and then find out how much service they get from their surgeon or their auto mechanic!We physicists, alas, are often more tolerant people than surgeons or auto mechanics -- a mistake we need to correct.

Jean-Paul ,

DeletePhysicistDave said:

"We have had, in the last few months, several chemists -- Ian Miller, Andrei, and now Steve Presse -- spouting utter nonsense about quantum mechanics and then becoming belligerent and obnoxious when their nonsense is pointed out."

I feel no joy in responding to this, but this statement is such a complete distortion of the truth that I need to defend myself.

PhysicistDave repeatedly failed to address my argument:

"1. The polarization of an EM wave depends on the specific way the electron accelerates.

2. The only way an electron can accelerate is Lorentz force.

3. The Lorentz force is given by the electric and magnetic field configuration at the locus of the emission.

4. The electric and magnetic field configuration at the locus of the emission does depend on the position/momenta of distant charges.

5. The detectors are composed of charged particles (electrons and quarks).

Conclusion: From 1-5 it follows that the hidden variable, λ, depends on the detectors’ states." Here, λ is the polarization."

This is the "nonsense" he is speaking about. The ridiculous thing here is that he admitted he will never address this argument. Well, not unless I pay him. But in time he has no problem insulting me.

This is the type of character you are dealing with. Be careful!

Jean-Paul,

DeletePhysicistDave said:

"As I keep pointing out, the real issue can be seen clearly if one considers brain surgeons or auto mechanics. Would these goofballs treat a brain surgeon or an auto mechanic with the same contempt? If a brain surgeon tells them they have a tumor that needs to be dealt with or their auto mechanic tells them their brakes are failing, would they start telling the brain surgeon or auto mechanic that brain surgeons and auto mechanics are idiots in their own fields and that Ian, Andrei, and Steve will now teach them the new, correct theories of brain surgery and auto repairs?"

This assertions are misleading again for the following reason. If you ask many brain surgeons about their opinions on your condition and some of them tell you you are perfectly OK, while others intend to cut a half of your brain the decision would not be so easy. In regards to my argument I have many conflicting opinions, for example:

Feynman says that there is a clear mathematical correlation between the fields originating from a charge and that charge. He presents those equations in his lectures. Dave believes otherwise. Who is the better "brain surgeon" here?

't Hooft says that classical physics is not ruled out by Bell's theorem. Dave believes otherwise. Again, Who is the better "brain surgeon" here?

May be His Majesty Dave would present a list with His magnificent achievements, starting with His Nobels first, achievements that would certainly dwarf the modest accomplishments of the above, less known physicists.

I actually find the argument from authority as a very weak one, but since Dave keeps it in such a high esteem let's see how well He fares here!

To non-physicists,

DeleteIt is probably worth explaining why what Presse is saying is nonsense.

Why are quantum physicists so worried about the issue of “measurement.”? Because the textbook approach to quantum mechanics points out that all possible intermediate states contribute to a final result in a way that cannot be explained mathematically by just saying one or another of those states actually occurred but we are ignorant of which particular one occurred. The reason this fails is because of the well-known “interference effects” (see any good discussion of the two-slit experiment).

According to the textbooks, you only get a definitive reality to quantum phenomena when you make a “measurement” which chooses among the different quantum possibilities and somehow makes one of them real and gets rid of the effects of the other phantom possibilities: i.e., the states that do not match your measurement can now be forgotten and will not contribute to “interference effects” in the future. A measurement is like a “reset” that clears the deck: of course, pretty soon, new alternative paths open up, and things get spooky again until another measurement occurs.

Anyone who passed a QM course taught by physicists is supposed to understand what I just explained (alas, experience proves we have passed some people we should not have passed!).

Let me emphasize that I am

notsaying this is the final word (I do not think it is), but merely that it is what almost all textbooks have taught for many, many decades. It also happens that nearly all physicists agree that this approach does indeed give the right answer when you predict the results of an experiment. In nearly a century, this approach hasneverfailed, not even once – pretty remarkable when you think about it!The quantum-measurement problem occurs because QM should also describe everything that goes to make up the measuring apparatus. In which case, the measuring apparatus plus the system being measured can themselves be considered a quantum system, in which case you need

anothermeasuring apparatus to measure the original measuring apparatus before there is any definitive reality.And so on ad infinitum.

You have a vicious infinite regress. Reality never comes; all of the phantom possibilities keep affecting the universe. Forever. Nothing ever truly happens.

Somehow, you either need a logical break beyond which quantum mechanics is not adequate or you need to get rid of “measurement” as an integral part of the specification of quantum mechanics.

Physicists are generally aware of this and have tried various ways out of the dilemma. E.g., Bohmian mechanics and many-worlds theory both refuse to give a special position to measurement – the idea of “measurement” is not needed to describe those theories. The Copenhagen interpretation just accepts that you cannot describe the whole world with QM, though it is rather vague as to where the quantum world ends and the classical world starts. Mentalist interpretations make a definite choice: the human mind (and presumably non-human minds) are the final stage of the measurement process, and minds are not fully described by quantum theory: this seems to have been Wigner's and perhaps Schrödinger's and Penrose's view; it may have been held by von Neumann.

But, somehow, anyone interested in the quantum measurement problem needs to deal with the issue.

Unless, of course, you are utterly ignorant of what is meant by “the quantum measurement problem” and worship E. T. Jaynes.

Like some chemists around here.

Think of their attitude as the “Alfred E. Neuman interpretation” of QM: “What, me worry?”

“Where ignorance is bliss, 'tis folly to be wise!”

PhysicistDave wrote: "QM should also describe everything that makes up the measuring apparatus."

DeleteHi Dave,

I'm glad you are coming back to science.

It is remarkable that a physicist calls into question that we have theories describing Geiger counters or photomultipliers. Alas, we need statistical theories for that, and for some people QM is not a statistical theory. They think of the deterministic evolution of the wave function as the essence of quantum theory, and "measurement" as an additional process that is part of the interpretation, not of the theory itself. Lacking a solution of the measurement problem, it is a miracle that physicists have so successfully applied quantum theory.

> "you need another measuring apparatus to measure the original measuring apparatus before there is any definite reality"

This is true only if you think of the wave function as describing an

individualsystem. But it has been known for a long time that it should be thought of as representing anensembleof identically prepared systems. Hasn't Schrödinger's cat demonstrated that long ago?The "measurement problem" is the result of misunderstanding the role of the wave function in the quantum formalism. The fact that so many people think of it as a problem points to a deeper problem: in spite of many candidates, we don't have an interpretation of QM that deserves the name "interpretation".

PhysicsDave wrote to me :

Delete"I can think of zero cases in which non-specialists have made a serious contribution to theoretical physics in the last hundred years" for witch as you say it "takes a lot of work to get up to matters of current research interest" for which as you say it "takes a lot of work to get up to matters of current research interest."

The kind of contribution i was taking about is objection and logical and conceptual argument - witch could be sayed "philosophical" -, not a well constructed new theory or proposal. Most questions of non-specialist are those.

And what is mathematically smart in physics can sometimes be philosophically stupid because physics speak about Nature.

Think for example of the role that certain "big names" made play to the subjectivity of the observer in QM, and to all the ensuing delusions in the literature (popular or not).

Among the things to remember at the end of the chapter 9 in Lost in Math, Sabine writes "Contact with philosophy can help physicists identify which questions deserve to be asked, but currently this contact is very limited". This is mine small contribution to improve this contact !

@Andrei

DeleteFor authority argument :

When there is non consensus between specialists (big name or not), that mean specialization at the time is not sufficient to decide. Perhaps the problem need even more specialization, or more probably he need other mode of discussion. I would say "just rationality" but it seems rationality has différents "colors" for different persons. His only invariant i know is the respect of classical logic.

For entanglement of spins polarized

I am not specialist but it is well known that phenomena not depend of distance. So it cannot simply depend on classical fields between detector and the particle.

Jean-Paul,

Delete"it is well known that phenomena not depend of distance. So it cannot simply depend on classical fields between detector and the particle."

Classical fields have infinite range, so the distance does not seem to me to present a problem. Consider for example the simple case of a planet orbiting a star. It's orbit is an ellipse, regardless of the distance. True, the force decreases with the square of the distance, but there is no distance where the planet starts moving randomly. The same reasoning holds in EM.

The above trivial example, about planetary orbits, shows that Bell's independence assumption is most likely false in gravity, electromagnetism or any other field theory of infinite range. It's not uncommon at all for distant systems to display correlations.

Andrei

Delete"Classical fields have infinite range, so the distance does not seem to me to present a problem".

It is also well known that physics would not be possible if all those fields effets from very far away sources were not completely negligible

Jean-Paul,

Delete“It is also well known that physics would not be possible if all those fields effects from very far away sources were not completely negligible”

I am glad you made this argument because this seems to be the main reason a lot of scientists reject superdeterminism. So, let me explain why this argument does not work for Bell tests.

Let’s consider first the case of gravity. Why is it possible to send a rocket to Mars without taking into account the gravitational field of every particle in the universe? After all, they all contribute to the gravitational field at the Solar system.

Here the explanation consists in the fact that the gradient in the gravitational field for distant sources is very small on the scale of a planetary system. In other words, Earth, Mars, the Sun and the Moon are accelerated in approximately the same way by the distant sources. So, if we are interested in the relative position/velocity of Earth and Mars we can ignore distant sources.

Exactly the same argument holds for, say, two planets (let’s call them Earth_1 and Mars_1) in a distant planetary system, in a different arm of our galaxy. But here is the trick: If we are interested to send a probe not from Earth to Mars or from Earth_1 to Mars_1, but from Earth to Mars_1 we cannot ignore the galactic field anymore. The approximation that works locally does not work here. You can see here the similarity with a Bell test. We are not interested in the results obtained at A alone or at B alone, we want to look at them in parallel. So, a Bell test is not analogous with an Earth-Mars trip, but with an Earth-Mars_1 trip, where the influence of distant sources (all matter in our galaxy) cannot be neglected.

to cont:

Jean-Pall,

Deletecont:

Let’s take a look now at EM interaction. As my argument states, distant neutral objects interact at any distance by EM fields. This is because those objects consist of field sources (charges), electrons and nuclei, that do not share the same positions. However, due to the fact that there are a lot of charges and the number of positive charges equals the number of negative charges, the effect of those fields on the center of mass of the objects amounts to nothing else but a small movement around the center of mass. For macroscopic objects, like chairs and tables and billiard balls, such a motion can be ignored. When objects become close the EM fields become relevant again. This is why Newtonian mechanics with contact forces works.

In the case of a chemical reaction the same statistical effects mask the distant EM fields. We do not expect a chemical reaction to proceed differently because Jupiter is in a different position because, statistically, nothing changes. A different position of Jupiter will imply different trajectories for individual molecules, but because there are so many of them, coming from all possible directions, the outcome of the experiment is expected to be the same. The same explanation holds for the much praised medical test argument of Tim Maudlin or for Smoking-cancer correlation. We are justified in assuming that the results of these experiments are not correlated with the selection/testing “choices” because the efficacy of a vaccine does not depend on a single protein molecule entering a specific cell from a certain position and with a certain velocity. It’s a statistical effect, and again, the influence of distant sources can be ignored.

In a Bell test, however, we are not interested in a statistical description of the source, but in the polarization of a single photon. Unlike all the examples above, this polarization depends exactly on the trajectory of the electron that emits the photon, and that trajectory in turn is not independent of the distant EM sources. So, statistical independence assumption is not justified for a Bell test, while it is justified in all of the other situations.

Conclusion: the fact that Newtonian mechanics/chemistry/biology work without taking into account distant field sources does not imply that those sources can also be ignored in a Bell test. I have therefore presented strong evidence that there is no incompatibility between superdeterminism and science.

@Jean-Paul:

DeleteIt's interesting to see Poppers motion of propensity turn up in this discussion. I managed to track down Poppers short work on propensity a few months ago where he dismisses the usual frequentist explanation of probability in favour of his notion of propensity as a quantitative measure, as you point out, of a potentiality of change.

In fact, when I was speaking to a friend of Chris Ishams, he mentioned that he was working on the notion of a propensiton (and which I misremembered as the apeiriton).

It seems everything is to be quantized today.

Andrei wrote to me:

Delete>I feel no joy in responding to this, but this statement is such a complete distortion of the truth that I need to defend myself.

>PhysicistDave repeatedly failed to address my argument:

Andrei, you are not telling the truth.You just think no one will bother to check out the fact that you are lying.On this thread, I went into enormous detail, in response to

yourrepeated and insistent requests, to deal with what you yourself claimed was the make-or-break issue for your position. You agreed that on that make-or-break issue, I proved you were wrong.No matter what I or anyone else says, you will always just keep moving the goalposts trying to keep alive your crackpot claims.

Look: you keep appealing to the SED nonsense. You know that almost no competent physicists take that stuff seriously: it is so kooky that it is difficult to find any competent physicist talking about it at all.

I urge anyone who is tempted to take SED seriously to read Emilio Santos' paper on the arXiv, posted just a couple months ago. Santos is a

supporterof SED, but the paper consists of one admission after another of what a miserable failure it has been.Typical is this:

>6.3 A difficulty with the angular momentum

>The disagreement between the quantum prediction, eq.(77),and the SED prediction, eq.(78),for the rigid rotor is actually general and it puts a problem for any realistic model of the rotation in quantum physics.

or

>In the case of the hydrogen atom the result of the calculation did not provide a stationary solution. In fact the prediction was that the atom is not stable but ionizes spontaneously due to the orbits passing close to the nucleus[19].

Any competent physicist could have told these guys that this SED nonsense would fail and that is indeed what has happened,

now for more than fifty years!This is the utter height of pseudo-science, far, far worse than, for example, string theory.

String theory may not make predictions that can be successfully tested. But SED

doesmake predictions, and they are refuted by experiment. An honest scientist gives up on a theory when it is proven to be wrong.You won't and they won't.

You and they are not honest.

QED.

By the way, Andrei,

Iactually figured out three decades ago how to have a stochastic zero-point field in a theory that reproduces quantum mechanics. To do this, you have to take a very different approach than the SED crackpottery, of course.I'd been thinking about publishing this someday, but I now see that crackpots like you will just claim my model as vindication for your crackpottery. So, I think I will just keep it to myself so that I can keep laughing and laughing at the failed attempts of you and your pals.

There needs to be some humor in life, after all.

Werner wrote to me:

Delete>This is true only if you think of the wave function as describing an individual system. But it has been known for a long time that it should be thought of as representing an ensemble of identically prepared systems. Hasn't Schrödinger's cat demonstrated that long ago?

>The "measurement problem" is the result of misunderstanding the role of the wave function in the quantum formalism. The fact that so many people think of it as a problem points to a deeper problem: in spite of many candidates, we don't have an interpretation of QM that deserves the name "interpretation".

Y'know, physicists thought of that a long time ago! But no one has figured out how to pursue that to the level where they can use that idea to write a textbook to teach students how to do calculations in QM that do not boil down to the summary I have given above of textbook QM.

You think you can? Go for it. I'd risk my life that you will fail. But prove me wrong and make a fool of me. I'd actually like you to do that because I really do not like textbook QM.

Werner also wrote:

>It is remarkable that a physicist calls into question that we have theories describing Geiger counters or photomultipliers.

Well, it would be! Do you know of such a physicist? I merely said that in principle QM should describe such devices and then,

if you take textbook QM seriously, you need a separate measuring device to measure the measuring device and so on ad infinitum.Don't blame me: I am just pointing out what happens if you take the textbooks seriously.

I know you think you understand this better than all the textbook writers. Good, Write your own book. Nowadays, you can post your book for free on the Web and everyone can read it.

Just do it, Werner! Show up all us physicists for the fools we are.

PhysicistDave,

Delete"Andrei, you are not telling the truth."

Of course I am. Just another typical false accusation from your part.

"You just think no one will bother to check out the fact that you are lying."

Au contraire my dear Dave! I hope that everybody here will take the time to check and see for themselves that the link you provided does not contain a valid refutation of my argument presented by me on this thread. I'm looking forward at the moment when everybody will call your bluff.

"No matter what I or anyone else says, you will always just keep moving the goalposts trying to keep alive your crackpot claims."

The argument presented on this thread predates the answer to a different argument at the link you provided. So, unless you think I have a time-travel device your accusation of "moving the goalpost" is just as ridiculous as the first one, about me lying.

You should also check a dictionary and learn what "crackpot" means. My argument only contains generally accepted mainstream stuff, like electrons being charged and so on. The reason you are so confused, besides not understanding what a sound argument is and what the burden of proof implies, is that you use words without knowing what they mean.

"you keep appealing to the SED nonsense."

Not in this argument. This is a red herring.

"You know that almost no competent physicists take that stuff seriously"

No, I don't. Did you ask all competent physicists about it? Oh, I get it. There is only one competent physicist, Dave.

"This is the utter height of pseudo-science, far, far worse than, for example, string theory."

You need to also check what "pseudo-science" means. SED is just mainstream classical EM in the presence of an EM field (ZPF). Sure, the ZPF may not exist, but this cannot make the theory pseudo-science. Your confusion runs much deeper than I thought.

"QED"

Just another abbreviation you don't understand.

"By the way, Andrei, I actually figured out three decades ago how to have a stochastic zero-point field in a theory that reproduces quantum mechanics."

Sure. Let me guess! It's on the third shelf, just between the blueprints for the time-travel machine and warp drive, secured beneath a heavy pile of Nobel medals.

"There needs to be some humor in life, after all."

Yes, this is the only part of your post that makes sense.

Andrei wrote to me:

Delete>Feynman says that there is a clear mathematical correlation between the fields originating from a charge and that charge. He presents those equations in his lectures. Dave believes otherwise. Who is the better "brain surgeon" here?

Andrei, I took QM and Intro to Elementary Particle Physics from Dick Feynman. The Feynman Lectures were the textbooks for our freshman and sophomore physics courses at Caltech. I myself have worked out a novel way to derive the formulae you refer to different from the method Feynman did.

I know this stuff.

You have admitted in the earlier thread that you lack the math required to understand it.

What you have said about my former professor is nonsense.

You have no idea what you are talking about because, as you admitted, you cannot handle the math.

Andrei also wrote:

>'t Hooft says that classical physics is not ruled out by Bell's theorem. Dave believes otherwise. Again, Who is the better "brain surgeon" here?

I do not believe 't Hooft has ever said, as you have said, that classical EM violates Bell's theorem, which is what you and I are arguing about, now has he?

I think you also misstated what you intended to say. You said, “'t Hooft says that classical physics is not ruled out by Bell's theorem.” Quite true. Classical physics indeed does not ever violate Bell's theorem, as I keep saying. It is QM that violates the Bell inequality. So, you are not telling the truth in saying, “Dave believes otherwise.”

But in this case, I will give you the benefit of the doubt and assume that you simply did not read carefully what you yourself wrote. You are missing a “not” or something.

This is, though, a general example of your carelessness. The Bell experiment cannot even be carried out in classical physics: for example, the Bell experiment relies on

individual pairsof photons. But photons do not exist in classical physics, just continuous waves. And, the Bell experiment is simply irrelevant to continuous waves.But you understand so little about the Bell experiment that you cannot grasp this trivial fact, can you,, even though I keep pointing it out?

It's like asking whether bok choy confirms or rules out the Pythagorean theorem. Anyone who asks that question has a screw loose.

This should worry you and worry anyone reading what you post. Your response to this point has simply been that polarization does exist in classical physics. Yes, but that is not enough. The Bell experiment relies on

two individual photonswhose spin is entangled in a particular way. You cannot have individual photons in classical physics.Therefore, the Bell experiment cannot even be performed in classical physics.

Therefore, one will never find a violation of the Bell inequality in classical physics simply because you can

nevertest it at all!QED

Andre, do you have any capacity to be embarrassed?

You are a very, very unusual person.

PhysicistDave,

Delete“Andrei, I took QM and Intro to Elementary Particle Physics from Dick Feynman.”

OK, tell me if you agree or not with the formula 21.1 here:

https://www.feynmanlectures.caltech.edu/II_21.html

“You have no idea what you are talking about because, as you admitted, you cannot handle the math.”

This is irrelevant. It’s not necessary that I understand how a formula is derived in order to use it for an argument.

“I do not believe 't Hooft has ever said, as you have said, that classical EM violates Bell's theorem, which is what you and I are arguing about, now has he?”

I did not say that “that classical EM violates Bell's theorem”, but that classical EM is not ruled out by Bell theorem which is a completely different kind of animal. After all this time you still misrepresent my claim.

“I think you also misstated what you intended to say. You said, “'t Hooft says that classical physics is not ruled out by Bell's theorem.” Quite true. Classical physics indeed does not ever violate Bell's theorem, as I keep saying.”

The sentence “classical physics is not ruled out by Bell's theorem” is equivalent to saying that “there is no evidence that classical physics cannot violate Bell’s theorem”. Which is completely different from the sentence “Classical physics indeed does not ever violate Bell's theorem”. As I’ve said earlier, your confusion is deep.

“The Bell experiment cannot even be carried out in classical physics: for example, the Bell experiment relies on individual pairs of photons.”

This claim is unjustified. We do not need “individual pairs of photons”, we need “something” that is able to make a detector register an event. “Individual photons” is a mathematical model that can do that, but there is no proof that a different mathematical model cannot work.

“This should worry you and worry anyone reading what you post. Your response to this point has simply been that polarization does exist in classical physics. Yes, but that is not enough. The Bell experiment relies on two individual photons whose spin is entangled in a particular way. You cannot have individual photons in classical physics.”

As pointed out above, we do not need “Individual photons”. What we need is “something” making the detector register an event. But here is a trivial refutation of your point: Bell tests have been performed with massive particles, or even Bose-Einstein condensates. So, clearly, there are many other possible implementations of the experiment, in addition to photons. But in any case, what we actually see is a detector registering an event. You need to be open-minded in regards to what is the cause of that event, otherwise you are just assuming what you want to prove.

“Andre, do you have any capacity to be embarrassed?

You are a very, very unusual person.”

I have no reason to be embarrassed. You cannot even repeat correctly the claim I am making (classical EM is not ruled out by Bell theorem) and invent different claims, like “classical EM violates Bell's theorem”. Hence, you cannot even start refuting my argument.

Steve,

ReplyDeletetalk of "measurement" in quantum mechanics is some kind of rationalization to gloss over the fact that it has never clearly been said what it is about, except for mystifying statements that it is about objects that are neither waves nor particles, but a combination of both. These "objects" have undefined or uncertain properties until they are "measured". Some people (Sabine among them) think of the wave function as describing an

individualsystem. They feel forced to believe that measurement leads to a "collapse of the wave function", and they think of it as a real physical process. But this is due to a misunderstanding of the role of the wave function in the formalism. It is but one piece of a bigger statistical apparatus of which the Born rule is another important component. A single neutron will decay even if there's no Geiger counter to detect it.Werner

Hi Werner,

Deleteplease see my responses below.

> talk of "measurement" in quantum mechanics is some kind of rationalization to gloss over the fact that it has never clearly been said what it is about, except for mystifying statements that it is about objects that are neither waves nor particles

Correct. The founders of QM were not trained probabilists. They used confusing language (like "collapse" instead of "sampling"). Our arsenal in the 21st century is different. We should change our language today. Changing the language has deep implications. Once we update our language to that of current mathematics we can speak like adults. My response to all your comments is an attempt at doing so.

> These "objects" have undefined or uncertain properties until they are "measured".

This is true of every theoretical framework unless the quantity of interest in the theory is distributed like a delta-function, i.e., a classical mechanical pointillistic and deterministic particle.

>Some people (Sabine among them) think of the wave function as describing an individual system.

Then you have a problem with Wigner's friend's paradox. Or shall I call it Werner's friend? ;)

>They feel forced to believe that measurement leads to a "collapse of the wave function", and they think of it as a real physical process.

I don't. You CANNOT MEASURE entire probabilities on their full support. You can only SAMPLE probabilities. So any theory relying on probabilities must involve a collapse (though I hate to use that word. Let's call it sampling.)

>But this is due to a misunderstanding of the role of the wave function in the formalism.

No. It is due to the fact that you don't like probabilistic theories and what it means to sample probabilities.

I'm not trying to be argumentative. But at some point, enough is enough.

Hope this helps.

Steve Presse wrote: "The founders of QM were not trained probabilists."

DeleteTo this day, probabilists quibble even among themselves about the true meaning of probability ... :-)

"you don't like probabilistic theories"

Where did you get that idea from? I find probability theory indispensable. But there is a difference between electrons and oranges. Physics is not just an application of sampling theory.

"Hope this helps."

Yes, thank you very much. I don't see any need for further elaboration.

The reason QM measurement is not quite the same as sampling is the wave function after a measurement can be expanded in another basis. This has a set of eigenstates and probabilities. This makes QM measurement different from classical probability,

DeleteHi Lawrence,

Delete>The reason QM measurement is not quite the same as sampling is the wave function after a measurement can be expanded in another basis.

Not really. You can measure a classical variable and and know very little about its Fourier conjugate. This is still encountered in classical problems.

The difference between classical and quantum probabilities is the fact the probability amplitudes are the fundamental object of QM.

So in QM, when we want to calculate the probability of ending in final state f starting from state i, we compute . Then we MUST square this quantity. If we had an intermediate state visited, we would insert a full set of states, sum, THEN square. The resulting quantity would be the probability.

By contrast, with classical probabilities, you multiply by transition probabilities. No squaring is required.

However, once you've arrived at a probability (whether classically or in QM), that's it. Both probabilities must be sampled. So we have the SAME "measurement problem".

p.s. the QM wave function that we eventually square could have been from a Young two-slit experiment for all we care (I am saying this for those hung up on the two-slit experiment). None of what I say changes.

p.s.2. my interaction with physicistDave above serves as a rude reminder to every real scientist of the trouble of engaging with internet creeps whose ratio of available time per unit wisdom is a diverging quantity ;) This is why I stay away from commenting on blogs, generally. Except when COVID gives me a modicum of spare time ;)

Steve Presse,

Delete"The difference between classical and quantum probabilities is the fact the probability amplitudes are the fundamental object of QM."

In my opinion, the problem with this approach is that it implies non-locality. In an EPR experiment for example, before any measurement is performed, the probability to get spin-up at detector A or B is 0.5. After A measures its particle and gets, say, spin-down, the probability at B becomes instantly 1 for spin-up. Now, if that probability only describes our limited information, no non-local process needs to be involved, but if the probability is seen as fundamental, a change in probability becomes a change of the system B, instantly determined by a measurement at A.

Andrei,

DeleteThere are not two separate "systems" A and B, only one "situation" which yields more complicated observations!

Stating the primary difference between classical probability and quantum probability is with the amplitude is the same as what I wrote. Quantum probabilities are the modulus square of amplitudes. Further, if you make a measurement and have probability unity for some outcome, a unitary transformation to another basis produces this as due to the sum of a spectra of amplitudes in that new basis.

DeleteHi Andrei,

Deletethe difference between how classical and QM probabilities is NOT non-locality. Because causality implies locality for both.

Bell's inequalities tell us either QM is non-local OR we must abandon realism. Of course, if we accept that probabilities do not represent a finer underlying reality, then we have already abandoned realism.

Hi Andrei,

Deletesmall addendum: I completely agree with you when you say "Now, if that probability only describes our limited information, no non-local process needs to be involved, but if the probability is seen as fundamental, a change in probability becomes a change of the system B, instantly determined by a measurement at A."

What I don't agree with is that what I said about probability amplitudes implies non-locality. It does not.

We are both in agreement that mathematical recipes for computing probabilities are different in QM and in classical. That's fundamentally the difference.

I think we are also both in agreement that probabilities reflect states of knowledge.

Steve Presse,

Delete“Bell's inequalities tell us either QM is non-local OR we must abandon realism. Of course, if we accept that probabilities do not represent a finer underlying reality, then we have already abandoned realism.”

I strongly disagree with this. Let me explain.

My previous argument, about EPR experiment (where detector orientations are fixed, so that a spin-up at A implies a probability of 1 for spin-down at B) leaves us with two options:

1. Deterministic local realism (Bertlmann's socks)

2. Non-locality (which can be either realistic/deterministic – Bohm, or non-realistic – Copenhagen style collapse)

There are some other logical options, like solipsism, brains in vats, etc. but those are not usually considered as science.

You may notice that there is no such option as local non-realism. Local non-realism is not on the table anymore. Denying realism cannot possibly give you back locality, on the contrary, it forces you to accept non-locality.

The mistake is to forget about EPR and go directly to Bell’s theorem. Bell’s theorem is not a denial of EPR, is a refinement of EPR. Bell’s theorem is based on two assumptions that are most relevant here:

1. Locality

2. Statistical independence between the measurement settings and the hypothetical hidden variable.

As a result, locality (either realistic or not) remains an option. The other option is a restricted class of deterministic realism where the statistical independence assumption is denied, the so-called superdeterminism.

In conclusion, after applying the EPR and Bell “filters”, in the correct order, we have two surviving options:

1. Non-locality (either realistic or not)

2. Superdeterminism.

"I think we are also both in agreement that probabilities reflect states of knowledge."

Agreed.

Prof. David Edwards,

Delete"There are not two separate "systems" A and B, only one "situation" which yields more complicated observations!"

I do not understand how renaming two distant detectors (A and B) as "one situation" is going to impact my argument. Sending instantly a signal from Earth to Mars would be considered a proof of non-locality, yet, you could also describe it as "one situation".

@Andrei

DeleteYou may notice that there is no such option as local non-realism. Local non-realism is not on the table anymore. Denying realism cannot possibly give you back locality, on the contrary, it forces you to accept non-locality.Indeed.

Copenhagen strictly requires non-local signaling. If the measurement itself prepares the state, then entangled particles could not act accordingly without non-local signaling. A pair of entangled electrons will always give us either ↑↓ or ↓↑ if we provide parallel orientation for both Stern-Gerlach devices. The perfect anticorellation shows that some realism is still involved. But, following the Copenhagen interpretation: "That's fine and now let us forget about it, don't ask furthermore".

In conclusion, after applying the EPR and Bell “filters”, in the correct order, we have two surviving options:

1. Non-locality (either realistic or not)

2. Superdeterminism.

So, if we want to avoid (supraluminal) signaling , then we have to go for a deterministic solution in the one ore the other way.

You have no argument then! The quantum algorithm correctly predicts the results of the experiments; your problem is with a semi-classical attempt to describe "underlying" processes.

DeleteProf. David Edwards,

Delete"You have no argument then! The quantum algorithm correctly predicts the results of the experiments; your problem is with a semi-classical attempt to describe "underlying" processes."

The argument does not depend on any description of the underlying physics, semi-classical or not. The argument is based strictly on the experimental results as displayed by the detectors or printed on papers. In order to observe EPR correlations one needs not assume anything. My argument proves that the only way to account for those results is to accept non-locality or superdeterminism.

In no way is the argument in conflict with the correct predictions of QM. The question is what those predictions tell us.

But, if you think that the argument is wrong you need to point out the premise that you think fails or explain why the conclusion does not follow from the premises or provide a counterexample (an interpretation of QM that is explicitly local and non-realistic).

Sixte,

Delete"So, if we want to avoid (supraluminal) signaling , then we have to go for a deterministic solution in the one ore the other way."

I disagree with the word "signaling". QM does not allow for such a thing, this is a proven fact. The question is if there is some non-local phenomenon taking place even if it cannot be used to transmit a signal. Otherwise we are in agreement.

Prof. David Edwards6:41 AM, June 16, 2020

DeleteWhat if superdeterminism is shown to be empirically true, for example? Will you be extending your perspectives to include new experimental results? Yours sounds like an ostrich approach to me. Let's ignore the dilemma and cover ourselves by saying there are no perfect perspectives. Isn't there an assumption anyway connected with Gleason's Theorem?

Hi Andrei,

DeleteI certainly appreciate the thoughtfulness of your replies.

But I completely disagree when you say non-realist local theories are eliminated by Bell's inequalities.

Let me explain:

1) The reason locality is not eliminated is because we can argue that (and we both agree that) if I know the spin's orientation wrt a particular filter setting at Alice, then I instantaneously know the orientation of the other entangled spin at Bob. It is *as if* the information had been communicated supraluminally but, of course, it is not. That's because probabilities encode states of knowledge. Obviously.

2) Non-realism is clearly not eliminated even under the assumption of locality.

Why? If I take the product spin state (i.e., the classical Bertlemann's socks example) I clearly violate the Bell's inequalities. QM has stronger correlations than the classical product state of colorful socks would allow. That's because we true quantum state has *not yet be realized by sampling* until measurement. Hence non-realism. But again, I emphasize the importance of accepting that probabilities in QM do not hide finer realist details (such as realizations of states that would eliminate the interference terms).

I will read your response with enthusiasm but may not get a chance to thoughtfully respond as I have looming deadlines (!)...

Steve Presse,

Delete“I completely disagree when you say non-realist local theories are eliminated by Bell's inequalities.”

I disagree with the above statement as well :), I’ve never made such a claim.

In 1935 Einstein, Podolsky and Rosen published a paper with the title:

"Can Quantum-Mechanical Description of Physical Reality be Considered Complete?"

Their argument runs as follows:

At two distant locations (A and B) you can measure the momentum or position of an entangled pair. QM predicts that:

P1: If you measure the position at A you can predict with certainty (probability 1) the position at B.

Let’s exclude non-locality:

P2: the position at B is not determined by the measurement at A.

From P1 and P2 it follows that there was something at B that determined the result of the B measurement. EPR named that “something” an “element of reality”. So:

P3: There is an element of reality at B that determines the measurement result at B.

You may observe that there is no other logical option available (unless you think that it’s by pure luck that we manage to always predict with certainty the position at B, which is rather absurd).

Now let me depart from the original argument and make the following inferences:

From P1 and P3 it follows:

P4: there was an element of reality at A that determined the measurement at A.

This is because once we’ve established that the measurement at B was fixed (P3) it’s impossible that the measurement at A could have been different, right?

OK, so P3 and P4 lead to:

C: The position of both particles, at A and B were determined before any measurement took place (deterministic realism).

“2) Non-realism is clearly not eliminated even under the assumption of locality.”

It certainly is. What options do we have here? We have (1) non-locality and (2) local-deterministic realism. You may check that the option “local-nonrealism” does not exist anymore. It’s dead and buried, not by Bell theorem (we are not there yet) but by the original EPR argument.

Now what about the widespread false claim that Bell’s theorem rules out realism? Let’s look at the title of Bell’s paper:

"On the Einstein Podolsky Rosen paradox"

In the abstract we find:

“THE paradox of Einstein, Podolsky and Rosen was advanced as an argument that quantum mechanics could not be a complete theory but should be supplemented by additional variables. These additional variables were to restore to the theory causality and locality. In this note that idea will be formulated mathematically and shown to be incompatible with the statistical predictions of quantum mechanics.”

So, Bell’s theorem’s focus is local-realism alone. The theorem intends to show that local-realism does not work, so the only surviving option is non-locality. Again, I need to stress the fact that local-non-realism is already excluded by EPR, so Bell cannot possibly revive it. It can only select from the remaining options, non-locality and local-deterministic realism.

Bell’s theorem, however, requires a new assumption (statistical independence between measurement settings and the hidden variables). So, in the end we can still have locality, but in the form of superdeterminism.

“I will read your response with enthusiasm but may not get a chance to thoughtfully respond as I have looming deadlines (!)...”

Please take your time, no hurry here. The debate is going for 85 years. It can wait few more hours/days to conclude :).

@Steve

DeleteThat's because the true quantum state has *not yet be realized by sampling* until measurement. Hence non-realism.You just repeated the Copenhagen Interpretation. For CI, the entangled electrons are a pair in one measurement, taken at two different/distant places. The two particles A and B form a product state, an |A> ⊗ |B> . They are now regarded as a single, but combined object and the measurement happens always on that combined object. Therefore, the perfect anticorrelations ↑↓ or ↓↑ can be found and reproduced over and over again. However, then we have to accept some (potentially) supraluminal exchange of spin orientation between the particles. If we don't want to "buy" that, we have to come up with some deterministic solution, therefore, a hidden variable has to be involved.

So, non-realism is not really an option. Too much obfuscation :) And Bell's own interpretation of his unequalities didn't help either...

This thread is a thicket of difficulties. Folks, the reason QM deviates from the Bertelsmann’s socks objection is that while QM predicts outcomes for ↑↓ states it does not address what happens when the observer makes ↑→ measurements. This is where the Bell inequalities enter in.

DeleteBell’s theorem tells us that we can’t have locality with observed outcomes and a classical concept of reality at the same time in a measurement. A classical concept of measurement just means there is some objective existence of a state prior to the observation. People most often prefer to let go of locality, so QM is nonlocal, but the Frauchiger-Renner result illustrates situation where reality breaks down as well.

QM is just weird.

Lawrence Crowell,

Delete“Folks, the reason QM deviates from the Bertelsmann’s socks objection is that while QM predicts outcomes for ↑↓ states it does not address what happens when the observer makes ↑→ measurements. This is where the Bell inequalities enter in.”

This may be so, but, as the argument presented here shows:

http://backreaction.blogspot.com/2020/06/physicists-still-lost-in-math.html?showComment=1592378422563#c1744771086781784026

we either accept an improved version of Bertelsmann’s socks or we need to accept non-locality, absolute reference frames and all that. It’s just no other way.

“Bell’s theorem tells us that we can’t have locality with observed outcomes and a classical concept of reality at the same time in a measurement.”

Not true. Bell’s theorem only has the above implication if the statistical independence (SI) between the measurement settings and the hidden variables holds. In field theories with long range forces, such as classical EM, there is no reason to believe SI holds. So, Bell’s theorem does not say much about the “classical concept of reality”. It only rules out theories without long-range forces, like Newtonian mechanics with contact forces only (billiard balls).

Discussion: Imagine a Bell test performed with a diatomic molecule that splits into atoms, like Hg2, and the spin of each atom is measured by a Stern-Gerlach device. Let’s see what classical physics predicts for such a case.

So, classically, our Hg2 molecule will be represented by two magnets in close contact. The detectors will consist of about 10^27 tiny charged magnets (electrons and nuclei). When the external EM force overcomes the force keeping our two Hg “magnets” together they will depart towards those detectors and will come out in either the “up” or “down” channel. What statistics should we expect? I admit I do not know. Will Bell’s inequalities be violated or not? How do you know?

The key fact here is that, classically, the Hg2 “magnets” will never split if an external force is not applied. But if an external force is applied, that force needs to be taken into account when applying the momentum conservation principle. So, one cannot claim that in all situations the spin on X, Y and Z must be anticorrelated. They are anticorrelated when the detectors are symmetric (both have the same orientation) but may not be anticorrelated when they have different orientations.

“QM is just weird.”.

It may not be more weird than classical field theories are.

Lawrence Crowell,

Deletethe Frauchiger-Renner result illustrates situation where reality breaks down”

Not so. All Wigner’s friend type of experiments share the same fundamental flaw. They assume that it is possible, in principle, to isolate a system. In reality, it’s not possible.

Any system consists of charged particles, or at least requires the presence of charged particles (photons require a source). All charged particles in the universe interact by EM fields. Placing a charge in a “box” cannot eliminate this interaction. The electric and magnetic fields of that charge will interact with the external world and the charge will be subjected to the action of external electric and magnetic fields. The same is true for gravitational fields.

Lawrence

DeleteThis is where the Bell inequalities enter inThe Bell-type correlation measurement can be reformulated as

a paired measurement with an (arbitrary) chosen zero and a selected parameter with value α so that the difference can be formed:

(1) Δ(0,α) = m(α) - m(0) . Let's now take another measurement:

(2) Δ(α,α+ε) whereby ε << α (possibly lim ε -> 0)

Now, we want to

reusethe result (The obtained Δ in (1)) in a monotonic way,that means we claim a monotonic extrapolation of (1) :(3) εΔ(0,α) ≈ αΔ(α,α+ε)

This will not work well, we should better use:

(4) ε²Δ(0,α) ≈ α²Δ(α,α+ε)

instead. But then, for lim ε -> 0 , d(Δ(α,α+ε))/dε = 0 can be derived. This holds for any particular α . Therefore we conclude that:

Δ(0,α) is constant (contradiction). Obviously, Δ(0,α) can't be regarded as a measurement.

What follows? That the Bell type correlation measurement is not a measurement. "We can't measure a potentially given hidden parameter". That's it. You can't go further than that (you can always speculate though...)

I see superdeterminism as a sort of hidden variable theory. However, there is no decidable algorithm for computing the geodesic set for the hidden variable. This is the invariant set theory of this hidden variable. If there existed a computable dynamics it is equivalent to the existence of a global algorithm for all points on the complement of a fractal, such as what occurs with a Cantor set. This is mathematically not possible. This has the implicaton superdeterminism is a formalism for a nonlocal hidden variable.

DeleteThere are no escapes from the trap of uncertainty in QM. There is no more of such than there is an exit door from a black hole. In fact the two may be related. The solution to the measurement problem may simply be there exist no such solution.

Quick follow on. The FR argument illustrates how witnesses of an observation can have conflicting reports. This means in a general sense, witnesses being in a sense Wigner watching his friend, there is no concrete meaning to a reality to QM, ans this includes measured results.

DeleteLawrence

DeleteThis has the implicaton superdeterminism is a formalism for a nonlocal hidden variable.Exactly. In this regard, it is not less "nonlocal" than the De Broglie-Bohm theory. But I avoid the (problematic) word "nonlocal", because some people could get afraid of it. I prefer "spatially distributed support".

The solution to the measurement problem may simply be there exist no such solution.In the photonic case, the quantum effect is directly related to the limit of the speed of light. A photon moving along the x axis can't have support along that axis due to Einstein's length contraction √(1-v²/c²) where v = c (with v along the x-axis). So, the photon's support has to be delivered "from the sides". Highly counterintuitive.

Lawrence Crowell,

Delete"I see superdeterminism as a sort of hidden variable theory."

Yes, it is a deterministic HVT where the statistical independence assumption (SI) does not hold.

"This has the implicaton superdeterminism is a formalism for a nonlocal hidden variable."

Not true. If SI cannot be shown to hold for a specific theory, Bell's argument is unsound and its conclusion does not follow. Sure, it is possible for the conclusion of an unsound argument to be true, but we have no reason to believe it is true, so we can safely ignore it.

I am not sure what is the point of the fractal analogy.

"The FR argument illustrates how witnesses of an observation can have conflicting reports. This means in a general sense, witnesses being in a sense Wigner watching his friend, there is no concrete meaning to a reality to QM, ans this includes measured results."

Just like in the case of Bell's theorem, the conclusion of the FR argument only follows if the argument is sound. But, as pointed by me, the argument is not sound. Hence, the conclusion does not follow. Again, it might be the case that the conclusion is true, but we have no reason to believe it is true, so we can ignore it.

Lawrence Crowell,

DeleteLet me be more precise about the FR argument. From the FR paper we read:

"The other observer, agent W, has no direct access to the outcome z observed by his friend F"

It is not possible to isolate a lab in such a way so that someone outside cannot observe the outcome of a measurement. The only limit, for both the inside and outside observers is the uncertainty principle. So, their perspectives will always be the same.

I am rather constrained on time today, so I may not respond in full. Sixte's comments are in line, and the discussion on measurement is similar to the sort of undecidability of all possible outcomes on hidden variables. This is why I say superdeterminism is about nonlocal hidden variables. Conversely I disagree with Andrei. Attempts to work around the epistemological barrier or horizon of QM have a long history of failure.

DeleteI just would like to kindly remind you that the transistors and the remaining early solid-state devices had no dependence on any breakthrough in physics. Those were parallel paths of scientific and engineering development that have merged only later. The first transistor (a bipolar junction device, as we call it today) was created by pure accident -- the guys were trying to build a JFET based on some patents dating back the 1920s. They utterly failed with the latter goal, but were smart enough to explain the unexpected success of the resulting failure. And then came the Nobel prize. This "not exactly scientific" approach that worked was even more apparent in the vacuum tubes era. So, please do not hijack these inventions, that wasn't "physics". Physicists have helped optimizing the devices and started designing their own, but the timeline was a bit diferent.

ReplyDeleteIf you want to define electromagnetism as "not physics" then you are badly abusing common terminology.

DeleteOne could say the electric light bulb was not the result of physics, but the work of Edison. However, the generation of photons is due to blackbody theory and ultimately quantum mechanics. Lilienfeld proposed a FET type of device, and it was Bardeen, Brattain and Schockley who employed P and N junctions that employed dopants in silicon. This results in an effective potential between these different silicon types. This is very much the result of quantum physics.

DeleteThe people who had made the groundbreaking discoveries, especially those in the early years of the valuum tube era, did not have thorough understanding of the modern theory of electromagnetism, which certainly is physics.

DeleteIn other words, if the theory leads the experiment and makes testable predictions, it certainly is science. If the experiment leads and the theory is lagging, a statement "we have no idea what would happen, let's go and figure out" is science as well. But if you are an extremely well-educated experimenter who aims at building X and end up with completely different Y, you are simply wrong, no matter how useful Y might turn out to be. And this is precisely the story behind the transistor. Early vacuum tubes didn't have even that, it was pure random hacking. *Then* came the guys who understood electromagnetism and quantum mechanics well and created true marvels, but they were not the inventors. Hence, no credit for physics here.

So the transistor was invented by physicists, but those physicists were only inventors and not physicists, so it's not physics. Alrighty. This is so silly it's not even worth arguing about.

DeleteBoth engineer-inventors and physicist-theorists throughout history were all hackers. The latter just hacked some math together to get something that worked. One big happy family of hackers.

DeleteI'm an old computer guy. Originally "hacker" didn't mean criminal, just someone with unusually deep knowledge of a system who wasn't afraid to go in and change things. Basically, good people to know if you had a problem to solve.

DeleteBee, you really seem not to understand the historical context. Shamelessly borging the inventions of others, sometimes made in direct conflict with the physics as known or (mis)understood back then, or -- more often -- in no relation to that physics is a very distinctive trait of physicists. This might indeed be a good tool for luring prey during the Department of Physics open days,

Deletebut it would help if you did not believe your own propaganda.

If theologians had invented the transistor, it would not have been an argument in favour of theology.

And so it is not in favour of physics. Those guys were simply lucky (and damn smart), despite the nascent solid-state physics that had misled them for a moment.

You are similarly abducting Computer Science, which is mostly a branch of mathematics and in its foundations does not depend much on any particular physical context imposed by Nature the same way as Pythagorean theorem does not. It has nothing to do with physics, although many of the founding fathers were indeed physicists.

I can see nothing silly about this.

Philip: no, this is not true. Many ground-breaking feats of engineering had correctly been designed conceptually, and then they were built, exactly the way as Sabine suggests. The Nobel-winning tunnel diode would be one example. But this all has been happening since the later part of the XX century, well past the initial exploration epoch.

DeleteLawrence: this is exactly what I mean, incadescent lightbulb was no fruit of physics, just an invention. Phycists have explained it *later*, because they had no other option. An exemplary postdiction. P and N doping was not a novelty back then, crystal diodes had beem used in quantity during WW2. The holy grail was an amplifying device, a solid-state analogue of thermionic triode. Bardeen, Brattain and Schockley were trying to build *a wrong device*. And it was germanium, not silicon, BTW.

Delete@piotr:

DeleteTo say that the light-bulb and transistor is not fruit of physics, is rather like saying that the novel is not the fruit of writing.

It's just silly.

You might want to note that Bardeen won the Nobel prize for physics, not just once, but twice.

I'd like to point out here that Shockley was a scientific racist - he was into the pseudo-sciences of eugenics and race/intelligence.

In recent days, there has been more evidence for a ‘fifth force’ found in radioactive decay measurements now in helium. This glitch in theoretical prediction has been around for a number of years now and it is now getting worse. Is this the kind of inconsistency in theory that particle physicists should look into: either to verify it as true or disprove it? Or should physicists get back to wallowing in the beauty of their equations.

ReplyDeleteThere is this article link posted below. I remember reading about this with respect to Be-8 --> 2He-2. This talks a lot about the decay of He-4. which I always thought of as absolutely stable, or stable as baryons are. So I am a bit unsure about that.

ReplyDeleteIt is of course interesting if there should be some new gauge force, or if nothing else something new with gauge fields we know. There are also gaps in our understanding of neutrinos.

https://physicsworld.com/a/more-evidence-for-a-fifth-force-found-in-radioactive-decay-measurements/?utm_medium=email&utm_source=iop&utm_term=&utm_campaign=14290-46389&utm_content=Image%3A%20More%20evidence%20for%20a%20%E2%80%98fifth%20force%E2%80%99%20found%20in%20radioactive%20decay%20measurements%20%20-%20Editors_pick&Campaign+Owner=

ReplyDelete"Think of Einstein and Dirac and of Higgs and the others who predicted the Higgs boson. What did these correct predictions have in common?

They have in common that they were based on theoretical advances which resolved an inconsistency in the then existing theories."

Maybe the focus on unreachable geniuses like Einstein, Dirac, or Feynman is also part of the problem. It risks to neglect the role of less illustrious hard workers like Max Planck, Arnold Sommerfeld, or Lise Meitner. The reason why I write this is the early theoretical work of Max Planck on thermodynamics. (And because those were people who promoted others, like Einstein later did with Satyendranath Bose.)

Here is a specific example that I have in mind: Assume that the work of Lienhard Pagel on dynamic information would turn out to provide an important perspective on why quantum mechanics and general relativity are hard to unify. The idea would be that whenever a quantum mechanical description emerges on a higher level (like Helium-4 being a boson composed of many fermionic particles), then the Planck constant for that emergent system would still have exactly the same numerical value. Assume that it would be possible to prove this as a mathematical theorem (for the "non-general-relativity" regime). Since the units of the Planck constant are the product of energy and time, and also the product of impulse and distance, this might not fit into the regime of general relativity, where notions of time and energy become "less constant".

My point is not so much this specific example, but more the fact that it doesn't seem to be included your suggestion to "resolve an inconsistency". It would more be the opposite, namely to exhibit an inconsistency more clearly. In a certain sense, Max Planck's contribution to quantum mechanics falls into the same category.

Have just ordered your book in paperback now that I can, and look forward to getting it! I've been waiting; I very rarely allow myself to buy hardbacks.

ReplyDeleteReading such arguments in 2020 is quite depressing. Working out details of what we already know? It's seems we are back to squalid-state physics and the Anderson-Weinberg debate once again?

ReplyDeleteYour arguments about the "foundational" origins of what you call societal progress are quite misleading and does not take into account the fact that as foundational physics moves towards higher and higher energies (and smaller ad smaller scales) the chances of discovering something relevant to our scale thins out dramatically.

Past foundational breakthroughs had measurable consequences at energy scales which are relevant for us. This helped theorists but also allowed for concrete technological applications (from transistors to nuclear energy and GPS) which changed our society.

However, a big part of present-day difficulties in high-energy physics is exactly due to the lack of new experimental data at higher energy scales. Emergence and universality strongly suggests that different short ranged microscopic behavior will lead to the same large scale behavior, so that we have little hope of extracting any information about vastly more microscopic degrees of freedom from the scales we can probe. This is an unprecedented situation, which hampered foundational progresses in the last 40 years or so, but which also cast serious doubts over any impact advances towards higher energy theories may have. A few years ago I have been asked by several people how the Higgs boson discovery would impact our lives, and of course my answer was zero, nil, nothing.

More concrete applications may eventually pop out from a better understanding of quantum mechanics, even if a corpus of no-go theorems and similar results/constrains make me doubt that technological impact from advances in this area could be truly revolutionary (but here I may easily be wrong, I am no seer at all).

I can understand the intellectual appeal of foundational physics, which is surely a goal worth investigation for the sake of pure knowledge. But selling it as a modern source of societal advance is a travesty. Big breakthrough, on the other hand, could be realised in many other areas such as neuroscience or high temperature superconductivity. Claiming that understanding these problems is just derivative just shows that the lessons of Feynman, Anderson, Kadanoff, Wilson and many others have been sadly forgotten.

Sabine,

ReplyDeleteThanks again for pointing me at your publisher, Basic Books, who gave me an EPUB format version of your book, which I converted into MOBI format to read on my Kindle.

I established with BB that the only way I can pay them for it is to wait until the kindle book appears in the UK (if it does) and then purchase it!

I am only part-way through your book, but I want to let you know how good it is! You also write extremely clearly, and I would guess you wrote the English version yourself, as well as (presumably the German version).

If you do another version, I would suggest you add a glossary.

It is interesting to be reminded that the whole concept of beauty in physics came from religious ideas - it is easy to forget that.

As I read the chapter on beauty, one thought (well more than one really!)kept niggling within me. The problem with searching for beauty, is that you can't

expect considerations beauty to work if you are not probing the bottom layer of physics. I learned this as a teenager when I read about PV=nRT,

a really elegant formula, that is easy to apply.

I thought I had learned something fundamental and precise, only to be disappointed to learn that the equation is only approximate, basically because

there is another layer underneath - the quantum theory as applied to molecules. Lots of physics formula have the same characteristics - Ohm's law, the laws of kinetics in chemistry etc.

Is it possible that all that is wrong with the concept of beauty as applied to high energy physics is that physicists are not yet probing the lowest layer?

It's not only physicists that can get lost in maths, mathematicians can get lost in it too!

ReplyDeleteSometime ago I got excited about Connie's non-commutative Geometry. It sounded like an excellent idea. If Einstein can be said to have geometrised gravity, then geometrising quantum theory may turn out to yield new insights. And of course since quantum mechanics is famously non-commutative, we ought to have a non-commutative geometry.

I should add here that I'm actually inclined to think that Einstein didn't geometrise gravity. I read a recent article that provided evidence that he saw the geodesic equation in GR as the unification of two physical concepts, gravity and inertia and not geometrically as a geodesic. Presumably the geometric perspective was promoted by more mathematically inclined physicists.

Anyway, once I had got my hands on Conne's book, I found to my dismay that I hardly understood a word and this was supposed to be a popular exposition! His work is ferociously mathematical.

I've also read an opinion on the FXQI site that the average physicist/mathematician may expect to understand what Connes is doing in 300 years time after the novelties of his work has been digested by the two communities.

This is some time to wait.

Given that his spectral action principle and a small geometric input reproduces the entire standard model action, which has over a hundred terms and so, unlike string theory, his work links in directly with experimentally validated equations makes his work, to my mind, much closer to the classical work on the mathematical approach to physics as opposed to the more free-wheeling speculations of post-modern physics.

Personally, I see his work as akin to Maxwells in integrating the standard model into a single powerful formalism.

Do you have any ideas on how to cut down the three hundred years to something within a more manageable time span? I mean, are there sociological barriers to overcomes here? And what are your own, if any, views on Connes work?

@ Mozibur,

DeleteWheeler gave up on his geometrodynamics program in the 70s because he couldn't geometricize Fermi-fields. One can now do that using Connes' noncommutative geometry. See my lecture notes "Einstein's Dream" on my webpage for a description of this revised program.

@ Mozibur,

DeleteThere are no sociological barriers to overcome; one 'merely' needs to know a great deal of modern mathematics. Connes' work is so great that it deserves a place on Sagan's Disk!!

" the average physicist/mathematician may expect to understand what Connes is doing in 300 years time after the novelties of his work has been digested by the two communities. "

DeleteThat's right. It takes so much longer to read and understand a mathematical idea than to come up with it.

1. Everything is reduced physics. So they could claim the credit to everything. To get basics right doesn’t mean you get credit to everything. To think that person who invented piano or violin can think himself above Chopin or Vivaldi is just silly.

Delete2.

"So the transistor was invented by physicists, but those physicists were only inventors and not physicists, so it's not physics. Alrighty. This is so silly it's not even worth arguing about."

What if inventors of all useful things had been carpenters? Again, this won't matter for fruits of future research. Its not obvious that solving these problems you talk about will mean you will patent something as useful as transistor on the side.

Sitting on the shoulders of past inventors doesn’t mean anything. There are plenty of companies who did amazing things but are bankrupt now. They have to pass the market test you know. And unlike in academia, you have to put your money where your mouth is. Talk is cheap.

I am personally skeptical that the next century of foundational physics research will bring anything comparable to past inventions even if we did our best possible effort. I think its obvious we can thank Newton, Maxwell, Feynman, Dirac, Bohr, Einstein, von Neumann etc. who made the scientific tools available to create economic benefit but we can hit diminishing returns to physics.

3. Most importantly, fundamental papers are "nice" but they are jut sitting in arXiv until you can turn them to some form of more practical benefit.

And obviously the fruits of labor might take long to bear fruit. Whether we can escape our Sun and survive in the galaxy in far future probably rests on the shoulders of physicists. But this is probably centuries if not millenia from now.

4. Soviet Union had plenty of smart people, physicists etc. yet it still was suffering from famine and was way behind Western countries just like NK is today. You really think they didn't have smart enough physicists? I bet they did. In fact SU boasted hosting the smartest scientists around. But there is more to society than just scientific breakthroughs. My point is that societal progress is much more than just having science right, it depends on practical innovators, incentives, politics, mangement. not to mention all real social problems and issues. Same goes for companies btw.

You can discuss this thing ad infinitum. To me, its not prima facie obvious that the “market price” of say some useful formula is actually that massive compared to something more practical innovation that can even derivative of that. I’m happy to have you and Robin Hanson discuss this. He can formulate the economic argument about it it on formal (and math) level.

5. And lastly; we are so amazingly rich that we can easily have smart people working on these problems without losing anything. Work away. I’m sure you can have your toys too, plenty of GDP to spare.

@Prof Edwards:

DeleteI had a quick glance at your 'paper'. It looks entertaining. Have you had it published in a peer reviewed journal yet?

LQG is a continuation of Wheelers geometro-dynamics. And last time I looked, that programme is alive and kicking.

When Maxwell came up with his equations it was mostly ignored. It took a couple of decades for the Physics community to even begin to pick up on his work. Whereas now of course we teach it to undergraduates. Moreover, it takes a much simpler form when phrased in the language of differential forms and which is generally taught at a graduate level.

Personally, I think it's a real pity that the standard physics curriculum doesn't teach this much earlier. It's simply a calculus on a manifold as opposed to Cartesian space.

After all, Calculus was the preserve of small group of serious mathematicians in Newton's time, but now it is taught all over the world to school-kids.

The sociological conditions I had in mind are the ones referred to in Smolins published book, The Trouble with Physics. In fact, he took the trouble to write up his thoughts on it and had it sent to a respected journal. It was rejected because the editors told him that his findings were already well-known in the sociological community. And when he objected, they pointed him to a number of published articles.

He didn't mention economic or political considerations - which is what I had in mind. Neoliberalism in a word. I recall reading an article in New Scientist when I was at high school when this 'disciplinary' system was first coming into Britain and the article was giving voice to scientists complaining how this would ruin science.

@Steve Evans:

DeleteI don't understand you. That seems like a really superficial way to understand the relationship between creativity and pedagogy.

@ari:

Delete'Everything is reduced to physics'. I don't think the best physicists think like that. It's a simple-minded joke that keeps cropping up that's not really even worth a laugh.

Mozibur10:28 AM, June 22, 2020

DeleteI was being sarky. It's not going to take anybody 300 years to learn some new algebra and geometry. NCG is already used in theoretical physics.

@Steven Evans:

DeleteI can't say I'm much interested in your sarkiness. Keep it for yourself.

Nevertheless the author of the FXQI article was onto something. Voevodesky is on record for saying something similar. NCG is not just a little bit of high school algebra, like I've already said it is ferociously complex. The timescale may be a little exaggerated, but it's worth recalling that it took twenty years before Maxwell theory began to be picked up by the physics mainstream. NCG, being significantly more sophisticated is going to take quite a bit more time.

@ Mozibur

DeleteJohn Wheeler gave up on his quantum geometrodynamics program in the early 70's because he couldn't geometrocize Fermi-fields. That can be now done using NCG! See my essay "Einstein's Dream" on my website for an outline of a renewal of Wheeler's program.

Dear Sabine,

ReplyDeleteI don't believe the "progress of the society" depends much on foundations of physics anymore. I'm sure it used to do. Perhaps better understanding QE will help with superconductivity but I'm skeptical. Can you elaborate on that? How much uptick in GDP or some other measure we can expect? Describe something more concrete or measurable. Even relativity helps mostly use to keep more accurate GPS.

Besides, how much do you have on the line on that? 100kE? Your house?

Don't get me wrong. Maybe progress on these very foundations will bring us something amazing but its well possible we have hit dimnishing returns.

Care to debate Robin Hanson on that. He thinks research is overrated. I bet he would love betting on the question given he works on prediction markets, and thinks people have too many overconfident beliefs because of bad incentives.

Sabine,

ReplyDeleteI have only got as far as Chapter 7 so far - yours is not a book to be rushed because it is full of insights!

A notable example, was your observation that people didn't cling to the Earth-centred view of the cosmos just for religious reasons - naturalness played its part too - if we orbited the sun, there would be parallax effects visible by observing the sky unless the stars were a ridiculously long way away relative to the planets.

I particularly liked your section on QM, because this was something I actually used for real as part of my PhD, and like most students, I wanted to get a mental imagine of what the maths represented.

I was a bit puzzled by your description of the xenon experiment, because you said that when two xenon atoms were able to collide, they would emit an ion - which was detected. Surely the gas had to be pretty cold to show quantum effects - so the atomic collisions would be gentle.

It is amazing to me that QM has never been replaced by a deeper theory, despite all the theoretical advances in other areas. This makes me feel that QM is fundamental - so I think you are right to want to focus your research in that area rather than on yet another scattering experiment.

Thx for writing the book and raising these important topics.

ReplyDeleteAbout non-beauty being super-important .... well, think about the transcendentals (pi, e). "Ugly", unsymmetrical, and yet absolutely essential to current human math and science.

The "ugly"and un-beautiful have been there in plain sight.

Dear Sabine,

ReplyDeleteHave you suffered exclusion by professional colleagues, ostracism from Academia, or retaliation from institutions for sharing your views in Lost in Math? These are the professional price paid for a departure from Groupthink, the professional and emotional costs are more relevant than Emergent philosophy and belongs in the second edition of your book.