Friday, June 28, 2019

Quantum Supremacy: What is it and what does it mean?

Rumors are that later this year we will see Google’s first demonstration of “quantum supremacy”. This is when a quantum computer outperforms a conventional computer. It’s about time that we talk about what this means.

Before we get to quantum supremacy, I have to tell you what a quantum computer is. All conventional computers work with quantum mechanics because their components rely on quantum behavior, like electron bands. But the operations that a conventional computer performs are not quantum.

Conventional computers store and handle information in form of bits that can take on two values, say 0 and 1, or up and down. A quantum computer, on the other hand, stores information in form of quantum-bits or q-bits that can take on any combination of 0 and 1. Operations on a quantum computer can then entangle the q-bits, which allows a quantum computer to solve certain problems much faster than a conventional computer.

Calculating the properties of molecules or materials, for example, is one of those problem that quantum computers can help with. In principle, properties like conductivity or rigidity, or even color, can be calculated from the atomic build-up of a material. We know the equations. But we cannot solve these equations with conventional computers. It would just take too long.

To give you an idea of how much more a quantum computer can do, think about this: One can simulate a quantum computer on a conventional computer just by numerically solving the equations of quantum mechanics. If you do that, then the computational burden on the conventional computer increases exponentially with the number of q-bits that you try to simulate. You can do 2 or 4 q-bits on a personal computer. But already with 50 q-bits you need a cluster of supercomputers. Anything beyond 50 or so q-bits cannot presently be calculated, at least not in any reasonable amount of time.

So what is quantum supremacy? Quantum supremacy is the event in which a quantum computer outperforms the best conventional computers on a specific task. It needs to be a specific task because quantum computers are really special-purpose machines whose powers help with particular calculations.

However, to come back to the earlier example, if you want to know what a molecule does, you need millions of q-bits and we are far away from that. So how then do you test quantum supremacy? You let a quantum computer do what it does best, that is being a quantum computer.

This is an idea proposed by Scott Aaronson. If you set up a quantum computer in a suitable way, it will produce probabilistic distributions of measurable variables. You can try and simulate those measurement outcomes on a conventional computer but this would take a very long time. So by letting a conventional computer compete with a quantum computer on this task, you can demonstrate that the quantum computer does something a classical computer just is not able to do.

Exactly at which point someone will declare quantum supremacy is a little ambiguous because you can always argue that maybe one could have used better conventional computers or a better algorithm. But for practical purposes this really doesn’t matter all that much. The point is that it will show quantum computers really do things that are difficult to calculate with a conventional computer.

But what does that mean? Quantum supremacy sounds very impressive until you realize that most molecules have quantum processes that also exceed the computational capacities of present-day supercomputers. That is, after all, the reason we want quantum computers. And the generation of random variables that can be used to check quantum supremacy is not good to actually calculate anything useful. So that makes it sound as if the existing quantum computers are really just new toys for scientists.

What would it take to calculate anything useful with a quantum computer? Estimates about this vary between half a million and a billion q-bits, depending on just exactly what you think is “useful” and how optimistic you are that algorithms for quantum computers will improve. So let us say, realistically it would take a few million q-bits.

When will we get to see a quantum computer with a few million q-bits? No one knows. The problem is that the presently most dominant approaches are unlikely to scale. These approaches are superconducting q-bits and ion traps. In neither case does anyone have any idea how to get beyond a few hundred. This is both an engineering problem and a cost-problem.

And this is why, in recent years, there has been a lot of talk in the community about NISQ computers, that are the “noisy intermediate scale quantum computers”. This is really a term invented to make investors believe that quantum computing will have practical applications in the next decades or so. The trouble with NISQs is that while it is plausible that they soon will be practically feasible, no one knows how to calculate something useful with them.

As you have probably noticed, I am not very optimistic that quantum computers will have practical applications any time soon. In fact, I am presently quite worried that quantum computing will go the same way as nuclear fusion, that it will remain forever promising but never quite work.

Nevertheless, quantum supremacy is without doubt going to be an exciting scientific milestone.

Update June 29: Video now with German subtitles. To see those, click CC in the YouTube toolbar and chose language under settings/gear icon.

Wednesday, June 26, 2019

Win a free copy of "Lost in Maths" in French

My book “Lost in Math: How Beauty Leads Physics Astray” was recently translated to French. Today is your chance to win a free copy of the French translation! The first three people who submit a comment to this blogpost with a brief explanation of why they are interested in reading the book will be the lucky winners.

The only entry requirement is that you must be willing to send me a mailing address. Comments submitted by email or left on other platforms do not count because I cannot compare time-stamps.

Update: The books are gone.

Monday, June 24, 2019

30 years from now, what will a next larger particle collider have taught us?

The year is 2049. CERN’s mega-project, the Future Circular Collider (FCC), has been in operation for 6 years. The following is the transcript of an interview with CERN’s director, Johanna Michilini (JM), conducted by David Grump (DG).

DG: “Prof Michilini, you have guided CERN through the first years of the FCC. How has your experience been?”

JM: “It has been most exciting. Getting to know a new machine always takes time, but after the first two years we have had stable performance and collected data according to schedule. The experiments have since seen various upgrades, such as replacing the thin gap chambers and micromegas with quantum fiber arrays that have better counting rates and have also installed… Are you feeling okay?”

DG: “Sorry, I may have briefly fallen asleep. What did you find?”

JM: “We have measured the self-coupling of a particle called the Higgs-boson and it came out to be 1.2 plus minus 0.3 times the expected value which is the most amazing confirmation that the universe works as we thought in the 1960s and you better be in awe of our big brains.”

DG: “I am flat on the floor. One of the major motivations to invest into your institution was to learn how the universe was created. So what can you tell us about this today?”

JM: “The Higgs gives mass to all fundamental particles that have mass and so it plays a role in the process of creation of the universe.”

DG: “Yes, and how was the universe created?”

JM: “The Higgs is a tiny thing but it’s the greatest particle of all. We have built a big thing to study the tiny thing. We have checked that the tiny thing does what we thought it does and found that’s what it does. You always have to check things in science.”

DG: “Yes, and how was the universe created?”

JM: “You already said that.”

DG: “Well isn’t it correct that you wanted to learn how the universe was created?”

JM: “That may have been what we said, but what we actually meant is that we will learn something about how nuclear matter was created in the early universe. And the Higgs plays a role in that, so we have learned something about that.”

DG: “I see. Well, that is somewhat disappointing.”

JM: “If you need $20 billion, you sometimes forget to mention a few details.”

DG: “Happens to the best of us. All right, then. What else did you measure?”

JM: “Ooh, we measured many many things. For example we improved the precision by which we know how quarks and gluons are distributed inside protons.”

DG: “What can we do with that knowledge?”

JM: “We can use that knowledge to calculate more precisely what happens in particle colliders.”

DG: “Oh-kay. And what have you learned about dark matter?”

JM: “We have ruled out 22 of infinitely many hypothetical particles that could make up dark matter.”

DG: “And what’s with the remaining infinitely many hypothetical particles?”

JM: “We are currently working on plans for the next larger collider that would allow us to rule out some more of them because you just have to look, you know.”

DG: “Prof Michilini, we thank you for this conversation.”

Thursday, June 20, 2019

Away Note

I'll be in the Netherlands for a few days to attend a workshop on "Probabilities in Cosmology". Back next week. Wish you a good Summer Solstice!

Wednesday, June 19, 2019

No, a next larger particle collider will not tell us anything about the creation of the universe

LHC magnets. Image: CERN.
A few days ago, Scientific American ran a piece by a CERN physicist and a philosopher about particle physicists’ plans to spend $20 billion on a next larger particle collider, the Future Circular Collider (FCC). To make their case, the authors have dug up a quote from 1977 and ignored the 40 years after this, which is a truly excellent illustration of all that’s wrong with particle physics at the moment.

I currently don’t have time to go through this in detail, but let me pick the most egregious mistake. It’s right in the opening paragraph where the authors claim that a next larger collider would tell us something about the creation of the universe:
“[P]article physics strives to push a diverse range of experimental approaches from which we may glean new answers to fundamental questions regarding the creation of the universe and the nature of the mysterious and elusive dark matter.

Such an endeavor requires a post-LHC particle collider with an energy capability significantly greater than that of previous colliders.”

We previously encountered this sales-pitch in CERN’s marketing video for theFCC, which claimed that the collider would probe the beginning of the universe.

But neither the LHC nor the FCC will tell us anything about the “beginning” or “creation” of the universe.

What these colliders can do is create nuclear matter at high density by slamming heavy atomic nuclei into each other. Such matter probably also existed in the early universe. However, even collisions of large nuclei create merely tiny blobs of such nuclear matter, and these blobs fall apart almost immediately. In case you prefer numbers over words, they last about 10-23 seconds.

This situation is nothing like the soup of plasma in the expanding space of the early universe. It is therefore highly questionable already that these experiments can tell us much about what happened back then.

Even optimistically, the nuclear matter that the FCC can produce has a density about 70 orders of magnitude below the density at the beginning of the universe.

And even if you are willing to ignore the tiny blobs and their immediate decay and the 70 orders of magnitude, then the experiments still tell us nothing about the creation of this matter, and certainly not about the creation of the universe.

The argument that large colliders can teach us anything about the beginning, origin, or creation of the universe is manifestly false. The authors of this article either knew this and decided to lie to their readers, or they didn’t know it, in which case they have begun to believe their own institution’s marketing. I’m not sure which is worse.

And as I have said many times before, there is no reason to think a next larger collider would find evidence of dark matter particles. Somewhat ironically, the authors spend the rest of their article arguing against theoretical arguments, but of course the appeal to dark matter is a bona-fide theoretical argument.

In any case, it pains me to see not only that particle physicists are still engaging in false marketing, but that Scientific American plays along with it.

How about sticking with the truth? The truth is that a next larger collider costs a shitload lot of money and will most likely not teach us much. If progress in the foundations of physics is what you want, this is not the way forward.

Tuesday, June 18, 2019

Brace for the oncoming deluge of dark matter detectors that won’t detect anything

Imagine an unknown disease spreads, causing temporarily blindness. Most patients recover after a few weeks, but some never regain eyesight. Scientists rush to identify the cause. They guess the pathogen’s shape and, based on this, develop test-stripes and antigens. If one guess doesn’t work, they’ll move on to the next.

Doesn’t quite sound right? Of course it does not. Trying to identifying pathogens by guesswork is sheer insanity. The number of possible shapes is infinite. The guesses will almost certainly be wrong. No funding agency would pour money into this.

Except they do. Not for pathogen identification, but for dark matter searches.

In the past decades, the searches for the most popular dark matter particles have failed. Neither WIMPs nor axions have shown up in any detector, of which there have been dozens. Physicists have finally understood this is not a promising method. Unfortunately, they have not come up with anything better.

Instead, their strategy is now to fund any proposed experiment that could plausibly be said to maybe detect something that could potentially be a hypothetical dark matter particle. And since there are infinitely many such hypothetical particles, we are now well on the way to building infinitely many detectors. DNA, carbon nanotubes, diamonds, old rocks, atomic clocks, superfluid helium, qubits, Aharonov-Bohm, cold atom gases, you name it. Let us call it the equal opportunity approach to dark matter search.

As it should be, everyone benefits from the equal opportunity approach. Theorists invent new particles (papers will be written). Experimentalists use those invented particles as motivation to propose experiments (more papers will be written). With a little luck they get funding and do the experiment (even more papers). Eventually, experiments conclude they didn’t find anything (papers, papers, papers!).

In the end we will have a lot of papers and still won’t know what dark matter is. And this, we will be told, is how science is supposed to work.

Let me be clear that I am not strongly opposed to such medium scale experiments, because they typically cost “merely” a few million dollars. A few millions here and there don’t put overall progress at risk. Not like, say, building a next larger collider would.

So why not live and let live, you may say. Let these physicists have some fun with their invented particles and their experiments that don’t find them. What’s wrong with that?

What’s wrong with that (besides the fact that a million dollars is still a million dollars) is that it will almost certainly lead nowhere. I don’t want to wait another 40 years for physicists to realize that falsifiability alone is not sufficient to make a hypothesis promising.

My disease analogy, as any analogy, has its shortcomings of course. You cannot draw blood from a galaxy and put it under a microscope. But metaphorically speaking, that’s what physicists should do. We have patients out there: All those galaxies and clusters which are behaving in funny ways. Study those until you have good reason to think you know what’s the pathogen. Then, build your detector.

Not all types of dark matter particles do an equally good job to explain structure formation and the behavior of galaxies and all the other data we have. And particle dark matter is not the only explanation for the observations. Right now, the community makes no systematic effort to identify the best model to fit the existing data. And, needless to say, that data could be better, both in terms of sky coverage and resolution.

The equal opportunity approach relies on guessing a highly specific explanation and then setting out to test it. This way, null-results are a near certainty. A more promising method is to start with highly non-specific explanations and zero in on the details.

The failures of the past decades demonstrate that physicists must think more carefully before commissioning experiments to search for hypothetical particles. They still haven’t learned the lesson.

Sunday, June 16, 2019

Book review: “Einstein’s Unfinished Revolution” by Lee Smolin

Einstein’s Unfinished Revolution: The Search for What Lies Beyond the Quantum
By Lee Smolin
Penguin Press (April 9, 2019)

Popular science books cover a spectrum from exposition to speculation. Some writers, like Chad Orzel or Anil Ananthaswamy, stay safely on the side of established science. Others, like Philip Ball in his recent book, keep their opinions to the closing chapter. I would place Max Tegmark’s “Mathematical Universe” and Lee Smolin’s “Trouble With Physics” somewhere in the middle. Then, on the extreme end of speculation, we have authors like Roger Penrose and David Deutsch who use books to put forward ideas in the first place. “Einstein’s Unfinished Revolution” lies on the speculative end of this spectrum.

Lee is very upfront about the purpose of his writing. He is dissatisfied with the current formulation of quantum mechanics. It sacrifices realism, and he thinks this is too much to give up. In the past decades, he has therefore developed his own approach to quantum mechanics, the “ensemble interpretation”. His new book lays out how this ensemble interpretation works and what its benefits are.

Before getting to this, Lee introduces the features of quantum theories (superpositions, entanglement, uncertainty, measurement postulate, etc) and discusses the advantages and disadvantages of the major interpretations of quantum mechanics (Copenhagen, many worlds, pilot wave, collapse models). He deserves applause for also mentioning the Montevideo interpretation and superdeterminism, though clearly he doesn’t like either. I have found his evaluation of these approaches overall balanced and fair.

In the later chapters, Lee comes to his own ideas about quantum mechanics and how these tie together with his other work on quantum gravity. I have not been able to follow all his arguments here, especially not on the matter of non-locality.

Unfortunately, Lee doesn’t discuss his ensemble interpretation half as critically the other approaches. From reading his book you may get away with the impression he has solved all problems. Let me therefore briefly mention the most obvious shortcomings of his approach. (a) To quantify the similarity of two systems you need to define a resolution. (b) This will violate Lorentz-invariance which means it’s hard to make compatible with standard model physics. (c) You better not ask about virtual particles. (d) If a system gets its laws from precedents, where do the first laws come from? Lee tells me that these issues have been discussed in the papers he lists on his website.

As all of Lee’s previous books, this one is well-written and engaging, and if you liked Lee’s earlier books you will probably like this one too. The book has the occasional paragraph that I think will be over many reader’s head, but most of it should be understandable with little or no prior knowledge. I have found this book particularly valuable for spelling out the author’s philosophical stance. You may not agree with Lee, but at least you know where he is coming from.

This book is recommendable for anyone who is dissatisfied with the current formulation of quantum mechanics, or who wants to understand why others are dissatisfied with it. It also serves well as a quick introduction to current research in the foundations of quantum mechanics.

[Disclaimer: free review copy.]

Thursday, June 13, 2019

Physicists are out to unlock the muon’s secret

Fermilab g-2 experiment.
[Image Glukicov/Wikipedia]
Physicists count 25 elementary particles that, for all we presently know, cannot be divided any further. They collect these particles and their interactions in what is called the Standard Model of particle physics.

But the matter around us is made of merely three particles: up and down quarks (which combine to protons and neutrons, which combine to atomic nuclei) and electrons (which surround atomic nuclei). These three particles are held together by a number of exchange particles, notably the photon and gluons.

What’s with the other particles? They are unstable and decay quickly. We only know of them because they are produced when other particles bang into each other at high energies, something that happens in particle colliders and when cosmic rays hit Earth’s atmosphere. By studying these collisions, physicists have found out that the electron has two bigger brothers: The muon (μ) and the tau (τ).

The muon and the tau are pretty much the same as the electron, except that they are heavier. Of these two, the muon has been studied closer because it lives longer – about 2 x 10-6 seconds.

The muon turns out to be... a little odd.

Physicists have known for a while, for example, that cosmic rays produce more muons than expected. This deviation from the predictions of the standard model is not hugely significant, but it has stubbornly persisted. It has remained unclear, though, whether the blame is on the muons, or the blame is on the way the calculations treat atomic nuclei.

Next, the muon (like the electron and tau) has a partner neutrino, called the muon-neutrino. The muon neutrino also has some anomalies associated with it. No one currently knows whether those are real or measurement errors.

The Large Hadron Collider has seen a number of slight deviations from the predictions of the standard model which go under the name lepton anomaly. They basically tell you that the muon isn’t behaving like the electron, which (all other things equal) really it should. These deviations may just be random noise and vanish with better data. Or maybe they are the real thing.

And then there is the gyromagnetic moment of the muon, usually denoted just g. This quantity measures how muons spin if you put them into a magnetic field. This value should be 2 plus quantum corrections, and the quantum corrections (the g-2) you can calculate very precisely with the standard model. Well, you can if you have spent some years learning how to do that because these are hard calculations indeed. Thing is though, the result of the calculation doesn’t agree with the measurement.

This is the so-called muon g-2 anomaly, which we have known about since the 1960s when the first experiments ran into tension with the theoretical prediction. Since then, both the experimental precision as well as the calculations have improved, but the disagreement has not vanished.

The most recent experimental data comes from a 2006 experiment at Brookhaven National Lab, and it placed the disagreement at 3.7σ. That’s interesting for sure, but nothing that particle physicists get overly excited about.

A new experiments is now following up on the 2006 result: The muon g-2 experiment at Fermilab. The collaboration projects that (assuming the mean value remains the same) their better data could increase the significance to 7σ, hence surpassing the discovery standard in particle physics (which is somewhat arbitrarily set to 5σ).

For this experiment, physicists first produce muons by firing protons at a target (some kind of solid). This produces a lot of pions (composites of two quarks) which decay by emitting muons. The muons are then collected in a ring equipped with magnets in which they circle until they decay. When the muons decay, they produce two neutrinos (which escape) and a positron that is caught in a detector. From the direction and energy of the positron, one can then infer the magnetic moment of the muon.

The Fermilab g-2 experiment, which reuses parts of the hardware from the earlier Brookhaven experiment, is already running and collecting data. In a recent paper, Alexander Keshavarzi, on behalf of the collaboration reports they successfully completed the first physics run last year. He writes we can expect a publication of the results from the first run in late 2019. After some troubleshooting (something about an underperforming kicker system), the collaboration is now in the second run.

Another experiment to measure more precisely the muon g-2 is underway in Japan, at the J-PARC muon facility. This collaboration too is well on the way.

While we don’t know exactly when the first data from these experiements will become available, it is clear already that the muon g-2 will be much talked about in the coming years. At present, it is our best clue for physics beyond the standard model. So, stay tuned.

Wednesday, June 12, 2019

Guest Post: A conversation with Lee Smolin about his new book "Einstein’s Unfinished Revolution"

[Tam Hunt sent me another lengthy interview, this time with Lee Smolin. Smolin is a faculty member at the Perimeter Institute for Theoretical Physics in Canada and adjunct professor at the University of Waterloo. He is one of the founders of loop quantum gravity. In the past decades, Smolin’s interests have drifted to the role of time in the laws of nature and the foundations of quantum mechanics.]

TH: You make some engaging and bold claims in your new book, Einstein’s Unfinished Revolution, continuing a line of argument that you’ve been making over the course of the last couple of decades and a number of books. In your latest book, you argue essentially that we need to start from scratch in the foundations of physics, and this means coming up with new first principles as our starting point for re-building. Why do you think we need to start from first principles and then build a new system? What has brought us to this crisis point?

LS: The claim that there is a crisis, which I first made in my book, Life of the Cosmos (1997), comes from the fact that it has been decades since a new theoretical hypothesis was put forward that was later confirmed by experiment. In particle physics, the last such advance was the standard model in the early 1970s; in cosmology, inflation in the early 1980s. Nor has there been a completely successful approach to quantum gravity or the problem of completing quantum mechanics.

I propose finding new fundamental principles that go deeper than the principles of general relativity and quantum mechanics. In some recent papers and the book, I make specific proposals for new principles.

TH: You have done substantial work yourself in quantum gravity (loop quantum gravity, in particular) and quantum theory (suggesting your own interpretation called the “real ensemble interpretation”), and yet in this new book you seem to be suggesting that you and everyone else in foundations of physics needs to return to the starting point and rebuild. Are you in a way repudiating your own work or simply acknowledging that no one, including you, has been able to come up with a compelling approach to quantum gravity or other outstanding foundations of physics problems?

LS: There are a handful of approaches to quantum gravity that I would call partly successful. These each achieve a number of successes, which suggest that they could plausibly be at least part of the story of how nature reconciles quantum physics with space, time and gravity. It is possible, for example that these partly successful approaches model different regimes or phases of quantum gravity phenomena. These partly successful approaches include loop quantum gravity, string theory, causal dynamical triangulations, causal sets, asymptotic safety. But I do not believe that any approach to date, including these, is fully successful. Each has stumbling blocks that after many years remain unsolved.

TH: You part ways with a number of other physicists in recent years who have railed against philosophy and philosophers of physics as being largely unhelpful for actual physics. You argue instead that philosophers have a lot to contribute to the foundations of physics problems that are your focus. Have you found philosophy helpful in pursuing your physics for most of your career or is this a more recent finding in your own work? Which philosophers, in particular, do you think can be helpful in this area of physics?

LS: I would first of all suggest we revive the old idea of a natural philosopher, which is a working scientist who is inspired and guided by the tradition of philosophy. An education and immersion in the philosophical tradition gives them access to the storehouse of ideas, positions and arguments that have been developed over the centuries to address the deepest questions, such as the nature of space and time.

Physicists who are natural philosophers have the advantage of being able to situate their work, and its successes and failures, within the long tradition of thought about the basic questions.

Most of the key figures who transformed physics through its history have been natural philosophers: Galileo, Newton, Leibniz, Descartes, Maxwell, Mach, Einstein, Bohr, Heisenberg, etc. In more recent years, David Finkelstein is an excellent example of a theoretical physicist who made important advances, such as being the first to untangle the geometry of a black hole, and recognize the concept of an event horizon, who was strongly influenced by the philosophical tradition. Like a number of us, he identified as a follower of Leibniz, who introduced the concepts of relational space and time.

The abstract of Finkelstein’s key 1958 paper on what were soon to be called black holes explicitly mentions the principle of sufficient reason, which is the central principle of Leibniz’s philosophy. None of the important developments of general relativity in the 1960s and 1970s, such as those by Penrose, Hawking, Newmann, Bondi, etc., would have been possible without that groundbreaking paper by Finkelstein.

I asked Finkelstein once why it was important to know philosophy to do physics, and he replied, “If you want to win the long jump, it helps to back up and get a running start.”’

In other fields, we can recognize people like Richard Dawkins, Daniel Dennett, Lynn Margulis, Steve Gould, Carl Sagan, etc. as natural philosophers. They write books that argue the central issues in evolutionary theory, with the hope of changing each other’s minds. But we the lay public are able to read over their shoulders, and so have front row seats to the debates.

There are also working now a number of excellent philosophers of physics, who contribute in important ways to the progress of physics. One example of these is a group, centred originally at Oxford, of philosophers who have been doing the leading work on attempting to make sense of the Many Worlds formulation of quantum mechanics. This work involves extremely subtle issues such as the meaning of probability. These thinkers include Simon Saunders, David Wallace, Wayne Mhyrvold; and there are equally good philosophers who are skeptical of this work, such as David Albert and Tim Maudlin.

It used to be the case, half a century ago, that philosophers, such as Hilary Putnam, who opined about physics, felt qualified to do so with a bare knowledge of the principles of special relativity and single particle quantum mechanics. In that atmosphere my teacher Abner Shimony, who had two Ph.D’s – one in physics and one in philosophy – stood out, as did a few others who could talk in detail about quantum field theory and renormalization, such as Paul Feyerabend. Now the professional standard among philosophers of physics requires a mastery of Ph.D level physics, as well as the ability to write and argue with the rigour that philosophy demands. Indeed, a number of the people I just mentioned have Ph.D’s in physics.

TH: One of your suggested hypotheses, the next step you take after stating your first principles, is an acknowledgment that time is fundamental, real and irreversible, effectively goring one of the sacred cows of modern physics. You made your case for this approach in your book Time Reborn and I'm curious if you've seen a softening over the last few years in terms of physicists and philosophers beginning to be more open to the idea that the passage of time is truly fundamental? Also, why wouldn't this hypothesis be instead a first principle, if time is indeed fundamental?

LS: In my experience, there have always been physicists and philosophers open to these ideas, even if there is no consensus among those who have carefully thought the issues through.

When I thought carefully about how to state a candidate set of basic principles, it became clear that it was useful to separate principles from hypotheses about nature. Principles such as sufficient reason and the identity of the indiscernible can be realized in formulations of physics in which time is either fundamental or secondary and emergent. Hence those principles are prior to the choice of a fundamental or emergent time. So I think it clarifies the logic of the situation to call the latter choice a hypothesis rather than a principle.

TH: How does viewing time as irreversible and fundamental mesh with your principle of background independence? Doesn’t a preferred spacetime foliation, which would provide an irreversible and fundamental time, provide a background?

LS: Background independence is an aspect of the two principles of Leibniz I just referred to: 1) sufficient reason (PSR) and 2) the identity of the indiscernible (PII). Hence it is deeper than the choice of whether time is fundamental or emergent. Indeed, there are theories which rest on both hypotheses about time (fundamental or emergent). Julian Barbour, for example, is a relationalist who develops background-independent theories in which time is emergent. I am also a relationalist, but I make background-independent models of physics in which time and its passage are fundamental.

Viewing time as fundamental and irreversible doesn’t necessarily imply a preferred foliation; by the latter you mean a foliation of a pre-existing spacetime, specified kinematically in advance of the dynamical evolution. In our energetic causal set models there does arise a notion of the present, but this is determined dynamically by the evolution of the model and so is consistent with what we mean by background independence.

The point is that the solutions to background-independent theories can have preferred frames, so long as they are generated by solving the dynamics. This is, for example, the case with cosmological solutions to general relativity.

TH: You and many other physicists have focused for many years on finding a theory of quantum gravity, effectively unifying quantum mechanics and general relativity. In describing your preferred approach to achieving a theory of quantum gravity worthy of the name you describe why you think quantum mechanics is incomplete and why general relativity is in some key ways likely wrong. Let’s look first at quantum mechanics, which you describe as “wrong” and “incomplete.” Why is the Copenhagen (still perhaps the most popular version of quantum theory) school of quantum mechanics wrong and incomplete?

LS: Copenhagen is incomplete because it is based on an arbitrarily chosen division of the world into a classical realm and a quantum realm. This reflects our practice as experimenters, and corresponds to nothing in nature. This means it is an operational approach which conflicts with the expectations that physics should offer a complete description of individual phenomena, with no reference to our existence, knowledge or measurements.

TH: Your objections just stated (what’s known generally as the “measurement problem”) seem to me, even as an obvious non-expert in this area, to be fairly apparent and accurate objections to Copenhagen. If that’s the case, why is Copenhagen still with us today? Why was it ever considered a serious theory?

LS: I don’t think there are many proponents of the Copenhagen view among people working in quantum foundations, or who have otherwise thought about the issues carefully. I don’t think there are many enthusiastic followers of Bohr left alive.

Meanwhile, what most physicists who are not specialists in quantum foundations practice and teach is a very pragmatic, operational set of rules, which suffices because it closely parallels the practice of actual experimenters. They can get on with the physics without having to take a stand on realism.

What Bohr had in mind was a much more radical rejection of realism and its replacement by a view of the world in which nature and us co-create phenomena. My sense is that most living physicists haven’t read Bohr’s actual writings. There are of course some exceptions, like Chris Fuch’s QBism, which is, to the extent that I understand it, an even more radical view. Even if I disagree, I very much admire Chris for the clarity of his thinking and his insistence on taking his view to its logical conclusions. But, in the end, as a realist who sees the necessity of completing quantum mechanics by the discovery of new physics, the intellectual contortions of anti-realists are, however elegant, no help for my projects.

TH: Could this be a good example of why philosophical training could actually be helpful for physicists?

LS: I would agree, in some cases it could be helpful for some physicists to study philosophy, especially if they are interested in discovering deeper foundational laws. But I would never say anyone should study philosophy, because it can be very challenging reading, and if someone is not inclined to think “philosophically” they are unlikely to get much from the effort. But I would say that if someone is receptive to the care and depth of the writing, it can open doors to new ideas and to a highly critical style of thinking, which could greatly aid someone’s research.

The point I would like to make here is rather different. As I discussed in my earlier books, there are different periods in the development of science during which different kinds of problems present themselves. These require different strategies, different educations and perhaps even different styles of research to move forward.

There are pragmatic periods where the laws needed to understand a wide range of phenomena are in place and the opportunities of greatly advancing our understanding of diverse physical phenomena dominate. These kinds of periods require a more pragmatic approach, which ignores whatever foundational issues may be present (and indeed, there are always foundational issues lurking in the background), and focuses on developing better tools to work out the implications of the laws as they stand.

Then there are (to follow Kuhn) revolutionary periods in science, when the foundations are in question and the priority is to discover and express new laws.

The kinds of people and the kinds of education needed to succeed are different in these two kinds of periods. Pragmatic times require pragmatic scientists, and philosophy is unlikely to be important. But foundational periods require foundational people, many of whom will, as in past foundational periods, find inspiration from philosophy. Of course, what I just said is an oversimplification. At all times, science needs a diverse mix of research styles. We always need pragmatic people who are very good at the technical side of science. And we always need at least a few foundational thinkers. But the optimal balance is different in different periods.

The early part of the 20th Century, through around 1930, was a foundational period. That was followed by a pragmatic period during which the foundational issues were ignored and many applications of the quantum mechanics were developed.

Since the late 1970s, physics has been again in a foundational period, facing deep questions in elementary particle physics, cosmology, quantum foundations and quantum gravity. The pragmatic methods which got us to that point no longer suffice; during such a period we need more foundational thinkers and we need to pay more attention to them.

TH: Turning to general relativity, you also don’t mince your words and you describe the notion of reversible time, thought to be at the core of general relativity, as “wrong.” What does general relativity look like with irreversible and fundamental time?

LS: We posed exactly this question: can we invent an extension of general relativity in which time evolution is asymmetric under a transformation that reverses a measure of time. We found two ways to do this.

TH: You touched on consciousness as a physical phenomenon and a necessary ingredient in our physics in your book, Time Reborn (as have many other physicists over the last century, of course). You spend less time on consciousness in your new book — stating “Let us tiptoe past the hard question of consciousness to simpler questions” — but I’m curious if you’ve considered including as a first principle the notion that consciousness is a fundamental aspect of nature (or not) in your ruminations on these deep topics?

LS: I am thinking slowly about the problems of qualia and consciousness, in the rough direction set out in the epilogue of Time Reborn. But I haven’t yet come to conclusions worth publishing. An early draft of Einstein’s Unfinished Revolution had an epilogue entirely devoted to these questions, but I decided it was premature to publish; it also would have distracted attention from the central themes of that book.

TH: David Bohm, one of the physicists you discuss with respect to alternative versions of quantum theory, delved deeply into philosophy and spirituality in relation to his work in physics, as you discuss briefly in your new book. Do you find Bohm’s more philosophical notions such as the Implicate Order (the metaphysical ground of being in which the “explicate” manifest world that we know in our normal every day life is enfolded, and thus “implicate”) helpful for physics?

LS: I am afraid I’ve not understood what Bohm was aiming for in his book on the implicate order, or his dialogues with Krishnamurti, but it is also true that I haven’t tried very hard. I think one can admire greatly the practical and psychological knowledge of Buddhism and related traditions, while remaining skeptical of their more metaphysical teachings.

TH: Bohm’s Implicate Order has much in common with physical notions such as the (nonluminiferous) ether, which has been revived in today’s physics by some heavyweights such as Nobel Prize winner Frank Wilczek (The Lightness of Being: Mass, Ether, and the Unification of Forces) as another term for the set of space-filling fields that underlie our reality. Do you take the idea of reviving some notion of the ether as a physical/metaphysical background at all seriously in your work?

LS: The important part of the idea of the ether was that it is a smooth, fundamental, physical substance, which had the property that vibrations and stresses within it reproduced the phenomena described by Maxwell’s field theory of electromagnetism. It was also important that there was a preferred frame of reference associated with being at rest with respect to this substance.

We no longer believe any part of this. The picture we now have is that any such substance is made of a large collection of atoms. Therefore the properties of any substance are emergent and derivative. I don’t think Frank Wilczek disagrees with this, I suspect he is just being metaphorical.

TH: He doesn’t seem to be metaphorical, writing in a 1999 article:“Quite undeservedly, the ether has acquired a bad name. There is a myth, repeated in many popular presentations and textbooks, that Albert Einstein swept it into the dustbin of history. The real story is more complicated and interesting. I argue here that the truth is more nearly the opposite: Einstein first purified, and then enthroned, the ether concept. As the 20th century has progressed, its role in fundamental physics has only expanded. At present, renamed and thinly disguised, it dominates the accepted laws of physics. And yet, there is serious reason to suspect it may not be the last word.” In his 2008 book mentioned above, he reframes the set of accepted physical fields as “the Grid” (which is “the primary world-stuff”) or ether. Sounds like you don’t find this re-framing very compelling?

LS: What is true is that quantum field theory (QFT) treats all propagating particles and fields as excitations of a (usually unique) vacuum state. This is analogized to the ether, but in my opinion it’s a bad analogy. One big difference is that the vacuum of a QFT is invariant under all the symmetries of nature, whereas the ether breaks many of them by defining a preferred state of at rest.

TH: You consider Bohm’s alternative quantum theory in some depth, and say that “it makes complete sense,” but after further discussion you consider it inadequate because it is generally considered to be incompatible with special relativity, among other problems.

LS: This is not the main reason I don’t think pilot wave theory describes nature.

Pilot wave theory is based on two equations. One, which is the same as in ordinary QM-the Schrödinger equation, propagates the wave-function, while the second-the guidance equation, guides the “particles.” The first can be made compatible with special relativity, while the second cannot. But when one adds an assumption about probabilities, the averages of the guided particles follow the waves and so agree with both ordinary QM and special relativity. In this way you can say that pilot wave theory is “weakly compatible” with special relativity, in the sense that, while there is a preferred sense of rest, it can’t be measured.

TH: If one considers time to be fundamental and irreversible, isn’t there a relativistic version of Bohmian mechanics readily available by adopting some version of Lorentzian or neo-Lorentzian relativity (which are background-dependent)?

LS: Maybe — you are describing research to be done.

TH: Last, how optimistic are you that your view, that today’s physics needs some really fundamental re-thinking, will catch on with the majority of today’s physicists in the next decade or so?

LS: I’m not but I wouldn’t expect any such call for a reconsideration of the basic principles would be popular until it has results which make it hard to avoid thinking about.

Monday, June 10, 2019

Sometimes giving up is the smart thing to do.

[likely image source]
A few years ago I signed up for a 10k race. It had an entry fee, it was a scenic route, and I had qualified for the first group. I was in best shape. The weather forecast was brilliant.

Two days before the race I got a bad cold. But that wouldn’t deter me. Oh, no, not me. I’m not a quitter. I downed a handful of pills and went nevertheless. I started with a fever, a bad cough, and a banging head.

It didn’t go well. After half a kilometer I developed a chest pain. After one kilometer it really hurt. After two kilometers I was sure I’d die. Next thing I recall is someone handing me a bottle of water after the finish line.

Needless to say, my time wasn’t the best.

But the real problem began afterward. My cold refused to clear out properly. Instead I developed a series of respiratory infections. That chest pain stayed with me for several months. When the winter came, each little virus the kids brought home knocked me down.

I eventually went to see a doctor. She sent me to have a chest X-ray taken on the suspicion of tuberculosis. When the X-ray didn’t reveal anything, she put me on a 2 week regime of antibiotics.

The antibiotics indeed finally cleared out whatever lingering infection I had carried away. It took another month until I felt like myself again.

But this isn’t a story about the misery of aging runners. It’s a story about endurance sport of a different type: academia.

In academia we write Perseverance with capital P. From day one, we are taught that pain is normal, that everyone hurts, and that self-motivation is the highest of virtues. In academia, we are all over-achievers.

This summer, as every summer for the past two decades, I receive notes about who is leaving. Leaving because they didn’t get funding, because they didn’t get another position, or because they’re just no longer willing to sacrifice their life for so little in return.

And this summer, as every summer for the past two decades, I find myself among the ones who made it into the next round, find myself sitting here, wondering if I’m worthy and if I’m in the right place doing the right thing at the right time. Because, let us be honest. We all know that success in academia has one or two elements of luck. Or maybe three. We all know it’s not always fair.

I’m writing this for the ones who have left and the ones who are about to leave. Because I have come within an inch of leaving half a dozen times and I have heard the nasty, nagging voice in the back of my head. “Quitter,” it says and laughs, “Quitter.”

Don’t listen. From the people I know who left academia, few have regrets. And the few with regrets found ways to continue some research along with their new profession. The loss isn’t yours. The loss is one for academia. I understand your decision and I think you choose wisely. Just because everyone you know is on a race to nowhere doesn’t mean going with them makes sense. Sometimes, giving up is the smart thing to do.

A year after my miserable 10k experience, I signed up for a half-marathon. A few kilometers into the race, I tore a muscle.

I don’t get a runner’s high, but running increases my pain tolerance to unhealthy levels. After a few kilometers, you could probably stab me in the back and I wouldn’t notice. I could well have finished that race. But I quit.

Saturday, June 08, 2019

Book Review: “Beyond Weird” by Philip Ball

Beyond Weird: Why Everything You Thought You Knew about Quantum Physics Is Different
By Philip Ball
University of Chicago Press (October 18, 2018)

I avoid popular science articles about quantum mechanics. It’s not that I am not interested, it’s that I don’t understand them. Give me a Hamiltonian, a tensor-product expansion, and some unitary operators, and I can deal with that. But give me stories about separating a cat from its grin, the many worlds of Wigner’s friend, or suicides in which you both die and not die, and I admit defeat on paragraph two.

Ball is guilty of some of that. I got lost half through his explanation how a machine outputs plush cats and dogs when Alice and Bob put in quantum coins, and still haven’t figured out why the seer’s daughter wanted to be wed to a man evidently more stupid than she.

But then, clearly, I am not the book’s intended audience, so let me instead tell you something more helpful.

Ball knows what he writes about, that’s obvious from page one. For all I can tell the science in his book is flawless. It is also engagingly told, with some history but not too much, with some reference to current research, but not too much, with some philosophical discourse but not too much. Altogether, it is a well-balanced mix that should be understandable for everyone, even those without prior knowledge of the topic. And I entirely agree with Ball that calling quantum mechanics “weird” or “strange” isn’t helpful.

In “Beyond Weird,” Ball does a great job sorting out the most common confusions about quantum mechanics, such as that it is about discretization (it is not), that it defies the speed of light limit (it does not), or that it tells you something about consciousness (huh?). Ball even cleans up with the myth that Einstein hated quantum mechanics (he did not), Feynman dubbed the Copenhagen interpretation “Shut up and calculate” (he did not, also, there isn’t really such a thing as the Copenhagen interpretation), and, best of all, clears out the idea that many worlds solves the measurement problem (it does not).

In Ball’s book, you will learn just what quantum mechanics is (uncertainty, entanglement, superpositions, (de)coherence, measurement, non-locality, contextuality, etc), what the major interpretations of quantum mechanics are (Copenhagen, QBism, Many Worlds, Collapse models, Pilot Waves), and what the currently discussed issues are (epistemic vs ontic, quantum computing, the role of information).

As someone who still likes to read printed books, let me also mention that Ball’s is just a pretty book. It’s a high quality print in a generously spaced and well-readable font, the chapters are short, and the figures are lovely, hand-drawn illustrations. I much enjoyed reading it.

It is also remarkable that “Beyond Weird” has little overlap with two other recent books on quantum mechanics which I reviewed: Chad Orzel’s “Breakfast With Einstein” and Anil Ananthaswamy’s “Through Two Doors At Once.” While Ball focuses on the theory and its interpretation, Orzel’s book is about applications of quantum mechanics, and Ananthaswamy’s is about experimental milestones in the development and understanding of the theory. The three books together make an awesome combination.

And luckily the subtitle of Philip Ball’s book turned out to be wrong. I would have been disturbed indeed had everything I thought I knew about quantum physics been different.

[Disclaimer: Free review copy.]

Related: Check out my list of 10 Essentials of Quantum Mechanics.

Wednesday, June 05, 2019

If we spend money on a larger particle collider, we risk that progress in physics stalls.

[Image: CERN]
Particle physicists have a problem. For 40 years they have been talking about new particles that never appeared. The Large Hadron Collider was supposed to finally reveal them. It didn’t. This $10 billion machine has found the Higgs-boson, thereby completing the standard model of particle physics, but no other fundamentally new particles.

With this, the Large Hadron Collider (LHC) has demonstrated that arguments used by particle physicists for the existence of new particles beyond those in the standard model were wrong. With these arguments now falsified, there is no reason to think that a next larger particle collider will do anything besides measuring the parameters of the standard model to higher precision. And with the cost of a next larger collider estimated at $20 billion or so, that’s a tough sell.

Particle physicists have meanwhile largely given up spinning stories about discovering dark matter or recreating the origin of the universe, because it is clear to everyone now that this is marketing one cannot trust. Instead, they have a new tactic which works like this.

First, they will refuse to admit anything went wrong in the past. They predicted all these particles, none of which was seen, but now they won’t mention it. They hyped the LHC for two decades, but now they act like it didn’t happen. The people who previously made wrong predictions cannot be bothered to comment. Except for those like Gordon Kane and Howard Baer, who simply make new predictions and hope you have forgotten they ever said anything else.

Second, in case they cannot get away with outright denial, they will try to convince you it is somehow interesting they were wrong. Indeed, it is interesting – if you are a sociologist. A sociologist would be thrilled to see such an amazing example of groupthink, leading a community of thousands of intelligent people to believe that relying on beauty is a good method to make predictions. But as far as physics is concerned, there’s nothing to learn here, except that beauty isn’t a scientific criterion, which is hardly a groundbreaking insight.

Third, they will sure as hell not touch the question whether there might be better ways to invest the money, because that can only work to their disadvantage. So they will tell you vague tales about the need to explore nature, but not ever discuss whether other methods to explore nature would advance science more.

But fact is, building a large particle collider presently has a high cost for little expected benefit. This money would be better invested into less costly experiments with higher discovery potential, such as astrophysical searches for dark matter (I am not talking about direct detection experiments), table-top searches for quantum gravity, 21cm astronomy, gravitational wave interferometers, high-precision but low-energy measurements, just to mention a few.

And that is only considering the foundations of physics, leaving aside the overarching question of societal benefit. $20 billion that go into a particle collider are $20 billion that do not go into nuclear fusion, drug development, climate science, or data infrastructure, all of which can be reasonably expected to have a larger return on investment. At the very least it is a question one should discuss.

Add to this that the cost for a larger particle collider could dramatically go down in the next 20-30 years with future technological advances, such as wake-field acceleration or high-temperature superconductors. In the current situation, with colliders so extremely costly, it makes economically more sense to wait if one of these technologies reaches maturity. Who wants to spend some billions digging a 100km tunnel when that tunnel may no longer be necessary by the time the collider could be be in operation?

Anyone who talks about building a larger particle collider, but who does not mention the above named issues demonstrates that they neither care about progress in physics nor about social responsibility. They do not want to have a sincere discussion. Instead, they are presenting a one-sided view. They are merely lobbying.

If you encounter any such person, I recommend you ask them the following: Why were all these predictions wrong and what have particle physicists learned from it? Why is a larger particle collider a good way to invest such large amounts of money in the foundations of physics now? What is the benefit of such an investment for society?

And do not take as response arguments about benefiting collaborations, scientific infrastructure, or education, because such arguments can be made in favor of any large investment into science. Such generic arguments do not explain why a particle collider in particular is the thing to do. I have a handy list with responses to further nonsense arguments here.

A prediction. If you give particle physicists money for a next larger collider this is what will happen: This money will be used to hire more people who will tell you that particle physics is great. They will continue to invent new particles according to some new fad, and then claim they learned something when their expensive machine falsifies these inventions. In 40 years, we will still not know what dark matter is made of or how to quantize gravity. We will still not have a working fusion reactor, will still not have quantum computers, and will still have group-think in science. Particle physicists will then begin to argue they need a larger collider. Rinse and repeat.

Of course it is possible that a larger collider will find something new. The only way to find out with certainty is to build it and look. But the same “Just Look” argument can be made about any experiment that explores new frontiers. Point is: Particle physicists have so far failed to come up with any reason why going to higher energies is currently a promising route forward. The conservative expectation therefore is that the next larger collider would be much like the LHC, but for twice the price and without the Higgs.

Particle physics is a large and very influential community. Do not fall for their advertisements. Ask the hard questions.

Monday, June 03, 2019

The multiverse hypothesis: Are there other universes besides our own?

You are one of some billion people on this planet. This planet is one of some hundred billion planets in this galaxy. This galaxy is one of some hundred billion galaxies in the universe. Is our universe the only one? Or are there other universes?

In the past decades, the idea that our universe is only one of many, has become popular among physicists. If there are several universes, their collection is called the “multiverse”, and physicists have a few theories for this that I want to briefly tell you about.

1. Eternal Inflation.

We do not know how our universe was created and maybe we will never know. But according to a presently popular theory, called “inflation”, our universe was created from a quantum fluctuation of a field called the “inflaton”. In this case, there would be infinitely many such fluctuations giving rise to infinitely many universes. This process of universe-creation never stops, which is why it is called eternal inflation. 

These other universes may contain the same matter as ours, but in different arrangements, or they may contain different types of matter. They may have the same laws of nature, or entirely different laws. Really, pretty much anything goes, as long as you have space, time, and matter.

2. The String Theory Landscape

The string theory landscape came out of the realization that string theory does not, as originally hoped, uniquely predict the laws of nature we observe. Instead, the theory allows for many different laws of nature, that would give rise to universes different from our own.  The idea that all of them exist goes together well with eternal inflation, and so, the two theories are often lumped together.

3. Many Worlds

Many Worlds is an interpretation of quantum mechanics. In quantum mechanics, we can make predictions only for probabilities. We can say, for example, a particle goes left or right, each with 50% probability. But then, when we measure it, we find it either left or right. And then we know where it is  with 100% probability. So what happened with the other option?

The most common attitude you find among physicists is who cares? We are here and that’s what we have measured, now let’s move on.

The many worlds interpretation, however, postulates that all possible outcomes of an experiment exist, each in a separate universe. It’s just that we happen to live in only one of those universes, and never see the other ones.

4. The Simulation Hypothesis

Video games are getting better by the day, and it’s easy to imagine that maybe one day they will be so good we can no longer tell apart the virtual world and the real world.

This brings up the question whether maybe we already live in a virtual world, one that is programmed by some being more intelligent than us and technologically ahead? If that is so, there is no reason to think that our universe is the only simulation that is going on. There may be many other universe simulations, programmed by superintelligent beings. This, too, is a variant of the multiverse.

5. The Mathematical Universe

Finally, let me briefly mention the idea, popularized by Max Tegmark, that all of mathematics exists, and that we merely observe a very small part of it. It is this small part of mathematics that we call our universe.

Are these theories science? Or are they fiction? Let me know what you think.

Does God exist? Science does not have an answer.

I know that some of you have been wondering what has happened to me that I go on about the existence of gods, but if you make it to the end of this blogpost, I am sure it will all make sense!

Before we can talk about whether God exists, I have to be clear what kind of god I am talking about. I am talking about the old-fashioned personal god, the one who listens to prayers, and tells you how to be a good person, and who sorts the good from the bad in afterlife, and so on.

Some variants of this god are in actual conflict with evidence. Say, if you believe that evolution does not happen, or that praying cures cancer, and so on. If you want to defend such beliefs, you are in the wrong channel, good bye. I will assume that you are here because, as I, you want to understand what we can learn from nature, so ignoring evidence is not an option.

What we have then is a god who is consistent with all our observations, but who himself does not result in any additional observable consequences. If you want to explain observations, then the scientific theories of the day are the best you can do. Adding god on top does not make the theories any more useful. By useful I mean concretely that a theory allows you to calculate patterns in data in a way that is quantifiably simpler than just collecting the data.

Example: The standard model of particle physics. It allows you to calculate what happens in particle collisions at the Large Hadron Collider. Now you can say, I take the standard model plus the hypothesis that it was made by god. But adding god does not simplify the calculations. So, god is superfluous.

The scientific approach is then to prefer the standard model without god. This, of course, is nothing else but Occam’s razor. You make a theory as simple as possible. Without this requirement, science just becomes dysfunctional, because you would be allowed to add all kinds of unnecessary clutter.

Now, as we discussed previously, scientists say something “exists” if it is an element of a theory that is useful to explain observations. The Higgs-boson exists in this very sense. So do black holes and gravitational waves.

On the other hand, if something is not useful to explain observations, as it is the case with god, science does not say it does not exist. Instead, it doesn’t say anything about whether it exists or not. It cannot say anything, because science is about what’s observable.

Personally, I am not sure what sense it makes to postulate the existence of something that has no observable consequences. But it is certainly something you can believe if you wish. It’s just that science cannot say anything about it.

So, as some of you have pointed out correctly, God could be said to exist in a different way than, say, elementary particles. Some have suggested to call it “immaterial existence”. But I find this misleading because space and time are also immaterial, yet they do exist in the scientific sense.

Some have suggested to call it “non-physical existence”, but this raises the impression it has something to do with physics in particular, which is also misleading.

What it really is, is a non-scientific type of existence. Or, let us call it what it is, it’s a religious existence. God exists in the religious way. An element of a hypothesis that does not result in observable consequences exist in the religious way.

So, here is my next homework assignment: Does the multiverse exist?

I think most of you will understand now what I am getting at. If you are not sure just what the multiverse is, I have another blogpost/video upcoming in a few hours that briefly summarizes what this is all about. So, stay, tuned.