Pages

Thursday, June 13, 2019

Physicists are out to unlock the muon’s secret

Fermilab g-2 experiment.
[Image Glukicov/Wikipedia]
Physicists count 25 elementary particles that, for all we presently know, cannot be divided any further. They collect these particles and their interactions in what is called the Standard Model of particle physics.

But the matter around us is made of merely three particles: up and down quarks (which combine to protons and neutrons, which combine to atomic nuclei) and electrons (which surround atomic nuclei). These three particles are held together by a number of exchange particles, notably the photon and gluons.

What’s with the other particles? They are unstable and decay quickly. We only know of them because they are produced when other particles bang into each other at high energies, something that happens in particle colliders and when cosmic rays hit Earth’s atmosphere. By studying these collisions, physicists have found out that the electron has two bigger brothers: The muon (μ) and the tau (τ).

The muon and the tau are pretty much the same as the electron, except that they are heavier. Of these two, the muon has been studied closer because it lives longer – about 2 x 10-6 seconds.

The muon turns out to be... a little odd.

Physicists have known for a while, for example, that cosmic rays produce more muons than expected. This deviation from the predictions of the standard model is not hugely significant, but it has stubbornly persisted. It has remained unclear, though, whether the blame is on the muons, or the blame is on the way the calculations treat atomic nuclei.

Next, the muon (like the electron and tau) has a partner neutrino, called the muon-neutrino. The muon neutrino also has some anomalies associated with it. No one currently knows whether those are real or measurement errors.

The Large Hadron Collider has seen a number of slight deviations from the predictions of the standard model which go under the name lepton anomaly. They basically tell you that the muon isn’t behaving like the electron, which (all other things equal) really it should. These deviations may just be random noise and vanish with better data. Or maybe they are the real thing.

And then there is the gyromagnetic moment of the muon, usually denoted just g. This quantity measures how muons spin if you put them into a magnetic field. This value should be 2 plus quantum corrections, and the quantum corrections (the g-2) you can calculate very precisely with the standard model. Well, you can if you have spent some years learning how to do that because these are hard calculations indeed. Thing is though, the result of the calculation doesn’t agree with the measurement.

This is the so-called muon g-2 anomaly, which we have known about since the 1960s when the first experiments ran into tension with the theoretical prediction. Since then, both the experimental precision as well as the calculations have improved, but the disagreement has not vanished.

The most recent experimental data comes from a 2006 experiment at Brookhaven National Lab, and it placed the disagreement at 3.7σ. That’s interesting for sure, but nothing that particle physicists get overly excited about.

A new experiments is now following up on the 2006 result: The muon g-2 experiment at Fermilab. The collaboration projects that (assuming the mean value remains the same) their better data could increase the significance to 7σ, hence surpassing the discovery standard in particle physics (which is somewhat arbitrarily set to 5σ).

For this experiment, physicists first produce muons by firing protons at a target (some kind of solid). This produces a lot of pions (composites of two quarks) which decay by emitting muons. The muons are then collected in a ring equipped with magnets in which they circle until they decay. When the muons decay, they produce two neutrinos (which escape) and a positron that is caught in a detector. From the direction and energy of the positron, one can then infer the magnetic moment of the muon.

The Fermilab g-2 experiment, which reuses parts of the hardware from the earlier Brookhaven experiment, is already running and collecting data. In a recent paper, Alexander Keshavarzi, on behalf of the collaboration reports they successfully completed the first physics run last year. He writes we can expect a publication of the results from the first run in late 2019. After some troubleshooting (something about an underperforming kicker system), the collaboration is now in the second run.

Another experiment to measure more precisely the muon g-2 is underway in Japan, at the J-PARC muon facility. This collaboration too is well on the way.

While we don’t know exactly when the first data from these experiements will become available, it is clear already that the muon g-2 will be much talked about in the coming years. At present, it is our best clue for physics beyond the standard model. So, stay tuned.

18 comments:

  1. On the same subject, at Résonaances:

    http://resonaances.blogspot.com/2018/06/alpha-and-g-minus-two.html

    ReplyDelete
  2. “The muon and the tau are pretty much the same as the electron, except that they are heavier.” ...

    Are they "pretty much the same" apart from the SM classification? A few thoughts...

    The tauon (τ-lepton) is the heaviest of the three charged leptons of the standard model. Its mass is ~ 1777 MeV /c2, the lifetime ~ 2,906 ± 0,010 • 10-13 s.

    If one looks at the possible decays of the tauon (http://pdg.lbl.gov/2015/listings/rpp2015-list-tau.pdf), fundamental questions arise. The electron does not decay at all, the likewise negatively charged muon “converts” mainly into an electron. Definitely, there are no „transformations” to mesons or baryons, starting from the electron and muon. The tauon „decays“ about 17.8% directly to an electron and neutrinos, about 17.4% in a muon, but mainly in various hadrons, for example, with 25% in a π-, π0 and postulated tau-neutrinos. Using the tauon as an example, the inconsistency of the theory and the associated "SM cataloging euphoria" can be read very well. Here, "Much and Nothing" is presented on current (as of 2015) 89 pages. The muon “creates” 15 (http://pdg.lbl.gov/2015/listings/rpp2015-list-muon.pdf) and the electron puts it in direct comparison only on 6 pages (http://pdg.lbl.gov/2015/listings/rpp2015-list-electron.pdf). In this context, it is also more than surprising that "theoreticians" announce detailed thoughts about the g-factor of the tauon, where are no (meaningful) measurements of the magnetic moment of the Tauon are available.

    ReplyDelete
    Replies
    1. The electron cannot decay since there are no particles into which it could decay and preserve charge and lepton number. Particles can decay only into lighter particles, which explains why the taon can decay into hadrons and anti-hadrons, while the muon and electron can't (the lightest hadron, the pion, is somewhat heavier than the muon). No mystery here.

      Delete
    2. These decay rates are different because the masses of the particles are different, which makes some decays more likely (or possible to begin with). As I said, they're pretty much the same, except the muon and tau are heavier.

      Delete
  3. "This value should be 2 plus quantum corrections, and the quantum corrections (the g-2) you can calculate very precisely with the standard model. Well, you can if you have spent some years learning how to do that because these are hard calculations indeed. Thing is though, the result of the calculation doesn’t agree with the measurement."

    I remember when I first heard of this, during an undergraduate lecture. The professor remarked that theory and experiment agreed to 15 decimal places or whatever. I asked if there was a discrepancy after that or if the experimental accuracy wasn't any better. He replied that the calculations couldn't be done any better.

    This was for the electron, for which there is no disagreement at all between theory and experiment.

    ReplyDelete
  4. After the muon had been discovered, I. I. Rabi quipped "Who ordered that?"

    ReplyDelete
  5. Hi Sabine,

    it turns out that there is another "maybe" anomaly (2.5 sigma), described on Resonaances blog (latest post from June 2018). This time on the electron.

    Best,
    J.

    ReplyDelete
  6. Sabine:

    That's awesome. both of your links lead to the same page, the "collaboration is in the second run" and the Japan J-parc link right after it.

    Hm, I wonder how much these experiments cost? Probably much less than the LHC and capable of producing new physics...

    ReplyDelete
    Replies
    1. I have fixed the J-PARC link. Thanks for noticing!

      Delete
  7. My suspicion is this is tied up with the the matter vs antimatter asymmetry issue. A deviation between g-2 may point to some problem with how the muon acquires mass and this may mean the Higgs theory is incomplete. I would also bet this influences how the running coupling constants converge near the string/Planck scale or Hagedorn temperature.

    Andrei Sakharov proposed three conditions necessary for baryogeneis and the generation of matter over antimatter. These are

    The universe is not in equilibrium

    There are parity and CP violations

    There are baryon number violating interactions.

    With the first quantum field theory in spacetime is not at equilibrium. A black hole of mass m in a thermal background with the same temperature as its Hawking temperature, T = 1/8πm absorbs or emits a unit of energy δm so it is cooler or hotter accordingly. The new temperature T ∓ δT = 1/8π(m ± δm) is different than the background temperature. This statistically favors an evolution of the black hole diverging from the background temperature. The generation of particles from spacetime is not an equilibrium condition with QFT in curved spacetime. The second of these is experimentally known with the violation of parity with weak interactions. The third is an open question. Quarks and anti-quarks are produced in pairs, and the same occurs with lepton and anti-lepton pairs.

    The sphaleron theory models how baryons may be converted to anti-leptons and anti-baryons to leptons. Here B and L may not be conserved, but rather B - L. This field is an instanton in a barrier of the low energy configuration for minima on either side for a state of 3 quark types or baryon numbers and an 3 lepton numbers. This covers the 3 doublets of quarks and leptons. The sphaleron has a mass-energy of around 10TeV. A hot gas of sphalerons converts baryons and anti-baryons into anti-leptons and leptons in an equilibrium configuration. This would then ideally hold for all energy scales and there would be a scale invariance to this equilibrium. However, as it will be argued in this essay that scale invariance is broken and this leaves the world in a configuration with an excess of baryons and leptons.

    The question that also needs to be asked is, “where do these scalar fields come from?” We have the Higgs field, which is a a pair of doublets of scalars, there is the inflaton that is thought to exist and with supergravity in general there are scalar-tensor fields with the scalar dilaton and axion fields What do all these scalar fields mean? The sphaleron and ultimately the Higgs field plus weak interactions emerge from a topological index with gravity. There may then be some general theory for how these strange phase changing scalar fields that form condensates etc emerge from quantum gravitation.

    As Sabine says these calculations are extremely difficult and tedious beyond imagination. Even plain QED calculations out to a few orders gets very hairy. If there is some massive field at high energy, or high order in the perturbation series of radiative field corrections, then we might expect some deviation from what is currently thought.

    This of course gets to the issue of the FCC again. My main question with that machine is whether we need to do 100TeV physics by brute force scaling up current technology. Is there an other way to do this that can reduce the scale of such a machine?

    ReplyDelete
  8. Two thoughts, first we clearly have enough new data from the LHC that is puzzling and therefore we don't need a bigger collider. Second, it's always frustrating to hear people claim the standard model is so successful with these "anamolies" going back to the 1960s still hanging out there. I sincerely hope the new neutrinos experiments find some answers, along with new muon experiments it seems like the only sources of new physics going forward. I remember when I learned about the GZK limit problem while visiting the cosmic ray detector called the "fly's eye" 30 years ago. The GZK limit shows that high energy protons above a certain energy threshold should not make it to Earth because they would interact with the CMBR and break down to lower energy constituents. The fly's eye detector has seen several particles that are much higher in energy than the GZK limit would suggest. I was astonished how these anamolies aren't considered a contradiction of the standard model.

    ReplyDelete
    Replies
    1. If data from the LHC were puzzling that would call for the FCC. The LHC data, I am less certain about the ALICE detector data, instead of puzzling is almost boring in how predictable it is.

      Cosmic rays that violate the GHZ limit are thought to be due to extremely high energy neutrinos that scatter off particles close enough to us so the CMB radiation does not appreciably attenuate the daughter products at energy larger than GHZ limit.

      Delete
    2. We just need more data to reach 5 sigma, not higher-energy data at a $30B price tag. The assumption that higher energy is going to solve our problems has been touted again and again, with LHC being the end all (the early matter era of the BB where the Higgs field comes into play) and there really are no new clues. If we're stymied by asymmetry in muon data, higher energy does nothing to resolve this because it merely suggests there is some new particle we don't know about that will solve all our problems (which the LHC should have enough resolution to see given our current CDM model). When our models seriously need re-evaluation from so many missing results that are expected, new machines aren't the answer.

      I had read that the GZK limit violators could be heavy nuclei as well, it's hard to know, especially testing for neutrinos scattering when it's hard enough measuring interactions with favorable targets like the Xenon detector.

      Delete
  9. "At present, it is our best clue for physics beyond the standard model. So, stay tuned."

    In what way, what will(could) be the new discovery?

    ReplyDelete
    Replies
    1. Sorry I couldn't connect, do you know of any paper. Even so, how a sterile neutrino can change SM in any fundamental way.

      Delete
  10. qsa, I believe the neutrino sector is the most promising candidate for physics beyond the Standard Model. Note that the finite neutrino masses and neutrino oscillations are the only examples of experimentally confirmed violations of the Standard Model that originally assumed that neutrinos were massless. And there exists a natural way to explain the low neutrino masses:

    https://en.wikipedia.org/wiki/Seesaw_mechanism

    ReplyDelete
  11. This past Wednesday, April 7th, the Muon G2 experiment announced that the muon's magnetic moment anomaly has grown to about 4.2 sigma as quoted by Natalie Wolchover in her Quanta Magazine article, of the same date, titled" "Last Hope, Experiment Finds Evidence for Unknown Particle". Natalie provides a link to a paper that proposes that "Vector-Like Leptons" might be behind this anomaly. Muddying the water, however, is another teams calculation of G2 that is more in agreement with the latest G2 results from Fermilab.

    ReplyDelete

COMMENTS ON THIS BLOG ARE PERMANENTLY CLOSED. You can join the discussion on Patreon.

Note: Only a member of this blog may post a comment.