Saturday, July 04, 2020

What is Quantum Metrology?

Metrology is one of the most undervalued areas of science. And no, I am not just mispronouncing meteorology, I actually mean metrology. Think “meter” not “meteor”. Metrology is the science of measurement. Meteorology is about clouds and things like that. And the study of meteors, in case you wonder, is called meteoritics.



Metrology matters because you can’t do science without measuring things. In metrology, scientists deal with problems how to define conventions for units, how to do this most accurately, how to most reliably reproduce measurements, and so on. Metrology sounds boring, but it is super-important to move from basic research to commercial application.

Just consider you are trying to build a house. If you cannot measure distances and angles, it does not matter how good your mathematics is, that house is not going to come out right. And the smaller the object that you want to build, the more precisely you must be able to measure. It’s as simple as that. You can’t reliably produce something if you don’t know what you are doing.

But if you start dealing with very small things, then quantum mechanics will become important. Yes, quantum mechanics is in principle a theory that applies to objects of all sizes. But in practice its effects are negligibly tiny for large things. However, from the size of molecules downwards, quantum effects are essential to understand what is going on. So what then is quantum metrology? Quantum metrology uses quantum effects to make more precise measurements.

It may sound somewhat weird that quantum mechanics can help you to measure things more precisely. Because we all know that quantum mechanics is… uncertain, right? So how do these two things fit together, quantum uncertainty and more precise measurements? Well, quantum uncertainty is not something that applies to any measurement. It only sets a limit to the entirety of information you can obtain about a system.

For example, there is nothing in quantum mechanics that prevents you from measuring the momentum of an electron precisely. But if you do that, you cannot also measure its position precisely. That’s what the uncertainty principle tells you. So, you have to decide what you want to measure, but the uncertainty principle is not an obstacle to measuring precisely per se.

Now, the magic that allows you to measure things more precisely with quantum effects is the same that gives quantum computers an edge over ordinary computers. It’s that quantum particles can be correlated in ways that non-quantum particles can’t. This quantum-typical type of correlation is called entanglement. There are many different ways to entangle particles, so entanglement lets you encode a lot of information with few particles. In a quantum computer, you want to use this to perform a lot of operations quickly. For quantum metrology, more information in a small space means a higher sensitivity of your measurement.

Quantum computers exist already, but the ones which exist are far from being useful. That’s because you need a large number of entangled particles, as much as a million, to not only make calculations, but to make calculations that are actually faster than you could do with a conventional computer. I explained the issue with quantum computers in an earlier video.

But in contrast to quantum computers, quantum metrology does not require large numbers of entangled particles.

A simple example for how quantum behavior can aid measurement comes from medicine. Positron emission tomography, or PET for short, is an imaging method that relies on, yes, entangled particles. For PET, one uses a short-lived radioactive substance, called a “tracer”, that is injected into whatever body part you want to investigate. A typical substance that’s being used for this is carbon-11 which has a half-life of about 20 minutes.

The radioactive substance makes a beta-decay and emits a positron. The positron annihilates with one of the electrons in the neighborhood of the decay site which creates, here it comes, an entangled pair of photons. They fly off not in one particular direction, but in two opposite directions. So, if you measure two photons that fit together, you can calculate where they were emitted. And from this you can reconstruct the distribution of the radioactive substance which “traces” the tissue of interest.

Positron emission tomography has been used since the 1950s and it’s a simple example for how quantum effects can aid measurements. But the general theoretical basis of quantum metrology was only laid in the 1980s. And then for a long time not much happened because it’s really hard to control quantum effects without getting screwed up by noise. In that, quantum metrology faced the same problem as quantum computing.

But in the past two decades, physicists have made rapid progress in designing and controlling quantum states, and, with that, quantum metrology has become one of the most promising avenues to new technology.

In 2009, for example, entangled photons were used to improve the resolution of an imaging method called optical coherence tomography. The way this works is that you create a pair of entangled photons and let them travel in two different directions. One of the photons enters a sample that you want to study, the other does not. Then you recombine the photons, which tells you where the one photon scattered in the sample, which you can then use to reconstruct how the sample is made up.

You can do that with normal light, but the quantum correlations let you measure more precisely. And it’s not only about the precision. These quantum measurements require only tiny numbers of particles, so they are minimally disruptive and therefore particularly well suited to the study of biological systems, for example, the eye, for which you don’t exactly want to use a laser beam.

Another example for quantum metrology is the precise measurement of magnetic fields. You can measure a magnetic field by taking a cloud of atoms, splitting it in two, letting one part go through the magnetic field, and then recombining the atoms. The magnetic field will shift the phases of the atoms that passed through it – because particles are also waves – and you can measure how much the phases were shifted, which tells you what the magnetic field was. Aaaand, if you entangled those atoms you can improve the sensitivity to the magnetic field. This is called quantum enhanced magnetometry.

Quantum metrology has also be used to improve the sensitivity of the LIGO gravitational wave interferometer. LIGO uses laser beams to measure periodic distortions of space and time. Laser light itself is already remarkable, but one can improve on it by bringing the laser light into a particular quantum state, called a “squeezed state,” that is less sensitive to noise and therefore allows more precise measurements.

Now, clearly these are not technologies you will have a switch for on your phone any time soon. But they are technologies with practical uses and they are technologies that we already know do really work. I don’t usually give investment advice, but if I was rich, I would put my money into quantum metrology, not into quantum computing.

37 comments:

  1. Thanks for very important topic raised by your blog post!

    The fundamental basis for meter in physical interactions is the theme at the first place.

    The statistical benefits from quantum computing is dependent on metrology of quanta with diversity enough and those are applications of emergent physics.

    Nowadays much money is thrown to accelerate quantum computing but just like you Sabine I prefer finance first the fundamental research in metrology.

    ReplyDelete
  2. I always thought that PET only relies on good old conservation of momentum. Two emitted photons must have the same net momentum as an annihilated positron and electron, which is very small since there's only thermal motion. So photons have to move at almost the same line in opposite directions.

    ReplyDelete
    Replies
    1. Cyberax,
      Nice comment, however
      in medical terms,
      only one photon
      Is necessary for the
      'scan'.
      The other is disregarded.
      Now, in scientific terms, we may be interested in where the other photon goes, but, not in the medical mindset.

      Delete
    2. I was also puzzled by that. In PET scanning the ring detectors simply measure the time difference between the arrival of the emitted photons, and from that calculate the position of the annihilation event which produced them. They may be entangled, but as I understand it, that fact is not used in PET scanning. It's just exploiting the fact that they are emitted in exactly opposite directions. And I assume that is a result of conservation of linear momentum, rather than some consequence of entanglement. Is that incorrect?

      Delete
    3. They are entangled because momentum is conserved. Not sure what's the problem here. We seem to agree they are entangled, no?

      Delete
    4. Fwiw, I found this as a motivation in this review paper and thought it's a good example because many people will have heard of it. (Which cannot be said of quantum optical coherence tomography I guess.)

      Delete
    5. I guess the disagreement is in importance of entanglement. PET scanners can work just fine in a hypothetical universe without quantum entanglement.

      Delete
    6. The photons collected in PET are in a mixed state of maximally anti-correlated classical trajectories. Yes, the state is in principle entangled initially, but it will very quickly decohere as it propagates through the body.

      PS. entanglement is not a sufficient condition for universal QC

      Delete
    7. Sorry for a late reply.
      My friend,
      I feel the emotion.
      and I can't avoid being the messenger.
      A 'pet' scan has nothing to do with generating entangled photons.
      It has nothing to do with splitting atoms, or any other Crazy Shit.
      If you're basically astute in the scientific sense, just Google the mechanism for a PET scan.

      Very basically, it's a question of unleasing some very fast decay radioisotopes anywhere;
      ( the human body,
      - the sidewalk in front of your house).
      Now, the interesting part of all this is...
      The Gamma Camera .
      Truly a Wonder of modern technology.
      (actually, very simple.,
      lol )
      I personally don't have the time to explain it to you in detail, I wish I did. Best wishes and good luck.

      Delete
  3. This is probably a very interesting article, but I'm just sad that comments are closed on your 2016 article about Free Will, because I had a really witty (but not serious..) thing I wanted to say.

    ReplyDelete
    Replies
    1. I'd be interested to hear it anyway.

      Delete
    2. Sorry, Tim I tried to reply. It didn't work out so well. You have something to say, say it now.
      Life is short.
      Sabine will understand.

      Delete
    3. Thanks for your interest! This refers to the article of Sunday, January 10, 2016 "Free will is dead, let’s bury it." It just occurred to me that I should probably reserve judgment on the supposed ill-effects from non-belief in Free Will until I had heard some potentially contradictory evidence from a Professor of Mediocrity in Marketing.

      Delete
    4. Tim,
      It was nice to understand your witticism
      for a moment.
      Thank you.
      I can't believe you've been holding it in from 2016.
      lol .
      Sone of my research has to do with belief structures and how they make our reality 'real'.

      Nice to hear that,
      I got the joke.
      Best wishes.

      Delete
    5. You are right not to believe that I've been holding it in from 2016! It's always nice to have a joke appreciated, but it's flattering to have it appreciated by someone who (like most of the commenters here) is much cleverer, and more knowledgeable than me. All the best to you too :)

      Delete

  4. I had an idea how to build a gravitational wave detector that could work, maybe. Make two coils with tens of kilometers of optical fiber and place them 90 degrees from each other in the same place, both forming a cross; then a modulated optical signal is sent, at the end both signals are tuned and synchronized to eliminate errors and observe variations. The idea is to obtain an increase for the same wave for each turn in the coil and contrasted with a contraction for each turn of the other coil; If the coil has a perimeter of 10 meters and a total of 10 km, 2,000 increments are obtained.

    ReplyDelete
  5. LIGO is measuring gravitational radiation. A part of metrology is calibration, and if there were some source of random noise in the detector it would have been found then. In fact a random source of noise was recently detecting by LIGO. The quantum uncertainty fluctuation in the position of a mirror was recently documented.

    ReplyDelete
  6. Undervalued? I dare say that lovely Rita, meter maid is invaluable.

    ReplyDelete
  7. I think it's that the same signal is detected at two different sites that corroborates it.

    Having said that, I'm amazed at the level of detail they are able to extract about the event - sizes of objects, distance and direction - from what sounds like just a blip.

    ReplyDelete
  8. 05-JUL-2020


    On the planet Irony, the brilliant international
    fund manager, Dr. Enibas Redlefnessoh, casually made a
    statement on her financial blog, Bankreaction, that
    led to a massive paradigm shift in that world's
    scientific programme:

    "I don’t usually give theoretical physics advice, but if
    I was really brilliant in that way, I would put my
    resources into better resolution and elimination of the
    inconsistencies in our present standard models of Nature.
    Long-term there's no better investment."

    The societal backreaction was swift and stunning.
    And they all lived happily ever after.

    Cheers,
    mj horn

    ReplyDelete
  9. Sabine said:

    "I don’t usually give investment advice, but if I was rich, I would put my money into quantum metrology, not into quantum computing."

    One of my day jobs for the US federal government (this was back when we had one) was to find and assess emerging science and technology products from the private sector, since big-government contracting is not good at this. As chief scientist (which mostly meant “translator in both directions”) for this tiny but effective effort, I had the delightful fun of working every day with some of brightest and most innovative science and IT minds on the planet.

    This brag has a reason: I think Sabine is giving excellent investment advice.

    Here’s why:

    Quantum metrology, under which I lump quantum encryption, is a strong emerging tech area that focuses on well-tested and well-understood aspects of entanglement. Since entanglement has never before been incorporated into marketable systems until quite recently, it provides a strong opportunity for new product exploration, including entirely new and unforeseen applications. Quantum encryption was easy to predict, but image enhancement was not, and that is often how new areas evolve. The bottom line is that quantum metrology is an area worth tracking, though which products actually succeed is always the trickier issue.

    Quantum computing (QC) is a radically different story. QC involves two distinct and separable concepts for speeding up computation: parallel multistate processing (superposition), and superluminal but ahistorical state updating (entanglement).

    The parallel multistate processing (superposition) part is easier one to explain, so I’ll address it first. My message for this concept is not good: Of all the ways once could conceive of to implement efficient parallel multistate processing, using quantum wave functions to do it is literally the worst strategy imaginable.

    Parallel multistate processing is described in QC using words like “superpositions” and “qubits”, which make it sound uniquely quantum. However, as long as no entanglement is involved, this is simply not true. For any given wave phenomenon, the cheapest, fastest, and most reliable way to parallel process orthogonal states will always be to use the highest-energy waves available — which means going in the opposite direction, away from the quantum world.

    Here’s an example: Define two arbitrary repeating waveforms as inputs to your device. By Fourier transform, these two inputs streams have orthogonal sinusoidal basis states, the number and precision (bandwidth) of which will depend on how long and reliably the waves repeat. In everyday life we call these distinct orthogonal states radio frequencies, and their superposition is the full set of waves hitting your radio antenna.

    Now imagine your radio picking up two stations on the same frequency. By default your radio adds such signals, but it could just as easily invert one of them first, causing it to be subtracted. Generalize this inversion to affect all signals coming in to your antenna, and you now have parallel multistate subtraction. Design and combine more such RF logic, add looping and storage, and you have a high-power state-parallel RF computer. Do this at optical frequencies instead of RF and the whole package could be made remarkably compact.

    I just looked this idea up on Google, and to my surprise did not get any obvious hits. Even optical logic appears to be “pulse biased”, shying away from frequency domain logical operators. That is… interesting.

    ---

    Regarding the second component of QC, entanglement computing, so far my reviews of the early papers keep pointing to the same conclusion: Ahistoric isolation, which is an absolute prerequisite for enabling entanglement computation, and which was properly addressed by Feynman, seems to have been overlooked or even misunderstood in most post-Feynman papers. That will be the subject of a later separate comment.

    ReplyDelete
  10. Moreover, receiving the same signal at more than one site also helps in eliminating local noise sources.

    ReplyDelete
  11. Update: I’m getting moderately terrified by this little QC-roots review I'm doing.

    So much has been invested in this area that I keep that the roots of QC are surely rock solid, and that I’m just being an idiot (which would not surprise my family who knows me best). Yet when I look at some of the seemingly standard definitions for components such as qubit registers, they just do not work, for the simple reason that in at least some of the definitions of them the genuinely quantum qubits are isolated by speed-of-light classical space. Unless the final state fully entangles all of those bit locations at once to create a single unified wave function, such individuality of the quantum regions in xyz space simply violates the preconditions for achieving entanglement computing. You can even dissect a single self-entangled quantum function into multiple nodes within xyz space if you are clever about it, but if you start treating them too much like classical bits, entanglement (coherence) is simply lost.

    It's not that you can’t do interesting computation tricks with quantum! E.g. simply sending a photon (or many photons) though a convex lens is a real-world example of a Fourier transform made possible by “instantaneous” self-entanglement of the photon. It’s just that when I read through some of the early QC literature, I keep ending up a bit terrified by the seeming sloppiness in how entanglement really works. There doesn’t seem to be a strong, consistent awareness of the physical necessity of maintaining a single quantum space region across all data components of the computation. But I keep hoping I’m just misreading the deeper intent.

    It's also certainly not that Shor et al are not aware of such issues, since they discuss in great detail critical issues such as reversibility. But reversibility alone doth not entanglement make. You've got to be careful in the physical design of those bits, or you just end up with a grotesquely inefficient way of doing fully classical multistate parallel computing. As I noted earlier, if that is all you are trying to do, you are better off using high-power non-quantum systems… and even then the net computation speed benefits are debatable, since such parallelism pushes all calculation more into the pipeline-style frequency domain.

    Sorry, I'm just venting. This review of QC basics is bothering me in a way that some of the silliness in the assumptions underlying now-disproven string theory never bothered me. That may be because QC is much closer to my home turf, so I had to think that it too got “lost in the math” the way string theory did.

    I should also note that Feynman's founding article on quantum computation causes me no such concerns when I read it. While that paper has no fancy algorithms, what it does have is a rock-solid understanding of how the physical setup of a single wave function would be needed to enable Bell inequality emulation via superluminal state updating. Feynman knew that superluminal Bell stuff could never leak out into the classical universe, and his firm awareness of it shows in the structure of his paper. But admittedly, he also didn't go very far into the details, so he never had to deal with anything more than how to emulate a single a quantum wave function.

    Which may in the end be all that is ever possible: emulation of one wave.

    (And yet… there are those photons that can do instant internal updates across millions of lightyears of Einstein lens space to calculate their final highest-probability Fourier transform location! Is it possible it’s just the designs of quantum computation devices that are messed up? Should we for example be using high-power lasers, which are also very stable quantum wave functions, to access instantaneous Fourier transforms, as opposed to low-energy atoms that will never have either the broad self-entanglement or robustness possessed by single photon traveling through room-temperature lenses?)

    ReplyDelete
  12. By the way, I hope Peter Shor comments. The problem I'm worried about is not his algorithm, but the fundamental concepts of register and qubit design that were proposed to enable such algorithms. And I still may be totally wrong, but the stuff really does scare me. It would not be the first time that computer hardware and software folks, who are accustomed to casually creating their own realities with their own rules, did not get the deeper principles of biology or physics quite right.

    ReplyDelete
    Replies
    1. A quantum computer works by constructive and destructive interference of a quantum wave, where the “crests” corresponding to constructive interference. Shor’s algorithm is a sort of Fourier transform that outputs those peaks as a spectrum which is the output. It can be argued in many ways this is what we do with QM very often. A quantum computer is then not that different from quantum metrology, but for it to be of great utility you are working with many qubits.

      That quantum computers work with many qubits is the where the rub lies, for decoherence is far more probable. To invoke GRW interpretation, if an EPR pair has a probability for spontaneous decoherence, this probability grows as the number of qubits, or as a bipartite entanglement is extended to an N-tanglement. To factor a prime that requires larger numbers of qubits and decoherence becomes a problem. The largest number prime factored is in the 100-thousand range. A classical computer can do this easily. With 20 qubits you could factor a number up to 1048576, if the coherence can be maintained long enough.

      Robust quantum error correction codes (QECC) are necessary. A QECC with Hamming distance of one are reasonably good. There have been some demonstrations of this. So far fidelity is not sufficient for a large number of qubits or large N-tangles. One might compare this to a sort of quantum metrology of mesoscale systems or quantization on the large.

      Quantum processors might sneak into standard computers. We may have before long quantum processing that detects eavesdropping and some elementary quantum computing. It will be some time before quantum computing enters the mainstream. Computers will largely be von Neumann classical processors, where over time there will be ancillary processors that are neural nets, maybe spintronics, and possibly quantum processors. Whether the quantum processor evolves into a larger role depends on the future of QECC development.

      Delete
    2. Hello Lawrence.

      You referred to the GRW interpretation of decoherence. After that I read up on GRW and the Penrose alternative method in Wiki. It gave me an idea maybe relating gravity to quantum spin. We have exchanged comments before on FQXi essays about entanglements being the basis of the spacetime metric, and my focus on Rasch and Penrose's CCC, but this is a new idea. Unfortunately, I will not be allowed to speculate more here, but I wanted to thank you for that information.

      Austin Fearnley

      Delete
    3. Lawrence, thank you for the nice summary of QC.

      GRW is an oddity due to its incorporation of collapse noise, which violates energy conservation. While the roots of spontaneous emission are indeed mysterious, my own inclination is to invoke the intrinsic deep entanglement of wave functions with the nearby (and also distant) classical environments in which they reside. Since you cannot create an experimentally accessible wave function without simultaneously entangling its energy and momentum with the classical environment from which it first emerges, I don’t think it’s that much of a leap to assume that the thermal chaos of those same entangled classical environments can contribute a little nudge of thermal chaos to how and when an excited wave function finally decays, a random thermal spike that occasionally gets large enough to trigger collapse. That in turn would mean that spontaneous wave collapse is nothing more than a rather subtle expression, via entanglement, of everyday thermal noise. In contrast, invoking spontaneous inexplicable wave collapse feels to me like confusing the ease with which Humpty Dumpty falls of a wall with the assumption that Humpty Dumpty just likes to explode spontaneously.

      For QC it’s not random decoherence that worries me. I’m just not convinced that even the simplest case of combining two classically isolated qubits (e.g. two atoms) can be maintained from start to finish of a calculation without somewhere violating the ahistorical requirement. Both QM and SR are very picky about allowing anything that uses the vast and free-wheeling superluminal state resets that are allowed within wave functions to “land” in the classical, speed-of-light, causal universe.

      The flip side is this: If cold atomic qubit designs can express superluminal speedups in their final classical outcomes, I do not easily see why the same speedups should not also be possible by using lasers and room temperature optical components. A laser pulse is after all just an ensemble of very similar photons, each of must maintain coherence and self-entanglement (you can only find it once) across the entire apparatus. With lasers and photons you would need to replace qubit registers with arrays of optical elements that encode the initial problem, e.g. by using potentials and polarizations, onto passing subsets of each photon wave front. With photons there would at least be no ambiguity about whether the wave remains ahistorical until the final result.

      I should hasten to note that I’m still not convinced that either of these strategies — qubit registers or photon wave encoder arrays — can make superluminal entanglement processing available in a non-trivial way. But I would firmly hypothesize that if qubit register algorithms can access entangled resets in ways that result in true superluminal speedups of algorithms — speedups that go beyond what is possible using fully classical multistate parallelism, by which I mean unentangled superposition — then it will always also be possible (and likely simpler) to achieve the same superluminal speed benefits by using lasers, photons, and room-temperature optical encoders and readers.

      Delete
    4. I do not like the idea Penrose and other have that quantum information is destroyed. However, if that information is taken up as spacetime information and further if spacetime is really built from entanglement then no information is really lost. This is one reason GRW, Penrose and Montivideo interpretations are potentially more than just a quantum interpretation. These might be "proto-theories" in their own right.

      BTW, your name is familiar and I seem to remember communicating with you.

      Delete
    5. Lawrence

      We exchanged comments two years ago on essays at FQXi. My essay was on my amateur and naive preon model, and I wrote about Penrose's CCC too, in particular that my background of using the Rasch model to make psychometric metrics made me accept the idea of collapse and re-birth of the universe's metric. I suspect that part is a dubious aspect of CCC to others. Further, my ideas on preon content of particles left me open to the idea of the universe as a particle.

      My very new (but why did I not think about this before yesterday when reading about GRW!) doubt about the CCC is that it is 'more of the same' on each cycle. When a fermion interacts it changes spin sign. An emitted sequence of photons also alternates in spin sign. For a universe-as-a-particle, the change of spin sign for an external observer should be reflected in some large-scale and complete reversal of 'something' internally in the universe at a measurement/collapse of wavefunction?

      I have read elsewhere, somewhere unmentionable, that the universe can maybe be represented by S^3. That is a double cover of R^3, and changing the cover changes the spin. I have also seen the two covers as being represented by the inner and outer surfaces of a hollow sphere. That is, one convex and one concave surface. Matter on a convex sheet (unfortunately using the rubber sheet analogy) appears to attract, while matter on a firm concave sheet appears to have negative mass. (Jamie Farnes has negative mass as computer-simulating dark matter and dark energy.) So one conclusion is that the universe alternates from convex to concave at each cycle. This associates internal mass sign with externally observed spin sign.

      Austin Fearnley

      Delete
  13. This has always puzzled me. If I measure the position extremely accurately, then the momentum gets a very wide spread of possible values. If I then measure the momentum it could collapse on any of them. But they have wildly different corresponding energies. Where did that energy come from.

    Also that momentum's corresponding speed is equally spread out and could be anything up to the speed of light? Have I not just had a change to jump from speed ~0 to near C just by taking a couple of measurements.

    Or instead I first measure the momentum very precicely, which spreads out the position probability possibly light-years wide. Can I then turn around and measure the position, and sometimes find the particle at Alpha Centauri instantly?

    ReplyDelete
  14. Sabine, my dear.

    At this moment I profess
    no Knowledge
    on this planet.

    - And I'm too tired
    to argue with
    anyone who does.

    Hope your day is good.

    Wishing well,

    Love Your Work

    ReplyDelete
  15. A method for measuring a length in an unknown distance:

    . measure the viewing angle of a known length in the same distance. Let's say the known length is 1 meter, par example as a marking on a spacecraft of witch we like to know the full length.
    . now measure the viewing angle of the whole spacecraft.
    . calculate the true length of the spacecraft from the viewing angle of the knowing length and the viewing angle of wanted length.

    by the way, this measuring method proofs that moving objects always and ever retains their "length at rest".

    ReplyDelete
    Replies
    1. This isn't new. The method was already used by Archimedes... (By the way, there's a typo. You wrote witch for which).

      Delete
  16. Is it possible to measure the strength of a magnetic field by the amount of vacuum birefringence that this magnetic field produces in matter? Has this measurement ever been preformed? If so, what is the method used?

    ReplyDelete
  17. In my opinion, Zitterbewegung does not exist. It has never been directly measured or even observed. So why have so many papers been written based on this prediction of the Dirac equation that is assumed to be real. Also, independent vibrations produced by the interaction of the vacuum on the electron would interfere with the workings of the Higgs field which also vibrates the electron in another but related way. In fact, Sir Roger Penrose makes a case that Zitterbewegung predicted by Dirac is actually the Higgs field vibration mechanism.

    https://arxiv.org/abs/1603.07156
    Gerald E. Marsh

    “In the zig-zag concept of the electron, eL - and eR- continually convert themselves into the other due to their interaction with the Higgs at the chiral oscillation frequency νch , which is the same as the zitterbewegung frequency. This means the mass m in Eq. (2.1), which serves as a coupling constant between these two equations, is being interpreted as a field, to quote Penrose, “essentially the Higgs field”; an illustration of this idea is shown in Fig. 2.3”.

    ReplyDelete

COMMENTS ON THIS BLOG ARE PERMANENTLY CLOSED. You can join the discussion on Patreon.

Note: Only a member of this blog may post a comment.