Earlier this year, there was an interesting program at the KITP on "

Bits, Branes and Black Holes." Unfortunately I couldn't be there for reasons that are presently happily taking apart the new IKEA catalogue. However, many

audios and videos are online, and meanwhile there's also some papers on the arxiv picking up the discussions from the program.

One of the maybe most interesting developments is a revival of the idea that black hole evolution might just not be unitary. Recall, if one takes Hawking's semi-classical calculation of black hole evaporation one has a hard time explaining how information that falls into a black hole can come out again. (And if you don't recall,

read this.) There is the option to just accept that information doesn't come back out. However, this would be in conflict with unitarity, one of the sacred principles of quantum mechanics. But nothing really is sacred to a theoretical physicist with a headache, so why not do without unitary? Well, there is an argument dating back to the early 80s by Banks, Susskind and Peskin that this would go along with violation of energy conservation.

Each time this argument came up I recall somebody objecting. Personally I am not very convinced that's the right way to go, so I was never motivated enough to look into this option. But interestingly, Bill Unruh has now offered a concrete counter-example showing that it is possible to have decoherence without violating energy conservation (

which you can find on the arXiv here), that seems to have gone some way towards convincing people it is possible. It seems to me quite likely at this point that non-unitary black hole evaporation might increase in popularity in the next years again, so this is a good time to tell you about neutral Kaons. Stay with me for some paragraphs and the link will become clear.

Black hole evaporation seems non-unitary when taking Hawking's calculation all the way to the end stage because the outcome is always thermal radiation no matter what one started with - it's a mixed state. One could have started for example with a pure state that collapsed to a black hole. Unitary evolution will never give you a mixed state from a pure state.

But what if we'd take it seriously that black hole evaporation is not unitary? It would mean that if you take into account gravity it might be possible to note decoherence in quantum systems when there shouldn't be any according to normal quantum mechanics. Everything moves through space-time and, in principle, that space-time should undergo quantum fluctuations. So it's not a nice and smooth background, but it is what has become known as "space-time foam" - a dynamic constantly changing background, a background in which

Planck scale black holes might be produced and decay all the time.

This idea calls for

a phenomenological model, a bottom-up approach that modifies quantum mechanics in such a way as to take into account this decoherence induced by the background. In fact a model for this has been proposed already in the early 80s by Ellis et al in their paper "

Search for Violations of Quantum Mechanics." It is relatively straight forward to reformulate quantum mechanics in terms of density matrices and allow for a non-unitary additional term for the Hamiltonian. As usual for phenomenological models, this modification comes with free parameters that quantify the deviations. For quantum gravitational effects, you should expect the parameters to be a number of order one times the necessary powers of the Planck mass. (If that doesn't make sense,

watch this video explaining natural units.)

This brings us to the question how to look for such effects.

A decisive feature of quantum mechanics is the oscillation between eigenstates, which is observable if the state in which a particle is produced is a superposition of these eigenstates. Decoherence is the loss of phase information, so the oscillation is sensitive to decoherence. Neutrino oscillations are an example of an oscillation between two Hamiltonian eigenstates. However, neutrinos are difficult to observe - it takes a lot of patience to collect enough data because they interact so weakly. In addition, at the typical energies that we can produce them with the oscillation wavelength is of the order of a kilometer to some hundred kilometers, not really very lab friendly.

Enter the neutral Kaons. The Kaons are hadrons; they are composites of quarks. The two neutral Kaons have the quark content of strange and anti-down, and down and anti-strange. Thus, even though they are neutral, they are not their own anti-particles. Instead, each is the anti-particle to the other. These Kaons are not however eigenstates of the Hamiltonian. Naively, one would expect the CP eigenstates, that can be constructed from them, to be the eigenstates of the Hamiltonian. Alas, the CP eigenstates are not Hamiltonian eigenstates either because the weak interaction breaks CP invariance.

The way you can show this is to construct the CP eigenstates to the eigenvalues +1 and -1 and note that the state with eigenvalue +1 can decay into two pions, which is the preferred decay channel. The one with eigenvalue -1 needs (at least) three pions. Since three is more than two, the three pion decay is less likely, which means that the CP -1 state lives longer.

Experiment shows indeed that there is a long lived and a short lived Kaon state. These measured particles are the mass eigenstates of the Hamiltonian. But if you wait for the short lived states to have pretty much all decayed, you can show that the long lived one

*still* can do a two pion decay. In other words, the CP eigenstates are not identical to the mass eigenstates, and the CP +1 state mixes back in.

This indirect proof of CP violation in the weak interaction got Cronin and Fitch the Nobel Price in 1980.
The same process can be used to find signs of decoherence. That's because the additional, decoherence inducing term in the Hamiltonian enters the prediction of the observables, eg the ratio of the decay rates in the two pion channel. The relevant property from the neutral Kaons that enters here is the difference in the decay widths which happens to be really small, of the order 10

^{-14} GeV, times the CP violating parameter ε

^{2} which is about 10

^{-6}, and we know these are values that can be measured with presently available technology.

This has to be compared to the expectation for the size of the effect if it was a quantum gravitational effect, which would be of the order

*M*^{2}/

*m*_{Pl}, where

*M* is the mass of the Kaons (about 500 MeV) and

*m*_{Pl} is the Planck mass. If you put in the numbers, you'll find that they are of about the same order of magnitude. There's some fineprint here that I omitted (most important, there are three parameters so you need several different observables) but roughly you can see that it doesn't take a big step forward in measurement precision to be sensitive to this correction. In fact, presently running experiments are now on the edge of being sensitive to this potential quantum gravitational effect, see eg

this recent update.

To come back to the opening paragraphs, the model that is being used here has the somewhat unappealing feature that it does not automatically conserve energy. It is commonly assumed that energy is statistically conserved, for example Ellis et al write "[A]t our level of sophistication the conservation of energy or angular momentum must be put in by hand as a statistical constraint." Mavromatos et al have worked out

a string-theory inspired model, the D-particle foam model, in which energy should be conserved if the recoil is taken into account, but the effective model has the same property that individual collisions may violate energy conservation. It will be interesting to see whether these models receive an increased amount of attention now.

I like this example of neutral Kaon oscillations because it demonstrates so clearly that quantum gravitational effects are not necessarily too small to be detected in experiments, and it is likely we'll hear more about this in the soon future.