- Dangerous implications of a minimum length in quantum gravity
Abstract: The existence of a minimum length and a generalization of the Heisenberg uncertainty principle seem to be two fundamental ingredients required in any consistent theory of quantum gravity. In this letter we show that they naturally predict dangerous processes which somehow must be suppressed. For example, long-lived virtual super--Planck mass black holes may lead to rapid proton decay. Possible solutions of this puzzle are briefly discussed.
Since I've been working on quantum field theories with a minimal length and a generalized uncertainty principle for a while (for a brief intro, see here), reading the paper was somehow mandatory. It is an interesting examination, but I don't agree on the conclusions drawn in the paper. Here is in a nutshell what I read out of the paper:
Heisenberg's usual uncertainty principle relates the measurement uncertainties in position and momentum space to each other. Within this context, it is possible to localize particles arbitrarily good in position space only on the expenses of losing more and more information about their momentum. To measure smallest distances you need to probe your sample with very small wavelengths, i.e. with large energies. In standard quantum mechanics, you can in principle measure arbitrarily precise if you can only probe your sample with particles of high enough energies. This is why we build larger and larger particle colliders, and accelerate particles to higher and higher energies.
However, General Relativity tells us that all energy gravitates. If you use a probe with a very high energy density you will get spacetime around it to curve. But all particles have to move in that curved spacetime. Such, if you get the probing particle near your sample whose position you want to measure, the background becomes dynamical, and the sample will wiggle. This gravitationally induced motion causes an additional uncertainty.
This effect is strictly speaking always present, but since the gravitational interaction is so weak compared to the other interactions, one can neglect this additional contribution in all experiments that we have ever done. One can however expect this effect to become relevant when enough energy is clumped into a region small enough so gravitational perturbations are no longer negligible. This will typically happen somewhere at the Planck Scale, and leads to a so-called 'generalized uncertainty' in which the position uncertainty has a minimum at the Planck length that one can not get below no matter what. If you go to even higher energies, the distortions only become worse. Thus, the generalized uncertainty typically increases with higher energies.
This kind of thought experiment can be somewhat straightened, for a very nice introduction see e.g. section II A of
- Quantum gravity and minimum length
By: Luis J. Garay
- On Gravity and the Uncertainty Principle
By: Ronald J. Adler, David I. Santiago
(Both papers are very readable). One should keep in mind though that besides the general sentiment being plausible, these arguments are not `derivations', since we don't yet have an observationally confirmed theory of quantum gravity to derive from. They are thought experiments - no more, no less - meant to investigate certain general aspects one can expect of quantum gravity. Different arguments lead to slightly different versions of generalized uncertainty principles, which indicates the limitations of such considerations - so far, there is no specific version of generalized uncertainty . The Bambi and Freese paper uses another argument which says if you increase the energy in a space time region to measure more and more precise, you will eventually just form a black hole and not learn anything more precise than this. That's essentially what Giddings and Thomas dubbed so aptly "The End of Short Distance Physics" (hep-ph/0106219).
The consequences of such a generalized uncertainty principle have been examined extensively since the early nineties, most notably by Achim Kempf, see e.g. "On Quantum Field Theory with Nonzero Minimal Uncertainties in Positions and Momenta" (hep-th/9602085), who considers even more general generalized uncertainties.
Virtual Black Holes
Another ingredient to the paper are virtual black holes. In particle physics, virtual particles are not 'really' produced in scattering experiment, they only come into life in intermediate exchanges. Black holes can at least in theory be 'really' produced in particle collisions if a high enough energy is centered in a small enough region of space. Black holes with masses close by the Planck mass evaporate extremely fast, and in this process they can violate certain conservation laws like e.g. baryon number. This is because the black hole doesn't care what it was formed of, it just evaporates democratically into all particles of the Standard Model .
If black holes can be produced in particle collisions, one would expect them to also appear in virtual exchanges, where they could mediate baryon or flavor violating processes, e.g. proton decay. Since the black holes are rather heavy and short lived these process are usually very suppressed though. This was for example examined in
- Proton Decay, Black Holes, and Large Extra Dimensions
By: Fred C. Adams, Gordon L. Kane, Manasse Mbonye, Malcolm J. Perry
They used the virtual black holes to set constraints on the size of extra dimensions. In the presence of 'large' (meaning, larger than the Planck scale) extra dimensions one can expect the production of black holes no longer to be suppressed by the usual Planck scale but by the new, lowered, one. Then, these processes are significantly enhanced. This problem doesn't only occur for virtual black holes but essentially for all higher order operators who are no longer suppressed because the Planck scale usually is far off. Either way, the Bambi and Freese paper isn't concerned with extra dimensions.
If I understand that correctly, Bambi and Freese in the paper essentially argue that the generalized uncertainty, which has a minimum position uncertainty, should also result in a minimum time-uncertainty (see e.g. "Quantum fluctuations of space-time" by Michael Maziashvili, hep-ph/0605146). One can construct similar thought experiments to the one mentioned above by attempting to build more and more precise clocks to measure exactly when a particle will e.g. decay. Again, General Relativity puts a limit to these efforts. (By starting from the uncertainty principle instead of from the commutation relations one circumvents virtuously to address the question what actually a time operator is supposed to be, see e.g. John Baez on The Time-Energy Uncertainty Relation.) This argument usually employs some Lorentz-symmetry (you bounce a particle back and forth via some distance to get a time measure), so one thing that springs into mind is whether one still has this Lorentz-symmetry.
Either way, the point they are making in the paper is now that the lifetime of virtual states with masses above the Planck scale should with the generalized uncertainty not as usual become shorter and shorter, because it can never drop below Planck time. Now, in comparison to the usual scenario these virtual contributions should become more important. In the paper they provide some general estimates that would arise from such a scenario, e.g. for proton decay and find
"that super–Planck mass virtual black holes predict naturally dangerous processes, clearly inconsistent with the observed universe. In particular, rapid baryon number violating processes may lead to predictions of proton decay lifetimes that are ruled out by experiment."
Well, if you've followed this blog for a while (e.g. this post on 'test theories' or this recent progress report), you probably know what I am going to say. This is all well and fine, but estimates based on dimensional argumentation can't replace a consistent model that includes a generalized uncertainty relation. The uncertainty relation is derived from the commutation relations of position and momenta. If you 'generalize' it, you need to modify these commutation relations. This is actually the better starting point, also the one most commonly used, and this modified commutator in quantum mechanics has its equivalent in quantum field theory.
Such a modification however, has a couple of consequences. The one is that you can't have a minimal length without also having a modification of special relativity. The other one is that in some cases one also has a modified dispersion relation (though not necessarily so, since the modification can factor out). The most important one in this context is that one has a modification of the measure in momentum space. Essentially, high momentum states are less populated. This modification of the measure in momentum space is not optional; it is a consequence of the generalized uncertainty principle. This was probably first pointed out in the above mentioned paper by Kempf. You find some details on the relations between these ingredients in my paper hep-th/0510245 .
I do not see how the estimations in the Bambi-Freese paper take into account that within a scenario with a generalized uncertainty principle the measure in momentum space is modified, which naturally suppresses virtual particles with masses above the Planck mass. As one would expect from a theory with a minimal length, this provides essentially a natural regulator, and all virtual contributions above the Planck scale are very suppressed.
Related, some years ago I calculated loop contributions in an extra-dimensional scenario with Kaluza-Klein excitations, by using a model with generalized uncertainty (see hep-ph/0405127). These excitations can become arbitrarily massive, which usually poses a problem, much like the massive virtual black holes considered by Bambi and Freese. In the scenario with the minimal length, these very massive virtual contributions are suppressed, and the result is naturally regularized (since it's with extra dimensions, the usual renormalization schemes don't work).
Besides this, I admittedly have a general problem with putting black holes with masses considerably above the Planck scale in Feynman loops. One should keep in mind that the quantum aspects of black holes become important only if the curvature on the horizon is in the Planckian regime, which is the case for Planck mass holes. If one increases the mass (and they are talking about a thousand times Planck mass in the paper), the curvature drops and the black hole becomes classical very fast. This is also the reason why the semi-classical treatment for Hawking-evaporation is extremely good up to the very late stages of decay.
I wrote an email to one of the authors about these concerns, and I am courious to hear what they think. I will keep you updated on that.
So I agree with the conclusion in the paper that "this problem must somehow be addressed". But unlike what they suggest I don't think it needs a new symmetry for it, it needs in the first place a consistent quantum field theory that incorporates a minimal length. Since the minimal length acts as a regulator at the Planck scale, I expect it to suppress these processes.
As it seems Lubos Motl has already commented on the same paper. Since his writing is as usually entertaining, let me briefly summarize his criticism.
Lubos begins with doubting the author's authority "First of all, you can see that the authors are complete outsiders in quantum gravity.", an excellent starting point for a scientific argument. His next point about the allegedly confused notation in the Bambi-Freese paper probably means that he didn't even bother the check the first some references which could have clarified his problem. Lubos then proceeds to his speciality, content free ad hominem attacks
"In the literature, most of the talk about the "minimum length" in quantum gravity is a vague sloppy babbling of incompetent people who don't have enough imagination to ever understand how Nature actually solves these things - think about profoundly and permanently confused authors such as Ng, Amelino-Camelia, Hossenfelder, and dozens of others."
Which he then attempts to explain with
"In reality, something new is indeed going on at the Planck scale, but to assume that 1.) it must be possible to talk about distances even in this regime and 2.) all the distances must always be strictly greater the Planck scale is a double naivite."
Sadly, this shows that despite all the effort I made explaining my model to him - on his blog, on this blog, and by email - was completely wasted. Otherwise he should at least have noticed by now that the minimal length in my model is a wavelength, and besides this the model has not very much in common with Ng and Amelino-Camelia's work. But this unfortunate limitation of his mental capability is explained with the following sentence
"If you actually look at any consistent realization of quantum gravity - and we have quite many setups to do so, including AdS/CFT, Matrix theory, perturbative string theory, and more informal descriptions of quantum gravity involving effective field theory - you will see that the "generalized uncertainty principle" in the strict Bambi-Freese sense is certainly wrong."
Which we should probably translate into "my head is full with string theory, and nothing else will fit in", the poor guy. This problem is further clarified in his statement
"Even if you are a bizarre, speculative alternative physicist who thinks that the reality is described by an entirely different theory of quantum gravity than the theory we still call "string theory", you must agree that string theory provides us with one or many realizations (depending how you count) of a physical system that satisfies all the consistency criteria expected from a theory of quantum gravity."
Well, as a scientist I don't have to agree on an description of reality as long as there is no experimental evidence whatsoever. I can't but wonder who of us is bizarre and speculative. Lubos continues his argumentation in the usual manner ( "I think that every competent person would agree with me ..." etc). The only thing that's surprising is that neither the name Smolin nor Woit appears in his writing. So maybe this is an improvement, and it leads me to hope that the air in Pilsen might clean his mind.
The only reason why I am telling you that (and break my own no-link policy) is that despite his disgusting way of writing, as far as I can tell, the conclusions he draws seem to be compatible with mine: "very large black holes "in loops" cannot cause any very large violation of the baryon and lepton numbers."
 In most cases, a first order approximation is used with an integer for the first power in energy over Planck mass that appears, and an expansion parameter of order one.
 At least that is what most people in the field believe today. Just to make sure I want to add that nobody has ever seen a real black hole evaporating.
 The paper was originally titled 'Self-consistency in Theories with a Minimal Length'. However, one of the referees didn't like the title, so it got published as 'Note on Theories with a Minimal Length', where the essential word 'self-consistency' dropped out.