|Image Source: Flickr.|
Martin Bojowald is one of the originators of Loop Quantum Cosmology (LQC), a model for the universe that makes use of the quantization techniques of Loop Quantum Gravity (LQG). This description of cosmology takes into account effects of quantum gravity and has become very popular during the last decade, because it allows making contact to observation.The best known finding in LQC is that the Big Bang singularity, which one has in classical general relativity, is replaced by a bounce that takes place when the curvature becomes strong (reaches the Planckian regime). This in return has consequences for example for the spectrum of primordial gravitational waves (that we still hope will at some point emerge out of the foreground dust).
Now rumors reached me from various sources that Martin lost faith that Loop Quantum Cosmology is a viable description of our universe, and indeed he recently put a paper out on the arxiv detailing the problem that he sees.
Information loss, made worse by quantum gravityLoop Quantum Cosmology, to be clear, was never claimed to be strictly speaking derived from Loop Quantum Gravity, though I have frequently noticed that the similarity of the names leads to confusion in the popular science literature. LQC deals with a symmetry-reduced version of LQG, but this symmetry reduction is done before the quantization. In practice this means that in LQC one first simplifies the universe by assuming it is homogeneous and isotropic, and then quantizes the remaining degrees of freedom. Whether or not this treatment leads to the same result that one would get by taking the fully quantized theory and looking for a solution that reproduces the right symmetries is controversial, and to my knowledge this question has never been satisfactorily settled.
Be that as it may, from my perspective and from that of most people working on the topic, LQC is a phenomenological model that is potentially testable and thus interesting in its own right, regardless of its connection to LQG.
It has become apparent however during the last years that if one takes into account perturbations around the homogeneous and isotropic background in LQC then one finds something peculiar: the space-time around the bounce loses its time-coordinate, it becomes Euclidean and is thus just space without time. We discussed this earlier here.
Now the time-coordinate in the space-time that we normally deal with plays a very important role, which is that it allows us to set an initial condition at one moment in time, and then use the equations of motion to predict what will happen at later times. This so called “forward evolution” is a very typical procedure for differential equations in physics, so typical that we often do not think about it very much. Thus I have to emphasize the relevant point is that to determine what happens at some point in space-time one does not have to set an initial condition on a space-time boundary around that point, which would necessitate knowing what happens at some moments into the future, but it is sufficient to know what happened at some moment in the past.
This important property that allows us to set initial conditions in the past to predict the future is not something you get for free in any space-time background. Space-times that obey this property are called “globally hyperbolic”. (Anti-de Sitter space is the probably best known example of a space-time that is not globally hyperbolic, thus the relevance of the boundary in this case.)
In his new paper Martin now points out that if space-time has regions that are Euclidean then the initial value problem becomes problematic. It is then in fact no longer possible to predict the future from a past initial condition. For the case of the Big Bang singularity being replaced by a Euclidean regime, this does not matter so much because we would just set initial conditions after this regime has passed and move on from there. But not so with black holes.
The singularity inside black holes is in LQC then also replaced by a Euclidean regime. This regime only forms in the late stages of collapse and will eventually vanish after the black hole has evaporated. But there being an intermediate Euclidean region has the consequence that whatever is the outcome of the evaporation process depends on the boundary conditions surrounding the Euclidean region. With the intermediate Euclidean region, one can no longer predict from the initial conditions of the matter that formed the black hole what is the outcome of black hole evaporation.
In his paper Martin writes that this makes the black hole information loss considerably worse. The normal black hole information loss problem is that the process of black hole evaporation seems to be irreversible and thus in particular not unitary. The final state of the evaporation is always thermal radiation, regardless of what formed the black hole. Now with the Euclidean region the final state of the black hole evaporation depends on some boundary condition that is not even in principle predictable. We have thus gone from not unitary to not deterministic!
Martin likens this case to that of a naked singularity, a singular region that (in contrast to the normal black hole singularity which is hidden by the horizon) is in full causal contact with space-time. A singularity is where everything ends, but it is also where anything can start. The initial value problem in a space-time with a naked singularity is similarly ill-defined as that in a space-time region with a Euclidean core, Martin argues.
I find this property of black holes in LQC not as worrisome as Martin. The comparison to a naked singularity is not a good one because the defining property of a singularity is that one cannot continue through it. One can however continue through the Euclidean region, it’s just that one needs additional constraints to know how. In fact I can see that what Martin thinks is a bug might be a feature for somebody else, for after all we know that time-evolution in quantum mechanics seems to be non-deterministic indeed.
But even leaving aside this admittedly far-fetched relation, the situation that additional information is necessary on some boundary to the future is not unlike that of the mysterious “stretched horizon” in black hole complementary. Said stretched horizon somehow stores and later releases the information of what fell through it. If the LQC black hole is supposed to solve the black hole information problem, then the same must be happening on the boundary of the Euclidean region. And, yes, that is a teleological constraint. I do not see what theory could possibly lead to it, but I don’t see that it is not possible either.
In summary, I find this development more interesting than troublesome. In contrast to non-unitarity, having a Euclidean core is uncomfortable and certainly unintuitive, but not necessarily inconsistent. I am very curious to see what the community will make out of this -- and I am sure we will hear more about this in the soon future.