Image Source: Flickr. |
Martin Bojowald is one of the originators of Loop Quantum Cosmology (LQC), a model for the universe that makes use of the quantization techniques of Loop Quantum Gravity (LQG). This description of cosmology takes into account effects of quantum gravity and has become very popular during the last decade, because it allows making contact to observation.
The best known finding in LQC is that the Big Bang singularity, which one has in classical general relativity, is replaced by a bounce that takes place when the curvature becomes strong (reaches the Planckian regime). This in return has consequences for example for the spectrum of primordial gravitational waves (that we still hope will at some point emerge out of the foreground dust).Now rumors reached me from various sources that Martin lost faith that Loop Quantum Cosmology is a viable description of our universe, and indeed he recently put a paper out on the arxiv detailing the problem that he sees.
Information loss, made worse by quantum gravityLoop Quantum Cosmology, to be clear, was never claimed to be strictly speaking derived from Loop Quantum Gravity, though I have frequently noticed that the similarity of the names leads to confusion in the popular science literature. LQC deals with a symmetry-reduced version of LQG, but this symmetry reduction is done before the quantization. In practice this means that in LQC one first simplifies the universe by assuming it is homogeneous and isotropic, and then quantizes the remaining degrees of freedom. Whether or not this treatment leads to the same result that one would get by taking the fully quantized theory and looking for a solution that reproduces the right symmetries is controversial, and to my knowledge this question has never been satisfactorily settled.
Martin Bojowald
arXiv:1409.3157
Be that as it may, from my perspective and from that of most people working on the topic, LQC is a phenomenological model that is potentially testable and thus interesting in its own right, regardless of its connection to LQG.
It has become apparent however during the last years that if one takes into account perturbations around the homogeneous and isotropic background in LQC then one finds something peculiar: the space-time around the bounce loses its time-coordinate, it becomes Euclidean and is thus just space without time. We discussed this earlier here.
Now the time-coordinate in the space-time that we normally deal with plays a very important role, which is that it allows us to set an initial condition at one moment in time, and then use the equations of motion to predict what will happen at later times. This so called “forward evolution” is a very typical procedure for differential equations in physics, so typical that we often do not think about it very much. Thus I have to emphasize the relevant point is that to determine what happens at some point in space-time one does not have to set an initial condition on a space-time boundary around that point, which would necessitate knowing what happens at some moments into the future, but it is sufficient to know what happened at some moment in the past.
This important property that allows us to set initial conditions in the past to predict the future is not something you get for free in any space-time background. Space-times that obey this property are called “globally hyperbolic”. (Anti-de Sitter space is the probably best known example of a space-time that is not globally hyperbolic, thus the relevance of the boundary in this case.)
In his new paper Martin now points out that if space-time has regions that are Euclidean then the initial value problem becomes problematic. It is then in fact no longer possible to predict the future from a past initial condition. For the case of the Big Bang singularity being replaced by a Euclidean regime, this does not matter so much because we would just set initial conditions after this regime has passed and move on from there. But not so with black holes.
The singularity inside black holes is in LQC then also replaced by a Euclidean regime. This regime only forms in the late stages of collapse and will eventually vanish after the black hole has evaporated. But there being an intermediate Euclidean region has the consequence that whatever is the outcome of the evaporation process depends on the boundary conditions surrounding the Euclidean region. With the intermediate Euclidean region, one can no longer predict from the initial conditions of the matter that formed the black hole what is the outcome of black hole evaporation.
In his paper Martin writes that this makes the black hole information loss considerably worse. The normal black hole information loss problem is that the process of black hole evaporation seems to be irreversible and thus in particular not unitary. The final state of the evaporation is always thermal radiation, regardless of what formed the black hole. Now with the Euclidean region the final state of the black hole evaporation depends on some boundary condition that is not even in principle predictable. We have thus gone from not unitary to not deterministic!
Martin likens this case to that of a naked singularity, a singular region that (in contrast to the normal black hole singularity which is hidden by the horizon) is in full causal contact with space-time. A singularity is where everything ends, but it is also where anything can start. The initial value problem in a space-time with a naked singularity is similarly ill-defined as that in a space-time region with a Euclidean core, Martin argues.
I find this property of black holes in LQC not as worrisome as Martin. The comparison to a naked singularity is not a good one because the defining property of a singularity is that one cannot continue through it. One can however continue through the Euclidean region, it’s just that one needs additional constraints to know how. In fact I can see that what Martin thinks is a bug might be a feature for somebody else, for after all we know that time-evolution in quantum mechanics seems to be non-deterministic indeed.
But even leaving aside this admittedly far-fetched relation, the situation that additional information is necessary on some boundary to the future is not unlike that of the mysterious “stretched horizon” in black hole complementary. Said stretched horizon somehow stores and later releases the information of what fell through it. If the LQC black hole is supposed to solve the black hole information problem, then the same must be happening on the boundary of the Euclidean region. And, yes, that is a teleological constraint. I do not see what theory could possibly lead to it, but I don’t see that it is not possible either.
In summary, I find this development more interesting than troublesome. In contrast to non-unitarity, having a Euclidean core is uncomfortable and certainly unintuitive, but not necessarily inconsistent. I am very curious to see what the community will make out of this -- and I am sure we will hear more about this in the soon future.
Good thoughts, and I agree with your assessment. Nice to see a clear paper with fresh ideas at least toward a possibility of solutions.
ReplyDeleteI left a comment on Antionio Ricardo Martines fb link to the Myths of the tower of Babylon:
L. Edgar Otto And on the 11th day the God said "let there be quantum mechanics" and the unity of the of the earthlings decohered just before they reached the heavens in there understanding and the gods uplifted them likewise a multiverse in His own image saying- "it can get better than this."
I added a note for a novel and it seems this fixed or moving issue opens the novel which cannot escape cosmology as part of the poetry after all now that I read this essay.
ReplyDeleteInteresting although familiar from totally different context for me.
In TGD space-time surfaces are surfaces in certain higher-D space. Metric and also gauge potentials are induced and space-time regions with Euclidian signature emerges as basic prediction. They emerge in strong enough gravitational field when the imbedding space coordinates vary very fast as functions of time coordinate which can be taken Minkowski time of the imbedding space M^4xCP_2.
Euclidian regions can be assigned with very small wormhole contacts connecting space-time sheets with Minkowskian signature of induced metric. Their existence means a definite departure from general relativity which correspond to small deformations of Minkowski space. Wormhole contacts serve as basic building bricks of elementary particles and can be identified as counterparts of "4-D lines" of generalised Feynman diagrams realise in terms of space-time topology.
Euclidian regions are natural counterparts of blackholes and perhaps all macroscopic objects in many-sheeted space-time of TGD.
Non-determinism mentioned above is also a basic aspect of the variational principle determining the dynamics of space-time surfaces. There is a huge vacuum degeneracy resembling U(1) gauge invariance but realised only for vacuum externals and identifiable as 4-D analog of spin glass degeneracy. Also Euclidian vacuum regions are non-deterministic: their M^4 projection is random light-like curve and light-likeness condition gives nothing but Virasoro conditions classically. This is a signature of underlying conformal symmetry.
I find it a bit strange to say that LQC "makes the BH information loss problem worse". The standard situation we have is semiclassical gravity with quantum fields, and there we find non-unitarity. Everyone expects that to restore unitarity, gravity must be made quantum mechanical. But in this calculation, Martin has started with a (symmetry-reduced) quantum theory and already taken the semi-classical limit of the gravity sector, in order to be able to discuss things like the metric signature. I'm not that surprised that the semiclassical limit of a quantum theory of spacetime can lead to non-Lorentzian spacetimes. But supposedly, before this semiclassical limit was taken, there was a perfectly healthy (unitary) Hamiltonian acting on a perfectly good Hilbert space, so nothing to worry about, right? The problems just arise from the semiclassical limit.
ReplyDeleteLeo:
ReplyDeleteHow do you know it's unitary? And also, what good would that knowledge be if you don't know the operator - that's only another way of saying it's not deterministic. Best,
B.
"homogeneous and isotropic background A rigorously derived axiomatic system fails to model observation. A founding postulate is empirically defective. Massless boson photons detect no vacuum refraction, dispersion, dissipation, dichroism, or gyrotropy. Postulate photon vacuum symmetries are exactly true for fermionic matter (quarks, hadrons). Parity violations, symmetry breakings, chiral anomalies, baryogenesis, Chern-Simons repair of Einstein-Hilbert action indicate a trace chiral anisotropic background acting only upon matter.
ReplyDeleteSpacetime chiral torsion adds to spacetime achiral curvature. Test as geometric Eötvös, calorimetry, and molecular rotation temperature experiments in existing apparatus. Unending elegant parameterizations versus one heterodox measurement: Look.
Matti,
ReplyDeleteCould these many sheets be seen as superimposed on each other? In your model it is not clear that if a Mersene prime describes a particle there is more than two resonances observable.
I call this the football problem, bosons as American with two naked or near naked singularities a the ends and European as the uncertainty of a soccer ball do to differences in radii to other frames.
The particle that falls into a black hole, is it a fermion or boson or something else in between if it could be seen and the Feynman diagram rotated 90 degrees? Can a single quark exist and so fall in?
The problem is not made worse, even limiting to 4space and motion. But it is certainly and necessarily made harder. A Minkowski signature can do many things as five (or ten) fold group theory but we need more.
What does this mean for vacuum pressure say in nuclear bombs as a region, why are they localized?
Unitary may have several layers of meaning but it should be clear which set of concepts applies to the physical. Lorentz holds on some levels and not on others from different views. If we limited it to narrow group notation we should find at leas M^4 x CP_4 conceptually.
Gravitation has to be more complicated than this... what is some local preferred initial condition and what is intrinsically superdetermined is a higher level of unity of the physics. Non-determinism is at best not necessarily physical or real. rather non-superdeterminism.
Uncle AI... we do such an experiment with our minds for a chiral path to some imagined singularity as if bouncing from it or lost into it focused at a nearby heart of a star.
After life or bouncing back again and again - a hint of a metaphysical question our simple ideas of Hilbert space cannot answer, yet. But Matti it does not begin to explain consciousness in isolation.
Thanks to the gang all here whom I have seen over and over again your worthwhile take on relevant theories.
If we do not expect planets to spiral into each other, then something more general holds back the condition where Lorentz may hold or orbits fail in the potential and kinetic conditions- it is a matter to sort out these "wormhole paths" and "chairal traces"
I've a very weak understanding of the point, but nevertheless I'll try to comment.
ReplyDeleteAccording to the consideration I did here ( https://www.dropbox.com/s/ejkj84bsmr7xmna/EN_Singularity.pdf?dl=0 ), space-time become flat when r_s become smaller that r_n, but in such a case, the matter is outside the singularity even if the density is 1/2 of planck density...
Sabine: Good point! If there is no unitary Hamiltonian, then it's sick as a theory. I assumed that if it was possible to formulate the theory from a real action, that it would be equivalent to a theory with a unitary Hamiltonian (the spin foam representation) ... but I don't actually know if that's true.
ReplyDeleteDo not exist any singularity in Nature. Particles are the most dense part of the universe but they can't be singulars due to the geometrical relation between the costant G, c and h_bar.
ReplyDeletenemo, the link to Favaon's letter is an impressive argument and reasoning.
ReplyDeleteA possible simple counter argument could be that if the black hole on the electron scale level as n finite units exist on the microscale we cannot distinguish the inside and outside for a continuous line (much like abstract empty space, aether like) may pass thru it not touching the points -so what seems discrete with better definition can connect the inside and outside from our view or it may not physically matter. But the duality of halves and doubles is part of the sensible arithmetic.
The really crucial thing in this for me is really the question of whether or not one can draw conclusions based on the effective picture even for regimes where semi-classical approximation is expected to break down - which is what this paper does. I think the arguments on this front will be very interesting and the ones to really look out for.
ReplyDelete@ L. Edgar Otto
ReplyDeleteUncle AI... we do such an experiment with our minds for a chiral path to some imagined singularity as if bouncing from it or lost into it focused at a nearby heart of a star.
1) H. K. Moffat, "Six lectures on general fluid dynamics and two on hydromagnetic dynamo theory," in R. Balian & J-L Peube (eds), Fluid Dynamics (Gordon and Breach, 1977 http://www.igf.fuw.edu.pl/KB/HKM/PDF/HKM_027_s.pdf (slow to download) https://googledrive.com/host/0B2UrHNG0HK6fazJMYl84WmVmUm8/PDFs/Moffatt_1977_GordonandBreach_Slogfdatohdt_149.pdf pp. 175-6, the chiral case. "a lack of reflexional symmetry"
2) Dunkelbumser
Uncle AI
ReplyDeletevery nice Moffatt links.
Classical electromagnetics is something historically it would pay us to know to apply to better to any alternative views. Our intuitive approach to space (of which Einstein did not go off the deep end to follow this idea of a wider general unity)
Lagrangian compliments Laplacian it seems- but we do need to stay close to the abstract mathematical methods so can depend on them.
Reflection itself more generalized could be the center as chirality of which all other processes are described, even fluidly. The problem we debate here is the nature (unnecessary to establish thermodynamics as a field by the first and second laws) that third law of which some "lack of x " defines where these concepts are intuitively and physically broken (that is the third law as if a 5 fold phi spiral with a center singularity neither bounced or reached in principle, yet). So general volume involves these aspects of broken golden ratios and so on of the unitary constants.
For example... We do not need 8 Presumably octonion (and if it is local or globally broken for physics) signed vector objects to describe the balanced tensor ideas of Eddingtons view but five composite ones of which Dirac has twelve... and so on... that simple geometry can be subject to this simple algebra and perhaps find closer or exact measurements.
2) Dunklebumster ? I do not understand.
It takes a quantum leaps intuitively of sorts to get some of my friends who try to study say game theory to leap from bored to tears to great excitement on the subject realizing that intuitively they sensed this all along but were not good at math like they would get on school tests.
Perhaps the neutrino in a sense is a black hole despite its simplicity as to what forces such particles recognize that electrons could so turn into- but this is as simplistic as most of our raw intuitions. Two spirals spinning down one drain with two openings in the flow do not always move in opposite directions.
As Martin rightly points out, the problems in LQG are conceptual.Without a cogent fundamental physical principle as a conerstone, LQG is bound to have internal contradictions which will lead to a catastrophic collapse of the theory.The Relativity theories and Quantum theory are founded on fundamental physical principles which ought to be reconciled if any meaningful quantum theory of gravity is to be realized.This is the approach taken in The Nexus model of Quantum Gravity. Here is a preprint which gives solutions that are falsifiable and non divergent. https://www.academia.edu/8604226/The_Schwarzschild_Solution_to_the_Nexus_Graviton_Field
ReplyDeleteIs there a link to an explicit definition of ‘information’ as it is used in this context - as a conserved physical property?
ReplyDeleteThank you.
Don:
ReplyDeleteNo, when it comes to black hole information, "information loss" means "non-unitary time evolution". As I explained in this old post, the issue is actually non-reversibility. (A non-reversible time-evolution is also non-unitary. The opposite is not necessarily true.) Best,
B.
Alice,
ReplyDeleteBlack holes go away. Just like decay.*
*When you don't stop watching them.
Best,
Bob