Saturday, March 10, 2007

Echo: DDR as a Test Model

This is a copy of my post at the blog to our discussion group 'Quantum Gravity in the Lab?!' about our meeting on March 6th. The discussion was lead by Michele Arzano, and this week's topic was 'Planck-scale departures from relativistic symmetries: test theories and status of predictions'.


Michele Arzano arrived as postdoc at PI around the same time as I did, in fall 2006. Before I start explaining what he told us in the discussion this week, let me point out that he's DJed for some while, and had the idea to organize PI's first 'Decoherence Dance' that will take place on Saturday in the old PI building. In case you're around, drop in... Michele has a quite impressive number of works on Hopf-algebras, black hole thermodynamics, and modified dispersion relations. Though he prefers to call these 'deformed dispersion relations' (DDR).

He started with explaining the general idea to confront a (not further specified) 'candidate theory of quantum gravity' (QG) with the real world by use of a test model. Such a test model, he explained, would generally have a 'main feature', and possible 'additional features'. I think of this as parameters which might not necessarily all vanish, but could do so under certain circumstances. An example that he gave later might be a DDR that could but doesn't have to come along with a violation of conservation laws, and the resulting threshold modifications.

Michele then briefly summarized evidence for DDRs from QG.

First, dispersion relations get modified in theories with non-commutative space-times. Such could arise in certain string scenarios, where Lorentz invariance is broken (for a review see e.g. Review of the Phenomenology of Noncommutative Geometry), or the non-commutativity arises through a modification of the Lie-Algebra from standard flat-space symmetries a to a κ-Minkowski Lie-Algebra (for references see e.g. Hopf-algebra description of noncommutative-spacetime symmetries).

Second, evidence for DDRs has been found in approaches from LQG (see e.g. Loop quantum gravity and light propagation, and Quantum symmetry, the cosmological constant and Planck scale phenomenology)

He then defined the test model by generally parameterizing an expansion of the DDR. The coefficient for the first non-vanishing term in energy over Planck energy Ep is the most important one:


Since the expansion parameter is typically less than 10-16, it is of utmost important whether the first power n is 1 or >1.

This deformation is what he referred to as the 'main feature'. Since this is not a theory, but only a single equation, this might come with additional features like a modification of energy-momentum conservation, or an energy dependent speed of light. Generally, he said, it is an open question whether the dynamics can be described by an effective field theory. The above expansion includes DSR approaches as well as an explicit breaking of Lorentz invariance with a preferred frame.

[At this point Michele had already talked one hour instead of half an hour.]

He summarized two prediction that arise from this approach.


  1. In the general case of a DDR the speed of a photon depends on its energy. This means that a signal composed of different frequencies shows an unusual dispersion. Roughly spoken, higher energetic photons are faster than one would think they are. The problem is that this difference in the time of flight is hard to detect, since the ratio of the photon's energy over the Planck energy tiny is for typical photons that we observe. However, a difference in time of flight can add up given that the signal composed of different frequencies travels over a long distance.

    If one inserts the typical scales it turns out that γ-ray bursts provide a source that would make such an effect - tiny as it is - observable with the GLAST satellite. The bursts have a high energetic contribution that can reach up to 1 GeV, and a typical distance of a Gpc. In the case of n=1, the accumulated difference in time of flight between the higher and lower energies becomes comparable to the typical duration of the burst itself (of the order milliseconds), and thus potentially detectable. (I mentioned that the energies inserted in the equation were taken in a specific restframe, that of the cosmic microwave background.)

  2. In case energy-momentum conservation is modified, one obtains a modification of thresholds for particle production. This effect has been used to explain the to-be-confirmed absence of the GZK cutoff for cosmic rays. To briefly recall the issue: cosmic rays are commonly believed to be created from incoming high energetic protons that are not produced in nearby sources. If the protons move fast enough relative to the cosmic microwave background however, they will eventually scatter on the background radiation and produce pions (pions being the lightest mesons). If the threshold for this reaction is crossed, the typical travel distance (mean free path) of the protons drops considerably, and they can not reach us any more. One thus expects a sharp cut-off in the spectrum that should occur for proton energies around 1019eV (in the earth rest frame. In the center of mass frame this is roughly a GeV).

    Whether the threshold is raised or lowered depends on the sign of η. I forgot whether positive or negative would raise the threshold as necessary, sorry. (As Joy pointed out, the experimental situation on the GZK cutoff is far from clear, and there are many issues that have to be taken into account. I mentioned that the energies inserted in the equation were taken in a specific restframe, that of the cosmic microwave background. Yes, I know, I insist on that point. )
[Somewhere around here the discussion exceeded 90 minutes. I think I must be the worst discussion leader that the world has ever seen.]

Michele didn't have the time to elaborate on the question whether or not n>1 effects would be observable as well.

In my opinion another test for the test model is whether or not it can be put into a consistent framework. It is unfortunate that this test model still only consists of a couple of equations, and can not be understood as a theory. I think the requirement of having the main- and/or additional features arise from a theory would significantly improve the reliability of the predictions, and help to clarify ambiguities of the model. Overall seen I find the approach interesting, though direct connections to quantum gravity seem to be weak, and more motivations than actual derivations. It is hard for me to judge on whether the discussed features must necessarily arise from certain approaches, or are just a general possibility that one can't (and doesn't want to) exclude. In this regard, I hope that Lee will enlighten us next week.

Suggested literature:

22 comments:

Uncle Al said...

Noncommutative geometries overall violate Lorentz Invariance with mechanisms active in chiral mass distributions' observables' divergences. Do QG DDRs demand the Equivalence Principle? If not, what sources EP violations?

Arguing with equations is inferior to experimental validation or falsification. Even theorists can make empirical coffee given budget, apparatus, consummables, and lab practice.

Moshe said...

Possibly naive question regarding Lorentz violation. Suppose we have breaking of Lorentz invariance at the Planck scale, doesn't really matter how. The low energy effective field theory will now have many more allowed couplings, the coefficients of which are of determined by dimensional analysis, up to order one effects.

So, irrelevant couplings will give Planck supressed effects, which is presumably what are the observational signatures are all about. How about relevant and marginal couplings? it is not that hard to write a few... and they will tend to dominate the physics at low energies. Seems to me then that by their absence we already know Lorentz invariance to be accurate to fantastically high energies, way above the Planck scale. Am I wrong?

QUASAR9 said...

Hi Bee, thanks for these insights into the discussion (and the five links directly relevant at the end)
Sometimes it is a good idea to be 'flexible' and allow a talk to go on for 90 minutes.

"Overall seen I find the approach interesting, though direct connections to quantum gravity seem to be weak, and more motivations than actual derivations. It is hard for me to judge on whether the discussed features must necessarily arise from certain approaches, or are just a general possibility that one can't (and doesn't want to) exclude. In this regard, I hope that Lee will enlighten us next week."

Bee, Gravity 'clearly' has a range
on Earth mass & distance from the Earth's core...
but the distance to the Moon is not constant, and said to be increasing - because ...?

I'll leave long distance Light, Photons, high energy scales and any QG relation for a later date

Plato said...

Escape velocity of the photon? I'd like to contribute but it has to be here, and not there in QGL. :( Not to smart sometimes.)

A black hole is an object so massive that even light cannot escape from it. This requires the idea of a gravitational mass for a photon, which then allows the calculation of an escape energy for an object of that mass. When the escape energy is equal to the photon energy, the implication is that the object is a "black hole".

I mean it's relevant to know what values would allow "direct perception of the blackhole" so you would have to have this measure?

http://bp2.blogger.com/_cldxKGOzgeM/RfBxtIxAoTI/AAAAAAAAARk/YKld1fv3Frw/s200/blog+compton_scatter.gif

Now to know what the value is, the interaction had to be of value as well, so you would define it's colour and the energy?

Glast historical data of the "cosmological event" are based on calorimetric measures.

This is self evident determination of the geometrical structure of that event. Just as one may say, what value this process in the Quark Gluon plasma?

Plato said...

Sorry Bee image does not allow it to be shown, so I will try this.

It is an animation "borrowed" that I thought suitable.

Bee said...

Hi Uncle,

I do not know any reason or evidence why QG would imply a violation of the equivalence principle. Already in standard General Relativity, a particle's spin and/or gauge charge in principle, does influence the motion of the particle, though this influence is usually negligible. This is not a violation of the equivalence principle, but a consequence of the fact that every contribution to the particle's field strengths is also a source for the gravitational field. The equivalence principle is most often stated for point particles, in its most trivial form it says inertial mass equals gravitational mass. The more useful formulation is that effects of gravity in curved space are locally identical to acceleration in flat space. It might very well be that chirality influences (in principle) the motion of particles, but I doubt this effect is observable for molecules.

Arguing with equations is inferior to experimental validation or falsification.

Without the equations there is nothing to validate of falsify, but just a bunch of measurements. The whole purpose of science is to understand the underlying laws. There is no doubt that mathematics has turned out to be a very useful tool, and the demand for consistency of a theory is in the absence of experimental facts the most powerful criteria I can think of.

Best,

B.

Bee said...

Hi Moshe,

yes, you are absolutely right. This is the reason why I prefer a scenario that does not break Lorentz invariance, and insist on that point. In the low energy effective limit all such modifications are tightly constrained, and have been thoroughly investigated, see e.g.

hep-ph/9809521

or search the arXiv for author Kostelecky. These constraints however are currently not really applicable to the above described scenario since the low energy effective qft is not available (some argue there might not be such a limit, but I don't feel qualified to judge on this).

Best,

B.

Moshe said...

Thanks Bee, if this is true then any theory or scenario that predicts Lorentz violation at the Planck scale is already falsified...I'll take a look at the paper you mention, I am curious what are the tightest constraints, I'd be surprised if one needs to do any new experiments to dramatically constrain that scenario.

I already encountered the argument of not reducing to low energy EFT...since our world is certainly well described by one such theory, seems to me again that any such scenario is already trivially falsified, using just table-top experiments (since such theory probably does not contain tables...).

Bee said...

Hi Quasar,

because ...?

Because angular momentum is conserved.

I was fine with the discussion - it was very interesting. But I'm not sure how the others thought of it.

Best,

B.

Bee said...

Hi Moshe,

well, the parameters would be tightly constrained. Whether or not the above scenario is affected by these constraints depends of course on the details of the so far not existent theory. I find it very possible that our universe has indeed a preferred frame, e.g. that in rest with the CMB, which breaks exact Lorentz symmetry.

(However, despite the fact that I find it possible, I guess it became clear that I am not a friend of a breaking of Lorentz invariance. If you check my papers you'll find that I neither think there is any modification of the GZK cutoff, nor an energy dependence in the speed of light as predicted by the DDR models. I can not exclude the possibility though that the kappa-Minkowski approach will eventually turn out to somehow avoid all the already present constraints.)

Best,

B.

Arun said...

??Ask at the beginning of the seminar if anyone has a hard stop??

Plato said...

Stefan:Strangelets have been thought of as possible culprits for RHIC disaster scenarios (besides the ubiquitous black holes ;-), and as responsible for potential cosmic ray particles beyond the GZK cutoff.

While pointing to the Fly Eye and Oh my God Particle were quite revealing?

Since the first observation, by the University of Utah's Fly's Eye 2, at least fifteen similar events have been recorded, confirming the phenomenon. The source of such high energy particles remains a mystery, especially since interactions with blue-shifted cosmic microwave background radiation limit the distance that these particles can travel before losing energy (the Greisen-Zatsepin-Kuzmin limit).

M said...

Is the case n=1 already excluded (at least morally) by the observation of synchrotron radiation from the Crab nebula? See astro-ph/0212190

Christine Dantas said...

Hi Bee,

In the paper by Tsvi Piran [astro-ph/0407462], the author presents some cautionary notes in using GRBs as probes to QG (see section 6). One of them is that high (~100+ MeV) and low (~100 keV) energy photons are probably not emitted simultaneously.

I'm not up-to-date with GRB literature (I was aware of this issue some time ago). Does anybody know whether this phenomenon has been better understood? I guess it should, if we would like to use GRBs reliably for probing QG.

Thanks,
Christine

Lumo said...

I think it's all irrational. There could be in principle violations of Lorentz invariance although there exists absolutely no reason to think that it should be the case. But why the hell do you think that these things have anything to do with quantum gravity? What is gravitational about Lorentz violation?

Gravity is, on the contrary, the result of imposing the local Lorentz symmetry in spacetime.

It's been explained many times that the statements that quantum gravity is related to these deformations and violations of the Lorentz symmetry is based on the obvious fact that virtually all the demonstrably wrong theories of quantum gravity - especially the discrete approaches - manifestly violate Lorentz symmetry.

Some of the proponents of these cheap theories don't ever want to accept falsification so they invent that the violation is not bad and it could be just a "deformation" etc.

There exists no rational argument whatsoever that would be connecting the "deformations" of the 4D Lorentz algebra with any theory *called* quantum gravity, and certainly not with any theory that actually describes gravity. All these connections are bogus. They're sociological construct and deliberately spread confusion - through the media - and every intelligent person who has thought about these things for at least 5 minutes must know that it's all silly.

Moreover it is very obvious that the violation of the Lorentz symmetry in the discrete models is not small in any parametric way, and that the deformed relativity with generic coefficients is already incompatible with experiments.

Why do you keep on spreading all this fog about this multi-layered pile of nonsense?

Bee said...

Hi Christine,

yes, one has to be careful. A very thorough investigation of the question under which circumstances experimental possibilities allow to measure a modification of the time of flight for gamma ray photons has been given in

On the Problem of Detecting Quantum-Gravity Based Photon Dispersion in Gamma-Ray Bursts

From the abstract: we argue that short bursts with narrow pulse structures at high energies will offer the least ambiguous tests for energy-dependent dispersion effects.

Best,

B.

Uncle Al said...

Lorentz Invariance parity violation can be large and observationally consistent if confined to resolved opposite parity mass distributions. They do not naturally occur. Resolved twistane is the most parity divergent single molecule in glass, CHI=0.72 of CHI=1 maximum. Isolated molecules are unworkable test masses.

Gravitation is empirically blind to all observables. An object is an array of gravitationally anonymous identical unbonded unit masses (atoms). A periodic single crystal locates mass in space. The entire object must be calculated atom by atom.

Lubos' argument is valid if the Equivalence Principle is true. Affine, teleparallel, and noncommutative gravitations predictively indistinguishable from General Relativity ignore the EP. Only the disjoint non-overlap is experimentally interesting. Nobody has examined a large amplitude case.

C, P, and T were separately and pairwise conserved in particle physics. Beauty of theory demanded it! Yang and Lee ended that. A two day calorimetry experiment could end the Equivalence Principle wholly within validated existing theory. How can you not look?

If it fails, so what? It's fast and cheap. Have an undergrad do it. If it succeeds the kid gets smooched by Oprah (and Carl XVI Gustaf).

QUASAR9 said...

Hi Bee, have you read this gravity-test from Sanford Uni
Gravity attracts Kasevich's interest
from "Steering Atoms Toward Better Navigation, Physicists Test Newton And Einstein Along The Way." Science Daily 23 Feb 2007.

Aaron said...

I know this may be hopelessly naive, but doesn't General Relativity already imply something like deformed special relativity?

It seems fairly clear that the Fourier Expansion of the space-time metric gives a metric on the momentum space.

Bee said...

Hi Uncle,

Gravitation is empirically blind to all observables.

Gravitation is sensitive to everything that carries energy, momentum, angular momentum, that also includes field strength tensors of (possibly non-abelian) gauge charges and spin.

A two day calorimetry experiment could end the Equivalence Principle wholly within validated existing theory. How can you not look?

If it fails, so what? It's fast and cheap.


Great, then how about you just do it?

Hi Aaron,

yes, in a certain regard you are of course right. If you read the introductions of my papers (e.g. hep-th/0603032, hep-th/0702016) you will find that I start from a classical viewpoint to argue for an energy dependence of the metric. However, for one a Fourier analysis relies on the expansion of the field in an orthonormal set, which isn't unique. But more importantly: how could you ever get a UV regulator in momentum space from a purely classical approach? Yes, one can motivate that additional effects arise based on classical considerations, but the properties of momentum space that give rise to DSR can imo not be caused purely by classical effects. One could of course attempt to include quantum effects by averaging over trans Planck scale scattering effects, say, excess graviton exchange or such - just in lack of the full theory how would we do that? Therefore, the reasoning in my papers is: already from classical arguments one can expect a modification, the precise form including quantum effects can so far not be derived, but we expect it to respect the Planck length as a minimal length.

Hi Lubos,

There could be in principle violations of Lorentz invariance although there exists absolutely no reason to think that it should be the case. But why the hell do you think that these things have anything to do with quantum gravity?

I don't think that and I never said so. I wrote: I find it very possible that our universe has indeed a preferred frame, e.g. that in rest with the CMB, which breaks exact Lorentz symmetry. and direct connections to quantum gravity seem to be weak. One reason for the whole discussion group is to pin down in how far the proposed effects actually have some connection to quantum gravity.

Why do you keep on spreading all this fog about this multi-layered pile of nonsense?

Trying to get out of the fog.

Best,

B.

Aaron said...

The different basis (both in 4-D and quantum) correspond to different momentum expansions, linear, angular, etc...

It would be interesting to compute the Fourier Transform of the Schwarzschild solution on a light cone coincident with an observer.

Hehehe, kind of a little joke, every particle is its own little black hole.

Aaron said...

I was thinking about supernovas last night and thought of what quantum gravity should predict: the quantization of the mass spectrum of black holes.

It should be an ionization spectrum turned upside down, so that Hawking Radiation occurs in steps down to some minimum threshold, after which the black hole evaporates (mind you releasing a lot of energy).

The question is then, is the lowest available black hole energy lower or greater than the highest degenerate quark plasma energy?

The process I was thinking is analogous to the precipitation of iron and its fusion products in the core of a star (because of turbulence the products don't form at the centre, rather they form in localized high pressure regions and then fall inward after formation).

So black holes wouldn't just appear in the middle of a degenerate quark star, rather quarks would get squeezed together in localized high pressure regions until they form a quantum black hole. These quantum black holes would then precipitate out.

But would they aggregate, forming a larger black hole, or is there another unknown force that must be overcome, yielding newer states of matter?