Friday, February 05, 2016

Much Ado around Nothing: The Cosmological non-Constant Problem

Tl;dr: Researchers put forward a theoretical argument that new physics must appear at energies much lower than commonly thought, barely beyond the reach of the LHC.

The cosmological constant is the worst-ever prediction of quantum field theory, infamously off by 120 orders of magnitude. And as if that wasn’t embarrassing enough, this gives rise to, not one, but three problems: Why is the measured cosmological constant neither 1) huge nor 2) zero, and 3) Why didn’t this occur to us a billion years earlier? With that, you’d think that physicists have their hands full getting zeroes arranged correctly. But Niayesh Afshordi and Elliot Nelson just added to our worries.

In a paper that made it third place of this year’s Buchalter Cosmology Prize, Afshordi and Nelson pointed out that the cosmological constant, if it arises from the vacuum energy of matter fields, should be subject to quantum fluctuations. And these fluctuations around the average are still large even if you have managed to get the constant itself to be small.

The cosmological constant, thus, is not actually constant. And since matter curves space-time, the matter fluctuations lead to space-time fluctuations – which can screw with our cosmological models. Afshordi and Nelson dubbed it the “Cosmological non-Constant Problem.”

But there is more to their argument than just adding to our problems because Afshordi and Nelson quantified what it takes to avoid a conflict with observation. They calculate the effect of stress-energy fluctuations on the space-time background, and then analyze what consequences this would have for the gravitational interaction. They introduce as a free parameter an energy scale up to which the fluctuations abound, and then contrast the corrections from this with observations, like for example the CMB power spectrum or the peculiar velocities of galaxy clusters. From these measurements they derive bounds on the scale at which the fluctuations must cease, and thus, where some new physics must come into play.

They find that the scale beyond which we should already have seen the effect of the vacuum fluctuations is about 35 TeV. If their argument is right, this means something must happen either to matter or to gravity before reaching this energy scale; the option the authors advocate in their paper is that physics becomes strongly coupled below this scale (thus invalidating the extrapolation to larger energies, removing the problem).

Unfortunately, the LHC will not be able to reach all the way up to 35 TeV. But a next larger collider – and we all hope there will be one! – almost certainly would be able to test the full range. As Niayesh put it: “It’s not a problem yet” – but it will be a problem if there is no new physics before getting all the way up to 35 TeV.

I find this an interesting new twist on the cosmological constant problem(s). Something about this argument irks me, but I can’t quite put a finger on it. If I have an insight, you’ll hear from me again. Just generally I would caution you to not take the exact numerical value too seriously because in this kind of estimate there are usually various places where factors of order one might come in.

In summary, if Afshordi and Nelson are right, we’ve been missing something really essential about gravity.


Aida Ahmadzadegan said...

Afshordi and Nelson in the summary?

Sabine Hossenfelder said...

Ah, thanks, I fixed that :)

Phillip Helbig said...

"The cosmological constant is the worst-ever prediction of quantum field theory, infamously off by 120 orders of magnitude."

I've read this and heard this so many times that three things spring to my mind:

1. Repeating something often doesn't make it true.

2. Many people just quote it from other people, without really thinking about it.

3. Why not call it the quantum-field-theory problem, rather than the cosmological-constant problem? Any other theory that bad would be thrown out right away. (Note for experts: QFT does not really predict this any more than QED predicts an infinite electron mass. If the observed electron mass is not the same as the one one has to renormalize in QED, why should the QFT value for lambda correspond to the observe value? Why does anyone even think that it should?)

4. There can be other sources to the cosmological constant besides vacuum fluctuations.

OK, four things. Among the things that spring to my mind are.... But no fanatic devotion to the Pope.

My universal advice: read this paper by Bianchi and Rovelli (the latter sometimes comments here) and come back. If you agree with their paper, repent of your ways. If not, publish a refutation.

Popular-science writers should write about papers like this, instead of hyping dark matter killing the dinosaurs or whatever.

Probably my love of Baroque music caused me to Google "Bianchi and Corelli" when searching for the URL above. This brings up The G-string Murders in Google Books. Really. It's not directly about string theory, at least not as we know it, Jim. It appears to be a hard-boiled police procedural. Have fun!

Sabine Hossenfelder said...


I know the Rovelli paper, I read it, and I agree with almost all of it. But this isn't the place to discuss it - it would get me greatly off-topic. I just wanted to point out the new argument put forward by Niayesh and Elliot.

The difference to fixing the electron mass is one of finetuning, so in the end it comes down to a naturalness argument. And I have long been on record for not believing in naturalness - before abandoning it became fashionable, so I hope this explains my opinion on the matter.

Phillip Helbig said...

OK. Yes, this is not the place to discuss it. Yes, the whole topic is a bit tricky and details matter. I was just a bit disappointed about reading the "120 orders of magnitude" argument yet again. However, are the 120 orders of magnitude really relevant for discussing this new paper?

By the way, sorry for the "quantum fluctuations" with my comments popping in and out of existence. While one can't edit comments here, at least one can delete them and replace them with a better version.

Why not a post expounding on Bianchi and Corelli?

While I'm at it, let me plug Carlo's most recent popular book, Seven Brief Lessons on Physics. A very enjoyable read.

Sabine Hossenfelder said...


I reviewed Carlo's book here. Regarding the 120, it's a standard argument and it's in its core correct, and it's there to explain to the reader why it's an interesting topic and they should read the rest of the post and follow me on facebook and send me a donation ;)

More generally, the problem is the following. This blog has a very diverse audience. It's basically impossible to satisfy everybody, especially not in less than 1000 words (which is what I normally try to aim at). I normally start out very basic, add technical details until it starts hurting, and then I sum up for those I've lost. I conclude that I will probably never succeed in writing a complaint-free article, unless you count the empty set of words.

The whole naturalness argument plays a major role in my hopefully forthcoming book, so in the future you will likely get to hear more about this than you want :o)



Niayesh Afshordi said...

Thanks Sabine and Everyone else for your interest.

Indeed, one way to understand our result is to ask how much of the cosmological constant problem CANNOT be solved by fine-tuning. So, even if you're ok with "technically unnatural" theories, you still expect to see fluctuations in the vacuum energy-momentum tensor, and may wonder why you don't see it's gravitational effect.

Uncle Al said...

"barely beyond the reach of the LHC" Increasing proton decay half-life, Super-K versus "inevitable" SUSY. "off by 120 orders of magnitude" Baryogenesis is off by [(hadrons - antihadrons)/photons] = 6.1×10^(-10). Is that better?

"there is no new physics before...35 TeV" The US Congress killed the 40 TeV Superconducting Supercollider. Waxahachie, Texas is still available.

"if Afshordi and Nelson are right, we’ve been missing something really essential about gravity" Maybe two things - but the second, from chiral anisotropic vacuum stresses selective to hadrons, is geometrically testable on a bench top. Look.

Matthew Rapaport said...

"missing something really essential about gravity" I would not be at all surprised. Cosmological explanation become steadily more complex as we probe deeper since discovery of the CMB. See no reason why this should not continue!

vladimirkalitvianski said...

I wonder how one can apply QFT to the whole universe? QFTs are about particular quasi-particles in compound systems of limited size, like lasers. Two different lasers have different occupation numbers of their photons. The claster decomposition principle means exactly that - different quasi-particles do not correlate in different systems. If you apply QFT to the whole universe, then you have to take into account all occupation numbers of quasi-particles. The universe is stuffed with them. Which vacuum fluctuations may influence the whole thing if it is not empty and the occupation numbers are huge?

Mitchell said...

This paper seems to belong to a "genre" that also includes "holographic dark energy" and "running vacuum energy"...

Lucy M said...

Mitchell says "This paper seems to belong to a "genre" that also includes "holographic dark energy" and "running vacuum energy"..."

Could you possibly expand on that a little? It's just that I don't have enough skills/knowledge to satisfyingly grok the paper, or I need more time in study of it. I think I'm strong enough that I can assemble a reasonable high level assessment, of say, robustness. But again, I'm not completely sure.

For that reason it would help me to hear your reasons, because assuming the insinuation that comes across for the words that you choose, is what you intend - i.e. that the work is pseudo, poorly founded or otherwise suspicious...I honestly completely missed anything like that.

On the contrary I thought their work and approaches exemplified good solid scientific values and stricture. I'm not that well read, but for example I've never seen before the same approach that they applied in order to obtain the bounds, i.e. as Sabine exampled.

I think the problem with a large number papers at the moment, is the absence of a major effort, involving original or distinctive thinking, to translate the ideas into a hard value to their colleagues in physics, of a kind that is not merely providing an alternative to something already doing the job in that space, but a whole new space (abstract)

So either I need to abandon all sense in which I have, or fool myself that I do, a way to get grip of this sort of thing. Or you're full of shit.

David Lambert said...

It's true the LHC will not be able to reach all the way up to 35 TeV. But cosmic ray interactions with matter can. Any hope of learning something from these, or from an energetic-enough neutrino in IceCube?

Lucy M said...

Blogger Niayesh Afshordi said..."Indeed, one way to understand our result is to ask how much of the cosmological constant problem CANNOT be solved by fine-tuning. So, even if you're ok with "technically unnatural" theories, you still expect to see fluctuations in the vacuum energy-momentum tensor, and may wonder why you don't see it's gravitational effect."

You seem to converge in your paper to a cut off for gravity itself, above the level of individual galaxies.

That also seems to be the resolution of your quoted comments presented (by you) as a riddle.

Did I get it wrong, or right?

Mitchell said...

Lucy M: there's a standard way of talking about the vacuum energy of quantum fields, which is that it's everywhere, it's the same everywhere, and it makes a uniform local contribution to the curvature of space. But in the sort of paper I am talking about, this simple picture is made more complicated, and on the basis of some heuristic idea or calculation, rather than from something deep and principled.

Holographic dark energy proposes that the vacuum energy is bounded in a way inspired by the holographic principle, but not actually derived from a specific holographic theory. Running vacuum energy says that the vacuum energy changes as the universe expands, but apparently only on the basis of a particular approximation scheme. And this paper says (I think) that vacuum energy fluctuations make a potentially destabilizing contribution to the metric, never before noticed by theorists; but the new calculation depends on quite a few technical assumptions.

None of the above is a particularly informed critique, there might even be misrepresentations. I am just spelling out how I get the superficial impression, in these works, of bold claims with a brittle basis.

For a critical analysis of the "CnC" argument in this paper, I would want to reconstruct the reasoning with an eye on sections 8.3 through 8.5, where various framing assumptions are motivated and defended. In particular, the alleged new effects are due to the Feynman diagram in figure 2B. I suspect that in a conventional approach, those contributions are fully accounted for in a renormalization. But here, they are not; instead, they grow so strong that the authors think a whole new physical regime has to appear at a relatively low energy, in order to explain why these alleged effects aren't cosmologically visible.

In other words, they've done a calculation in a new way, predicted an effect they can't see in nature, and postulated unseen new physics in order to minimize their unseen new effect. It does sound like a self-created problem, the result of going about their calculation in the wrong way.

Babak said...

I think I might be missing the Shakespearean allusion in the title for the piece. Seems weightier.

Sabine Hossenfelder said...

Ado around Nothing -> Fluctuations around the mean vacuum energy

Lucy M said...

hi Mitchell - thank you for commenting back. A good showing :)