Saturday, November 30, 2019

Dark energy might not exist after all

Last week I told you what dark energy is and why astrophysicists believe it exists. This week I want to tell you about a recent paper that claims dark energy does not exist.


To briefly remind you, dark energy is what speeds up the expansion of the universe. In contrast to all other types of matter and energy, dark energy does not dilute if the universe expands. This means that eventually all the other stuff is more dilute than dark energy and, therefore, it’s the dark energy that determines the ultimate fate of our universe. If dark energy is real, the universe will expand faster and faster until all eternity. If there’s no dark energy, the expansion will slow down instead and it might even reverse, in which case the universe will collapse back to a point.

I don’t know about you, but I would like to know what is going to happen with our universe.

So what do we know about dark energy. The most important evidence we have for the existence of dark energy comes from supernova redshifts. Saul Perlmutter and Adam Riess won a Nobel Prize for this observation in 2011. It’s this Nobel-prize winning discovery which the new paper calls into question.

Supernovae give us information about dark energy because some of them are very regular. These are the so-called type Ia supernovae. Astrophysicists understand quite well how these supernovae happen. This allows physicists to calculate how much light these blasts emit as a function of time, so they know what was emitted. But the farther the supernova is away, the dimmer it appears. So, if you observe one of these supernova, you can infer its distance from the brightness.

At the same time, you can also determine the color of the light. Now, and this is the important point, this light from the supernova will stretch if space expands while the light travels from the supernova to us. This means that the wave-lengths we observe here on earth are longer than they were at emission or, to put it differently, the light arrives here with a frequency that is shifted to the red. This red-shift of the light therefore tells us something about the expansion of the universe.

Now, the farther away a supernova is, the longer it takes the light to reach us, and the longer ago the supernova must have happened. This means that if you measure supernovae at different distances, they really happened at different times, and you know how the expansion of space changes with time.

And this is, in a nutshell, what Perlmutter and Riess did. They used the distance inferred from the brightness and the redshift of type 1a supernovae, and found that the only way to explain both types of measurements is that the expansion of the universe is getting faster. And this means that dark energy must exist.

Now, Perlmutter and Riess did their analysis 20 years ago and they used a fairly small sample of about 110 supernovae. Meanwhile, we have data for more than 1000 supernovae. For the new paper, the researchers used 740 supernovae from the JLA catalogue. But they also explain that if one just uses the data from this catalogue as it is, one gets a wrong result. The reason is that the data has been “corrected” already.

This correction is made because the story that I just told you about the redshift is more complicated than I made it sound. That’s because the frequency of light from a distant source can also shift just because our galaxy moves relative to the source. More generally, both our galaxy and the source move relative to the average restframe of stuff in the universe. And it is this latter frame that one wants to make a statement about when it comes to the expansion of the universe.

How do you even make such a correction? Well, you need to have some information about the motion of our galaxy from observations other than supernovae. You can do that by relying on regularities in the emission of light from galaxies and galaxy clusters. This allow astrophysicist to create a map with the velocities of galaxies around us, called the “bulk flow” .

But the details don’t matter all that much. To understand this new paper you only need to know that the authors had to go and reverse this correction to get the original data. And *then they fitted the original data rather than using data that were, basically, assumed to converge to the cosmological average.

What they found is that the best fit to the data is that the redshift of supernovae is not the same in all directions, but that it depends on the direction. This direction is aligned with the direction in which we move through the cosmic microwave background. And – most importantly – you do not need further redshift to explain the observations.

If what they say is correct, then it is unnecessary to postulate dark energy which means that the expansion of the universe might not speed up after all.

Why didn’t Perlmutter and Riess come to this conclusions? They could not, because the supernovae that they looked were skewed in direction. The ones with low redshift were in the direction of the CMB dipole; and high redshift ones away from it. With a skewed sample like this, you can’t tell if the effect you see is the same in all directions.*

What is with the other evidence for dark energy? Well, all the other evidence for dark energy is not evidence for dark energy in particular, but for a certain combination of parameters in the concordance model of cosmology. These parameters include, among other things, the amount of dark matter, the amount of normal matter, and the Hubble rate.

There is for example the data from baryon acoustic oscillations and from the cosmic microwave background which are currently best fit by the presence of dark energy. But if the new paper is correct, then the current best-fit parameters for those other measurements no longer agree with those of the supernovae measurements. This does not mean that the new paper is wrong. It means that one has to re-analyze the complete set of data to find out what is overall the combination of parameters that makes the best fit.

This paper, I have to emphasize, has been peer reviewed, is published in a high quality journal, and the analysis meets the current scientific standard of the field. It is not a result that can be easily dismissed and it deserves to be taken very seriously, especially because it calls into question a Nobel Prize winning discovery. This analysis has of course to be checked by other groups and I am sure we will hear about this again, so stay tuned.



* Corrected this paragraph which originally said that all their supernovae were in the same direction of the sky.

273 comments:

  1. Wow! I didn’t expect this.

    ReplyDelete
  2. Very exciting stuff! Can't wait to hear more!

    ReplyDelete
  3. I wouldn't put too much stock in the peer review process or the Nobel prize.

    Still, I am happy to hear the universe is no longer accelerating.

    ReplyDelete
  4. How many parameters are in the fit to the CMB spectrum?

    I always read claims that the observed variation in the CMB is 'proof' for Inflation.

    Thanks,
    Greg

    ReplyDelete
    Replies
    1. Greg,

      Could you please ask Google before you dump your question here, I have better things to do than doing this for you. Wikipedia lists the parameters of the concordance model.

      The CMB doesn't prove inflation, to begin with because you cannot prove theories. The spectral index is frequently put forward as evidence in favor of inflation because its approximate value was a prediction of certain models of inflation (true) but it's inconclusive because the same prediction can be obtained by other, non-inflationary, models, while some inflationary models give other predictions.

      Delete
    2. I apologize.

      It was my clumsy new way of trying to make a point. :(

      Excellent answer, though!



      Delete
    3. But you explain it so much better than Google. That's why we follow you.
      Also thanks for the correction * my apologies to Reiss et al for thinking they only measured on one direction. This low redshift in one direction and high in the other does encourage my daft ideas.

      Delete
  5. It's not the first time that Subir Sarkar tries to "kill" the accelerated expansion of the universe.
    Three years ago, using the same dataset (if I'm not mistaken), he published a paper on Scientific Reports with the title "Marginal evidence for cosmic acceleration from Type Ia supernovae": https://www.nature.com/articles/srep35596
    Actually, if you looked at Figure 2 of that paper, a universe with no accelerated expansion was inconsistent at 3 sigma
    or more...
    Anyway, that paper was criticized for having a flawed analysis, for example in https://iopscience.iop.org/article/10.3847/2041-8213/833/2/L30
    Let's see if this new claim withstand the test of the community.

    ReplyDelete
    Replies
    1. The criticism of our previous paper was made by Rubin and Hayden, who added 12 parameters to our 10-parameter model to allow for possible sample- & redshift- dependence of the ("stretch" & "colour") parameters used to model the light curves of the supernovae in the JLA catalogue. By doing so they raised the significance of q_o being -ve (i.e. acceleration) from 2.8 to 3.7 sigma (Section 2.3 + Fig.3 of their paper). Interestingly, as we discuss in the Appendix of our new paper, if we carry out their 22-parameter fit, the significance of the dipole we find in q_0 *increases* from 3.9 to 4.7 sigma! However such proliferation of parameters is not justified by the Bayesian Information Criterion. It is also fundamentally against the principles of blind hypothesis testing to add such parameters a posteriori - especially since the JLA authors (who included Perlmutter & Riess) had not done so. Moreover one might ask why the absolute magnitude of Type Ia supernovae should not also be sample- and redshift- dependent (if their lightcurve parameters are) ... of course if we were to allow this it would totally undermine their use as 'standard candles' and no evidence for acceleration can then be claimed! I leave it to you to judge who made the "flawed analysis".

      Delete
    2. I have only given your paper an initial overview reading, which I always do with papers before really studying them. It is my understanding that this focuses on low z effects, usually z < .1, which is where the dipole shift of solar and galactic motion relative to the Hubble frame exists. However, dark energy and ΛCDM are measured with very high z data. The most extreme is from the Sach-Wolfe effect. The FLRW metric term exp(t sqrt{Λ/3}) stretches gravitational wells to result in adjusted warm and cold spots on the CMB. These data, which are at z ≥ 10 for gravitational wells and for light around z ≈ 1100 correlate well with ΛCDM. It is then hard for me to understand, at least just given my first reading of this paper, that you can state, "In summary, the model-independent evidence for acceleration of the Hubble expansion rate from the largest public catalogue of SNe Ia is only 1.4σ. This is in contrast to the claim (Scolnic et al.2018) that acceleration is established by SNe Ia at >6σ in the framework of the ΛCDM model."

      Dark energy in ΛCDM is largely inferred from very high z luminosities, such as SN1 data. The dipole moment in the CMB due to galactic motion and the orbit of the sun (200km/sec) are small and used to infer where the Hubble frame exists. There may indeed be small deviations here, but it is difficult for me to see how that can change the data relative to ΛCDM that significantly.

      Delete
    3. Lawrence Crowell, note that on high-z the contribution of the cosmological constant to the redshift in relation to the CMB should be very small, because the DM+Baryonic density should be much higher. Only at small z the effect of dark energy gets significant.

      Delete
    4. You have it flipped around. In fact the Sachs-Wolfe effect with the CMB is one of the benchmarks for ΛCDM and dark energy. The further out we observe the z captures the fact that our galaxy has been for the last 6 billion years frame dragged by the gravitation of dark energy.

      Delete
    5. If someone is acting in good faith, I see no reason to hold past mis-analysis/mistakes against them. The paper reported above should stand on its own.

      Delete
    6. This comment has been removed by the author.

      Delete
    7. Lawrence Cromwell, 6 billion years yields z=0.5, so it is still recent.

      Delete
    8. This question is primarily for Subir Sarkar and Mohamed, two of the four authors of the paper which this blogpost is about.

      How far back did you go, to the original sources of data on the 740 SNe 1a that you used in your analyses?

      For example, did you obtain the original CBAT citations, and their subsequent publications of new data on each, download the data (RA+Dec, light curves, spectra, and so on), and derive "the distance modulus and apparent magnitude" of each from scratch?

      If not, how did you control for biases which may have been introduced between the original observational data and inclusion in the SDSS-II/SNLS3 JLA catalogue?

      Delete
    9. @Daniel de França MTd2: At around z = .5 is where the SNa are found that have red shift showing at accelerated expansion. At around 8 billion years ago is where the accelerated expansion becomes evident.

      Delete
    10. More questions on Colin+ (2019), shorthand C19. One set per comment.

      In both Riess+ (1998), shorthand R98, and Perlmutter (1999), shorthand P99, there is quite a lot of material on possible systematics, biases, cross-checks, etc. Some examples (not a complete list): K-correction, extinction (three separate sources), gravitational lensing, Malquist bias, and Local Void.

      In Appendix B you say: "The JLA covariance matrix includes uncertainties from, for example the light-curve template fitting process, calibration uncertainties, and dust extinction in the Galaxy, together with the expected dispersion resulting from peculiar velocities (which mainly affects low redshift SNe) and lensing (which mainly affects high redshift SNe Ia) and the propagated uncertainties from the flow model from which the SN by SN peculiar velocity corrections are performed."

      To what extent did you check that these - each, individually - are free from "distance to CMB dipole" systematics?

      Delete
    11. Pet peeve re C19.

      The colors in Figs 1 and 2, for lowZ and SDSS data points, are swapped; why?

      In Fig1 the HST points are purple (or violet, or ...) not black. Why?

      There seem to be just two HST points in Fig 1, but at least eight HST inverted triangles in Fig2. No text like "4 big blue dots show clusters
      of many individual SNe Ia" Why?

      Maybe you authors considered it the job of some lowly copy editor to find this and fix it?

      In my experience, sloppiness like this is sometimes/often accompanied by more serious shortcomings, errors, omissions etc. Ones which reviewers may not pick up (or authors chose to over-ride or over-look).

      This has increased my motivation to dig deeper into C19.

      Delete
    12. More questions on CP19.

      I may be mis-reading Section 3 ("Cosmological analysis"), but it seems that you ran the analysis on all 740 JLA SNe 1a; did you?

      Did you consider doing this (or similar) analysis on each of the four datasets (lowZ, SDSS, SNLS, HST) separately?

      As the HST (apparently) and SNLS datasets have very limited footprints (two and four, respectively), did you consider doing an analysis, like that in Section 3, independently on each of the six subsets? Or at least the four SNLS ones (likely too few HST datapoints for such an analysis).

      Did you consider doing an analysis of the SALT2 scheme, explicitly looking for any "distance to the CMB dipole" biases?

      Delete
    13. I just looked into it:

      There are 9 HST SNe included in the JLA compilation:

      Their right ascensions are, in degrees:

      223.53334809 125.78064202 125.68834008 126.0096569 125.86351175
      223.47289322 125.60554703 125.77072739 125.75903376

      and their declinations are

      -54.39845011 54.82204426 54.82876129 54.83518258 54.83086884
      -54.43854068 54.74609646 54.73999359 54.86053051

      As you can see, 2 of them are clustered together very closely while the other 7 are clustered together closely.

      I'll admit that at some level this was sloppyness. But I dont think it affects our conclusions. Thanks for pointing this out.

      Delete
    14. The coordinates posted before were the longitudes and latitudes in the Galactic coordinate system, not right ascensions and declinations. Sorry.

      Yes, we ran the analysis on all 740 JLA SNe.

      We have dont these tests as well as the monopole in acceleration only fits with each sample separately, and find mild anisotropies at high redshifts, conditional on whether peculiar velocity corrections are included or not at low redshifts etc. Please email me if you'd like me to summarize these tests for you. However what we find are in agreement with previous results such as Bernal et al that we cite, and due to splitting up the sample into smaller samples, not statistically conclusive in any sense.

      And no, we have not analyzed the SALT 2 scheme for anything related to the CMB dipole. This is beyond the scope of this work. But could be relevant.

      Delete
    15. Thanks for the clarifications, Mohamed. Some comments:

      "The coordinates posted before were the longitudes and latitudes in the Galactic coordinate system, not right ascensions and declinations. Sorry." I have no problems with that. One minor "sloppiness": I assume you mean J2000 coordinates, but I think it would be far better to actually say so.

      "Please email me if you'd like me to summarize these tests for you." Thanks, but no thanks. I would only repost here, with your permission. I want every reader of this blog to see. However, I recognize that without something like MathJax enabled here, it may be very difficult to hold a good technical discussion.

      "and due to splitting up the sample into smaller samples, not statistically conclusive in any sense." I'd likely have been surprised if you had responded in any other way! I feel that detailed tests, on individual subsamples, are important to do, no matter how well the agreement with previous results is (or seems). Likely we won't have to wait too long, given the pace of reported SNe, in CBAT for example.

      "And no, we have not analyzed the SALT 2 scheme for anything related to the CMB dipole. This is beyond the scope of this work. But could be relevant." I feel it's a very important i to be dotted (or t to be crossed), given the potential ramifications of what C19 reports.

      Delete
    16. My turn for sorry. Galactic coordinates do not need any J2000 or similar (those are for ecliptic coordinates). And the old galactic coordinate system is so old that specifying which you are using is important only decades ago (say, 1960s).

      I'd better refrain from posting further for today at least (I'm coming down with something, likely a cold; brain is clearly functioning poorly).

      Delete
  6. I just opened the paper by Colin, Mohayaee, Rameez, Sarkar, but my wake up coffee has not kicked in enough to get me reading it just yet. There are a few things that I have pondered with this. The dipole red vs blue shift of the CMB has been known since the late 1970s, which I would presume Perlemutter, Riess and Schmidt took into account with their interpretation of red shift data on type I supernovae. This is, for those not aware, due to the small motion of the Milky Way in the local group and the motion of the sun, some 300km/s as I recall, relative to a galactic frame and so forth. The other three indicated below has to do with supersymmetry and inflationary cosmology.

    Supersymmetry is in an unbroken phase for energy E = 0 or H = ½(QQ^† + Q^†Q) annulled on the vacuum. Inflationary cosmology would highly break SUSY with its large vacuum energy. The current state of the universe would break SUSY softly, as one might model with how Zeeman splitting by a small magnetic field lightly breaks atomic degeneracy. However, we LHC finds no evidence of lightly broken SUSY. We might then ponder the consequences of there being no vacuum energy, in which case Λ = 0 and if SUSY exists we should then expect supersymmetric partners with equal masses. This we do not observe. So if there is really no dark energy, or equivalently zero vacuum energy, then this might be evidence of a hard ruling out of SUSY. With a small Λ we might still get a highly broken SUSY, just as the phase of water can be liquid in very low temperatures. Inflationary cosmology might then be compared to a topological order, and a small Λ with SUSY still highly broken might then be analogous to a supersymmetric protected topological system. With this going to Λ → 0 seems to be something similar to quantum criticality. Things to think about for sure.

    Suppose there is an intrinsic dipole to the accelerated expansion of the observable universe, though Λ is not zero. If we consider the model with physical vacuum bubbles emerging in an inflationary spacetime, this then have a boundary at a finite distance. Even if this bubble region “pops off” as a closed manifold or even a Euclidean space R^3 evolving in time there might be signatures of this anisotropy inherent in what is observed. I would though expect that to be some feature of the CMB that is not removed by accounting for stellar motion in this galaxy and galaxy motion in the local group.

    The third consideration is that this might go away with further work. It could just as easily be that these analyses of data have some problem that will not fit with further work.

    ReplyDelete
    Replies
    1. "Things to think about for sure."

      Thinks for 15 seconds. Stops thinking.

      What does this convoluted mess of an argument have to do with the paper Sabine mentions or astrophysics?

      -drl

      Delete
  7. The reference to the new paper is actually above the video rather than below (hands pointing down) the video!

    ReplyDelete
    Replies
    1. Roy,

      Yes, thanks for pointing out. I sometimes forget to remove the sentences that don't make sense in the written text.

      Delete
  8. "What is with the other evidence for dark energy?"
    Perhaps "How about the other evidence for dark energy", or "What is the other evidence for dark energy".

    ReplyDelete
  9. Apologies ahead of time if this is a stupid question, (I’m a Geologist not a physicist).
    So if this new paper is correct, does this mean the cosmological constant is no longer required?

    ReplyDelete
    Replies
    1. It's not a stupid question, and, yes, that's what it means.

      Delete
    2. Yes, IF the paper is shown to be correct and IF one explains why ALL the other indications of a positive cosmological constant is wrong.

      Delete
  10. If dark energy is not what dark anergy is supposed to do, maybe it exists nevertheless because it does something else. I think that only if dark energy does nothing dark energy does not exist.

    ReplyDelete
  11. Roughly, ΛCDM predicts/calculates Dark energy density = 69%, Dark matter density = 26%, and Baryon density = 5% (relative proportions of total mass-energy density)

    So if dark energy were to be eliminated, what would happen to the other components of the total mass-energy?

    ReplyDelete
  12. The "cosmological" average frame is problematic especially if it will be revealed that the universe is closed after all. Hence maybe better be careful and remember to condider average frames always as local.

    ReplyDelete
  13. Why not simply skip the Nobel prize for physics every now and then?

    ReplyDelete
  14. Pardon my pissed off attitude; but with 110 samples, pick any theoretical 50% dividing line for some parameter, and there's about 3.5% chance that you'll see a skew of 45/65 or worse without violating the 50%.

    Which is apparently what happened here; an unexpected dividing line (by direction) for the original 110 sample dataset.

    What bullshit peer review ends up awarding a Nobel Prize for analyzing a 110 point data set and determining 2/3 of the entire Universe is Dark Energy?

    Real question: Am I missing something here?

    ReplyDelete
    Replies
    1. Thank you Dr. Castaldo. I wish the Nobel committee had shown more restraint. We would not have to put up with an epistemologically useless toy model of the Universe then.

      Delete
    2. Yes, Dr. A.M. Castalso, you are missing something. Not necessarily wrt a single P&R paper (I make no comment either way), but more broadly.

      As Bee points out, BAO and CMB data are also consistent with models which include dark energy. Ditto with gravitational lens analyses. And maybe (soon?) with GWR observations and analyses. And more.

      True, the "error bars" are large, so consistency may still hold (barely?). More interestingly, as Bee also points out, "if the new paper is correct, then the current best-fit parameters for those other measurements no longer agree with those of the supernovae measurements."

      Wouldn't that be cool? ;-)

      Delete
    3. JeanTate: My question does pertain to a single paper, based on 110 sample points. I don't see how anyone could have achieved a result, using 110 data points, with enough significance to warrant a Nobel Prize and a wholesale revision of our view of the Universe (making 2/3 of it heretofore undetectable "Dark Energy").

      Especially if the source of the bias was as simple as an underlying (and unaddressed) correlation to the direction of measurement.

      If all of our science is done to that sloppy a level of precision, then none of it is worth a shit.

      Astronomy is not my field, but I know some damn statistics. how did this paper make it past peer review? Why in the world would it be accepted by fundamental particle physicists? By what VooDoo magical thinking was this a five sigma result? I doubt it could reach 2 sigma!

      My question is indeed on one paper from one experiment that became inordinately famous and has quite possibly tainted astronomy and physics for forty years with a flawed analysis of 110 data points with what should have been an obvious non-uniform clustering issue for anybody trained and skilled in the science of astronomy.

      So my serious question is: What am I missing about this paper? How did this gain enough traction, when it was published, to win a Nobel and have thousands of physicists suddenly agree that the Universe was composed of 68% mysterious "dark energy"?

      It sounds crazy. If that is the standard of "proof" in astronomy, the field is probably riddled with lies.

      Delete
    4. Thanks for the clarification, Dr. A.M. Castaldo.

      I have no insight into the deliberations of the Nobel committee, but I think a big factor in their decision was the extent to which the P&R finding (dark energy), per their first paper, was subsequently shown to be consistent with other results. Including independent analyses also based on 1a SNe.

      Perhaps a contrast may help: cold fusion. No Nobel because the initial paper(s)' conclusions did not withstand independent scrutiny.

      Or: context is important, not just a single paper.

      Oh, and yes indeed astronomy is a field riddled with myths! :O But not lies, as that would imply deliberate falsehoods; sloppy thinking, bad use of tools ("statistics"), failure to follow-up, and much more are certainly rather too common.

      Delete
    5. Dr. A.M. Castaldo: you wrote "My question does pertain to a single paper, based on 110 sample points."
      and: "My question is indeed on one paper from one experiment that became inordinately famous"
      and (bold in original): "So my serious question is: What am I missing about this paper?"

      May I ask, have you read "this paper"?

      Fair warning, this is a trick question.

      Delete
    6. Dr Castaldo,

      Your instincts here are correct, but it is not so much astrophysics that is at fault as it is the modern cosmological model, through the lens of which all astrophysical observations are distorted (by interpretation) to achieve "concordance".

      So, what essentially happened is, an observational discrepancy between the redshift and luminosity distances was presented as evidence of a universal acceleration which shifted the focus from the actual observations to the woo-woo, Nobel Prize-worthy, interpretation. The basic analytical error would have probably been detected sooner, had not the interpretation superseded the evidence, in the public and scientific imagination.

      Better late than never though, this welcome paper from the reality-based branch of the scientific community. Its going to be interesting to see how the theoretical community defends itself against this assault on LCDM. A safe bet would be that dark energy will survive as a zombie idea for many years to come, even if this new analysis holds and strengthens. It is unlikely the expanding universe story will be as broadly and quickly refuted, as it was accepted.

      Delete
    7. JeanTate: May I ask, have you read "this paper"?

      I have not. I would expect reading it to be of marginal efficacy; Astronomy is not my field. I don't have the background to understand the assumptions, equations, notation and terms of art and other context that could answer my question. I'd need an expert guide.

      But what I do have is statistical "instinct", as bud rap put it above.

      To be fair and speak against my instinct; there are a few routes by which a 110 point dataset could produce a high-sigma result; e.g. 110 coin flips in a row coming up heads is about a 12 sigma result. Which way over 5 sigma so we would take that as evidence one of our assumptions is wrong: Maybe the coin is not fair, or the flipping process and environment are more precise than we imagined they could be, and are not the random influence we assumed. Basically, our assumptions do not capture the model of what is going on.

      Nevertheless, I think we can infer from the later paper that the authors of the first paper did not explore their assumptions when getting an unusual result, and particularly their assumption that the redshifts would be uniform regardless of the observational direction.

      And my additional statistical instinct is simply that 110 points (as reported by Dr. Hossenfelder) is typically not enough data to report a five sigma result, which is the standard for reporting something new in particle physics.

      My example is based on coin flips and the standard distribution. If we pick what should be a theoretical median value for some parameter, we should see 50% of observations fall below it, and 50% above it. A coin flip. 55:55 for a 110 point data set. but a skew of 45:65 has a 3.5% chance; about a 1.8 sigma result.

      So my real question is, did they find a five sigma result with 110 points? If they did not, why did particle physicists accept "Dark Energy" as part of the canon? If they did find a five sigma result, what did they do to stress test their assumptions? On that line, how could professional astronomers overlook the fact (later discovered by the new paper) that redshifts were directionally dependent? I should think astronomers would be rather keen on testing for and discovering directional dependencies, especially relative to movements by the moon, earth, sun, galaxy, etc.

      It seems to me a reasonable expectation that all avenues would be exhausted before we conclude there is 3x more energy in the universe than we ever imagined, and a NEW form of energy ("Dark Energy"), and we need to rewrite physics to accommodate it.

      But apparently I'm wrong, it took decades before anybody bothered to look at that (or perhaps bothered to listen to those that did.)

      Where are the skeptical scientists? Shackled in a dungeon somewhere, those party poopers, until they learn their lesson: Don't get in the way of your colleagues making sensational press; the public loves mysteries and big $$$ depend on it.

      Delete
    8. Dr. A.M. Castaldo,

      I thank you for your frank remarks.

      Here are some comments:
      - the 2011 Nobel Physics prize was given to three scientists, not two (Schmidt was the third)
      - Perlmutter and Riess were lead authors of the two "landmark" papers the Nobel Committee cited; their work was independent
      - since 1998/9, many papers were published, on 1a SNe, on "dark energy", etc, including in the period between 1999 and 2011
      - over these ~20 years, there have been several published challenges to "dark energy" conclusions drawn from astronomical observations (not only 1a SNe)
      - YMMV but few of these have withstood subsequent scrutiny
      - it is much too early to say how the paper cited in this blog post (Colin+ 2019) will withstand the scrutiny it will surely attract
      - astrophysics is not particle physics
      - without reading either the Perlmutter+ (1999) or the Riess+ (1999) papers, how do you know what assumptions they did, or did not, make?
      - without reading, at minimum, Colin+ (2019) - which I infer you did not read - how do you know what others have (or have not) looked at re a directional bias re redshifts?

      On McGaugh's personal opinion: Did you check anything he wrote? If not, how did you arrive at the conclusion that it is an accurate (or appropriate) summary (other than that it may match your personal biases)?

      Historical note: Halton Arp was, by all accounts, the perfect gentleman. He was also a very good observational astronomer. He wrote at least one book ("Seeing Red", IIRC) which is kinda like McGaugh's piece, only much longer. AFAIK, to his dying day, he felt hard done by the astronomy community, and continued to believe in one form or other of his "discordant redshifts" idea. Surely you'd be the first to acknowledge that a good historical account would need to carefully examine Arp's claims about bias, including a need to find contemporaneous accounts by other astronomers, right? Why not apply the same skepticism to McGaugh?

      Delete
    9. Dr. Castaldo, if you think this field is a mess (which it appears to be to an extent), you must check out nutrition “science”—an appalling melange of horribleness with no parallel.

      Delete
    10. I don't see how anyone could have achieved a result, using 110 data points, with enough significance to warrant a Nobel Prize and a wholesale revision of our view of the Universe (making 2/3 of it heretofore undetectable "Dark Energy").

      My question is indeed on one paper from one experiment that became inordinately famous and has quite possibly tainted astronomy and physics for forty years with a flawed analysis of 110 data points with what should have been an obvious non-uniform clustering issue for anybody trained and skilled in the science of astronomy.

      So my serious question is: What am I missing about this paper? How did this gain enough traction, when it was published, to win a Nobel and have thousands of physicists suddenly agree that the Universe was composed of 68% mysterious "dark energy"?"


      Your comment indicates that you don't know the history of cosmology in the last 30 years. It was not one paper. The community was not immediately convinced. At the same time, the current cosmological model is a much better fit to the observations than that of 30 years ago. Many people had come up with the current cosmological parameters before the supernova data were even known. And so on and so forth.

      Delete
  15. Nice piece, Bee. I've appreciated your recent blogs a lot. It's interesting to me that cosmologists in this sub-field talk frequently and uncontroversially about an "average restframe" of the universe, sometimes described as the "cosmic frame," and yet make no reflections on how this notion undermines the foundations of relativity theory and its background independence. This is obviously a mine field but I'd love to see a future blog entry from you on this specific topic.

    ReplyDelete
    Replies
    1. Isn't this somehow related to finding the 'middle point' (whatever that means) on the surface of a globe?

      Delete
    2. Read the comment by PhysicistDave below as well. There is a difference between the global and local symmetries of spacetime. The global symmetry of all spacetimes are the Lorentz and Poincare groups = Lorentz group plus translations. The local symmetry of spacetime can be the Lorentz group plus additional symmetry the spacetime may possess. The Schwarzschild solution for instance has an additional spherical symmetry and a coordinate axis centered at the black hole. Because of this we do not say that somehow this violates relativity. The same holds for cosmology as well, and the Hubble frame is no more of a problem than the spherical centered coordinates of a black hole.

      Delete
    3. The three possible homogeneous static universes - flat Lorentz, Einstein, de Sitter - do not have preferred frames. The non-static models as considered by Robertson, Lemaitre etc. also are spatially isotropic. So something like a preferred frame inferred from say the CMB anisotropy is certainly in a different category than the simple cosmological models.

      Of great interest is that the preferred frame has a very simple explanation from the point of view of the conformal group. A four-vectorial parameter enters into the transformations. Assuming this is timelike, you can go to its rest frame and achieve rest with respect to the cosmic frame. This can be made completely consistent with local Lorentz invariance. So in response to the O.P., it is possible to have local Lorentz invariance and a cosmic frame. (A side effect of the conformal argument is a simple, kinematical explanation of the cosmic redshift.)

      -drl

      Delete
    4. The FLRW and the de Sitter spacetimes have much the same metric

      ds^2 = dt^2 - a(t)^2[dr^2 + r^2dΏ^2]

      where for FLRW a(t)^2 = exp(t√(Λ/3))/(1 - kr^2)) and for de Sitter a(t)^2 = cosh^2(t√(Λ/3)). Then clearly for k = 0 and for t >> 0 the de Sitter metric converges to the FLRW. For a comoving points in FLRW for k = 0 the z factor is z = a(t0)/a(t) - 1 and we may Taylor expand the expansion a(t) = 1 + t√(Λ/3) + ½(Λ/3)t^2 + ... . and so to first order in time we have

      z ≈ a(t0)/[a(t0)(1 + (t - t0)√(Λ/3))] ≈ (t0 - t)√(Λ/3)

      and the distance in this linear approximation is d = c(t0 – t) and the Hubble parameter is H ≈ √(Λ/3) . We now have the Hubble law or relationship. This same result can be found with the de Sitter spacetime.

      Really the only thing that defines a special frame in the FLRW and de Sitter spacetimes is whether the metric is in diagonal form. The same holds for the Schwarzschild spacetime. It is not hard to see that if we apply a Lorentz transformation with (x, y, z) coordinates that the Hubble law will be deformed in the direction of the boost.

      Delete
  16. Tam,

    Relativity says the laws of nature are frame-invariant, not that the existing distribution of matter has to be frame-invariant. In fact, a moment's thought should make clear that if there is any matter (or radiation) present at all, it can't be frame-invariant.

    That intuition can be made precise: you can show, fairly easily, that any distribution of matter or radiation that looks the same in all frames would have to have infinite density.

    I'm not saying your question is foolish: the same issue bothered me some years ago.

    But, when you get the distinction between frame-invariant laws of nature vs. an (impossible) frame-invariant distribution of matter or radiation, you should be able to see that it is not an issue.

    By the way, the one exception is "dark energy" due to the cosmological constant. But that is a special case that takes advantage of the fact that the metric is (Lorentz) frame-invariant. And the cosmological constant does not relate to real matter or radiation in any normal sense. (How do I know no real matter or radiation acts like the cosmological constant? Well... I and others have tried! If anyone thinks they can make that work, by all means do tell. Precise quantitative math, please, not "word salad.")

    All the best,

    Dave

    ReplyDelete
    Replies
    1. Relativity says the laws of nature are frame-invariant, not that the existing distribution of matter has to be frame-invariant. In fact, a moment's thought should make clear that if there is any matter (or radiation) present at all, it can't be frame-invariant.

      True enough, but homogeneity and isotropy (which implies frame invariance) are assumptions underlying the FLRW solutions which form the basis of the standard model of cosmology. Given that the cosmos does not appear to be homogeneous and isotropic, it would seem reasonable to question whether the standard model accurately represents the observed cosmos, especially since it requires the existence of invisible stuff (dark matter and dark energy) simply to reconcile itself with observations.

      Delete
    2. bud rap wrote:
      >True enough, but homogeneity and isotropy (which implies frame invariance) are assumptions underlying the FLRW solutions which form the basis of the standard model of cosmology. Given that the cosmos does not appear to be homogeneous and isotropic, it would seem reasonable to question whether the standard model accurately represents the observed cosmos,

      The FLRW solutions are -- obviously -- not intended to be exactly true of the real world.

      Like almost everything in science, they are meant to be decent approximations to the real world.

      There is no perfect harmonic oscillator in the real world, no perfect elliptical orbit in the real world, no perfectly spherical planet in the real world.

      But often those are very good approximations.

      We are physicists, not theologians: we aspire to a better understanding of the physical world, not to divine perfection.

      If isotropy and homogeneity are good approximations to the large scale structure of the universe, then FLRW solutions should also be good approximations. If you have evidence to the contrary, write it up and make yourself famous.

      In any case, Tam was simply mistaken, confusing the symmetries of the laws with the symmetries of the solutions. He made an elementary error.

      By the way, you of course also made an error when you said: "homogeneity and isotropy (which implies frame invariance) are assumptions underlying the FLRW solutions which form the basis of the standard model of cosmology." Nope -- the FLRW solutions are homogeneous and isotropic in space (in an appropriate global reference frame) but are not generally frame invariant (i.e., except for the trivial solutions that have no matter and no radiation).

      This is trivially obvious from the fact that in (non-trivial) FLRW solutions there is a local rest frame for the local energy density.

      For a long time you have been ranting and raving about physicists overly relying on math, but your ignorance of math causes you to keep making errors. Pretty much constantly.

      "Mit der Dummheit kämpfen Götter selbst vergebens" (Schiller).

      Delete
    3. Dave,

      For a long time you have been ranting and raving about physicists overly relying on math, but your ignorance of math causes you to keep making errors. Pretty much constantly.

      Maybe, but then I'm not championing a model that bears absolutely no resemblance, in its particulars, to the observed cosmos. It would appear, in other words, that your willful ignorance of physical reality allows you to cling to a model that, as a structural description of the observed cosmos, is an abject failure.

      To that charge you seem only able to sputter some lazy, ad hominem remarks, in hopes, I suppose, of inciting me to engage in a similarly childish display. Unfortunately Dave, I've seen all the "let's change the subject" provocations your reality-challenged community can muster. It won't work, so let's move on.

      If isotropy and homogeneity are good approximations to the large scale structure of the universe, then FLRW solutions should also be good approximations. If you have evidence to the contrary, write it up and make yourself famous.

      Unfortunately you have the situation exactly backwards. There is no evidence for your belief that isotropy and homogeneity are characteristics of the "large scale structure" (a deliberately imprecise specification that has expanded ever outward to accommodate ever expanding inhomogeneity and anisotropy observations). They have always been a foundational assumption of your model.

      In questioning that assumption I don't have to present evidence that it is not correct. On the contrary, it is incumbent on true believers like yourself, to prove that the assumption is in fact, valid. Be my guest, write it up and become famous.

      As to the FLRW solutions being good approximations of the cosmos, they provide, in fact, a manifestly bad approximation, and result in a scientifically indefensible account of the cosmological structure. The cosmos we observe gets along quite well without dark energy, dark matter, substantival spacetime, inflation, the big bang event and its inexplicable original condition, all of which comprise necessary structural elements of your reality-challenged standard model, but none of which make an appearance in empirical reality.

      I'm not as ignorant of math Dave, as you are of the proper role of empiricism in science. And BTW, I'm completely uninterested in the imaginary struggles of your imaginary gods; I presume them to reside in your imaginary universe.

      Delete
    4. bud rap wrote to me:
      >As to the FLRW solutions being good approximations of the cosmos, they provide, in fact, a manifestly bad approximation, and result in a scientifically indefensible account of the cosmological structure. The cosmos we observe gets along quite well without dark energy, dark matter, substantival spacetime, inflation, the big bang event and its inexplicable original condition, all of which comprise necessary structural elements of your reality-challenged standard model, but none of which make an appearance in empirical reality.

      Oh, well. No one conversant with the relevant theories and empirical evidence seems to agree with you. And what on earth is "substantival spacetime" (no, I know I should not ask!).

      I see that you attack the idea of the "big bang event."

      bud, young fella, would you mind telling us how old you think the earth and the universe are and why you think that? I really think that would help us understand where you are coming from.

      Perhaps the Holy Spirit has indeed vouchsafed the truth to you and hidden it from everyone who has devoted his or her life to this subject. In which case, all of us scientists will just have to accept that we are eternally mired in "invincible ignorance."

      I suppose we will have to leave it at that.

      You keep attacking all of us scientists and pretty much all of the best-tested facts of modern cosmology, but you never give us any details of how you know we are all wrong.

      Yes, yes, I know: you think the burden of proof lies on us. Except, you see, we cannot very well tutor you on modern cosmology here in Sabine's comment section.

      In fact, there are textbooks on this stuff, and if you want to learn it, you should start there instead of badgering us for not trying to teach it all to you in Sabine's comment section. If you would actually focus in on one specific fact or equation or derivation in those textbooks, we could help you. But, so far at least, you have not done that.

      Just "word salad" about "substanive spacetime" and such.

      I know: you think it is unfair that we claim you need to spend years to understand stuff that you are quite sure is false.

      Just attribute it to our "invincible ignorance."

      There is one thing though: I have used my knowledge of math and physics to build lots of complicated stuff that actually works.

      And you have not.

      Which is... suggestive.

      Delete
  17. Very interesting, thanks for that update. I know that personal biases count for nothing in science, and should moreover generally be surpressed, but I would love to see dark energy evaporate, as I have always found it an ugly 'fudge factor' in GR.

    ReplyDelete
  18. Again, can't avoid thoughts of tired light speculations.

    Analyzing via the most basis of foundation of physics we can make analogue with light in medium. Light needs not to be massive for energy decrease.

    https://arxiv.org/pdf/1603.07224

    The model of mass waves might be worth of deeper analysis.

    If only the dark energy would be thrown away there obviously were more dynamical actors than basic expansion...

    ReplyDelete
    Replies
    1. Eusa, if I read this - and papers which cite it - correctly, it has ~nothing to do with "tired light". But hey, why not take these results and apply them to the Virgo cluster, say. It could be interesting, if only to see what results there are in a medium (~tens of Mpc in size) that is almost entirely a harder vacuum than any attainable in labs here on Earth.

      In short, why not go beyond speculation, and do some work on its implications and consequences?

      Delete
    2. Thanks for your interest.

      https://arxiv.org/pdf/1803.10069.pdf
      https://arxiv.org/pdf/1811.09456.pdf
      https://arxiv.org/pdf/1905.09218.pdf

      Here are the newer papers of the Mass Polariton theory of light. it is possible to interpretate so that the quantum energy of photon could decrease in relation to the causal length of propagation when entangled momenta of photon with matter particles occur.

      Delete
    3. Eusa, perhaps I missed it - if so please point it out - but like the first paper you cited, none of these mention the propagation of light in even the IPM (inter-planetary medium), much less the ISM (stellar) or IGM (galactic).

      To repeat, why have you not gone beyond speculation, and done some work on the implications and consequences of the Mass Polariton theory of light wrt astrophysics and cosmology?

      Delete
    4. We have observational evidence of tired light within our own Milky Way galaxy. There are observations that show that electro-magnetic radiation slows down as it traverses the "vacuum". The evidence comes from the radio astronomy of pulsars. The arrival time of the leading edge of the pulse varies by wavelength.

      http://astronomy.swin.edu.au/cms/astro/cosmos/p/Pulsar+Dispersion+Measure
      http://www.jb.man.ac.uk/distance/frontiers/pulsars/section4.html
      http://adsabs.harvard.edu/full/1969AJ.....74..849D
      https://en.wikipedia.org/wiki/Pulsar

      The implication is that the Index of Refraction of the interstellar medium is greater than unity - not a vacuum.

      Delete
    5. JeanTate, I've done job with many theory exitations and ideas but it's complicated and still in a progress to achive any consistent math of tired light with observations. The main problem is that the phenomenom is in all cases only partial explanation if a solution at all. The Mass-Polariton idea is new and the interpretation of energy loss as tired light under study...

      Delete
    6. Here is an image of "tired light":
      https://en.wikipedia.org/wiki/Cherenkov_radiation#/media/File:Advanced_Test_Reactor.jpg

      Roy Lofquist: the propagation of radio waves through a plasma is well-understood. Indeed, depending on how the universe evolves, the day will come when the CMB (redshifted by then into the radio) will be undetectable, as it will be absorbed by the IPM (first) then the ISM ... just as radio astronomers cannot do observations of the universe at low frequency.

      AFAIK, this has nothing to do with the usual "tired light" crackpot ideas you can find on the internet.

      Eusa: good for you! When do you expect to be able to publish something?

      Delete
    7. Jean Tate: I'm not quite sure what you are trying to say here. I did not mention either plasma or the Cosmic Microwave Background. As for crackpot ideas, that's kinda cruel to call Fritz Zwicky a crackpot.

      Delete
    8. Roy Lofquist,

      I do not know how much you know about usage of the term "tired light". However, using that term to describe the pulsar dispersion measure is just the sort of thing beloved of crackpots. Perhaps you did not read, or misunderstood, the Swinburne webpage you provided a link to?

      Zwicky's tired light is not in any way a crackpot idea. However, he certainly recognized that it has very few observational legs to stand on.

      Delete
    9. Jean Tate,

      I first became aware of the term "tired light" sometime in the 1950s. That's about the time I became interested in cosmology upon reading Gamow's "One, Two, Three, Infinity". I set out to be a physicist or an astronomer, got all the math etc. including General Relativity (it's a real bitch using a slide rule), then discovered computers. Computers turned out to be both more interesting and far more lucrative than academia so that was my path, but I remained interested in cosmology.

      "Zwicky's tired light is not in any way a crackpot idea. However, he certainly recognized that it has very few observational legs to stand on."

      Actually it had no observational legs to stand on other than the fact that the light from distant objects appeared reddened. But then again, neither did the notion that the universe was expanding. The great debate about the nature of the universe was whether Sir Fred Hoyle’s steady state theory was correct as opposed to the expanding universe or Fr. George Lemaitre. The debate was decided in favor of an expanding universe on theoretical grounds - that the interstellar medium was a true vacuum and thus, according to Einstein, there was no mechanism to retard the progress of light.

      Of course that universe, the universe of the 1930s, was a universe long ago and far away. Its size had recently greatly increased with the discovery that the Magellanic clouds were actually galaxies composed of individual stars. All those dimmer lights that we now recognize as galaxies were thought to be gaseous nebulae within the Milky Way. There were far fewer sub atomic particles. The neutron, the first of these, was discovered in 1932. Electromagnetism was strictly local, its effects confined to the close proximity of stars and their planets. There were no interstellar or intergalactic magnetic fields with their entwined electric currents, no solar wind, no Higgs field, no cosmic rays, no zero point energy, no x-ray stars, no neutron stars, no black holes, no pulsars, no dark matter, no dark energy, no hula hoops. Pretty much a plain vanilla type of thing.

      Ain’t progress wonderful. The pretty, nay elegant, Einstein LeMaitre Gamow universe was disrupted by the inflation theory introduced by Guth et. al. Advances in observations and their extension beyond the visible light portion of the spectrum revealed all kinds of unexpected surprises. These have been explained using advanced techniques in ad hockery - arbitrary parameters, new entities (undetectable) - to the point that modern cosmology looks like a collaboration between M.C. Escher and Rube Goldberg.

      And there is still no observational evidence for expansion other than the red shift, which can potentially be caused by other mechanisms. One of those mechanisms, tired light, is in evidence all around us - in the color of the sky, a rainbow, a prism, a lens. The speed of light is not a constant, it varies depending on the medium which it transverses. The amount of reduction in its speed is dependent on its wavelength. We can measure the effect and we assign a number to it which is called the Index of Refraction, which is unity for an absolute vacuum.

      It is extremely difficult to measure the speed of light. The first really accurate number came from observing the occultations of the moons of Jupiter. Any practical measurement requires an occultation or sudden onset (e.g., pulsar) of the source. The distance and energy must be great enough for the dispersion to be measurable within the precision of our instruments. Pulsars fit these requirements. The observations show that the photons travel at different speeds, necessarily less than c, dependent on their wavelength. The amount of dispersion depends on the distance traveled. The close correlation between the dispersion and the parallax determined distance to the pulsars indicate that the effect is homogeneous and isotropic. The Index of Refraction of the medium is greater than unity.













      Delete
  19. Great article! I wonder why you quote 4 times Perlmutter and Riess and not once the authors of the new paper. Nobel bias or do you think that previous understanding will stand?

    ReplyDelete
    Replies
    1. Because I have heard people say "Perlmutter and Riess" often enough to know how to pronounce their names ;) More seriously, the paper has 4 authors, these appear clearly visible in the video multiple times, and I have added a reference, so I considered it to be unnecessary to read them off. I don't know what information viewers would have gained from me reading them off.

      Delete
  20. Saul Perlmutter: "Science isn't a matter of trying to prove something – it is a matter of trying to figure out how you are wrong and trying to find your mistakes." (July 2013, The Guardian).

    ReplyDelete
    Replies
    1. Gary,

      Feynman told our QM class (I'm paraphrasing of course, but I think this is a fair summary):

      >"You guys are gonna make mistakes because you're human beings. The important thing is to catch your mistakes as soon as possible, fix them, and move on."

      I've always thought the 2011 "super-luminal neutrino" mistake by the OPERA group was a good example of this: they did not over-hype their apparent result, they asked the community to help find any errors, and they corrected it publicly when they found the source of the error.

      Einstein, by the way,made (and published) several errors before getting the final version of General Relativity, ranging from initially calculating the wrong value for bending of light by the sun (by a factor of 2) to the matter we have discussed here of setting the Ricci tensor proportional to the stress-energy tensor (which is mathematically inconsistent).

      In the process of trying to understand General Relativity, I myself managed to make the same mistakes, of which I am perversely proud: I may not be as smart as Einstein, but at least I can replicate his mistakes!

      Whoever turns out to be right on this issue of the accelerating expansion of the universe (or lack thereof), let's remember that the open airing of possible mistakes is ultimately the true strength of science. Scientists as individuals are far from infallible, but when the scientific community functions as it should, it corrects the inevitable errors of individual scientists.

      Delete
    2. Thanks for the perspective ! It is insightful to read Feynman: "Incidentally, I have done this problem four times--making a mistake every time--but I have at last, got it right."(page 55, Tips On Physics).

      Delete
  21. As the third author of this paper, I would love to see these results tested and reproduced by other groups, and more importantly, the hypothesis tested on a bigger dataset. Adam Riess has in a recent press interview (https://physicsworld.com/a/dark-energy-debate-reignited-by-controversial-analysis-of-supernovae-data/) claimed to have tested this on a larger sample (1300) SNe and ruled the dipole out. This sample of 1300 SNe is not public. Intermediate sample (Pantheon) came with a lot of problems and discrepancies, highlighted in : https://arxiv.org/abs/1905.00221v1

    I present a full picture of the history of SNe and what I think is happening, at :

    https://4gravitons.com/2019/11/15/guest-post-on-the-real-inhomogeneous-universe-and-the-weirdness-of-dark-energy/

    ReplyDelete
  22. "How did this gain enough traction, when it was published, to win a Nobel and have thousands of physicists suddenly agree that the Universe was composed of 68% mysterious "dark energy"?"

    Just because it was the missing piece of the puzzle:
    https://tritonstation.com/2019/01/28/a-personal-recollection-of-how-we-learned-to-stop-worrying-and-love-the-lambda/

    ReplyDelete
    Replies
    1. Yves: Thanks for the link from an expert. That paints a depressing picture, for the last 30-40 years, of cosmology as a dogmatic religion filled with confirmation bias due to absolute certainties. The refusal to consider alternatives, a celebrity scientist pressuring researchers to produce results consistent with his own preconception, Nobel prizes awarded or not depending on how well they fit the current dogma, regardless of the statistical significance or sloppiness of the methodology. The presumption the CMB was uniform because the first experiment to look for it was of such low resolution it blurred into uniformity. Yikes.

      Reading Dr. Hossenfelder, I am sensing much of the same dogmatism in fundamental physics. The same celebrity opinion privilege, the refusal to confront mistakes and bad predictions, and thus the inability to learn from them and make any progress.

      It's really a sickening story. It seems both fields have become mathematical religions instead of science, more intent on never admitting a mistake than they are in finding the truth by confronting their mistakes.

      What is it with humans and our penchant for cults of personality? Cultism is rampant in politics, religion, business and nearly every human endeavor. Science is supposed to overcome cultism by judging experiments and results with objectivity. Instead of being the last refuge from cultism, the one human endeavor immune to dogmatism and belief-without-evidence, it has become another breeding ground for it.

      Cults in the sciences might literally be the death of us all.

      Delete
    2. James Peebles wrote (1993): "But since this subject, dark mass, is still being explored, it is well to bear in mind the alternative that we are not using the right physics." (page 47, Principles of Physical Cosmology). Steven Weinberg wrote (2008): "The discovery of dark energy is of great importance, both in interpreting other observations and as a challenge to fundamental theory." (page 56, Cosmology).

      Delete
    3. Yves, Dr. A.M. Castaldo, Gary Alan: Stacy McGaugh is a thoughtful astronomer. But his "personal recollection" is just that. For a better perspective, shouldn't we turn to a historian of science?

      And one-liners, even by august scientists, are surely no substitute for level-headed analyses?

      Dr. A.M. Castaldo wrote: "for the last 30-40 years, of cosmology as a dogmatic religion filled with confirmation bias due to absolute certainties." I'm puzzled, how did you draw that conclusion from McGaugh's personal recollection?

      FYI, there are many, many papers - published in relevant peer-reviewed journals - presenting observations and analyses that are the antithesis of dogmatism or confirmation bias.

      Delete
    4. For me, the "alarm bells" started ringing over dark energy when people started claiming that the modern expansionary term was the return of Einstein's old Lambda, apparently in order to try to fabricate some sort of air of historical precedent and legitimacy for the new thing.

      But Einstein's Lambda and the modern Lambda are two very different scientific animals:
      Einstein's was motivated purely by theory and aesthetics - the idea that //obviously// the universe had to be static, and Lambda was therefore a theoretically derived value required to exactly compensate for the otherwise-expected gravitational collapse.
      The modern Lambda has no theoretically derived value or motivation, it's something we made up to explain the difference between theory and experimental observation.

      One was all theory and no observation, the other is all observation and no theory. When people started writing articles saying that "Einstein wasn't so wrong after all!", I felt that this was either profoundly ignorant or deliberately disingenuous, and I suspected the latter.

      Delete
    5. JeanTate: I'm puzzled, how did you draw that conclusion from McGaugh's personal recollection?

      Right under the LOTR poster "I was there, Gandalf..." is the paragraph: (verbatim, including misspelling of "fervor"); speaking about the 1980's, 30-40 years ago.

      It is hard to overstate the ferver with which the SCDM paradigm was believed. Inflation required that the mass density be exactly one; Ωm < 1 was inconceivable. For an Einstein-de Sitter universe to be old enough to contain the oldest stars, the Hubble constant had to be the lower of the two (50 or 100) commonly discussed at that time. That meant that H0 > 50 was Right Out. We didn’t even discuss Λ. Λ was Unmentionable. Unclean.

      If that is a remotely accurate account, it smacks of religion, not science. It is another example of Dr. Hossenfelder's thesis in Lost In Math: Prior to that we have the quote:

      Inflation gave a mechanism that drove the mass density to exactly one (elegant!), and CDM gave us hope for enough mass to get to that value. Together, they gave us the Standard CDM (SCDM) paradigm with Ωm = 1.000 and H0 = 50 km/s/Mpc.

      In other words, I think it is fair to infer it was the elegance of the models that convinced them of their truth and made it dogmatic, not the scientific merit or observational concordance. Anything else was derided, disdained, and not suitable for polite conversation. They did not want to even discuss it.

      As for a science historian, I doubt I'd get the unfiltered flavor of what it was like to live through this period, and why shouldn't I consider Stacy McGaugh's writing an historical account? Clearly it was intended as such.

      Delete
    6. Dr. AMC,

      I took GR from Kip Thorne back in the '75-'76 academic year. Kip did not much like the cosmological constant -- after all, all of the existing evidence then was that it was consistent with zero. But there was no single cosmological model that was foisted on us.

      And I was at Stanford when Guth worked out his stuff on inflation -- I went to one of the first talks he gave on it. The idea was certainly provocative and seemed to answer some questions, such as the horizon problem, that were mysteries.

      But I do think McGaugh may be exaggerating a bit. There is a big difference between "Nobody had a better idea than inflation" and "Everyone had to agree that inflation had to be true."

      Personally, whenever I've talked with laymen about the subject for the last forty years, I've said that inflation seems to be the leading idea among theoretical cosmologists of the early universe and -- who knows? -- it might even turn out to be true.

      Delete
    7. As much as I respect Stacy's work in his own field (which is not cosmology), I think that even he would admit that the post you are referring to is more of an op-ed than a genuine historical assessment.

      Delete
  23. We don't know how distribute dark matter in our own galaxy. This parameter is part of the problem.

    ReplyDelete
    Replies
    1. To talk about something ficitve in a way as if it would be real and well-known is misleading.

      "Dark matter" is a concept, not a fact.

      Delete
    2. Dark matter is a hypothesis that explains observations very well.

      Delete
    3. The dark matter hypothesis should be at least plausible to a certain degree. But to explain the flatness of the rotation curves, these assumed particles need a spatial distribution following a 1/r^2 law. Who has an idea why the distribution could be this way?

      The only particles which have naturally such distribution are neutrinos and photons. Because they are permanently created inside a galaxy and spread out into all directions. But they have little mass (in motion). This could point to a new understanding of gravity, i.e. the role of mass in it. - But this direction is presently not under investigation.

      Delete
    4. @Sabine Hossenfelder: "Dark matter is a hypothesis that explains observations very well."
      -------
      Dark matter may indeed be a good “hypothesis” that explains certain observations, but the hypothesis is woefully lacking when it comes to explaining what dark matter actually is.

      Sabine, in combining the mystery of dark matter with the mystery of dark energy, can you name a LARGER PROBLEM in physics than that of not being able to directly detect, or measure, or even apprehend what approximately 95% of the universe is made of?

      I mean, isn’t it mind-blowing to think that it’s possible that the entire endeavor of our physical sciences can only access and address a mere 5% of that which we call “reality”?

      Delete

    5. I think that if space is part of the dynamics of the phenomenon to be described; then it cannot be used as a fixed frame of reference; the other problem that I see is that space can grow by stretching or by addition, both have different consequences

      Delete
    6. Hi Sabine!
      "... very well." is a bit subjective: If we're allowed to invent arbitrary unknown substances, that obey new laws of physics and are invisible, with an undefined number of free parameters that can be assigned any values we like (quantity, hot/ cold/warm ...), in order to bring our predictions based on general relativity back into line with observation ...

      ... then yes, the hypothesis //certainly should// end up agreeing with observations!

      But the agreement shouldn't necessarily be considered to be particularly //significant//, because if they couldn't get the thing to curve-fit with the existing parameter-set, they could have kept inventing new parameters until it did (a la MOND).

      Delete
  24. S.H.: "Now, and this is the important point, this light from the supernova will stretch if space expands while the light travels from the supernova to us"

    Nobody has an experimental proof for a "space expansion". So that's a pure hypothesis.

    There are a lot of alternative explanations:

    - shrinking of matter instead growing of space
    - increasing of time speed
    - other physical mechanism for redshift of old, far away emitted light, e.g. something like stokes-shift
    - today unknown mechanism of interactions between light and matter and/or fields in the space
    ...

    "Space expansion" is as cracy as all the other possibilities. And more cracy than some of them.

    So the concept of "space expansion" is pure hypothesis and may mislead the tries of explanantion - as "dark matter" may do.

    ReplyDelete
    Replies
    1. weristdas,

      What you say is incorrect and demonstrates that you have no knowledge of how research in this area works and that you are not familiar with the mathematics. Shrinking matter will not have the same effect and "time speed" isn't a thing. Please stop spreading misinformation.

      Delete
    2. You deny that "space expansion" is a pure, unproofable hypothesis?

      And you call the proposal of alternative hypotheses "spreading misinformation"?

      Gosh!

      Delete

    3. Dark matter is just a name; but it names very real and very localized effects, they are more real than the visual matter itself. For me, the key is in the supermassive holes

      Delete
    4. weristdas,

      Physicists mean a very concrete mathematical hypothesis when they speak of the expansion of the universe, and that hypothesis serves to explain observations very well.

      That you do not understand what they mean does not make it wrong, it means that you do not understand what they mean. You would be well advised to try to understand it before you make dismissive remarks.

      Delete
    5. So you are talking in a secret language in your community. But you mistake that the media and the politicians and the popular takes that in literal sense.

      So you made the fault - not I.

      And you ignore the influence of pictures on thinking, as "space expansion" would only be if you are right.

      You pretend to talk over math - but you don't do in fact. So you are fooling yourself.

      Delete
    6. Luis6:18 AM, December 03, 2019

      That's nonsense which show that you have no idea of mind and psych.

      There are no "just names". This "just names" are pictures in your mind which form and lead your thinking.

      And the picture of "dark matter" leads wrong.

      Delete
    7. @weristdas: I have heard many times before from people who lack depth of knowledge that certain scientific results or theories are a matter of politics. This is particularly the case with climate change. When it comes to the foundations of physics and cosmology politicians could largely not give one rat's ass about these matters. In fact some here in USA oppose these things for not being "biblically correct."

      Scientists are not like magicians or shamans with some box of secrets. It is the case of course that to understand these things on some depth requires a measure of study and work. This is not about science as some revelation system, where science provides some simple set of statements that make cosmic truth evident to everyone. You have to actually work at it.

      Delete
    8. weristdas,

      All theories in physics are mathematical. You cannot understand these theories without understanding the mathematics. It should therefore be obvious that every verbal description is at best an analogy.

      There is nothing "secret" about our language. The internet is full with free online resources that will teach you the basics of General Relativity. What you are really complaining about is that this would take you effort. Sorry, but there is no way around it. You can either continue to document your frankly obvious lack of comprehension or you go and make this effort. If you do not, please stop bothering us with your nonsense opinions, thank you.

      Delete
    9. @Lawrence Crowell: Well, you see, it used to be enough for politicians to justify something by relating it to Bible or to its vague interpretation. It is no longer so. Now politicians more and more appeal to "science has proved it" to push their agenda. "Scientism", when broad audience is listening to science writers and science journalists rather than to science (which is anyway impossible not just for the broad audience but for working scientists as well if the matter is out of their working domain), is and will be a political tool. No surprise that this discredits science.

      > When it comes to the foundations of physics and cosmology politicians could largely not give one rat's ass about these matters.

      Well, I would disagree. "Big" and "interesting" matters are literally breathtaking. I would say that they can stimulate feelings close to religious: all this astrophysical/cosmological stuff with stars-galaxies-light-years-big-bangs. And then there's this super question "and what was before the Big Bang?" and the answer is something like "the time probably didn't exist as we know it" -- just WOW! Mindblowing. :D
      And such is easy to exploit.

      Delete
    10. Weristdas, do not talk to me about the mind, a good idea is three or four neurons in a relationship and a bad idea has any other neuronal relationship, and luckily for fools and intelligent, nature allows the existence of both; but the outer reality is only one. There is an orbital movement of the stars in the galaxy and of galaxies in a cluster that is not explained with the mass we see; that unknown thing is called dark matter, what is this? , that's the problem.

      Delete
    11. Recently a congressman from TN called for the abolition of higher education because it is a hot bed for liberalism. In case you have not noticed things are getting pretty loopy these days.

      Delete
    12. Lawrence Crowell: He's right, understanding things is so liberal.

      In the immortal words of Ned Flanders (Simpsons) leading a religious protest against science: There are things we don't want to know! Important things!

      Delete
  25. Sabine, you wrote "Now, Perlmutter and Riess did their analysis 20 years ago and they used a fairly small sample of about 110 supernovae."

    I just checked Perlmutter+ (1999) and Riess+ (1998), and they - independently - report using 42 and 16+34 SNe 1a, respectively.

    Where did you get the 110 from?

    ReplyDelete
    Replies
    1. Correspondence with one of the authors. I didn't check; it seemed reasonable.

      Delete
    2. I am trying to find a table of sn1 events. It is certain that in the last two decades far more than just 110 have been observed.

      Delete
    3. Thanks. I'm a bit surprised.

      Delete
    4. LC: "I am trying to find a table of sn1 events. It is certain that in the last two decades far more than just 110 have been observed."

      It is true that a great many more than 110 SNe 1a have been observed in the last two decades. Ditto in the period 1998/9 to 2011 (when the Nobel was awarded).

      CBAT produces lists of supernovae (you need an account to get the details):
      http://www.cbat.eps.harvard.edu/lists/Supernovae.html

      Colin+ (2019) - the paper this blogpost is about - references and discusses two SNe 1a catalogs, the "Joint Light-curve Analysis (JLA; Betoule et al. 2014)" and the "Pantheon catalogue (Scolnic et al. 2018)".

      Delete
    5. The Open Supernova Catalogue, https://sne.space/, has about 55,000 entries, although many recent entries are still un-classified. There are 12,000 entries since the first of this year, and 36,000 since 1/1/2010. About 12,500 entries in the total catalogue are classified type 1a, if my 2 mins with excel are correct. There are 8600 records with spectra and 43,000 with light curves. The Large Synoptic Survey Telescope, first light anticipated 2020, is supposed to discover 3M supernovae by 2030. Data apparently will not be the problem.

      Delete
    6. Bob Smith wrote: "Data apparently will not be the problem."

      It might be.

      The LSST will have no spectrometer (per current plans anyway). So which observatory (or -ies) which do have spectrometers (or spectrographs) will obtain estimates of the redshifts? Will they be able to handle ~90% (say) of the 3 million in that decade?

      Delete
    7. I think this pretty much puts an end to the objection over 110 SNa events of Perlmutter, Riess and Schmidt.

      Delete
  26. I'm not sure when or how I reached the conclusion that "dark energy" smacked of measurement error but I'll admit vindication is sweeeeeet.

    I will of course still be cursed with explaining how dark matter looks real-ish and is a different thing... We really should find the people who name these things and give them a serious talking to about namespace and the importance of avoiding collisions /sigh

    For the pretty people who can read and understand this stuff I have a question: They claim to have uncorrected the data by subtracting out the bulk flow corrections. We are sure that this subtraction was done using the same values as original correction? how/why? I don't want to look too sternly at a paper that confirms my weird biases but why didn't they find and use the raw data to start with?

    Suddenly I have strong fears that people are publishing papers without including the raw data... let us all now weep for the state of science as a whole that this practice is not considered a firing offense QQ QQ QQ

    ReplyDelete
    Replies
    1. Jonathan Starr: as the saying goes, great minds think alike ... or perhaps it's fools seldom differ? ;-)

      Within the last hour, I finished reading Colin+ (2009) in some detail, and as a result asked the two authors of that paper who have made themselves known here a question that is quite similar to yours ("why didn't they find and use the raw data to start with?"). Though my comment won't show up for some time yet (Sabine is, I hope, sleeping).

      Delete
  27. In discussions of dark energy one frequently sees this wording as from PhysicistDave:

    "By the way, the one exception is "dark energy" due to the cosmological constant."

    I thought the cosmological constant was basically just a fudge factor added by Einstein, then assumed to be zero for many years, then became non-zero in 1998. I see the usage above so often that I'm wondering is "CC" now a shorthand for a specific theory? How can you say a phenomenon is "due to" a fudge factor?

    ReplyDelete
    Replies
    1. Bob,

      No, the CC is not a "fudge factor." It is a perfectly reasonable part of the Einstein field equations. Since measurements could not show that it was non-zero for a long time, lots of physicists rather hoped that it was exactly zero (on the basis of the sort of "argument from beauty" that Sabine has harshly criticized).

      Bob asked:
      >I see the usage above so often that I'm wondering is "CC" now a shorthand for a specific theory?

      No, when we say that some phenomenon is due to the CC, that is just shorthand for saying that a non-zero CC term in the field equations implies that this phenomenon exists.

      As to what the "CC" term really represents physically... well, as Newton said, "Hypotheses non fingo." What really are the physical entities represented by E and B fields in Maxwell's equations? Or the Higgs field that provides mass to ordinary particles?

      Back in the nineteenth century, a number of physicists, including Maxwell, devoted a lot of effort to trying to find some comfy underlying classical-mechanical kind of way of representing the E and B fields as (very strange!) stresses in some sort of "luminiferous ether."

      It never really worked. And, most physicists today would say the attempts were misguided (although there are some people who still have hope!): the real things that are represented by E and B in Maxwell's equations just are whatever in the real world behaves in a way described by those equations.

      Same thing for gluon fields, the CC, the Higgs field, etc.: we have mathematical theories that predict how such things behave and those predictions seems to track rather well with observations. What more do you want?

      A tiger is an animal that looks, sounds, and behaves like a tiger? What more do you want?

      To be sure, a deeper theory may someday be found in which something else causes the CC or the Higgs or gluon fields or tigers (e.g., for tigers, the theory of evolution). But a chain of explanation has to stop somewhere: and then the thing just is whatever behaves as our equations predict.

      I think this discussion may explain why some of our less scientifically literate friends around here are so angry at those of us who use math to get results from GR. They want more than the mathematical predictions and confirming observations: they want to know the real physical essence behind the math and the observations.

      To which I can only say, "I am a physicist, not a theologian. Hypotheses non fingo."

      Delete
    2. Actually, it did end up working.

      Newton himself was a proponent of some medium of transmission for the gravitational force, the gravitational ether, but he could not come up with a reasonable working hypothesis as to how. Similarly with the electromagnetism, where the medium of transmission was called the lumineferous ether.

      In hindsight we can see that the difficulties with framing the ether, of either kind, was to think of them mechanically. It turns out that the gravitational ether is spacetime itself, and the lumineferous ether is the EM field pervading it.

      Einstein himself said in 1920:

      "We may say that according to the general theory of relativity space is endowed with physical qualities; in this sense, therefore, there exists an Aether. According to the general theory of relativity space without Aether is unthinkable; for in such space there not only would be no propagation of light, but also no possibility of existence for standards of space and time (measuring-rods and clocks), nor therefore any space-time intervals in the physical sense. But this Aether may not be thought of as endowed with the quality characteristic of ponderable media, as consisting of parts which may be tracked through time. The idea of motion may not be applied to it"

      I think this idea of the medium of transmission was the major reason why Einstein felt QM was incomplete. He couldn't countenance the prediction in the EPR thought experiment that entanglement effects were instantaneous because that supposed no field of transmission (I'm supposing that a speed of transmission can't literally be infinite). According to Abraham Pais, it wasn't the fact that QM was indeterminate in some fashion that Einstein objected to, but this dismissal of locality. Of course when Einstein framed his thought-experiment it couldn't be carried out. It can now, and QM confirms the non-locality.

      Delete
    3. PhysicistDave,

      But don't you agree that Einstein's 1917 addition of the CC in order to satisfy certain intuitions (Sabine's Beauty?) about the static nature of the cosmos is the very definition of a fudge factor? Einstein sounds relieved when he writes to Weyl in 1923, "If there is no quasi-static world after all, then away with the cosmological term!" (Pais, 1982)

      Delete
    4. No, Mozibur. The nineteenth-century attempts did not work.

      You can take a quote of Einstein's out of context all you like.

      But the nineteenth-century attempts expected to find a rest frame for the luminiferous ether.

      Never happened.

      Delete
    5. Bob Smith asked me:
      >But don't you agree that Einstein's 1917 addition of the CC in order to satisfy certain intuitions (Sabine's Beauty?) about the static nature of the cosmos is the very definition of a fudge factor?

      No, not at all.

      "Fudge factor" is usually used in STEM subjects to refer to some alteration in a formula that we know is wrong, often because it disagrees with the basic laws, or at least something we add to correct for our ignorance of the actual physics of the system. You add the "fudge factor" to agree with the experimental data even though you really do not know what is going on.

      Eventually, you should be able to clean up the "fudge factor" if you come to really understand the physics of the system.

      That is not the case with the CC. It was always a possible term in the field equations and works perfectly nicely mathematically and physically. It is easy to understand (well, easier than the rest of GR!).

      As Einstein said in the paper in which he introduced the CC: "However, the system of equations [the Einstein field equations] allows a readily suggested extension which is compatible with the relativity postulate..."

      It is not clear, by the way, if he ever made the supposed "greatest blunder" comment. The source of the story was Gamow, who was a bit of a prankster. There is also a suspicion that Einstein might have been referring to a real mistake: his failure to see that the CC did not actually stabilize a static universe (a slight perturbation would destabilize it, as Friedmann pointed out).

      By the way, that paper also quotes from a letter by Einstein to de Sitter at about the same time:
      “The general theory of relativity allows the addition of the term λ gμν in the field equations. One day, our actual knowledge of the composition of the fixed-star sky, the apparent motions of fixed stars, and the position of spectral lines as a function of distance, will probably have come far enough for us to be able to decide empirically the question of whether or not λ vanishes. Conviction is a good mainspring, but a bad judge!"

      Despite his mixed feeling about the CC, I think that, in that letter to de Sitter, Einstein got it exactly right! Whether or not the CC vanishes is simply an empirical matter. We have the good luck to live in that future he projected when that question will soon be finally resolved.

      In case anyone is in doubt, yes, I do rather like the CC aesthetically. But, of course, my aesthetic appreciation is irrelevant to whether or not it actually vanishes empirically.

      Delete
    6. @PhysicistDave:

      Ahh... science. We're told its about argument, discussion and debate. But instead you're simply indulging in assertion.
      Your 'no' isn't going to work given the history I've explained above and justified.

      How do you explain the quote I have given by Einstein? I think I'll take Einsteins intuition over yours anyday...

      Delete
    7. The signal-to-noise ratio is becoming rather low here. All commenters should read all cosmology papers from 1917 to 1940 (there aren't that many), then read review articles on the last 100 years of the cosmological constant, then come back and ask intelligent questions.

      If the people you are criticizing have really screwed up as badly as you claim, why haven't you published a paper pointing it out? Blogs are good for discussion and spreading the word, but not for debate about nitty-gritty details.

      Delete
    8. Mozibur wrote to me:
      >Ahh... science. We're told its about argument, discussion and debate. But instead you're simply indulging in assertion.

      And exactly who "told" you that, Mozibur? Whoever it was lied.

      What you said would be true of philosophy or theology -- argue, discuss, debate, forever. "Word salad" without end.

      No, science is not like that: science is about evidence. Science is not about openness and tolerance to any kind of new idea whatsoever, if that new idea disagrees with the evidence or if the new idea is so imprecise that there is no hope of ever comparing it to evidence.

      On the contrary, science is about eviscerating, demolishing, destroying theories that do not agree with the evidence.

      Think of science not as a glorious meadow of contrasting, blooming theories that are never cut down.

      Think of science rather as the graveyard of theories that were killed by evidence.

      Frankly, I think that is the real reason that most human beings are not interested in engaging in the actual practice of science: at its best, science consists of the ruthless, brutal, unfeeling destruction of theories that failed to agree with the evidence. Most human beings do not like having their cherished beliefs demolished, and they tend to think that people who do so are not very nice people.

      For the record: scientists, like anyone else, can be gentle, kind people or nasty, unpleasant people. But, to be a serious scientist, you cannot be gentle towards ideas which are inconsistent with the evidence. And most people just do not like that.

      Now, speaking of your quote from Einstein, you were replying to my point that:
      "Back in the nineteenth century, a number of physicists, including Maxwell, devoted a lot of effort to trying to find some comfy underlying classical-mechanical kind of way of representing the E and B fields as (very strange!) stresses in some sort of 'luminiferous ether'."

      You replied with a twentieth-century quote from Einstein that, yes, used the word "aether" but then went on to say, "But this Aether may not be thought of as endowed with the quality characteristic of ponderable media, as consisting of parts which may be tracked through time. The idea of motion may not be applied to it."

      I.e., he was contrasting his own idea of "aether" with the nineteenth-century idea, to which I alluded, which did indeed consist of "parts which may be tracked through time." Einstein was rebutting that view.

      Early in the twentieth century, E. T. Whittaker wrote a lengthy book which went into some detail on these bizarre nineteenth-century attempts to explain electromagnetism by a classical-mechanical model. For example, he says of Macwell's own attempts:
      "When it is desired that two wheels should revolve in the same sense, an idle wheel is inserted between them so as to be in gear with both. The model of the electromagnetic field to which Maxwell arrived by the introduction of this device greatly resembles that proposed by Bernoulli in 1736. He supposed a layer of particles, acting as idle wheels, to be interposed between each vortex and the next..."

      This sounds quite bizarre to most of us today, of course, which was my point.

      Here is a brief summary of this strange Victorian obsession with trying to find a mechanical explanation of electromagnetism.

      Some of the non-scientists here seem to have a similar obsession with knowing what the cosmological constant really is or with rejecting General Relativity because it is a mathematical theory.

      They are free to speculate as they would like. I, and most physicists, doubt that such speculations will prove fruitful, but let us see what precise results they can get and test them against evidence.

      Delete
  28. Thanks for that amazing update, I will be very "tuned" to finding out if other groups confirm the data in that paper.

    ReplyDelete
  29. This comment has been removed by the author.

    ReplyDelete
  30. This comment has been removed by the author.

    ReplyDelete
    Replies
    1. The universe can be flat without dark energy. The FLRW metric for Λ = 0 is rather simple, in fact Minkowski-like. The biggest difference is the red shift of distance objects would be markedly less. The CMB is far more red shifted, and the CMB radiation bears signatures of distance gravitational field stretched out by gravitational frame dragging of dark energy. This is one reason the CMB surface of last scatter is 46 billion light years away and not just 13.8 billion light years away. The frame dragging of coordinates by dark energy gravitation has marked influence over objects that are more distant and with higher z = v/c. A galaxy with z = 8 is being frame dragged at that multiple of the speed of light. The CMB is at z = 1100. This is an effect of general relativity, and there is no local violation of the causal limitation of light speed. By the same measure an observer can fall into a black hole, cross the horizon and be frame dragged with v > c relative to the outside.

      Delete
    2. This comment has been removed by the author.

      Delete
    3. Contrary to what Gene Kranz said about Apollo 13, when it comes to theory failure is always a possible option. However, with respect to this there is no failure of general relativity.

      Delete
    4. “Is there any possibility that general relativity fails?” None whatsoever.

      Delete
  31. This comment has been removed by the author.

    ReplyDelete
  32. Hi Sabine,
    Thank you very much for this enlightening post and discussion. I have a question regarding this paper: assuming that some still undisclosed mechanism could stretch photon wavelengths as a result of quantum interactions, so that the light from distant sources such as standard candles would, in general, reach us with higher redshifts than expected according to current particle theories (also assuming that this mechanism could describe other phenomena at least as accurately, etc.), would the anisotropy of cosmic acceleration claimed in this paper be more, less or equally compatible/incompatible with such a mechanism (as an alternative to dark energy)?
    Many thanks in advance!

    ReplyDelete
    Replies
    1. There are several things to consider re this idea:
      - redshift is achromatic, so whatever mechanism there is it must also be achromatic
      - whatever mechanism there is it cannot produce too much distortion, distant objects do not appear "fuzzy" (this is pretty strong)
      - no one can say whether "the anisotropy of cosmic acceleration claimed in this paper be more, less or equally compatible/incompatible with such a mechanism (as an alternative to dark energy)" without knowing at least some details of this "undisclosed mechanism".

      Hope this helps.

      Delete
    2. Thanks for your kind reply, JeanTate. I started to think about this cosmological side effect only very recently, so sure it helps. Of course, it is hard to assess whether this possible anisotropy can support or challenge such a mechanism without more details. It differs from tired light mechanisms based on scattering processes like Compton or Raman, which were ruled out long ago. The redshift caused by it would be linear with distance if the universe follows the cosmological principle and would not involve blurring in my opinion (but I could be wrong). It would be achromatic in that the shift in wavelength would be independent of photon wavelength and would not require a static universe, so it can be an additional effect to time dilation. And might or might not account for the accelerating expansion of the universe (or a significant part thereof) depending on its large-scale size, which remains to be quantified. So, I’d better sit down, calculate, and wait to see whether this anisotropy gets confirmed or refuted before pondering its impact.

      Best

      Delete
  33. Too bad Paul Feyerabend is not around today. I think he would be enjoying this.

    Paul Feyerabend Interview (1993)

    ReplyDelete
  34. I wouldn't mind if it turns out that the Nobel prize committee made a so-called "mistake". That's what science is, there are no forever-truths in science.

    ReplyDelete
  35. Great, balanced article, Sabine, thanks so much. A great example of the "follow and weigh the evidence" approach that seems to be eroding in some areas. You are doing a great public service helping educated non-experts follow the very technical and sometimes muddled research going on in modern physics.

    ReplyDelete
  36. Interestingly enough, looking back, one recalls that Hubble's 1929 paper announcing the discovery of the linear relationship between distance and redshift, what is now called Hubble's Law, was based on individual distance estimates for some 24 relatively bright galaxies..." (see pages 82-92, Observational Basis for Hubble's Law, in Principles of Physical Cosmology, Peebles).

    ReplyDelete
    Replies
    1. Here is that 1929 Hubble paper:
      https://www.pnas.org/content/15/3/168

      Smoot+ (1992), "Structure in the COBE Differential Microwave Radiometer First-Year Maps", may seem largely irrelevant to the topic of this blogpost.
      https://ui.adsabs.harvard.edu/abs/1992ApJ...396L...1S/abstract

      Ditto Freedman+ (2001), "Final Results from the Hubble Space Telescope Key Project to Measure the Hubble Constant".
      https://ned.ipac.caltech.edu/level5/Sept01/Freedman/Freedman_contents.html

      Ditto Planck Consortium (2014) "Planck 2013 results. XVI. Cosmological parameters"
      https://www.aanda.org/articles/aa/full_html/2014/11/aa21591-13/aa21591-13.html

      What these - together with Riess+ (1998), Perlmutter+ (1999), and Colin+ (2019) - provide is a snapshot of the various statistical techniques used in cosmology (astronomy, etc). And how they have changed in the last ~90 years.

      Note that Smoot was one of two Nobel Prize winners in 2006, the landmark paper in this case is the one I cited. Neither Freedman nor anyone on the HST Key Project or in the Planck Consortium has won a Nobel, yet, for the work reported in these papers. And Hubble didn't either (you know why, don't you?).

      Clearly, Hubble did not use anything like a Bayesian Information Criterion! But which of the other papers I've cited did (hint: Colin+ 2019 did; others?).

      Did any of these papers use a σ>5.0 criterion (as used by LHC experimentalists, say)?

      And how important is it that a "correct" statistical method is used, to establish the Hubble relationship (redshift vs distance)? The value of H0? Estimates of the values of various cosmological parameters? What role does objective, independent verification (or validation) play?

      Delete
  37. "But the details don’t matter all that much. To understand this new paper you only need to know that the authors had to go and reverse this correction to get the original data. And *then they fitted the original data rather than using data that were, basically, assumed to converge to the cosmological average."

    Forgive the outsider's perspective here (I'm no physicist), but assuming the above paper is correct, should we be worried that any other discoveries based of this data could be suspect/incorrect?

    ReplyDelete
  38. An analysis of the comments on this blogpost, so far (85 comments, see caveats below):
    - two of the four paper* authors have, between them, written three comments
    - two others have written comments which are directly about the paper*
    - there are two comments about an earlier paper (Nielsen+ 2016), one by one of its authors
    - no one, including Sabine, has said that they have read the two papers which were cited by the Nobel Committee (re the 2001 Physics Prize, see below)
    - there are quite a few comments about these two papers, including some which state explicitly that the commenter has not read either.

    For me this is depressing.

    I am a big fan of critically examining the work of astronomers, astrophysicists, and cosmologists.

    But I am also a big fan of actually reading that work; in particular, the published papers.

    I'll go further: in what is likely a near majority of the comments, I see the kind of herd behavior/mentality that some commenters decry, ranging from (paraphrases) "the concordance model must be wrong because of my pet idea", to "I always knew DE was nonsense", to "yeah, and dark matter doesn't exist either", and more.

    Here is the what the Nobel Committee had to say about the 2011 Physics Prize, note that it was awarded to Perlmutter and Riess ... and Schmidt:
    https://www.nobelprize.org/uploads/2018/06/advanced-physicsprize2011.pdf

    What they refer to as "the two breakthrough papers" are:
    A.G. Riess et al., “Observational evidence from supernovae for an accelerating universe and a cosmological constant”, Astron. J., 116, 1009-1038, (1998)
    S. Perlmutter et al., “Measurement of Ω and Λ from 42 high-redshift supernovae”, Astrophys. J., 517, 565-586, (1999)

    Caveats:
    - my numbers are somewhat subjective, in particular some comments are ambiguous in key aspects
    - I have no way to determine whether commenters who claim to be the paper* authors are; I accept their statements at face value

    *the paper is Colin+ (2019)

    ReplyDelete
    Replies
    1. My own reading of the paper by Riess, et.al. (1998) proves fascinating (as it is both lucid and detailed). Read: "Evidence of systematic problems also lurked in supernova photometry so that merely increasing the sample would not be adequate."(page 4) and "To investigate the consequence of sample selection effects, we used a Monte Carlo simulation." (page 22). It is probably the norm that research papers outside of one's own specialty are difficult to fully comprehend and the topic here is hardly different in that respect. The 2011 Nobel Lecture by Adam Riess is helpful to view. Yet, Nobel Prize winner James Peebles says with all honesty: “It is astounding to think that nature operates by rules that we can discover, by trial and error, of course.” (Princeton Alumni Weekly, Dec 2019, page 36).

      Delete
    2. JeanTate: Okay, I looked at the nobelprize PDF. As I thought, I don't have the depth of knowledge about this field to critique it, so reading it was a waste of time.

      I am a frequent peer-reviewer in my own field. I am careful and err on the side of not approving BS. I check their math, I check their assumptions, I check their logic.

      Outside of my field, I rely on expert peer-review. And I assumed there must be an even higher standard of scrutiny than just 3 randomly chosen peers, because I know for a fact that sometimes authors get lucky and get three peer-reviewers that aren't thorough or just all overlook a subtle mistake.

      I am relying on the peer review of the new paper, and another expert I am confident is not a sensationalist, Dr. Hossenfelder, that has interpreted their results.

      The researchers have found an obvious flaw in the original paper which invalidates the extraordinary, universe-changing conclusion. That earlier conclusion alone should have been a red flag that perhaps they missed something and the scrutiny should be increased; or perhaps the Nobel should be delayed until more SN could be observed, but neither of those happened.

      That is my complaint. If we can't trust peer-review to filter out 98% of the idiocy, then all of science becomes competing opinions based on flawed data, mis-analysis, and pet belief systems.

      The most plausible explanation I currently see is that the original paper is based on too little data and an apparently fundamentally flawed analysis that ignored one of three parameters: where the SN are found.

      Which means peer-review failed spectacularly. That is what pisses me off, that the state of science in this field (and apparently in particle physics) is so infested with non-science drivel and anti-science authoritarian repressions that errors like this could stand for 20 years.

      That is what is depressing.

      Delete
    3. The evidence for Dark Energy doesn't rely on a single paper (or pair of papers); it will take more than a single paper to refute it. It is hardly 'invalidated' yet.

      Delete
    4. I said "And I assumed there must be an even higher standard of scrutiny ..."

      I meant that For a Nobel Prize I assumed there must be an even higher standard of scrutiny.

      Delete
    5. Scott: If the original Nobel winning paper had been correctly analyzed, and found no indication of Dark Energy, and had not won a Nobel, what would have been the outcome of that alternate history?

      Would we even be talking about Dark Energy if it was completely unsupported by SN observations? Which is the apparent case now with the flaw in that paper exposed and a more comprehensive analysis undertaken.

      We might be talking about some anomalies, we might be talking about vacuum energy, but we wouldn't be talking about Dark Energy and an accelerating expansion of the universe.

      Delete
    6. Dark Energy was a requirement of observations years before the SN observations, so yes, we would be talking about it.

      Delete
    7. Jean Tate,

      Well, for the record I read (and re-read) the Reiss paper contemporaneously with its release. At the time, what I found most striking was the grandiose claim prominent in the Introduction (emphasis added):

      This paper reports observations of 10 new high-redshift type Ia supernovae (SNe Ia) and the values of the cosmological parameters derived from them. Together with the four high-redshift supernovae previously reported by our High-Z Supernova Search Team (Schmidt et al. 1998; Garnavich et al. 1998) and two others (Riess et al. 1998a), the sample of 16 is now large enough to yield interesting cosmological results of high statistical significance. Confidence in these results depends not on increasing the sample size but on improving our understanding of systematic uncertainties.

      So the authors assert, without any attempted justification, that on the basis of a mere 16 data points they can make sweeping generalizations about the nature of the cosmos. As if that weren't breathtaking enough, it turns out that those aren't even 16 robust data points:

      The available set of high-redshift SNe includes nine well-observed SNe Ia, six sparsely observed SNe Ia, and SN 1997ck (z = 0.97) whose light curve was well observed but lacks
      spectroscopic classification and color measurements.


      Dr Castaldo had the statistical argument correct with regard to the erroneous 110 data point assumption. This pompous claim that the sample size is adequate for the derivation of cosmological implications of "universal" significance is ludicrous on its face. It is yet another example of the self-delusional proclivities of the mathematicist mind-set.

      And that's just for starters. While there is an extensive and necessary discussion of the data acquisition methodology and an extensive discussion of the theoretical implications, there is a remarkable paucity of discussion of just what it is that has been observed. It is possible to tease out, with close reading, that the central observation is of a discrepancy between the redshift and luminosity distances. Nowhere in the paper though, is there an extensive consideration given to the range of possible explanations for this discrepancy. There is no evidence, for instance, of the reliability of the theoretical expectations being any given consideration.

      Instead the data is simply crammed into the existing model's framework, with all its free parameters (including the available, but unused at the time, cosmological constant). The result, we are triumphantly told, is that the "universe" is undergoing a previously unpredicted, accelerated expansion.

      This is, of course, just model fitting. The model manifestly failed to predict the claimed observations, but is constructed such that it can simply be adjusted to fit any unexpected new data. Now that the SN Ia data are being called into question, the model can be reconfigured again to fit, albeit with some residual grumbling from those who've made a career touting Dark Energy.

      The upshot of all this is that the standard model of cosmology is a scientifically useless mathematical toy. It has nothing meaningful to say about the nature of the cosmos we actually observe.

      It's long past time to scrap ΛCDM, its invisible free parameters, and its empirically baseless foundational assumptions, and start over. There is a vastly greater abundance of observational data today, than was available a century ago when the basic structure of ΛCDM was concocted.

      The institutional impediments to such a wholly justified scientific undertaking are, unfortunately, quite substantial. The guild-like structure of the modern theoretical physics community is simply not amenable to the occasional, but absolutely necessary, upheavals that attend scientific progress. This institutional inertia is destroying the good name of science.

      Delete
    8. Dr. A.M. Castaldo wrote "If the original Nobel winning paper had been correctly analyzed, and found no indication of Dark Energy, and had not won a Nobel, what would have been the outcome of that alternate history?"

      First, the Nobel Committee cited two "landmark" papers (not one). They report results of two independent sets of observations and analyses.

      Second, while alternative history is (or can be) certainly fun, it isn't exactly, um, objective is it?

      Whatever ... whatever its name might be, something like Dark Energy can be seen in the various analyses of the CMB. Note: there are two global, independent missions which have reported consistent values of the "six parameter" ΛCDM model; in addition there are well over 40 other, independent projects/missions/etc which have reported consistent estimated values of the same six parameters (not all "just" the CMB).

      So yes, as others have already noted, we very likely would still be talking about something like Dark Energy (cosmological constant, Λ, whatever), though its name may be different.

      Delete
    9. JeanTate,

      I myself have not read the papers simply because, as a theorist, I am pretty sure I lack the expertise to judge that details of the debate.

      In particular, I am pretty sure that you are better than I at that, and so I am following your comments.

      I am trying to restrain myself to simply replying to those who post comments showing a lack of understanding of how the CC fits into GR, which is something I do know about.

      My uninformed "gut feeling," by the way, is that the new paper will not pan out just because most new papers that go against a well-established scientific consensus do not pan out. But of course if I were rigid about that, I would have rejected most ground-breaking discoveries.

      So, I'll wait for the evaluations of those who know more than I.

      Thanks for all your informative comments.

      All the best,

      Dave

      Delete
    10. JeanTate: Second, while alternative history is (or can be) certainly fun, it isn't exactly, um, objective is it?

      Consideration of an alternative history is one of the major ways we learn anything new; we say "what would have happened if I'd done X instead of Y?"

      And fine, you're right, the Nobel Committee considered two papers and apparently got them both wrong, which isn't surprising if they don't scrutinize them and accept wild claims on the basis of as little as ten data points.

      The Nobel Committee can award prizes after 30 years of waiting; if the Nobel Committee had waited for enough real data and thorough analysis to have statistical rigor, science wouldn't have wasted 20 years on an acceleration expanding universe.

      Talking about a CC is fine, but if the new Paper debunking Dark Energy has legs, then that talk has to reconcile with a lack of Dark Energy in the supernova data.

      That in turn would change the approach toward it, in ways that cannot produce accelerating expansion.

      Delete
    11. I've been reading through the comments and have to say I'm somewhat confused. Is this new paper asserting that accelerated expansion is not supported by the supernovae data, but is not questioning the expansion of the Universe at just a linear or constant rate?

      Delete
    12. I am a big fan of critically examining the work of astronomers, astrophysicists, and cosmologists.

      But I am also a big fan of actually reading that work; in particular, the published papers.

      I'll go further: in what is likely a near majority of the comments, I see the kind of herd behavior/mentality that some commenters decry, ranging from (paraphrases) "the concordance model must be wrong because of my pet idea", to "I always knew DE was nonsense", to "yeah, and dark matter doesn't exist either", and more.


      My thoughts exactly. See my comment above (written after yours I'm quoting here) about the dropping signal-to-noise ratio here. :-|

      Delete
    13. I myself have not read the papers simply because, as a theorist, I am pretty sure I lack the expertise to judge that details of the debate.

      In particular, I am pretty sure that you are better than I at that, and so I am following your comments."


      Sabine is not an observational astronomer.

      Delete
    14. Dr. A.M. Castaldo wrote: "And fine, you're right, the Nobel Committee considered two papers and apparently got them both wrong, which isn't surprising if they don't scrutinize them and accept wild claims on the basis of as little as ten data points."

      Perhaps you missed this, in the Physics Nobel 2011 "Scientific Background" PDF:
      https://www.nobelprize.org/uploads/2018/06/advanced-physicsprize2011.pdf

      "Although not as evident at the time of the discovery, later studies of SNe beyond z = 1 [29], from the time when the Universe was much denser and ΩMdominated, indicate that at that early epoch, gravity did slow down the expansion as predicted by cosmological models. Repulsion only set in when the Universe was about half its present age."

      And "Figure 3. A summary figure from Review of Particle Properties, http://rpp.lbl.gov, showing the combination of supernova observations (SNe), the microwave background (CMB) and the spatial correlation between galaxies (”Baryon Acoustic Oscillations”, BAO)." I think the link is dead; one that works:
      http://pdg.lbl.gov/2011/reviews/rpp2011-rev-cosmological-parameters.pdf

      Look for Figure 21.1: "Confidence level contours of 68.3%,95.4% and 99.7% in the ΩΛ–Ωmplane from the CMB, BAOs and the Union SNe Ia set, as well as their combination(assumingw=−1). [Courtesy of Kowalskiet al.[25]]" In fact, this whole 2011 Particle Data Group PDF is well worth reading (as you might guess, there are quite a few PDG PDFs; this one is directly relevant to your comments).

      Delete
    15. @Phillip Helbig: thanks.

      I think PhysicistDave was referring to me, not Sabine (whose focus is, as you note, not primarily observational astronomy).

      Delete
    16. Thanks PhysicistDave.

      "I myself have not read the papers simply because, as a theorist, I am pretty sure I lack the expertise to judge that details of the debate.

      In particular, I am pretty sure that you are better than I at that, and so I am following your comments.
      "

      I'll let you in on a "secret" (as someone earlier said) about observational astronomy (well, at least extra-galactic astronomy; and of course it's not in any way a secret): paper (and result) chains.

      These days - several decades old though YMMV - it usually begins with a survey or three. SDSS is a good model: a camera, a telescope, a mountain observatory. Which took years to build and get right ("A Grand and Bold Thing" by Ann Finkbeiner (2010), is an excellent book on what it took).

      Swarms of scientists - astrophysicists principally, but also physicists, cosmologists, and more - then write thousands (or more) of papers based on the public datasets (SDSS goes above and beyond to help anyone, scientist or not, access the data, extract subsets, etc, etc).

      Often these present new discoveries; often they confirm and extend known ones; sometimes they contradict well-established relationships (I am co-author on just such a paper).

      C19 (which I have incorrectly called CP19 in some comments :() is a bit different. It is further removed from the primary observations than makes me comfortable; specifically, it relies on a compilation of data that is at least one degree removed (actually two such, JLA and Pantheon, or three if you include the CMB). Further, as has been made abundantly clear (in C19 itself and in many comments here), C19 examines SNe 1a only, in relation to estimates of ΩΛ (principally, but not exclusively).

      So an obvious place to start scrutiny of C19, especially as it seems to have pretty far-reaching ramifications/consequences, is the extent to which the analyses presented go right back to the primary observations, vs relying on sources further "up the chain" (albeit with some, IMHO, mild/limited checking).

      Another easy piece of scrutiny: internal inconsistencies. At its simplest this is just glorified copy editing, but you'd likely be surprised at how many of these internal inconsistencies make it into the actual journal paper (no surprise that they're pretty common in arXiv versions). Very often these are dismissed as having ~no effect on the results presented, but for mold-breaking papers it pays to be extra scrupulous (I am credited, in the Acknowledgements section of an early Galaxy Zoo paper, for my "assiduous copy editing"; what I found was actually a pretty gross error in the draft text, which apparently none of the paper's authors had noticed at the time).

      One more (for now): statistics. This is not so easy if you do not have at least some familiarity with the tools and techniques used, especially as these have gone in and out of favor over time. But if - like Dr. A.M. Castaldo - you have considerable experience outside observational extragalactic astronomy, you may be able to quickly spot weaknesses (but unless you are unafraid of making huge boo-boos, it would pay to spend time trying to understand what, exactly, the techniques are being applied to and what they are aimed at/claim is their scope).

      tl;dr: read C19, even skim it. I think you'll be surprised at how much of it your GR/CC/etc expertise may be relevant! :)

      Delete
  39. Thanks Sabine for the absolutely excellent article. If the results of this new paper hold up then, it seems to me, that at least two types of `dark matter' will be needed after getting rid of dark energy. The reason for this is because the background radiation shows that the observable universe is very nearly flat and ordinary matter and ordinary dark matter together make up about thirty percent of the energy budget necessary to make it `flat'. With dark energy gone something else must make up the remaining seventy percent to keep it `flat'. Offhand, the only candidate I can think of is some new kind of `dark matter' that has no rest mass and therefore always travels at the speed of light. That way it would be evenly spread around like the cosmic background radiation is -- although it might be more properly called a very dense dark radiation instead of a new `dark matter'.

    ReplyDelete
  40. I personally don't believe in either dark matter or dark energy. Dark matter was invented to explain motion in a galaxy, with effectively two saucers of dark matter on the top and bottom of the galaxy. How did that dark matter get there and only there for each galaxy?

    ReplyDelete
    Replies
    1. Tom,

      I have no idea what makes you think that galaxies have "effectively two saucers of dark matter on the top and bottom of the galaxy". This is most definitely not the case.

      Delete
  41. This comment has been removed by the author.

    ReplyDelete
  42. Truly fascinating discussion with comments that stretch my limited understanding of the math involved.
    Given this pointer toward reanalysis of dark energy it makes me wonder if perhaps the ways we calculate the expected velocity of planets and the shape of space affecting light to posit the existence of Dark Matter are completely correct. The evidence for dark matter rests primarily observations of massive objects that extend their influence over large distances. How do we calculate the general relativity effects of diffuse objects spread over light years?

    ReplyDelete
  43. What is the reason for ignoring Nobel Prize winner Brian Schmidt in the article? He was the leader of the High Z Supernova Search, and Adam Riess certainly would not have done this work if Schmidt had not started the project. I think that an apology is in order.

    ReplyDelete
  44. For me as a layman, the concept of so called dark energy / cosmolocical constant always seemed unlogical and absurd. An always and always still faster and faster expanding universe is against common sense and I'm sure, that ín some decades we're gonna grin laugh about this like we grin about phlogiston or epicycles today...

    ReplyDelete
    Replies
    1. So, Uli Thomsen, if you find DE etc to be "unlogical and absurd", and "against common sense", what must you think of wave/particle duality? Bell and "local realism"? Just about any aspect of QM (of the physical realized form)?

      Delete
  45. Lawrence

    Does this mean that the DE issue will likely be resolved in say the next 10 to 20 years? That is assuming more and more data continue to be gathered?

    Austin Fearnley

    ReplyDelete
    Replies
    1. It depends upon what you mean by the DE issue. That DE exists is almost beyond question. The issue is what is DE. It is some form of the quantum vacuum, but it is strange in that it is so very small. Somehow DE is a feature of the way that spacetime is "stitched together" by quantum entanglements of states, most likely quantum gravitation states. This is where the real questions lie IMO.

      Delete
    2. Lawrence

      Thank you for your explanation. I did mean the issue: "exist or not", as in the blog title. (The previous blog was about "what is DE?".) There have been a lot of new posts about the existence or not, since your previous post, but it is not clear to me as a layman. If I read you correctly z=10 really defines DE (despite DE only becoming evident at Z=5?). Z=5 (8 billion years ago/away) is where accelerated expansion becomes evident (= begins?). Z=0.5 is local and subject to local interference or local conditions.

      Doesn't one need all three Z values to get a fuller picture? Z=10 on its own, without more recent data, could have been merely a second period of inflation with acceleration stopping soon afterwards? And similarly, Z=0.5 on its own could be unreliable as it is more like a snapshot which makes it harder to disentangle the complementary variables of global position and acceleration?

      While you mentioned causes of DE I hope I am allowed to mention negative mass as a possible cause of both DM and DE [Farnes, 2018]as I cannot find it mentioned in the previous blog. The "runaway motion" associated with negative mass might have seemed ludicrous in the 1960s but it could be just what we need for DE.

      "Spacetime ... stitched together by quantum entanglements of states". This always reminds me of the use of paired rasch analysis to make psychometric metrics. The metrics so made are most reliable where the data are most concentrated and least reliable where the data are sparse. So by analogy you get reliable metrics in galactic clusters and less reliable metrics in deep space? Can DM be used to make the metrics? I suppose 'yes' if they are gravitational quantum states.

      Austin Fearnley

      Delete
  46. This comment has been removed by the author.

    ReplyDelete
  47. Hi Sabine,

    Have you seen this guest post from 4 gravitons by Mohamed Rameez? Here's a piece of it:

    On the 2011 Nobel Prize in Physics:

    The Fitting Problem in cosmology was written in 1987. In the context of this work and the significant theoretical difficulties involved in inferring fundamental physics from the real Universe, any claims of having measured a cosmological constant from directionally skewed, sparse samples of intrinsically scattered observations should have been taken with a grain of salt. By honouring this claim with a Nobel Prize, the Swedish Academy may have induced runaway prestige bias in favour of some of the least principled analyses in science, strengthening the confirmation bias that seems prevalent in cosmology.

    This has resulted in the generation of a large body of misleading literature, while normalizing the practice of ‘massaging’ scientific data. In her recent video about gravitational waves, Sabine Hossenfelder says “We should not hand out Nobel Prizes if we don’t know how the predictions were fitted to the data”. What about when the data was fitted (in 1998-1999) using a method that has been discredited in 1989 to a toy model that has been cautioned against in 1987, leading to a ‘discovery’ of profound significance to fundamental physics?

    ReplyDelete
    Replies
    1. As have I.

      After I've commented further - mostly questions - on Colin+ (2019) (CP19 for short), upthread, I'll see if I can add a comment there. Top priority: make Mohamed (his handle here) aware that there are questions about CP19 here, questions he might like to drop by to address.

      Delete
    2. "t is frustrating to think that physicists have lost the north and not find anyone who thinks the same. Apparently, the few who have not yet lost it were hidden in this blog, what a pleasant surprise!"

      Someone once said that the internet would be great, because it would provide everyone with essentially unlimited information to make an informed decision on any topic.

      Thirty years later, what has happpened---especially in politics---is that there is so much information that people can spend their whole lives encountering no information except that which confirms the prejudices.

      Delete
    3. This comment has been removed by the author.

      Delete
  48. Getting rid of dark energy, or something similar to it, comes with a price: it gives an age problem between the oldest stars and the age of the cosmos. If it is assumed that there is a new type of dark radiation that makes up seventy percent of the energy budget necessary to keep the observable cosmos nearly flat then the age of the cosmos is about 7 billion years (assuming a Hubble constant of about 70). If it is assumed that there is a new type of dark matter that non-gravitationally repels itself, so that it is evenly spread throughout space, and makes up the aforementioned seventy percent of the energy budget, then the age of the cosmos is about 9.3 billion years (again assuming a Hubble constant of about 70). But the oldest stars have ages around 13 billion years. So either of those options creates an age problem; no star can be older than the cosmos. It could be postulated that there is a new type of dark matter that gravitationally repels itself but its effect on the evolution of the cosmos should be very similar to that of dark energy. The iopscience article "Implications of Symmetry and Pressure in Friedman Cosmology" examines that sort of thinking. The fact that there are stars around 13 billion years old it itself good evidence for the existence of dark energy or something similar to it.

    https://iopscience.iop.org/article/10.3847/1538-4357/ab32da/pdf

    https://en.wikipedia.org/wiki/List_of_oldest_stars

    ReplyDelete
    Replies
    1. Louis Wilbur: The fact that there are stars around 13 billion years old it itself good evidence for the existence of dark energy or something similar to it.

      And what is it evidence for if the SN data shows no sign of Dark Energy?

      Probably a MOND-like problem, similar in that our models of gravity predict things exquisitely well on the local scale but not very well on the galactic scale.

      Perhaps the same phenomenon, or a similar phenomenon, causes an issue in the measurement of a star's age.

      That doesn't have to include an accelerating expansion of the universe, and doesn't have to include dark energy at all.

      After all, as Dr. Hossenfelder indicates, dark matter explains a lot, but we still don't know what it is, or even if any new "matter" is involved; as her earlier posts have explained, it may just be a lack of detail in our galactic models. Or it may be some form of MOND. Technically, I'd say it should be called "Dark Gravity", because there is apparently another source of gravitational force that we have not apprehended, be it from matter or something else (like poor modeling, or an actually different behavior of gravity like MOND, or by the limits of quantum gravity).

      I'm not making a proposal or pet theory here, but pointing out that as long as our theory of gravity is inconsistent with our observations, and appears to be distance-sensitive, then it is premature to claim we absolutely know how old very distant stars are.

      Redshift can also be caused by gravitational fields, we may be observing some aspect of gravitation instead.

      Delete
    2. What does a theory of gravity have to do with determining a star's age? Nothing. That's one reason that dark energy won out as an explanation -- there are multiple lines of evidence for it, each independent of the others.

      Delete
    3. Dr. Castaldo, if the new paper is correct then it necessarily follows that LCDM is wrong but it does not follow that General Relativity is wrong. LCDM is one general relativistic model but there can be lots of other general relativistic models that are compatible with the newer data. For example, there are general relativistic cosmological models that have a coasting period, for example the Lemaitre model. In fact, Sabine's post got me to start calculating how the cosmos obeying the cosmological principle will evolve if there is dark radiation, making up seventy percent of the energy budget, and that dark radiation itself has a negative pressure. However, the new paper that this particular blog is about may not itself be right; there is already pushback to the idea that LCDM is wrong, see https://arxiv.org/abs/1912.02191.

      Delete
  49. @Jesus

    "It is frustrating to think that physicists have lost the north and not find anyone who thinks the same"

    The corruption is in every field. The problem is in us. Most of the time we walk around running simple shell scripts. We do it because cognition costs calories and our genome didn't develop in a time of food abundance. On occasion one or two of us will grok a problem deeply enough to implement a minor improvement in current best practices. As the number of layers of abstraction increase the type of brain that can hold it's own at that bleeding edge gets rarer. But the colony remains and better practices diffuse, slowly at first then all at once a flash, and the colony changes. let us hope that is what is happening here and now.

    It takes a while to understand S88 batch but when you get it you wonder why no-one told you directly that it's just two sets of Russian dolls one for time one for space...

    I believe in the $40 ISO standard cup of tea. For where would we be without ISO.

    I'm trying to cheer up Jesus on a world wide computer network while physicists argue in the background... we live in the future lol

    ReplyDelete
  50. Nobody apparently noticed the difference between the log likelyhood method (known to produce large biases needing corrections in SN analyses) and the traditional X2 method: see table 1 which gives larger isotropic acceleration!

    ReplyDelete
    Replies
    1. I noticed.

      But I'm holding back writing comments here until I am confident that I will likely have something significant to write (barring typo errors). Maybe in an interval while I drink some [strike]chi[/strike] chai, which, strangely, is branded Sigma. ;-)

      Delete
    2. The constrained chi square method is so wrong it's been pointed out as rather wrong in 1989 itself:

      https://books.google.dk/books?id=HsDxCAAAQBAJ&pg=PA511&lpg=PA511&dq=Gull+S.,+1989,+Maximum+Entropy+and+Bayesian+Methods,+511&source=bl&ots=XT4hAuQUlq&sig=ACfU3U0iw5sMbSRWGjyEwzfU4FyE-a21Sg&hl=en&sa=X&ved=2ahUKEwiR26_Y2OTlAhUILFAKHedGA84Q6AEwBXoECAcQAQ#v=onepage&q=Gull%20S.%2C%201989%2C%20Maximum%20Entropy%20and%20Bayesian%20Methods%2C%20511&f=false

      Also:

      https://arxiv.org/abs/1102.3237

      There are masters and PhD theses dedicated to how wrong it is:

      https://arxiv.org/abs/1508.07850

      https://arxiv.org/abs/1503.03844

      It is a highly arbitrary way to treat the intrinsic scatter of a population, especially when other sources of intrinsic scatter are being tuned by hand and 3 sigma outliers are being rejected. Our initial draft did not contain the constrained chi square result. It was added only at the request of the referee.

      Delete
    3. From CP19: "Acknowledgements. We thank the JLA collaboration for making all their data public"

      I couldn't find a JLA webpage, but the authors of Betoule+ (2014) include: Perlmutter, Riess, Ellis, Fabbro, Filippenko, Goobar, Hogan, Hook, Jha, and Pain.

      Maybe the R98 and P99 teams (many of them anyway) are keen to have their "breakthrough" Nobel-Prize cited papers trashed (as some who have posted comments here clearly wish)?

      Delete
    4. @JeanTate

      I've tried to go through this thread and answer as many questions as I can. Pardon me if I've missed any. Please feel free to post them on 4gravitons and I shall definitely answer.

      This is the closest there is to a JLA web page:
      http://supernovae.in2p3.fr/sdss_snls_jla/ReadMe.html

      Before JLA, the Union datasets came with the arbitrary treatment of the intrinsic scatter pre added to the uncertainty budget, making independent statistically rigorous analysis impossible.

      After JLA, the Pantheon dataset has issues which I highlight in the 4gravitons post and 1905.00221.

      I do not wish to trash anyone's achievements. But given that the fitting problem in cosmology (Ellis and Stoeger) was written in 1987, I find the claims of the 1998 99 papers outlandish.

      I'd be glad to be proven wrong by future data. Even gladder to be proven right.

      Delete
    5. Thanks Mohamed. If you search on "Mohamed", likely a capability you'll find in most browsers, you'll see that my substantive questions re CP19 are far upthread.

      re: "I do not wish to trash anyone's achievements." and "I'd be glad to be proven wrong by future data. Even gladder to be proven right.":

      There are quite a few comments here much in line with these sentiments (if I may call them that); for example (nowhere near complete) by Philipp Helbig, PhysicistDave, Gary Alan, and Scott. However, my subjective impression is that there are far more comments which view the tension (shall I say) as anything but collegial. A point I was making, indirectly, is that the objective behavior of the JLA consortium (which includes many of the authors of R98 and P99), per your Acknowledgement, is that we may infer that at least many of those authors (including all the principals?) would also be glad to be "proven wrong", and even gladder to be "proven right"! ;-)

      Delete
  51. "Thus the cosmic acceleration deduced from supernovae may be an artefact of our being non-Copernican observers, rather than evidence for a dominant component of “dark energy” in the Universe."

    Even if true, how are all the other observations explained which indicate that there is a significant positive cosmological constant? It is relatively easy to find something which could have gone wrong with any test. The convincing argument for the cosmological constant, and the reason the current standard model is called the concordance model, is that many independent lines of evidence agree.

    ReplyDelete
    Replies
    1. Just two basic questions from a non-expert. Are all these independent lines of evidence directly or indirectly based on the cosmological redshift (CMB, SN or else) or is there any other line of evidence? And do all of them interpret the redshift mostly as recessional velocity away from us? Many thanks!

      Delete
  52. "There is for example the data from baryon acoustic oscillations and from the cosmic microwave background which are currently best fit by the presence of dark energy. But if the new paper is correct, then the current best-fit parameters for those other measurements no longer agree with those of the supernovae measurements."

    It makes sense to combine data only if there is no systematic error in one or more of the data sets.

    Yes, the CMB gives a combination of various parameters. But we can measure the Hubble constant independently of other cosmological tests, ditto for the matter density, so, even without using the supernova data at all, there is still a strong signal for dark energy.

    I'm not saying that the new paper is wrong, but rather that extraordinary claims require extraordinary evidence. In particular, one has to explain why all the other tests are wrong.

    ReplyDelete
    Replies
    1. Phillip Helbig: YES, extraordinary claims require extraordinary evidence! Which is one reason to reject a paper claiming the universe contains 3x as much energy as heretofore expected, based on 10 data points, or 48, or 110. Especially in the face of the later paper, with over a 1000 supporting data points.

      Which one is making the more extraordinary claim? The first paper 20 years ago, or the recent paper, which finds flaws in the first and thus no evidence of its extraordinary conclusion of accelerating expansion?

      Refuting an extraordinary claim is not in itself an extraordinary claim, it is only a claim that there is nothing extraordinary after all!

      If I write a matrix factorization and claim it is by a factor of 10 the fastest in the world (an extraordinary claim), but then you examine my code and prove I have a bug and my factorization doesn't work, then you are not making an extraordinary claim by refuting my extraordinary claim.

      The extraordinary claims of the first paper should have been motivation to pound every possible objection into sand. Motivation for the researchers, then the peer reviewers, and then by the Nobel Committee before awarding an extraordinary prize.

      All of them failed, because the result was so pretty they all fell victim to confirmation bias and proof by celebrity once they won the Nobel.

      Extraordinary claims require extraordinary evidence, and they clearly did not have it, twenty years ago.

      Refuting an extraordinary claim does not require extraordinary evidence; and also science does not get any better with age, having been part of canon for 20 years doesn't make it any more untouchable or deserving of respect.

      If anything, age allows new methods, data and analysis, unavailable to previous researchers, that can prove the flaws in that old science.

      Delete
    2. "Which one is making the more extraordinary claim? The first paper 20 years ago, or the recent paper, which finds flaws in the first and thus no evidence of its extraordinary conclusion of accelerating expansion?"

      The case for dark energy never rested on a single paper from 20 years ago. What we have is 1000 papers from 20 years ago vs. one paper today.

      Delete
    3. Scott: I wonder how many of those thousand cited (and therefore relied upon) the papers shown to be flawed.

      Besides, the question isn't whether the cosmological constant is non-zero, or whether there is vacuum energy, I believe the question is whether the expansion is accelerating and we are headed to a Big Rip; and the question is whether 2/3 of all the energy in the universe is "dark energy".

      Might we be headed to a Big Crunch instead? Is the universe cyclic, which would also cast doubt on Inflation? (Which is already in doubt).

      By accepting these papers, have we indeed ruled out the true nature of the Universe and supported false narratives?

      It's like the old joke by Josh Billings:

      It Ain’t What You Don’t Know That Gets You Into Trouble. It’s What You Know for Sure That Just Ain’t So.

      Delete
    4. Dr. A.M. Castaldo wrote "I wonder how many of those thousand cited (and therefore relied upon) the papers shown to be flawed."

      At a high level this is easy to answer; at a detailed level one would need to read a good sample (probably >100) of those papers.

      Some papers will not mention SNe 1a at all (except in the Intro, where it's pretty much mandatory), much less discuss them.

      Some will include references to some SNe 1a papers in the discussion, but will not include any SNe 1a data in the analyses.

      Some will explicitly say they have done joint analyses which include one or more SNe 1a datasets and/or papers.

      The results/findings presented will thus be independent of SNe 1a in all but the last kind of paper ("joint analysis").

      And to repeat: C19's scope is SNe 1a, specifically those in the JLA and Pantheon datasets (at the time of writing); C19 makes no comments on estimates of cosmology model parameter values derived from analyses of other datasets (such as CMB and BAO).

      Delete
  53. Maybe the solution is more simple and local redshift measurements are wrong.

    I note that the first author of the paper linked to above probably knows more about cosmology than everyone here combined. :-)

    ReplyDelete
    Replies
    1. Very instructive paper anyway. Thanks a lot for the link, Phillip.

      Delete
  54. Not obviously directly relevant to this blogpost (though "Accepted by ApJ"), also not as clean a read as CP19 (despite my pet peeves), Zhou&Li (2019) "Model-independent Estimations for the Cosmic Curvature from the Latest Strong Gravitational Lensing Systems" (https://arxiv.org/abs/1912.01828) suggests an approach or two that might bear closely on the central claims of CP19.

    ReplyDelete
  55. This comment has been removed by the author.

    ReplyDelete
  56. Worth noting that a preliminary result was published in 2016 by an overlapping group of authors. J.T. Nielsen, A. Guffanti an S. Sarkar, "Marginal evidence for cosmic acceleration from Type Ia supernovae" 6 Scientific Reports 35596 (October 21, 2016) (open access).

    ReplyDelete
  57. Another 2015 paper points out an independent systemic issue with the dark energy analysis that lead to the 2011 Nobel Prize awards which tends to show that the amount of dark energy is, at a minimum, overestimated. It flows from the fact that there are two sub-types of Type Ia Supernova, which are hard to distinguish in the visible light spectrum, but are clear at other wavelengths. Peter A. Milne, Ryan J. Foley, Peter J. Brown, Gautham Narayan. THE CHANGING FRACTIONS OF TYPE IA SUPERNOVA NUV–OPTICAL SUBCLASSES WITH REDSHIFT. The Astrophysical Journal, 2015; 803 (1): 20 DOI: 10.1088/0004-637X/803/1/20

    ReplyDelete

COMMENTS ON THIS BLOG ARE PERMANENTLY CLOSED. You can join the discussion on Patreon.

Note: Only a member of this blog may post a comment.