The Cosmic Microwave Background (CMB) is radiation we receive today from a time when the universe was about 300,000 years young. At that time, radiation decoupled from matter and since then, photons could travel almost undisturbed. The CMB shows the temperature, or the inverse wavelength, of the microwaves that we receive on Earth from these early times.
The mean temperature of the CMB is approximately 2.7 Kelvin, and is a blackbody spectrum to truly amazing accuracy. What we will be concerned with here however is not the mean temperature, but tiny fluctuations around this temperature. These carry a lot of information about the conditions in the early universe which can help us understand the origin of the structures that we see today, and the processes that were important in the early universe. These fluctuations are of the order micro-Kelvin, and have been measured by NASA's WMAP mission. You probably have all seen their skymap of the temperature fluctuations:
We have discussed the features and usefulness of the CMB temperature fluctuations a few times already, see e.g. my earlier posts The CMB Power Spectrum and Anomalous Alignments in the CMB.
One way to extract information from the data is to look at correlation functions. These come in integer orders like the two-point function, the three-point function, the four-point function etc. There also is a one-point function but ‒ assuming a homogeneous probability distribution ‒ you already know it: it's just the expectation value. In our case, it would be the mean temperature. The two-point function tells you something about the correlation length in the distribution.
The relevant quantity we are concerned with here is the three-point function. (Confusingly enough the three-point function is also known as bi-spectrum.) To compute it, you roughly take three different points of your distribution, multiply the value of the function (here the temperature), and integrate over combinations of three points. Even from this rough description you can notice two things. First, it's several ugly integrals that are hard to compute, especially with loads of data. Second, multiplying small numbers makes even smaller numbers, thus the result is in risk of dropping below the uncertainties in your measurements. Therefore it's hard to come by this observable, yet it's what one wants to extract from the data because it contains information beyond the simplest (single-field, slow-roll) inflation scenario. This simplest scenario predicts the temperature fluctuations to be to very good precision a Gaussian distribution. If they were exactly Gaussian, the three-point function would vanish, and all higher-order correlations would follow from the two-point function. A non-vanishing three-point function would thus be, here it comes, an indication for the non-Gaussianity of the temperature fluctuations, and an indication for new physics.
There had indeed previously been rumors that non-Gaussianities had been found in an analysis of the CMB data, see e.g. this post on Non-Gaussian CMB over at Resonaances. At that time I heard like half a dozen talks on the topic, yet was reasonably sure the "signal" would vanish back into noise as indeed it did. To our present knowledge, the data with the uncertainty we have is still compatible with a Gaussian spectrum. (One has to be somewhat careful when one reads about these bounds since there's several different ones. That's because the full three-point function is pretty much impossible to calculate. What people have done instead is to take samples of specific threesomes of points, e.g. those forming equilateral triangles, or obtuse angled ones, thus there's different bounds depending on the triangles chosen.)
However, the important thing to note is that the uncertainty in these observations will go down in the soon future, with the WMAP 8 years mission results one expects a 20% improvement on the bounds, while Planck can yield a factor of 4. Now if there was an indication for non-Gaussianity this would be very exciting. Then the question is of course, what is the physics behind that? What I guess is going to happen is that anybody with their model will predict non-Gaussianities. I wouldn't be surprised if suddenly it will be a signature for cosmic strings, evidence for the multiverse and also a prediction of Loop Quantum Cosmology. It will certainly take some while to sort out these things. In any case however, I am sure it is a topic you will hear more about in the coming years.
Can't we all get along? Voigt profile! Everybody stays employed by carefully avoiding any exclusionary conclusions.
ReplyDeletehttp://www.chemistry.adelaide.edu.au/external/soc-rel/content/lineshap.htm
If Voigt is nekulturny, the old standby is an upgraded tortoise and hare race refereed by Zeno: a Yukawa potential.
Shouldn't there be some Feynman diagram type approximation to calculating the three point function from two point functions and some "vertex" whose strength should be deducible from the data? instead of this equilateral triangle, etc., type stuff?
ReplyDeleteThe dark flow or CMB polarization patterns aren't such "Non-Gaussianities"? I'm not sure, if we aren't talking about the same things, just under different names.
ReplyDeletehttp://groups.physics.umn.edu/cosmology/PPPDT/images/CMB_Pol_sm.gif
The first one-year all-sky survey from Planck is in, Bee.
ReplyDeleteClcik here to see it. Check out the NW and SSE quadrants, Bee, and dig that crazy detail. The messy stuff in the middle is our galaxy.
Has anyone implemented three point Monte Carlo sampling yet?
ReplyDeleteI've been reading the paper by Bennett, et. al., on hunting for anomalies in the 7 year WMAP data over the past few days on my spare to/from work hours.
ReplyDeleteI'm probably going to spend the rest of the week @ 3 hours a day digesting it, because there are serious lessons to be learned there.
What I've discerned is the following:
1) The WMAP people _LOVE_ spherical harmonics. No surprise there, but I've never seen spherical harmonics used so frequently. Ever.
2) Every one of the supposed detections of non-Gaussianity and other such anomalous objects has amounted to either an expected low probability feature of the Gaussian distribution (example: the cold spots), or was an artifact of the chosen weighting function. Or some other variant what was termed 'a posterior' choices of data, eg - cuts of the sky, restriction of multipole ranges, etc.
Zephir - neither of your suggestions survived scrutiny. Sorry.
In analogy when Wayne Hu spoke of such correlations so it was with the understanding that we would look at the Cosmos much differently then we have always had.
ReplyDeleteThis shift in perspective, is much like Helio-seismology, looking at correlations in relation to the three body problem in terms of Lagrangian?
If you remove Gaussion Coordinates (a non euclidean description of the geometrics?) from perspective, THEN, you are settling for the three-point function?
I'm confused.
Best,
Hi Steven,
ReplyDeleteThanks, I had seen it, but not a paper with the results. Best,
B.
Hi Arun,
ReplyDeleteIf not gaussian, the 3-point function does really contain more information than the 2-point function. You can't compute the former from the latter. Best,
B.
Aaron,
ReplyDeleteGood question, I can't recall exactly how they do the calculation. It's the part of the talk that I typically zoom out I'm afraid. Best,
B.
Hi Zephir,
ReplyDeleteNo, it's not the same thing, though they are of course entangled with each other, both being features of the same spectrum. The dark flow or the earlier discussed 'axis of evil' are types of anisotropies (special directions). A non-gaussian distribution can be isotropic. Best,
B.
Testing gaussianity, homogeneity and isotropy with the cosmic microwave background
ReplyDeleteIt is always easy for the layman to see the desire for one part of investigation correlated with another, so, if one was to expect the normal Gaussian distribution, then, what experimentally "in theory" would be it's counterpart in the new physics?
Not what you assume will be exploited by different areas of research to tow the new line of theoretical definition aand prospect.
Reductionism, allowing new interpretations?
Best,
This comment has been removed by the author.
ReplyDeleteThis comment has been removed by the author.
ReplyDeleteHi Bee,
ReplyDeleteThanks for your explanation regarding the current explorations of the cosmic fossil. When it’s all said and done what comes across for me is that despite much effort the sensitivity of the instrumentation used to gather the data is still not what we would like it to be. Another thing that confuses me is as it appears even if the distribution turns out to be non Gaussian it still is close to being so and such shouldn’t the questions centre around why this should be? In many ways it’s like looking for a needle in a hay stack without even knowing what the needle looks like or even if it is there.
It would appear to me it might be useful and more convincing if some theory was first arrived at which predicted what we should be looking for instead of just mathematically filtering the data to look for some unspecified anomaly. This reminds me of Roger Penrose’s prediction in respect to the CMB data which although not panning out at leasr gave what to look for in advance rather than searchomg for patterns which may or may not be significant,
Best,
Phil
This multi-colour all-sky image of the microwave sky has been synthesized using data spanning the full frequency range of Planck, which covers the electromagnetic spectrum from 30 to 857 GHz.
ReplyDeleteAs with Fermi, the idea is to increase our capabilities of vision(window on the universe) as our technologies are developed? Theoretically new experimentations will either add to that perspective or will not.
Any presumptions of a failing theoretical process not suited to experimental testing will either have to wait until the experiment is produced, or until the results of the experiential are successful?
Fool's notion to accept defeat even before one is started:)
Best,
Here's a nicer image.
ReplyDeletePlato, in my link I gave a much larger version.
ReplyDeleteNow boys, let's not start a foolish debate about whose is bigger.
Let's debate which is cooler. I submit this Chromoscope of the Planck Galaxy/CMB 1-yr Survey, in which you can continuously adjust (in an interactive way) the wavelength view between gamma rays and radio waves. Tres awesome?
And of course, credit must be given where credit is due, that is to Phil Plait's article of Bad Astronomy from July , here, who turned me on to the Planck photo/chromoscope, and who is after all the biggest of us all.
Wish to compete? Provide a Chromoscope for WMAP, and for extra credit: COBE.
Hi Steven,
ReplyDeleteYes looking back your link is a "Nicer image." I should of followed up. Sorry mate.
Also, thank you for Bad astronomy link. That was very helpful.
Best,
Hi Phil,
ReplyDelete"Another thing that confuses me is as it appears even if the distribution turns out to be non Gaussian it still is close to being so and such shouldn’t the questions centre around why this should be"
The thing is that it is easy to get a Gaussian spectrum, you can do that with the simplest models around, but you need to work harder to get it to be non Gaussian. Thus, the non-Gaussianity really tells you you've missed something about the physics and must look deeper. You don't really expect it to be large because these effects are generically small. Maybe think about the temperature fluctuations in the first place. You don't expect them to be large either. The spectrum is to good precision a black body spectrum, and then you have tiny deviations that teach you more about the physics that's going on. Best,
B.
Hi Bee,
ReplyDeleteI think I see your point. That is just as the orbit of mercury or the bending of star light around the sun would have indicated that gravity was not totally explained by an inverse square relationship. This of course ended in having time included as a dimension which was not describable with a mathematical description that only accounts for the action within three degrees of freedom. Still though it would be nice if a theory was proposed that predicted what these differences are which then could be reinforced or denied as the measurements are refined.
Best,
Phil
Hi Phil,
ReplyDeleteThere are already theories explaining what the differences are and, as I indicated towards the end of my post, I guess that soon we'll have more 'explanations' than we'd wish for. It will be hard to distinguish between the different possibilities. Best,
B.
This comment has been removed by the author.
ReplyDeleteHI Bee,
ReplyDelete” I guess that soon we'll have more 'explanations' than we'd wish for. It will be hard to distinguish between the different possibilities”
And yet that is what the observation of the phenomena is supposed to accomplish. That would indicate to me there is something wrong with the objectives of science if the only criteria for having a theory be considered worthy is if all it has to do is serve as an explanation of phenomena. Personally I think it has drifted away from other things that should be considered with it all beginning with QM allowed not to be assigned a complete explanation simply by having others before to be deemed as classical. The hope is of course a new theory would clarify things yet I can’t shake the feeling this is not possible until the fundamental questions are better understood. That is why for instance I like your box problem paradox as it forces such things to be re-examined again.
”"The supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience"
-Albert Einstein
Best,
Phil
Hi Phil,
ReplyDelete"That would indicate to me there is something wrong with the objectives of science"
I said it's hard, not impossible, and not that nobody will do it. It will simply take more time and we won't know for a while. I was just asking readers to be cautious about claims that will be made because many models might fit the first observations. These observations we're dealing with today are difficult, they take a long time to make and require plenty of high-tech and data-analysis. Times have changed since Einstein. Think of neutrino oscillations as an example. There was a theory, yet there were other theories (eg decay) that some might have found more plausible. It took a long time, not just one measurement, to be able to rule out the decay model (I believe it wasn't till 2004 that this was indeed the case). Of course there's more to a convincing theory than being able to fit data, but the data is what I believe will point us into the right direction. Without that, we're just poking around in the dark. Best,
B.
Hi Bee,
ReplyDeleteAs they say you’re the doctor and thus I will take you at your word that the data as it’s refined should serve to point the way. Also thanks for the analogy as I do remember all the goings on in respect to the solar neutrinos with some saying at first it indicated what we thought about stellar processes was all wrong as a result. Actually I can remember a time when they thought neutrinos had no rest mass and therefore travelled at c which was finally observed not to be the case.
Well anyway it’s been some time since Penzias and Wilson first thought their detection of the CMB was resultant of some bird dropping in their antennae and we are still dealing with the repercussions of what it has to tell us.
Best,
Phil