Pages

Friday, September 26, 2014

Black holes declared non-existent again.

That's me. 

The news of the day is that Laura Mersini-Houghton has presumably shown that black holes don’t exist. The headlines refer to these two papers: arXiv:1406.1525 and arXiv:1409.1837.

The first is an analytical estimate, the second a numerical study of the same idea. Before I tell you what these papers are about, a disclaimer: I know Laura; we have met at various conferences, and I’ve found her to be very pleasant company. I read her new paper some while ago and was hoping I wouldn’t have to comment on this, but my inbox is full with people asking me what this is all about. So what can I do?

In their papers, Laura Mersini-Houghton and her collaborator Harald Pfeiffer have taken into account the backreaction from the emitted Hawking radiation on the collapsing mass which is normally neglected. They claim to have shown that the mass loss is so large that black holes never form to begin with.

To make sense of this, note that black hole radiation is produced by the dynamics of the background and not by the presence of a horizon. The horizon is why the final state misses information, but the particle creation itself does not necessitate a horizon. The radiation starts before horizon formation, which means that the mass that is left to form the black hole is actually less than the mass that initially collapsed.

Physicists have studied this problem back and forth since decades, and the majority view is that this mass loss from the radiation does not prevent horizon formation. This shouldn’t be much of a surprise because the temperature of the radiation is tiny and it’s even tinier before horizon formation. You can look eg at this paper 0906.1768 and references [3-16] therein to get an impression of this discussion. Note though that this paper also mentions that it has been claimed before every now and then that the backreaction prevents horizon formation, so it’s not like everyone agrees. Then again, this could be said about pretty much every topic.

Now what one does to estimate the backreaction is to first come up with a time-dependent emission rate. This is already problematic because the normal Hawking radiation is only the late-time radiation and time-independent. What is clear however is that the temperature before horizon formation is considerably smaller than the Hawking-temperature and it drops very quickly the farther away the mass is from horizon formation. Incidentally, this drop was topic of my master’s thesis. Since it’s not thermal equilibrium one actually shouldn’t speak of a temperature. In fact the energy spectrum isn’t quite thermal, but since we’re only concerned with the overall energy the spectral distribution doesn’t matter here.

Next problem is that you will have to model some collapsing matter and take into account the backreaction during collapse. Quite often people use a collapsing shell for this (as I did in my master’s thesis). Shells however are pathological because if they are infinitely thin they must have an infinite energy-density and are by themselves already quantum gravitational objects. If the shell isn’t infinitely thin, then the width isn’t constant during collapse. So either way, it’s a mess and you best do it numerically.

What you do next is take that approximate temperature which now depends on some proper time in which the collapse proceeds. This temperature gives via Stefan-Bolzmann’s law a rate for the mass loss with time. You integrate the mass-loss over time and subtract the integral from the initial mass. Or at least that’s what I would have done. It is not what Mersini-Houghton and Pfeiffer have done though. What they seem to have done is the following.

Hawking radiation has a negative energy-component. Normally negative energies are actually anti-particles with positive energies, but not so in the black hole evaporation. The negative energy particles though only exist inside the horizon. Now in Laura’s paper, the negative energy particles exist inside the collapsing matter, but outside the horizon. Next, she doesn’t integrate the mass loss over time and subtracts this from the initial mass, but integrates the negative energies over the inside of the mass and subtracts this integral from the initial mass. At least that is my reading of Equation IV.10 in 1406.1525, and equation 11e in 1409.1837 respectively. Note that there is no time-integration in these expressions which puzzles me.

The main problem I have with this calculation is that the temperature that enters the mass-loss rate for all I can see is that of a black hole and not that of some matter which might be far from horizon crossing. In fact it looks to me like the total mass that is lost increases with increasing radius, which I think it shouldn’t. The more dispersed the mass, the smaller the gravitational tidal force, and the smaller the effect of particle production in curved backgrounds should be. This is for what the analytical estimate is concerned. In the numerical study I am not sure what is being done because I can’t find the relevant equation, which is the dependence of the luminosity on the mass and radius.

In summary, the recent papers by Mersini-Houghton and Pfeiffer contribute to a discussion that is decades old, and it is good to see the topic being taken up by the numerical power of today. I am skeptic that their treatment of the negative energy flux is consistent with the expected emission rate during collapse. Their results are surprising and in contradiction with many previously found results. It is thus too early to claim that is has been shown black holes don’t exist.

Monday, September 22, 2014

Discoveries that weren’t

This morning’s news, as anticipated, is that the new Planck data renders the BICEP2 measurement of relic gravitational waves inconclusive. It might still be there, the signal, but at least judging by the presently existing data analysis one can’t really say whether it is or isn’t.

Discoveries that vanish into dust, or rather “background” as the technical term has it, are of course nothing new in physics. In 1984, for example, the top quark was “discovered” with a mass of about 40 GeV:

Physicists may have found 'top' quark
By Robert C. Cowen, Staff writer of The Christian Science Monitor
MAY 9, 1984

Particle physicists appear to be poised for a breakthrough in their quest for the underlying structure of matter. Puzzling phenomena have appeared at energies where present theory predicted there was little left to uncover. This indicates that reseachers may have come across an unsuspected, and possibly rich, field in which to make new discoveries. Also, a team at the European Center for Nuclear Research (CERN) at Geneva may have found the long-sought 'top' quark. Protons, neutrons, and related particles are believed to be made up of combinations of more basic entities called quarks.”
This signal turned out to be a statistical fluctuation. The top quark wasn’t really discovered until 1995 with a mass of about 175 GeV. Tommaso tells the story.

The Higgs too was already discovered in 1984, at the Crystal Ball Experiment at DESY with a mass of about 9 GeV. It even made it into the NY Times:
PHYSICISTS REPORT MYSTERY PARTICLE
By WALTER SULLIVAN
Published: August 2, 1984

A new subatomic particle whose properties apparently do not fit into any current theory has been discovered by an international team of 78 physicists at DESY, a research center near Hamburg, West Germany. The group has named the particle zeta […] As described yesterday to a conference at Stanford, the zeta particle has some, but not all, of the properties postulated for an important particle, called the Higgs particle, whose existence has yet to be confirmed.”
Also in 1984, Supersymmetry came and went in the UA1 experiment [ht JoAnne]:
“Experimental observation of events with large missing transverse energy accompanied by a jet or a photon (S) in p \bar p collisions at \sqrt{s} = 540 GeV
UA1 Collaboration

We report the observation of five events in which a missing transverse energy larger than 40 GeV is associated with a narrow hadronic jet and of two similar events with a neutral electromagnetic cluster (either one or more closely spaced photons). We cannot find an explanation for such events in terms of backgrounds or within the expectations of the Standard Model.”
And the year 1996 saw a quark substructure come and go. The New York Times reported:
Tiniest Nuclear Building Block May Not Be the Quark
By MALCOLM W. BROWNE
Published: February 8, 1996

Scientists at Fermilab's huge particle accelerator 30 miles west of Chicago reported yesterday that the quark, long thought to be the simplest building block of nuclear matter, may turn out to contain still smaller building blocks and an internal structure.”
Then there is the ominous pentaquark that comes and goes, the anisotropic universe [ht Ben], the lefthanded universe [ht Ethan], and the infamous OPERA anomaly that was a loose cable - and these are only the best known ones. The BICEP2 story is remarkable primarily because the initial media reports, based on the collaboration’s own press releases, so vastly overstated the confidence of the results.

The evidence for relic gravitational waves is a discussion that will certainly be continued for at least a decade or so. My prediction is in the end, after loads of data analysis they will find the signal just where they expected it. And that is really the main difference between the BICEP announcement and the superluminal OPERA neutrinos: In the case of the gravitational waves everybody thought the signal should be there. In the case of the superluminal neutrinos everybody thought it should not be there.

The OPERA collaboration was heavily criticized for making such a big announcement out of a result that was most likely wrong, and one can debate whether or not they did the right thing. But at least they amply warned everybody that the result was likely wrong.

Saturday, September 13, 2014

Is there a smallest length?

Good ideas start with a question. Great ideas start with a question that comes back to you. One such question that has haunted scientists and philosophers since thousands of years is whether there is a smallest unit of length, a shortest distance below which we cannot resolve structures. Can we look closer and always closer into space, time, and matter? Or is there a limit, and if so, what is the limit?

I picture our foreign ancestors sitting in their cave watching the world in amazement, wondering what the stones, the trees and they themselves are made of – and starving to death. Luckily, those smart enough to hunt down the occasional bear eventually gave rise to human civilization sheltered enough from the harshness of life to let the survivors get back to watching and wondering what we are made of. Science and philosophy in earnest is only a few thousand years old, but the question whether there is smallest unit has always been a driving force in our studies of the natural world.

The old Greeks invented atomism, the idea that there is an ultimate and smallest element of matter that everything is made of. Zeno’s famous paradoxa sought to shed light on the possibility of infinite divisibility. The question came back with the advent of quantum mechanics, with Heisenberg’s uncertainty principle that fundamentally limits the precision by which we can measure. It became only more pressing with the divergences in quantum field theory that are due to the inclusion of infinitely short distances.

It was in fact Heisenberg who first suggested that divergences in quantum field theory might be cured by the existence of a fundamentally minimal length, and he introduced it by making position operators non-commuting among themselves. Like the non-commutativity of momentum and position operators leads to an uncertainty principle, so does the non-commutativity of position operators limits how well distances can be measured.

Heisenberg’s main worry, which the minimal length was supposed to deal with, was the non-renormalizability of Fermi’s theory of beta-decay. This theory however turned out to be only an approximation to the renormalizable electro-weak interaction, so he had to worry no more. Heisenberg’s idea was forgotten for some decades, then picked up again and eventually grew into the area of non-commutative geometries. Meanwhile, the problem of quantizing gravity appeared on stage and with it, again, non-renormalizability.

In the mid 1960s Mead  reinvestigated Heisenberg’s microscope, the argument that lead to the uncertainty principle, with (unquantized) gravity taken into account. He showed that gravity amplifies the uncertainty so that it becomes impossible to measure distances below the Planck length, about 10-33 cm. Mead’s argument was forgotten, then rediscovered in the 1990s by string theorists who had noticed using strings to prevent divergences by avoiding point-interactions also implies a finite resolution, if in a technically somewhat different way than Mead’s.

Since then the idea that the Planck length may be a fundamental length beyond which there is nothing new to find, ever, appeared in other approaches towards quantum gravity, such as Loop Quantum Gravity or Asymptotically Safe Gravity. It has also been studied as an effective theory by modifying quantum field theory to include a minimal length from scratch, and often runs under the name “generalized uncertainty”.

One of the main difficulties with these theories is that a minimal length, if interpreted as the length of a ruler, is not invariant under Lorentz-transformations due to length contraction. This problem is easy to overcome in momentum space, where it is a maximal energy that has to be made Lorentz-invariant, because momentum space is not translationally invariant. In position space one either has to break Lorentz-invariance or deform it and give up locality, which has observable consequences, and not always desired ones. Personally, I think it is a mistake to interpret the minimal length as the length of a ruler (a component of a Lorentz-vector), and it should instead be interpreted as a Lorentz-invariant scalar to begin with, but opinions on that matter differ.

The science and history of the minimal length has now been covered in a recent book by Amit Hagar:

Amit is a philosopher but he certainly knows his math and physics. Indeed, I suspect the book would be quite hard to understand for a reader without at least some background knowledge in math and physics. Amit has made a considerable effort to address the topic of a fundamental length from as many perspectives as possible, and he covers a lot of scientific history and philosophical considerations that I had not previously been aware of. The book is also noteworthy for including a chapter on quantum gravity phenomenology.

My only complaint about the book is its title because the question of discrete vs continuous is not the same as the question of finite vs infinite resolution. One can have a continuous structure and yet be unable to resolve it beyond some limit (this is the case when the limit makes itself noticeable as a blur rather than a discretization). On the other hand, one can have a discrete structure that does not prevent arbitrarily sharp resolution (which can happen when localization on a single base-point of the discrete structure is possible).

(Amit’s book is admittedly quite pricey, so let me add that he said should sales numbers reach 500 Cambridge University Press will put a considerably less expensive paperback version on offer. So tell your library to get a copy and let’s hope we’ll make it to 500 so it becomes affordable for more of the interested readers.)

Every once in a while I think that there maybe is no fundamentally smallest unit of length, that all these arguments for its existence are wrong. I like to think that we can look infinitely close into structures and will never find a final theory, turtles upon turtles, or that structures are ultimately self-similar and repeat. Alas, it is hard to make sense of the romantic idea of universes in universes in universes mathematically, not that I didn’t try, and so the minimal length keeps coming back to me.

Many if not most endeavors to find observational evidence for quantum gravity today look for manifestations of a minimal length in one way or the other, such as modifications of the dispersion relation, modifications of the commutation-relations, or Bekenstein’s tabletop search for quantum gravity. The properties of these theories are today a very active research area. We’ve come a long way, but we’re still out to answer the same questions that people asked themselves thousands of years ago.


This post first appeared on Starts With a Bang with the title "The Smallest Possible Scale in the Universe" on August 12, 2014.

Thursday, September 11, 2014

Experimental Search for Quantum Gravity – What is new?

Last week I was at SISSA in Trieste for the 2014 conference on “Experimental Search for Quantum Gravity”. I missed the first two days because of child care problems (Kindergarten closed during holiday season, the babysitter ill, the husband has to work), but Stefano Liberati did a great job with the summary talk the last day, so here is a community update.

The briefest of brief summaries is that we still have no experimental evidence for quantum gravity, but then you already knew this. During the last decade, the search for experimental evidence for quantum gravity has focused mostly on deviations from Lorentz-invariance and strong quantum gravity in the early universe that might have left imprints on the cosmological observables we measure today. The focus on these two topics is still present, but we now have some more variety which I think is a good development.

There is still lots of talk about gamma ray bursts and the constraints on deformations of Lorentz-invariance that can be derived from this. One has to distinguish these constraints on deformations from constraints on violations of Lorentz-invariance. In the latter case one has a preferred frame, in the former case not. Violations of Lorentz-invariance are very strongly constrained already. But to derive these constraints one makes use of an effective field theory approach, that is one assumes that whatever quantum gravity at high energies (close by the Planck scale) looks like, at small energies it must be describable by the quantum field theories of the standard model plus some additional, small terms.

Deformations of Lorentz-symmetry are said to not have an effective field theory limit and thus these constraints cannot be applied. I cautiously say “are said not to have” such a limit because I have never heard a good argument why such a limit shouldn’t exist. For all I can tell it doesn’t exist just because nobody working on this wants it to exist. In any case, without this limit one cannot use the constraints on the additional interaction terms and has to look for other ways to test the model.

This is typically done by constraining the dispersion relation for free particles which obtains small correction terms. These corrections to the dispersion relation affect the speed of massless particles, which now is energy-dependent. The effects of the deformation become larger with long travel times and large energies which is why high energetic gamma ray bursts are so interesting. The deformation would make itself noticeable by either speeding up or slowing down the highly energetic photons, depending on the sign of a parameter.

Current constraints put the limits roughly at the Planck scale if the modification is either to slow down or to speed up the photons. Putting constraints on the case where the deformation is stochastic (sometimes speeding up, sometimes slowing down) is more difficult and so far there haven’t been any good constraints on this. Jonathan Granot briefly flashed by some constraints on the stochastic case, but said he can’t spill the details yet, some collaboration issue. He and collaborators do however have a paper coming out within the next months that I expect will push the stochastic case up to the Planck scale as well.

On the other hand we heard a talk by Giacomo Rosati who argues that to derive these bounds one uses the normal expansion of the Friedmann-Robertson-Walker metric, but that the propagation of particles in this background should be affected by the deformed theory as well, which weakens the constraints somewhat. Well, I can see the rationale behind the argument, but after 15 years the space-time picture that belongs to deformed Lorentz-invariance is still unclear, so this might or might not be the case. There were some other theory talks that try to get this space-time picture sorted out but they didn’t make a connection to phenomenology.

Jakub Mielczarek was at the meeting talking about the moment of silence in the early universe and how to connect this to phenomenology. In this model for the early universe space-time makes a phase-transition from a Euclidean regime to the present Lorentzian regime, and in principle one should be able to calculate the spectral index from this model, as well as other cosmological signatures. Alas, it’s not a simple calculation and progress is slow since there aren’t many people working on it.

Another possible observable from this phase-transition may be leftover defects in the space-time structure. Needless to say, I like that very much because I was talking about my model for space-time defects that basically is a parameterization of this possibility in general (slides here). It would be great if one could connect these parameters to some model about the underlying space-time structure.

The main message that I have in my talk is that if you want to preserve Lorentz-invariance, as my model does, then you shouldn’t look at high energies because that’s not a Lorentz-invariant statement to begin with. You should look instead at wave-functions sweeping over large world-volumes. This typically means low energies and large distances, which is not a regime that presently gets a lot of attention when it comes to quantum gravity phenomenology. I certainly hope this will change within the next years because it seems promising to me. Well, more promising than the gamma ray bursts anyway.

We also heard Joao Magueijo in his no-bullshit style explaining that modified dispersion relations in the early universe can reproduce most achievements of inflation, notably the spectral index including the tilt and solving the horizon problem. This becomes possible because an energy-dependence in the speed of light together with redshift during expansion turns the energy-dependence into a time-dependence. If you haven’t read his book “Faster Than the Speed of Light”, I assure you you won’t regret it.

The idea of dimensional reduction is still popular but experimental consequences, if any, come through derived concepts such as a modified dispersion relation or early universe dynamics, again.

There was of course some discussion of the BICEP claim that they’ve found evidence for relic gravitational waves. Everybody who cared to express an opinion seemed to agree with me that this isn’t the purported evidence for quantum gravity that the press made out of it, even if the measurement was uncontroversial and statistically significant.

As we discussed in this earlier post, to begin with this doesn’t test the quantum gravity at high energies but only the perturbative quantization of gravity, which for most of my colleagues isn’t really quantum gravity. It’s the high energy limit that we do not know how to deal with. And even to claim that it is evidence for perturbative quantization requires several additional assumptions that may just not be fulfilled, for example that there are no non-standard matter couplings and that space-time and the metric on it exist to begin with. This may just not be the case in a scenario with a phase-transition or with emergent gravity. I hope that next time the media picks up the topic they care to talk to somebody who actually works on quantum gravity phenomenology.

Then there was a member from the Planck collaboration whose name I forgot, who tried to say something about their analysis of the foreground effects from the galactic dust that BICEP might not have accurately accounted for. Unfortunately, their paper isn’t finished and he wasn’t really allowed to say all that much. So all I can tell you is that Planck is pretty much done with their analysis and the results are with the BICEP collaboration which I suppose is presently redoing their data fitting. Planck should have a paper out by the end of the month we’ve been told. I am guessing it will primarily say there’s lots of uncertainty and we can’t really tell whether the signal is there or isn’t, but look out for the paper.

There was also at the conference some discussion about the possibility to test quantum gravitational effects in massive quantum systems, as suggested for example by Igor Pikovski et al. This is a topic we previously discussed here, and I still think it is extremely implausible. The Pikovski et al paper is neither the first nor the last to have proposed this type of test, but it is arguably the one that got the most attention because they managed to get published in Nature Physics. These experiments are supposed to test basically the same deformation that the gamma ray bursts also test, just on the level of commutation relations in quantum mechanics rather than in the dispersion relation (the former leads to the latter, the opposite is not necessarily so).

The problem is that in this type of theory nobody really knows how to get from the one-particle case to the many-particle case, which is known as the ‘soccer-ball-problem’. If one naively just adds the energies of particles, one finds that the corrections blow up when one approaches the Planck mass, which is about 10-5 grams. That doesn’t make a lot of sense - to begin with because we wouldn’t reproduce classical mechanics, but also because quantum gravitational effects shouldn’t scale with the energy but with the energy density. This means that the effects should get smaller for systems composed of many particles. In this case then, you cannot get good constraints on quantum gravitational effects in the proposed experiments. That doesn’t mean one shouldn’t do the experiment. This is new parameter space in quantum mechanics and one never knows what interesting things one might find there. I’m just saying don’t expect any quantum gravity there.

Also at the conference was Jonathan Miller, who I had been in contact with earlier about his paper in which he and his coauthor estimate whether the effect of gravitational bremsstrahlung on neutrino propagation is detectable (we discussed this here). It is an interesting proposal that I spent quite some time thinking about because they don’t make giant leaps of faith about the scaling of quantum gravitational effects. In this paper, it is plainly perturbatively quantized gravity.

However, after some thinking about this I came to the conclusion that while the cross-section that they estimate may be at the right order of magnitude for some cases (I am not too optimistic about the exact case that they discuss in the paper), the total probability for this to happen is still tiny. That is because unlike the case of cross-sections measured at the LHC, for neutrinos scattering off a black hole one doesn’t have a high luminosity to bring up the chance of ever observing this. When I estimated the flux, the probability turned out to be too small to be observable by at least 30 orders of magnitude, ie what you typically expect for quantum gravity. Anyways, I had some interesting exchange with Jonathan who, needless to say, isn’t entirely convinced by my argument. So it’s not a settled story, and I’ll let you know what comes out of this.

Finally, I should mention that Carlo Rovelli and Francesca Vidotto talked about their Planck stars and the possible phenomenology that these could lead to. We previously discussed their idea here. They are arguing basically that quantum gravitational effects can be so that a black hole (with an apparent horizon, not an event horizon) does not slowly evaporate until it reaches the Planck mass, but suddenly explodes at a mass still much higher than the Planck mass, thereby releasing its information. If that was possible, it would sneak around all the issues with firewalls and remnants and so on. It might also have observable consequences for these explosions might be detectable. However, this idea is still very much in its infancy and several people in the audience raised concerns similar to mine, whether this can work without violating locality and/or causality in the semi-classical limit. In any case, I am sure that we will hear more about this in the soon future.

All together I am relieved that the obsession with gamma ray bursts seems to be fading, though much of this fading is probably due to both Giovanni Amelino-Camelia and Lee Smolin not being present at this meeting ;)

This was the first time I visited SISSA since they moved to their new building, which is no longer located directly at the coast. It is however very nicely situated on a steep hill, surrounded by hiking paths through the forest. The new SISSA building used to be a hospital, like the buildings that house Nordita in Stockholm. I’ve been told my office at Nordita is in what used to be the tuberculosis sector, and if I’m stuck with a computation I can’t help but wonder how many people died at the exact spot my desk stands now. As to SISSA, I hope that the conference was on what was formerly the pregnancy ward, and that the meeting, in spirit, may give birth to novel ideas how to test quantum gravity.

Monday, September 08, 2014

Science changed my life – and yours too.

Can you name a book that made you rethink? A song that helped you through bad times? A movie that gave you a new perspective, new hope, an idea that changed your life or that of people around you? And was it worth the price of the book, the download fee, the movie ticket? If you think of the impact it has had, does it come as a number in your currency of choice?

Those of us working in basic research today are increasingly forced to justify their work by its social impact, it’s value for the society that they live in. It is a good question because scientists payed by tax money should keep in mind who they are working for. But that impact that the funding agencies are after, it is expected to come in the form of applications, something that your neighbor will eventually be able to spend money on, to keep the economic wheels turning and the gears running.

It might take centuries for today’s basic research to result in technological applications, and predicting them is more difficult than doing the research itself. The whole point of doing basic research is that its impact is unpredictable. And so this pressure to justify what we are doing is often addressed by fantastic extrapolations of today’s research, potential gadgets that might come out of it, new materials, new technologies, new services. These justification that we come up with ourselves are normally focused on material value, something that seems tangible to your national funding agency and your member of parliament who wants to be reelected.

But basic research has a long tail, and a soft one, that despite its softness has considerable impact that is often neglected. At our recent workshop for science writers, Raymond Laflamme gave us two great lectures on quantum information technology, the theory and the applications. Normally if somebody starts talking about qubits and gates, my brain switches off instantly, but amazingly enough listening to Laflamme made it sound almost comprehensible.

Here is the slide that he used to motivate the relevance of basic research (full pdf here):


Note how the arrows in the circle gradually get smaller. A good illustration for the high-risk, high impact argument. Most of what we work on in basic research will never lead anywhere, but that which does changes our societies, rewards and refuels our curiousity, then initiates a new round in the circle.

Missing in this figure though is a direct link from understanding to social impact.



New scientific insights have historically had a major impact on the vision the thinkers of the day had for the ideal society and how it was supposed to work, and they still have. Knowledge about the workings of the universe have eroded the rationale behind monarchy, strong hierarchies in general, the influence of the church, and given rise to other forms of organizations that we may call enlightened today, but that will seem archaic a thousand years from now.

The variational principle, made popular in Leibnitz’ conclusion that we live in the “best of all possible worlds”, a world that must be “optimal” in some sense, has been hugely influential and eventually spun off the belief in self-organization, in the existence of an “invisible hand” that will lead societies to an optimal state, and that we better not try to outsmart. This belief is still wide-spread among today’s liberals, even though it obviously begs the questions whether what an unthinking universe optimizes is that what humans want.

The ongoing exploration of nature on large and small scales has fundamentally altered the way in which we perceive of us as special, now knowing that our solar system is but one among billions, many of which contain planets similar to our own. And the multiverse in all its multiple realizations is the maybe ultimate reduction of humanity to an accident, whereby it remains to be seen just how lucky this accident is.

That insights coming from fundamental research affect our societies long before and in many ways besides applications come along today is documented vividly by the Singularity believers who talk about the coming of artificial intelligence surpassing our own intelligence like Christians talk about the rapture. Unless you live in Silicon valley it's a fringe phenomenon, but it is vivid proof just how much ideas affect us.

Other recent developments that have been influential way beyond the scientific niches where they originated are chaos, instability, tipping points, complexity and systemic risk. And it seems to me that the awareness that uncertainty is an integral part of scientific knowledge is slowly spreading.

The connection between understanding and social impact is one you are part of every time you read a popular science piece and update your views about the world, the planet we inhabit, our place on it, and its place in the vastness of the universe. It doesn’t seem to mean all that much, all these little people with their little blogs and their little discussions, but multiply it by some hundred millions. How we think about our being part of nature affects how we organize our living together and our societies.

Downloading a Wikipedia entry of 300 kb through your home wireless: 0.01 Euro. Knowing that the universe expands and will forever continue to expand: Priceless.