Pages

Tuesday, December 30, 2014

A new proposal for a fifth force experiment

Milla Jovovich in “The Fifth Element”
I still find it amazing that all I see around me is made up of only some dozens particles and four interactions. For all we know. But maybe this isn’t all there is? Physicists have been speculating for a while now that our universe needs a fifth force to maintain the observed expansion rate, but this has turned out to be very difficult to test. A new paper by Burrage, Copeland and Hinds from the UK now proposes a test based on measuring the gravitational attraction felt by single atoms.
    Probing Dark Energy with Atom Interferometry
    Clare Burrage, Edmund J. Copeland, E. A. Hinds
    arXiv:1408.1409

Dark energy is often portrayed as mysterious stuff that fills the universe and pushes it apart, but stuff and forces aren’t separate things. Stuff can be a force carrier that communicates an interaction between other particles. In its simplest form, dark energy is an unspecified smooth, inert, and unchanging constant, the “cosmological constant”. But for many theorists such a constant is unsatisfactory because its origin is left unexplained. A more satisfactory explanation would be a dark-energy-field that fills the universe and has the desired effect of accelerating the expansion by modifying the gravitational interaction on long, super-galactic, distances.

The problem with using fields to modify the gravitational interaction on long distances and to thus explain the observations is that one quickly runs into problems at shorter distances. The same field that needs to be present between galaxies to push them apart should not be present within the galaxies, or within solar systems, because we should have noticed that already.

About a decade ago, Weltman and Khoury pointed out that a dark energy field would not affect gravity on short distances if it was suppressed by the density of matter (arXiv:astro-ph/0309411). The higher the density of matter, the smaller the value of the dark energy field, and the less it would affect the gravitational attraction. Such a field thus would be very weak within our galaxies, and only make itself noticeable between galaxies where the matter density is very low. They called this type of dark energy field the “chameleon field” because it seems to hide itself and merges into the background.

The very same property that makes the chameleon field such an appealing explanation for dark energy is also what makes it so hard to test. Fifth force experiments in the laboratory measure the gravitational interaction with very high precision, and they have so far reproduced standard gravity with ever increasing precision. These experiments are however not sensitive to the chameleon field, at least not in the parameter range in which it might explain dark energy. That is because the existing fifth force experiments measure the gravitational force between two macroscopic probes, for example two metallic plates, and the high density of the probes themselves suppresses the field one is trying to measure.

In their new paper, Burrage et al show that one does not run into this problem if one uses a different setting. To begin with they say the experiment should be done in a vacuum chamber as to get the background density to be as small as possible, and the value of the chameleon field as high as possible. The authors then show that the value of the field inside the chamber depends on the size of the chamber and the quality of the vacuum and that the field increases towards the middle of the chamber.

They calculate the force between a very small, for example atomic, sample and a larger sample, and show that the atom is too small to cause a large suppression of the chameleon field. The gravitational attraction between two atoms is too feeble to be measureable, so one still needs one macroscopic body. But when one looks at the numbers, replacing one macroscopic probe with a microscopic one would be enough to make the experiment sensitive to find out whether dark energy is a chameleon field, or at least some of it.

One way to realize such an experiment would be by using atom interferometry which has previously been demonstrated to be sensitive to the gravitational force. In these experiments, an atom beam is split in two, one half of it is subjected to some field, and then the beams are combined again. From the resulting interference pattern one can extract the force that acted on the beams. A similar setting could be used to test the chameleon field.

Holger Müller from the University of California at Berkeley, an experimentalist who works on atom interferometry, thinks it is possible to do the experiment. “It’s amazing to see how an experiment that is very realistic with current technology is able to probe dark energy. The technology should even allow surpassing the sensitivity expected by Burrage et al.,” he said.

I find this a very interesting paper, and also a hopeful one. It shows that while sending satellites into orbit and building multi-billion dollar colliders are promising ways to search for new physics, they are not the only ways. New physics can also hide in high precision measurements in your university lab, just ask the theorists. Who knows, there might be a chameleon hidden in your vacuum chamber.

This post first appeared on Starts with a Bang as "The Chameleon in the Vacuum Chamber".

Monday, December 29, 2014

The 2014 non-news: Where do these highly energetic cosmic rays come from?

As the year 2014 is nearing its end, lists with the most read stories are making the rounds. Everything in there, from dinosaurs over miracle cures, disease scares, Schadenfreude, suicide, the relic gravitational wave signal that wasn't, space-traffic accidents, all the way to a comet landing.

For the high energy physicists, this was another year of non-news though, not counting the one or the other baryon that I have a hard time getting excited about. No susy, no dark matter detection, no quantum gravity, no beyond the standard whatsoever.

My non-news of the year that probably passed you by is that the origin of highly energetic cosmic rays descended back into mystery. If you recall, in 2007, the Pierre Auger Collaboration announced that they had found a correlation between the directions from which they saw the highly energetic particles coming and the positions of galaxies with supermassive black holes, more generally referred to as active galactic nuclei. (Yes, I've been writing this blog for that long!)

This correlation came with some fineprint because highly energetic particles will eventually, after sufficiently long travel, scatter at one of the very dispersed photons of the cosmic microwave background. So you would not expect a correlation with these active galactic nuclei beyond a certain distance, and that seemed to be exactly what they saw. They didn't at this point have a lot of data so that the statistical significance wasn't very high. However, many people thought this correlation would become stronger with more data, and the collaboration probably thought so too, otherwise they wouldn't have published it.

But it didn't turn out this way. The correlation didn't become stronger. Instead by now it's pretty much entirely gone. In October, Katia Moskvitch at Nature News summed it up:

"Working with three-and-a-half years of data gleaned from 27 rays, Auger researchers reported that the rays seemed to preferentially come from points in the sky occupied by supermassive black holes in nearby galaxies. The implication was that the particles were being accelerated to their ultra-high energies by some mechanism associated with the giant black holes. The announcement generated a media frenzy, with reporters claiming that the mystery of the origin of cosmic rays had been solved at last.

But it had not. As the years went on and as the data accumulated, the correlations got weaker and weaker. Eventually, the researchers had to admit that they could not unambiguously identify any sources. Maybe those random intergalactic fields were muddying the results after all. Auger “should have been more careful” before publishing the 2007 paper, says Avi Loeb, an astrophysicist at Harvard University in Cambridge, Massachusetts."

So we're back to speculation on the origin of the ultra high energetic cosmic rays. It's a puzzle that I've scratched my head over for some while - more scratching is due.

Wednesday, December 24, 2014

Merry Christmas :)

I have a post about "The rising star of science" over at Starts with a Bang. It collects some of my thoughts on science and religion, fear and wonder. I will not repost this here next month, so if you're interested check it out over there. According to medium it's a 6 minutes read. You can get a 3 minutes summary in my recent video:


We wish you all happy holidays :)


From left to right: Inga the elephant, Lara the noisy one, me, Gloria the nosy one, and Bo the moose. Stefan is fine and says hi too, he isn't in the photo because his wife couldn't find the setting for the self-timer.

Tuesday, December 23, 2014

Book review: "The Edge of the Sky" by Roberto Trotta

The Edge of the Sky: All You Need to Know about the All-There-Is
Roberto Trotta
Basic Books (9. Oktober 2014)

It's two days before Christmas and you need a last-minute gift for that third-degree-uncle, heretofore completely unknown to you, who just announced a drop-in for the holidays? I know just the right thing for you: "The Edge of the Sky" by Roberto Trotta, which I found as free review copy in my mailbox one morning.

According to the back flap, Roberto Trotta is a lecturer in astrophysics at Imperial College. He has very blue eyes and very white teeth, but I have more twitter followers, so I win. Roberto set out to explain modern cosmology with only the thousand most used words of the English language. Unfortunately, neither "cosmology" nor "thousand" belongs to these words, and certainly not "heretofore" which might or might not mean what I think it means.

The result is a nice little booklet telling a story about "big-seers" (telescopes) and "star-crowds" (galaxies) and the "early push" (inflation) with a couple of drawings for illustration. It's pretty and kinda artsy which probably isn't a word at all. The book is also as useless as that price-winning designer chair in which one can't sit, but better than the chair because it's very slim and will not take up much space, or money. It's just the right thing to give to your uncle who will probably not read it and so he'll never find out that you think he's too dumb to know the word "particle". It is, in summary, the perfect re-gift, so go and stuff it into somebody's under-shoe-clothes - how am I doing?

Saturday, December 20, 2014

Has Loop Quantum Gravity been proved wrong?

Logo of site by name loop insight.
The insight to take away is that you have to
carefully look for those infinities
[Fast track to wisdom: Probably not. But then.]

The Unruh effect is the predicted, but so-far not observed, particle production seen by an accelerated observer in flat space. It is a result obtained using quantum field theory and does not include gravity, and the particles are thermally distributed with a temperature that is proportional to the acceleration. The origin of the particle production is that the notion of particles, like the passage of time, is observer-dependent, and so what is Bob’s vacuum might be Alice’s thermal bath.

The Unruh effect can be related to the Hawking effect, that is the particle production in the gravitational field of a black hole, by use of the equivalence principle. Neither of the two effects has anything to do with quantum gravity. In these calculations, space-time is treated as a fixed background field that has no quantum properties.

Loop Quantum Gravity (LQG) is an approach to quantum gravity that relies on a new identification of space-time degrees of freedom, which can then be quantized without running into the same problems as one does when quantizing perturbations of the metric. Or at least that’s the idea. The quantization prescription depends on two parameters, one is a length scale normally assumed to be of the order of the Planck length, and the other one is a parameter that everybody wishes wasn’t there and which will not be relevant in the following. The point is that LQG is basically a modification of the quantization procedure that depends on the Planck length.

In a recent paper now Hossain and Sadar from India claim that using the loop quantization method does not reproduce the Unruh effect

If this was correct, this would be really bad news for LQG. So of course I had to read the paper, and I am here to report back to you.

The Unruh effect has not been measured yet, but experiments have been done for some while to measure the non-gravitational analog of the Hawking effect. Since the Hawking effect is a consequence of certain transformations in quantum field theory that also apply to other systems, it can be studied in the laboratory. There is some ongoing controversy whether or not it has been measured already, but in my opinion it’s really just a matter of time until they’ve pinned down the experimental uncertainties and will confirm this. It would be theoretically difficult to claim that the Unruh effect does not exist when the Hawking effect does. So, if it’s true what they claim in the paper, then Loop Quantum Gravity, or its quantization method respectively, would be pretty much ruled out, or at least in deep trouble.

What they do in the paper is that they apply the two quantization methods to quantum fields in a fixed background. As is usual in this calculation, the background remains classical. Then they calculate the particle flux that an accelerated observer would see. For this they have to define some operators as limiting cases because they don’t exist the same way for the loop quantization method. They find in the end that while the normal quantization leads the expected thermal spectrum, the result for the loop quantization method is just zero.

I kinda want to believe it, because then at least something would be happening in quantum gravity! But I see a big problem with this computation. To understand it, you first have to know that the result with the normal quantization method isn’t actually a nice thermal distribution, it is infinity. This infinity can be identified by a suitable mathematical procedure, in which case one finds that it is the zero of a delta function in momentum space. Once identified, it can be factored out, and the prefactor of the delta function is the thermal spectrum that you’ve been looking for. One can trace back the physical origin of this infinity to find it is, roughly speaking, that you’ve looked at the flux for an infinite volume.

These types of infinites appear in quantum field theory all over the place and they can be dealt with by a procedure called regularization that is the introduction of a parameter, the “regulator”, whose purpose is to capture the divergences so that they can be cleanly discarded of. The important thing about regularization is that you have to identify the divergences first before you can get rid of them. If you try to divide out an infinite factor from a result that wasn’t divergent, all you get is zero.

What the authors do in the paper is that they use a standard regularization method for the Unruh effect that is commonly used for the normal quantization, and apply this regularization also to the other quantization. Now the loop quantization in some sense already has a regulator, that’s the finite length scale that, when the quantization is applied to space-time, results in a smallest unit of area and volume. If this length scale is first set to zero, and then the regulator is removed, one gets the normal Unruh effect. If one first removes the regulator, the result is apparently zero. (Or so they claim in the paper. I didn’t really check all their approximations of special functions and so on.)

My suspicion therefore is that the result would have been finite to begin with and that the additional regularization is an overkill. The result is zero, basically, because they’ve divided out an infinity too much.

The paper however is very confusingly written and at least I don’t see at first sight what’s wrong with their calculation. I’ve now consulted three people who work on related things and neither of them saw an obvious mistake. I myself don’t care enough about Loop Quantum Gravity to spend more time on this than I already have. The reason I am telling you about this is because there has been absolutely no reaction to this paper. You’d think if colleagues go about and allegedly prove wrong the theory you’re working on, they’d be shouted down in no time! But everybody loop quantum just seems to have ignored this.

So if you’re working on loop quantum gravity, I would appreciate a pointer to a calculation of the Unruh effect that either confirms this result or proves it wrong. And the rest of you I suggest spread word that loop quantum gravity has been proved wrong, because then I’m sure we will get a clarification of this very very quickly ;)

Saturday, December 13, 2014

The remote Maxwell Demon

During the summer, I wrote a paper that I dumped in an arxiv category called cond-mat.stat-mech, and then managed to entirely forget about it. So somewhat belatedly, here is a summary.

Pretty much the only recollection I have of my stat mech lectures is that every single one of them was inevitably accompanied by the always same divided box with two sides labeled A and B. Let me draw this for you:


Maxwell’s demon in its original version sits in this box. The demon’s story is a thought experiment meant to highlight the following paradox with the 2nd law of thermodynamics.

Imagine the above box is filled with a gas, and the gas is at a low temperature on side A and at a higher temperature on side B. The second law of thermodynamics says that if you open a window in the dividing wall, the temperatures will come to an average equilibrium value, and in this process entropy is maximized. Temperature is basically average kinetic energy, so the average speed of the gas atoms approaches the same value everywhere, just because this is the most likely thing to happen

The system can only do work on the way to equilibrium, but no longer once it’s arrived there. Once you’ve reached this state of maximum entropy, nothing happens any more, except for fluctuations. Unless you have a Maxwell demon...

Maxwell’s demon sits at the dividing wall between A and B when both sides are at the same temperature. He opens the window every time a fast atom comes from the left or a slow atom comes from the right, otherwise he keeps it closed. This has the effect of sorting fast and slow atoms so that, after some while, more fast atoms are on the right side than on the left side. This means the temperatures are not in equilibrium anymore and entropy has decreased. The demon thus has violated the second law of thermodynamics!

Well, of course he hasn’t, but it took a century for physicists to pin down the exact reason why. In brief it’s that the demon must be able to obtain, store, and use information. And he can only do that if he either starts at a low entropy that then increases, or brings along an infinite reservoir of low entropy. The total entropy never decreases, and the second law is well and fine.

It has only been during recent years that some versions of Maxwell’s demon have been experimentally realized in the laboratory. These demons use essentially information to drive a system out of equilibrium, which can then, in principle, do work.

It occurred to me that this must mean it should be possible to replace transfer of energy from a sender to a receiver by transfer of information, and this information transfer could take place with a much smaller energy than what the receiver gets out of the information. In essence this would mean one can down-convert energy during transmission.

The reason this is possible is that the relevant energy here is not the total energy – a system in thermal equilibrium has lots of energy. The relevant energy that we want at the receiving end is free energy – energy that can be used to do work. The signal does not need to contain the energy itself, it only needs to contain the information that allows one to drive the system out of equilibrium.

In my paper, I have constructed a concrete example for how this could work. The full process must include remote measuring, extraction of information from the measurement, sending of the signal, and finally making use of the signal to actually extract energy. The devil, or in this case the demon, is in the details. It took me some while to come up with a system simple enough so one could in the end compute the energy conversion and also show that the whole thing, remote demon included, obeys the Carnot limit on the efficiency of heat engines.

In the classical example of Maxwell’s demon, the necessary information is the velocity of the particles approaching the dividing wall, but I chose a simpler system with discrete energy levels, just because the probability distributions are then better to deal with. The energy extraction that my demon works with is a variant of stimulated emission that is also used in lasers.

The atoms in a laser are being “pumped” into an out-of equilibrium state, which has the property that as you inject light (ie, energy) with the right frequency, you get out more light of the same frequency than you sent in. This does not work if the system is in equilibrium though, it is then always more likely that the injected signal is absorbed rather than that it stimulates a net emission.

However, a system in equilibrium always has fluctuations. The atoms have some probability to be in an excited state, a state in which they could be stimulated to emit light. If you just knew which atoms were in the excited state, then you could target them specifically, and end up with twice the energy that you sent in.

So that’s what my remote demon does: It measures out of equilibrium fluctuations in some atomic system and targets these to extract energy. The main point is that the energy sent to the system can be much smaller than the extracted energy. It is, in essence, a wireless battery recharger. Except that the energies in question are, in my example, so tiny that it’s practically entirely useless.

I’ve never worked on anything in statistical mechanics before. Apparently I don’t even have a blog label to tag it! This was a fun project and I learned a lot. I even made a drawing to accompany it.


Saturday, December 06, 2014

10 things you didn’t know about the Anthropic Principle

“The anthropic principle – the idea that our universe has the properties it does because we are here to say so and that if it were any different, we wouldn’t be around commenting on it – infuriates many physicists, including [Marc Davis from UC Berkeley]. It smacks of defeatism, as if we were acknowledging that we could not explain the universe from first principles. It also appears unscientific. For how do you verify the multiverse? Moreover, the anthropic principle is a tautology. “I think this explanation is ridiculous. Anthropic principle… bah,” said Davis. “I’m hoping they are wrong [about the multiverse] and that there is a better explanation.””
~Anil Ananthaswamy, in “The Edge of Physics”
Are we really so special?
Starting in the mid 70s, the anthropic principle has been employed in physics as an explanation for values of parameters in the theories, but in 2014 I still come across ill-informed statements like the one above in Anil Ananthaswamy’s (otherwise very recommendable) book “The Edge of Physics”. I’m no fan of the anthropic principle because I don’t think it will lead to big insights. But it’s neither useless nor a tautology nor does it acknowledge that the universe can’t be explained from first principles.

Below the most important facts about the anthropic principle, where I am referring to the definition from Ananthaswamy’s quote “Our universe has the properties it does because if it were any different we wouldn’t be here to comment on it.”
  1. The anthropic principle doesn’t necessarily have something to do with the multiverse.

    The anthropic principle is correct regardless of whether there is a multiverse or not and regardless of what is the underlying explanation for the values of parameters in our theories, if there is one. The reason it is often brought up by multiverse proponents is that they claim the anthropic principle is the only explanation, and there is no other selection principle for the parameters that we observe. One then needs to show though that the value of parameters we observe is indeed the only one (or at least a very probable one) if one requires that life is possible. This is however highly controversial, see 2.

  2. The anthropic principle cannot explain the values of all parameters in our theories.

    The typical claim that the anthropic principle explains the value of parameters in the multiverse goes like this: If parameter x was just a little larger or smaller we wouldn’t exist. The problem with this argument is that small variations in one out of two dozen parameters do not consider the bulk of possible combinations. You’d really have to consider independent modifications of all parameters to be able to conclude there is only one combination supportive of life. This however is not a presently feasible calculation.

    Though we cannot presently scan the whole parameter space to find out which combinations might be supportive for life, we can do a little better than one and try at least a few. This has been done and thus we know that the claim that there is really only one combination of parameters that will create a universe hospitable to life is on very shaky ground.

    In their 2006 paper “A Universe Without Weak Interactions”, published in PRD, Harnik, Kribs, and Perez paper put forward a universe that seems capable of creating life and yet is entirely different from our own [arXiv:hep-ph/0604027]. Don Page argues that the universe would be more hospitable for life if the cosmological constant was smaller than the observed value [arxiv:1101.2444], and recently it was claimed that life might have been possible already in the early universe [arxiv:1312.0613. All these arguments show that a chemistry complex enough to support life can arise under circumstances that, while still special, are not anything like the ones we experience today.

  3. Even so, the anthropic principle might still explain some parameters.

    The anthropic principle might however still work for some parameters if their effect is almost independent on what the other parameters do. That is, even if one cannot use the anthropic principle to explain all values of parameters because one knows there are other combinations allowing for the preconditions of life, some of these parameters might need to have the same value in all cases. The cosmological constant is often claimed to be of this type.

  4. The anthropic principle is trivial but that doesn’t mean it’s obvious.

    Mathematical theorems, lemmas, and corollaries are results of derivations following from assumptions and definitions. They essentially are the assumptions, just expressed differently. They are always true and sometimes trivial. But often, they are surprising and far from obvious, though that is inevitably a subjective statement. Complaining that something is trivial is like saying “It’s just sound waves” and referring to everything from engine noise to Mozart.

  5. The anthropic principle isn’t useless.

    While the anthropic principle might strike you as somewhat silly and trivially true, it can be useful for example to rule out values of certain parameters. The most prominent example is probably the cosmological constant which, if it was too large, wouldn’t allow the formation of structures large enough to support life. This is not an empty conclusion. It’s like when I see you drive to work by car every morning and conclude you must be old enough to have a driver’s license. (You might just be stubbornly disobeying laws, but the universe can’t do that.) The anthropic principle is in its core function a consistency constraint on the parameters in our theories. One could derive from it predictions on the possible combinations of parameters, but since we have already measured them these are now merely post-dictions.

    Fred Hoyle's prediction of properties of the carbon nucleus that make possible the synthesis of carbon in stellar interiors — properties that were later discovered as predicted — is often quoted as successful application of the anthropic principle because Hoyle is said to have exploited the fact that carbon is central to life on Earth. Some historians have questioned whether this was indeed Hoyle's reasoning, but the mere fact that it could have been shows that anthropic reasoning can be a useful extrapolation of observation - in this case the abundance of carbon on our planet.

  6. The anthropic principle does not imply a causal relation.

    Though “because” suggests it, there is no causation in the anthropic principle. An everyday example for “because” not implying an actual cause: I know you’re sick because you’ve got a cough and a runny nose. This doesn’t mean the runny nose caused you to be sick. Instead, it was probably some virus. Alas, you can carry a virus without showing symptoms so it’s not like the virus is the actual “cause” of my knowing. Likewise, that there is somebody here to observe the universe did not cause a life-friendly universe into existence. (And the return, that a life-friendly universe caused our existence doesn’t work because it’s not like the life-friendly universe sat somewhere out there and then decided to come into existence to produce some humans.)

  7. The applications of the anthropic principle in physics have actually nothing to do with life.

    As Lee Smolin likes to point out, the mentioning of “life” in the anthropic principle is entirely superfluous verbal baggage (my words, not his). Physicists don’t usually have a lot of business with the science of self-aware conscious beings. They talk about formation of large scale structures or atoms that are preconditions for biochemistricy, but don’t even expect physicists to discuss large molecules. Talking about “life” is arguably catchier, but that’s really all there is to it.

  8. The anthropic principle is not a tautology in the rhetorical sense.

    It does not use different words to say the same thing: A universe might be hospitable to life and yet life might not feel like coming to the party, or none of that life might ever ask a why-question. In other words, getting the parameters right is a necessary but not a sufficient condition for the evolution of intelligent life. The rhetorically tautological version would be “Since you are here asking why the universe is hospitable to life, life must have evolved in that universe that now asks why the universe is hospitable to life.” Which you can easily identify as rhetorical tautology because now it sounds entirely stupid.

  9. It’s not a new or unique application.

    Anthropic-type arguments, based on the observation that there exists somebody in this universe capable of making an observation, are not only used to explain free parameters in our theories. They sometimes appear as “physical” requirements. For example: we assume there are no negative energies because otherwise the vacuum would be unstable and we wouldn’t be here to worry about it. And requirements like locality, separation of scales, and well-defined initial value problems are essentially based on the observation that otherwise we wouldn’t be able to do any science, if there was anybody to do anything at all. Logically, these requirements are the same as anthropic arguments, they just aren’t referred to it as such.

  10. Other variants of the anthropic principle have questionable scientific value

    The anthropic principle becomes speculative, for not to say unscientific, once you try to go beyond the definition that I referred to here. If one does not understand that a consistency constraint does not imply a causal relation then you come to the strange conclusion that humans caused the universe into existence. And if one does not accept that the anthropic principle is just a requirement that a viable theories has to fulfil, one is then stuck with the question why the parameter values are what they are. Here is where the multiverse comes back, for you can then argue that we are forced to believe in the “existence” of universes with all possible combinations. Or you can go off the deep end and argue that our universe was designed for the existence of life.

    Personally I feel the urge to wash my hands after having been in touch with these kinds of arguments. I prefer my principles trivially true.


This post previously appeared October 21st 2014 on Starts with a Bang.

Saturday, November 29, 2014

Negative Mass in General Relativity?

[Image Source: Ginva.com]
Science News ran a piece the other week about a paper that has appeared in PRD titled “Negative mass bubbles in de Sitter spacetime”. The Science News article is behind a paywall, but don’t worry I’ll tell you everything you need to know.

The arxiv version of the paper is here. Since I’m quoted in the Science News piece saying something to the extent that I have my reservations but think it’s a promising direction of study, I have gotten a lot of questions about negative masses in General Relativity lately. So here a clarification.

First one has to be careful what one means with mass. There are three types of masses: inertial mass, passive gravitational mass, and active gravitational mass. In General Relativity these masses, or their generalization in terms of tensors respectively, are normally assumed to be identical.

The equality of inertial and passive gravitational mass is basically the equivalence principle. The active gravitational mass is what causes space-time to bend; the passive gravitational mass is what couples to the space-time and determines the motion of particles in that background. The active and passive gravitational masses are identical in almost all theories I know. (The Schrödinger-Newton approach is the only exception that comes to mind). I doubt it is consistent to have them not be equal, but I am not aware of a proof for this. (I tried in the Schrödinger-Newton case, but it’s not as trivial as it looks at first sight.)

In General Relativity one further has to distinguish between the local quantities like energy-density and pressure and so on that are functions of the coordinates, and global quantities that describe the space-time at large. The total mass or energy in some asymptotic limit are essentially integrals over the local quantities, and there are several slightly different ways to define them.

The positive mass theorem, in contrast to what its name suggests, does not state that one cannot have particles with negative masses. It states instead, roughly, that if your local matter is normal matter and obeys certain plausible assumptions, then the total energy and mass are also positive. You thus cannot have stars with negative masses, regardless of how you bend your space-time. This isn’t as trivial a statement as it sounds because the gravitational interaction contributes to the definition of these integrated quantities. In any case, the positive mass theorem holds in space that is asymptotically flat.

Now what they point out in the new paper is that for all we know we don’t live in asymptotically flat space, but we live in asymptotic de-Sitter space because observational evidence speaks for a positive cosmological constant. In this case the positive mass theorem doesn’t apply. Then they go on to construct a negative mass solution in asymptotic de Sitter space. I didn’t check the calculation in detail, part of it is numerical, but it all sounds plausible to me.

However, it is somewhat misleading to call the solution that they find a negative mass solution. The cosmological constant makes a contribution to the effective mass term in what you can plausibly interpret as the gravitational potential. Taken together both, the effective mass in the potential is positive in the region where this solution applies. The local mass (density) is also positive by assumption. (You see this most easily by looking at fig 1 in the paper.)

Selling this as a negative mass solution is like one of these ads that say you’ll save 10$ if you spend at least $100 – in the end your expenses are always positive. The negative mass in their solution corresponds to the supposed savings that you make. You never really get to see them. What really matters are the total expenses. And these are always positive. There are thus no negative mass particles in this scenario whatsoever. Further, the cosmological constant is necessary for these solutions to exist, so you cannot employ them to replace the cosmological constant.

It also must be added that showing the existence of a certain solution to Einstein’s field equations is one thing, showing that they have a reasonable chance to actually be realized in Nature is an entirely different thing. For this you have to come up with a mechanism to create them and you also have to show that they are stable. Neither point is addressed in the paper.

Advertisement break: If you want to know how one really introduces negative masses into GR, read this.

In the Science News article Andrew Grant quotes one of the authors as saying:
“Paranjape wants to look into the possibility that the very early universe contained a plasma of particles with both positive and negative mass. It would be a very strange cosmic soup, he says, because positive mass gravitationally attracts everything and negative mass repels everything.”
This is wrong. Gravitation is a spin-2 interaction. It is straightforward to see that this means that like charges attract and unlike charges repel. The charge of gravity is the mass. This does not mean that negative gravitational mass repels everything. Negative gravitational mass repels positive mass but attracts negative mass. If this wasn’t so, then you’d run into the above mentioned inconsistencies. The reason this isn’t so in the case considered in the paper is that they don’t have negative masses to begin with. They have certain solutions that basically have a gravitational attraction which is smaller than expected.

In summary, I think it’s an interesting work, but so far it’s an entirely theoretical construct and its relevance for the description of cosmological dynamics is entirely unclear. There are no negative mass particles in this paper in any sensible interpretation of this term.

Saturday, November 22, 2014

Gender disparity? Yes, please.

[Image Source: Papercards]

Last month, a group of Australian researchers from the life sciences published a paper that breaks down the duration of talks at a 2013 conference by gender. They found that while the overall attendance and number of presentations was almost equally shared between men and women, the women spoke on the average for shorter periods of time. The main reason for this was that the women applied for shorter talks to begin with. You find a brief summary on the Nature website.

The twitter community of women in science was all over this, encouraging women to make the same requests as men, asserting that women “underpromote” themselves by not taking up enough of their colleagues’ time.



Other studies have previously found that while women on the average speak as much as men during the day, they tend to speak less in groups, especially so if the group is predominantly male. So the findings from the conference aren’t very surprising.

Now a lot of what goes around on twitter isn’t really meant seriously, see the smiley in Katie Hinde’s tweet. I remarked one could also interpret the numbers to show that men talk too much and overpromote themselves. I was joking of course to make a point, but after dwelling on this for a while I didn’t find it that funny anymore.

Women are frequently told that to be successful they should do the same as men do. I don’t know how often I have seen advice explaining how women are allegedly belittling themselves by talking, well, like a woman. We are supposed to be assertive and take credit for our achievements. Pull your shoulders back, don’t cross your legs, don’t flip your hair. We’re not supposed to end every sentence as if it was a question. We’re not supposed to start every interjection with an apology. We’re not supposed to be emotional and personal, and so on. Yes, all of these are typically “female” habits. We are told, in essence, there’s something wrong with being what we are.

Here is for example a list with public speaking tips: Don’t speak about yourself, don’t speak in a high pitch, don’t speak too fast because “Talking fast is natural with two of your best friends and a bottle of Mumm, but audiences (especially we slower listening men) can’t take it all in”. Aha. Also, don’t flirt and don’t wear jewelry because the slow men might notice you’re a woman.

Sorry, I got sick at point five and couldn’t continue – must have been the Mumm. Too bad if your anatomy doesn’t support the low pitches. If you believe this guy that is, but listen to me for a moment, I swear I’ll try not to flirt. If your voice sounds unpleasant when you’re giving a talk, it’s not your voice, it’s the microphone and the equalizer, probably set for male voices. And do we really need a man to tell us that if we’re speaking about our research at a conference we shouldn’t talk about our recent hiking trip instead?

There are many reasons why women are underrepresented in some professions and overrepresented in others. Some of it is probably biological, some of it is cultural. If you are raising or have raised a child it is abundantly obvious that our little ones are subjected to gender stereotypes starting at very young age. Part of it is the clothing and the toys, but more importantly it’s simply that they observe the status quo: Childcare is still predominantly female business and I yet have to see a woman on the garbage truck.

Humans are incredibly social animals. It would be surprising if the prevailing stereotypes did not affect us at all. That’s why I am supportive of all initiatives that encourage children to develop their talents regardless of whether these talents are deemed suitable for their gender, race, or social background. Because these stereotypes are thousands of years old and have become hurdles to our selfdevelopment. By and large, I see more encouragements for girls than I see for boys to follow their passion regardless of what society thinks, and I also see that women have more backup fighting unrealistic body images which is what this previous post was about. Ironically, I was criticized on twitter for saying that boys don’t need to have a superhero body to be real men because that supposedly wasn’t fair to the girls.

I am not supportive of hard quotas that aim at prefixed male-female ratios. There is no scientific support for these ratios, and moreover I witnessed repeatedly that these quotas have a big backlash, creating a stigma that “She is just here because” whether or not that is true.

Thus, at the present level women are likely to still be underrepresented from where we would be if we’d manage to ignore social pressure to follow ancient stereotypes. And so I think that we would benefit from more women among the scientists, especially in math-heavy disciplines. Firstly because we are unnecessarily missing out of talent. But also because diversity is beneficial for the successful generation and realization of ideas. The relevant diversity is in the way we think and argue. Again, this is probably partly biological and partly cultural, but whatever the reason, a diversity of thought should be encouraged and this diversity is almost certainly correlated with demographic diversity.

That’s why I disapprove of so-called advice that women should talk and walk and act like men. Because that’s exactly the opposite from what we need. Science stands to benefit from women being different from men. Gender equality doesn’t mean genders should be equal, it means they should have the same opportunities. So women are more likely to volunteer organizing social events? Wtf is wrong with that?

So please go flip your hair if you feel like it, wear your favorite shirt, put on all the jewelry you like, and generally be yourself. Don’t let anybody tell you to be something you are not. If you need the long slot for your talk go ahead. If you’re confident you can get across your message in 15 minutes, even better, because we all talk too much anyway.


About the video: I mysteriously managed to produce a video in High Definition! Now you can see all my pimples. My husband made a good camera man. My anonymous friend again helped cleaning up the audio file. Enjoy :)

Wednesday, November 19, 2014

Frequently Asked Questions

[Image source: Stickypictures.]

My mom is a, now-retired, high school teacher. As teenager I thought this was a great job and wanted to become a teacher myself. To practice, I made money giving homework help but discovered quickly I hated it for a simple reason: I don’t like to repeat myself. I really don’t like to repeat myself.

But if I thought spending two years repeating how to take square roots - to the same boy - was getting me as close to spontaneous brain implosion I ever wanted to get, it still didn’t quite prepare me for the joys of parenthood. Only the twins would introduce me to the pleasure of hearing Jingle Bells for 5 hours in a row, and re-reading the story about Clara and her Binky until the book mysteriously vanished and will not be seen again unless somebody bothers to clean behind the shoe rack. “I told you twice not to wash the hair dryer,” clearly wasn’t my most didactic moment. But my daughter just laughed when the fuse blew and the lights went off. Thanks for asking, we got a new dryer.

And so I often feel like I write this blog as an exercise in patience. Nobody of course bothers to search the blog archives where I have explained everything. Sometimes twice! But today I will try to be inspired by Ethan who seems to have the patience of an angel, if a blue one, and basically answers the same questions all over and over and over again. So here are answers to the questions I get most often. Once and forever I hope...
  1. Is string theory testable?

    The all-time favorite. Yes, it is. There is really no doubt about it. The problem is that it is testable in principle, but at least so far nobody knows how to test it in practice. The energy (densities) necessary for this are just too high. Some models that are inspired by string theory, notably string cosmology, are testable with existing experiments. That it is testable in principle is a very important point because some variants of the multiverse aren’t even testable in principle and then it is indeed highly questionable whether it is still science. Not so though for string theory. And let me be clear that I mean here string theory as the candidate theory of everything including gravity. Testing string theory as means to explain certain strongly coupled condensed matter systems is an entirely different thing.

  2. Do black holes exist?

    Yes. We have ample evidence that supermassive black holes exist in the centers of many galaxies and that solar-sized black holes are found throughout galaxies. The existence of black holes is today generally accepted fact in the physics community. That black holes exist means concretely that we have observational evidence for objects dense enough to be a black hole and that do not have a hard surface, so they cannot be a very dim stars. One can exclude this possibility because matter hitting the surface of a star would emit radiation, whereas the same would not happen when the matter falls through the black hole horizon. This horizon does not have to be an eternal horizon. It is consistent with observation, and indeed generally believed, that the black hole horizon can eventually vanish, though this will not happen until hundreds of billions of years into the future. The defining property of the black hole is the horizon, not the singularity at its center, which is generally believed to not exist but for which we have no evidence one way or the other.

  3. Why quantize gravity?

    There is no known way to consistently couple the non-quantized theory of general relativity to the quantum field theories of the standard model. This only works in limiting cases. The most plausible way to resolve this tension is to quantize gravity too. It is in principle also possible that instead there is a way to couple quantum and classical theories that has so far been missed, or that the underlying theory is in some sense neither classical nor quantum, but this option is not favored by most researchers in the field today. Either way, the inconsistency in our existing theories is a very strong indication that the theories we have are incomplete. Research in quantum gravity basically searches for the completion of the existing theories. In the end this might or might not imply actually quantizing gravity, but Nature somehow knows how to combine general relativity with quantum field theory, and we don’t.

  4. Why is it so hard to quantize gravity?

    It isn’t. Gravity can be quantized pretty much the same way as the other interactions. It’s just that the theory one arrives at this way cannot be a fundamental theory because it breaks down at high energies. It is thus not the theory that we are looking for. Roughly speaking the reason this happens is that the gravitational equivalent of a particle’s charge is the particle’s energy. For the other known interactions the charge and the energy are distinct things. Not so for gravity.

  5. Is quantum gravity testable?

    Again, yes it is definitely testable in principle, it’s just that the energy density necessary for strong quantum gravitational effects is too high for us to produce. Personally I am convinced that quantum gravity is also testable in practice, because indirect evidence can prevail at much lower energy densities, but so far we do not have experimental evidence. There is a very active research area called quantum gravity phenomenology dedicated to finding the missing experimental evidence. You can check these two review papers to get an impression of what we are presently looking for.

Wednesday, November 12, 2014

The underappreciated value of boring truths

My primary reaction to any new idea on the arXiv is conviction that it’s almost certainly wrong, and if I can’t figure out quickly why it’s wrong, I’ll ignore it because it’s most likely a waste of time. In other words, I exemplify the stereotypical reaction of scientists which Arthur Clarke summed up so nicely in his the three stages of acceptance:
  1. “It’s crazy — don’t waste my time.”
  2. “It’s possible, but it’s not worth doing.”
  3. “I always said it was a good idea.”

Maybe I’m getting old and bold rather than wise and nice, but when it comes to quantum gravity phenomenology, craziness seems to thrive particularly well. My mother asked me the other day what I tell a journalist who wants a comment on somebody else’s work which I think is nonsense. I told her I normally say “It’s very implausible.” No, I’m not nice enough to bite my tongue if somebody asks for an opinion. And so, let me tell you that most of what gets published under the name of quantum gravity phenomenology is, well, very implausible.

But quantum gravity phenomenology is just an extreme example of a general tension that you find in theoretical physics. Consider you’d rank all unconfirmed theories on two scales, one the spectrum from exciting to boring, the other the spectrum from very implausible to likely correct. Then put a dot for each theory in a plane with these two scales as axes. You’d see that the two measures are strongly correlated: The nonsense is exciting, and the truth is boring, and most of what scientists work on falls on a diagonal from exiting nonsense to boring truths.


If you’d break this down by research area you’d also find that the more boring the truth, the more people work on nonsense. Wouldn’t you too? And that’s why there is so much exciting nonsense in quantum gravity phenomenology - because the truth is boring indeed.

Conservative wisdom says that quantum gravitational effects are tiny unless space-time curvature is very strong, which only happens in the early universe and inside black holes. This expectation comes from treating quantum gravity as an effective field theory, and quantizing it perturbatively, ie when the fluctuations of space-time are small. The so quantized theory does not make sense as a fundamental theory of gravity because it breaks down at high energies, but it should be fine for calculation in weak gravitational fields.

Most of the exciting ideas in quantum gravity phenomenology assume that this effective limit does not hold for one reason or the other. The most conservative way to be non-conservative is to allow the violation of certain symmetries that are leftover from a fundamental theory of quantum gravity which does not ultimately respect them. Violations of Lorentz-invariance, CPT invariance, space-time homogeneity, or unitarity are such cases that can be accommodated within the effective field theory framework, and that have received much attention as possible signatures of quantum gravity.

Other more exotic proposals implicitly assume that the effective limit does not apply for unexplained reasons. It is known that effective field theories can fail under certain circumstances, but I can’t see how any of these cases play a role in the weak-field limit of gravity. Then again, strong curvature is one of the reasons of failure, and we do not understand what the curvature of space-time is microscopically. So sometimes, when I feel generous, I promote “implausible” to “far-fetched”.

John Donoghue is one of the few heroically pushing through calculations in the true-but-boring corner of quantum gravity phenomenology. In a recent paper, he and his coauthors calculated the quantum contributions to the bending of light in general relativity from 1-loop effects in perturbatively quantized gravity. From their result they define a semi-classical gravitational potential and derive the quantum corrections to Einstein’s classical test of General Relativity by light deflection.

They find a correction term that is suppressed by a factor ℏ G/b2 relative to the classical result, where b is the impact parameter and G is Newton’s constant. This is the typical result you’d expect from dimensional reasons. It’s a loop correction, it must have an extra G in it, it must have an inverse power of the impact parameter so it gets smaller with distance, thus G/b2 is a first guess. Of course you don’t get tenure for guessing, and the actual calculation is quite nasty, see paper for details.

In the paper the authors write “we conclude that the quantum effect is even tinier than the current precision in the measurement of light deflection”, which is an understatement if I have ever seen one. If you are generous and put in a black hole of mass M and a photon that just about manages to avoid being swallowed, the quantum effect is smaller by a factor (mp/M)2 than the classical term, where mp is the Planck mass. For a solar mass black hole this is about 70 orders of magnitude suppression. (Though on such a close approach the approximation with a small deflection doesn’t make sense any more.) If you have a Planck-mass black hole, the correction term is of order one – again that’s what you’d expect.

Yes, that is a very plausible result indeed. I would be happy to tell this any journalist, but unfortunately news items seem to be almost exclusively picked from the ever increasing selection of exciting nonsense.

I will admit that it is hard to communicate the relevance of rather technical calculations that don’t lead to stunning results, but please bear with me while I try. The reason this work is so important is that we have to face the bitter truth to find out whether that’s really all that there is or whether we indeed have reason to expect the truth isn’t as bitter as it said on the wrapping. You have to deal with a theory and its nasty details to figure out where it defies your expectations and where your guesses go wrong. And so, we will have to deal with effective quantum gravity to understand its limits. I always said it was a good idea. Even better that somebody else did the calculation so I can continue thinking about the exciting nonsense.

Bonus: True love.


Tuesday, November 11, 2014

And the winners are...

The pile of money whose value you have been guessing came out to be 68.22 Euro and 0.5 Deutsche Mark, the latter of which I didn't count. Hoping that I didn't miss anybody's guess, this means the three winning entries are:
  • Rbot: 72
  • Rami Kraft: 62
  • droid33: 58.20
Congratulations to the winners! Please send an email to hossi[at]nordita.org with your postal address and I will send the books on the way.

Saturday, November 08, 2014

Make a guess, win a book.

The twins' piggy banks are full, so I've slaughtered them. Put in your guess of how much they've swallowed and you can win a (new) copy of Chad Orzel's book "How to Teach Quantum Physics to Your Dog". (No, I'm not getting paid for this, I have a copy I don't need and hope it will make somebody happy.) You can put in your guess until Monday, midnight, East Coast Time. I will only take into account guesses posted in the comments - do not send me an email. I am looking for the amount in Cent or Euro, not the number of coins. The winners will be announced Tuesday morning. Good luck!

Wednesday, November 05, 2014

The paradigm shift you didn’t notice

Inertia creeps.

Today, for the first time in human history a scientist has written this sentence – or so would be my summary of most science headlines I read these days. Not only do the media buy rotten fish, they actually try to resell them. The irony is though that the developments which really change the way we think and live happen so gradually you wouldn’t ever learn about them in these screaming headlines.

HIV infection for example still hasn’t been cured, but decades of hard work turned it from a fatal disease into a treatable one. You read about this in longwinded essays in the back pages where nobody looks, but not on the cover page and not in your news feed. The real change didn’t come about by this one baby who smiles on the photo and who was allegedly cured, as the boldface said, but by the hundreds of trials and papers and conferences in the background.

These slow changes also happen in physics. Quantum measurement is a decoherence process rather than collapse. This doesn’t break the ground but slowly moves it. It’s an interpretational shift that has spread through the community. Similarly, it is now generally accepted that most infinities in quantum field theory do not signal a breakdown of the theory but can be dealt with by suitable calculational methods.

For me the most remarkable shift that has taken place in physics in the last decades is the technical development and, with it, acceptance of renormalization group flow and effective field theories. If this sounds over your head, bear with me for I’m not going into the details, I just want to tell you why it matters.

You have certainly heard that some quantum field theories are sick and don’t make sense – they are said to be non-renormalizable. In such a theory the previously mentioned infinities cannot be removed, or they can only be removed on the expense of introducing infinitely many free parameters which makes the theory useless. Half a century ago a theory with this disease was declared dead and went where theories go to die, into the history aisle.

Then it became increasingly clear that such non-renormalizable theories can be low-energy approximations to other theories that are healthy and renormalizable. The infinities are artifacts of the approximation and appear if one applies the approximation outside its regime of validity.

These approximations at low energies are said to be “effective” theories and they typically contain particles or degrees of freedom that are not fundamental, but instead “emergent”, which is to say they are good descriptions as long as you don’t probe them with too high energy. The theory that is good also at high energies is said to be the “UV completion” of the effective theory. (If you ever want to fake a physics PhD just say “in the IR” instead of “at low energy” and “UV” instead of “high energy”.)

A typical example for an effective theory is the nuclear force between neutrons and protons. These are not fundamental particles – we know that they are made of quarks and gluons. But for nuclear physics, at energies too small to test the quark substructure, one can treat the neutrons and protons as particles in their own right. The interaction between them is then effectively mediated by a pion, a particle that is itself composed of two quarks.

Fermi’s theory of beta-decay is a historically very important case because it brought out the origin of non-renormalizability. We know today that the weak interaction is mediated by massive gauge-bosons, the W’s and the Z. But at energies so low that one cannot probe the production and subsequent decay of these gauge bosons, the weak interaction can be effectively described without them. When a neutron undergoes beta decay, it turns into a proton and emits an electron and electron-anti-neutrino. If you do not take into account that this happens because one of the quark constituents emits a W-boson, then you are left with a four-fermion interaction with a coupling constant that depends on the mass of the W-boson. This theory is not renormalizable. Its UV completion is the standard model.

Upper image: One of the neutron's quark constituents interacts via a gauge boson with an
electron. Bottom image: If you neglect the quark substructure and the boson-exchange, you get a four-fermion interaction with a coupling that depends on the mass of the boson and which is non-renormalizable.


So now we live and work with the awareness that any quantum field theories is only one in a space of theories that can morph into each other, and the expression of the theory changes with the energy scale at which we probe the physics. A non-renormalizable theory is perfectly fine in its regime of validity. And thus today these theories are not declared dead any longer, they are declared incomplete. A theory might have other shortcomings than being non-renormalizable, for example because it contains dimensionless constants much larger than (or smaller than) one. Such a theory is called unnatural. In this case too you would now not simply discard the theory but look for its UV completion.

It is often said that physicists do not know how to quantize gravity. This isn’t true though. Gravity can be quantized just like the other interactions; the result is known as “perturbatively quantized gravity”. The problem is that the theory one gets this way is non-renormalizable, which is why it isn’t referred to as quantum gravity proper. The theory of quantum gravity that we do not know is the UV-completion of this non-renormalizable perturbative quantization. (It cannot be non-renormalizable in the same way as Fermi’s theory because gravity is a long-range interaction. We know that gravitons, if they have masses at all, have tiny masses.)

But our improved understanding of how quantum field theories at different energies belong together has done more than increasing our acceptance of theory with problems. The effective field theory framework is the tool that binds together, at least theoretically, the different disciplines in physics and in the sciences. No longer are elementary particle physics and nuclear physics and atomic physics and molecular physics different, disconnected layers of reality. Even though we cannot (yet) derive most of the relations between the models used in these disciplines, we know that they are connected through the effective field theory framework. And at high energies many physicists believe it all goes back to just one “theory of everything”. Don’t expect a big headline announcing its appearance though. The ground moves slowly.

Friday, October 31, 2014

String theory – it’s a girl thing

My first international physics conference was in Turkey. It was memorable not only because smoking was still allowed on the plane. The conference was attended by many of the local students, and almost all of them were women.

I went out one evening with the Turkish students, a group of ten with only one man who sucked away on his waterpipe while one of the women read my future from tea leaves (she read that I was going to fly through the air in the soon future). I asked the guy how come there are so few male students in this group. It’s because theoretical physics isn’t manly, it’s not considered a guy thing in Turkey, he said. Real men work outdoors or with heavy machinery, they drive, they swing tools, they hunt bears, they do men’s stuff. They don’t wipe blackboards or spend their day in the library.

I’m not sure how much of his explanation was sarcasm, but I find it odd indeed that theoretical physics is so man-dominated when it’s mostly scribbling on paper, trying to coordinate collaborations and meetings, and staring out of the window waiting for an insight. It seems mostly a historical accident that the majority of physicists today are male.

From the desk in my home office I have a view onto our downstairs neighbor’s garden. Every couple of weeks a man trims her trees and bushes. He has a key to the gate and normally comes when she is away. He uses the smoking break to tan his tattoos in her recliner and to scratch his breast hair. Then he pees on the roses. The most disturbing thing about his behavior though isn’t the peeing, it’s that he knows I’m watching. He has to cut the bushes from the outside too, facing the house, so he can see me scribbling away on my desk. He’ll stand there on his ladder and swing the chainsaw to greet me. He’s a real man, oh yeah.

After I finished high school, I went to the employment center which offered a skill- and interest-questionnaire, based on which one then was recommended a profession. I came out as landscape architect. It made sense – when asked, I said I would like to do something creative that allows me to spend time outdoors and that wouldn’t require many interpersonal skills. I also really like trees.

Then I went and studied math because what the questionnaire didn’t take into account is that I get bored incredibly quickly. I wanted a job that wouldn’t run out of novelty any time soon. Math and theoretical physics sounded just right. I never spent much time thinking about gender stereotypes, it just wasn’t something I regarded relevant. Yes, I knew the numbers, but I honestly didn’t care. Every once in a while I would realize how oddly my voice stood out, look around and realize I was the only women in the room, or one of a few. I still find it an unnatural and slightly creepy situation. But no, I never thought about gender stereotypes.

Now I’m a mother of two daughters and I realized the other day I’ve gone pink-blind. Before I had children, I’d look at little girls thinking I’d never dress my daughters all in pink. But, needless to say, most of the twin’s wardrobe today is pink because it’s either racing cars and soccer players on blue, or flowers and butterflies on pink. Unless you want to spend a ridiculous amount of money on designer clothes your kids will wear maybe once.

The internet is full with upset about girl’s toys that discourage an interest in engineering, unrealistic female body images, the objectification of women in ads and video games, the lack of strong female characters in books and movies. The internet is full with sites encouraging women to accept their bodies, the bodies of mothers with the floppy bellies and the stretch marks, the bodies of real women with the big breasts and the small breasts and the freckles and the pimples – every inch of you is perfect from the bottom to the top. It’s full with Emma Watson and He for She. It’s full of high pitched voices.

But it isn’t only women who are confronted with stereotypical gender roles and social pressure. Somebody I think must stand up and tell the boys it’s totally okay to become a string theorist, even though they don’t get to swing a chainsaw - let that somebody be me. Science is neither a boy thing nor a girl thing.

So this one is for the boys. Be what you want to be, rise like a phoenix, and witness me discovering the awesomeness of multiband compression. Happy Halloween :)

Monday, October 27, 2014

Einstein’s greatest legacy- How demons and angels advanced science

Einstein’s greatest legacy is not General Relativity, it’s not quantum entanglement, and it’s not slices of his brain either. It’s a word: Gedankenexperiment – German for “thought experiment”.

Einstein, like no other physicist before or after him, demonstrated how the power of human thought alone, used skillfully, can make up for the lack of real experiments. He showed we little humans have the power to deduce equations that govern the natural world by logical conclusion. Thought experiments are common in theoretical physics today. Physicists use them to examine the consequences of a theory beyond that what is measureable with existing technology, but still within the realm of that what is in principle measureable. A thought experiments pushes a theory to its limit and thereby can reveal inconsistencies or novel effects. The rules of the game are that a) relevant is only that what is measureable and b) do not fool yourself. This isn’t as easy as it sounds.

The famous Einstein-Podolsky-Rosen experiment was such an exploration of the consequences of a theory, in this case quantum mechanics. In a seminal paper from 1935 the three physicists showed that the standard Copenhagen interpretation of quantum mechanics has a peculiar consequence: It allows for the existence of “entangled” particles.

Entangled particles have measureable properties, for example spin, that are correlated between two particles even though the value for each single particle is not determined as long as the particles were not measured. You can know for example that if one particle has spin up the other one has spin down or vice versa, but not know which is which. The consequence is that if one of these particles is measured, the state of the other one changes – instantaneously. The moment you measure one particle having spin up, the other one must have spin down, even though it did, according to the Copenhagen interpretation, not previously have any specific spin value.

Einstein believed this ‘spooky’ action at a distance to be nonsense and decades of discussion followed. John Steward Bell later quantified exactly how entangled particles are stronger correlated than classical particles could ever be. According to Bell’s theorem, quantum entanglement can violate an inequality that bounds classical correlations.

When I was a student, tests of Bell’s theorem were still thought experiments. Today they are real experiments, and we know beyond doubt that quantum entanglement exists. It is at the basis of quantum information, quantum computation, and chances are all technologies of the coming generations will build upon Einstein, Podolsky and Rosen’s thought experiment.

Another famous thought experiment is Einstein’s elevator being pulled up by an angel. Einstein argued that inside the elevator one cannot tell, by any possible measurement, whether the elevator is in rest in a gravitational field or is being pulled up with constant acceleration. This principle of equivalence means that locally (in the elevator) the effects of gravitation are the same as that of acceleration in the absence of gravity. Converted into mathematical equations, it becomes the basis for General Relativity.

Einstein also liked to imagine chasing after photons and he seems to have spent a lot of time thinking about trains and mirrors and so on, but let us look at some other physicists’ thoughts.

Before Einstein and the advent of quantum mechanics, Laplace imagined an omniscient being able to measure the positions and velocities of all particles in the universe. He concluded, correctly, that based on Newtonian mechanics this being, named “Laplace’s demon”, would be able to predict the future perfectly for all times. Laplace did not know back then of Heisenberg’s uncertainty principle and neither did he know of chaos, both of which spoil predictability. However, his thoughts on determinism were hugely influential and lead to the idea of a clockwork universe, and our understanding of science a prediction tool in general.

Laplace’s is not the only famous demon in physics. Maxwell also imagined a demon, one that was able to sort particles of a gas into compartments depending on the particles’ velocities. The task of Maxwell’s demon was to open and close a door connecting two boxes that contain gas which initially has the same temperature on both sides. Every time a fast particle approaches from the right, the demon lets it through to the left. Every time a slow particle arrives from the right, the demon closes the door and keeps it right. This way, the average energy of particles and thus the temperature in the left box increases, and entropy of the whole system decreases. Maxwell’s demon thus seemed to violate the second law of thermodynamics!

Mawell’s demon gave headaches to physicists for many decades until it was finally understood that the demon itself must increase its entropy or use energy while it measures, stores, and eventually erases information. It has not been until a few years ago that Maxwell’s demon was in fact realized in the laboratory.

A thought experiment that still gives headaches to theoretical physicists today is the black hole information loss paradox. If you combine general relativity and quantum field theory, each of which is an extremely well established theory, then you find that black holes evaporate. You also find however that this process is not reversible; it destroys information for good. This however cannot happen in quantum field theory and thus we face a logical inconsistency when combining the two theories. This cannot be how nature works, so we must be making a mistake. But which? There are many proposed solutions to the black hole information loss problem. Most of my colleagues believe that we need a quantum theory of gravity to resolve this problem and that the inconsistency comes about by using general relativity in a regime where it should no longer be used. The thought experiments designed to resolve the problem typically use an imagined pair of observers, Bob and Alice, one of which is unfortunate to have to jump into the black hole while the other one remains outside.

One of the presently most popular solution attempts is black hole complementarity. Proposed in 1993 by Susskind and Thorlacius, black hole complementarity rests on the Gedankenexperiment main rules: That what matters is only what can be measured, and you should not fool yourself. One can avoid information loss in black holes by copying information and let it both fall into the black hole and go out. One copy remains with Bob, one goes with Alice. Copying quantum information however is itself inconsistent with quantum theory. Susskind and Thorlacius pointed out that these disagreements would not be measureable by neither Bob nor Alice, and thus no inconsistency could ever arise.

Black hole complementarity was proposed before the AdS/CFT duality was conjectured, and its popularity sparked when it was found that the non-locally doubled presence of information seemed to fit nicely with the duality that arose in string theory.

As of recently though, it has become clear that this solution has its own problems because it seems to violate the equivalence principle. The observer who crosses the horizon should not be able to notice anything unusual there. It should be like sitting in that elevator being pulled by an angel. Alas, black hole complementarity seems to imply the presence of a “firewall” that would roast the unsuspecting observer in his elevator. Is this for real or are we making a mistake again? Since the solution to this problem holds the promise of understanding the quantum nature of space and time much effort has focused on solving it.

Yes, Einstein’s legacy of thought experiments weighs heavily on theoretical physicists today – maybe too heavy for sometimes we forget that Einstein’s thoughts were based on real experiments. He had Michelson-Morley’s experiments that disproved the aether, he had the perihelion precession of mercury, he had the measurements of Planck’s radiation law. Thought alone only gets one so far. In the end, it is still data that decides whether a thought can become reality or remain fantasy.

[Cartoon: Abstruse Goose, Missed Calling]



This post first appeared on "Starts with a Bang".

Tuesday, October 21, 2014

We talk too much.

Image Source: Loom Love.

If I had one word to explain human culture at the dawn of the 21st century it would be “viral”. Everybody, it seems, is either afraid of or trying to make something go viral. And as mother of two toddlers in Kindergarten, I am of course well qualified to comment on the issue of spreading diseases, like pinkeye, lice, goat memes, black hole firewalls, and other social infections.

Today’s disease is called rainbow loom. It spreads via wrist bands that you are supposed to crochet together from rubber rings. Our daughters are too young to crochet, but that doesn’t prevent them from dragging around piles of tiny rubber bands which they put on their fingers, toes, clothes, toys, bed posts, door knobs and pretty much everything else. I spend a significant amount of my waking hours picking up these rubber bands. The other day I found some in the cereal box. Sooner or later, we’ll accidentally eat one.

But most of the infections the kids bring home are words and ideas. As of recently, they call me “little fart” or “old witch” and, leaving aside the possibility that this is my husband’s vocabulary when I am away, they probably trade these expressions at Kindergarten. I’ll give you two witches for one fart, deal? Lara, amusingly enough, sometimes confuses the words “ass” and “men” – “Arch” and “Mench” in German with her toddler’s lisp. You’re not supposed to laugh, you’re supposed to correct them. It’s “Arsch,” Lara, “SCH, not CH, Arsch.”

Man, as Aristotle put it, is a zoon politicon, she lives in communities, she is social, she shares, she spreads ideas and viruses. He does too. I pass through Frankfurt international airport on the average once per week. Research shows that the more often you are exposed to a topic the more important do you think it is, regardless of what the source is. It’s the repeated exposure that does it. Once you have a word in your head marked as relevant, your brain keeps pushing it around and hands it back to you to look for further information. Have I said Ebola yet?

Yes, words and ideas, news and memes, go viral, spread, mutate and affect the way we think. And the more connected we are, the more we share, the more we become alike. We see the same things and talk about the same things. Because if you don’t talk about what everybody else talks about would you even listen to yourself?

Not so surprisingly then, it has become fashionable to declare the end of individualism also in science, pointing towards larger and larger collaborations, and increasing co-author networks, the need to share, and the success of sharing. According to this NYT headline, the “ERA OF BIG SCIENCE DIMINISHES ROLE OF LONELY GENIUS”. We can read there
“Born out of the complexity of modern technology, the era of the vast, big-budget research team came into its own with its scientific achievements of 1984.”
Yes, that’s right, this headline dates back 30 years.

There lonely genius of course has always been a myth. Science is and has always been a community enterprise. We’re standing on the shoulders of giants. Most of them are dead, ok, but we’re still standing, standing on these dead people’s shoulders and we’re still talking and talking and talking. We’re all talking way too much. It’s hard not to have this impression after attending 5 conferences more or less in a row.

Collaboration is very en vogue today, or “trending” as we now say. Nature recently had an article about the measurement of the gravitational constant, G. Not a topic I care deeply about, but the article has an interesting quote:
“Until now, scientists measuring G have competed; everyone necessarily believes in their own value, says Stephan Schlamminger, an experimental physicist at NIST. “A lot of these people have pretty big egos, so it may be difficult,” he says. “I think when people agree which experiment to do, everyone wants their idea put forward. But in the end it will be a compromise, and we are all adults so we can probably agree.” 
Working together could even be a stress reliever, says Jens Gundlach, an experimental physicist at the University of Washington in Seattle. Getting a result that differs from the literature is very uncomfortable, he says. “You think day and night, ‘Did I do everything right?’”
And here I was thinking that worrying day and night about whether you did everything right is the essence of science. But apparently that’s too much stress. It’s clearly better we all work together to make this stressful thinking somebody else’s problem. Can you have a look at my notes and find that missing sign?

The Chinese, as you have almost certainly read, are about to overtake the world, and in that effort they now reform their science research system. Nature magazine informs us that the idea of this reform is “to encourage scientists to collaborate on fewer, large problems, rather than to churn out marginal advances in disparate projects that can be used to seek multiple grants. “Teamwork is the key word,” says Mu-Ming Poo, director of the CAS Institute of Neuroscience in Shanghai.” Essentially, it seems, they’re giving out salary increases for scientists to think the same as their colleagues.

I’m a miserable cook. My mode of operation is taking whatever is in the fridge, throwing it into a pan with loads of butter, making sure it’s really dead, and then pouring salt over it. (So you don’t notice the rubber bands.) Yes, I’m a miserable cook. But I know one thing about cooking: if you cook it for too long or stir too much, all you get is mush. It’s the same with ideas. We’re better off with various individual approaches than one collaborative one. Too much systemic risk in putting all your eggs in the same journal.

The kids, they also bring home sand-bathed gummy bears that I am supposed to wash, their friend’s socks, and stacks of millimeter paper glued together because GLUE! Apparently some store donated cubic meters of this paper to the Kindergarten because nobody buys it anymore. I recall having to draw my error bars on this paper, always trying not to use an eraser because the grid would rub away with the pencil. Those were the days.

We speak about ideas going viral, but we never speak about what happens after this. We get immune. The first time I heard about the Stückelberg mechanism I thought it was the greatest thing ever. Now it’s on the daily increasing list of oh-yeah-this-thing. I’ve always liked the myth of the lonely genius. I have a new office mate. She is very quiet.