Saturday, December 13, 2014

The remote Maxwell Demon

During the summer, I wrote a paper that I dumped in an arxiv category called cond-mat.stat-mech, and then managed to entirely forget about it. So somewhat belatedly, here is a summary.

Pretty much the only recollection I have of my stat mech lectures is that every single one of them was inevitably accompanied by the always same divided box with two sides labeled A and B. Let me draw this for you:


Maxwell’s demon in its original version sits in this box. The demon’s story is a thought experiment meant to highlight the following paradox with the 2nd law of thermodynamics.

Imagine the above box is filled with a gas, and the gas is at a low temperature on side A and at a higher temperature on side B. The second law of thermodynamics says that if you open a window in the dividing wall, the temperatures will come to an average equilibrium value, and in this process entropy is maximized. Temperature is basically average kinetic energy, so the average speed of the gas atoms approaches the same value everywhere, just because this is the most likely thing to happen

The system can only do work on the way to equilibrium, but no longer once it’s arrived there. Once you’ve reached this state of maximum entropy, nothing happens any more, except for fluctuations. Unless you have a Maxwell demon...

Maxwell’s demon sits at the dividing wall between A and B when both sides are at the same temperature. He opens the window every time a fast atom comes from the left or a slow atom comes from the right, otherwise he keeps it closed. This has the effect of sorting fast and slow atoms so that, after some while, more fast atoms are on the right side than on the left side. This means the temperatures are not in equilibrium anymore and entropy has decreased. The demon thus has violated the second law of thermodynamics!

Well, of course he hasn’t, but it took a century for physicists to pin down the exact reason why. In brief it’s that the demon must be able to obtain, store, and use information. And he can only do that if he either starts at a low entropy that then increases, or brings along an infinite reservoir of low entropy. The total entropy never decreases, and the second law is well and fine.

It has only been during recent years that some versions of Maxwell’s demon have been experimentally realized in the laboratory. These demons use essentially information to drive a system out of equilibrium, which can then, in principle, do work.

It occurred to me that this must mean it should be possible to replace transfer of energy from a sender to a receiver by transfer of information, and this information transfer could take place with a much smaller energy than what the receiver gets out of the information. In essence this would mean one can down-convert energy during transmission.

The reason this is possible is that the relevant energy here is not the total energy – a system in thermal equilibrium has lots of energy. The relevant energy that we want at the receiving end is free energy – energy that can be used to do work. The signal does not need to contain the energy itself, it only needs to contain the information that allows one to drive the system out of equilibrium.

In my paper, I have constructed a concrete example for how this could work. The full process must include remote measuring, extraction of information from the measurement, sending of the signal, and finally making use of the signal to actually extract energy. The devil, or in this case the demon, is in the details. It took me some while to come up with a system simple enough so one could in the end compute the energy conversion and also show that the whole thing, remote demon included, obeys the Carnot limit on the efficiency of heat engines.

In the classical example of Maxwell’s demon, the necessary information is the velocity of the particles approaching the dividing wall, but I chose a simpler system with discrete energy levels, just because the probability distributions are then better to deal with. The energy extraction that my demon works with is a variant of stimulated emission that is also used in lasers.

The atoms in a laser are being “pumped” into an out-of equilibrium state, which has the property that as you inject light (ie, energy) with the right frequency, you get out more light of the same frequency than you sent in. This does not work if the system is in equilibrium though, it is then always more likely that the injected signal is absorbed rather than that it stimulates a net emission.

However, a system in equilibrium always has fluctuations. The atoms have some probability to be in an excited state, a state in which they could be stimulated to emit light. If you just knew which atoms were in the excited state, then you could target them specifically, and end up with twice the energy that you sent in.

So that’s what my remote demon does: It measures out of equilibrium fluctuations in some atomic system and targets these to extract energy. The main point is that the energy sent to the system can be much smaller than the extracted energy. It is, in essence, a wireless battery recharger. Except that the energies in question are, in my example, so tiny that it’s practically entirely useless.

I’ve never worked on anything in statistical mechanics before. Apparently I don’t even have a blog label to tag it! This was a fun project and I learned a lot. I even made a drawing to accompany it.


Saturday, December 06, 2014

10 things you didn’t know about the Anthropic Principle

“The anthropic principle – the idea that our universe has the properties it does because we are here to say so and that if it were any different, we wouldn’t be around commenting on it – infuriates many physicists, including [Marc Davis from UC Berkeley]. It smacks of defeatism, as if we were acknowledging that we could not explain the universe from first principles. It also appears unscientific. For how do you verify the multiverse? Moreover, the anthropic principle is a tautology. “I think this explanation is ridiculous. Anthropic principle… bah,” said Davis. “I’m hoping they are wrong [about the multiverse] and that there is a better explanation.””
~Anil Ananthaswamy, in “The Edge of Physics”
Are we really so special?
Starting in the mid 70s, the anthropic principle has been employed in physics as an explanation for values of parameters in the theories, but in 2014 I still come across ill-informed statements like the one above in Anil Ananthaswamy’s (otherwise very recommendable) book “The Edge of Physics”. I’m no fan of the anthropic principle because I don’t think it will lead to big insights. But it’s neither useless nor a tautology nor does it acknowledge that the universe can’t be explained from first principles.

Below the most important facts about the anthropic principle, where I am referring to the definition from Ananthaswamy’s quote “Our universe has the properties it does because if it were any different we wouldn’t be here to comment on it.”
  1. The anthropic principle doesn’t necessarily have something to do with the multiverse.

    The anthropic principle is correct regardless of whether there is a multiverse or not and regardless of what is the underlying explanation for the values of parameters in our theories, if there is one. The reason it is often brought up by multiverse proponents is that they claim the anthropic principle is the only explanation, and there is no other selection principle for the parameters that we observe. One then needs to show though that the value of parameters we observe is indeed the only one (or at least a very probable one) if one requires that life is possible. This is however highly controversial, see 2.

  2. The anthropic principle cannot explain the values of all parameters in our theories.

    The typical claim that the anthropic principle explains the value of parameters in the multiverse goes like this: If parameter x was just a little larger or smaller we wouldn’t exist. The problem with this argument is that small variations in one out of two dozen parameters do not consider the bulk of possible combinations. You’d really have to consider independent modifications of all parameters to be able to conclude there is only one combination supportive of life. This however is not a presently feasible calculation.

    Though we cannot presently scan the whole parameter space to find out which combinations might be supportive for life, we can do a little better than one and try at least a few. This has been done and thus we know that the claim that there is really only one combination of parameters that will create a universe hospitable to life is on very shaky ground.

    In their 2006 paper “A Universe Without Weak Interactions”, published in PRD, Harnik, Kribs, and Perez paper put forward a universe that seems capable of creating life and yet is entirely different from our own [arXiv:hep-ph/0604027]. Don Page argues that the universe would be more hospitable for life if the cosmological constant was smaller than the observed value [arxiv:1101.2444], and recently it was claimed that life might have been possible already in the early universe [arxiv:1312.0613. All these arguments show that a chemistry complex enough to support life can arise under circumstances that, while still special, are not anything like the ones we experience today.

  3. Even so, the anthropic principle might still explain some parameters.

    The anthropic principle might however still work for some parameters if their effect is almost independent on what the other parameters do. That is, even if one cannot use the anthropic principle to explain all values of parameters because one knows there are other combinations allowing for the preconditions of life, some of these parameters might need to have the same value in all cases. The cosmological constant is often claimed to be of this type.

  4. The anthropic principle is trivial but that doesn’t mean it’s obvious.

    Mathematical theorems, lemmas, and corollaries are results of derivations following from assumptions and definitions. They essentially are the assumptions, just expressed differently. They are always true and sometimes trivial. But often, they are surprising and far from obvious, though that is inevitably a subjective statement. Complaining that something is trivial is like saying “It’s just sound waves” and referring to everything from engine noise to Mozart.

  5. The anthropic principle isn’t useless.

    While the anthropic principle might strike you as somewhat silly and trivially true, it can be useful for example to rule out values of certain parameters. The most prominent example is probably the cosmological constant which, if it was too large, wouldn’t allow the formation of structures large enough to support life. This is not an empty conclusion. It’s like when I see you drive to work by car every morning and conclude you must be old enough to have a driver’s license. (You might just be stubbornly disobeying laws, but the universe can’t do that.) The anthropic principle is in its core function a consistency constraint on the parameters in our theories. One could derive from it predictions on the possible combinations of parameters, but since we have already measured them these are now merely post-dictions.

    Fred Hoyle's prediction of properties of the carbon nucleus that make possible the synthesis of carbon in stellar interiors — properties that were later discovered as predicted — is often quoted as successful application of the anthropic principle because Hoyle is said to have exploited the fact that carbon is central to life on Earth. Some historians have questioned whether this was indeed Hoyle's reasoning, but the mere fact that it could have been shows that anthropic reasoning can be a useful extrapolation of observation - in this case the abundance of carbon on our planet.

  6. The anthropic principle does not imply a causal relation.

    Though “because” suggests it, there is no causation in the anthropic principle. An everyday example for “because” not implying an actual cause: I know you’re sick because you’ve got a cough and a runny nose. This doesn’t mean the runny nose caused you to be sick. Instead, it was probably some virus. Alas, you can carry a virus without showing symptoms so it’s not like the virus is the actual “cause” of my knowing. Likewise, that there is somebody here to observe the universe did not cause a life-friendly universe into existence. (And the return, that a life-friendly universe caused our existence doesn’t work because it’s not like the life-friendly universe sat somewhere out there and then decided to come into existence to produce some humans.)

  7. The applications of the anthropic principle in physics have actually nothing to do with life.

    As Lee Smolin likes to point out, the mentioning of “life” in the anthropic principle is entirely superfluous verbal baggage (my words, not his). Physicists don’t usually have a lot of business with the science of self-aware conscious beings. They talk about formation of large scale structures or atoms that are preconditions for biochemistricy, but don’t even expect physicists to discuss large molecules. Talking about “life” is arguably catchier, but that’s really all there is to it.

  8. The anthropic principle is not a tautology in the rhetorical sense.

    It does not use different words to say the same thing: A universe might be hospitable to life and yet life might not feel like coming to the party, or none of that life might ever ask a why-question. In other words, getting the parameters right is a necessary but not a sufficient condition for the evolution of intelligent life. The rhetorically tautological version would be “Since you are here asking why the universe is hospitable to life, life must have evolved in that universe that now asks why the universe is hospitable to life.” Which you can easily identify as rhetorical tautology because now it sounds entirely stupid.

  9. It’s not a new or unique application.

    Anthropic-type arguments, based on the observation that there exists somebody in this universe capable of making an observation, are not only used to explain free parameters in our theories. They sometimes appear as “physical” requirements. For example: we assume there are no negative energies because otherwise the vacuum would be unstable and we wouldn’t be here to worry about it. And requirements like locality, separation of scales, and well-defined initial value problems are essentially based on the observation that otherwise we wouldn’t be able to do any science, if there was anybody to do anything at all. Logically, these requirements are the same as anthropic arguments, they just aren’t referred to it as such.

  10. Other variants of the anthropic principle have questionable scientific value

    The anthropic principle becomes speculative, for not to say unscientific, once you try to go beyond the definition that I referred to here. If one does not understand that a consistency constraint does not imply a causal relation then you come to the strange conclusion that humans caused the universe into existence. And if one does not accept that the anthropic principle is just a requirement that a viable theories has to fulfil, one is then stuck with the question why the parameter values are what they are. Here is where the multiverse comes back, for you can then argue that we are forced to believe in the “existence” of universes with all possible combinations. Or you can go off the deep end and argue that our universe was designed for the existence of life.

    Personally I feel the urge to wash my hands after having been in touch with these kinds of arguments. I prefer my principles trivially true.


This post previously appeared October 21st 2014 on Starts with a Bang.

Saturday, November 29, 2014

Negative Mass in General Relativity?

[Image Source: Ginva.com]
Science News ran a piece the other week about a paper that has appeared in PRD titled “Negative mass bubbles in de Sitter spacetime”. The Science News article is behind a paywall, but don’t worry I’ll tell you everything you need to know.

The arxiv version of the paper is here. Since I’m quoted in the Science News piece saying something to the extent that I have my reservations but think it’s a promising direction of study, I have gotten a lot of questions about negative masses in General Relativity lately. So here a clarification.

First one has to be careful what one means with mass. There are three types of masses: inertial mass, passive gravitational mass, and active gravitational mass. In General Relativity these masses, or their generalization in terms of tensors respectively, are normally assumed to be identical.

The equality of inertial and passive gravitational mass is basically the equivalence principle. The active gravitational mass is what causes space-time to bend; the passive gravitational mass is what couples to the space-time and determines the motion of particles in that background. The active and passive gravitational masses are identical in almost all theories I know. (The Schrödinger-Newton approach is the only exception that comes to mind). I doubt it is consistent to have them not be equal, but I am not aware of a proof for this. (I tried in the Schrödinger-Newton case, but it’s not as trivial as it looks at first sight.)

In General Relativity one further has to distinguish between the local quantities like energy-density and pressure and so on that are functions of the coordinates, and global quantities that describe the space-time at large. The total mass or energy in some asymptotic limit are essentially integrals over the local quantities, and there are several slightly different ways to define them.

The positive mass theorem, in contrast to what its name suggests, does not state that one cannot have particles with negative masses. It states instead, roughly, that if your local matter is normal matter and obeys certain plausible assumptions, then the total energy and mass are also positive. You thus cannot have stars with negative masses, regardless of how you bend your space-time. This isn’t as trivial a statement as it sounds because the gravitational interaction contributes to the definition of these integrated quantities. In any case, the positive mass theorem holds in space that is asymptotically flat.

Now what they point out in the new paper is that for all we know we don’t live in asymptotically flat space, but we live in asymptotic de-Sitter space because observational evidence speaks for a positive cosmological constant. In this case the positive mass theorem doesn’t apply. Then they go on to construct a negative mass solution in asymptotic de Sitter space. I didn’t check the calculation in detail, part of it is numerical, but it all sounds plausible to me.

However, it is somewhat misleading to call the solution that they find a negative mass solution. The cosmological constant makes a contribution to the effective mass term in what you can plausibly interpret as the gravitational potential. Taken together both, the effective mass in the potential is positive in the region where this solution applies. The local mass (density) is also positive by assumption. (You see this most easily by looking at fig 1 in the paper.)

Selling this as a negative mass solution is like one of these ads that say you’ll save 10$ if you spend at least $100 – in the end your expenses are always positive. The negative mass in their solution corresponds to the supposed savings that you make. You never really get to see them. What really matters are the total expenses. And these are always positive. There are thus no negative mass particles in this scenario whatsoever. Further, the cosmological constant is necessary for these solutions to exist, so you cannot employ them to replace the cosmological constant.

It also must be added that showing the existence of a certain solution to Einstein’s field equations is one thing, showing that they have a reasonable chance to actually be realized in Nature is an entirely different thing. For this you have to come up with a mechanism to create them and you also have to show that they are stable. Neither point is addressed in the paper.

Advertisement break: If you want to know how one really introduces negative masses into GR, read this.

In the Science News article Andrew Grant quotes one of the authors as saying:
“Paranjape wants to look into the possibility that the very early universe contained a plasma of particles with both positive and negative mass. It would be a very strange cosmic soup, he says, because positive mass gravitationally attracts everything and negative mass repels everything.”
This is wrong. Gravitation is a spin-2 interaction. It is straightforward to see that this means that like charges attract and unlike charges repel. The charge of gravity is the mass. This does not mean that negative gravitational mass repels everything. Negative gravitational mass repels positive mass but attracts negative mass. If this wasn’t so, then you’d run into the above mentioned inconsistencies. The reason this isn’t so in the case considered in the paper is that they don’t have negative masses to begin with. They have certain solutions that basically have a gravitational attraction which is smaller than expected.

In summary, I think it’s an interesting work, but so far it’s an entirely theoretical construct and its relevance for the description of cosmological dynamics is entirely unclear. There are no negative mass particles in this paper in any sensible interpretation of this term.

Saturday, November 22, 2014

Gender disparity? Yes, please.

[Image Source: Papercards]

Last month, a group of Australian researchers from the life sciences published a paper that breaks down the duration of talks at a 2013 conference by gender. They found that while the overall attendance and number of presentations was almost equally shared between men and women, the women spoke on the average for shorter periods of time. The main reason for this was that the women applied for shorter talks to begin with. You find a brief summary on the Nature website.

The twitter community of women in science was all over this, encouraging women to make the same requests as men, asserting that women “underpromote” themselves by not taking up enough of their colleagues’ time.



Other studies have previously found that while women on the average speak as much as men during the day, they tend to speak less in groups, especially so if the group is predominantly male. So the findings from the conference aren’t very surprising.

Now a lot of what goes around on twitter isn’t really meant seriously, see the smiley in Katie Hinde’s tweet. I remarked one could also interpret the numbers to show that men talk too much and overpromote themselves. I was joking of course to make a point, but after dwelling on this for a while I didn’t find it that funny anymore.

Women are frequently told that to be successful they should do the same as men do. I don’t know how often I have seen advice explaining how women are allegedly belittling themselves by talking, well, like a woman. We are supposed to be assertive and take credit for our achievements. Pull your shoulders back, don’t cross your legs, don’t flip your hair. We’re not supposed to end every sentence as if it was a question. We’re not supposed to start every interjection with an apology. We’re not supposed to be emotional and personal, and so on. Yes, all of these are typically “female” habits. We are told, in essence, there’s something wrong with being what we are.

Here is for example a list with public speaking tips: Don’t speak about yourself, don’t speak in a high pitch, don’t speak too fast because “Talking fast is natural with two of your best friends and a bottle of Mumm, but audiences (especially we slower listening men) can’t take it all in”. Aha. Also, don’t flirt and don’t wear jewelry because the slow men might notice you’re a woman.

Sorry, I got sick at point five and couldn’t continue – must have been the Mumm. Too bad if your anatomy doesn’t support the low pitches. If you believe this guy that is, but listen to me for a moment, I swear I’ll try not to flirt. If your voice sounds unpleasant when you’re giving a talk, it’s not your voice, it’s the microphone and the equalizer, probably set for male voices. And do we really need a man to tell us that if we’re speaking about our research at a conference we shouldn’t talk about our recent hiking trip instead?

There are many reasons why women are underrepresented in some professions and overrepresented in others. Some of it is probably biological, some of it is cultural. If you are raising or have raised a child it is abundantly obvious that our little ones are subjected to gender stereotypes starting at very young age. Part of it is the clothing and the toys, but more importantly it’s simply that they observe the status quo: Childcare is still predominantly female business and I yet have to see a woman on the garbage truck.

Humans are incredibly social animals. It would be surprising if the prevailing stereotypes did not affect us at all. That’s why I am supportive of all initiatives that encourage children to develop their talents regardless of whether these talents are deemed suitable for their gender, race, or social background. Because these stereotypes are thousands of years old and have become hurdles to our selfdevelopment. By and large, I see more encouragements for girls than I see for boys to follow their passion regardless of what society thinks, and I also see that women have more backup fighting unrealistic body images which is what this previous post was about. Ironically, I was criticized on twitter for saying that boys don’t need to have a superhero body to be real men because that supposedly wasn’t fair to the girls.

I am not supportive of hard quotas that aim at prefixed male-female ratios. There is no scientific support for these ratios, and moreover I witnessed repeatedly that these quotas have a big backlash, creating a stigma that “She is just here because” whether or not that is true.

Thus, at the present level women are likely to still be underrepresented from where we would be if we’d manage to ignore social pressure to follow ancient stereotypes. And so I think that we would benefit from more women among the scientists, especially in math-heavy disciplines. Firstly because we are unnecessarily missing out of talent. But also because diversity is beneficial for the successful generation and realization of ideas. The relevant diversity is in the way we think and argue. Again, this is probably partly biological and partly cultural, but whatever the reason, a diversity of thought should be encouraged and this diversity is almost certainly correlated with demographic diversity.

That’s why I disapprove of so-called advice that women should talk and walk and act like men. Because that’s exactly the opposite from what we need. Science stands to benefit from women being different from men. Gender equality doesn’t mean genders should be equal, it means they should have the same opportunities. So women are more likely to volunteer organizing social events? Wtf is wrong with that?

So please go flip your hair if you feel like it, wear your favorite shirt, put on all the jewelry you like, and generally be yourself. Don’t let anybody tell you to be something you are not. If you need the long slot for your talk go ahead. If you’re confident you can get across your message in 15 minutes, even better, because we all talk too much anyway.


About the video: I mysteriously managed to produce a video in High Definition! Now you can see all my pimples. My husband made a good camera man. My anonymous friend again helped cleaning up the audio file. Enjoy :)

Wednesday, November 19, 2014

Frequently Asked Questions

[Image source: Stickypictures.]

My mom is a, now-retired, high school teacher. As teenager I thought this was a great job and wanted to become a teacher myself. To practice, I made money giving homework help but discovered quickly I hated it for a simple reason: I don’t like to repeat myself. I really don’t like to repeat myself.

But if I thought spending two years repeating how to take square roots - to the same boy - was getting me as close to spontaneous brain implosion I ever wanted to get, it still didn’t quite prepare me for the joys of parenthood. Only the twins would introduce me to the pleasure of hearing Jingle Bells for 5 hours in a row, and re-reading the story about Clara and her Binky until the book mysteriously vanished and will not be seen again unless somebody bothers to clean behind the shoe rack. “I told you twice not to wash the hair dryer,” clearly wasn’t my most didactic moment. But my daughter just laughed when the fuse blew and the lights went off. Thanks for asking, we got a new dryer.

And so I often feel like I write this blog as an exercise in patience. Nobody of course bothers to search the blog archives where I have explained everything. Sometimes twice! But today I will try to be inspired by Ethan who seems to have the patience of an angel, if a blue one, and basically answers the same questions all over and over and over again. So here are answers to the questions I get most often. Once and forever I hope...
  1. Is string theory testable?

    The all-time favorite. Yes, it is. There is really no doubt about it. The problem is that it is testable in principle, but at least so far nobody knows how to test it in practice. The energy (densities) necessary for this are just too high. Some models that are inspired by string theory, notably string cosmology, are testable with existing experiments. That it is testable in principle is a very important point because some variants of the multiverse aren’t even testable in principle and then it is indeed highly questionable whether it is still science. Not so though for string theory. And let me be clear that I mean here string theory as the candidate theory of everything including gravity. Testing string theory as means to explain certain strongly coupled condensed matter systems is an entirely different thing.

  2. Do black holes exist?

    Yes. We have ample evidence that supermassive black holes exist in the centers of many galaxies and that solar-sized black holes are found throughout galaxies. The existence of black holes is today generally accepted fact in the physics community. That black holes exist means concretely that we have observational evidence for objects dense enough to be a black hole and that do not have a hard surface, so they cannot be a very dim stars. One can exclude this possibility because matter hitting the surface of a star would emit radiation, whereas the same would not happen when the matter falls through the black hole horizon. This horizon does not have to be an eternal horizon. It is consistent with observation, and indeed generally believed, that the black hole horizon can eventually vanish, though this will not happen until hundreds of billions of years into the future. The defining property of the black hole is the horizon, not the singularity at its center, which is generally believed to not exist but for which we have no evidence one way or the other.

  3. Why quantize gravity?

    There is no known way to consistently couple the non-quantized theory of general relativity to the quantum field theories of the standard model. This only works in limiting cases. The most plausible way to resolve this tension is to quantize gravity too. It is in principle also possible that instead there is a way to couple quantum and classical theories that has so far been missed, or that the underlying theory is in some sense neither classical nor quantum, but this option is not favored by most researchers in the field today. Either way, the inconsistency in our existing theories is a very strong indication that the theories we have are incomplete. Research in quantum gravity basically searches for the completion of the existing theories. In the end this might or might not imply actually quantizing gravity, but Nature somehow knows how to combine general relativity with quantum field theory, and we don’t.

  4. Why is it so hard to quantize gravity?

    It isn’t. Gravity can be quantized pretty much the same way as the other interactions. It’s just that the theory one arrives at this way cannot be a fundamental theory because it breaks down at high energies. It is thus not the theory that we are looking for. Roughly speaking the reason this happens is that the gravitational equivalent of a particle’s charge is the particle’s energy. For the other known interactions the charge and the energy are distinct things. Not so for gravity.

  5. Is quantum gravity testable?

    Again, yes it is definitely testable in principle, it’s just that the energy density necessary for strong quantum gravitational effects is too high for us to produce. Personally I am convinced that quantum gravity is also testable in practice, because indirect evidence can prevail at much lower energy densities, but so far we do not have experimental evidence. There is a very active research area called quantum gravity phenomenology dedicated to finding the missing experimental evidence. You can check these two review papers to get an impression of what we are presently looking for.

Wednesday, November 12, 2014

The underappreciated value of boring truths

My primary reaction to any new idea on the arXiv is conviction that it’s almost certainly wrong, and if I can’t figure out quickly why it’s wrong, I’ll ignore it because it’s most likely a waste of time. In other words, I exemplify the stereotypical reaction of scientists which Arthur Clarke summed up so nicely in his the three stages of acceptance:
  1. “It’s crazy — don’t waste my time.”
  2. “It’s possible, but it’s not worth doing.”
  3. “I always said it was a good idea.”

Maybe I’m getting old and bold rather than wise and nice, but when it comes to quantum gravity phenomenology, craziness seems to thrive particularly well. My mother asked me the other day what I tell a journalist who wants a comment on somebody else’s work which I think is nonsense. I told her I normally say “It’s very implausible.” No, I’m not nice enough to bite my tongue if somebody asks for an opinion. And so, let me tell you that most of what gets published under the name of quantum gravity phenomenology is, well, very implausible.

But quantum gravity phenomenology is just an extreme example of a general tension that you find in theoretical physics. Consider you’d rank all unconfirmed theories on two scales, one the spectrum from exciting to boring, the other the spectrum from very implausible to likely correct. Then put a dot for each theory in a plane with these two scales as axes. You’d see that the two measures are strongly correlated: The nonsense is exciting, and the truth is boring, and most of what scientists work on falls on a diagonal from exiting nonsense to boring truths.


If you’d break this down by research area you’d also find that the more boring the truth, the more people work on nonsense. Wouldn’t you too? And that’s why there is so much exciting nonsense in quantum gravity phenomenology - because the truth is boring indeed.

Conservative wisdom says that quantum gravitational effects are tiny unless space-time curvature is very strong, which only happens in the early universe and inside black holes. This expectation comes from treating quantum gravity as an effective field theory, and quantizing it perturbatively, ie when the fluctuations of space-time are small. The so quantized theory does not make sense as a fundamental theory of gravity because it breaks down at high energies, but it should be fine for calculation in weak gravitational fields.

Most of the exciting ideas in quantum gravity phenomenology assume that this effective limit does not hold for one reason or the other. The most conservative way to be non-conservative is to allow the violation of certain symmetries that are leftover from a fundamental theory of quantum gravity which does not ultimately respect them. Violations of Lorentz-invariance, CPT invariance, space-time homogeneity, or unitarity are such cases that can be accommodated within the effective field theory framework, and that have received much attention as possible signatures of quantum gravity.

Other more exotic proposals implicitly assume that the effective limit does not apply for unexplained reasons. It is known that effective field theories can fail under certain circumstances, but I can’t see how any of these cases play a role in the weak-field limit of gravity. Then again, strong curvature is one of the reasons of failure, and we do not understand what the curvature of space-time is microscopically. So sometimes, when I feel generous, I promote “implausible” to “far-fetched”.

John Donoghue is one of the few heroically pushing through calculations in the true-but-boring corner of quantum gravity phenomenology. In a recent paper, he and his coauthors calculated the quantum contributions to the bending of light in general relativity from 1-loop effects in perturbatively quantized gravity. From their result they define a semi-classical gravitational potential and derive the quantum corrections to Einstein’s classical test of General Relativity by light deflection.

They find a correction term that is suppressed by a factor ℏ G/b2 relative to the classical result, where b is the impact parameter and G is Newton’s constant. This is the typical result you’d expect from dimensional reasons. It’s a loop correction, it must have an extra G in it, it must have an inverse power of the impact parameter so it gets smaller with distance, thus G/b2 is a first guess. Of course you don’t get tenure for guessing, and the actual calculation is quite nasty, see paper for details.

In the paper the authors write “we conclude that the quantum effect is even tinier than the current precision in the measurement of light deflection”, which is an understatement if I have ever seen one. If you are generous and put in a black hole of mass M and a photon that just about manages to avoid being swallowed, the quantum effect is smaller by a factor (mp/M)2 than the classical term, where mp is the Planck mass. For a solar mass black hole this is about 70 orders of magnitude suppression. (Though on such a close approach the approximation with a small deflection doesn’t make sense any more.) If you have a Planck-mass black hole, the correction term is of order one – again that’s what you’d expect.

Yes, that is a very plausible result indeed. I would be happy to tell this any journalist, but unfortunately news items seem to be almost exclusively picked from the ever increasing selection of exciting nonsense.

I will admit that it is hard to communicate the relevance of rather technical calculations that don’t lead to stunning results, but please bear with me while I try. The reason this work is so important is that we have to face the bitter truth to find out whether that’s really all that there is or whether we indeed have reason to expect the truth isn’t as bitter as it said on the wrapping. You have to deal with a theory and its nasty details to figure out where it defies your expectations and where your guesses go wrong. And so, we will have to deal with effective quantum gravity to understand its limits. I always said it was a good idea. Even better that somebody else did the calculation so I can continue thinking about the exciting nonsense.

Bonus: True love.


Tuesday, November 11, 2014

And the winners are...

The pile of money whose value you have been guessing came out to be 68.22 Euro and 0.5 Deutsche Mark, the latter of which I didn't count. Hoping that I didn't miss anybody's guess, this means the three winning entries are:
  • Rbot: 72
  • Rami Kraft: 62
  • droid33: 58.20
Congratulations to the winners! Please send an email to hossi[at]nordita.org with your postal address and I will send the books on the way.

Saturday, November 08, 2014

Make a guess, win a book.

The twins' piggy banks are full, so I've slaughtered them. Put in your guess of how much they've swallowed and you can win a (new) copy of Chad Orzel's book "How to Teach Quantum Physics to Your Dog". (No, I'm not getting paid for this, I have a copy I don't need and hope it will make somebody happy.) You can put in your guess until Monday, midnight, East Coast Time. I will only take into account guesses posted in the comments - do not send me an email. I am looking for the amount in Cent or Euro, not the number of coins. The winners will be announced Tuesday morning. Good luck!

Wednesday, November 05, 2014

The paradigm shift you didn’t notice

Inertia creeps.

Today, for the first time in human history a scientist has written this sentence – or so would be my summary of most science headlines I read these days. Not only do the media buy rotten fish, they actually try to resell them. The irony is though that the developments which really change the way we think and live happen so gradually you wouldn’t ever learn about them in these screaming headlines.

HIV infection for example still hasn’t been cured, but decades of hard work turned it from a fatal disease into a treatable one. You read about this in longwinded essays in the back pages where nobody looks, but not on the cover page and not in your news feed. The real change didn’t come about by this one baby who smiles on the photo and who was allegedly cured, as the boldface said, but by the hundreds of trials and papers and conferences in the background.

These slow changes also happen in physics. Quantum measurement is a decoherence process rather than collapse. This doesn’t break the ground but slowly moves it. It’s an interpretational shift that has spread through the community. Similarly, it is now generally accepted that most infinities in quantum field theory do not signal a breakdown of the theory but can be dealt with by suitable calculational methods.

For me the most remarkable shift that has taken place in physics in the last decades is the technical development and, with it, acceptance of renormalization group flow and effective field theories. If this sounds over your head, bear with me for I’m not going into the details, I just want to tell you why it matters.

You have certainly heard that some quantum field theories are sick and don’t make sense – they are said to be non-renormalizable. In such a theory the previously mentioned infinities cannot be removed, or they can only be removed on the expense of introducing infinitely many free parameters which makes the theory useless. Half a century ago a theory with this disease was declared dead and went where theories go to die, into the history aisle.

Then it became increasingly clear that such non-renormalizable theories can be low-energy approximations to other theories that are healthy and renormalizable. The infinities are artifacts of the approximation and appear if one applies the approximation outside its regime of validity.

These approximations at low energies are said to be “effective” theories and they typically contain particles or degrees of freedom that are not fundamental, but instead “emergent”, which is to say they are good descriptions as long as you don’t probe them with too high energy. The theory that is good also at high energies is said to be the “UV completion” of the effective theory. (If you ever want to fake a physics PhD just say “in the IR” instead of “at low energy” and “UV” instead of “high energy”.)

A typical example for an effective theory is the nuclear force between neutrons and protons. These are not fundamental particles – we know that they are made of quarks and gluons. But for nuclear physics, at energies too small to test the quark substructure, one can treat the neutrons and protons as particles in their own right. The interaction between them is then effectively mediated by a pion, a particle that is itself composed of two quarks.

Fermi’s theory of beta-decay is a historically very important case because it brought out the origin of non-renormalizability. We know today that the weak interaction is mediated by massive gauge-bosons, the W’s and the Z. But at energies so low that one cannot probe the production and subsequent decay of these gauge bosons, the weak interaction can be effectively described without them. When a neutron undergoes beta decay, it turns into a proton and emits an electron and electron-anti-neutrino. If you do not take into account that this happens because one of the quark constituents emits a W-boson, then you are left with a four-fermion interaction with a coupling constant that depends on the mass of the W-boson. This theory is not renormalizable. Its UV completion is the standard model.

Upper image: One of the neutron's quark constituents interacts via a gauge boson with an
electron. Bottom image: If you neglect the quark substructure and the boson-exchange, you get a four-fermion interaction with a coupling that depends on the mass of the boson and which is non-renormalizable.


So now we live and work with the awareness that any quantum field theories is only one in a space of theories that can morph into each other, and the expression of the theory changes with the energy scale at which we probe the physics. A non-renormalizable theory is perfectly fine in its regime of validity. And thus today these theories are not declared dead any longer, they are declared incomplete. A theory might have other shortcomings than being non-renormalizable, for example because it contains dimensionless constants much larger than (or smaller than) one. Such a theory is called unnatural. In this case too you would now not simply discard the theory but look for its UV completion.

It is often said that physicists do not know how to quantize gravity. This isn’t true though. Gravity can be quantized just like the other interactions; the result is known as “perturbatively quantized gravity”. The problem is that the theory one gets this way is non-renormalizable, which is why it isn’t referred to as quantum gravity proper. The theory of quantum gravity that we do not know is the UV-completion of this non-renormalizable perturbative quantization. (It cannot be non-renormalizable in the same way as Fermi’s theory because gravity is a long-range interaction. We know that gravitons, if they have masses at all, have tiny masses.)

But our improved understanding of how quantum field theories at different energies belong together has done more than increasing our acceptance of theory with problems. The effective field theory framework is the tool that binds together, at least theoretically, the different disciplines in physics and in the sciences. No longer are elementary particle physics and nuclear physics and atomic physics and molecular physics different, disconnected layers of reality. Even though we cannot (yet) derive most of the relations between the models used in these disciplines, we know that they are connected through the effective field theory framework. And at high energies many physicists believe it all goes back to just one “theory of everything”. Don’t expect a big headline announcing its appearance though. The ground moves slowly.

Friday, October 31, 2014

String theory – it’s a girl thing

My first international physics conference was in Turkey. It was memorable not only because smoking was still allowed on the plane. The conference was attended by many of the local students, and almost all of them were women.

I went out one evening with the Turkish students, a group of ten with only one man who sucked away on his waterpipe while one of the women read my future from tea leaves (she read that I was going to fly through the air in the soon future). I asked the guy how come there are so few male students in this group. It’s because theoretical physics isn’t manly, it’s not considered a guy thing in Turkey, he said. Real men work outdoors or with heavy machinery, they drive, they swing tools, they hunt bears, they do men’s stuff. They don’t wipe blackboards or spend their day in the library.

I’m not sure how much of his explanation was sarcasm, but I find it odd indeed that theoretical physics is so man-dominated when it’s mostly scribbling on paper, trying to coordinate collaborations and meetings, and staring out of the window waiting for an insight. It seems mostly a historical accident that the majority of physicists today are male.

From the desk in my home office I have a view onto our downstairs neighbor’s garden. Every couple of weeks a man trims her trees and bushes. He has a key to the gate and normally comes when she is away. He uses the smoking break to tan his tattoos in her recliner and to scratch his breast hair. Then he pees on the roses. The most disturbing thing about his behavior though isn’t the peeing, it’s that he knows I’m watching. He has to cut the bushes from the outside too, facing the house, so he can see me scribbling away on my desk. He’ll stand there on his ladder and swing the chainsaw to greet me. He’s a real man, oh yeah.

After I finished high school, I went to the employment center which offered a skill- and interest-questionnaire, based on which one then was recommended a profession. I came out as landscape architect. It made sense – when asked, I said I would like to do something creative that allows me to spend time outdoors and that wouldn’t require many interpersonal skills. I also really like trees.

Then I went and studied math because what the questionnaire didn’t take into account is that I get bored incredibly quickly. I wanted a job that wouldn’t run out of novelty any time soon. Math and theoretical physics sounded just right. I never spent much time thinking about gender stereotypes, it just wasn’t something I regarded relevant. Yes, I knew the numbers, but I honestly didn’t care. Every once in a while I would realize how oddly my voice stood out, look around and realize I was the only women in the room, or one of a few. I still find it an unnatural and slightly creepy situation. But no, I never thought about gender stereotypes.

Now I’m a mother of two daughters and I realized the other day I’ve gone pink-blind. Before I had children, I’d look at little girls thinking I’d never dress my daughters all in pink. But, needless to say, most of the twin’s wardrobe today is pink because it’s either racing cars and soccer players on blue, or flowers and butterflies on pink. Unless you want to spend a ridiculous amount of money on designer clothes your kids will wear maybe once.

The internet is full with upset about girl’s toys that discourage an interest in engineering, unrealistic female body images, the objectification of women in ads and video games, the lack of strong female characters in books and movies. The internet is full with sites encouraging women to accept their bodies, the bodies of mothers with the floppy bellies and the stretch marks, the bodies of real women with the big breasts and the small breasts and the freckles and the pimples – every inch of you is perfect from the bottom to the top. It’s full with Emma Watson and He for She. It’s full of high pitched voices.

But it isn’t only women who are confronted with stereotypical gender roles and social pressure. Somebody I think must stand up and tell the boys it’s totally okay to become a string theorist, even though they don’t get to swing a chainsaw - let that somebody be me. Science is neither a boy thing nor a girl thing.

So this one is for the boys. Be what you want to be, rise like a phoenix, and witness me discovering the awesomeness of multiband compression. Happy Halloween :)

Monday, October 27, 2014

Einstein’s greatest legacy- How demons and angels advanced science

Einstein’s greatest legacy is not General Relativity, it’s not quantum entanglement, and it’s not slices of his brain either. It’s a word: Gedankenexperiment – German for “thought experiment”.

Einstein, like no other physicist before or after him, demonstrated how the power of human thought alone, used skillfully, can make up for the lack of real experiments. He showed we little humans have the power to deduce equations that govern the natural world by logical conclusion. Thought experiments are common in theoretical physics today. Physicists use them to examine the consequences of a theory beyond that what is measureable with existing technology, but still within the realm of that what is in principle measureable. A thought experiments pushes a theory to its limit and thereby can reveal inconsistencies or novel effects. The rules of the game are that a) relevant is only that what is measureable and b) do not fool yourself. This isn’t as easy as it sounds.

The famous Einstein-Podolsky-Rosen experiment was such an exploration of the consequences of a theory, in this case quantum mechanics. In a seminal paper from 1935 the three physicists showed that the standard Copenhagen interpretation of quantum mechanics has a peculiar consequence: It allows for the existence of “entangled” particles.

Entangled particles have measureable properties, for example spin, that are correlated between two particles even though the value for each single particle is not determined as long as the particles were not measured. You can know for example that if one particle has spin up the other one has spin down or vice versa, but not know which is which. The consequence is that if one of these particles is measured, the state of the other one changes – instantaneously. The moment you measure one particle having spin up, the other one must have spin down, even though it did, according to the Copenhagen interpretation, not previously have any specific spin value.

Einstein believed this ‘spooky’ action at a distance to be nonsense and decades of discussion followed. John Steward Bell later quantified exactly how entangled particles are stronger correlated than classical particles could ever be. According to Bell’s theorem, quantum entanglement can violate an inequality that bounds classical correlations.

When I was a student, tests of Bell’s theorem were still thought experiments. Today they are real experiments, and we know beyond doubt that quantum entanglement exists. It is at the basis of quantum information, quantum computation, and chances are all technologies of the coming generations will build upon Einstein, Podolsky and Rosen’s thought experiment.

Another famous thought experiment is Einstein’s elevator being pulled up by an angel. Einstein argued that inside the elevator one cannot tell, by any possible measurement, whether the elevator is in rest in a gravitational field or is being pulled up with constant acceleration. This principle of equivalence means that locally (in the elevator) the effects of gravitation are the same as that of acceleration in the absence of gravity. Converted into mathematical equations, it becomes the basis for General Relativity.

Einstein also liked to imagine chasing after photons and he seems to have spent a lot of time thinking about trains and mirrors and so on, but let us look at some other physicists’ thoughts.

Before Einstein and the advent of quantum mechanics, Laplace imagined an omniscient being able to measure the positions and velocities of all particles in the universe. He concluded, correctly, that based on Newtonian mechanics this being, named “Laplace’s demon”, would be able to predict the future perfectly for all times. Laplace did not know back then of Heisenberg’s uncertainty principle and neither did he know of chaos, both of which spoil predictability. However, his thoughts on determinism were hugely influential and lead to the idea of a clockwork universe, and our understanding of science a prediction tool in general.

Laplace’s is not the only famous demon in physics. Maxwell also imagined a demon, one that was able to sort particles of a gas into compartments depending on the particles’ velocities. The task of Maxwell’s demon was to open and close a door connecting two boxes that contain gas which initially has the same temperature on both sides. Every time a fast particle approaches from the right, the demon lets it through to the left. Every time a slow particle arrives from the right, the demon closes the door and keeps it right. This way, the average energy of particles and thus the temperature in the left box increases, and entropy of the whole system decreases. Maxwell’s demon thus seemed to violate the second law of thermodynamics!

Mawell’s demon gave headaches to physicists for many decades until it was finally understood that the demon itself must increase its entropy or use energy while it measures, stores, and eventually erases information. It has not been until a few years ago that Maxwell’s demon was in fact realized in the laboratory.

A thought experiment that still gives headaches to theoretical physicists today is the black hole information loss paradox. If you combine general relativity and quantum field theory, each of which is an extremely well established theory, then you find that black holes evaporate. You also find however that this process is not reversible; it destroys information for good. This however cannot happen in quantum field theory and thus we face a logical inconsistency when combining the two theories. This cannot be how nature works, so we must be making a mistake. But which? There are many proposed solutions to the black hole information loss problem. Most of my colleagues believe that we need a quantum theory of gravity to resolve this problem and that the inconsistency comes about by using general relativity in a regime where it should no longer be used. The thought experiments designed to resolve the problem typically use an imagined pair of observers, Bob and Alice, one of which is unfortunate to have to jump into the black hole while the other one remains outside.

One of the presently most popular solution attempts is black hole complementarity. Proposed in 1993 by Susskind and Thorlacius, black hole complementarity rests on the Gedankenexperiment main rules: That what matters is only what can be measured, and you should not fool yourself. One can avoid information loss in black holes by copying information and let it both fall into the black hole and go out. One copy remains with Bob, one goes with Alice. Copying quantum information however is itself inconsistent with quantum theory. Susskind and Thorlacius pointed out that these disagreements would not be measureable by neither Bob nor Alice, and thus no inconsistency could ever arise.

Black hole complementarity was proposed before the AdS/CFT duality was conjectured, and its popularity sparked when it was found that the non-locally doubled presence of information seemed to fit nicely with the duality that arose in string theory.

As of recently though, it has become clear that this solution has its own problems because it seems to violate the equivalence principle. The observer who crosses the horizon should not be able to notice anything unusual there. It should be like sitting in that elevator being pulled by an angel. Alas, black hole complementarity seems to imply the presence of a “firewall” that would roast the unsuspecting observer in his elevator. Is this for real or are we making a mistake again? Since the solution to this problem holds the promise of understanding the quantum nature of space and time much effort has focused on solving it.

Yes, Einstein’s legacy of thought experiments weighs heavily on theoretical physicists today – maybe too heavy for sometimes we forget that Einstein’s thoughts were based on real experiments. He had Michelson-Morley’s experiments that disproved the aether, he had the perihelion precession of mercury, he had the measurements of Planck’s radiation law. Thought alone only gets one so far. In the end, it is still data that decides whether a thought can become reality or remain fantasy.

[Cartoon: Abstruse Goose, Missed Calling]



This post first appeared on "Starts with a Bang".

Tuesday, October 21, 2014

We talk too much.

Image Source: Loom Love.

If I had one word to explain human culture at the dawn of the 21st century it would be “viral”. Everybody, it seems, is either afraid of or trying to make something go viral. And as mother of two toddlers in Kindergarten, I am of course well qualified to comment on the issue of spreading diseases, like pinkeye, lice, goat memes, black hole firewalls, and other social infections.

Today’s disease is called rainbow loom. It spreads via wrist bands that you are supposed to crochet together from rubber rings. Our daughters are too young to crochet, but that doesn’t prevent them from dragging around piles of tiny rubber bands which they put on their fingers, toes, clothes, toys, bed posts, door knobs and pretty much everything else. I spend a significant amount of my waking hours picking up these rubber bands. The other day I found some in the cereal box. Sooner or later, we’ll accidentally eat one.

But most of the infections the kids bring home are words and ideas. As of recently, they call me “little fart” or “old witch” and, leaving aside the possibility that this is my husband’s vocabulary when I am away, they probably trade these expressions at Kindergarten. I’ll give you two witches for one fart, deal? Lara, amusingly enough, sometimes confuses the words “ass” and “men” – “Arch” and “Mench” in German with her toddler’s lisp. You’re not supposed to laugh, you’re supposed to correct them. It’s “Arsch,” Lara, “SCH, not CH, Arsch.”

Man, as Aristotle put it, is a zoon politicon, she lives in communities, she is social, she shares, she spreads ideas and viruses. He does too. I pass through Frankfurt international airport on the average once per week. Research shows that the more often you are exposed to a topic the more important do you think it is, regardless of what the source is. It’s the repeated exposure that does it. Once you have a word in your head marked as relevant, your brain keeps pushing it around and hands it back to you to look for further information. Have I said Ebola yet?

Yes, words and ideas, news and memes, go viral, spread, mutate and affect the way we think. And the more connected we are, the more we share, the more we become alike. We see the same things and talk about the same things. Because if you don’t talk about what everybody else talks about would you even listen to yourself?

Not so surprisingly then, it has become fashionable to declare the end of individualism also in science, pointing towards larger and larger collaborations, and increasing co-author networks, the need to share, and the success of sharing. According to this NYT headline, the “ERA OF BIG SCIENCE DIMINISHES ROLE OF LONELY GENIUS”. We can read there
“Born out of the complexity of modern technology, the era of the vast, big-budget research team came into its own with its scientific achievements of 1984.”
Yes, that’s right, this headline dates back 30 years.

There lonely genius of course has always been a myth. Science is and has always been a community enterprise. We’re standing on the shoulders of giants. Most of them are dead, ok, but we’re still standing, standing on these dead people’s shoulders and we’re still talking and talking and talking. We’re all talking way too much. It’s hard not to have this impression after attending 5 conferences more or less in a row.

Collaboration is very en vogue today, or “trending” as we now say. Nature recently had an article about the measurement of the gravitational constant, G. Not a topic I care deeply about, but the article has an interesting quote:
“Until now, scientists measuring G have competed; everyone necessarily believes in their own value, says Stephan Schlamminger, an experimental physicist at NIST. “A lot of these people have pretty big egos, so it may be difficult,” he says. “I think when people agree which experiment to do, everyone wants their idea put forward. But in the end it will be a compromise, and we are all adults so we can probably agree.” 
Working together could even be a stress reliever, says Jens Gundlach, an experimental physicist at the University of Washington in Seattle. Getting a result that differs from the literature is very uncomfortable, he says. “You think day and night, ‘Did I do everything right?’”
And here I was thinking that worrying day and night about whether you did everything right is the essence of science. But apparently that’s too much stress. It’s clearly better we all work together to make this stressful thinking somebody else’s problem. Can you have a look at my notes and find that missing sign?

The Chinese, as you have almost certainly read, are about to overtake the world, and in that effort they now reform their science research system. Nature magazine informs us that the idea of this reform is “to encourage scientists to collaborate on fewer, large problems, rather than to churn out marginal advances in disparate projects that can be used to seek multiple grants. “Teamwork is the key word,” says Mu-Ming Poo, director of the CAS Institute of Neuroscience in Shanghai.” Essentially, it seems, they’re giving out salary increases for scientists to think the same as their colleagues.

I’m a miserable cook. My mode of operation is taking whatever is in the fridge, throwing it into a pan with loads of butter, making sure it’s really dead, and then pouring salt over it. (So you don’t notice the rubber bands.) Yes, I’m a miserable cook. But I know one thing about cooking: if you cook it for too long or stir too much, all you get is mush. It’s the same with ideas. We’re better off with various individual approaches than one collaborative one. Too much systemic risk in putting all your eggs in the same journal.

The kids, they also bring home sand-bathed gummy bears that I am supposed to wash, their friend’s socks, and stacks of millimeter paper glued together because GLUE! Apparently some store donated cubic meters of this paper to the Kindergarten because nobody buys it anymore. I recall having to draw my error bars on this paper, always trying not to use an eraser because the grid would rub away with the pencil. Those were the days.

We speak about ideas going viral, but we never speak about what happens after this. We get immune. The first time I heard about the Stückelberg mechanism I thought it was the greatest thing ever. Now it’s on the daily increasing list of oh-yeah-this-thing. I’ve always liked the myth of the lonely genius. I have a new office mate. She is very quiet.

Wednesday, October 15, 2014

Siri's Song [music video]

After the ios 8 update you can now use your iPhone entirely hands-free if the phone is plugged in and you speak the magic words "Hey Siri." I know this because last weekend my phone was on the charger next to my microphone as I was working on one of my pathetic vocal recordings, when suddenly Siri offered the following wisdom
    "Our love is like two long shadows kissing without hope of reality."

I cursed, stopped the recording, and hit playback. And there was Siri's love confession over my carefully crafted drum-bass loop. It was painfully obvious that whoever processed these vocals knew, in contrast to me, what he or she was doing. They're professionally filtered, compressed and flawlessly de-essed. In short, they sound awesome, even after re-recording.

I then had a little conversation with my phone, inquiring what this shadow business was all about. Siri stubbornly refused to repeat her lyrical deepity, but had some other weird insights to offer.

Enjoy :)


PS: No, my lyrics do of course not contain the words "Hey Siri". I'm not sure what caught her attention, but I recommend you don't sing to your phone.

Monday, October 13, 2014

Does Loop Quantum Cosmology make the black hole information loss problem worse rather than better?

Image Source: Flickr.

Martin Bojowald is one of the originators of Loop Quantum Cosmology (LQC), a model for the universe that makes use of the quantization techniques of Loop Quantum Gravity (LQG). This description of cosmology takes into account effects of quantum gravity and has become very popular during the last decade, because it allows making contact to observation.

The best known finding in LQC is that the Big Bang singularity, which one has in classical general relativity, is replaced by a bounce that takes place when the curvature becomes strong (reaches the Planckian regime). This in return has consequences for example for the spectrum of primordial gravitational waves (that we still hope will at some point emerge out of the foreground dust).

Now rumors reached me from various sources that Martin lost faith that Loop Quantum Cosmology is a viable description of our universe, and indeed he recently put a paper out on the arxiv detailing the problem that he sees.
Information loss, made worse by quantum gravity
Martin Bojowald
arXiv:1409.3157
Loop Quantum Cosmology, to be clear, was never claimed to be strictly speaking derived from Loop Quantum Gravity, though I have frequently noticed that the similarity of the names leads to confusion in the popular science literature. LQC deals with a symmetry-reduced version of LQG, but this symmetry reduction is done before the quantization. In practice this means that in LQC one first simplifies the universe by assuming it is homogeneous and isotropic, and then quantizes the remaining degrees of freedom. Whether or not this treatment leads to the same result that one would get by taking the fully quantized theory and looking for a solution that reproduces the right symmetries is controversial, and to my knowledge this question has never been satisfactorily settled.

Be that as it may, from my perspective and from that of most people working on the topic, LQC is a phenomenological model that is potentially testable and thus interesting in its own right, regardless of its connection to LQG.

It has become apparent however during the last years that if one takes into account perturbations around the homogeneous and isotropic background in LQC then one finds something peculiar: the space-time around the bounce loses its time-coordinate, it becomes Euclidean and is thus just space without time. We discussed this earlier here.

Now the time-coordinate in the space-time that we normally deal with plays a very important role, which is that it allows us to set an initial condition at one moment in time, and then use the equations of motion to predict what will happen at later times. This so called “forward evolution” is a very typical procedure for differential equations in physics, so typical that we often do not think about it very much. Thus I have to emphasize the relevant point is that to determine what happens at some point in space-time one does not have to set an initial condition on a space-time boundary around that point, which would necessitate knowing what happens at some moments into the future, but it is sufficient to know what happened at some moment in the past.

This important property that allows us to set initial conditions in the past to predict the future is not something you get for free in any space-time background. Space-times that obey this property are called “globally hyperbolic”. (Anti-de Sitter space is the probably best known example of a space-time that is not globally hyperbolic, thus the relevance of the boundary in this case.)

In his new paper Martin now points out that if space-time has regions that are Euclidean then the initial value problem becomes problematic. It is then in fact no longer possible to predict the future from a past initial condition. For the case of the Big Bang singularity being replaced by a Euclidean regime, this does not matter so much because we would just set initial conditions after this regime has passed and move on from there. But not so with black holes.

The singularity inside black holes is in LQC then also replaced by a Euclidean regime. This regime only forms in the late stages of collapse and will eventually vanish after the black hole has evaporated. But there being an intermediate Euclidean region has the consequence that whatever is the outcome of the evaporation process depends on the boundary conditions surrounding the Euclidean region. With the intermediate Euclidean region, one can no longer predict from the initial conditions of the matter that formed the black hole what is the outcome of black hole evaporation.

In his paper Martin writes that this makes the black hole information loss considerably worse. The normal black hole information loss problem is that the process of black hole evaporation seems to be irreversible and thus in particular not unitary. The final state of the evaporation is always thermal radiation, regardless of what formed the black hole. Now with the Euclidean region the final state of the black hole evaporation depends on some boundary condition that is not even in principle predictable. We have thus gone from not unitary to not deterministic!

Martin likens this case to that of a naked singularity, a singular region that (in contrast to the normal black hole singularity which is hidden by the horizon) is in full causal contact with space-time. A singularity is where everything ends, but it is also where anything can start. The initial value problem in a space-time with a naked singularity is similarly ill-defined as that in a space-time region with a Euclidean core, Martin argues.

I find this property of black holes in LQC not as worrisome as Martin. The comparison to a naked singularity is not a good one because the defining property of a singularity is that one cannot continue through it. One can however continue through the Euclidean region, it’s just that one needs additional constraints to know how. In fact I can see that what Martin thinks is a bug might be a feature for somebody else, for after all we know that time-evolution in quantum mechanics seems to be non-deterministic indeed.

But even leaving aside this admittedly far-fetched relation, the situation that additional information is necessary on some boundary to the future is not unlike that of the mysterious “stretched horizon” in black hole complementary. Said stretched horizon somehow stores and later releases the information of what fell through it. If the LQC black hole is supposed to solve the black hole information problem, then the same must be happening on the boundary of the Euclidean region. And, yes, that is a teleological constraint. I do not see what theory could possibly lead to it, but I don’t see that it is not possible either.

In summary, I find this development more interesting than troublesome. In contrast to non-unitarity, having a Euclidean core is uncomfortable and certainly unintuitive, but not necessarily inconsistent. I am very curious to see what the community will make out of this -- and I am sure we will hear more about this in the soon future.

Wednesday, October 08, 2014

I can't forget [music video]

Update about my songwriting efforts:



I had some help with the audio mix, but the friendly savior of my high frequency mush prefers to go unnamed. Thanks to him though, you can now put the thing on your stereo and it will sound reasonably normal. I like to think that I have made some progress with the vocal recording and processing. I am not happy with the percussion in that piece, have to work on that. If you go through my last few videos you can basically hear which tutorial I read at which point. So far I believe I am making progress, but you be my judge!

As to the video, I spent some money on an inexpensive video camera, and it has made my video recording dramatically easier because it has an auto zoom. As a result, the new video is more dynamic than the previous ones, it looks considerably better to me. It would have been even better hadn't I been wearing a blue shirt on the blue-screen day, some neurons failed me there.

I still haven't found a good way to deal with the problem that the video tends to go out of synch with the audio after exporting it. In fact I noticed that the severity of the problem depends on the player with which you watch the result which I find particularly odd. And after uploading the thing to youtube the audio again shifts oh-so-slightly. In the end, no matter what I do, it never quite fits.

And since I was asked a few times, yes I do have a soundcloud account, under the name "Funny Mommy". You can find all the tracks there. It's just that I am not in the mood to play the social network game on yet another platform, so I have a total of three followers or so, all of which are probably spam-bots. That's why I use YouTube. I am totally open to suggestions for other artist names :) And yeah, I am also on Ello, as @hossi, not that it seems to be good for anything.

Friday, October 03, 2014

Is the next supercollider a good investment?

The relevance of basic research is difficult to communicate to politicians who only care about their next term and who don’t want to invest in what might take decades to pay off. But it is even more difficult to decide which research is the best to invest into, and how much it is worth, in numbers.

Whether a next supercollider is worth the billions of Euro that it will eat up is a very involved question. I find it partly annoying, partly disturbing, that many of my physics colleagues regard the answer as obvious. Clearly we need a new supercollider! To measure the details of this, and the decay channels of that, to get a cleaner signal of something and a better precision for whatever. And I am sure they will come up with an argument for why Susy, our invisible friend, is still just around the corner.

To me this superficial argumentation is just another way of demonstrating they don’t care about communicating the relevance of their research. Of course they want a next collider - they make their living writing papers about that.

The most common argument that I hear in favor of the next collider is that much more money is wasted on the war in Afghanistan (if you ask an American) or rebuilding the Greek economy (if you ask a German), and I am sure similar remarks are uttered worldwide. The logic here seems to be that a lot of money is wasted anyway, so what does it matter to spend some billions on a collider. Maybe this sounds convincing if you have a PhD in high energy physics, but I don’t know who else is supposed to buy this.

The next argument I keep hearing is that the worldwide web was invented at CERN which also hosts the LHC right now. If anything, this argument is even more stupid than the war-also-wastes-money argument. Yes, Tim Berners-Lee happened to work at CERN when he developed hypertext. The environment was certainly conductive to his invention, but the standard model of particle physics had otherwise very little to do with it. You could equally well argue we should build leaning towers to advance research on general relativity.

I just finished reading John Moffat’s book “Cracking the Particle Code of the Universe”. I can’t post the review here until it has appeared in print due to copyright issues, sorry, but by and large it’s a good book. No, he doesn’t use it to advertise his own theories. He mentions them of course, but most of the book is more generally dedicated to the history, achievements, and shortcomings of the standard model.

His argument for the relevance of particle colliders amounts to the following paragraph:
“As Guido Altarelli mused after my talk at CERN in 2008, can governments be persuaded to spend ever greater sums of money, amounting to many billions of dollars, on ever larger and higher energy accelerators than the LHC if they suspect that the new machines will also come up with nothing new beyond the Higgs boson? Of course, to put this in perspective, one should realize that the $9 billion spend on an accelerator would not run a contemporary war such as the Afghanistan war for more than five weeks. Rather than killing people, building and operating these large machines has practical and beneficial spinoffs for technology and for training scientists. Thus, even if the accelerators continued to find no new particles, they might still produce significant benefits for society. The Worldwide Web, after all, was invented at CERN.”

~ John Moffat, Cracking the Particle Code of the Universe, p. 78
Well, running a war also has practical and beneficial spinoffs for technology and training scientists. Sorry John, but that was disappointing. To be fair, the whole book itself makes a pretty good case for why understanding the laws of nature is important business. But what war doesn’t do for your country and what investing in basic research does is building a base for sustainable progress. Without new discoveries and fundamentally new insights, applied science must eventually run dry.

There is no doubt in my mind that society invests its billions well if it invests in theoretical physics. Whether that investment should go into particle colliders though is a different question. I don’t have a good answer to that, and I don’t see that the question is seriously being discussed. Is it a worthy cause?

Last year, Fermilab’s Symmetry Magazine ran a video contest on the topic “Why particle physics matters”. Ironically most of the answers have nothing to do with particle physics in particular: “could bring about a revolution,” “a wonderful model of successful international collaboration,” “explore the frontiers and boundaries of our universe,” “engages and sharpens the mind”, “captures the imagination of bright minds”. You could use literally the same arguments for cosmology, quantum information or high precision measurements. Indeed, I personally find the high precision frontier presently more promising than ramping up energy and luminosity.

I am happy of course if China will go ahead and build the next supercollider. After all it’s not my taxes and still better than spending money on diamond necklaces that your 16 year old can show off on facebook. I can’t quite shake the impression though that this plan is more the result of wanting to appear competitive than the result of a careful deliberation about return on investment.

Friday, September 26, 2014

Black holes declared non-existent again.

That's me. 

The news of the day is that Laura Mersini-Houghton has presumably shown that black holes don’t exist. The headlines refer to these two papers: arXiv:1406.1525 and arXiv:1409.1837.

The first is an analytical estimate, the second a numerical study of the same idea. Before I tell you what these papers are about, a disclaimer: I know Laura; we have met at various conferences, and I’ve found her to be very pleasant company. I read her new paper some while ago and was hoping I wouldn’t have to comment on this, but my inbox is full with people asking me what this is all about. So what can I do?

In their papers, Laura Mersini-Houghton and her collaborator Harald Pfeiffer have taken into account the backreaction from the emitted Hawking radiation on the collapsing mass which is normally neglected. They claim to have shown that the mass loss is so large that black holes never form to begin with.

To make sense of this, note that black hole radiation is produced by the dynamics of the background and not by the presence of a horizon. The horizon is why the final state misses information, but the particle creation itself does not necessitate a horizon. The radiation starts before horizon formation, which means that the mass that is left to form the black hole is actually less than the mass that initially collapsed.

Physicists have studied this problem back and forth since decades, and the majority view is that this mass loss from the radiation does not prevent horizon formation. This shouldn’t be much of a surprise because the temperature of the radiation is tiny and it’s even tinier before horizon formation. You can look eg at this paper 0906.1768 and references [3-16] therein to get an impression of this discussion. Note though that this paper also mentions that it has been claimed before every now and then that the backreaction prevents horizon formation, so it’s not like everyone agrees. Then again, this could be said about pretty much every topic.

Now what one does to estimate the backreaction is to first come up with a time-dependent emission rate. This is already problematic because the normal Hawking radiation is only the late-time radiation and time-independent. What is clear however is that the temperature before horizon formation is considerably smaller than the Hawking-temperature and it drops very quickly the farther away the mass is from horizon formation. Incidentally, this drop was topic of my master’s thesis. Since it’s not thermal equilibrium one actually shouldn’t speak of a temperature. In fact the energy spectrum isn’t quite thermal, but since we’re only concerned with the overall energy the spectral distribution doesn’t matter here.

Next problem is that you will have to model some collapsing matter and take into account the backreaction during collapse. Quite often people use a collapsing shell for this (as I did in my master’s thesis). Shells however are pathological because if they are infinitely thin they must have an infinite energy-density and are by themselves already quantum gravitational objects. If the shell isn’t infinitely thin, then the width isn’t constant during collapse. So either way, it’s a mess and you best do it numerically.

What you do next is take that approximate temperature which now depends on some proper time in which the collapse proceeds. This temperature gives via Stefan-Bolzmann’s law a rate for the mass loss with time. You integrate the mass-loss over time and subtract the integral from the initial mass. Or at least that’s what I would have done. It is not what Mersini-Houghton and Pfeiffer have done though. What they seem to have done is the following.

Hawking radiation has a negative energy-component. Normally negative energies are actually anti-particles with positive energies, but not so in the black hole evaporation. The negative energy particles though only exist inside the horizon. Now in Laura’s paper, the negative energy particles exist inside the collapsing matter, but outside the horizon. Next, she doesn’t integrate the mass loss over time and subtracts this from the initial mass, but integrates the negative energies over the inside of the mass and subtracts this integral from the initial mass. At least that is my reading of Equation IV.10 in 1406.1525, and equation 11e in 1409.1837 respectively. Note that there is no time-integration in these expressions which puzzles me.

The main problem I have with this calculation is that the temperature that enters the mass-loss rate for all I can see is that of a black hole and not that of some matter which might be far from horizon crossing. In fact it looks to me like the total mass that is lost increases with increasing radius, which I think it shouldn’t. The more dispersed the mass, the smaller the gravitational tidal force, and the smaller the effect of particle production in curved backgrounds should be. This is for what the analytical estimate is concerned. In the numerical study I am not sure what is being done because I can’t find the relevant equation, which is the dependence of the luminosity on the mass and radius.

In summary, the recent papers by Mersini-Houghton and Pfeiffer contribute to a discussion that is decades old, and it is good to see the topic being taken up by the numerical power of today. I am skeptic that their treatment of the negative energy flux is consistent with the expected emission rate during collapse. Their results are surprising and in contradiction with many previously found results. It is thus too early to claim that is has been shown black holes don’t exist.