|I'm the half-face 2nd from the right.|
It was an “invitation only” conference, but it wasn’t difficult to get on the list because, for reasons mysterious to me, the conference system listed me as a “chair” of the meeting, whatever that might mean. I assure you I didn’t have anything to do neither with the chairs nor with the organization, other than pointing out some organization might be beneficial. So please don’t blame me that there was no open registration.
So much about academia. Now let me say something about the science.
Two talks from the conference were particularly memorable for they stood in stark contrast to each other. The one talk was by Lisa Randall, the other by Glenn Starkman. Both spoke about dark matter, but that’s about where the similarities end.
Randall talked about her recent work on “Partially Interacting Dark Matter” (slides here). Her research in collaboration with JiJi Fan, Andrey Katz, and Matthew Reece is based on a slightly more complicated version of existing particle dark matter models. They consider some so-far undetected particles which are not in the Standard Model. But in contrast to most of the presently used models, these additional particles are not, as commonly assumed, unable to interact with each other. Instead, the particles exert forces among themselves, which has a several consequences.
First, it allows them to explain why dark matter hasn’t been seen in direct detection experiments: the stuff just isn’t as simple as assumed for the estimates of detection rates. Second, it means that dark matter has friction and thus at least partly forms rotation disks during galaxy formation, the “dark disks.” If they are right, our galaxy too should have a dark disk, and our solar system would traverse it periodically every 35 million years or so. If you trust Nature News, which maybe you shouldn’t, then this can explain the extinction of the dinosaurs. Right or wrong, it’s a story catchy enough to spread like measles, and equally spotty. Third, and most important, the partially interacting dark matter introduces additional parameters which you can use to fit unexplained data like the 130 GeV Fermi gamma ray line.
In other words, I wasn’t very convinced that partially interacting dark matter is anything more than something you can publish papers about.
Now to Starkman (slides here).
He set out alluding to Odysseus’ odyssey, which lead to strange and distant countries that Starkman likened to the proposed dark matter particles like WIMPS and axions. In the end though, Starkman pointed out, Odysseus returned to where he started from. Maybe, he suggested, things have gotten strange enough and it is time to return to where we started from: the Standard Model.
His talk was about “macro dark matter,” a dark matter candidate that has received little if no attention. I had only become aware of it briefly before, through a paper by Starkman together with David Jacobs and Amanda Weltman. Unlike the commonly considered particle dark matter, macro dark matter isn’t composed of single particles, but of macroscopically heavy chunks of matter with masses that are a priori anywhere between a gram and the mass of the Sun.
It is often said that observations indicate dark matter must be made of weakly interacting particles, but that is only true if the matter is thinly dispersed into light, individual particles. What we really know isn’t that the particles are weakly interacting but that are rarely interacting; you never measure a cross-section without a flux. Dark matter could be rarely interacting because it is weakly interacting. That’s the standard assumption. Or it could be rarely interacting because it is clumped together to tiny and dense blobs that are unlikely to meet each other. That’s macro dark matter.
But what is macro dark matter made of? It might for example be a type of nuclear matter that hasn’t been discovered so far, blobs of quarks and gluons that were created in the early universe and lingered around ever since. These blobs would be incredibly dense; at this density the Great Pyramid of Giza would fit inside a raindrop!
If you think nuclear matter is last-century physics, think again. The phases and properties of nuclear matter are still badly understood and certainly can’t be calculated from first principles even today.
Physicists were stunned for example when the quark gluon plasma turned out to have lower viscosity than anybody expected. They still argue about the equations governing the behavior of matter in neutron stars. Nobody has any idea how to calculate lifetimes of unstable isotopes. I recently talked to a nuclear physicist who told me that the state-of-the-art for composites is 20 nucleons. Twenty. This brings you just about up to neon in the periodic table. And that is, needless to say, using an effective model, not quarks and gluons. The Standard Model interactions are well-understood at LHC energies or higher, yes. But once quarks start binding together physicists are back to comparing models with data, rather than making calculations in the full theory.
So matter of nuclear density containing some of the heavier quarks is a possibility. But Starkman and his collaborators prefer to not make specific assumptions and keep their search as model-independent as possible. They were simply looking for constraints on this type of dark matter which are summarized in the figure below
|Constraints on macro dark matter. Fig 3 from arxiv:1410.2236|
On the vertical axis you have the cross-section, on the horizontal axis the mass of the macros. The grey and green diagonal lines are useful references marking atomic and nuclear density. In general the macro could be made up of a mixture, and so they wanted to keep the density a variable to be constrained by experiment. The shaded regions are excluded by various experiments.
To arrive at the experimental constraints one takes into account two properties of the macros that can be inferred from existing data. The one is the total amount of dark matter which we know from a number of observations, for example gravitational lensing and the CMB power spectrum. This means if we look at a particular mass of the macro, we know how many of them there must be. The other property is the macros’ average velocity which can be estimated from the mass and the strength of the gravitational potential that the particles move in. From the mass and the density one gets the size, and together with the velocity one can then estimate how often these things hit each other – or us.
The grey-shaded left upper region is excluded because the stuff would interact too much, causing it to clump too efficiently, which runs into conflict with the observed large scale structures.
The red regions are excluded by gravitational lensing data. These would be the macros that are so heavy they’d result in frequent strong gravitational lensing which hasn’t been observed. These constraints are also the reason why neutron stars, brown dwarfs, and black holes have long been excluded as possible explanations for dark matter. There are two types of lensing constraints from two different lensing methods, and right now there is a gap between them, but it will probably close in the soon future.
The yellow shaded region excludes macros of small mass, which is possible because these would be hitting Earth quite frequently. A macro with mass 109g for example would pass through Earth about once per year, the lighter ones more frequently. Searches for such particles are similar to searches for magnetic monopoles. One makes use of natural particle detectors, such as the sediment mica that forms neatly ordered layers in which a passing heavy particle would leave a mark. No such marks have been found, which rules out the lighter macros.
What about that open region in the middle? Could macros hide there? Starkman and his collaborators have some pretty cool ideas how to look for macros in that regime, and that’s what my New Scientist piece with Naomi is about. (Want me to keep interesting stories on this blog? Please use the donate button that’s in your face in the top right corner, thank you.)
Macro dark matter of course leaves many open questions. As long as we don’t really know what it’s made of, we have no knowing whether it can form in sufficient amounts or is stable enough. But its big advantage is that it doesn’t necessarily require us to construe up new particles.
Do I like this idea? Holy shit, no, I hate it! Like almost all particle physicists, I prefer my interactions safely in the perturbative regime where I can calculate cross-sections the way I learned it in kindergarten. I fled from the place where I made my PhD because everybody there was doing nuclear physics and I wanted nothing of that. I wanted elementary particles, grand unification and fundamental truths. I would be deeply disappointed if dark matter wasn’t a hint for physics beyond the standard model, but instead would drift into the realm of lattice qcd.
But then I was thinking. If everybody feels this way we might be missing the solution to a 80 year old puzzle because we focus on answer that we like and answers that are simple, not answers that are in our face yet complicated. Yes, macros are the most conservative and in a sense most depressing dark matter model around. But at least they didn’t kill the dinosaurs.