The idea underlying their calculation is that we live in a multiverse in which universes with all possible combinations of the constants of nature exist. On this multiverse, you have a probability distribution. You further must take into account that some combinations of natural constants will not allow for life to exist. This results in a new probability distribution that quantifies the likelihood, not of the existence of universes, but that we observe a particular combination. You can then calculate the probability for finding certain masses of postulated particles.
As I just explained in a recent post, this is a new variant of arguments from naturalness. A certain combination of parameters is more “natural” the more often it appears in the multiverse. As Baer et al write in their paper:
“The landscape, if it is to be predictive, is predictive in the statistical sense: the more prevalent solutions are statistically more likely. This gives the connection between landscape statistics and naturalness: vacua with natural observables are expected to be far more common than vacua with unnatural observables.”Problem is, the landscape is just not predictive. It is predictive in the statistical sense only after you have invented a probability distribution. But since you cannot derive the distribution from first principles, you really postulate your results in form of the distribution.
Baer et al take their probability distribution from the literature, specifically from a 2004 paper by Michael Douglas. The Douglas paper has no journal reference and is on the arXiv in version 4 with the note “we identify a serious error in the original argument, and attempt to address it.”
So what do the particle physicists find? They find that the mass of the Higgs-boson is most likely what we have observed. They find that most likely we have not yet seen supersymmetric particles at the LHC. They also find that so far we have not seen any dark matter particles.
I must admit that this fits remarkably well with observations. I would have been more impressed, though, had they made those predictions prior to the measurement.
They also offer some actual predictions which is that the next LHC run is unlikely to see any new fundamental particles, but that upgrading the LHC to higher energies should help seeing them. (This upgrade is called HE-LHC and is distinct from the FCC proposal.) They also think that the next round of dark matter experiments should see something.
Ten years ago, Howard Baer worried that when the LHC turned on, it would produce so many supersymmetric particles that this would screw up the detector calibration.
Hi Sabine,
ReplyDeletecan they, based on the natural constants, compute the likelihood to live in a universe where it is possible to rule-out string theory ?
Best,
J.
That's the funniest thing I've heard in at least a week. Thank you.
DeleteLet's say there are infinite possibilities, then the observer abstracts based on its program. But what does it abstract? Reality. This reality is constrained, limited by the program and therefore reduces the percentage of probability. In this way if we go on complicating the abstraction by making the program more and more complex the degree of uncertainty decreases and we approach a high level of certainty. The certainty increases and probability decreases when the complexity of the program applied to the uninterpreted primordial increases. When we apply a program to the uninterpreted primordia as an act of measurement there is interpretation, there is abstraction. Then do we need a multiverse. The program is the observer, the interpretation, the abstraction; and the uninterpreted primordial is the whole i.e., infinite possibilities.
ReplyDeleteDid string theory landscape predict mass of the Higgs boson either? AFAIk, this was only predicted by Asymptotic safety (by Wetterich and Shaposhnikov)
ReplyDeleteYes, that's also my state of knowledge.
DeleteNo, no, right after the measured value of the Higgs was discussed, string theory was totally able to predict it:
Deletehttp://www.math.columbia.edu/~woit/wordpress/?p=4262
Quote:
"n a remarkable coincidence, after more than 25 years of unsuccessfully trying to extract a definite experimental prediction from string theory, Kane and collaborators were able to achieve the holy grail of the subject (a prediction of the one unknown parameter in the SM, the Higgs mass) just a week before the CERN announcement. "
(Of course, they also predicted Gluinos, but let's just ignore that...)
In my previous comment, rather than use the word complicated with reference to the abstraction, let's say as the complexity of the program increases the abstraction, or the interpretation becomes more and more "well defined". As the definition improves the certainty increases, and the uncertainty greatly decreases. Now defines or improves the definition? What is it that defines in the first place? The program, the complexity of the program.
ReplyDeleteSo do we need parallel universes or is it just infinite possibilities, the whole?
Deletebee,
ReplyDelete"This upgrade is called HE-LHC and is distinct from the FCC proposal."
I'm unclear as to your position on HE-LHC.
IMO, we should upgrade to HE-LHC, and if it find something, build the 100TEV, if it doesn't find anything, perhaps HEP should wait until wakefield technology is ready.
obviously with this paper plenty of string theorists remain die hard true believers, who obviously want a 100TEV collider, but are there any that have taken your criticism to heart, and now work on some other nonstring QG theory?
"IMO, we should upgrade to HE-LHC, and if it find something, build the 100TEV, if it doesn't find anything, perhaps HEP should wait until wakefield technology is ready."
DeleteUnfortunately upgrading to HE-LHC would not help much, since HE-LHC still would need 16T magnets, for a mere doubling of LHC's present beam energy.
In addition, HE-LHC would probably need a new dedicated high-energy injector, a superconducting SPS... so a new 7+ km-long accelerator.
On top of that, one couldn't build the HE in parallel with operation of the LHC, it would need at least 8-10 years a no beams at all... means 3-4 cycles of PhD students lost, and with them the academic positions that to with it... a real disaster for the field.
About Wakefield technology, the best case is awake... which can actually be run only because it profits from the existence of a HEP accelerator, the SPS... :-)
" Howard Baer and collaborators predict masses of new particles using the string theory landscape."
ReplyDeletewhat about the hierarchy problem that SUSY was supposed to solve, requiring LHC- accessible SUSY partners?
This solution is replaced by the new version of naturalness. If it is natural in the new way, that solves the problem by definition.
Delete"The idea underlying their calculation is that we live in a multiverse in which universes with all possible combinations of the constants of nature exist."
ReplyDeleteSister Bee,
Please explain conservation of physical quantities in a multiverse scenario.
I am thinking that at some nexus the universe is not able to decide between two alternative paths forward and, rather than disappoint anybody, chooses both by dividing itself into two alternative universes that vary from each other only by this one undecidable detail. Further, it sounds like these moments of indecision are not rare but rather commonplace so the universes would stack up pretty quickly. Seems like a strange way for nature to conduct itself and likely I have misunderstood something. Be that as it may, I am curious about the principle of conservation of energy and its possible violation. Does each newly minted universe amble on its way with the same endowment of energy (information etc.)? How is that possible?
Sabine, may I ask what is your opinion of Ethan Siegel and his writings? He seems to currently be promoting the bigger, better LHC. Although I mostly think of him as a astrophysics writer.
ReplyDeleteIf I understand you correctly, you're saying these theorists are making a circular argument.
ReplyDeleteI see what you did there ;) They are not string theorists though. This is what is called "particle physics phenomenology". But string theory serves as an excuse for all kinds of speculations. People use it to make their ideas sound credible.
DeleteAnd is it also the case that all these new particles predicted by the paper will appear at *exactly* the range that the proposed $ 20 billion collider will probe ?
ReplyDeleteThis is the meta-prediction I am making for all particle physics papers which will come out over the next few years. :)
As I said, they talk about the HE-LHC, not the FCC. The energy of the former is about 27 TeV, the latter 100 TeV. But if they don't see anything at 27 TeV, I'm sure they'll find a reason why it ought to be at 100 TeV.
DeleteI'm not a physicist but the bald circularity of the type of argument that begins with an a priori probability distribution is something that I often see in my field (ecology) too. I applaud Sabine for returning to that point relentlessly.
ReplyDeleteHi, Sabine.
ReplyDeleteTalking about dark matter:
Please, can you tell me if mathematically is consistent to postulate that dark matter could be a field whose particles don't interact at all with our "ordinary" matter (nor by the electroweak force neither by strong nuclear force) and that just by deforming the space-time with its mass(in a GR way) it disturbs indirectly the "ordinary" matter?
In other words, there could exists (transparents) fields unable to couple (in any way) to the known (accesible) ones? there could exists something like a separate set of fields forming a "dark" lagrangian ("dark" to us in the sense this L does not interact with our SM one)? Then, this independient set of fields could be phenomenology presented only by the deformation their mass produce in the space-time the two (or more) sets of (independents) fields share.
Moreover, could an explanation of this type be behind also of dark energy in some way?
Best regards,
Samuel.
1) yes
Delete2) maybe
3) please avoid off-topic comments. I don't have time to answer random questions.
I'm very sorry for the off-topic :(. Just ask you, please, if you (or anybody) know about some paper that goes in the direction commented.
DeleteBy the way, the "yes" is for the dark matter issue and the "maybe" for the possible "dark energy" relation, isn't it?
Regards.
P.S.: No more off-topic I promise.
I think in general the landscape as it pertains to this world is questionable. The string or M-theory landscape is really something that pertains to anti-de Sitter spacetimes. We exist in an approximately de Sitter spacetime. We may be in a sort of vacuum pocket of low energy in an eternal inflating spacetime similar to a de Sitter spacetime with a large vacuum energy or positive cosmological constant /\ ~ b/l_p^2 for b ~ .001 or so. With that vacuum energy supersymmetry would be strongly broken. Eternal inflation is also a situation of near self-similarity of fields with no energy dependency on interactions. This is a signature of a phase change, and the end of this inflation in this observable pocket broke that symmetry.
DeleteThe real physics is more similar to the swampland, where string theory does not play a direct role in the way most theorists think. It may play a role with a putative phase transition mentioned above, but strings may not have much to do with standard low energy physics. They might play a role with quantum hair or fuzz on black hole horizons.
Actually, reading articles like this makes you wonder if anyone is willing to reflect these days. It's also pertinent, because arguments presented against a new collider seem to boil down to two (that are plausible to me):
ReplyDeletea) better ways to spend that money and
b) giving people a new machine may prolong an unhealthy way of working in the field (as far as theorycrafting is concerned). Basically, it may reinforce erroneous ways where a shock could be healthy.
Until now, I was actually of a mind to say that building the FCC (or HE-LHC or similar) and NOT finding anything would be the bigger shock (albeit with a hefty price tag). In the light of documented revisions of the past by leading people in the field and some form of collective amnesia, I'm now not so sure anymore.
a stray thought on a) given that we do not know how money not spent on the FCC would be spent, aren't we relying on an unknown probability distribution here too? (fortunately, it is not a priori unknowable, but from a pragmatist point of view, i'm not sure it makes a real difference)? Point is, assuming the FCC has value V. Assume it is not built and the money redirected to different ventures with likelihoods p_i and values v_i. Also assume that there's a bias toward spending money unwisely the more there is of it as far as governments are concerned (ten years of booming economy and national debt has increased the world over...). Who's to say that the sum of p_i * v_i for all i is actually larger than V? Hope springs eternal, of course ;)
"You basically say that society is most likely not able to spent money for something usefull and hence it is better if society hands-over the money to the FCC people"
ReplyDelete??? Again the same bs?
Society is not expected to hand over the money to FCC people. Society, since decades, relies on the decisions of groups of experts, often members of fields not related to particle physics research. These panels of experts then deliver their findings, conclusions and recommendations to the highest research bodies at national and/or continental level.
The recipients of the multi-year funding programs then use this money in a parsimonious way, doing their best to reach the promised performances, within time and budget.
As far as the accelerators are concerned, i.e. devices based on known physics, the design performances and specs have often been met, and even exceeded ... too many exemples could be given here.
So, please... stop your nonsense demeaning, evidence-deaf, patronizing rants. Thank you in advance.