There are loads of data for sure, and nuclear physicists are giddy with joy because the LHC has delivered a wealth of new information about the structure of protons and heavy ions. But the good old proton has never been the media’s darling. And the fancy new things that many particle physicists expected – the supersymmetric particles, dark matter, extra dimensions, black holes, and so on – have shunned CERN.
It’s a PR disaster that particle physics won’t be able to shake off easily. Before the LHC’s launch in 2008, many theorists expressed themselves confident the collider would produce new particles besides the Higgs boson. That hasn’t happened. And the public isn’t remotely as dumb as many academics wish. They’ll remember next time we come ask for money.
The big proclamations came almost exclusively from theoretical physicists; CERN didn’t promise anything they didn’t deliver. That is an important distinction, but I am afraid in the public perception the subtler differences won’t matter. It’s “physicists said.” And what physicists said was wrong. Like hair, trust is hard to split. And like hair, trust is easier to lose than to grow.
What the particle physicists got wrong was an argument based on a mathematical criterion called “naturalness”. If the laws of nature were “natural” according to this definition, then the LHC should have seen something besides the Higgs. The data analysis isn’t yet completed, but at this point it seems unlikely something more than statistical anomalies will show up.
I must have sat through hundreds of seminars in which naturalness arguments were repeated. Let me just flash you a representative slide from a 2007 talk by Michelangelo L. Mangano (full pdf here), so you get the idea. The punchline is at the very top: “new particles must appear” in an energy range of about a TeV (ie accessible at the LHC) “to avoid finetuning.”
I don’t mean to pick on Mangano in particular; his slides are just the first example that Google brought up. This was the argument why the LHC should see something new: To avoid finetuning and to preserve naturalness.
I explained many times previously why the conclusions based on naturalness were not predictions, but merely pleas for the laws of nature to be pretty. Luckily I no longer have to repeat these warnings, because the data agree that naturalness isn’t a good argument.
The LHC hasn’t seen anything new besides the Higgs. This means the laws of nature aren’t “natural” in the way that particle physicists would have wanted them to be. The consequence is not only that there are no new particles at the LHC. The consequence is also that we have no reason to think there will be new particles at the next higher energies – not until you go up a full 15 orders of magnitude, far beyond what even futuristic technologies may reach.
So what now? What if there are no more new particles? What if we’ve caught them all and that’s it, game over? What will happen to particle physics or, more to the point, to particle physicists?
In an essay some months ago, Adam Falkowski expressed it this way:
“[P]article physics is currently experiencing the most serious crisis in its storied history. The feeling in the field is at best one of confusion and at worst depression”At present, the best reason to build another particle collider, one with energies above the LHC’s, is to measure the properties of the Higgs-boson, specifically its self-interaction. But it’s difficult to spin a sexy story around such a technical detail. My guess is that particle physicists will try to make it sound important by arguing the measurement would probe whether our vacuum is stable. Because, depending on the exact value of a constant, the vacuum may or may not eventually decay in a catastrophic event that rips apart everything in the universe.*
Such a vacuum decay, however, wouldn’t take place until long after all stars have burned out and the universe has become inhospitable to life anyway. And seeing that most people don’t care what might happen to our planet in a hundred years, they probably won’t care much what might happen to our universe in 10100 billion years.
Personally I don’t think we need a specific reason to build a larger particle collider. A particle collider is essentially a large microscope. It doesn’t use light, it uses fast particles, and it doesn’t probe a target plate, it probes other particles, but the idea is the same: It lets us look at matter very closely. A larger collider would let us look closer than we have so far, and that’s the most obvious way to learn more about the structure of matter.
Compared to astrophysical processes which might reach similar energies, particle colliders have the advantage that they operate in a reasonably clean and well-controlled environment. Not to mention nearby, as opposed to some billion light-years away.
That we have no particular reason to expect the next larger collider will produce so-far unknown particles is in my opinion entirely tangential. If we stop here, the history of particle physics will be that of a protagonist who left town and, after the last street sign, sat down and died, the end. Some protagonist.
But I have been told by several people who speak to politicians more frequently than I that the “just do it” argument doesn’t fly. To justify substantial investments, I am told, an experiment needs a clear goal and at least a promise of breakthrough discoveries.
Knowing this, it’s not hard to extrapolate what particle physicists will do next. We merely have to look at what they’ve done in the past.
The first step is to backpedal from their earlier claims. This has already happened. Originally we were told that if supersymmetric particles are there, we would see them right away.
“Discovering gluinos and squarks in the expected mass range […] seems straightforward, since the rates are large and the signals are easy to separate from Standard Model backgrounds.” Frank Paige (1998).Now they claim no one ever said it would be easy. By 2012, it was “Natural SUSY is difficult to see at LHC” and “"Natural supersymmetry" may be hard to find.”
“The Large Hadron Collider will either make a spectacular discovery or rule out supersymmetry entirely.” Michael Dine (2007)
Step two is arguing that the presently largest collider will just barely fail to see the new particles but that the next larger collider will be up to the task.
One of the presently most popular proposals for the next collider is the International Linear Collider (ILC), which would be a lepton collider. Lepton colliders have the benefit of doing away with structure functions and fragmentation functions that you need when you collide composite particles like the proton.
In a 2016 essay for Scientific American Howard Baer, Vernon D. Barger, and Jenny List kicked off the lobbying campaign:
“Recent theoretical research suggests that Higgsinos might actually be showing up at the LHC—scientists just cannot find them in the mess of particles generated by the LHC's proton-antiproton collisions […] Theory predicts that the ILC should create abundant Higgsinos, sleptons (partners of leptons) and other superpartners. If it does, the ILC would confirm supersymmetry.”The “recent theoretical research” they are referring to happens to be that of the authors themselves, vividly demonstrating that the quality standard of this field is currently so miserable that particle physicists can come up with predictions for anything they want. The phrase “theory predicts” has become entirely meaningless.
The website of the ILC itself is also charming. There we can read:
“A linear collider would be best suited for producing the lighter superpartners… Designed with great accuracy and precision, the ILC becomes the perfect machine to conduct the search for dark matter particles with unprecedented precision; we have good reasons to anticipate other exciting discoveries along the way.”They don’t tell you what those “good reasons” are because there are none. At least not so far. This brings us to step three.
Step three is the fabrication of reasons why the next larger collider should see something. The leading proposal is presently that of Michael Douglas, who is advocating a different version of naturalness, that is naturalness in theory space. And the theory space he is referring to is, drums please, the string theory landscape.
Naturalness, of course, has always been a criterion in theory-space, which is exactly why I keep saying it’s nonsense: You need a probability distribution to define it and since we only ever observe one point in this theory space, we have no way to ever get empirical evidence about this distribution. So far, however, the theory space was that of quantum field theory.
When it comes to the landscape at least the problem of finding a probability distribution is known (called “the measure problem”), but it’s still unsolvable because we never observe laws of nature other than our own. “Solving” the problem comes down to guessing a probability distribution and then drowning your guess in lots of math. Let us see what predictions Douglas arrives at:
|Slide from Michael Douglas. PDF here. Emphasis mine.|
Supersymmetry might be just barely out of reach of the LHC, but a somewhat larger collider would find it. Who’d have thought.
You see what is happening here. Conjecturing a multiverse of any type (string landscape or eternal inflation or what have you) is useless. It doesn’t explain anything and you can’t calculate anything with it. But once you add a probability distribution on that multiverse, you can make calculations. Those calculations are math you can publish. And those publications you can later refer to in proposals read by people who can’t decipher the math. Mission accomplished.
The reason this cycle of empty predictions continues is that everyone involved only stands to benefit. From the particle physicists who write the papers to those who review the papers to those who cite the papers, everyone wants more funding for particle physics, so everyone plays along.
I too would like to see a next larger particle collider, but not if it takes lies to trick taxpayers into giving us money. More is at stake here than the employment of some thousand particle physicists. If we tolerate fabricated arguments in the scientific literature just because the conclusions suit us, we demonstrate how easy it is for scientists to cheat.
Fact is, we presently have no evidence – neither experimental nor theoretical evidence – that a next larger collider would find new particles. The absolutely last thing particle physicists need right now is to weaken their standards even more and appeal to multiversal math magic that can explain everything and anything. But that seems to be exactly where we are headed.
* I know that’s not correct. I merely said that’s likely how the story will be spun.
Like what you read? My upcoming book “Lost in Math” is now available for preorder. Follow me on twitter for updates.
I agree with you more than you'd think, given my connection to the LHC. However, I do think you overstated one point.
At the end of 2018, the LHC will have recorded a mere 3% of the intended research program. That means that there is 30x more data to come. I think you'd need to see the results of all of the data before you say that the LHC was a bust. It may be. But your claim is hasty.
Mind you, I'm not claiming that the other 97% will result in a discovery. But I'm still going to dig through it to find out.
Thank you Dr. H. You are on point as always... of course my personal interest lies in what would make fine tuning unsurprising but that is not a scientific argument..ReplyDelete
Thanks Bee for your blog post highlighting a crises in the theoretical physics sector.ReplyDelete
The current state of theoretical physics is characterised by a growing number of theorists who are advocating for the defilement of scientific principles just to preserve their pet theories and huge egos.Those who still work on sound scientific principles are now at the risk of being labelled crackpots and unimaginative.
The astroparticle sector is now dominated by speculative dark matter particle theories which have been ruled out by a series of observations. It comes as no surprise these days when every anomaly observed in astrophysics is attributed to DM before a thorough scientific analysis using off the shelf physics is done. This shows how some theorists are so unashamedly desperate to preserve their pet theories at the expense of established scientific principles and procedures.
Yes, you are right. Sorry, I should have said that more clearly. They might still find something and if so, that will resolve the situation and I can stop complaining. It just pains me to have to see this shift of "predictions" again.
Hubris. Though it does not help that so many projects seemingly need to be 'sold' via TV and general media, a trend since the advent of cable tv.ReplyDelete
Personally, I would prefer to see projects like eRHIC funded. Lower cost and clearly still some interesting physics to be had.
That only 3% of the LHC's experimental program is complete, as Don Lincoln indicated, is very encouraging for the prospect of new physics showing up in the relatively near future, as more and more data is crunched.ReplyDelete
Not really - this is not linear, not by a long shot in case of LHC. There is very little chance new physics will be uncovered in the remaining data.Delete
some years ago I made a "prediction" here in a comment,
Imagine they find that Higgs, but nothhing else :=(
Of course that was just some black humor, but
now we have that "CERNobyl" .
"new particles must appear...to avoid finetuning and to preserve naturalness" Neither theory nor observation address common mode failure of both. Baryogenesis, Milgrom acceleration, and proton-antiproton magnetic moment divergence(DOI:10.1038/nature24048) are ~ppb vacuum chiral anisotropy.ReplyDelete
Opposite shoes non-identically embed in chiral anisotropic vacuum background (a trace left foot). Observe the cryogenic rotational spectrum of an extreme geometric chiral-divergent enantiomer pair of prolate top polar molecules. Physics heals if enantiomers’ rotational spectra are not exactly identical and/or not exactly superposed, or with 3:1 broadened lines.
"an experiment needs a clear goal and at least a promise of breakthrough discoveries" Observe 3:1 R:S 2-cyano-D_3-trishomocubane self-calibrated by nitrogen hyperfine splitting re DOI:10.1515/zna-1986-1107. One hour in commercial equipment. Look.
A quick but weaker enantiomer probe is norborn-2-ene-(E)-5,6-dicarbonitrile from cyclopentadiene plus fumaronitrile.
"Conjecturing a multiverse of any type (string landscape or eternal inflation or what have you) is useless." There might be a multiverse in which the alternate universes are organized according to Wolfram's cosmological automaton (predicting Milgrom's MOND approximately). Google "witten milgrom" and "mond fundamental philosophy science".ReplyDelete
Instead of just hasten to the next larger particle collider wouldn´t it be more efficient for physics to promote experiments like Brukner, Everitt, Paternostro or Vedral to get on firm ground again?ReplyDelete
As you wrote here “In so doing, we would promote quantum gravity from mathematics to physics.”
Before starting yet another, “the measure problem” we maybe need first to settle the good old “measurement problem”. Maybe we already reached the last rung of reductionism - no more particles. We have to put the pieces of the puzzle, we found so far with QM and GR together in just another way.
And maybe the believe in determinism is the obstacle in solving this puzzle.
You make good points. I have avoided the question which physics investment would yield the biggest bang for the buck because I don't know how to even begin addressing it.
"And the public isn’t remotely as dumb as many academics wish." That clearly does not apply to the American people. Our Secretary of Energy is none other than the religious moron Rick Perry, who was recently schooled on Fermilab's g-2 experiment. "God did it, why spend all this money?" and "What's a moo-on, anyway?" were his most cogent remarks.ReplyDelete
For those who believe there is a "measurement problem" Klaas Landsman offers a solution.
Well. We are close to the nightmare scenario...The scenario no one wanted to believe 2 decades ago!ReplyDelete
Option 1. SUSY or something in the reach of the LHC is found after the high-luminosity stage.
Option 2. We see only the Higgs particle (as we see), and no more.
Asymptotically safety plus gravity until Planck or GUT scale is possible. Shaposhnikov model, with the only addition of 3 "heavy" neutrinos right-handed elucidated such a model long ago AND, more importantly, almost predicted the Higgs mass without SUSY. That is quite a thing if you see in perspective. Anyway, SUSY or something like it should exist. And now, despite the no-evidence for SUSY (it is likely hidden or not something like the ugly MSSM I disliked since my undergraduated times) must exist because...Even black holes like Kerr-black holes do have hidden (exotic) supersymmetries. So, the problem is not SUSY as a whole, but what or where are the particles, and, more importantly, why...
Deadly theories can resurrect, atoms were rediscovered after centuries of ostracism in a box. Maybe, SUSY is in that point. I must confess that I like reductionism, but not too much reductionism. Fermions do exist, so we need SUSY (or something like the ideas of Peter Freund, R.I.P., in which susy can be "derived" from only bosonic 26d theory; it is weird).
I might say...We need something better to rule out naturalness as guide for HEP in the next decades. Naturalness is not dead yet...Not totally. To give up or delete some parameter space for dark matter is hard...Anyway, I have a question for you. I have never understood one thing relative to sparticles...SUSY is broken in low energy (apparently), at least to scales of energy we observe. If SUSY is not close to the electroweak scale, can we YET cancel loops of quadratic corrections to the Higgs mass without finetuning the coupling constants? I think the answer is NO. So, personally, this is quite an issue for SUSY at the moment...I will not say nothing about the divergences of the vacuum energy (that is another big problem for SUSY, lightly solved recently but with too many disadvantages to my eyes -i.e., the existence of dS vacua in susy or string backgrounds) but it is also related (SUSY essentially could fix lambda to cero, but lambda is not zero, so it seems to crunch the idea of SUSY in the Universe, at least in a standard or minimal way -that is good and bad).
this is a great column!
as to the question you pose " What will happen to particle physics or, more to the point, to particle physicists? " the answer might (should) be: they'll start doing theoretical physics, rather than indulging in metaphysics. As a soft matter theoretical physicist, i have been frustrated by theoretical physicists who claim to be doing 'foundational' physics research as if other theoretical physics research is less fundamental. Not only does this smack of the same sort of arrogance that 'pure' mathematicians often display towards 'applied' mathematicians but it fails to recognize that emergence is as valid a research paradigm in theoretical physics as is reductionism. As P.W. Anderson put it:
"The main fallacy [of] the reductionist hypothesis [is that it] does not by any means imply a “constructionist” one: The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe. In fact, the more the elementary particle physicists tell us about the nature of the fundamental laws, the less relevance they seem to have to the very real problems of the rest of science. The constructionist hypothesis breaks down when confronted with the twin difficulties of scale and complexity. The behavior of large and complex aggregates of elementary particles, it turns out, is not to be understood in terms of a simple extrapolation of the properties of a few particles. Instead, at each level of complexity entirely new properties appear, and the understanding of the new behaviors requires research which I think is as fundamental in its nature as any other…[and will show] how the whole becomes not only more than the sum of but very different from the sum of the parts. With increasing complication… we expect to encounter fascinating and very fundamental questions in fitting together less complicated pieces into a more complicated system and understanding the basically new types of behavior that can result… Quantitative differences become qualitative ones."
Moreover, new theoretical physicists might then be better able to find employment in academia rather than having to become 'quants' on Wall Street. This will not only benefit the sciences (biology as well as physics) but could be of benefit to the economy.
Re the fact that the LHC will deliver 30x more data.ReplyDelete
Statistical errors typically only decrease as the square root of the luminosity, so the error bars will only shrink by a factor of 5-6 before the LHC is decommissioned in the late 2030's. It seems to me that if a clean discovery is ahead, it should already manifest itself in some tantalizing hints. But perhaps somebody more knowledgeable in statistics and particle experiments could tell us how large fraction of the data set is needed to rule out a five-sigma discovery, given that only a bunch of two-sigma bumps have been seen.
Rick Perry is not a scientist and he is not well schooled in particle physics, but he visited Fermilab and was attentive, engaged and generally supportive. That's a reasonable way for him to be.
Yes, it would be better to have a passionate particle physicist supporter at the top. But DOE secretary has many things on his plate and g-2 is obviously something that he should delegate.
Only a short reply to Paul Hayes, since “the measurement problem” is off topic in this post.ReplyDelete
In Landsman´s proposal in his book which also refers to Spehner-Haake model and the Flea on Schroedinger's Cat I am lacking a crisp principle, so I would agree with the obituary with respect to this proposal and thus the problem is not yet solved.
Sabine, do you or anyone else know anything more about HE-LHC, not to be confused with HL-LHC, with a 28 TEV collision energy due to magnets doubled to 14 T, re using same tunnel? specifically, is it actually going to be made, or will a 100 TEV collider be built with a new tunnel near CERN? I heard China is cancelling it?ReplyDelete
Any thoughts about discoveries @ 28 TEV ?
I was always brought up on the concept that if you make a prediction and nature refuses to comply, either the equipment malfunctioned or the theory was wrong. Yes, the theory could be sort of right but needs an unknown correction for something overlooked, but it seems to me that supersymmetry is just not complying. Surely that should encourage theoreticians. Something has gone wrong somewhere, which means there is more work worth doing. The problem, of course, is to imagine where current theory has gone wrong. It may be very deep.ReplyDelete
My 13 year-old son asked this morning why, specifically, the string theory landscape is such a poor idea. I asked him: "Suppose you have a sack and you reach in and pull out a white billiard ball. What can you deduce about the contents of the sack?". He immediately replied "Nothing at all. You don't even know if there are more balls in the sack."ReplyDelete
He's pretty sharp, that kid.
This would seem to be an ideal time to take a pause to seriously explore advanced accelerator technologies. With no urgent target in sight at higher energies, stop to see if higher energies might be reached in less hideously expensive ways. Sometimes the indirect path is best. Also, it might be easier to sell research on wakefields or whatever as gee-whiz advanced technology rather than trying to push another scale-up of the conventional approach.ReplyDelete
Oh, Sabine, you complete me. I am a super big fan of particle physics and colliders, but this hits the nail on the head in so many ways. Truth-telling is the only way to break through this multiverse magical mathematical mindf@uck.ReplyDelete
Downgrade the naturalness models, but do not discard them entirely. Invoking naturalness in model-building has some practical benefits that are independent of primordial principles and may be intuitively appreciated as "beauty". A multi-faceted physics program at a collider such as the LHC is likely to produce a measurable positive financial return on the investment under typical scenarios (https://arxiv.org/pdf/1603.00886.pdf). From the economics viewpoint, the LHC becomes cost-efficient through a combination of fundamental research and, as importantly, large-scale training of professionals for high-tech industry and academia. The LHC greatly expands knowledge in the subatomic domain by testing a variety of theoretical models that may eventually point to a deep fundamental principle. Naturalness-based models are as good for organizing the empirical knowledge and training of young researchers as the alternative models that should also be pursued and tested.ReplyDelete
2019 News Flash: Sabine Hossenfelder, quantum conspiracy theorist and naturalness denier, was found unconscious and thought to be poisoned by some form of weaponized dark matter two days before she was to address the Committee for Research on Artisanal Particles, on the funding of the International Trans-Siberian Linear Accelerator. Russia has denied any involvement.ReplyDelete
Your son has a bright future ahead :) I used a similar example elsewhere though I pulled a red marble ;) The situation with the landscape is slightly worse in that we don't even know there is a sack to pull from! Say hi to your kid,
Neo, I just heard about the HE-LHC from the latest issue of Cern Courier.ReplyDelete
"HL-LHC is an approved extension of the LHC programme that aims to achieve a total integrated luminosity of 3 ab–1 by the second half of the 2030s (for reference, the LHC has amassed around 0.1 ab–1 so far). HE-LHC, by contrast, is one of CERN’s possible options for the future beyond the LHC; its target collision energy of 27 TeV, twice the LHC energy, would be made possible using the 16 T dipole magnets under development in the context of the Future Circular Collider study. "
So not approved yet. But seems to me an excellent idea to get the most out of the investment already made.
Yes, people who make "firm predictions" which don't pan out should be fired because, as you say, it casts a bad light on all of science.ReplyDelete
But you are throwing the baby out with the bathwater. As the late, great Robert Pirsig wrote, the television scientist (e.g. in a B-movie) who says "Our experiment is a failure; we didn't find what we expected" is suffering mainly from a bad script writer. If we are sure what we will find, there is no need to do the experiment. The whole point of science is finding the unexpected. No, it's not always there, but when it is, it is interesting.
The justification of the LHC or any other accelerator is not to confirm predictions. More important is to rule out predictions and thus falsify theories (no, Popper was not completely wrong). Most important is to see what happens.
In astronomy, many general-purpose instruments (the Palomar 200-inch telescope, the VLA) didn't do what they were ostensibly built to do. But they are some of the best investments in science ever. As general-purpose instruments (which particle accelerators are as well), they were able to contribute tremendously to our understanding of the universe.
Characterizing the LHC as a one-trick pony sows the seeds of its own downfall.
I’m not a physicist but am interested in physics, and as a UK taxpayer have done my tiny bit to fund the LHC and CERN. Personally I agree with the sentiment that we should just fund such experiments – there are many worse uses of public funds, and we may discover an ‘unknown unknown’ by accident. But as is said here, this argument isn’t likely to fly with the general public.ReplyDelete
To build a large collider, in this case in the US, distinguish between ‘how to convince decision maker’ and ‘how to justify to bean counters’ (cf. standard distinction in philosophy of science between contexts of discovery and justification)
Bean counter argument – technology developed as a by-product of constructing the collider (such as the WWW at CERN, doubtless many clever advances in cunning widgets required to make the detectors work etc)
Decision maker argument – ‘we should build the biggest collider ever, it’ll be called the Trump Collider, it’ll be the best, whoever owns this collider will be the winner, but no one else has the cojones to make a decision this big, do you?... PS we will need to build a hotel on the site too for all those visitors on expense accounts, we need to think about who to give this hotel concession to…’ it’ll be approved before you leave the room.
Half a century of stringtheory - half an century of stais...!ReplyDelete
I will never understand, how such an ill-minded Wolkenkuckucksheim ("home of cuckoos in the clouds" - not translatabele German term, meaning kinda pattern made out of nothing but wishfull thinking) could be so succesfully ever? Really: branes, 10^500 landscapes, multiple unvisible dimensions - come on...!!!
Now, that LEP seems to prove, there is no SuSy, no mini-black-holes, no higher dimensions, we should be so honest to admit, lot of our proud, tricky and sophisticated theories have gone haywire and left nothing but a big hangover.
Helas - since the personal careers of some thousand influencal physicists depend on stringtheory et al. there will most likely be no confessione, no 'mea culpa', no (re)search for other explanations... - no:they're gonna pretend, that 'SuSy&the high Dimensions'lurk just around the next corner - if we only finance build a still bigger accelerator: high hopes insteadt of solid research!
What Sabine said about the next time "we" come for money is very important. All of basic and applied physical science is dependent upon public and private funding. None of these institutions are profit-making ventures.ReplyDelete
So the public has a right to know always which lab(s), telescope(S), supercomputers(s) and the like, not just vague statements like "several other...". So if I am posting to my blog and hit that sort of expression, I go to the writer and ask for specifics before I post. I do not care if it is Kip Thorne, Don Lincoln, Brian Cox, Sean Carroll, anyone at NASA, ESO, ESA, D.O.E.. Generally they will tell me so that my work as a communicator can be as complete as possible.
"we have no reason to think there will be new particles at the next higher energies – not until you go up a full 15 orders of magnitude, far beyond what even futuristic technologies may reach."
I have heard this before. The last time I heard it it was formulated as:
"to get answers that LHC can't provide the next accelerator will need a diameter on the order of the orbit of Neptune. If we can only use strong electrostatic fields"
Idk if 'diameter = orbit of Neptune' is the same as '15 orders of magnitude more energy then LHC', but it seams like the kind of guess that can only be off by a mere 15 orders of magnitude right? I think "strong electrostatic fields" means the kind you get from normal everyday superconducting electromagnets. As an engineer I feel that it is a safe bet we won't get a stronger field type but of course that is 'out of my department'.
What I really want to know is were does this needed much larger energy number come from? If it was just the standard bigger accelerator = better I wouldn't care (pfff physicists, they always want more money) but now that I've herd two people try and bound the energy (granted the bounds are crazy big) I want to know how/why are you generating that bound? Is this something to do with working back from plank length or vacume energy?
Or is this like those people trying to bound the energy cost of an Alcubierre drive's warp bubble where they started at the mass-energy equivalent of Jupiter then someone else came along and changed some assumptions and got it down to the mass-energy equivalent of the moon and I concluded that some mathematicians talk out of their... never mind it's not polite
Jon Starr A silly engineer.
For the record I think you are performing yeomans service on this blog and fully support the effort to inject some real legibility into the partial physics field. Please keep up the good work!
I also support building a nice big linear collider in japan and converting Neptune into a giant collider but one of those projects seams out of the scope of this millennium /sigh
Yes, the "15 orders of magnitude" figure comes from the Planck length. The associated energy scale, the Planck energy, is around 10^28 eV; the LHC is around 10^13ish,hence the extra 15 orders of magnitude.
This is the kind of energy scale where we're shoving so much energy into such a small space that gravitational effects should become about as significant as the other interactions and we start producing things like Planck-scale black holes; if quantum gravity effects are going to show up anywhere, this is when they would definitely show up.
We need fewer physicists analyzing collider data and more of them thinking of new theories that don't involve multiverses and strings, and new ways to probe our universe!ReplyDelete
Human technologies don't stay relevant forever. Every new piece of tech has an initial surge of utility until additional investments pay diminishing returns. Eventually cars stopped getting faster, airplanes couldn't go higher, and transistors couldn't get smaller. I have a suspicion that we have reached "peak collider". Soon they will be old technology. The new last century. Buggy whips and sails.
The Chinese will probably still build a large accelerator (CEPC or other), because they have more money than they know what to do with, and they feel an urgency to be at least equivalent to The West in every aspect.
But for everybody else - the answer is probably to look up at the skies.
We know with certainty astrophysics has deep unresolved mysteries. Most notably dark matter, which is more likely than not to have a solution related to beyond-the-standard-model particle physics. And phenomena such as supernova, black holes and gamma-ray bursts offer a natural source of extremely high energy processes.
Fortunately, the cost of launching large payloads to orbit is coming down drastically thanks to SpaceX, especially if you consider that the timeline for design and fabrication of new space-based instruments coincides with SpaceX's BFR timeline (BFR is SpaceX's next-generation rocket, with projected payload of 150 tons to low earth orbit or 150 tons anywhere with LEO refueling).
Therefore, the particle physics community should focus on space as the source for new data in the 2025-2030 timeframe. And when we find interesting hints in that data, we can design the right Earth-based instruments to pull at those threads.
As was pointed out, the LHC has delivered a tiny amount of the luminosity it will provide over its lifetime. The experiments are busy doing what they should be doing - exploring the new energy range. I work on an LHC experiment; the atmosphere is extremely positive, we know we've barely started. Its premature to start drawing general conclusions regarding the absence of new physics at the TeV scale.
You're right to point out that CERN has delivered what it has promised. The discovery of the Higgs has rightly led to a Nobel prize - what has been achieved is of fundamental significance. Yes there has been a lot of background noise concerning SUSY, extra dimensions, black holes etc. Most of this has been generated by theorists although experimentalists are also culpable albeit to a lesser degree. However, when the community goes back to the funding bodies to request support for future projects they will rightly point out that the LHC was sold to them as a Higgs discovery machine which is what it achieved. This argument will likely carry more weight than you imply.
I don't know how representative your picture of the theoretical community is. Theorists that I speak to freely concede that naturalness/SUSY etc. may well have been a misguided guiding principle; they are not shifting the goal posts by arguing that a higher energy collider is needed. There are certainly many who do shift the goal posts but its unclear to me whether they are simply a loud minority.
In any event, the community is now preparing the update to the European Particle Physics Strategy. It will be extremely interesting to see how this process pans out. There is a broad consensus for a e+e- collider. It would be irresponsible in my opinion not to make precision measurements of the Higgs; one doesn't need a raft of speculative models to make this case. Regarding higher energy hadron-hadron machines, there is no consensus one way or another that I can yet discern. The treatment of such an option in the strategy will be interesting. I suspect that there will be a lot of submissions in this area and I strongly suggest that you make your own. The strategy preparation is far from being a closed process. It is being discussed right now throughout the community.
My own opinion is that a future hadron-hadron collider would be a mistake if the non-neutrino sector of Standard Model remains unfalsified. It would simply be an expensive way to test the naturalness paradigm. That is not to say that higher energies are uninteresting. I was a bit baffled by your comment that there is no reason to expect to new physics up to the Planck scale. The history of science has shown that whenever we go to shorter distances, new (and often unpredicted) phenomena are observed. We must continue to explore. Given the costs involved, I would, however, prefer that we invest more heavily in new accelerating technologies and propose a new collider when are in a position to take a far greater leap than LHC->FCC.
There is a large body of non-collider possibilities with which the field can progress in the absence of a FCC. For example, I'm involved in a proposed experiment to search for neutrons converting to antineutrons. There are models predicting this but my motivation is that we can improve sensitivity to the oscillation probability by three orders of magnitude and that baryon number violation has been neglected of late (proton decay doesn't count as it also needs lepton number :) ). There are many more examples of "cheap" experiments offering a potentially huge return.
There seems to be a schadenfreude in many of the comments regarding the current state of particle physics. I think this is misguided. Particle physics has a lot to offer inside and outside of the collider world for the new few decades and most likely beyond.
Keep up the good work!
I'll be honest. I am not a particle physicist. I did study nuclear physics as an undergrad and solid state physics in grad school. I participated in particle physics experiments at BNL and neutron backscatter experiments at ORNL (investigating high temperature superconductors). When I consider the landscape of experiments that could be funded across medicine, energy transfer and storage, propulsion, weapons, transportation, communication, and so on I end up with a bit of an empty feeling about particle physics. Consider if we took the price tag of CERN and had instead used it to fund immunotherapy research with the stipulation that all patents live in the public domain. Do you think the world would now be better off or worse off as a planet?ReplyDelete
If the list of all possible things we could investigate were laid out in a line with the most important first, I suspect that we would run out of funding long before we got to a new collider. Imagine you are in Perry's position. A physicist and an engineer come to you, the physicist says he wants funding to investigate the potentiality of vacuum collapse and the engineer says he wants to test a new internal combustion engine that will reduce fuel consumption by 20% and weight by 50%. As the head of the Department of Energy which one should be your priority?
You're assuming that reducing hydrocarbon consumption is one of Perry's goals?Delete
Is it me, or is the entire idea of scientists making the assumption that "there just can't be fine tuning" rather tainting science in itself? I mean, that assumption has led to so many wacky theories with really no way to ultimately test most of them. And the assumption merely stems from a scientist's personal philosophy or diest/atheist views, not on anything observable.ReplyDelete
We can't study what came before or what exists outside of our universe...so why are so many working so hard to try and "prove" there is no God or superior universe-creating aliens or that this is not a computer program on some computer, and letting that desire define their entire "scientific" career?
To me, it's exactly the same as some scientist trying to prove the existence of God. It can't be done and it isn't science to even try. Science is the study of this universe, not whatever else there is. And this universe so far tells us it is "fine tuned" to some extent. Whether that is by chance or by accident, we can't know.
Bill, you said:ReplyDelete
>> "And the public isn’t remotely as dumb as many academics wish." That clearly does not apply to the American people. Our Secretary of Energy is none other than the religious moron Rick Perry, who was recently schooled on Fermilab's g-2 experiment. "God did it, why spend all this money?" and "What's a moo-on, anyway?" were his most cogent remarks. <<
I'm not a particle physicist, but I am one of the American people, I am interested in the subject, and I value Dr. Hossenfelder's posts as some of the most accessible writing for us non-experts. I also tend to agree with her very reasonable opinions on the subject. In particular, her view that a new high energy collider may well be worth the investment, but must be justified more as an exploration into the unknown instead of sure verification of theoretical predictions makes a great deal of sense to me, a humble taxpayer.
As such, I find Secretary Perry's questions both reasonable and entirely relevant. Indeed, I'd be very upset if the Secretary of Energy was not asking "Why spend all this money?" It is not an easy question to answer, but if the response is, "Shut up, we know better than you, you religious moron" then I'm afraid you'll find it very hard to generate significant support for these priorities. Including from people like me, who are generally predisposed to support scientific research.
Likewise, I am happy to see our Secretary of Energy asking about muons. It indicates engagement and maybe, hopefully, even a willingness to listen. I do not expect the Secretary of Energy to know scientific details about muons, but I do want him to be willing to find experts, talk, and do his best to understand the issues - technical, financial, and political - at an appropriate level.
In short, I found your post about the American people disturbing. I understand it may be more about politics generally than SecEnergy specifically, but remember that in politics , as in most thing in life, the more people you can bring together, the more likely you'll reach your goals. Please remember, that it's us, the people who see value in fundamental physics research, who are asking them to do us the favor of providing their resources for our priorities. I suspect a bit more respect would be helpful when we make this request. Respect will be returned, while scorn ... well, that is also likely to be returned.
can you or someone else comment and summarize the results presented on 53rd Rencontres de Moriond - EW 2018
has SUSY or any hints of new physics been discovered?
what was the size fb-1 analyzed of the data set and bounds on gluinos and squarks?
"Nice dig about people not caring what happens to the planet in 100 years. It's also possible that they don't put a lot of credence in what scientists say about what the planet will be like in 100 years."ReplyDelete
The latter implies the former. Anyone who could pass a secondary-school general science class and who looks at the data can tell that what scientists are saying about the planet is reasonable. Those who can't/don't have no basis to doubt the scientists, unless they just don't care.
I've been in science research for just over 4 decades. It's not enough to ask if a project contributes to science...it is at what cost and at the expenditure of what intellectual capital? CERN was worthwhile...son of CERN not so much.ReplyDelete
My expectation is that physics wont be advanced as an academic discipline but as a commercial one. Whatever the 'secrets' are of existence,they will be be revealed via advances in AI and not higher energy particle accelerators. Apple, Google, etc. are where the 'new physics' is happening and will be in coming decades. AI will have practical nitty gritty use and theoretical physics will be sifting through the applications to find out what 'reality' is all about. Most likely, AI will be able to answer direct questions about 'existence'.
None of this is far off. 20 years? If not, then 40 or 75. Inevitable.
CERN is fun. Apollo was fun. However, humans are about to make an existential shift into a higher plane of learning via AI.
You seem to suggest that with humanity having so many high priority, practical needs that are still unmet, we might be better served if the large sums spent on particle physics research were instead spent elsewhere. I think there are several problems with that way of looking at cost vs. benefit of particle research.
First, every important unsolved problem almost certainly receives research funding at some level already. The question is how much.
Second, there is no reason to believe that money taken from particle physics would flow to other areas of need. Back when the SSC was under construction, some prominent physicists thought SSC costs were starving other areas of funds, and lobbied to have it cancelled. But once Clinton cancelled the SSC the savings went to reducing the budget deficit, not to other physics research programs. The idea that total funding is fixed, so that reducing research in one area (particle physics) will free up funds for research in an unrelated area (immunology), doesn't work.
(And weapons vs. particle research? Seriously? There is no shortage of money spent on weaponry; why not shift some of those funds?)
Third, governments decided to fund CERN because they thought its research was worth doing on its own merits, not because they thought it was better to spend money on CERN than on immunology research. Looking at funding that way, it's easier to see why the idea of dividing a fixed-size pot of research money doesn't match what governments do.
Fourth, making great research progress in immunology (or whatever) isn't simply a matter of making lots of money available. You need to have enough highly qualified researchers and research lab space to take advantage of it, or else you'll end up wasting a lot of money or having it sit unused. That requires lots of new PhD's and postdoc positions first to train lots of researchers in your targeted areas--you can't hire a zoologist into an immunologist position--and you must build shiny new government laboratories for them. It presupposes there are lots of interested people in becoming immunologists (or whatever) who are both talented and prepared to do the grueling work of earning their PhD. It presupposes funding will continue at the greatly increased levels indefinitely, so all these new researchers can have a full career rather than being "surplussed" once certain objectives are achieved.
Fifth, comparing the benefits of fundamental physics research with the benefits of medical or alternative energy research isn't appropriate. They have different goals. The goal of fundamental physics research is understanding the foundation of the physical world at its deepest level--it is closer to art, literature, philosophy, music, even entertainment in general; it is not at all like engineering, transportation, and healthcare, all of which are obviously important in their own right. As some people put it, there are things that help us live, and other things that make life worth living. Why do people spend so much money on music, travel, movies, etc.? Certainly not because it is a practical necessity for life in most cases!
Finally, your (rhetorical?) question about research into vacuum collapse vs. engineering a more efficient internal combustion engine: I don't think Rick Perry should be involved in funding the engineer at all. That kind of project should be pursued by private enterprise, not the government. If the engineer has a credible story he/she shouldn't have trouble getting initial funding from existing companies, venture capitalists, "angel" investors or other private funding sources. Government funding should focus on potentially important or very interesting research that is too risky or unprofitable for private enterprise to justify. However, I don't think Rick Perry should be involved with metastable vacua research either--that isn't expensive and can be evaluated/funded elsewhere.
"Idk if 'diameter = orbit of Neptune' is the same as '15 orders of magnitude more energy then LHC'"
No. Collision energy is proportional to the radius/diameter of the accelerator (with fixed field strength of the magnets). So it's just 9 orders of magnitude. Bee is obviously more pessimist than your source.
"As an engineer I feel that it is a safe bet we won't get a stronger field"
At the LHC, they're playing it safe with 8T field strength. There is, so far, no 'natural law' excluding material with much higher critical field strengths, say, 60T or some such. But field energy by volume goes with the field strength squared! A 60T magnet 'quenching', i.e. losing superconductivity, will practically explode, and destroy parts of the bigger machine it is in. I presume you are familiar to the concept of 'playing it safe'.
You are certainly more optimistic about the prospects for AI uncovering the "secrets of existence" than I am! AI algorithms can be very good at analyzing large amounts of data and finding patterns, but my understanding is that it can be very difficult to then work backward from the results to get a human-friendly, logical/coherent chain of reasoning for why those results are correct and sensible. Without that, one must simply trust that the results are true, perhaps subject to some consistency checks with new data samples or other means.
If an AI were ever able to determine any "secrets of existence" then it's hard to see how those results could be translated into a kind of general theory that is intelligible to humans. For that matter, it's hard to see how a general theory could emerge from humans combing through data and looking for patterns either: it seems that observations and data analysis could ever be sufficient in themselves to arrive at a theory like quantum mechanics or general relativity; creative speculation and application of existing mathematical tools in possibly-new or unexpected ways (for example) are also needed, and these aren't implied by the data. If all you end up with is a bunch of rules, patterns and conditions with no general principles or statements that tie them together in a general way, would you feel like your AI has explained any secrets of existence in an intellectually satisfying way, or has indicated where new experiments or observations can open up further discoveries?
Of course, one should never say "never" -- it's logically possible that machines will someday be constructed to do what you say. But I can't help but wonder if the amount of effort required to do so may make it an intractable project, or at least an intolerably expensive and time-consuming one.
Is it me, or is the entire idea of scientists making the assumption that "there just can't be fine tuning" rather tainting science in itself?ReplyDelete
People used to think that the internet would provide so much information that people would be better informed than before it existed. In many cases, though, what happens is that there is so much information that people spend all their time reading stuff which confirms their prejudices: conspiracy theories, fake news, etc. Let's face it, something as absurd as PizzaGate wouldn't have happened in the 1970s.
We shouldn't let this happen in science. Don't read just the blogs of people you agree with.
Not all scientists, probably not even most, say that "there just can't be fine tuning". Most don't even think about it. Among those that do, there are basically two camps. One seems to be Sabine's opinion: if we don't know the underlying probability distribution, then we can't say anything, so any observed value is OK. The other does see evidence for fine-tuning though the cause of it is a different question.
Personally, I think that a) there is fine-tuning and b) the weak anthropic principle in the multiverse is by far the best explanation. For a good overview, read a book which I reviewed recently.
Could it be that mathematics does not represent our Universe, merely our perception of the Universe?ReplyDelete
In your article you mention that potential vacuum decay won't occur until the distant future if it is determined that we are in fact a metastable universe.ReplyDelete
This is not necessarily true. Vacuum decay can occur at any time, and may already be propagating through the universe at this very moment, but we would not know until the event was upon us due to the principle of locality (if this principle even holds in such an event).
Physics predicts that if we do live in a metastable universe, that it would theoretically 'last' longer than the current age of the universe before this catastrophic event occurs naturally, although this does not take into account possibly unobserved high-energy events (natural or artificial) which could trigger destabilization.
"When looking a hundred years into the future, "reasonable" covers a pretty large territory, too large to spend trillions now."ReplyDelete
Translation: you don't care what happens 100 years from now, because caring might cost you some money (i.e., reduce your life style in some way - money itself has no value unless it is spent; most measures would act to conserve resources, e.g. fossil fuels and forests and fresh water and oxygen, not spend them; and money that is spent out of your pocket goes into someone else's pocket, which is what drives the economy and provides jobs) and some unforeseen miracle might alter the current obvious trend. This is why you don't care, not a demonstration of caring. To repeat, there is no dichotomy, as you implied in your first comment, between those who don't care what happens in 100 years and those who choose not to agree with the (huge) scientific (and economic) consensus.
I suppose another way of saying that is that you care as much or more about your current lifestyle than you care about what happens in 100 years, so your net care is zero or negative but your individual care (for the future) might not be zero. I can accept that semantic difference. Dr. Hossenfelder and I (neither one of us a climate researcher, and in my case not a scientist) are talking about the net care.
(Whether or not that lack of net care is justified is another issue. I have a strong opinion on that, but will not violate the site's commenting rules to pursue that off-topic issue.)
wereatheist: At very very high energies, circular colliders are no longer practical because the energy lost to synchrotron radiation becomes too large (it increases as energy^4). So having stronger magnets doesn't help. Instead you have to use a linear collider which doesn't suffer from synchrotron radiation.ReplyDelete
This would of course have to be very long (possibly that's where the diameter of Neptune's orbit statement comes from), but the biggest problem isn't building a very long collider, it's finding the energy to run it. Most high-energy collisions are uninteresting and due to low-energy processes - only a very small fraction tell you anything about new physics, and that fraction falls as 1/energy^2. So the total energy needed to run a collider goes as the collision energy cubed...
I've thought a bit about how such colliders might look - if you're interested, have a look at my paper: https://arxiv.org/abs/1704.04469. Even with some tricks I came up with to reduce the energy use, a GUT-scale collider ("only" 11 orders of magnitude above the LHC) would use all the energy the Sun produces in 400 years. A Planck-scale collider would be another 12 orders of magnitude above that!
Thanks for the reference, that looks like a fun paper!
On funding, perhaps there is encouragement to be found in the US Space Launch System, which continues to be well funded though it has no identified missions or payloads. Point is, it keeps money flowing into Alabama, Florida, Texas, and Utah. Which is, on the evidence, sufficient to keep it going.ReplyDelete
Thanks! I had great fun writing it!ReplyDelete
BTW I've since realised that part of it (about what happens in the collision region) isn't quite right. I think I know how to fix it, I just haven't had time to write it up. But we're getting quite far off topic...
Well naturalness arguments aside: I have long held a (possibly mistaken) predjudice that it is worth chasing up to about 2 TeV at least. This is based on a classical soliton calculation which indicates that as the Yukawa coupling is increased, the higgs field buckles and is forced to zero in the vicinity of the fermion. The upshot is the fermion mass maxes out at about 2TeV, regardless of how high you crank the coupling. Ergo, if there are any more fermions to be found, they are less than 2TeV. Assuming a lot of things of course... as always we should interrogate the world.ReplyDelete
Excellent column Sabine. You are right about the sociology; part of the problem is that once an organization like CERN has thousands (or even just hundreds) of staff, the leadership have a moral and sometimes legal responsibility to keep the ball rolling. I hear your point that the propaganda mainly comes from theorists, but when it comes to proposal time, these papers are bound to be prominently cited.ReplyDelete
On the other hand, there are some cases where decades of failed predictions that discoveries would be just over the (ever-receding) experimental horizon eventually panned out, for instance of fluctuations in the microwave background radiation, and gravitational waves. But the failure to find beyond-standard-model physics may be a record-holder in terms of duration, and certainly in terms of integrated effort/money expended. (OK neutrino mass, but my first encounter with the SM was in Mandl & Shaw's 1984 textbook, which already assumed neutrino mass was an option).
Your "not translatable German term" is just the German translation of Νεφελοκοκκυγία, from Aristophenes' play The Birds (414 BC). In English it's "Cloud-cuckoo land" and the meaning is just as you describe.
Thanks for another fine article.
I also enjoy the comments from your readers. Very informative. I just wish that the snarky political comments were omitted.
The point which you make, that some long outstanding observation was eventually successful, is one that people frequently bring up, so I want to comment on this.
In almost all cases that have been mentioned to me the phenomenon they were looking for was predicted by an already confirmed theory and/or the non-observation would have left behind an actual inconsistency. That was the case for gravitational waves (non-observation would have been in conflict with GR), the Higgs or something like it (non-observation would have screwed up unitarity), neutrinos (non-observation would have brought back the energy-conservation issue).
The case you mentioned with the CMB was one of measurement precision - they already knew the CMB was there.
One peculiar case is that of proton decay, which was never measured, but the measurements turned out to be useful to confirm neutrino oscillations. Though we already knew something was at odds with the solar neutrinos, so one can debate whether or not this was a new discovery.
The situation we have now is entirely different: These are predictions which have no basis. These theories are not necessary for consistency. They have been invented because the standard model isn't pretty enough. There is no theoretical reason whatsoever to believe there are any more particles than the ones we have already measured (not for the next 15 orders in energy).
Yes, we know that something is at odds with GR because of dark matter. But there is no particular reason to think it's this or that type of particle (or a particle to begin with). GR alone doesn't tell you that. Indeed, that it's easy enough to move predictions for direct detection out of experimental bounds tells you already that theoretical consistency alone isn't enough to say anything about what dark matter is.
Very interesting article and comments. I always thought the argument of naturalness was to avoid just-so stories and deus ex machina explanations. But we have examples in astrophysics and cosmology where "degrees" of naturalness allowed incorrect ideas to persist beyond rational defense. So a warning sign that we are reaching limits to a given approach is an idea beauty contest. With lots of self citations.ReplyDelete
It's more than high time that we go to CERN and turn the main breaker switch to the 'OFF' position. It's all fun and games until the funds run out.ReplyDelete
Secondly: there can only n many particles that make up the universe. The universe is there, it's using mechanics an particles to great the flavour of chaos we are using to make up what is loosely defined as 'reality'. It's got to be a finite amount of particles. There can't be a substrate of particles that make up Quarks and then have even smaller particles that make up those particles.
At some point we have to run out of physics to describe the universe. At some point somebody's going to say 'that's it, that's what it takes to make this universe, there's nothing more that can be found.'
I like string theory as the next frontier that explains how the universe works. In the words of the late, great, inimitable Richard Feynman: the only thing string theory produces are excuses. Somebody needs to come up with something better.
I notice the particles presented in the Standard Model that have gravity glued next to it, like it's a piece of paper that we needed to write the last bit of the test on. There's going to be more to it. There has to be more to it.
All my beloved sarcasm notwithstanding we still don't understand gravity, we still have no idea what dark matter / dark energy is, and somehow that translates into an inability to justify the funding for a next-generation machine. I very much have the sense of the director of the patent bureau who, near the end of the 19th century wanted to close the bureau because 'almost everything that can be invented has already been invented'.
Thirdly: I live in the age where vulture capitalism tries to take everything away from the most amount of people to give as much as possible to a very select few people who have no sane use for the insane amounts of money they already have. I pay taxes for these assholes. So, I want the next generation machine to be built. It can't be all that much money. And if it is a lot of money, I still want it to be built so that I at least have the idea that it's going to be used by people who might find something we hadn't thought we'd be finding.
Hop to it. Think of something smart to do. Or if you can't, find someone who can. Money cannot be an object. Go do it. Tell them I sent you.
Dave: is your calculation public?ReplyDelete
Anyway, it sounds like it is only intended to apply to fermions that get their mass from a yukawa coupling to the standard model Higgs. A new particle could couple to a new Higgs with a vev at a different scale, or its mass could come from compositeness or from other effects.
Thanks for an excellent article, one someone with a mere BSEE could follow. I'd add that in their pursuit of funding, particle physics faces a funding problem that climatologist don't,ReplyDelete
1. As you mention, any possible risk to our existence has to be almost unimaginable far into the future not a mere ten or twenty years out. Politicians have trouble thinking beyond their next election.
2. What particle physics is discovering is simply how the universe works at the smallest levels. No one is arguing that human behavior is altering basic cosmological constants that could push the entire universe into chaos.
Those are both opportunities that climatologists and their kin are exploiting to get funding for their projects. Unfortunately, their behavior is destroying the credibility of science. When their short-term claims of global disaster fall flat, as indeed they already have, everyone in science is hurt. And this is coming alongside the discovery that dietary advice given since the Seventies has been bogus.
The danger lies in particular with getting the support of those with a healthy sense of skepticism based on life experiences. I am sensible enough to know that when something precise like "global warming" morphs into something meaningless like "climate change" a scam is taking place. And when every possible disadvantage of that "climate change" is hyped but none of the potential benefits get mention, I feel like I am in the presence of an elaborate, money-seeking deception.
Also, take not that many people understand subjects they little understand by what they do understand. Claims about what will happen to our weather (climate science) or about new particles (particle physics) that don't prove true damage that long-term credibility. That is one of the main points of this article—don't overpromise. If you do, you'll be in trouble when you under-delivery.
Perhaps there are a couple of misunderstandings here? Am not sure why you're bring climate science into the discussion, but since you did it's worth pointing out some problems with your claims. Being skeptical is healthy, but basing skepticism on misunderstandings or unreliable information isn't.
First, it's not clear what you're referring to when you say
Those are both opportunities that climatologists and their kin are exploiting to get funding for their projects. Unfortunately, their behavior is destroying the credibility of science. When their short-term claims of global disaster fall flat, as indeed they already have, everyone in science is hurt.
What time frame are you referring to when you say "short term" and what groups of climate scientists are predicting short-term global disaster? The projections I see are all decades away at least, but if that is short term to you then I don't see how you can say they are wrong. There certainly is no indication that global warming is slowing.
But maybe you're not distinguishing climate from weather? You suggest that: Claims about what will happen to our weather (climate science)... Weather is short term, and there's no "global weather." Climate is a long-term average; short-term weather tells you very little about it. Meteorologists are the ones concerned with weather.
Second, it's true that global warming can have some benefits, but they pale in comparison to the downside that would accompany those benefits: mass starvation, mass migration, mass extinctions, many more regional wars, economic catastrophe, etc. (Maybe you think that's "alarmist" but what objective evidence is there to back that up?) Why should anyone play up the benefits when they're relatively trivial?
Finally, there seems to be a misunderstanding behind your claim, I am sensible enough to know that when something precise like "global warming" morphs into something meaningless like "climate change" a scam is taking place. For example, see https://www.skepticalscience.com/climate-change-global-warming.htm. I'll quote from that:
The argument "they changed the name" suggests that the term 'global warming' was previously the norm, and the widespread use of the term 'climate change' is now. However, this is simply untrue. For example [...] The journal 'Climatic Change' was created in 1977 (and is still published today). The IPCC was formed in 1988, and of course the 'CC' is 'climate change', not 'global warming'. There are many, many other examples of the use of the term 'climate change' many decades ago. There is nothing new whatsoever about the usage of the term.
Those who perpetuate the "they changed the name" myth generally suggest two reasons for the supposed terminology change. Either because (i) the planet supposedly stopped warming, and thus the term 'global warming' is no longer accurate, or (ii) the term 'climate change' is more frightening.
The first premise is demonstrably wrong, as the first figure above shows the planet is still warming, and is still accumulating heat. Quite simply, global warming has not stopped.
The second premise is also wrong, as demonstrated by perhaps the only individual to actually advocate changing the term from 'global warming' to 'climate change', Republican political strategist Frank Luntz in a controversial memo advising conservative politicians on communicating about the environment: [...]
This comment has been removed by the author.ReplyDelete
Hi! I'm curious whether you think Naturalness is an especially poor appeal in QFT since it's being used to fix a perturbative expansion? Or to put it another way, even if the full theory had some nice aesthetic property, I don't see why we would expect it to be manifested in what is fundamentally an approximation!ReplyDelete
The work of many minds may be seen an implementation of Artificial Evolution (AE) for searching theory space. By observing their progress (or lack thereof) you can learn what fundamental limits exist in the space without having to actually to know lies there in detail.ReplyDelete
Whenever, and wherever we encounter a deep problem that has remained in place for a long period of time without decisive resolution, we can always draw one of several conclusions. Either (1) the process itself is unsound, e.g. the underlying random walk has constraints that prevent it from straying beyond certain well-defined barriers - usually a form of group-think or confirmation bias that exhibits itself by strict adherence to certain normative, linguistic or editorial standards; (2) the area being explored is a dead end and no solution actually exists in there - which is not mutually exclusive of (1); or (3) the solution is too complex to be humanly comprehensible and learnable and is therefore of no practical use or benefit (except to machines); with no way to pass it down to later generations by virtue of its being unlearnable. The more time that passes, and the more people involved in the process without resolution, the more firmly established is the conclusion.
As a corollary to the AE premise; it follows that each time you brainstorm a potential hot-spot or collision point, all you need to do is a search and you crowd-source the answers or find a void. For instance, here are some key phrases that most reveal interesting results and recent activity: "gravitational decoherence", "quantum equivalence principle", "gravitational entanglement" (a lot of experiments are in the works for each of these), "hybrid classical quantum dynamics", "graviton searches" (with recent results from 2018-2019), "dilaton dark energy", "right neutrino dark matter", etc.
Much of this process, in fact, can be automated. But if you can brainstorm, then you do this right now, by hand.
Thanks for dissecting these false promises and restoring a little bit of honesty in this weird discussion about future colliders.ReplyDelete