## Saturday, February 23, 2019

### Gian-Francesco Giudice On Future High-Energy Colliders

 Gian-Francesco Giudice [Image: Wikipedia]
Gian-Francesco Giudice is a particle physicist and currently Head of the Theoretical Physics Department at CERN. He is one of the people I interviewed for my book. This week, Giudice had a new paper on the arXiv titled “On Future High-Energy Colliders.” It appeared in the category “History and Philosophy of Physics” but contains little history and no philosophy. It is really an opinion piece.

The article begins with Giudice stating that “the most remarkable result [of the LHC measurements] was the discovery of a completely new type of force.” By this he means that the interaction with the Higgs-boson amounts to a force, and therefore the discovery of the Higgs can be interpreted as the discovery of a new force.

That the Higgs-boson exchanges a force is technically correct, but this terminology creates a risk of misunderstanding, so please allow me to clarify. In common terminology, the standard model describes three fundamental forces (stemming from the three gauge-symmetries): The electromagnetic force, the strong nuclear force, and the weak nuclear force. The LHC results have not required physicists to rethink this. The force associated with the Higgs-boson is not normally counted among the fundamental forces.

One can debate whether or not this is a new type of force. Higgs-like phenomena have been observed in condensed-matter physics for a long time. In any case, rebranding the Higgs as a force doesn’t change the fact that it was predicted in the 1960s and was the last missing piece in the standard model.

Giudice then lists reasons why particle physicists want to further explore high energy regimes. Let me go through these quickly to explain why they are bad motivations for a next larger collider (for more details see also my earlier post about good and bad problems):
• “the pattern of quark and lepton masses and mixings”

There is no reason to think a larger particle collider will tell us anything new about this. There isn’t even a reason to think those patterns have any deeper explanation.
• “the dynamics generating neutrino masses”

The neutrino-masses are either of Majorana-type, which you test for with other experiments (looking for neutrino-less double-beta decay) or they are of Dirac-type, in which case there is no reason to think the (so-far missing) right handed neutrinos have masses in the range accessible by the next larger collider.
• “Higgs naturalness”

Arguments from naturalness were the reason why so many physicists believed the LHC should have seen fundamentally new particles besides the Higgs already (see here for references). Those predictions were all wrong. It’s about time that particle physicists learn from their mistakes.
• “the origin of symmetry breaking dynamics”

I am not sure what this refers to. If you know, pls leave a note in the comments.
•  “the stability of the Higgs potential”

A next larger collider would tell us more about the Higgs potential. But the question whether the potential is stable cannot be answered by this collider because the answer also depends on what happens at even higher energies.
• “unification of forces, quantum gravity”

Expected to become relevant at energies far exceeding that of the next larger collider.
• “cosmological constant”

Relevant on long distances and not something that high energy colliders test.
• “the nature and origin of dark matter, dark energy, cosmic baryon asymmetry, inflation”

Giudice then goes on to argue that “the non-discovery of expected results can be as effective as the discovery of unexpected results in igniting momentous paradigm changes.” In support of this he refers to the Michelson-Morley experiment.

The Michelson-Morley experiment, however, is an unfortunate example to enlist in favor of a larger collider. To begin with, it is somewhat disputed among historians how relevant the Michelson-Morley experiment really was for Einstein’s formulation of Special Relativity, since you can derive his theory from Maxwell’s equations. More interesting for the case of building a larger collider, though, is to look at what happened after the null-result of Michelson and Morley.

What happened is that for some decades experimentalists built larger and larger interferometers looking for the aether, not finding any evidence for it. These experiments eventually grew too large and this line of research was discontinued. Then, the second world-war interfered, and for some while scientific exploration stalled.

In the 1950s, due to rapid technological improvements, interferometers could be dramatically shrunk back in size and the search for the aether continued with smaller devices. Indeed, Michelson-Morley-like experiments are still made today. But the best constraints on deviations from Einstein’s theory now come from entirely different observations, notably from particles traveling over cosmologically long distances. The aether, needless to say, hasn’t been found.

There are two lessons to take away from this: (a) When experiments became too large and costly they paused until technological progress improved the return on investment. (b) Advances in entirely different research directions enabled better tests.

Back to high energy particle physics. There hasn’t been much progress in collider technology for decades. For this reason physicists still try to increase collision energies by digging longer tunnels. The costs are now exceeding $10 billion dollars for a next larger collider. We have no reason to think that this collider will tell us anything besides measuring details of the standard model to higher precision. This line of research should be discontinued until it becomes more cost-efficient again. Giudice ends his essay with arguing that particle colliders are somehow exceptionally great experiments and therefore must be continued. He writes “No other instrument or research programme can replace high-energy colliders in the search for the fundamental laws governing the universe.” But look at the facts: The best constraints on grand unified theories come from searches for proton decay. Such searches entail closely monitoring large tanks of water. These are not high-energy experiments. You could maybe call them “large volume experiments”. Likewise, the tightest constraints on physics at high energies currently comes from the ACME measurement of the electric dipole moment. This is a high precision measurement at low energies. And our currently best shot at finding evidence for quantum gravity comes from massive quantum oscillators. Again, that is not high energy physics. Building larger colliders is not the only way forward in the foundations of physics. Particle physicists only seem to be able to think of reasons for a next larger particle collider and not of reasons against it. This is not a good way to evaluate the potential of such a large financial investment. #### 47 comments: 1. I think "the origin of symmetry breaking dynamics" means dynamical electroweak symmetry breaking models, i.e. technicolour, composite Higgs models etc. 1. Hi Francis, Thanks. Yes, that's possible. So, probing alternatives to the SM Higgs mechanism basically. Makes sense. 2. We are told that the 27 kilometer long Seoul Subway Line 9 was built at the cost of USD$40 million per kilometer (2010 value). I'm not sure that the cost of tunneling is the major cost of the next generation of accelerator.

Of course, to minimize land acquisition costs and to have a favorable geology, the accelerator may have to be built in a place where physicists may not want to live long-term.

1. Arun,

I think that's right. But a tunnel is not an accelerator. There is stuff that has to go into the tunnel which needs to be paid for.

3. I don't have answers for this, but one can ask questions like the following: Suppose we spent 20 billion on a proton decay monitor; what bounds could we put on the lifetime of the proton? Suppose we spent 20 billion on neutrino physics experiments; how well could we measure the masses of the neutrinos, and their other unknown properties? Suppose we spent 20 billion on a cosmic ray detector; what would we observe? Which of these is most likely to tell us more about fundamental physics?

Shouldn't we at least ask these questions before we spent 20 billion on the next generation collider?

4. Dear Dr. Hossenfelder
The longer this goes on, the more I get the impression the situation is completely messed up. The particle physicist community obviously knows, that the next bigger collider is a shot in the dark. I read a few of the proposals to the European Particle Physics Council and some of them are either very cautious towards building a new collider or just plain against it, until we know more. That Dr. Guidice is in favor, is no wonder, it is his job. But arguing, that we should build the FCC because if we find nothing that would be the moment to change paradigm, is beyond me. By this standard, we could easily argue, that the LHC results are close enough to a null result (just the Higgs and nothing else), that we can change paradigm now.
I always cringe, when ROI comes up. Is there a ROI for the LHC? It is in fact a number which is often hard to calculate precisely, but one at least can get a pretty good approximation. In case of the LHC or any other fundamental research that does not end up in something people want to buy. Or did anybody sell a Higgs lately?
Or lets turn in around: For all those, who expected only the Higgs from the LHC, the ROI is positiv, for all those who expected more and did not get it, it is negative. Who is right? Whose ROI is right? Is the insight, that we need another bigger collider part of the ROI?
(Just as an explanation: ROI = Profit / Invested Capital and since it is impossible to have the "Profit" of the LHC, I just assume that the profit is Achieved Results - Expected Results).
If they really think, that a null result would be a good result, the ROI would be always positiv. This is fundamentally wrong.
ROI isn't a good argument, because we can never agree on what is the "profit". Especially not, if the null result would be a "profit".
The discussion should not be about the money and who gets it, because that is something where politics get involved and that is very unpredictable.
It looks like, that the European Particle Physics Community can now decide to change the paradigm or carry on for another 30 years until we find out, that we need yet another even bigger collider ... and so on. For the sake of saving CERN from a very probable huge embarrassment (an even bigger one than the LHC) in 30 years, it might be more cautious to change paradigm now. But alas, because organizations like CERN, as cool as I think it is, the older they get, the more they want to keep what they have, the less they are pioneers and nobody who is in charge now will be around when the big hammer falls. Therefore the FCC will be build, if only because CERN has no better idea.

1. Christian,

People have indeed tried to do this, see eg here. In principle, I think, it's a good approach. Question is, though, what to compare it to. Eg, as the abstract notes, while it's a positive ROI, it's not a large one, and one thus has to ask if not one could have invested the money better, which brings up better in what category.

My argument has mostly focused on comparing investments in the foundations of physics (noting that other experiments have a larger expected benefit but are less costly), but of course one could generally ask the question whether that's something we should be investing in at the present time to begin with, or if not mankind has bigger problems than quantizing gravity.

5. Giudice Is a politician who is trying to protect an industry. His arguments are rhetorical.

6. Another good analysis.
If LHC funding money could be diverted to more appropriate reseach, I wonder who would benefit?
They should speak up, out of self-interest at least.

7. Bee,

what's your position on the HE-LHC, reusing the same 27km LHC tunnel and simply upgrading the magnets from 8 teslas to 16 for increase from 14TEV to 28TEV?

8. A small correction: Although KATRIN is designed to measure the mass of the electron neutrino by looking at the endpoint of the tritium beta decay spectrum, I'm not aware of any way for that experiment to determine if the neutrino is Majorana or Dirac. The only experiments I know of with that capability are neutrinoless double beta decay experiments like MJD, GRETA, CUORE, EXO, KamLand-Zen, SNO+, etc. A suite of next generation experiments are in the later stages of planning: nEXO, LEGEND, CUPID, NEXT. These experiments rely on accumulating approximately a ton of enriched nuclear isotopes and will cost roughly $250 million. They are technological wonders and have rather impressive discovery probabilities (depending on the ordering of neutrino masses and the neutrino mass scale). The discovery of neutrino less double beta decay would at once, demonstrate lepton number violation providing the observation of one of the key ingredients to the simplest explanation for the matter vs antimatter asymmetry of the universe (leptogenesis), establish the Majorana nature of the neutrino which is a key to understanding why neutrinos are so light, and finally determine the mass-scale of neutrinos. 1. Hi Paul, Sorry about that :/ I meant to look up the experiment's name but then forgot. I keep mixing up all those terrible acronyms. I fixed that sentence. Thanks for pointing out! 9. I'm reminded of the kind of people who get themselves elected to local boards like school committees. They all see it as their jobs to get more money for the schools. As opposed to serving the interests of the residents and taxpayers of the town. Their working principle is 'More for Us.' Before these people (the particle physicists) cash their next checks, they should have to write a ten page paper on opportunity costs. 10. In basic research, seems astronomy &planetray science has been far more productive per$ spent, than HEP: LIGO, exoplanet discoveries, unmanned space probes, etc.
I don't have $estimates handy, but excluding the expensive space probes, but I believe all other astronomical facilities have been far less expensive than a CERN-class collider. -- TomH 1. "In basic research, seems astronomy &planetray science has been far more productive per$ spent, than HEP: LIGO"

Wait a sec!
We are told here day in day out by the blogger that the LHC has done nothing more than discovering the last piece of the SM, a 40 years old prediction... and now all of a sudden an experiment, LIGO, which has simply confirmed what we already knew since 1926, the existence of GWs, is a much better investment???... more productive in which sense?... that it gives one event every 3-4 months?... randomly, instead of 640 million collisions per second... or whatever the collision rate at the LHC is???
The comments on this blog are more in the category "Cern haters" than "HEP".
Cheers.

2. Roberto,

The Hulse-Taylor observation dates to 1974. That was the first time one could reasonably say we knew that gravitational waves exist, not 1926, as you wrote.

You may have confused the observation with the prediction. Not the same thing.

If you want to argue that we should have settled on an indirect detection of gravitational waves, I expect you will also go on to argue that we should have settled on an indirect detection of the Higgs, which we have had basically since the discovery that particles have mass to begin with. According to your logic, I we should not have built the LHC. I am sure your colleagues will love to discuss your awesome argument.

Leaving aside that you seem to be ill-informed about the history of general relativity, gravitational wave are messengers that carry information from faraway places to us. They can be used to study the cosmos. The same cannot be said about measuring the Higgs-boson.

I am sure that you know all that. And since you know all that, the only reason I can think of that would make you write the above comment is that you hope some reader may accidentally believe what you wrote.

Everyone else:

Roberto Kersevan is a technician at CERN and he is here to demonstrate how particle physicists argue.

3. Roberto,
To be fair to LIGO the technology is still in a relatively early stage of development. Even so it has the potential to do a far more than just confirm the existence of Gravitational Waves as you claim. Funding of just over $20million has been agreed to upgrade the experiment so that(according to the LIGO website) "Advanced LIGO Plus can expand LIGO's horizons enough to capture this many events (11) each week, and it will enable powerful new probes of extreme nuclear matter as well as Albert Einstein's general theory of relativity." https://www.ligo.caltech.edu/news/ligo20190214 4. @sabine "The Hulse-Taylor observation dates to 1974. That was the first time one could reasonably say we knew that gravitational waves exist, not 1926, as you wrote. " You are right, Sabine, my apologies... I was wrong when I wrote "since 1926"... my memory failed me (I'm a simple "technician, as you've written... more on that below): "Gravitational waves are disturbances in the curvature (fabric) of spacetime, generated by accelerated masses, that propagate as waves outward from their source at the speed of light. They were proposed by Henri Poincaré in 1905[1] and subsequently predicted in 1916[2][3] by Albert Einstein on the basis of his general theory of relativity.[4][5]" ... and on wikipedia you'll find the proper references. It was 1916, for Einstein's predictions, after the initial postulate by Poincare'. Tough the life of us, simple "technicians". :-) " I expect you will also go on to argue that we should have settled on an indirect detection of the Higgs, which we have had basically since the discovery that particles have mass to begin with. According to your logic, I we should not have built the LHC. I am sure your colleagues will love to discuss your awesome argument." ??? Ahahahah... lol and rotfl... nice try Sabine! Here I was simply mocking you... because this has been EXACTLY your reasoning for the past months!... LHC/CERN is useless because nothing has come out of it other than the Higgs which is a 40 years old prediction! You are UNBELIEVABLE, Sabine! I save this one on my hard disc... too good to be true. :-) ----------- "Everyone else: Roberto Kersevan is a technician at CERN and he is here to demonstrate how particle physicists argue." Well... actually... if by "technician" you mean that I deal mainly with technical matters then... yes!... that's it... in fact I've been involved in the past 30+ years with the design, construction, operation, re-furbishing, upgrade, and related R&D of... let me count... 1, 2,... 7 accelerators, one experimental fusion reactor, and as a consultant or external expert, at least 4 more accelerators, at 7 different labs. By the way, all these, except ITER which is still under construction, have done and do very well, often exceeding their design parameters. How about you, Sabine? What is it that you have calculated, or predicted theoretically, which has been checked by experiments and validated? Tell me/us one thing! Your smearing anybody who you perceived or have cataloged as "opponent" is shameful, sabine! Really. And I'm not talking about myself, because I have broad shoulders, literally, and do not care about such attacks... look at this blog entry of yours. You use a paper cataloged as "history and philosophy"... which has not even ONE scientific formula!... to smear one more theoretical physicist, gratuitously. And you leave other posters on your blog smear further his name without any comment on your part!... I really pity you. 5. Roberto, The history of the prediction of gravitational waves is considerably more complicated than that. I am sure that some Googling will lead you to the light. "because this has been EXACTLY your reasoning for the past months!... LHC/CERN is useless because" This is nonsense. First, I never said a single word about the use of CERN. Second, I never said that the LHC was or is useless. Stop fabricating things I didn't say. You made a fool of yourself and once again fail to notice. Ad hominem attacks will help neither you nor your colleagues. 6. "Ad hominem attacks will help neither you nor your colleagues. " Yeah!... right!... no ad hominem! "Roberto Kersevan is a technician at CERN and he is here to demonstrate how particle physicists argue." You are shameless. 7. Roberto, That I inform the reader about your identity is not an ad hominem attack because I have not used this information to argue that something you said is incorrect or should be disregarded. You, on the other hand, have repeatedly tried to draw upon my biographical information to convince the reader to not listen to me. That is an ad hominem attack. Look it up if you don't believe me. You have also further accused me of "smearing" someone (not sure whom) and now complain that I must be "shameless" because I point out that you are not capable of leading an argument. 8. "Roberto, That I inform the reader about your identity is not an ad hominem attack " OK, fine... noted. For the readers: Sabine is a plumber who specializes in PVC conduits. She's here to show us how plumbers who specialize in PVC conduits argue. 9. @sabine hossenfelder "Leaving aside that you seem to be ill-informed about the history of general relativity,... I am sure that you know all that. And since you know all that, the only reason I can think of that would make you write the above comment is that you hope some reader may accidentally believe what you wrote. " Sure!... 1916 is not a good year to cite when talking about early predictions of the existence of GWs! Sure... "Fact Sheet NSF and the Laser Interferometer Gravitational-Wave Observatory In 1916, Albert Einstein published the paper that predicted gravitational waves..." https://www.ligo.caltech.edu/system/media_files/binaries/300/original/ligo-fact-sheet.pdf Ehi, sabine! Technicians beats plumbers anytime! :-) 10. @RGT Exactly... 11 events in 3 years... and 11x52/year after upgrade IF the upgrade reaches the promised specs. We'll see... of course I wish them the best. So, RGT: as per the document you've cited... they will be able to test "extreme nuclear matter" and AE's GRT... what about the famous "origins of the universe" that matter so much to our blogger/plumber? This is "far more"? For the next generation GW detectors, we're in the ballpark of 1+ billion each... https://knaw.nl/shared/resources/adviezen/bestanden/KNAWAgendaEinsteinTelescope.pdf ...and one needs at least 3 of them in order to get some decent directionality at the highest sensitivity... so we go in the multi-billion range. Another billion more estimated for creating a "HEP-type" network (as per document)... strange... they take HEP as a model???? how crazy can they be???? Anyway, how come 3-4 billions is so cheap and 4-5 times more is incredibly expensive? Strange uh!... must be some sort of log scale she's using to define the thresholds. Plumbers... you never know! :-) 11. Roberto, I am not a plumber. This isn't even an ad-hominem attack, it's simply false information. An experiment that's 5 times more expensive than another experiment is 5 times more expensive. I was thinking you could have figured that out. I never said it is "incredibly expensive" - thats another quote that you have simply fabricated. I said a next larger collider is more expensive. Combine that with the lack of discovery potential and it makes a bad investment. Since you seem to have difficulties googling for "History of Gravitational Waves", I can recommend this paper, from which you may learn that Einstein's 1916 paper far from settled the matter. 12. @sabine "I am not a plumber. This isn't even an ad-hominem attack, it's simply false information." ??? You sense of humour is zero, Sabine!... Of course you are not a plumber, but I am not a technician either... so... correct your statement and I'll correct mine. You are also humour-impaired! "I can recommend this paper, from which you may learn that Einstein's 1916 paper far from settled the matter. " I *NEVER* said it settled the matter... it simply postulated the existence of GWs, 80 years before their detection at LIGO... exactly as Higgs et al postulated the existence of the Higgs boson... and yet ...LHC/CERN/HEP theorists have done nothing good or new for the past 40+ years, right? You simply cannot, or do not want to, see that the ridiculous arguments you use against the FCC (or LHC, for that matters) can be used against any other big scale science project... including your own pet/favorite projects. The "cost issue" is plainly ridiculous... as also other posters have clearly explained on your blog... Germany alone spends a lot more for useless policy measures (one among many, "incentivization" of useless photovoltaics... 10.5 billion Euro/y for 20 years!)... and yet you keep on pushing that button. "Since you seem to have difficulties googling for "History of Gravitational Waves", I can recommend this paper," Funny... this just shows how detached you are from reality... the arxiv paper you link, which I already knew... says in the abstract exactly what I said... "their prediction by Einstein in 1916" Q.E.D. :-) Cheers, and have a nice day. 13. Roberto, I am sincerely sorry in case I mistakenly referred to you as technician, which was my interpretation of your LinkedIn page. It was not my intention to make any incorrect statement. Please let me know how you want to be referred to and I will use that in the future. (I cannot edit comments.) Regarding Einstein, I am afraid you may have to actually read the paper to understand what I am saying. 'You are also humour-impaired!' Have you considered that maybe you just aren't witty? 11. Hi Sabine, Thanks for mentioning ACME and EDM. A less prolific blogger than yourself, but a graduate student with Gabrielse, Daniel Ang, has an insider, but not entirely authoritative, perspective on EDM and table-top experiments. He has written many blog posts in 2017-2018, including the following: https://www.danielang.net/2017/07/09/guide-to-the-acme-edm-experiment-why-cp-violation-might-explain-everything-about-the-universe/ He asserts that: "if we don’t find an electron EDM, then it becomes less likely that experiments like the LHC will find anything, because physics tends to show up at similar energy scales." and follows this with the footnote: "Or so we think. This is the principle of naturalness, which underpins all of theoretical physics." If the theory and physical principles behind EDM experiments are accepted widely, these experiments are very inexpensive compared to any future accelerator. Shouldn't this funding be expanded? An effort to rule out the electron EDM seems like lower hanging fruit, naturalness or unnaturalness aside. I believe you would have many justifiable disagreements with his footnote. * I have no financial or other self-interest in making these statements. I am also not opposed to large internationally supported financial projects that are decided based on justifiable progress, but the self-interest of participants, or "because this is how things have always been done" doesn't show scientific integrity. 1. RK, I don't understand this reference to naturalness. That high energy contributions are small but measurable at low energies isn't an argument from naturalness, it's effective field theory. Also, to say that the "principle of naturalness" (which??) "underpins all of theoretical physics" is crazy talk. Note that the high energy contributions that are relevant here are real contributions, in contrast to the high energy "contributions" that physicists are worried about for the naturalness of the Higgs-mass or cosmological constant, which are by definition unobservable (hence I say it's a pseudo-problem). I can only guess that what this comment refers to is that not all high energy contributions play a role for specific low-energy observables, so despite the EDM constraints there may be things happening at the LHC (and I am sure someone has a model for that). I think this is correct, but it's also the reason why I think we need better theoretical predictions to get us out of this kind of argument where no constraint really leads to any kind of progress because whatever we measure someone can come up with another model. 2. Hi Sabine (& RK), I just came across this comment - I'm the person who wrote the blog post above. I think my original choice of words was a little sloppy and is confusing. I'm an experimentalist, not a theorist, and I hope I am not misunderstanding something. What I meant is that, as I understand it, naturalness-motivated assumptions are essential to be able to compare low-energy experiments like ACME with other experiments, including the LHC. For example, as somewhat pointed out already above, to say that EDM experiments can probe physics at tens of TeV assumes (among other things) a "natural" value of the CP-violating phase (usually ~1). If this assumption is true, then if EDM experiments probing tens of TeV give a null result, it also becomes less likely for the LHC to discover a new particle at several TeV. (Of course, I'm not taking into account hypothetical particles which do not cause a non-zero EDM, which I believe is less common, although they exist, as you said.) Would you say that that is a reasonable statement? 3. Hi dga, Thanks for commenting! I believe when I read this, I thought you were referring to technical naturalness, but it seems you just mean the parameters are of order 1. It looks to me like the experiment will make new discoveries at the LHC less likely regardless of what the phase, it's just that if the phase is small, it will not make much of a difference. So, well, as we discussed already above, it's really a two-parameter situation about which you make a statement under the assumption that one of the parameters has a value ~1. But of course it is always the case, regardless of what you test, that the energy is not the only variable. You will always have other couplings that, if you make them arbitrarily small, will decrease the probability of seeing something. That is as long as you stay within the context of a specific model (or class of models). If you have infinitely many models, then constraining one of them will not help you much. So, yes, it is a reasonable statement, except that I object that naturalness underpins all of theoretical physics. 12. Hi Sabine, I don't think it is correct that EDMs provide the strongest constraints. I love EDMs and EDM experiments definitely probe high scales, but the limits on kaon oscillations probe even higher scales. This is also a low-energy experiment of course, but still... A rough guide for EDM limits is that : d_e ~ (me /Lambda^2) sin[phi] (alpha_em/pi)^n where n is the number of loops necessary to create an EDM in the particular BSM model, sin phi some CPV phase, and Lambda the scale of BSM physics. If n=0, which can happen in e.g. leptoquark models then the scale is thousands of TeV, but if n=1 (SUSY) or n=2 (2HDM) the limits go to 100 TeV, or a couple of TeV all assuming sin phi ~ O(1). I guess that would be a naturalness argument so you might dislike that. Still the bounds are impressive, but Kaon oscillations can probe even higher scales (10^5 TeV), see e.g 1812.10913 . Again all such limits depend on naturalness arguments for dimensionless couplings. If you don't want to do that then its very hard to assign a BSM scale to low-energy experiments. Jordy 1. Hi Jordy, Thanks for the reference. I haven't looked at the Kaon constraints for a while, I wasn't aware they are probing such high scales. It seems rather unnecessary to actually assume sin \phi ~ 1, why not think of it as a two-parameter constraint? Best, B. 2. Hi , what Jordy means is that for small sin[phi] the bound on Lam will be very weak... so your statement that EDM s can give you a bound on very high scales is incorrect unless you assume naturalness (for the value of sin[phi]). Similarly in other indirect measurements if you are allowed to assume arbitrarily small values for some couplings (for eg. flavour violating couplings in the case of Kaon decays) you cannot come to any conclusions about the scale of physics probed. 3. AKG, Right, but the constraints on any high energy contribution will depend on the coupling in front of the respective term. I don't see how that's specific to this particular constraint. 4. Of course that is true. It's just that statements that low-energy experiments are more constraining than collider searches, depends on a comparison between the two. When people say that EDMs or Kaons or whatever probes scales of many TeVs, well beyond direct limits, this assumes natural values of dimensionless couplings. If you don't assume this then you cannot really compare the different experiments. I agree that one should ideally see EDMs as a two-parameter constraint (or even more in most models) that also implies that one cannot compare them to LHC limits that typically do not depend on sin phi. So then the statement that the best limits arise from EDMs is not really appropriate. Not sure how one should phrase it instead. This is somewhat pedantic, and I fully agree low-energy limits set strong limits but they do come with a fine-print. 13. Quantum oscillators, eh? I'd like to hear more. Thanks for writing. 14. The part of my work for the US Department of Defense that I most enjoyed was helping define, acquire, and promote small to medium size research efforts in robotics and artificial intelligence. I found that young researchers with open ended, minimally encumbered basic research grants can be incredibly innovative in bringing new perspectives to existing data sets, often with results that lead to unexpectedly powerful outcomes. While such research has led to many practical applications, many would argue that the nature of intelligence is also a deeply fundamental question. Is existing physics data truly so weak and incomplete that there is no point in funding brilliant young researchers to reexamine that data solely to look for radically new ways to interpret and understand it? 1. You write as if there is a shortage of ideas in theoretical physics. But there are plenty of ideas out there. Every month on the HEP part of the arxiv I see new ideas posted; and on the HEP part of vixra too, for that matter. There might be as many as thirty distinct avenues of investigation which I regard as having some claim on my own attention. That kind of fundamental theorizing takes as its "data" to be explained, all the basic qualitative and quantitative facts of particle physics and cosmology, and perhaps a few anomalies whose significance is not yet agreed upon. If what you're looking for is reinterpretation of the voluminous big data produced by the giant experimental collaborations, Matt Strassler recently blogged about that. 2. Mitchell, no disagreements at all! I love the way physics theory has been percolating lately, and I'm actually optimistic that physics is on the verge of a major, 1920s-level inflection point, likely within the next ten years. My point instead was that current high-level physics funding is based more on habit than strategy. Hotter-Is-Better was a fantastically effective research heuristic for half a century, and was it was pivotal in achieving one of the supreme accomplishments of physics, the 1970s-vintage Standard Model. But from a hard-nosed analysis of results per Euro or dollar spent, Hotter-Is-Better just seemed to lose all of its potency after that. Other issues also arose about that time, including in particular capture of NSF funding by an exceptionally narrow, unusually self-defensive, and experimentally indifferent school of theoretical thought. Even so, after a decade or so of floundering in terms of generating experimentally meaningful new insights, funding at the highest levels should have better reflected the failure of Hotter-Is-Better and allocated more funding towards diversity of analysis. But apart from the costly final verification of the Higgs component of the Standard Model, why did Hotter-Is-Better seem to fail after 1980? Here's one thought: Maybe the universe is simply gentler than we thought, and the Standard Model was already pushing the limits of information available in that direction. Similarly, new interpretations of black holes that keep information in touch with the outer universe, such as in 't Hooft's recent papers, at least enable arguments that GR is also gentler than we thought, with event-horizon interpretations making singularities unnecessary. Perhaps the final step in physics is not to make things hotter, but to look for similar softening limits in quantum theory, and then show how all three areas work together in some remarkable and still unknown way. 3. Addendum: For completeness, by "gentler" quantum theory I mean observer-dependent QM. This is the idea (my own, apologies, but any related paper refs would be very welcome) that quantum behavior can exist only in the presence of an observer. Observer-dependent QM is gentler because it makes energy-free space truly empty, with field theory applying only when passing energy or matter makes QFT relevant. Do this right and you get exactly the same field theory observed experimentally. (That assertion is a tautology if you think it through; computer modelers understand this principle well.) However, in observer-dependent QM the vacuum catastrophe problem disappears, since empty regions of space end up with nearly zero energy density due to their lack of significant internal quantum behavior. Philosophically, observer-dependent QM is pretty much the diametric opposite of MWI. The resulting Gentle Trio is: (1) Standard Model Finality: The particles of the Standard Model are already our best representation of reality; we just don't "get" the model well yet. Part of "getting" it will be recognizing that particles and space are deeply interdependent duals of each other, rather than independent entities. (2) Dark Mirror General Relativity: The extremes of general relativity lead to particles entering reversibly into momentum space, rather than into singularities. (3) Observer-Dependent QM: Like quantum observables, quantum mechanics itself only comes into existence when it is observed -- that is, when it is made relevant by the presence of matter or energy. In the Gentle Trio, the popular concepts of Planck foam, quantum gravity, and Planck-scale strings would cease to exist. More precisely, such concepts would exist only locally and only through the application of levels of energy likely not available anywhere in the real universe. They would be comparable to patterns of a Mandelbrot dive that is too deep to reach computationally. 15. The Higgs field is a sort of force. The quartic scalar potential is analogous to the [A, A]^2 term in the gauge field Lagrangian. Hitchens proposed a bundle formed from the Higgs field φ and a holomorphic bundle E from a gauge field (E, φ). The bundle can be the SU(2) for weak interactions, or U(2) = SU(2)×U(1) bundle for EW, plus this additional Higgs part with the condition φ/\φ = 0 on the 1-forms formed from the scalar field. The Higgs doublets here are then SU(2) and are a complement to the nonabelian weak symmetry. I read Giudice's piece a couple of days ago. To be honest his arguments left me as uncertain as your arguments against the next collider. 16. technically, ad feminam. the classism implicit in considering "technician" a pejorative term seems to be a conditio sine qua non for academia. 17. I am a retired control systems engineer who worked in heavy ion accelerators and a laser system for fusion energy research. I browsed your book a few times, each bringing me more understanding, and I have just finished actually reading it. I understand your essential points, your supporting arguments, and your reason for betting with yourself. Each of the several engineers I know who has read your book has pretty much agreed with my comments, and I speak for us all in thanking you for betting on yourself and giving us the invaluable insights your book offers to your engineering partners. 18. Apologies for posting several times. I think I posted earlier as unknown. I would like to correct the idea that the historical Michelson-Morley experiments produced clear "null" results. The effect seen in these experiments was merely below that predicted by the stationary ether hypothesis. Dayton Miller wrote in 1933 about the 1887 experiment: “The brief series of observations was sufficient to show clearly that the effect did not have the anticipated magnitude. However, and this fact must be emphasized, the indicated effect was not zero." Only modern experiments produced null results with great accuracy. Italian physicist Maurizio Consoli authored a number of papers, among them “From classical to modern ether-drift experiments: the narrow window for a preferred frame” (2004), "The classical ether-drift experiments: A modern re-interpretation” (2013) and "The Classical Michelson-Morley Experiments: A new solution to an old problem" (2018), that make the case that the anomalous signals seen in classical M-M experiments were not entirely random, as one would expect if they were due to artifacts, but systematic. In Consoli's view, the effect size seen in historic M-M experiments is proportional to ε, with n = 1 + ε being the refractive index of the medium. Experiments performed in air (ε = 0.000293) showed a greater effect than those performed in helium (ε = 0.000036), while modern vacuum experiments show no effect (ε = 0). Consoli's analysis of classic M-M data derives estimates for the motion of the Earth relative to the ether that are consistent with observation of our motion relative to the CMB - roughly 370km/s. Needless to say, this suggest that physics made a wrong turn long before supersymmetry, and has been barking up the wrong tree for over a century now. Lorentz's view of relativity would be the correct one, not Einstein's. After Consoli published his first paper on the subject, New Scientist devoted a story titled "Catching the cosmic wind" to this subject in its April 2nd, 2005 issue. It reported that the Optical Metrology group at Humboldt was likely to test these claims by performing a M-M experiment in a gaseous medium: "It is not a straightforward experiment to perform, though. Experimenters have managed to produce a laser frequency stable enough to carry out experiments for hundreds of days only by cooling the cavities to close to absolute zero. If a gas is introduced at these temperatures it will freeze: it's going to take quite some ingenuity to overcome the problem. Nevertheless, a group of physicists at Humboldt University are considering taking on the challenge. "There is a good chance we will do the experiment," says Achim Peters, one of the group. It's going to be a much-watched piece of lab work. "If someone does do it, I will be very interested in the result," says Holger Müller of Stanford University, California, who was involved in laser cavity experiments at Humboldt before moving to the US. Müller admits that a positive result would have profound implications for physics. For a start it would mean that one of Einstein's contemporaries Hendrik Lorentz, has been denied proper recognition. Lorentz, not Einstein, would have to be credited with the definitive theory of relativity." 14 years later, this crucial experiment has not been performed. New Scientist cited an estimated cost of$200,000 in 2005.

Given the dead end that modern physics has found itself in, radical ideas like this one need to be on the table, especially when they make clear, testable predictions. Instead of investing billions into a new collider, we need to fund many small and much cheaper experiments that re-visit the foundations of physics with modern technology, especially the parts that are generally assumed to be long settled and beyond debate.

19. Why does he think Higgs is a new force? Isn't the Higgs field part of the electroweak theory? Is he talking about the unified electroweak force?

The wakefield accelerator would make giant colliders obsolete. Why spend billions of euros when you can make a table-top LHC? The race would be on the smallest, highest energy, lowest cost accelerator.