Saturday, June 11, 2011

Extra Dimensions at the LHC: Status Update

The Planck scale is the scale at which quantum gravitational effects are expected to become important. An extrapolation of the strength of gravity gives a value of 1016TeV, which is far out of reach for collider experiments. In the late 90s however, it was pointed out by Arkani-Hamed, Dimopoulous and Dvali, that this extrapolation does not hold if our spacetime has additional spacelike dimensions with certain properties. If that was the case, the true Planck scale could actually be at a TeV, an idea that is appealing because it does away with the question why the Planck scale is so large, respectively why gravity is so weak, to begin with. The answer would be, well, it isn't, it is only apparently so: Our naive extrapolation doesn't hold because space-time isn't four-dimensional. (For more details, read my earlier post.)

This (and other) extra dimensional models with a lowered Planck scale have been very popular at the beginning of the last decade and caused an extraordinarily high paper production which reflects not only the number of theoretical particle physicists, but also their desperation to put their skills to work. The most thoroughly analysed consequence of such models are the modification of standard model cross-sections through virtual graviton exchange and the production of black holes at the LHC. The latter possibility in particular received a lot of attention in the media due to some folks who accused physicists of planning the end of the world just to increase their citation count. (For more details, read these earlier posts.)

In any case, the LHC is running now, data is coming in and models are being sorted out, so what's the status?

In arXiv:1101.4919, Franceschini et al have summarized constraints from the LHC's CMS and ATLAS experiments on virtual graviton production. For the calculation of the contributions from virtual gravitons one needs to introduce a cut-off Λ of dimension energy that, next to the lowered Planck scale, becomes another parameter of the result. The constraints are then shown as contour plots in a two parameter space, the one parameter being the 'true' fundamental Planck scale, here denoted MD, and the other one being mentioned cut-off, or its ratio to MD respectively. One would expect the cut-off to be in the range of the lowered Planck-scale, though it might be off by a factor 2π or so, so the ratio should be of the order one. The figure below (Fig. 6 from arXiv:1101.4919) shows the bounds for the case of 4 additional spacelike dimensions:

The continuous line is the constraint from CMS data (after 36/pb integrated luminosity. Don't know what that means? Read this), and the dashed line is the constraint from ATLAS. The shaded area shows the excluded area. As you can see, a big part of the parameter space for values in the popular TeV range is meanwhile excluded.

Now what about the black holes? A black hole with a mass a few times the lowered Planck mass would already be well described by Hawking's calculation for particle emission, usually called Hawking-radiation. It would have a temperature (or average energy of primary emitted particles) of some hundred GeV. Just statistically, a big fraction of the emitted particles carry color charges and are not directly detected, but they form color strings that subsequently decay into a shower of hadrons, ie color neutral particles (pions, protons, etc). This process is called hadronization, and the event is called a jet. Depending on how many jets you get, it's a di-jet, tri-jet or multi-jet. The black hole's Hawking radiation would typically make a lot of particles and thus contribute to the multi-jets. One expects some multi-jets already from usual standard-model processes ("the background"), but the production of black holes should significantly increase the number. The figure below (from this paper by the CMS collaboration) shows an actual multi-jet event at the LHC:


In the paper arXiv:1012.3375 [hep-ex], the CMS collaboration summarized constraints on the lower mass of black holes in models with extra dimensions. For this, they analyzed the amount of multi-jet events in their data. The figure below (Fig 2 from arXiv:1012.3375) contrasts the predictions from the Standard Model with those of models with black hole production, for events with multiplicity N larger than 3 (that includes jets, but also photons, electrons and muons that don't hadronize).

On the vertical axis is the number of multi-jet events per bin of 100 GeV, on the horizontal axis the total transverse energy of the event (if you don't know what that means think of it as just the total energy). The solid blue line is the Standard Model prediction, the shaded area depicts the uncertainty. The various dotted and dashed lines are the predictions for the number of such events for different values of the minimal black hole mass, usually assumed to be in the range of the lowered Planck scale. These lines are created by use of event generators, ie numerical simulations. From this and similar data, the CMS collaboration is able to conclude that they haven't seen any black holes for minimum masses up to 4.5 TeV. CMS has an update on these constraints here, where they've pushed the limits up to 5 TeV, if not with amazingly high confidence level.

Some comments are in order though for the latter analysis. It argues with the production of multi-jets by black holes. This is a reliable prediction only for black holes produced with masses at least a few times above the lowered Planck scale. The reason is that a black hole of Planck mass is a quantum gravitational object and it is not correctly described by Hawking's semi-classical calculation. How to correctly describe it, nobody really knows. It is for the sake of numerics typically assumed that a black hole of Planck mass makes a final decay into a few particles. But that's got nothing to do with theory, it is literally just a subroutine in a code that randomly chooses some particles and their momenta such that all conservation laws are fulfilled. (The codes are shareware, look it up if you don't believe it.)

That procedure wouldn't be a problem if that was just some pragmatic measure to deal with the situation that has no impact on the prediction. Unfortunately it is the case that almost all black holes that would be produced at the LHC would be produced in the quantum gravitational regime. The reason is simply that the LHC is a hadron collider, and all the energy from the protons is redistributed on its constituents (called partons). As a result of this, the vast majority of the black holes produced have masses as low as possible, ie close by the new Planck scale.

What that means is that it is actually far from clear what the CMS constraints on excess of multi-jets mean for the production of black holes. A similar argument was recently made by Seong Chan Park in Critical comment on the recent microscopic black hole search at the LHC, arXiv:1104.5129.

Summary: It clearly doesn't look good for models with a lowered Planck scale. While it is in many cases not possible to falsify a model, but just to implausify it, large extra dimensions are becoming less plausible by the day. Nevertheless, one should exert scientific caution and not jump to conclusions. The relevance of CMS constraints on multi-jets depends partly on assumptions about the black holes' final decay that are not theoretically justified.

Question for the experts: Why do the curves in Fig 2 of the CMS paper seem to have a bump around the mininum black hole mass even though N > Nmin?

39 comments:

Plato said...

Wow Bee...a lot to digest. This is what I have been after. As a layman it will definitely take time, as the focus is directly on the collision process and what is coming out of it. Jet description has in m mind some correlation in the cosmological expression as to see it's larger comparative views as what is projected toward earth like a light house affect. The intensity of the beam and particle information.

Savas Dimopoulos

Here’s an analogy to understand this: imagine that our universe is a two-dimensional pool table, which you look down on from the third spatial dimension. When the billiard balls collide on the table, they scatter into new trajectories across the surface. But we also hear the click of sound as they impact: that’s collision energy being radiated into a third dimension above and beyond the surface. In this picture, the billiard balls are like protons and neutrons, and the sound wave behaves like the graviton.


It was simple to understand that given an amount of energy for collision process that there has to be an accountability for the energy valuations?

Why space station experiments should be very interesting given work directed to LHC. Why Fermi through calorimeter info from events are important.

Best,

Plato said...

Hi Bee,

I do not think it a conceptual mistake to think of sound in the way it was expressed in previous comment.

I tried to look for comparative views on this.

Intuition and Logic in Mathematics by Henri Poincaré

On the other hand, look at Professor Klein: he is studying one of the most abstract questions of the theory of functions to determine whether on a given Riemann surface there always exists a function admitting of given singularities. What does the celebrated German geometer do? He replaces his Riemann surface by a metallic surface whose electric conductivity varies according to certain laws. He connects two of its points with the two poles of a battery. The current, says he, must pass, and the distribution of this current on the surface will define a function whose singularities will be precisely those called for by the enunciation.


Again we are looking for locations as to find expression, and relevance to QGP expressions? How information is transmitted through Jet expressions?

Best,

Plato said...

Of course you might be in disagreement with the following but for me Poincare and Felix Klein seem to bounce off each other.

Felix Klein on intuition

It is my opinion that in teaching it is not only admissible, but absolutely necessary, to be less abstract at the start, to have constant regard to the applications, and to refer to the refinements only gradually as the student becomes able to understand them. This is, of course, nothing but a universal pedagogical principle to be observed in all mathematical instruction ....

I am led to these remarks by the consciousness of growing danger in Germany of a separation between abstract mathematical science and its scientific and technical applications. Such separation can only be deplored, for it would necessarily be followed by shallowness on the side of the applied sciences, and by isolation on the part of pure mathematics ....


The simplicity comes from the simplest way in which to explain how one might see dimensional value as a conceptual method of appreciation.

Uncle Al said...

Given Euclid's Fifth Postulate, the Earth's surface is impossible. Euclid is incomplete. No perturbative treatment or reparameterization creates undistorted continuous flat maps of the Earth. Rigorously derived axiomatic systems must be externally assaulted.

...their desperation to put their skills to work. Talk to Keuffel & Esser about the 1972 HP-35. Reality deficit disorder elicits deformed decisions leading to economic cloudy days. Physics postulates strong interactions and deep symmetries (BRST invariance, Calabi-Yau; U(1)xSU(2)xSU(3) or S((U2)xU(3))...). Chiral weak interactions are designated symmetry breakings. A fermion universe is chiral all the way down! Strong interactions smear fine structure. Increasingly fantastical theory patches weak postulates, forever.

Physics never tested vacuum symmetry with massed fermion geometric parity divergence. Massless boson photons are not definitive (arxiv:0912.5057, 0905.1929, 0706.2031; 1106.1068).

http://www.mazepath.com/uncleal/erotor1.jpg
Two geometric parity Eotvos experiments.

Output is outside Noether's theorems for input of an absolutely discontinuous symmetry. The worst it can do is succeed. "This is not the solution we are seeking." Reality is not a peer vote.

Massive failures of quantum gravitation and SUSY; terrible success of MOND. dark matter indeterminacy; LHC sterility... Occam's razor.

Bee said...

Hi Plato,

Yes, there's of course also real graviton emission, corresponding to the sound waves leaving the table. But if there's energy loss, it's difficult to pin down exactly what was not detected, and you'd need to make a case it was a graviton in particular. Now the graviton is the only spin-2 particle, but that's a property difficult to extract from the data. Best,

B.

Robert L. Oldershaw said...

1. Is there any physical evidence for what you refer to as a "graviton"?

And I mean ANY experimental evidence whatsoever.

2. Why are we so obsessed with the HEP string of failures and false-positives? Real experimental advances are ongoing in astrophysics and atomic physics, but all we hear about is "bumps", and "Higgs" and "WIMPs" and "extra dimensions" and assorted delusions like "sparticles".

The LHC results are interesting and important, but why not turn down the hype and special pleading by at least an order of magnitude.

Sorry to be such a curmudgeon, but one tires of the endless Platonic circus.

Eric said...

Bee, this is interesting and all very technical but I question the frame here in which a lowered Planck scale is being questioned. It reminds me a little of two nerds driving in a car. They are approaching a dead end with a house situated at the end of it. The passenger is looking down at their GPS device and tells the driver to just keep driving because the GPS says to go 10 more miles before turning. The driver dutifully obeys and plows right into the house.

The first thing that comes to mind (and I hate that I'm starting to sound like a broken record on this) is that the temperature of the universe indicated by the CMB is now about 3 degrees above absolute zero. This cmb temperature (should?) be linked to what is usually linked to non-local action. Just as an electrically conductive material becomes superconductive at a sufficiently low temperature the same thing should happen with the universe if there was a medium in it similar acting similar to an electrical conductor.

It may be that these large extra spatial dimensions everyone is always talking about are not additional to the three that we already have. They would just be conditional on the local Thermal environment one exists in. In empty space, or at the periphery of galaxies the cmb temperature would be the controlling factor. So in this situation any hidden variable pathways created by matter on the periphery of the galaxy would be quite stable and allow low inertia movement of material along these pathways.

Eric said...

I should add that this isn't rigorous and I'm not saying this is how the world actually works. And since math is required to prove these things, and math isn't my forte, serious work needs to be done to validate it. But it sure seems to me like it is low hanging fruit that the nerds in physics seem to ignore. Sort of seems like they are paying to much attention to what the GPS device is telling them and continually running into houses.

Exl Blogger said...

Am I reading that chart wrong, or is the universe actually three dimensional, x-y-t?

muon said...

Hi Bee, nice post.

concerning your question about the broad peak / bump in the theoretical curves: N_min is not the precise number of jets/leptons/photons produces by the black hole - it is a requirement set by the experimenter to suppress backgrounds. Once you have a black hole event, you can assume that all jets/leptons/photons/etc were produced in the decay of the black hole, and that S_T roughly corresponds to the mass of the black hole, since it is heavy and will not have much kinetic energy.

So, if there were a signal in the CMS data, we could infer an approximate value for MBH_min and perhaps also the number of extra dimensions.

Does this answer your question?

regards,
Michael

Plato said...
This comment has been removed by the author.
Plato said...

Hi Bee,


Michael said:So, if there were a signal in the CMS data, we could infer an approximate value for MBH_min and perhaps also the number of extra dimensions.

Not everyone finds it pleasing to take the leap to the non-euclidean description of the world.

Uncle Al why do you think Riemann soooooo.... pleasing too, Einstein?

I quote Michael because from my layman perspective he relates extra-dimensions to MBH, and any "particulate expression" as a measure of the dimensional significance of the MBH? Do you agree?

Sounds in analogy was an expression to help understand gravitational significance of how one might look at the universe in a "three body problem" kind of way?:)

It's strength, and it's weaknesses.

Best,

Plato said...

Here's a question then.

If the definition of MBH "is relativistic," then what correlation may be assumed if the QGP resulting data is met. Do MBH qualify?

If yes, then what "conditions" allow the QGP to be considered? Are they?

It's sort of like looking down the Rabbit Hole?:) Although, the MBH do quickly dissipate?

Best,

Bee said...

Hi Robert,

1) Finding evidence it what this is all about.

2) Don't know. I'm not a psychologist.

Best,

B.

Bee said...

Hi Eric,

I'm afraid I don't actually understand what you mean. In case you mean that the size of the extra-dimensions might be position-dependent, I think that's been looked into in a few papers. Don't think you learn very much from it though, it just makes matters more complicated. Best,

B.

Bee said...

Hi Exl Blogger,

Sorry, I don't know what chart you're talking about? Best,

B.

Bee said...

Hi Michael,

Thanks. No, now I'm even more confused. I thought the parameter N_min is a parameter that goes into the MC simulation and I thought it must have something to do with the number of particles in the final decay, which is usually a parameter that has to be set in these codes. (In CHARYBDIS, it's called NBODY.)

My confusion is this: from some simulations I've done myself I would have expected that most of the black holes have masses close by the minimum mass, and thus the final decay is the one carrying the bulk of energy. That would mean, there's indeed a bump around the minimum mass, as you see in these curves. However, if you drop the final decay, most of the bump should vanish. Now if you only look at events with N>N_min and the final decay is a N_min event, why is there a bump? Okay, the N_min is the number of primary particles, not secondary, but at these energies, the multiplicity shouldn't get much larger for the secondary particels? Or does it? Best,

B.

Phil Warnell said...

Hi Bee,

Thanks for the explanative update and comments. It seems that for many quantum gravity researchers the news hasn’t been good for a while. That is invariance doesn’t seem to vary, dimensions resist being expanded and black holes won’t come out to play;-) That is the Paradigm shift that many are looking for stubbornly refuses to happen.

Perhaps this has a physical explanation, that being with so many watching and so often observed the Quantum Zeno effect is responsible for thwarting things;-)

Then again there are some who might claim it has to do with the intent of the observers and as scientists are by profession supposed to be doubters this having reality to reflect this ;-)

“Rather than being an interpreter, the scientist who embraces a new paradigm is like the man wearing inverting lenses.”

-Thomas Kuhn

Best,

Phil

muon said...

Hi Bee,

I'm rather sure that N_min is a cut that the experimenters place on the observed multiplicity N (the sum of the numbers of jets, leptons and photons). Please note, though, that the cut is N >= N_min, not N > N_min, according to the caption in Table 1 and second-to-last paragraph on page 7. So the case that you have in mind, N = N_min, is included in the distributions.

You know more about CHARYBDIS than I do, and a lot more about the phenomenology of black hole decays. I did not realize that the decay proceeds in stages, for example. From the point of view of the mass distribution of the created black holes, however, one should include all objects and not just the number corresponding to the last stage (MBH_min), right?

Why does one need to specify NBODY in CHARYBDIS? Is it impossible to simulate a distribution for N? Is this distribution incalculable?

I would not expect the observed number of objects, N, to match NBODY closely, for several reasons. First, a fraction of the generated partons (quarks, gluons, leptons, photons, neutrinos) will go down the beam pipe and be unobserved. Second, they may overlap and be reconstructed as one (a photon in a jet, for example). Third, neutrinos cannot be measured individually. And finally, the number of jets in a multi-jet event depends on the algorithm used and the kinematic and separation criteria applied. So for sure, N_MIN has only a loose correspondence with NBODY, at the level of a fully-simulated event.

regards,
Michael

Bee said...

Hi Michael,

Sorry, I meant N >= N_min, though in some cases it's also N>N_min. Take the figure I've used in this post and the uppermost dotted line for M_D=1.5TeV, M_BH^min=3TeV, n =6. In the table, N_min for that case =3, ie if N_min is the number of particles in the final decay, and the hole makes only the final decay, these events should be missing. So why is there a bump in the curve?

Yes, one should include all emitted particles. But see, what I'm saying is that in the pp collision you produce BHs according to some mass spectrum, not at one particular mass. That mass spectrum is strongly peaked at the threshold, and these BHs make only the final decay. Thus, if you integrate over the spectrum to get the decay products, the main contribution comes from the final decay and has a (primary) multiplicity that's put in by hand. At least that's what I thought, but that doesn't match with the CMS curve, so where am I wrong?

Best,

B.

Plato said...

The Devil and the Deep Blue Sea?:)Logically?

That one might graduate to an NBody problem shows a graduation of a kind? Kind of like a "geometrically enhanced view?" Like a jump from euclidean to non-euclidean?

Maybe Navier would be happy to know "the vortex" can be applied in "two cases" not just one:)

Viscosity relativistically, may still play a part? At what levels and energy output would limitations be applied to particle decay from MBH in consider of current levels LHC energy outputs.

I thought Robert would be happy about such a claims of 3body, but obviously he cannot see things in space that way?:)

Best,

Robert L. Oldershaw said...

I humbly make two requests.

1. When scientists discuss purely speculative hypotheses or entities, i.e., those for which there is not yet a shread of empirical evidence, would they please not speak of these speculations as if they were well-tested facts? This appeal is directed to the entire scientific community.

2. I beseech the Great God of Hockey to smile upon the Boston Bruins tomorrow night. I want a game 7 and one more chance to defy the odds in Vancouver.

Bee said...

Zephir: If you want to voice an opinion, I am sure you are able to do it without calling other people imbecile. It seems a little... inappropriate. Best,

B.

Zephir said...

Isn't it a bit suspicious, the physicists often recognize some phenomena just after years of grants spent into it? We discussed it at the case of extradimensions and dark matter strings in connections of dark matter structures, but here are many other examples? What prohibits physicists to consider CMBR noise as a gravitational waves, extradimensions as a dispersive effects, the atom nuclei as a micro-black holes, missing antimatter as a dark matter, vacuum as a particle environment with foamy density fluctuations?

The answer is, absolutelly nothing - but their research would end prematurely, as R.Wilson (a former president of APS) recognized corectly before many years. He even proposed the same solution, which you're using too.

The problem is, physicists are lowing their jobs more, then their results. If every research ends with finding of solution, what prohibits them to avoid the solution?

Zephir said...

The whole discussion is about the lack of evidence of extradimensions at the case of LHC collisions. But what actually prohibits the physicists to consider the jet suppression as an evidence of such extradimensions? The jet suppression has been observed before many years already.

The whole problem is, it can be explain with dispersion of quarks inside of quark-gluon condensate, too. But isn't such dispersion exactly the way, in which extradimensions should manifest?

Zephir said...

For example, the string theorists are seeking for extradimensions too with violation of inverse square law of gravity force. If they met with electrostatic force during their experiment, they will eliminate it for not to interfere their measurements. If they find Casimir force, they eliminate it too from the same reason - because they just want to measure "only the gravitational effect".

Sorry - but you cannot detect the violation of gravity force while neglecting and compensating all phenomena which are violating it. This is what the schizophrenia is called.

Zephir said...

What all these missintepretations have in common:

1) The contempoary physics is driven with formally thinking people (the articles without formal models aren't even alowed in many peer-reviewed journals). But the math is strictly schematic language, it doesn't allow dual persperctive of problem solution, which would make such solution fuzzy


2) Physicists are motivated in finding of new phenomena, instead of reconcilling of these old ones. They're supported with tabloid journalism in it. In this way, they cannot recognize the old phenomena from new concepts even at the case, when such understanding is quite trivial.

3) Physicists don't research for their own money, like Faraday or Tesla. They're part of industry, which only seeks for perspective of continuation of their jobs, salaries and conferences travell. Actual results aren't so important for such an industry, the continuation of research is.

Bee said...

Zephir: Enough now. This is the first and last warning.

Eric said...

"I'm afraid I don't actually understand what you mean. In case you mean that the size of the extra-dimensions might be position-dependent, I think that's been looked into in a few papers. Don't think you learn very much from it though, it just makes matters more complicated. "

Bee, did you ever hear the story about the guy (gal) who lost one of his (her) contact lenses outside at night. Another person wants to help and says "You sure are lucky that it fell out here under the street lamp."

The other person replies "Oh, it fell out over there in the
dark. But it is easier to search over here where the light is good".

Bee said...

Hi Eric,

I've heard this story like ten million times as a comment to other people's seminars. Still, I have no clue what you're trying to say. Look, I don't mean to be annoying, I'm just saying if you have an idea, write it up, do the maths, publish it. Having ideas isn't the difficult part. It's bringing them in a useful form that is. And my blog btw isn't the place to publish them. Best,

B.

Eric said...

Bee,
There is no need to get snotty. I have never once in all my time seen you give credit to a commenter here on a technical matter that actually helped you. I know for a fact that I have helped you on occasion but have never received acknowledgment by you. In fact usually I can tell when I've helped by your silence. Stealing credit is what it called. You take ideas from commenters here and translate them into papers that are presentable. There is nothing wrong with that. But you and me both know you took directly the visualization I talked about with the double slit experiment.

All I would have liked to see is a note in that paper that conversations on your blog were helpful in formulating your ideas. I'm not interested in writing papers and I don't get money from physics so it is an avocation, not a vocation. So screw you and your lack acknowledgment of help you get here. I guess for a narcistic person like you it is too much to thank the "little people".

Bee said...

Hi Eric,

"There is no need to get snotty... for a narcistic person like you..."

It's called projection.

I have noticed you seem to believe that your comments have something to do with my recent paper. I thought I would do you a favor not correcting you, but see now that it was a mistake. What I wrote in the paper has nothing, I repeat, absolutely nothing, to do with what anybody commented on this blog. The first version of this paper is about 5 years old, I had previously submitted it (it was rejected), I have shared and discussed it with several people over the years, it is easy to prove that you had nothing to do with it. People who've helped me with the paper are in the acknowledgments.

Best,

B.

Giotis said...

These scenarios were never popular among string theorists as far as I know. Even the famous GKP paper, which reproduced the basic features of the RS model within String theory, became popular not because of the large hierarchy it introduced but because it stabilized the complex structure moduli via flux compactifications.

Bee said...

Hi Giotis,

Well, they were not popular as a topic to work on. They were considered not serious enough and left to the phenomenologists, mostly particle physicists in my impression. Lisa Randall's book tells the story. They were, and still are, however very popular among string theorists as something to point to when asked about experimental evidence. In the last some years that has somewhat shifted towards AdS/CFT. Best,

B.

Eric said...

Well Bee, you certainly ask for any abuse you get when you state that ideas are a dime a dozen and imply that execution of the idea is the only thing that is important. They are both very important and good ideas can be quite rare. When you say a thing like that, and you have said it twice now that I remember, I lose total respect for you. It is basically the same as saying "what I do is important and rare, what you do is a dime a dozen". That is why i've come to think you are a narcissist.

Giotis said...

Really? The solution to the hierarchy problem was not considered serious enough topic? My impression was that such models were not too popular mainly because they didn't require low energy SUSY to solve the hierarchy problem and String phenomenology has been built largely upon this assumption.

Zephir said...

Eric: Which visualisation do you mean, exactly? I presume, it should be possible to trace the original source. In general, if you don't want to get some ideas stealed, don't exploit the traffic of foreign highly visited blogs for their presentation...;-)

Every frontier has a bit hypertrophed ego (me included). Bees is very talented and diligent (... and she knows about it quite well..;-)) I don't like her occasional censorship - on the other hand she's patient sheep with compare to Motl and/or similar autistic individuals. So that the positive attitude prevails for me. After all, I wouldn't communicate with people, which are unpleasant for me.

Eric said...

Zephir, I was obviously wrong about her influences on the paper she wrote. I'll leave it at that. Everyone can be pushed to the edge under the right circumstances and she pushed my my buttons in a way that brought me there.

I have never said that she wasn't very talented, and she also has a right to have that self awareness. But there are very many different kinds of talent that are not always so easily communicated. I feel I am talented in a certain kind of way. Some of those talents are complimentary to Bee's
. In other words, the talent I have comes from a different direction and can fill in gaps in which she is missing.

I think it is sad that she can't see that. I just think her ego for whatever reason refuses to let her see that.

Zephir said...

Eric: IMO it's nothing personal about it. Mainstream physicists are curently living from writing of publications filled with various combinations of equations, instead of explanations of things. The informations and ideas, which cannot be rewritten into rigor immediatelly are useless for them. They cannot handle them, they cannot publish them. Such ideas simply don't exist for mainstream physics at all in similar way, like you cannot see the gas: only less or more deterministic fluctuations (density gradients) of that gas. Of course such view is biased, but you cannot expect the people, who spent whole their productive life with learning of rigor, they will start to use another approach. The inertia of their thinking is of the same physical nature, like the inertia of massive objects.