Friday, June 08, 2012

Also against measure

The German Physical Society (Deutsche Physikalische Gesellschaft, DPG) has a new president, Prof. Dr. Johanna Stachel from the University of Heidelberg, member of the ALICE collaboration. In her inaugural speech, she addressed the issue of the spreading measures for scientific success, which we discussed previously in my post "Against measure." Here is my (rough) translation of the respective part of her speech:
[I] want to address a point that I am personally concerned about. Scientific discoveries and breakthroughs are made by individuals. For this they need freedom, an atmosphere of congeniality, also luck. On the opposite side are - a rapidly increasing number of - programs that want to be measured by success, that want to measure success. A catalog of actions to measure quality is rolling towards us (and over us?). We all know performance-oriented assignment of grants, agreements on goals, judges for quality...


Whom do they serve? Sure, by this one can increase quantitative indicators for quality, like the number of publications or grants, much like the milk output of a cow, we see this already. But is this the atmosphere that supports a scientist to enter new terrain? The result is unclear, full of set-backs.


In a recently published book about Bell Labs (by Jon Gertner), one can read: "in innovation as in hitting home runs in baseball you have to be willing to strike out a lot to be successfull."
The full speech (in German) is printed in the June 2012 issue of the membership magazine of the DPG (and is not open access). The president of the European Research Council recently expressed a similar sentiment.

7 comments:

Uncle Al said...

Reconfigure grant-funding into researcher loans. The grant funding agency later decides whether output justifies debt cancellation. What a jolly administrative despotism awaits!

Research funding and publication are process fluids enabling mangerial boons. Drain and refill at recommended intervals using brand name jugs. Nobody ever got fired for consensus.

stefan said...

BTW, the text of the speech is available on the DPG website.

Phil Warnell said...

Hi Bee,

I like these sentiments as expressed by Prof. Dr. Johanna Stache, especially the reference to baseball. On the other hand I think scientific research can also be compared to golf, where a good game is had by having ones shots the most careful one can make to complete an entire round, where the birdies, the eagles and best of all a hole in one can make up for several errant swings. However, the thing that poses the greatest difficulty in science isn’t so much in judging the quality of the round once played, yet to have the pros distinguished from the duffers, which I find Dr. Stache hasn’t attempted to have addressed here.


“The difference between a good mechanic and a bad one, like the difference between a good mathematician and a bad one, is precisely this ability to select the good facts from the bad ones on the basis of quality. He has to care! This is an ability about which normal traditional scientific method has nothing to say. It's long pasttime to take a closer look at this qualitative preselection of facts which has seemed so scrupulously ignored by those who make so much of these facts after they are "observed." I think that it will be found that a formal acknowledgment of the role of Quality in the scientific process doesn't destroy the empirical vision at all. It expands it, strengthens it and brings it far closer to actual scientific practice.”

-Robert M. Pirsig- Zen and the Art of Motorcycle Maintenance

Best,

Phil

Kay zum Felde said...

Measuring theoretical articles, and I mean new ones is difficult, since you cannot in most cases say if they are right or wrong, even if you're conservative. They need to be confirmed by experiment, and if they describe the experiment proper, than you can say they succeed and they have quality.

Take care Kay

Uncle Al said...

Contemporary physics is grotesquely incapable of excluding “wrong.” arxiv:1205.5998, 1204.0484, 1203.5008, 1203.4052, 1202.5560, 1202.3319, 1201.4374, 1201.4147, 1112.5793, 1112.4714, 1112.3050, 1110.6577, 1110.6571, 1110.5866, 1110.3540, 1110.2170, 1110.0882, 1110.0783, 1110.0424, 1110.0245, 1110.0243, etc.

MSSM adds 120 hugely curve-fit parameters to the standard model (soft supersymmetry breaking is diagonal in flavor space; new CP-violating phases vanish). MSSM is empirical crap. Where is the quality? Two geometric Eotvos experiments operating in ECKS spacetime. Observation repairs empirically defective postulates, then derived into real world predictive theory.

Bee said...

Hi Phil,

Yes, you're right. I should have said, in that part of her speech she was referring to basic research specifically. I agree with you, golf and baseball both have their place. As always in life, it's all about balance. I think she's concerned, much like me, that we're developing into a golf nation. Best,

B.

Bee said...

Hi Kay,

Yes, that's right. The problem is what do you do in the time you're still waiting for experiment to catch up? That really is the problematic phase, and esp in theoretical physics it can be quite long, decades. It also brings up the question how long you want to wait. So then we're facing the problem how to judge on the promise of somebody's research before we actually know if it moves us forward. And at this point scientists usually rely on the judgement of the community. After all, that's the only judgement you have. The question is however, how do you do that. And imo counting citations is not a good way for it's an indicator of popularity rather than promise. As I've argued on other occasions, any centralized quantitative measure will inevitably mess with scientists best intentions to do good research, and given them incentives to optimize their scores instead. So we should be using measures for scientific success sparsely and carefully, because their very use limits their usefulness. Best,

B.