Pages

Thursday, July 31, 2008

Doctor of Philosophy

Tomorrow it will be exactly five years that I've finished my PhD. Being on a scholarship the deal was they'd pay until the end of the month that I finished, but no longer than three years in total. So obviously, I sheduled my defense for the first day of the last month. Rather unnecessarily actually - I was finished months earlier, but why on earth would I have cut myself off funding prematurely?

Just yesterday, I came across a post from Jamie's Weblog Things I learnt during and about my PhD. Written by a computer scientists who didn't finish the PhD I find it somewhat overly cynical, but sadly enough there's lots of truths in it as well. So, since I guess some of our readers are struggling through these dark times while others are wondering whether to get into them in the first place, I want to add some thoughts from my perspective.

What is it good for?

A PhD is in the first line a degree that certifies you are qualified to conduct research. It's a necessary step if you want to stay in academia. It isn't impossible, but extremely unlikely that you will actually work on something groundbreaking. What you are supposed to do is to show yourself, your supervisor, and the rest of the world that you can study a topic in depth, acquire the necessary technical skills, formulate new questions, answer them, document them, and finish your examination.

You will likely not have a very good time. I don't want to discourage you, but I don't know anybody who was particularly happy during his PhD. As far as I am concerned it was possibly the worst time of my life. You will be mostly on your own, probably underpaid, overworked, be stuck on stuff you're not interested in while being subject to increasing pressure of getting done.

You will ask yourself repeatedly what is it good for. You will need a large capability for self-motivation. If you are not one of these persons who can not live without doing physics, a PhD is not for you. If you are stuck in the details of your thesis that nobody including you is or will ever be interested in, you will have to remind yourself endlessly: It isn't about changing the world or winning the Nobel-prize, it is a degree that certifies you are qualified to conduct research. Period. Read that reference. Make that figure. Finish that paragraph.

The Topic

It is not that the topic of your PhD thesis is completely irrelevant. If you apply for a new job, people will look at the title. But there is the general sense that if it's rather dull that's not your fault so you'll be excused. The number of people who will actually read your thesis is tiny. As far as my thesis is concerned there's exactly one person I am sure read it. That's my mother. She corrected all typos and grammatical bugs.

What is far more important than the topic of the thesis is how you pursue it. The important information will be in the letters accompanying your first application. And that letters will either say you haven't managed to make one single step on your own, and were neither able to find the library nor to understand the wise words of your supervisor. Or it will say you're destined to be a future leader. Translate into: you haven't bothered your supervisor with questions he indeed had to think about. (Like, where is the library?)

You can increase the visibility of your thesis by putting it on the arxiv. You can decrease it by writing in any language other than English. My thesis is in German, but I put an introduction and a summary in English on the arxiv which by now has 42 citations - that is mostly because it contains a large collection of references and an overview on the field that at this time was pretty much up to date. Also, if you are working in a group, there is the chance that your thesis will be handed over to the next students as an introduction to the topic. I know various examples for that.

Your Supervisor

I want to quote a particularly amusing paragraph from Jamie's post
Supervisors: a curious species, rarely sighted in their expected habitat

Supervisors are strange creatures. Some are like ghosts, appearing occasionally for a fleeting moment, and you’re more likely to meet them at a conference than at the University. Others are always around but they’re too busy running around like demented hamsters on a wheel – all motion and no progress. They’re disorganised. All of them will, at some point, forget what your project is about – and some will even forget who you are.

I made an interesting discovery half way through my PhD: the number of good/useful/interesting/brilliant things that your supervisor will say to you is not proportional to the amount of contact you have with them – it’s constant. Yep, that right. You can have weekly meetings with your supervisor but you’ll only get three good suggestions a year out of them [...]

Supervisors also participate in a little-known game which can catch out the naïve student: Hunt the Supervisor. This involves the PhD student attempting to locate their supervisor during the agreed meeting slot. And, no, they are definitely not going to be in their office. You’ll be lucky if they’re in the right country.

This description is to my impression very accurate. I am relieved to hear actually it isn't much different in computer science than in physics.

There is something to learn from that: If you pick your supervisor, pay attention to his reputation in taking care of students and to his reliability, *not* to his professional reputation and his likability. Most obvious thing to do is chose somebody who doesn't already have more students than he can cope with, who isn't traveling 11 months of the year and who isn't glued to his BlackBerry. People who write books, speak on every conference, and are known for their public appearance typically aren't recommendable either. Problem is, these are the people you are most likely to have heard of. (And also the ones whose letters are likely to have the highest impact...)

You will find out these things rather easily: ask previous students. They will tell you more than you want to know. In some cases that information has already been collected somewhere and you can look it up. Nowadays probably online, in my days we had folders for every prof that were being passed around. What your found there was pretty accurate.

What helps?

What helps if you're stuck in the dark times are friends, your office mates who are stuck in similar situations, and online group therapy. If things get really tough I recommend you get professional advice. Most universities offer a counseling service, anonymous and free of charge, where they will help you to move on. Whatever problem you have, you are very likely not the first to have it.

One possibility to relieve loneliness is to chose a topic that is close to something another student is working on. However, this is recommendable only if you will not be finishing around the same time. Reason is that you will for some years be referred to as "Soandso's student who worked on maggot growth" (or whatever) - "Which one?"

Besides this, if you are stuck on a supervisor who is essentially useless, do not focus on that person. Just face that he isn't going to help you and look for help elsewhere. This is easier today thanks to online connectivity. If you are smart and not lazy you will have no problem finding somebody who is willing to work with you. Typically these will be postdocs or young profs who do not have many postdocs and students, and are looking for people to work out details of some projects they have in mind. Both for my master's thesis as well as for my PhD I have been unofficially supervised by postdocs while the actual supervisors where constantly absent or kept forgetting my name.

In any case you should preferably look for somebody who will not be searching for a new job and be moving elsewhere in the soon future. That very often limits the usefulness of postdocs.

Bottomline

5 years after my PhD I am still amazed I ever managed to finish what looked like a complete mess only 6 months before my defense. If you are currently working on your PhD, I wish you good luck and don't lose your dreams out of sight.

Tuesday, July 29, 2008

Dealing with Uncertainty

The New York Times has an interesting article about communicating science to the public

Though the article is mostly about the presentation of climate change and health issues, the points raised are more generally applicable. Kimberly Thompson, an associate professor of risk analysis and decision science at Harvard, is quoted with
“Words that we as scientists use to express uncertainty routinely get dropped out to make stories have more punch and be stronger,” she said, adding that those words are important to include because “they convey meaning to readers not only in the story at hand, but more generally about science being less precise than is typically conveyed.”

but she also points out that “scientists themselves sometimes fail to carefully discriminate between what is well understood and what remains uncertain,” indicating that the communication problem is two-sided, and that small inaccuracies can have large backlashes because “the flow of scientific findings from laboratory to journal to news report is fraught with 'reinforcing loops' that can amplify small distortions.”

I agree with these assessments. As I have previously said (eg in my post Fact or Fiction?) uncertainties are part of science. Especially if reports are about very recent research, uncertainties can be high. Uncertainties need to be documented accordingly, even if that lowers the entertainment value. There is no place in science for inaccuracies. I understand the need to make popular writing more accessible, but on no account should it be outright wrong. Leaving out details clarifying under which circumstances which conclusion applies to what certainty can kick a statement from being vague to being wrong. And that doesn't even take into account the distortion of media reports in echos of original articles on the internet.

As David Malone wrote very aptly in the New Scientist, Aug 2007
“We are faced with all kinds of questions to which we would like unequivocal answers […] There is a huge pressure on scientists to provide concrete answers […] But the temptation to frame these debates in terms of certainty is fraught with danger. Certainty is an unforgiving taskmaker. […] If we are honest and say the scientists conclusions aren’t certain, we may find this being used as justification for doing nothing, or even to allow wiggle room for the supernatural to creep back in again. If we pretend we’re certain when we are not, we risk being unmasked as liars.”

In a similar spirit, the author of the NYT article, Andrew Revkin, writes on his blog in a posting about Media Mania and Front Page Thoughts:
“[O]ne danger in this kind of coverage — not accounting for the full range of uncertainty or understanding in dealing with very important environmental questions — is that it ends up providing ammunition to critics charging the media with an alarmist bias.”

I am afraid 'alarming' is exactly the reason for such headlines — it's a way to get attention. Revkin ends his article with an optimistic quote by Morris Ward, the editor of yaleclimatemediaforum.org (a forum on Climate Change and the Media with focus on improving media coverage)

“Ward [...] says that it will be up to the public to choose to be better informed on momentous issues that do not fit the normal template for news or clash with their ingrained worldviews. 'At some point,' he said, 'the public at large has to step up to the plate in terms of scientific and policy literacy, in terms of commitment to education and strong and effective political leadership, and in terms of their own general self-improvement.'”

Monday, July 28, 2008

Book Review: “Distracted” by Maggie Jackson


Distracted: The Erosion of Attention and the Coming Dark Age
By Maggie Jackson (Prometheus, 2008)

(order at Amazon.com)


Focus

I was distracted by an irrelevant search result, followed a path of links I can't recall and ended up on Maggi Jackson's website where her new book “Distracted” is advertised:

“Distracted is a gripping exposé of this hyper-mobile, cyber-centric, attention-deficient life. Day by day, we are eroding our capacity for deep attention— the building block of intimacy, wisdom and cultural progress. The implications for a healthy society are stark.”

If you follow this blog you will hear the resonance with some of my writings, on Googlearchy, Communication, Information Overload and the danger of making irreversible mistakes.

So I ordered the book, how could I have resisted something about a coming dark age? (For my version, see here). When the book arrived a month ago I didn't even open the box, but put it on a huge pile of papers waiting to be read, where it sat until my flight back to Canada. Thinking of a friend left to Niagara Riesling, I added a bottle of German Chardonnay to my suitcase upon which it exceeded the allowed weight. I therefore decided to promote one more book into hand-baggage, and “Distracted” successfully distracted me from my seat-neighbor's cough during a transatlantic flight.

This already brings up the first question: What is distraction? As for me, I would have said I am often distracted - to my husband's chagrin - because I am focused, just not on what is coughing right next to me. However, what Jackson is actually writing about is not so much distraction as instead attention, and how essential paying attention is to our ability to shape our future. Using the words of philosopher William James, she defines
“Attention is the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thought. It implies withdrawal from some things in order to deal effectively with others, and is a condition which has a real opposite in the confused, dazed, scatterbrained state which in French is called distraction, and Zerstreutheit in German.”

Judgement

With this starting point Jackson sets out to argue vividly that

“The way we live is eroding our capacity for deep, sustained, perceptive attention - the building block of intimacy, wisdom, and cultural progress.”

In her book, Maggie Jackson describes her pursuit of the topic that must have taken several years. She covers a wide range of aspects, and describes meetings with researchers in many fields from neuroscience over psychology to computer science, she talked to artists, and with people who shared their experiences. The research findings are very well referenced, understandably summarized, and woven into the narrative of her travels. The book reads very smoothly, much like a piece of science journalism, just extended to book's length.

Some of the topics Jackson covers were not quite what I expected, such eg the importance of the fork in western civilization or Edison's attempts to communicate with the dead. She actually addresses a much wider range than what the blurb lets one expect: Jackson also writes about the clash of surveillance with trust, the loss of permanence in mobility and fast food, and the eeriness of computers that pretend or react to human emotion. The parts I found most interesting though where those covering the influence of multitasking, lacking self-discipline, and an inability to maintain focus on the education of the next generation.

The book is divided into three parts:

PART I. LENGHTENING SHADOWS: EXPLORING OUR LANDSCAPE OF DISTRACTION
PART II. DEEPENING TWILIGHT: PURSUING THE NARROWING PATH
PART III. DARK TIMES... OR RENAISSANCE OF ATTENTION?

and the first and second part has a chapter each with the title “Focus”, “Judgement” and “Awareness”, which in the last part of the book she explains are the three levels of our attention system

“Do we yearn for such voracious virtual connectivity that others become optional and conversation fades into a lost art? For efficiency's sake, do we split focus so finely that we thrust ourselves in a culture of lost threads? [..] Smitten with the virtual, split-split, and nomadic, we are corroding the three pillars of our attention: focus (orienting), judgement (executive function), and awareness (alerting). The costs are steep: we begin to lose trust, depth, and collection in our relations and our thought. Without a flourishing array of attentional skills, our world flattens and thins. And most alarmingly, we begin to lose our ability to collectively face the challenges of our time. Can a society without deep focus preserve and learn from its past? Does a culture of distraction evolve to meet the needs of its future?”

However, the assignment of these titles to the chapters seems to me rather constructed, at least I fail to see why keeping memories of the deceased on Facebook falls under 'Focus', and global nomads fall under 'Awareness'. Either way, the line of thought in Jackson's book is well presented and easy to follow. She struggles hard to not come off as a Luddite and is semi-successful with it. In the final part, she comes damned close to advertising Buddhism and recommending we all meditate 30 min each day as a cure for attention deficit, but then she just finishes with saying we have to chose between “creating a culture of attention, recover the ability to pause, focus, connect, judge and enter deeply into a relationship or an idea, or we can slip into numb days of easy diffusion and detachment.”

Awareness

One thing weird about this book is that it put me in a position of constantly reflecting on my own attention, asking myself whether I'm following or getting distracted.

Something else that I could not avoid noticing is that Maggie Jackson's book is thoroughly American (notice bold face). Not only do most of the people she has talked to either live in New York, or are from New York, or are in some other way connected to New York, she doesn't even once attempt to ask what's going on outside the USA. I wonder for example have the Japanese and Chinese encountered similar problems in the education of the next generation?

Further, many of the points she argues to be connected to loss of attention are very USA-specific, such as the the omnipresence and fondness of fastfood, and the idea that freedom is the same as mobility. All of the research results she quotes seem to have been based on samples of US citizens. For me this lowers the significance of her arguments considerable. The part of me living in North America wants to constantly nod and say YES, YES, but the part born in Europe says “Who cares, so they finally get what they deserve.”

And honestly, that's what you have to ask yourself: if you have indoctrinated generations of your citizens that the pursuit of immediate individual advantage will lead your society towards happiness, guided by some “Invisible Hand” converting micro-interests into a desirable macro-behavior, how can you now expect them to realize they are making a huge and irreversible mistake?

Unfortunately, Jackson's book ends up being nothing more than a well-meant appeal that does not really offer any insights or solutions. I hope it will raise some awareness for the issues she is writing about, but I can't help but think she didn't address the main problem. These warnings are not new, but have been around at least for several decades, as she mentions in various places herself, and we are now beginning to notice the impact in an erosion of attention, and in an inability of our societies to maintain focus on long-term goals. How likely is this problem to be resolved by asking people to meditate and pay more attention to their friends?

Bottomline

If this was an amazon-review, I'd give four out of five stars. The book is a very recommendable read, well written and covers a lot of ground, but lacks some conclusions.

Saturday, July 26, 2008

Blogging Heads: Woit and Hossenfelder

Some weeks ago, David from bloggingheads.tv asked whether I'd be interested in contributing to their program by chatting with Peter Woit. During the last years, I found that Peter and I, we share some interests when it comes to the problems of the current academic system, the role of the blogosphere, and the influence of the media on scientific discussions. So I agreed on doing this 'diavlog', and we had indeed a very interesting exchange that I enjoyed very much. You find most of what Peter and I talked about summarized in my last week's post We have only ourselves to judge on each other.

As you know, I was traveling the last weeks, so for the recording I had to overcome some technical hurdles. I ended up sitting in my mother's study in the attic where a 1.5m Ethernet cable confined my mobility considerably, no power outlet was nearby, the video quality suffered from the bad lighting, and my back suffered from crouching on a chair I'm sure every orthopedist would disapprove of. Somewhere around min 8:40 there appears a blot below my nose that looks quite funny. Anyway, here it is:



See the full video here, and Peter's related post here.

Blog reactions: Secret Blogging Seminar Not Even Blogging, John Hawks Organizing the "idea marketplace", Uncertain Principles: Not Even Backreaction

Friday, July 25, 2008

Ghosts in Transit

Ghosts around me. Caught in between, not really gone, not yet arrived. An airport terminal - hard to say which city or even which country. A duty free shop, a Starbucks, groups of Japanese guys sitting on the floor around pillars with power outlets, typing on their laptops, maybe blogging how the German is sitting there with her laptop, maybe blogging.

Ghosts caught in transit, trying to stay in touch with their loved ones, talking to their cellphones, checking their emails. Lost souls, staring at the CNN news, dragging their children around, munching pizza, sipping coffee, queuing at the restrooms. Another Japanese circling around, searching walls for an outlet to plug in the lifeline. A guy next to me, smoker’s smell, attempts to cough out his lung, every breath an instant message of rotten tissue. A bit more ghostlike is what I’d wish, and look for a different place. Make a detour through duty free and cover myself with perfumes that carry names like Miracle, Pleasure, Innocence.

Somebody opens an emergency exit door, the alarm shrieks. The ghosts sit stoically, continue to watch CNN, the oil price, the banking crisis, an advertisement for a frequent flyer program. A recording repeats endlessly the terminal will be checked for a possible fire. The alarm blends into the background noise, last calls, paging passengers with unpronounceable names, security advice: do not leave your baggage unattended at any time.

Eventually, the shrieking stops, a child starts crying. The lost souls hold on to imitations of permanence, paper cups, iPods, credit cards, a BlackBerry they sleep with in exchangeable hotel beds. I wonder about the ghost’s stories, is he married, does he cheat on his wife, is she happy? Will the little boy clinging to her leg in ten years try to completely ignore her? Is he going to a funeral, is he well prepared for the job interview, does she have cancer? How many lost souls are stuck in airports, trampled upon, disapproved by security because they leak out of tightly sealed plastic bags?

A ghost in transit is what I am too, no longer in the past and not yet in the future, caught between where I’ve been and where I want to go - if only I knew where that is. I have the urge to rebook and fly to elsewhere, some country with a long unprotected border, vanish and settle down in ghoststate. Run a hotel maybe where burned-out businesspeople can recover. Be a painter maybe, sell paintings of flowers and fish to dentists. Be an interior designer maybe, a writer, a teacher, maybe I should use the restroom before boarding.

Ten hours later, another airport, another city. Miracle, Pleasure and Innocence have faded away. Thank you for flying with us, make sure to take all your personal belongings. The smoker’s cough still among us. The BlackBerry welcomes me to Canada, emails have piled up: a traffic jam on the 401, a broken water pipe, a conference participant cancelled, a late notice, a Facebook message, and another forgotten referee report. If Toronto is your final destination we wish you a pleasant stay.

How global can a soul be before it gets lost?

Thursday, July 24, 2008

Liquid Helium

This month has seen the centenary of the first liquefaction of helium, the lightest noble gas:

On July 10, 1908, a complicated apparatus working in the laboratory of Heike Kamerlingh Onnes in Leiden, Holland, managed to produce 60 ml of liquid helium, at a temperature of 4.2 Kelvin, or −269°C.

Heike Kamerlingh Onnes (left) and Johannes Diderik van der Waals in 1908 in the Leiden physics laboratory, in front of the apparatus used later to condense helium. (Source: Museum Boerhaave, Leiden)
Kamerlingh Onnes had been experimenting with cold gases since quite some time before, as he was trying to check the theories of his fellow countryman Johannes Diderik van der Waals on the equation of state of real gases. He had been scooped in the liquefaction of hydrogen (at 20.3 K) in 1898 by James Dewar (who, in the process, had invented the Dewar flask).

But as it turned out, the liquefaction of helium required a multi-step strategy and a big laboratory, and this was Kamerlingh Onnes' business: Using first with liquid air, then liquid hydrogen, helium could finally be cooled enough, via the Joule-Thomson effect, to condense into the liquid state. The physics laboratory in Leiden had become the "coldest place on Earth", and immediately turned to the international centre for low-temperature physics.

Three years later, in 1911, Onnes found that mercury lost its electrical resistivity when cooled to the temperature of liquid helium - this was the discovery of superconductivity. In 1913, Kamerlingh Onnes was awarded the Nobel Prize in Physics, "for his investigations on the properties of matter at low temperatures which led, inter alia, to the production of liquid helium".

Paul Ehrenfest, Hendrik Lorentz, Niels Bohr, and Heike Kamerlingh Onnes (from left to right) in 1919 in front of the helium liquefactor in the Leiden physics laboratory. (Source: Instituut-Lorentz for Theoretical Physics)



I read about the story of the liquefaction of helium in the July issue of the PhysikJournal (the German "version" of Physics Today - PDF file available with free registration). Moreover, the Museum Boerhaave in Leiden shows a special exhibition to commemorate the event, "Jacht op het absolute nulpunt", but the website seems to be in Dutch only. However, the curator of the exhibition, Dirk van Delft, describes the story in a nice article in the March 2008 issue of Physics Today, "Little Cup of Helium, Big Science", where he makes the point that the Kamerlingh Onnes Laboratory in Leiden marked the beginning of "Big Science" in physics (PDF file available here and here).

One Hundred years later, there is a twist to the story I wasn't aware about at all: Helium is now so much used in science and industry that there may be a serious shortage ahead! [1]

Helium Demand ...


The following graph, plotting data provided by the US Geological Survey, shows how helium is used today in the US:


Helium Usage. Data from US Geological Survey; click to enlarge. (XLS/PDF file)


The biggest chunk of helium is used for technical applications, which include pressurizing and purging, welding cover gas, controlled atmospheres, or leak detection. The second-largest part is already usage in cryogenics, such as in the cooling of superconducting magnets for magnetic resonance imaging (MRI, formerly known as nuclear magnetic resonance, NMR) machines in medicine, and of superconducting cavities and magnets for high-energy particle accelerators. Only then follow applications that include lifting, as in balloons or blimps.

The LHC, for example, needs 120 metric tons of liquid helium to cool down the accelerator to a mere 2.17 Kelvin, when helium becomes a superfluid and an ideal thermal conductor (90 tons are being used in the magnets and the rest in the pipes and refrigerator - see p. 33 of LHC the guide), and 40 more tons to cool down the magnets of the large detectors to 4.5 Kelvin, so that the coils are superconducting [2]. But even this huge amount of helium is just about 5% of the annual US consumption of helium for cryogenics!

...and Helium Supply


Helium is the second-most abundant element in the Universe, but on Earth, it is rare: The atmosphere cannot hold back the light noble gas atoms - ionized helium is transported along magnetic field lines into the upper atmosphere, where it's thermal velocity exceeds the escape velocity of 11.2 km/s [3].

Thus, the constant helium content of about 5 parts per million (ppm) in the atmosphere is maintained only because helium is constantly being produced anew in radioactive decay: for each uranium, thorium or radon nucleus undergoing alpha decay in the Earth's crust, a new helium atom has emerged. This helium gas accumulates in gas fields within Earth, often together with natural gas. That's where helium can be won.

The following figure compares the annual helium production in the US from the exploitation of gas fields with consumption and exports, and with the total World production (data according to the US Geological Survey, who is to blame for the anomaly that the US production can exceed world production):


Annual Helium Production. Data from US Geological Survey; click to enlarge.
(XLS/PDF file)


US helium consumption and exports clearly exceed production, which is possible because the US helium stock is being consumed. World helium production is still raising at the moment, but easily exploitable reservoirs will become rare some time in the future, as they are already now in the US.

Fortunately for future particle accelerators, and all other applications of helium in science and technology, helium can also be won back from the atmosphere, albeit at a higher cost:

The Meissner-Ochsenfeld effect: A superconductor is hovering above a magnet (Source: Wikipedia)
When Walther Meissner succeeded to produce liquid helium in Berlin in 1925 [4], he could not rely on helium supplied from american gas fields because of embargoes in the wake of World War 1 - helium to fill balloons and Zeppelins was considered of highly strategic value. Instead, he cooperated with a company who later sold commercially equipment to liquefy helium. And he could distill enough liquid helium to discover, together with his postdoc Robert Ochsenfeld, that superconductors expel magnetic fields – the Meissner-Ochsenfeld effect.








[1] For the pending helium shortage, see for example
  • The coming helium shortage, by Laura Deakin: "It’s surprising how many scientists and nonscientists alike are oblivious of the pending helium shortage. But it is a fact—we will run out of helium. [...] The question is when, not if, this will happen." (Chemical Innovation 31 No. 6, June 2001, 43–44)
  • Helium shortage hampers research and industry, by Karen H. Kaplan: "If new sources of helium aren't developed, the world's supply of the gas will dwindle and prices will soar." (Physics Today, June 2007, page 31)
  • Helium Supplies Endangered, Threatening Science And Technology: "In America, helium is running out of gas." (ScienceDaily, January 5, 2008)

[2] For the cooling of the LHC, see for example
  • Let the cooling begin at the LHC, by Hamish Johnston: "Tens of thousands of tonnes of equipment must be cooled to near absolute zero before the Large Hadron Collider can detect its first exotic particle. The head of CERN's cryogenics group, Laurent Tavian, tells Hamish Johnston how this will be done." (Physics World, November 7, 2007)
  • Messer to provide helium for LHC project, by Rob Cockerill: "Over the course of the next few years, industrial gas specialist [...] is to provide a 160.000 kg supply of helium to the European Organisation for Nuclear Research (CERN) for the operation of the world’s largest particle accelerator." (gasworld.com, January 23, 2008)
  • Cern lab goes 'colder than space', by Paul Rincon: "A vast physics experiment built in a tunnel below the French-Swiss border is fast becoming one of the coolest places in the Universe." (BBC News, July 18, 2008)
  • Cooldown status - the current state of the cooldown of the LHC, from CERN.

[3] See for example page 250 and 251 of Noble Gas Chemistry by Minoru Ozima, Frank A. Podosek, Cambridge University Press, 2002.

[4] Verflüssigung des Heliums in der Physikalisch-Technischen Reichsanstalt, by Walther Meissner, Naturwissenschaften 13 No 32 (1925) 695-696.

Wednesday, July 23, 2008

Is there life after CERN?

Is there life after CERN? Will a black hole swallow the earth? Such titled PM magazine in its July issue that my husband bought and kindly showed to me upon my arrival in Germany. He means well, I should add, my blood pressure is often too low, especially after long-distance flights, and in such condition I'm not good for anything.

PM is a popular German magazine that reports in a usually entertaining way on science and engineering. I never much read it because to my taste there's always been too much engineering in it, but it makes for a nice read on the beach or so. The PM article about black holes at the LHC is unfortunately a) in German and b) not available online, but you can look at the two-page illustration here and read the first paragraphs here. You get the flavor I presume, we've all seen numerous articles of that sort during the last months. For an extensive discussion of the key points, see our previous posts on Micro Black Holes, Black Holes at the LHC - again, Black Holes at the LHC - What can happen, and Black Holes at the LHC - the CERN Safety report.

Regarding the PM article, we could not restrain from writing a letter to the editor that referred specifically to the last part of the article, in which they introduce the allegedly ingenious idea of how to supply the world's demand of energy with black hole relics. I commented on this previously in my post 'Micro Black Day'. The idea is roughly that in case the LHC produces black holes, and these happen to not evaporate completely but leave behind stable remnants (relics) one could use these relics to convert arbitrary matter in radiation energy. The picture, so I guess, I to collect the relics, shovel in garbage upon which they gain mass, radiate down again to the relic mass, thereby emitting clean energy with an average temperature of some hundred GeV.

Here is a rough translation of our letter that comments on this scenario:
    Letter to "Will a Black Hole swallow Earth?" PM 07 / 2008

    As physicists who have worked for several years on the possible production of black holes at the LHC we were disappointed by your article. Instead of discussing the interesting aspects of the science involved, like Hawking-radiation or generally collider physics, you produce a sensational article about a constructed doomsday scenario.

    Especially inappropriate is your mentioning of "Scenario No 3: Free energy in abundance," referring to a patent filed by Prof. Horst Stöcker. To begin with, it is completely wrong that "researchers find more and more hints that black holes do not completely evaporate". There are no hints whatsoever, and nothing changed about that recently. This possibility can just not be outruled.

    Worse than this inaccuracy however is that your estimate about the "harvesting" of black hole radiation ignores the fact that one has to run a particle collider the size of the LHC to produce these black holes in the first place. The power consumption of CERN to run the LHC is about 240 Megawatt, about one fifth of a nuclear power plant. Even with the very optimistic estimate that the LHC produces about one black hole per second, this would then take about 70 kWh each - this is the mass equivalent of about 1018 protons or 1015 times the mass of the black hole itself. The black hole therefore has to convert about one billion billion protons into radiation in order to generate a net energy gain. In addition one has to take into account that even if these black hole relics can be produced, they have a very small cross-section and - similarly to neutrinos - will pass through all kinds of matter almost without interaction and will generally escape into outer space. Unless, that is, they carried electric charge, which could happen for purely stochastic reasons for about 2/3 of these black hole relics. However, if one "feeds" the black hole relics and lets them evaporate down to relic mass again, the end product will in one out of three cases be neutral again and escape. Therefore, it is practically impossible to reach only approximately the break-even-point of 1018 protons to be converted into energy: one had to constantly reproduce the black holes.

    And with that we haven't even touched the question of how long it would take to get a "truck with only ten tons normal matter" to "cover the energy supply of the whole earth" into such a black hole with a cross-section of 10-32 cm2 - as mentioned, this cross-section is extremely small: the radius of the black hole of about 10-18 m stands in relation to the width of a needle with a diameter of about 1mm as the width of the needle to the average distance of the earth to the sun.

    As far as we know, Prof. Stöcker's filed in patent was declined. The idea you are advertising here is scientifically complete nonsense.


To restore the scientific credibility (well, I'm not completely heartless) of Prof. Horst Stöcker and Prof. Marcus Bleicher who grin from a photo in that magazine, let me point you towards a very nice article by them that was on the arxiv today
    Exclusion of black hole disaster scenarios at the LHC
    arXiv:0807.3349 [hep-ph]


    The upcoming high energy experiments at the LHC are one of the most outstanding efforts for a better understanding of nature. It is associated with great hopes in the physics community. But there is also some fear in the public, that the conjectured production of mini black holes might lead to a dangerous chain reaction. In this paper we summarize the most straightforward proofs that are necessary to rule out such doomsday scenarios.
in which you find plenty of 'diskussions'.

Monday, July 21, 2008

Openness in Science

In a recent blog post on The Future of Science Michael Nielsen (co-organizer of our upcoming conference on Science in the 21st Century) puts forward an interesting hypothesis. The reason, he speculates, that scientists are so slow with adapting Web 2.0 tools is a "reluctance to share knowledge that could be useful to others". Since Michael's very recommendable essay is likely too long for some of you guys with an attention span around the lifetime of a charged Kaon, here is my summary: What is missing in the scientific community, he says, is a culture of openness.

I agree with him on this - without supporting the cultural change towards larger openness science will fall behind other areas of our lives that have been moving on. It is quite ironic that science, which lives from creating and discussing ideas, suffers from an inhibition of spreading and sharing these ideas.

I can see the following reasons for this:

  • P1: Especially in physics, there is the prevailing myth of the lonely genius who sits under a tree and waits for the apple to drop on his head. In most instances however, this picture is very incomplete. Even the genius needs a community to work out his ideas, to discuss, and to ask - not to mention that ideas are not born in vacuum but based on the knowledge drawn from that same community.


  • P2: Offering spontaneous opinions eg. on blogs or in a forum brings a risk of being wrong. Making mistakes is human, but the atmosphere in scientific discussions is too often unforgiving and malicious instead of supportive and constructive. In many instances, there is also a confusion of intelligence with knowledge. Not knowing something doesn't equal stupidity, but as you could read off for example from the comments on this blog, many people seem to think so. It is rather unsurprising that under such conditions especially scientists who strive to surround themselves with an aura of omniscience are reluctant to potentially embarrass themselves.


  • P3: As long as scientists have to justify their existence by producing papers, and live in the constant fear of being scooped, their ideas are not meant to be shared. It is of advantage for them to hear about other's insights, but not to offer own as long as these are not published and every citation goes on record. Scientists simply don't get paid for having ideas, but for working them out with their name-stamp on it. This is sensible in some regards, but has obvious disadvantages when it comes to sharing these ideas, especially in casual though public environments.

Ways to deal with it

  • S1: The importance of the community and its support is very underappreciated. We have way too much emphasis on competition instead of collaboration, and the collaborative advantage is badly managed. I think this problem will solve itself since it seems to be a general sociological trend that more people realize the advantage of well-organized collaboration, and that knowledge sharing with the appropriate management can catalyze progress.

    This does not mean the times of the lonely genius are over. Collaboration is a way to efficiently use ideas and to discover the potential in already existing knowledge but it still needs human creativity to actually produce novelty. I am emphasizing this because I am very skeptic about the enthusiasm caused by wiki-like collaborative efforts (see e.g. Wikinomics). It is one thing to use existent resources - here, human knowledge - most efficiently, but something completely different to add new. In business one shouldn't neglect the importance of the latter, and in science one shouldn't neglect the importance of the former.


  • S2: The problem of potential embarrassment is rather simple to solve in that one realizes an online discussion, though written and public, isn't a scientific publication, and to dwell upon somebodies inaccuracies isn't constructive. Like the first point, this also is a barrier that I believe will vanish by itself since our communication culture shifts towards less formality, also in the sciences. I only have to look at my inbox. The state of mind needed to quote others' inaccuracies and to make fun of them is going to bore people into afterlife in a couple of years.

    If you've been around in the blogosphere for a while, either blogging or commenting, you will probably also have noticed that this hesitation is a threshold effect. If you've made a stupid remark once you will realize it doesn't kill you, and the memory of online discussions is very short.


  • S3: There's ways to deal with that. For example, I have been wondering for a while why not make it possible for papers to be based on an idea of somebody who eventually wasn't involved in working out the details (please spare me any comments about seers or craftsmen). So that person who eg offered an idea online would get credits for initiating the process but not for presenting the results. There are already attempts into this direction of allowing more detailed information about researchers' contributions. Philip Campbell from Nature recently wrote in a contribution to Ethics in Science and Environmental Politics in an article titled Escape from the Impact Factor:

    "I am intrigued by the possibility of greater granularity within the literature. This is already happening in relation to author's contributions. In Nature and Nature research journals (following the lead taken by medical journals, to their great credit), we encourage authors to include brief summaries of which author contributed what to the work. More and more author collaborations are taking up this option. This will be spurred on and will no doubt become more formal if funding agencies begin to explicitly track such information."

    Similarly I would prefer funding by interest in somebodies ideas instead of him or her getting funding, and subsequently creating interest through his ability to offer positions (see this previous post to clarify my skepticism about directing research interests through financial incentives). What I mean is consider putting out ideas for projects, and distributing funding according to how much interest a topic received by qualified candidates.

    Michael further mentioned the necessity to support people who open up science, which is probably the way most straight-forward. It requires however that even scientists realize times have changed and continuing to organize research like we've done a century ago is a disadvantage and an obstacle to progress.

Regarding the third point, especially with regard to blogs one problem attached to it is a missing time-stamp for posts. With that I do not mean the date on the bottom - this date can be changed arbitrarily, and also the post can be edited a posteriori. This way, the author has no base on which to potentially show "I suggested this earlier here". (Not so with comments btw, at least here on blogger neither the time-stamp nor the content of a comment can be altered). It would be easy enough to change this, for example by allowing a post to be "locked" in a certain version with the current date and content that can't be modified without reverting it to an unlocked state.

See also: Science and the Web 2.0

Sunday, July 20, 2008

Catastrophe Conference

Over the last few days, the Future of Humanity Institute in Oxford hosted the
On their program you find peacefully aligned talks about how "financial losses associated to catastrophes can be mitigated by insurance"(Peter Taylor), "ecological catastrophes that cause widespread local and global extinctions of species" (Professor Christopher Wills ), Climate Change (Dr Dave Frame), "Hazards from Comets and Asteroids" (William Napier), "Catastrophic Nuclear Terrorism" (William Potter and Gary Ackerman), with physicists reporting on the end of the world. There is Prof Fred Adams from the University of Michigan who talks about

"The Long Term Future of our Dying Universe"

From the abstract:

"After accounting for the demise of the galaxy, we consider the evaporation of expelled degenerate objects (planets, white dwarfs, and neutron stars). The evolution and eventual sublimation of these objects is dictated by the decay of their constituent nucleons. After white dwarfs and neutron stars have disappeared, the black holes are the brightest astrophysical objects, slowly losing their mass as they emit Hawking radiation. After the largest black holes have evaporated, the universe slowly slides into darkness."

And Dr Michelangelo Mangano from CERN on the "Expected and unexpected in the exploration of the fundamental laws of nature". From the abstract "The discussions over the possible outcomes of new high-energy experiments will be used as a case study to address this topic, covering both the scientific and sociological aspects of the issue." Mangano is author of the previously mentioned CERN safety report.

Saturday, July 19, 2008

Nothingness

The recent issue of Discover magazine has an article by by Tim Folger

It is nicely written, scientifically accurate though vague, but its narrative is absolutely pointless - it just ends somewhere and I kept looking for the next page.

The article adds more evidence that headlines are in most cases nonsensical and have rarely something to do with the content of the article. I've been told repeatedly headlines are picked by the editors not the writers, and I find this increasingly annoying. The article by Folger talks a bit about Casimir energy and dark energy. Then it features the idea of 'extracting' energy from the vacuum which could destroy the universe in a chain reaction in which our vacuum reverts to an energetically more favorable state: “If some clever engineer were ever to extract energy from the vacuum, it could set off a chain reaction that would spread at the speed of light and destroy the universe.” At least that wasn't the headline. Finally there are some paragraphs about the LHC and the Higgs and a rather unmotivated mentioning of extra dimensions and M-theory. How the vacuum is supposed to “illuminate” the Theory of Everything remains remarkably unclear throughout the whole article.

John Baez is quoted in various places and is a voice of reason, esp. with regard to the destruction of the universe. About the possibility that the LHC might find the Higgs and only the Higgs, he says
“Well, it would be exciting, but only in the same sense as if you lose your keys and then you find them again. Someone would certainly win a Nobel Prize for it, but after the initial excitement, particle physicists would become grumpy because it would just mean that what we thought was true is true, and all the things we don’t understand we still don’t understand, and there is still no new evidence.”

Sean Carroll is quoted with “we really have to think deeply about what our theories are.”

Amen.

Friday, July 18, 2008

This and That

In the absence of quality blogging time here's some unordered bits of information:

Wednesday, July 16, 2008

Interna

We have good news! As you might recall, after finishing his PhD Stefan started a position as editor for Springer - the scientific publisher with the little horsehead: "Springer" is the German word for "Knight". (The same scientific publisher whose books show up annoyingly often in Google searches without providing as much as a free abstract.)

Anyway, after two years Stefan's contract recently became permanent! ("entfristet" nennt sich das in Neuhochdeutsch).

At the same time, the bureau in Darmstadt where he was located has been moving to Heidelberg (where I recently gave a seminar at the ITP). Commuting from Frankfurt to Heidelberg has turned out to be quite painful, so we have been looking for a new apartment during the last weeks. Meanwhile, we found a very nice place in the Heidelberg area and are now packing boxes, have to decide what to do with the furniture, and I am trying to convince my beloved husband to throw out some cubic meters of clothes and books and paper printouts he'll never use again in his lifetime.

You might be relieved to hear that the availability of a high speed internet connection was a major criterion for picking the new apartment. We are told it will be set up already next week.

Monday, July 14, 2008

The German Academy of Sciences

The UK has its Royal Society, France the Institut de France with its Académie des sciences, Italy has the Accademia Nazionale dei Lincei, and the US the The National Academies with the National Academy of Sciences, which bring together committees of experts in all areas of scientific and technological endeavor [...] to address critical national issues and give advice to the federal government and the public.

So far, Germany has been missing a comparable institution to produce evidence-based statements as a basis for discussions and political decisions. But since today, The German Academy of Sciences Leopoldina is Germany's first National Academy of Sciences - it was officially appointed so in a ceremony with Germany’s Federal President Horst Köhler, who also took over the patronage of the Leopoldina as the National Academy.

The Leopoldina was founded in January 1652 by four physicians in the Free Imperial City of Schweinfurt to explore Nature to the Benefit of the Human Being, and is named after Emperor Leopold I (1640–1705), who was well-known for his interest in the arts and sciences of his time.

Let's hope that the new Academy will establish a fruitful two-way exchange between the inhabitants of the Ivory Tower and the public, or its elected representatives, respectively.

Sunday, July 13, 2008

We have only ourselves to judge on each other

Last week, I was talking to a string theorist. We were talking about black holes at the LHC, but I am admittedly presently somewhat tired of the topic. So, since I will be on the market again this fall, I asked how the job situation presently looks like for string theorists. Not good, he said. Having gotten used to the always optimistic US spirit, this came as a surprise to me. People are losing interest, he said, there hasn't been enough progress. He is now looking into loop quantum gravity & Co.

Gee, I thought. Peter is right. There is change knocking on the front door. Should we let it in?

I am watching these developments with interest, but also with some concern. As a continuation of my earlier post on the Marketplace of Ideas, I want to elaborate somewhat on what the differences are between scientific research and the market in economy.


The Economic System

With regard to our economy, the free market has proved to be an enormously
useful tool, as long as its freedom is guaranteed by appropriate institutions. It works so efficiently because the mechanism that optimizes distribution of goods and money is strikingly simple. It is the individual who chooses where to invest and whether an investment is worth the money or not, which then provides feedback to those producing a good or offering a service. One has in this case a large group of potential buyers that judge on offers by accepting them or not accepting them. This leads the goods to obtain some value, a price that develops out of the interplay of supply and demand.

Buyers don't always act rational and/or are not well informed, so there is some error to this process. People eg might not want to spend arbitrarily much time comparing offers, or they are influenced by fashion trends. Decision science is a fairly recent interdisciplinary field that investigates how people make decisions under which circumstances. Still, it remains the fact that the marketplace works quite well. (Though some of it quirks are less than pleasant - eg a large part of the sudden increase in oil price is due to speculation). However, one important factor influencing people's decisions are advertisements which can substantially skew situations (and are thus subject to regulations).

If we are selling products on the marketplace we have sellers, we have buyers, and as long as the system is set up properly this should lead to an optimal investment of time, money and resources - and eventually free capital for further progress. One could say that a product will tend towards some natural value that optimally balances supply and demand.

This procedure of operation for profit has been so efficient and globally lead to so much progress that it is of little surprise some countries preach it like a religion - an 'invisible hand' guiding us towards wealth and happiness. Ideally, the mechanism of the free market converts our individual 'micro-interests' in 'macro-interests' that are to the benefit of everybody. (A lot can be said about the circumstances under which this works or doesn't work, but this is not the aim of this post.)


The Academic System

Systems with individuals who pursue interests, and develop strategies that lead to specific dynamics and trends can be found in many situations. The blogosphere is an example. Here we have the bloggers who are trying to increase the number of visitors, comments, and in-links. One of the results are multiple echos of topics that attract interest. Strategies that are useful are being fast, being brief, and being provocative. I leave it to you to decide whether the outcome is desirable.

The academic system is also constituted of individuals who pursue their own interests, and the result should be that the single researcher's strategies ideally lead to progress. As in all other cases this requires however that the system is set up appropriately and the developing individual strategies indeed lead to a desired trend.

For this, one has to be aware of several crucial differences between the academic system and the marketplace of goods:


  • First, I dare to say that within academia the relevant factor that scientists aim for is not money but attention, primarily - though not entirely - attention of peers. Those who want to become rich wouldn't stay within academia to begin with. It is however the case today that financial support is closely linked to attention.


  • Second, the most important assumption behind there being something like a 'Marketplace of Ideas' is that an idea has a natural value that can be identified in some simple way. It is here where the analogy fails dramatically. If a company produces a candy bar people will like it or don't like it, you will find out very fast. In scientific research, the value of an idea (or a research program) is eventually whether it leads to progress. Progress might not necessarily be an application but simply growth of knowledge and understanding. Developing and judging on the value of an idea is a complicated and time-consuming process, and the judgement itself is in fact one of the main tasks of scientists. It can however take decades until a research program is fully developed and until it is possible to figure out whether an idea is of high value. In academia, people still discuss today things that have been written hundreds of years ago.

    The timescale it takes to figure out the value of an idea can depend very much on the field and also on the project. Theoretical physics specifically is a field that is very abstract and working out an idea takes at least 5-10 years. It can take decades for it to be tested.


  • Third, between proposing an idea and there being an external indicator as to its value, there is no way to judge on the promise of a research direction other than peer review. And this is the most important difference between our economy and scientific research: Judging on the value of a research program requires an education in the field and expert's knowledge. Scientists are thus buyers and sellers likewise. They provide supply and cause demand. Until an idea is developed sufficiently and can be experimentally tested, we have only ourselves to judge on each other.



The Fall of the Ivory Tower

Especially because of the third point above, the academic system is extremely fragile and prone to be substantially distorted when under external pressure that influences researchers interests. It is for this reason that traditionally the ivory tower was meant to protect scientists. The ivory tower is now a word used quite cynically to describe the detached academic who doesn't know what people do in real life. But the intention of this detachment was to make scientists independent of financial, political, and social influence. This independence is crucial, and it is not easy to obtain. Rational and objective judgement is essential to science and is simple neither on the personal nor on the community level. The only way that researchers micro-interests can be as objective and well-informed as possible, and that these interests can lead to a desired outcome is to ensure researchers are able to follow their judgement on ideas, unaffected by what fashion, funding and the media says.

This however is presently not the case. There are most prominently three important factors that influence researchers

  • Financial pressure: Nothing works without money and typically researchers are funded to work on certain projects, either through grants or because they are hired into specific groups. Projects are funded if they are considered interesting, often based on previous success. Tenure depends on grants obtained. The possibility to hire people depends on grants. The reputation of the place depends on the people and thus on the grants. Now we have a situation where people go where money goes, and money goes where people go, and interest goes where money and people goes - attracting more money and people. A perfect setting to produce bubbles of nothing, based on people telling each other and funding agencies how great their research program is (also called: positive feedback). Money is an extraordinarily powerful tool to direct interests and one has to be extremely careful with using it.


  • Peer pressure: Scientists strive for attention and appreciation of peers, which opens the doors to social problems that one has to be aware of and that potentially need to be counteracted by providing incentives, or ensuring appropriate management. Especially in cases when sub-fields specialize there is the trend to misjudge shortcomings of the own research fields, and to neglect criticism by out-group members (this is a well documented phenomenon in sociology). Also, scientists might hesitate to work on topics that potentially could damage their reputation, might prefer topics their peers consider interesting thereby creating fashion trends, and there is the risk that a lot of time is invested in improving social connections which goes on the expenses of time invested in research itself.


  • Public attention: Scientists are part of the society they live in and are influenced by what is discussed in the public. It is also not surprising that researchers like to work on topics that cause attention and are appreciated by the public, which is not necessarily a bad thing. However, this delegates quite some influence to the mass media and journalists. In a recent post I mentioned survey results according to which 46% of all scientists consider media contacts to be beneficial to their career. It is a small step from this for researchers to decide working on topics that are of interest for the media is beneficial for their career.
The above mentioned points lead scientists to develop strategies that improve the possibility of surviving within the academic system. The results of such strategies are especially pronounced if competitive pressure is high and the selection works very fast, which in turn creates a system populated by scientists that did well under the present circumstances and thus see no reason to change it. In areas where paying attention to such matters simply is not necessary because sufficient positions exists, these influences might be small or negligible.

Unfortunately, especially theoretical physics presently suffers from a large competitive pressure that leads people to pay attention to these factors. I am very afraid this significantly affects our ability to judge on each other because we are investing too much time in worrying for our career, and because following our own interests might be in conflict with what is strategically a wise decision. Time pressure and lacking future options produce a tendency to dismiss or just ignore new approaches. This is not a problem that originates in funding agencies who in my impression seem to be well aware of the need of 'transformative research' - it is a problem that originates among the researchers who are afraid of wasting time or money in approaches that will not produce presentable outcome for several years. The 5-10 years in which a research program needs unsolicited support to be fully developed is often not available.


Some suggestions

Some suggestions follow directly from the above:

  • Don't tie researchers to topics, but let them chose freely. Scientists should be hired and promoted based only on their ability and creativity, period. No other factors should play any role, and then let them do what they want. This holds true also for younger researchers. Cooperation with senior researchers happens naturally because it is beneficial for both sides, but should not be enforced.


  • Support researchers for an appropriate time period. In research the weight shifts strongly towards short term positions. From 1973 to 2005 the share of postdoc positions in academic research in the USA grew from 13% to 27% (numbers: NSF). People who are on short term contracts and under pressure to produce outcome (papers/patents) will hesitate to start projects that do not fit within that timeframe. This totally stalls any progress on topics that take longer to be worked out. Have some faith. If a researcher has done well, just fund him and let him do what he wants. In some fields of science (like theoretical physics), writing a proposal that plans ahead several years and then needing to stick to it is in many cases a procedure that is nonsensical and constrains scientists freedom.


  • Don't fall for shortcuts. Under time pressure, people tend to look for easy criteria to judge on other people or their projects. Such might be the citation index, the number of papers, prominent coauthors or grants obtained. In all instances scientists should judge on other's work by themselves. This requires that scientists have enough time for this judgement. Time is essential, but too short in today's research atmosphere. For a thought on how to cut back on researcher's duties, see previous post on Research and Teaching.


  • Counteract specialization: In the present system it is almost impossible for a researcher to change fields without risking severe drawbacks for his/her career. One of the reason is that researchers are hired into specific tasks, and for these nobody would be hired who hasn't previously worked on the field. Another problem is that grants typically require having former publications in a field to document expertise. It has been realized already decades ago that progress very often comes from interdisciplinary exchange. It is quite ironic that lots of funding goes into such new inter-disciplines while the possibility for researchers to just change between fields (or even sub-fields!) is hindered. Solution again: Have faith in people who have proved to do good research and just let them follow their interest.


  • Reward internal communication and criticism: Science lives from criticism and dialogue. Unfortunately, presently there are about no incentives for researchers to invest time into obtaining and communicating an overview on their or other's research fields. It is basically a waste of time since the thing to do to obtain a position is to make oneselves an expert in some niche where one is the only person. Give credits to people and/or institutions who fulfil this task and create an atmosphere in which the relevance of criticism is acknowledged and its importance appreciated.


  • Don't reward advertisement: Advertisement of the own product, in this case an idea or research program, is a tactic widely used to sell goods. The aim is simply to influence buyers. Such a procedure is completely inappropriate for scientific ideas that have to be critically assessed, in papers as well as in proposals. Nobody should be punished for openly presenting and discussing drawbacks of the own research program, eg because he or she might appear not enthusiastic or optimistic enough. Neither talks nor papers should overhype promises or forget to mention problems, because these are 'well known'. The latter also causes a significant problem when it comes to communication with science journalists.




No Panaceas

I recently had an interesting exchange with Garrett Lisi, who said:
    "It's just that the [academic] system seems locked in a poor state right now, so it seems easy to think of steps to make it better."

I am afraid that the more people realize the present system doesn't work well, the more likely they will think it's easy to improve it and fall for panaceas - cheap solutions offering miracle cures. However, putting emphasis on other easy criteria than the previously used ones is not going to solve the problem, it just moves the problem elsewhere. There are no panaceas.

Consider we'd put an emphasis on the 'independent' researcher who works on an 'alternative' approach, who has plenty of single-authored papers and doesn't like mainstream topics. Consider we'd set up the system to preferably reward this type of person. What would we create then if we leave the selective pressure behind the system? People who strive to fulfill these new criteria. To me it seems pretty obvious that this only cures the symptoms, not the disease: One can also have too many 'independent' people, too much emphasis on 'alternative', and cooperation is as far as I am concerned a beneficial trait. The same is the case for over-emphasizing predictions or phenomenology. Just look at the arxiv and see what the outcome is. Money goes into phenomenology. People go where money goes. More money goes where people go. Interest goes where people and money goes. In two decades from now, somebody will come and say: Hey, there is trouble in physics. They are working on all of these cheap little pheno-models, how is ever something supposed to come out of this?

Therefore, the suggestions I had above are aimed at ensuring the system can self-optimize its outcome, by lowering disturbing influence that deviates researcher's work from their actual interest.

On the long run, the only way I see to ensure progress is to scientifically investigate the situation and incorporate asap offered solutions.


Bottomline

If change knocks on the front door, ask what it wants.


Post-Scriptum

Re-reading what I wrote, I was just reminded of a paragraph from Lee's 1st book
“All there is of Nature is what is around us. All there is of Being is relations among real, sensible things. All we have of natural law is a world that has made itself. All we may expect of human law is what we can negotiate among ourselves, and what we take as our responsibility. All we may gain of knowledge must be drawn from what we can see with our own eyes and what others tell us they have seen with their eyes. All we may expect of justice is compassion. All we may look up to as judges are each other. All that is possible of utopia is what we make with our own hands. Pray let it be enough.”

~Lee Smolin, The Life of the Cosmos (Epilogue)



Meta

My FQXi proposal to investigate some of the aspects I discussed here was declined, main reason that it seems to "overlap with studies of the sociology of science, which is a well-developed field, but one in which the principal investigator appears to have no direct professional experience or training". That is entirely correct. I am just a physicist who has had too much time thinking how the academic system sucks, wondering why nobody in it seems to listen to what the sociologists say, and why said sociologists don't come up with practical advices (interdisciplinary research, anybody?). To those of you waiting, this also means there will be no financial support for grad students to participate in our upcoming conference. I am genuinely sorry about this.


This post is part IV of our series on Science and Democracy. See also: Part I, Part II, Part III.


TAGS: , ,

Friday, July 11, 2008

Scientists and the Mass Media

The recent issue of Science has an article about


which summarizes the result of a survey of 1354 researchers in the United States, Japan, Germany, United Kingdom, and France. The results show that media contacts of scientists in these top R&D countries are more frequent and smooth than was previously thought.

In all five countries, the biggest part of scientists who had contact with the media in the past 3 years rated the impact of those contacts on their careers positively: 46% of the respondents perceived a “mostly positive” impact, whereas only 3% found the impact to be “mostly negative”. Overall, 46% of the respondents perceived a “mostly positive” impact of their interaction with journalists. 57% of the respondents said they were “mostly pleased” about their “latest appearance in the media,” and only 6% were “mostly dissatisfied”.

I find it interesting but also slightly worrisome that such a large fraction of scientists considers media contacts to be beneficial for their career. It would be good to know whether the perceived benefits are actual benefits.

Thursday, July 10, 2008

German Citizenship Test

As previously mentioned, Germany has introduced a citizen test for immigrants. Spiegel Online now has the full version with 33 questions:

I got 32 right, I failed on the colors in the state flag of North Rhine-Westphalia. Outcome:
    "Well done! You would have no problem getting a German passport."
I'm relieved to hear. Particularly nice the question:
    Which of the following do Germans traditionally do at Easter?

    • Leave pumpkins in front of the door
    • Decorate a fir tree
    • Paint eggs
    • Let off fireworks

Hint: if you follow this blog, you know the answer.

They left out however some of the really important questions, here are my suggestions:
    1. Besides being a citizen of Frankfurt, what is a 'Frankfurter'

    • A: A bakery
    • B: A sausage
    • C: A flat tire
    • D: A drink mixed of beer and lemonade

    2. If a German says he will meet you at three-quarter eight (dreiviertel Acht), what does he mean?

    • A: 8:45
    • B: 8:15
    • C: 7:45
    • D: Any time between three quarter to and after eight, ie 7:15 - 8:45

    3. What did the crowd chant on Nov. 9th '89 at the Brandenburg Gate?

    • A: Lasst uns rein - Let us in
    • B: Lasst uns raus - Let us out
    • C: Wir sind der Staat - We are the state
    • D: Wir sind das Volk - We are the people

    4. If a Bavarian tells you to "Grüss Gott" - "Say hello to God", he means

    • A: Thank you
    • B: Hello
    • C: Piss off
    • D: I died and went to heaven



(Answers: 1 - B, 2 - I don't know!, 3 - D, 4 - B)

Yes, he can have a dream

But a dream it will likely remain. German newspapers have been discussing back and forth during the last days whether Barack Obama while on visit in Germany should be allowed to speak in front of the Brandenburg Gate at July 24th in Berlin (SPEECH AT BRANDENBURG GATE? -German Politicians Are in an Obama Tizzy). The Berlin mayor Klaus Wowereit says yes, the chancellor Angela Merkel says no. Today the topic made it into the NYT which titles

Prospect of Obama at Brandenburg Gate Divides German Politicians

You didn't ask for my opinion, but in this case I'm with Merkel though she's in the wrong party (can you imagine I'd vote for a party that has a C for Christian in its name?). Obama isn't an elected representative of any nation, at least not yet. It is pretty clear he wants to speak in front of a historically loaden location like the Brandenburg Gate to help the campaign in his own country, and I think Germany should try to stay out of other countries election campaign.


Aside: I do presently neither receive nor can I send any emails. The problem seems to have occurred about 12 hours ago and before somebody gets up on the east coast it probably won't be resolved. So please don't expect me to reply to any emails I didn't get. In case it's really urgent recall an invention named 'the phone'.

Update 4pm: Email works again.

Tuesday, July 08, 2008

Research and Teaching

A main topics in today's issue of the Frankfurter Rundschau (one of the major German newspapers) is dedicated to the quality of teaching at universities. Page 3 features an interview with Thomas Metschke who founded the page www.meinprof.de where students can rate their prof's teaching skills (it's all in German I'm afraid.)

Maybe I was particularly unlucky with my lecturers, but the classes I had to take were in the best case useless, in the worst case demotivating, if not debilitating. The quality considerably improved in the cases a prof was spending other people's travel grants to go in vacation and his class was held by one of his students instead. I do well learning from books, so to me it wasn't a huge problem, but still one had to appear and sit through all these hours while somebody was mumbling to the blackboard or got confused about his own notes.

Either way, ever since then I've been wondering why people are forced to teach who apparently have neither the skill nor any desire to, while people who would like to teach first place have a hard time getting a professorship. The idea that those who teach about research should be researchers themselves has a long history in Germany, and I think this is a necessary requirement. But do they have to be active in research while teaching? In practice it seems to me that teaching is often seen as a timeconsuming duty while research is the 'real' thing - it is what brings the colleagues' appreciation, possibly invitations to cool places or journalist's requests. On the other hand, people whose interest clearly is in teaching, and who have experience in research but possibly are not in the top-group have a hard time making it onto a position were they would teach.

Does that make any sense? It raises for me the following two questions

A) Why aren't there more pure research positions at universities? Face it, there are people in the academic system who can be great researchers but are a complete failure with students. There is the kind of people who are complete hermits or just totally nerdy, and who efficiently radiate an aura of get-out-of-my-office. Nothing of which necessarily makes them bad researchers, but where is the place for them?

B) Would it be possible to diversify research jobs by additional training in possibly different secondary disciplines like e.g. in teaching, public outreach, or group management? These are all skills besides the research activities that today are expected of researchers to acquire them just somehow. The advantage of that would be that it could be better documented, tasks could be assigned better, and if you were hiring somebody you'd know better what you are at.

In fact I think this is where the trend will be going. Researchers today are expected to be all-rounders, to do everything with less and less time, under increasingly high pressure. It just doesn't work well. Universities and institutes, as well as private companies, offer an increasing amount of training seminars to improve such secondary skills. I think this is a natural development into a division of labor that the academic system will have to incorporate at some point in order to stay functional.