Wednesday, June 27, 2012

Nature = Mathematics?

On the weekend I had to spend some time at the airport. Something about airports has me come back to the question just what is reality anyway. In this case it was a long hallway down the terminal that sparked the thought; it had me think about perspective drawing.

I taught myself perspective drawing in 5th grade, which I recall because my friends asked me to explain how to do it, upon which I went to the library to learn it proper. I was surprised to read how late in the history of painting it was that artists got the perspective right. Upon closer introspection I guess though I didn’t actually learn it from watching the three dimensional world carefully, but by watching carefully images and photos that were already two dimensional.

Example of perspective drawing
Pietro Perugino, about 1481
Source: Wikipedia Commons
There are some early examples of perspective in drawing, it was for example widely known that objects in the distance appear smaller, but it wasn’t until the 15th century that the geometrical methods were properly developed and widely used. I don’t think it’s a coincidence that this was briefly before the scientific revolution dramatically changed the way people understood the world.

A painting is, in a very simple sense, a model of the world, and understanding perspective drawing must have had people realize that there is a mathematical basis of the world that’s waiting to be recognized. If you make an accurate drawing of, say, what you see out of your window, if you have sufficient details about how the mountains look like and where the river is, you might be able to “predict” from your drawing that there must be a tree standing over there.

I used to think of our theories as being maps, essentially, from mathematics to reality. I wrote about this earlier and will just reproduce here the accompanying diagram.


There is the world of mathematics, the eternal platonic ideal, and we take a part of it and identify it with the real world. The mathematical part we can call “the model” and “the theory” is the mechanism of identification with the real world, essentially how you compute observables and connect them to data. (I am aware that’s not how other people might be using these words, but arguing about words is pointless.)

This picture of the way we describe the world however raises the question if there is a distinction between these two areas, the question whether mathematics is equally real as that computer screen you are looking at, a question that is some thousand years old, minus the computer. Note that to ask that question, I don’t have to tell you what “real” means. I am just asking if there is a difference between a mathematical object and something that you can throw at me. Max Tegmark famously does not believe there is a distinction.
Most people I know believe there is.

However, it occurred to me, the mapping that the image suggests is actually not what we do if we build a model or apply a theory. What we always do, instead, is that we map one system of the real world to some other system, where the idea is that the one system is better to understand or to use.

Think of the painting: the painting is not a mathematical object. It’s an abstraction, all right, one that can make use of mathematical tools, but it’s not in and by itself a platonic idea. The same is true for all other models that we use. A computer simulation is not a mathematical object, it is a re-building, usually also a simplification, of another part of nature that we want to compare it too. And a calculation that you do in your head is not platonic either, it’s some firing of neurons and a lot of chemical reactions going on, and so on. And it is again, essentially some simulation, approximation, extrapolation, of another part of nature that you want to compare it to, to the end of making a prediction because you want to know if you got it right.

So where does that leave mathematics then? Mathematics is a tool that we use to improve on our models, it’s a technique that we force our thoughts through because it has proven to be incredibly useful. Nevertheless, the point I am trying to make is that this usefulness doesn’t mean a model actually extracts some mathematical “substance” from reality.

You will wonder now what does it matter. The reason it matters to me is that for reasons I elaborated on in this earlier post, I think that the occurrence of the multiverse in its various forms is unavoidable and a consequence of relying exclusively on mathematical consistency. The multiverse tells us that mathematics is not sufficient. What is, I don’t know.

The question is of course if we can conceive of any type of model and a theory to map it that is not mathematical. One thing that came to my mind here is analog gravity, basically the idea to study some types of gravitational phenomena with condensed matter or fluid analogies (thus the name), an idea that has caught on during the last years. I am not terribly excited about this because I don’t really see what we learn from this about quantum gravity. But the point is that it’s an example where you have a model (the “analogue”) that is mapped to the system you want to describe (spacetime) and the model in this case is not a mathematical structure.

Or in other words, if it should be the case that nature cannot be described by mathematics alone, this type of models could still be used.

So much about my latest thoughts on the question whether, at some point in the history of science, we will have to find a way to go beyond mathematics to make progress, and what that could possibly mean.

Friday, June 22, 2012

Catching photons

The Germans have a great history of telling tales, most of which are supposed to teach some type of lesson. One series of such tales is about the citizens of Schilda, the "Schildbürger," who in each story excel in stupidity. One of the best known stories is the construction of a new city hall. Unfortunately, the Schildbürger forget the windows, and then try to carry light into the building with buckets. 


I forgot which lesson one was supposed to learn from that, maybe that the photon number is not a conserved quantity, more likely though that you better don't forget the windows if you build a house. I recall however that I was bugging my poor grandmother with that story over and over again because it wasn't really clear to me exactly why one can't catch light in a box. Surely you could just put mirrors on the inside and visible light would bounce around till you left it out again? 

Well, leaving aside that it's easier to bring light into a dark room by flipping a switch, the problem is that mirrors are imperfect, that is, they don't actually always bounce back photons. The typical mirror in your bathroom, glass with aluminium coating, only reflects about 90% of the infalling light in the visible spectrum, so this wouldn't help the citizens of Schilda very much. (This isn't obvious if you look into a mirror but if you hold two mirrors opposite to each other, you might notice the reflection getting weaker, also, probably getting a little blue/green tint which is from the glass not being perfectly transparent.)

The Schildbürger were on my mind when I came across this paper by a group of French physicists around Serge Haroche who is known for a series of quantum optics experiments. They developed "diamond-machined copper mirrors coated with superconducting niobium."

If you shape the mirrors suitably and arrange them opposite to each other, you can use them to capture photons. And the mind-boggling number that you should take away from here is that the typical decay time of light bouncing back between these mirrors is 0.129 seconds, which corresponds to about 39,000 km light travel time back and forth between the mirrors that have a distance of about 3 cm (a quality factor of more than 1010).

So they would have to run a little, the citizens of Schilda, but they might finally be able to carry light into the city hall. They would also have to cool their buckets to 0.8 K and it would only work in the infrared, but at least it's a start.

The two mirrors of the photon box at ENS
Photo Credits: Photothèque/LKB/Michel BRUNE
Image Source

Tuesday, June 19, 2012

Did the OPERA affair harm or benefit science?

Last year around that time of the year I was working on a draft that re-investigated an old question, whether superluminal information exchange is compatible with special relativity, causality and locality. Just when I had finished the draft and had sent it to a few colleagues, I read the news on the alleged superluminal neutrinos in the OPERA experiment.

Suddenly, the arxiv was flushed with papers on superluminal propagation. My draft didn't have anything to say about neutrinos, but the last thing I wanted was for it to drown in a flood of papers I was convinced would become rapidly irrelevant. So I sat on the draft, but watched the appearance of papers on the arxiv, and the fate of the "OPERA anomaly" closely. Luckily, none of the papers that appeared had any resemblance with mine.

Now that the anomaly vanished into a loose cable that will probably become a running gag in the history of science, I want to return to a question that we already discussed two years ago: Did the attention of the press on what turned out to be a mistake benefit or harm the public perception of science?

A Nature Editorial from two months ago, titled "No Shame," boldly declared everything that happened went perfectly alright:
OPERA's handling of the incident, at least publicly, was a model for how scientists should behave. Ereditato and Auterio acted responsibly when speaking publicly by sticking close to their data and avoiding over-interpretation. They shared their work with their competitors, and did their best to quickly address outside criticism. In the end, it was OPERA's internal checks that found the loose cable. When the error was discovered, physicists on the team wasted no time in publicly announcing the problem, along with others they had exposed during their review.
This elaboration however misses the point that "sharing work" doesn't exactly require to hold a press conference. It would have been perfectly possible for the collaboration to share their results and trouble-shoot without making such a big fuzz about it. The press would probably have heard of it anyway, but the collaboration could have calmly explained them that they're working on it.

The GEO600 collaboration, for example, when faced with their "mystery noise" did not make a secret of it either. They had information on their website and in conference proceedings, and in fact a lot of people knew about it. There were a few reports in the media, but not even NewScientist managed to create a sensation with a collaboration member who just declared that everybody expected the noise to vanish in a rather mundane explanation. Which was exactly what happened.

A lot of my colleagues think that any attention physics receives in the press is good. I don't think so. I understand that it's certainly an ego-boost if you read in the news about a topic on which you are an insider, and suddenly friends and relatives want to hear your opinion. Ah, I'm so knowledgeable, so cool, I'm so up-to date. But there are downsides to this. In my earlier post I listed three points that one should take into account

  • First problem is that while it might draw interest in the short run, it erodes trust as well as interest in the long run. Science lives from accuracy more than any other field. The more often people read claims that something maybe was discovered, but then it wasn't, the less attention they'll pay if they read it again. Quantum gravity in cosmic rays! No wait, nothing to find there. Quantum gravity in gravitational wave interferometers! No wait, nothing to find there. Quantum gravity at the LHC. Sorry, nothing there either. This erosion of trust is exactly why I spent so much time on this blog deflating the headlines.
  • Second problem is one of principle. If rumors or measurement errors are considered a useful tool to draw attention, and attention is a good thing, why not make up a few? 
  • Third problem is that these rumors tend to circle around a few presently particularly popular topics or institutions, and if they dominate the news the vast majority of topics remains uncovered. This, I think, is clearly a disadvantage to education in general and also to the way researchers perceive the relevance of their work.
After having watched the OPERA anomaly come and go, I want to add a fourth point:
  • Fourth problem: The more public attention a topic receives, the more likely scientists in the field are to jump on the train and spend time coming up with contributions, eventually wasting their time and, essentially, taxpayers money.
It is not my intention to blame anybody for anything. It is always easier to point the finger after the facts are on the table. I also do not think people who made mistakes necessarily should resign over them, or should be forced to go. In many instances it seems better to me to keep people who have learned their lesson. If anything, the collaboration should maybe rethink their decision-making procedures. Clearly, they must have thought that it's a mistake that is not on their territory, something that they need outside help with. And they reported it before they had even run all the checks that they had at their disposal. That seems odd to me.

No, the reason for the post is that I think the above points should be taken into account in similar situations because it's not at all clear more attention is always better. Also, I am interested in your opinion.

Related: I saw coincidentally that Giovanni Amelino-Camelia has a paper on physics.hist-ph that discusses the relevance of the OPERA affair for the philosophy of science. I haven't read it, but if you're interested in the details, it might be worth a look.

Monday, June 18, 2012

Free Book Giveaway “How to Teach Relativity to Your Dog”

I received a 2nd free copy of Chad Orzel's delightful book "How to Teach Relativity to Your Dog" (which I reviewed here). I really don't have much use for the second copy, so I am giving it away for free! The book will go to the first reader who lets me know in the comments to this post they are interested. (Please do not send me an email.) There is only one condition: You need to have a postal address either in Sweden or in Germany because I'm not in the mood to pay more than the book is worth on the postal fee.

Update: The book is gone.


Thursday, June 14, 2012

Anton Zeilinger ventures into art

The "Documenta" is a major event in the German art scene. It takes place every 5 years and draws hundreds of thousands of visitors. The well-known quantum physicist Anton Zeilinger will have an exhibition at this year's event, which just opened doors. The German Newspaper "Die Zeit" spoke to Zeilinger about this. If you don't speak German, let me translate you some paragraphs:
Anton Zeilinger... attracts mystics like light attracts moths, "quantum healers" or "quantum doctors" refer to him. "I am sorry," says Zeilinger, "there's nothing I can do about this." At the Documenta he will try to defend his research against such interpretations.
He says about the relation between art and science:
Scientists and artists have a lot in common, says Zeilinger: "Intuition and creativity are their most important tools, it is all about new approaches for the study of reality." But there is a point where [scientists and artists] differ: Science demands testability. And they have a claim of truth. "We say things about the world that are simply right." But if contradictions occur, scientists sometimes just throw out their view of the world. That was the case with quantum theory which entirely changed physics a hundred years ago.
I'll leave that for you to comment on...

I'll be away from my desk for a while, so don't worry if you don't hear much from me for the next week or so. And if you like our blog and/or extraterrestrial planets, you can vote for us in the 3 Quarks Daily contest here.

Tuesday, June 12, 2012

The loneliness of making sense

Connect the dots.
Click to enlarge. Source.
Jonah Lehrer in his book “Imagine” introduced me to an interesting study by Beeman and Kounios. They showed, in brief, by using fMRI imaging and EEG measurements of brain activity, that we possess two different modes of problem solving: One dominated by the left half of the brain, and one dominated by the right half of the brain.

The left-brain dominated problem solving is an analytical step-by-step procedure. It goes through the existing knowledge and applies known methods to a problem. Candidates solving a problem by this method have usually a feeling of making progress, of getting closer to a solution.

The right-brain dominated problem solving relies on pattern recognition and associations. It often kicks in after the left-brain method failed. Candidates solving a problem by this method have no feeling of making progress till they suddenly come up with a solution, often accompanied by an “aha-moment.” (Another study showed that brain activity indicates a solution has been found before people became consciously aware of that. The “aha” is produced just above your right ear.)

The problems that were used in this study, verbal puzzles and trick questions and so on, are highly artificial. In real life, most problems require a mixture of both approaches, though some more heavily rely on one or the other. If you are for example adding prices when shopping, that’s a very straight-forward left-brain problem. Figuring out how to fit the twin stroller and two baby seats plus two adults into a Renault Twingo (clearly a misnomer) will take forever if you’d indeed go through all possible options. Now visualize the space, or lack thereof, take off the stroller’s wheels and, aha, the trunk will close. I was very proud of my right brain.

But I am usually addressing problems exactly the way described in Lehrer’s book: First, search through existing knowledge and see if a method is known to solve the problem. If that doesn’t work, I have an intermediate step in which I try to come up with knowledge about where the solution can be found. If that fails too, I’ll try to match the problem to other problems that I know, simplify it, look at limiting cases, rewrite it, iterate, take off the wheels, and so on and so forth.

By and large my pattern searching mechanism seems to be somewhat overactive. It frequently spits out more associations than I’d want to, resulting in what the psychologists call divergent thinking. That, I’m afraid, is very noticeable if you talk to me, as I have the habit of changing topic in the middle of the sentence, making several loops and detours before coming back, if I come back. Needless to say, this makes perfect sense to me. In my experience (watch out, anecdotal evidence) most women have no problem following me. Most men get glassy eyes and either interrupt me, or patiently wait till I’ve made my loops and detours. I know at least one exception to this and, yes, I’m talking about you. So you should have no problem following the connections I’m about to draw from Lehrer’s book to some other books I’ve read recently.

Micheal Nielsen in his book “Reinventing Discovery” preaches that scientists should not only share their knowledge, but also share their ideas that are still under construction. In essence, his point is that our present knowledge is badly connected and has lots of unused potential. You might just have exactly that piece of knowledge that I am missing, but how will you know if I don’t tell you what it is I’m looking for? There are some prominent examples where this crowd-sourcing for knowledge matches has been successful, the Polymath project is often named.

Reading this, introspection reveals I rarely, if ever, blog about research I am working on. It’s not so much that I don’t want to, but that I can’t. I talk of course to my colleagues about what I am working on, people I have known and who have known me for a while. But they usually can’t make much sense of what I’m telling them. Heck, even my husband usually has no clue what I’m trying to say - till he has a finished paper in his hand that is. I mostly talk to them just for the merit of talking, and they know pretty well that their role is primarily to listen. I know this procedure from both sides, it’s quite common and clearly serves a purpose. But that purpose isn't sharing, it's improving the pattern seeking by bouncing loose connections off other people’s frowned foreheads.

Most often the problem I’m plaguing myself with is not the answer to a specific question, but to find a useful way to ask a question to begin with. And the way it feels to me, that’s mostly a right-brain task, a pattern-seeking, sense-making effort; a searching through the bits and pieces from papers and seminars, a matching and mixing, a blending and crossing. Once you have a concrete question, you can get out the toolbox and nail it to the wall, left-brain dominated.

Science needs both finding a question and finding an answer to that question, one can debate to which extent. But these two types of problems don’t communicate the same way. In fact, Sunstein in his book “Infotopia” points out it is very relevant for crowd-sourcing to work well that one has a well-posed question, the solution to which, if it is found, everybody will be able to agree on.

So I am thinking, there are problems we are plaguing ourselves with that we just can’t talk about. They are lonely problems.

Another connection I want to draw is to Michael Chorost’s book “World Wide Mind,” because that piece of information from Lehrer’s book had me realize just why I was so skeptic of the brain-to-brain communication method which is Chorost’s vision for the future.

Chorost suggests in his book to record each person’s pattern of neuronal activity for certain impressions, sights, smells, views, words, emotions and so on, which he calls “cliques.” An implant in your brain will pick up and decode your neural activity into “cliques” and transmit them to trigger the same cliques by somebody else’s implant in that person’s brain, which might have a different neuronal representation of the clique. That is, the cliques are essentially the basic units of brain-to-brain communication.

But what you cannot communicate this way is your brain’s attempt to find patterns in all the cliques. Neither can you, by this method, ever try to find patterns in other people’s cliques. Or, in the words that I used in my earlier post on “collective intelligence,” these are no examples for type-2 collective intelligence, the type in which the intelligence of the collective is not due to a shared and well-connected pool of knowledge, but to shared processes acting on that knowledge.

Finally, let us revisit an argument from Mark Pagel that we discussed recently. Pagel believes, in a nutshell, that we need fewer and fewer inventors because we are constantly improving the way we share ideas. The better we share the ideas we have, the fewer people we need to produce them. But what do we do with the part of the idea seeking that’s unshareable, at least for now? The better we share ideas the fewer similar ideas we need, but that leaves open the question how many people are needed to produce one shareable idea. And, distinguishing the two types of problem solving that we use, sharing doesn’t cut down the amount of necessary pattern seeking per idea. Sharing can improve the repository you search through, but taking into account that the problems are getting more involved too, it is far from clear that we need fewer people per idea.

If you’ve been following along all the way till here, thank you for your patience. If not, good to see you again in the last paragraph and either way, let me just come to the conclusion now. As I argued above, improvements in sharing and connecting ideas don’t work equally well for all types of thought processes. This bears the risk of smothering the lonely and unshareable sense-making, right-brain efforts. Much like a forest with two types of trees that receives a fertilizer which benefits only one type of trees, the shade of larger trees can cut off sunlight to the smaller trees. So I hope your lonely thoughts receive sufficient sunlight, and may you have many aha-moments right above your ear.

Sunday, June 10, 2012

New haircut!

You might or might not have noticed that some time during the last year I cut off my hair, getting tired of the babies pulling on it. A year later or so I finally went to get a proper hair-do. The first time since, ohm, 1989? Or so.

I don't like strangers with scissors near my ears. This haircut was particularly traumatic because the woman took of my glasses, washed my hair, and when I looked into the mirror I saw - my brother! I mean, when I was a kid people frequently mistook me for my younger brother, but I guess I didn't believe them we look "soo similar" till yesterday.

In any case, he has a beard now, and is a foot or so taller than me. So here's my haircut.

 

I've then also finally updated my profile pictures to short hair. Well, at least some of them. Google+'s ingenious software had the following to say about my new profile picture: "Are you sure people will recognize you in this photo? It doesn't seem to have a face in it. Upload a different photo. Dismiss." Okay, dismiss. So I don't seem to have a face. Or rather, my brother doesn't have a face.

Luckily I find the photo looks sufficiently alike the one in the blog header to justify not updating the latter. If the haircut doesn't look like much of a haircut, that's why I usually don't bother paying money for it and cut it myself. My hair is fundamentally messy, and the exact way it's cut doesn't make much of a difference.

Either way, I'm posting this photo so you'll recognize me should we meet! Because I'm always interested to get to know some of our readers. So should our paths cross some day, don't hesitate to say hello.

Friday, June 08, 2012

Also against measure

The German Physical Society (Deutsche Physikalische Gesellschaft, DPG) has a new president, Prof. Dr. Johanna Stachel from the University of Heidelberg, member of the ALICE collaboration. In her inaugural speech, she addressed the issue of the spreading measures for scientific success, which we discussed previously in my post "Against measure." Here is my (rough) translation of the respective part of her speech:
[I] want to address a point that I am personally concerned about. Scientific discoveries and breakthroughs are made by individuals. For this they need freedom, an atmosphere of congeniality, also luck. On the opposite side are - a rapidly increasing number of - programs that want to be measured by success, that want to measure success. A catalog of actions to measure quality is rolling towards us (and over us?). We all know performance-oriented assignment of grants, agreements on goals, judges for quality...


Whom do they serve? Sure, by this one can increase quantitative indicators for quality, like the number of publications or grants, much like the milk output of a cow, we see this already. But is this the atmosphere that supports a scientist to enter new terrain? The result is unclear, full of set-backs.


In a recently published book about Bell Labs (by Jon Gertner), one can read: "in innovation as in hitting home runs in baseball you have to be willing to strike out a lot to be successfull."
The full speech (in German) is printed in the June 2012 issue of the membership magazine of the DPG (and is not open access). The president of the European Research Council recently expressed a similar sentiment.

Tuesday, June 05, 2012

Scientific Publishing - Investing in our Future

During the last years, I developed the distinct feeling that my opinion on the promises of open access differs from what the majorities of bloggers preach: That open access is an end to be met by all means, preferably over the dead bodies of established publishers who are ripping us off - us, the scientists as well as us, the public. Scientific publishers, so the story goes, are making huge profits by enforcing high subscription fees for access to research results that were primarily tax funded to begin with. They have no right to deny public access to their journals.

I have splattered my opinion on this around in the comments, here and elsewhere, but thought it would be good to collect them. I am hoping for a fruitful discussion.

I'm all for open access - in principle. I have no library access at home, and it sucks if a paper isn't on the arxiv. I am not however in favor of making open access mandatory as means to enforce change, not at this point. I am concerned that the drawbacks for research will be larger than the advantages.

I'm all for open access in principle - but in practice many efforts I have seen, for example the recent call for an Elsevier boycott, do not address important questions. Michael Nielsen, in his (very recommendable) book "Reinventing Discovery" also supports a top-down approach in which funding agencies require open access to research results published under their grants. I do not support this because I find it to be a well-meant, but short-sighted procedure.

In a recent essay, H. Frederick Dylla, Executive Director and CEO of the AIP, emphasized the importance of creative destruction for progress. To improve on knowledge discovery and dissemination, we have to allow new technologies to supersede old ones, even if that means bankruptcy for some. The example Dylla calls upon is electricity putting out of work people in the candle and oil-lamp industries.

I agree, but I would like to put an emphasis on the adjective "creative" before the destruction. Making open access mandatory in a top-down approach now is like outlawing candles and oil-lamps before households have electricity. Jah, we have open access journals already, but we're not anywhere close by them being able to deliver and replace all the services we presently enjoy, in all the fields that we enjoy them, in all the quality with which we enjoy them, if we enjoy them. And boycotting established publishers just punishes those who have seved our community for centuries, and have served us well.

There seem to be many who believe that mandatory open access will have only benefits for all, except publishers who will be taught a lesson. The evil publishers will be forced to reduce fees, and to shrink profits to a reasonable level. In the end, we will have a system that provides the same service equally good or better, just at lower cost. A no-brainer, so better get a brain.

It is difficult to tell what would happen in fact. A recent survey among libraries found that, if a universal open-access mandate were introduced with an embargo period of six months, this would lead 10 per cent of libraries to cancel all their subscriptions to scientific journals, and about half of them to at least cancel some. (PDF here, see also THE for a summary.) This report was commissioned by the Publishers Association, which should have us be cautious with the finding, but not reason to outright dismiss it. Note that the number extracted from the replies are probably underestimating what would be the actual effect because once the option exists, pressure will be mounting to cancel subscriptions.

Another study by the PEER Project found that openly accessible self-archiving by researchers or universities would not have such a drastic effect. I suspect the difference between the studies is one of expectation what the archiving would look like. The study by the Publishers Association asked specifically "If the (majority of) content of research journals was freely available within 6 months of publication, would you continue to subscribe?" which suggests that what is freely available is access to the published paper on the journal homepage itself. Self-archiving on the other hand seems to me to suggest alternative online deposits, which are of very limited use for reasons of archiving, searching, filtering, tagging, referencing and so on.

Thus, the findings of the Publishers Association, that libraries would dramatically cut back on their journal subscriptions should open access become mandatory, are plausibly correct. This in turn would lead some publishers to go bankrupt or at least dramatically drop their subscription fees. And that is, after all, what many in the open access movement are hoping for.

So, having seen that this is where we might be going, let us ask what the risks are. For that let us get back to the claim that publishers are making unduly profits, and have a look at how the system is presently organized.


Research and development, and knowledge discovery in general, is essential to innovation. It is however a process that runs on a very long time scale, too long for it to work well in a purely capitalistic system: There isn't enough tangible outcome in the short run. Thus, literally all developed nations fund academic research publicly. Private funding exists, but it is more the exception than the norm.

Now this, mostly publicly funded research, needs tools to structure, filter and archive the produced knowledge. For that we have historically used commercial publishers who are working in competition with each other, much like in a free market, but are serving almost exclusively to the research communities. (Many publishers have popular science offers too, but the bulk of money comes from journal subscriptions.)

Are these subscriptions are overpriced? Are publishers making too much profit? That is not an easy question to answer. Basically, this claim means there is something wrong with the competition, something is not working with the market. Maybe it is the case that publishers make too much profit, I don't know. But I think what also plays a role for this perception of too high profits is a misunderstanding of the function of scientific publishing. The publishers themselves are trying to optimize profit, all right, after all that's how capitalism works. But the receiving end, the scientists or, in practice, the libraries, are not optimizing profit. Or at least they shouldn't. Their purpose is to do the long-term thinking, they have to think about how people tomorrow will be able to access and understand the knowledge generated today and yesterday. What do we need for that? What serves this goal?

Now look: There is an obvious tension here as financial pressure rises. Libraries get under pressure to cut subscriptions. They will cut first where there is the least resistence. That means presently unpopular and small fields. How do we know we will not regret this in a decade or two? We don't. It's a risk for knowledge discovery. Is it worth the risk? We don't know. That's bad enough already, but now ask what the publishers do. They bundle the non-popular stuff with the popular stuff in unpopular packaging deals. Why? Because otherwise they'd have to cut the stuff that's become non-popular and then it's gone for good. Now the libraries complain because the publishers are carrying on with publishing and selling research they don't want to buy any more.

Wait - that sounds like publishers are the ones holding up the torch, being concerned about the future of knowledge discovery while the scientists are the ones trying to optimize their profit. Odd, no? You shouldn't believe this any more than you should believe that publishers are evil bloodsuckers.

And now, in this situation under tension already, the open access movement wants to increase pressure on the publishers, believing that we'll end up with the same service and quality at a lower cost. No way, I say. What's more likely going to happen is that some smaller publishers go bankrupt or are bought by larger ones. And the larger ones will start throwing out what's the least profitable, serving to what seems to be the demand of the day. It's not their responsibility to do the long-term thinking. You can't blame a for-profit organization for wanting to make profit. Except, come to think of it, that's what the people signing the Elsevier boycott seem to be doing.

I personally am particularly concerned about the archiving that can be provided by relatively inexpensive depositories, which is why I like my papers to be published in print. I simply don't trust the presently available digital archiving systems, neither software nor hardware. Look, I've grown up, basically, among Roman ruins. It wasn't all that uncommon for the farmers in our neighborhood to find relics of Roman dishware or jewelry (right next to the WWII bombs that is). I'm thinking in thousands of years when I say I want knowledge to be preserved. Empires come and go. Make sure the knowledge stays. Which open access journal do you trust?

Another relevant point is the aggregation and searchability: The more information we have the more important it becomes that content is suitably filtered, tagged, searchable, classified and maintained, even if the original editing is already done. All these services cost money and are unlikely to work well in a low-cost splattered self-archiving. Are you sure the services we presently have will continue or improve? Capitalistic considerations, as we have already noted, are not serving innovation well in the long run. So why do we apply this argument to scientific publishing now?

The problem, in my eyes, is how we think about scientific publishing. For many researchers it has become primarily a cost factor, one that presently is hard to circumvent. But for me, it's an investment. It's an investment into our future, like research itself. And that investment, long-term like that into research, isn't one that should be decided on monetary arguments.

This role of scientific publishing is however obscured by the present funding practice in which journals are a cost factor to libraries, rather than, well, an investment of our societies. But if published scientific research papers are widely considered to be a service to the public that should serve the public, then the publishing too should be funded like a public service. So let publishers elaborate on their contribution to knowledge preservation and have them write proposals for funding. Let library committees judge on the merit and promise of these proposals and distribute grants. This would move the emphasis from destruction to creation, and it would put into place criteria that benefit research rather than primarily reduce cost. And it would almost certainly move us towards open access.

We should come to see scientific publishers as our alleys rather than our enemies. After all that's what they've been for centuries.



Appendix:

Scientific publishing serves many purposes: There is peer review, editing in its various forms, filtering and classifying, sorting and referencing, archiving, maintaining and distributing. Each of these services takes time and effort. They all have a monetary value. Here are some numbers, quoted from this document:

"
  • Peer review has real costs and there are no economies of scale. Average cost $250 per manuscript for salary and fees only, excludes overheads - infrastructure, systems etc. (heavily affected by rejection rates)
  • Excluding peer review, average production cost ranges from $170 to over $400 per article (again excluding all overheads)
  • Annual publisher platform maintenance costs ranges from $170k to $400k (excludes set up and development costs typically costing hundreds of thousands of dollars)"
The cost per paper is not negligible and somebody has to pay it. Even the arxiv, one of the best known, most widely used and accepted depository of papers, has been struggling to run sustainably, and they just about manage to. (For more on the arxiv budget, read this). And that's a depository only, if with some moderation. No peer review, no editing, no print, no press, no whistles and no bells.

Disclaimer:

My husband works for a large scientific publisher, Springer. He does not however work in journal publishing, but for a scientific database in print and software. He also hasn't read what I just wrote. Now time to hit the publish button.

Sunday, June 03, 2012

Interna

Stefan brought a cold from work. My immune system wasn't very impressed, but the girls went down with a fever. During the night they wake up every other hour, crying because their noses are clogged. After some of these nights, we're all cranky. I'm thinking the English word "cranky" might have the same root as the German word "krank" which means "sick." Luckily, the fever is gone now and we're on the way to recovery.

Lara and Gloria have enriched their vocabulary with the word "baby." Baby might be everything from small children over dolls and plush toys to pets. Gloria has developed an amusing body language in which she'll slap her palms to the sides of her head if she's uncertain what to do next. The other day, we were at the playground and a girl, somewhat older than ours, insisted on greeting Gloria with a series of hello's. When no reply came, the girl hugged and kissed Gloria on the cheek, which sent our blondie running away slapping her head.


We're still trying to teach them to eat with a spoon. They've understood that the spoon goes into the food first and then into the mouth, but the finer details of spoon orientation and aiming are still somewhat rough.

The girls are enjoying the summer, which means time out in the green with many things to see. For me it means recurring child-themed conversations with strangers. A common topic is the complaint of women my mother's age that their own offspring hasn't yet shown intentions of producing grandchildren. I find it a very awkward conversation to have with a stranger; I don't even know their children, so what can I possibly say? I usually settle on some vaguely sympathetic sounds and nodding.

Besides this, everything is moving forward and onward, slowly but persistently. The workshop I'm organizing is taking shape with the schedule about to be made next week. I'm preparing two talks for the Marcel Grossmann  meeting next month. I'm thinking of writing a larger grant application for the ERC later this year, which will need some preparation time. To accompany my efforts in songwriting, I bought a book on music theory, "Harmony and Voice Leading" (don't freak out over the price tag, I bought it used for $15), from which I learned words like "diminished 7th" and which gave completely new meaning to "scalar motion." I'm struggling with the dissonances though, they somehow refuse to make sense. I'm thinking I'll have to actually get some classical sheet music.