- Transparency in experimental methodology, observation, and collection of data.
- Public availability and reusability of scientific data.
- Public accessibility and transparency of scientific communication.
- Using web-based tools to facilitate scientific collaboration.
These seem to be rather experimentally focussed, so let me add some words from the perspective of a theorist. Since I just finished reading Surowiecki's "Wisdom of Crowds" (see review), I feel now better equipped to get across something I already said in my post We Are Einstein, so let me quote myself and then explain:
"[A]n environment with a very high interaction rate thermalizes quickly, and can be very destructive in the early stage of an idea's development. A highly connected community means we’ll have to watch out very carefully for sociological phenomena that might affect objectivity, and work towards premature consensus. We will have to watch out for fads that grow out of proportion, and we will have to find a way to protect the young ideas that “you have to ram down people's throats,” in Atkin's words, until people are ready to swallow them. There is no reason to assume scientists are immune to sociological effects."With the wisdom I gathered out of Surowiecki's book, the point I was trying to make is that sharing too much information and being too tightly connected will actually lead to a dumb rather than a smart community.
Yes, that is right. What I am saying is that all the sharing and openness can actually harm progress. In fact, I think we already share way too much too premature information. The reason is that scientists too are only human. If we hear some colleagues talk who are genuinely excited about a topic, chances are we'll get interested. If we have an idea in an early stage and bounce it off a lot of people, it will lose its edges because we'll try to make it fit. If we hear something repeatedly, we are likely to think it's of some relevance. If we know the opinions of other people, in particular people with a higher social status or more experience, we'll try to fit in. That's what humans do. That's why crowds make dumb decisions. That's how groupthink starts, that's where herding comes from, that's how hypes and bubbles are created. As Surowiecki points out, independence during the opinion making process is essential for an outcome that reflects all the wisdom present in the crowd.
Of course nothing of that applies to you, the superior and entirely rational scientist, because you are different. Funny though that study after study shows scientists are just like all other people.
Dan writes in his post that he wants the incentive structure to be changed such that it supports openness. With that he means "Work. Finish. Publish. Release." Again, this seems specific to experiment (a theory is released when it's published). I do of course agree on the goal, but not on the means. I am generally suspicious about any "incentives" that are supposed to push scientists into doing something they wouldn't voluntarily do. We do have such incentives today. And they are counterproductive. I don't want them to be replaced with other incentives that somebody cooked up on his blog and that likely turn out to be equally counterproductive, though for other reasons. That's why I say the only thing we have to rely on is our own judgement and what we should be doing is to avoid any distortion of the opinion making processes. And for that, we should be paying attention to what advice our colleagues from psychology and sociology have to offer.
Of course nothing of that applies to you, the superior and entirely rational scientist, because you are different. Funny though that study after study shows scientists are just like all other people.
Dan writes in his post that he wants the incentive structure to be changed such that it supports openness. With that he means "Work. Finish. Publish. Release." Again, this seems specific to experiment (a theory is released when it's published). I do of course agree on the goal, but not on the means. I am generally suspicious about any "incentives" that are supposed to push scientists into doing something they wouldn't voluntarily do. We do have such incentives today. And they are counterproductive. I don't want them to be replaced with other incentives that somebody cooked up on his blog and that likely turn out to be equally counterproductive, though for other reasons. That's why I say the only thing we have to rely on is our own judgement and what we should be doing is to avoid any distortion of the opinion making processes. And for that, we should be paying attention to what advice our colleagues from psychology and sociology have to offer.
Sometimes when I hear Science2.0 fans fantasize about the brave new world they want to create, one in which every scientist throws his thoughts into a vast global pool of knowledge and thousands colleagues contribute and advise, I get really scared. For all we can tell from current knowledge, the result will be a combination of streamlining and self-supporting fads. What scientists really need is more time and more freedom to play with their ideas without pressure to fit in, to publish, to make up their mind.
Thus, my bottomline is always the same: You can dream up any 2.0 utopia you want. But in reality it will be populated with imperfect, irrational humans. If you're not taking into account well studied sociological and psychological effects your utopia will be a dystopia. Science can be too open.