Children aren’t saints. We’re born mistrusting people who look different from us, and we treat those who look like us better. Toddlers already have this “in-group bias” research says. Though I have to admit that, as a physicist, I am generally not impressed by what psychologists consider statistically significant, and I acknowledge it is generally hard to distinguish nature from nurture. But that a preference for people of similar appearance should be a result of evolution isn’t so surprising. We are more supportive to who we share genes with, family ahead of all, and looks are a giveaway.
As we grow up, we should become aware that our bias is both unnecessary and unfair, and take measures to prevent it from being institutionalized. But since we are born being extra suspicious about anybody not from our own clan, it takes conscious educational effort to act against the preference we give to people “like us.” Racist thoughts are not going away by themselves, though one can work to address them – or at least I hope so. But it starts with recognizing one is biased to begin with. And that’s why this photo bothers me. Denying a problem rarely helps solving it.
On the same romantic reasoning I often read that infants are all little scientists, and it’s only our terrible school education that kills curiosity and prevents adults from still thinking scientifically. That is wrong too. Yes, we are born being curious, and as children we learn a lot by trial and error. Ask my daughter who recently learned to make rainbows with the water sprinkler, mostly without soaking herself. But our brains didn’t develop to serve science, they developed to serve ourselves in the first place.
My daughters for example haven’t yet learned to question authority. What mommy speaks is true, period. When the girls were beginning to walk I told them to never, ever, touch the stove when I’m in the kitchen because it’s hot and it hurts and don’t, just don’t. They took this so seriously that for years they were afraid to come anywhere near the stove at any time. Yes, good for them. But if I had told them rainbows are made by garden fairies they’d have believed this too. And to be honest, the stove isn’t hot all that often in our household. Still today much of my daughters’ reasoning begins with “mommy says.” Sooner or later they will move beyond M-theory, or so I hope, but trust in authorities is a cognitive bias that remains with us through adulthood. I have it. You have it. It doesn’t go away by denying it.
Let me be clear that human cognitive biases aren’t generally a bad thing. Most of them developed because they are, or at least have been, of advantage to us. We are for example more likely to put forward opinions that we believe will be well-received by others. This “social desirability bias” is a side-effect of our need to fit into a group for survival. You don’t tell the tribal chief his tent stinks if you have a dozen fellows with spears in the back. How smart of you. While opportunism might benefit our survival, it rarely benefits knowledge discovery though.
It is because of our cognitive shortcomings that scientists have put into place many checks and methods designed to prevent us from lying to ourselves. Experimental groups for example go to lengths preventing bias in data analysis. If your experimental data are questionnaire replies then that’s that, but in physics data aren’t normally very self-revealing. They have to be processed suitably and be analyzed with numerical tools to arrive at useful results. Data has to be binned, cuts have to be made, background has to be subtracted.
There are usually many different ways to process the data, and the more ways you try the more likely you are to find one that delivers an interesting result, just by coincidence. It is pretty much impossible to account for trying different methods because one doesn’t know how much these methods are correlated. So to prevent themselves from inadvertently running multiple searches for a signal that isn’t there, many experimental collaborations agree on a method for data analysis before the data is in, then proceed according to plan.
(Of course if the data are made public this won’t prevent other people to reanalyze the same numbers over and over again. And every once in a while they’ll find some signal whose statistical significance they overestimate because they’re not accounting, can’t account, for all the failed trials. Thus all the CMB anomalies.)
In science as in everyday life the major problems though are the biases we do not account for. Confirmation bias is the probably most prevalent one. If you search the literature for support of your argument, there it is. If you try to avoid that person who asked a nasty question during your seminar, there it is. If you just know you’re right, there it is.
Even though it often isn’t explicitly taught to students, everyone who succeeded making a career in research has learned to work against their own confirmation bias. Failing to list contradicting evidence or shortcomings of one’s own ideas is the easiest way to tell a pseudoscientist. A scientist’s best friend is their inner voice saying: “You are wrong. You are wrong, wrong, W.R.O.N.G.” Try to prove yourself wrong. Then try it again. Try to find someone willing to tell you why you are wrong. Listen. Learn. Look for literature that explains why you are wrong. Then go back to your idea. That’s the way science operates. It’s not the way humans normally operate.
(And lest you want to go meta on me, the title of this post is of course also wrong. We are scientists in some regards but not in others. We like to construct new theories, but we don’t like being proved wrong.)But there are other cognitive and social biases that affect science which are not as well-known and accounted for as confirmation bias. “Motivated cognition” (aka “wishful thinking”) is one of them. It makes you believe positive outcomes are more likely than they really are. Do you recall them saying the LHC would find evidence for physics beyond the standard model. Oh, they are still saying it will?
Then there is the “sunk cost fallacy”: The more time and effort you’ve spent on SUSY, the less likely you are to call it quits, even though the odds look worse and worse. I had a case of that when I refused to sign up for the Scandinavian Airline frequent flyer program after I realized that I'd be a gold member now had I done this 6 years ago.
I already mentioned the social desirability bias that discourages us from speaking unwelcome truths, but there are other social biases that you can see in action in science.
The “false consensus effect” is one of them. We tend to overestimate how much and how many other people agree with us. Certainly nobody can disagree that string theory is the correct theory of quantum gravity. Right. Or, as Joseph Lykken and Maria Spiropulu put it:
“It is not an exaggeration to say that most of the world’s particle physicists believe that supersymmetry must be true.” (Their emphasis.)The “halo effect” is the reason we pay more attention to literally every piece of crap a Nobelprize winner utters. The above mentioned “in-group bias” is what makes us think researchers in our own field are more intelligent than others. It’s the way people end up studying psychology because they were too stupid for physics. The “shared information bias” is the one in which we discuss the same “known problems” over and over and over again and fail to pay attention to new information held only by a few people.
One of the most problematic distortions in science is that we consider a fact more likely the more often we have heard of it, called the “attentional bias” or the “mere exposure effect”. Oh, and then there is the mother of all biases, the “bias blind spot,” the insistence that we certainly are not biased.
Cognitive biases we’ve always had of course. Science has progressed regardless, so why should we start paying attention now? (Btw, it’s called the “status-quo-bias”.) We should pay attention now because shortcomings in argumentation become more relevant the more we rely on logical reasoning detached from experimental guidance. This is a problem which affects some areas of theoretical physics more than any other field of science.
The more prevalent problem though is the social biases whose effects become more pronounced the larger the groups are, the tighter they are connected, and the more information is shared. This is why these biases are so much more relevant today than a century, even two decades ago.
You can see these problems in pretty much all areas of science. Everybody seems to be thinking and talking about the same things. We’re not able to leave behind research directions that turn out fruitless, we’re bad at integrating new information, we don’t criticize our colleagues’ ideas because we are afraid of becoming “socially undesirable” when we mention the tent’s stink. We disregard ideas off the mainstream because these come from people “not like us.” And we insist our behavior is good scientific conduct, purely based on our unbiased judgement, because we cannot possibly be influenced by social and psychological effects, no matter how well established.
These are behaviors we have developed not because they are stupid, but because they are beneficial in some situations. But in some situations they can become a hurdle to progress. We weren’t born to be objective and rational. Being a good scientist requires constant self-monitoring and learning about the ways we fool ourselves. Denying the problem doesn’t solve it.
What I really wanted to say is that I’ve finally signed up for the SAS frequent flyer program.