Thursday, August 09, 2012

Book review: “Thinking, fast and slow” by Daniel Kahneman

Thinking, Fast and Slow
By Daniel Kahneman
Farrar, Straus and Giroux (October 25, 2011)

I am always on the lookout for ways to improve my scientific thinking. That’s why I have an interest in the areas of sociology concerned with decision making in groups and how the individual is influenced by this. And this is also why I have an interest in cognitive biases - intuitive judgments that we make without even noticing; judgments which are just fine most of the time but can be scientifically fallacious. Daniel Kahneman’s book “Thinking, fast and slow” is an excellent introduction to the topic.

Kahneman, winner of the Nobel Price for Economics in 2002, focuses mostly on his own work, but that covers a lot of ground. He starts with distinguishing between two different modes in which we make decisions, a fast and intuitive one, and a slow, more deliberate one. Then he explains how fast intuitions lead us astray in certain circumstances.

The human brain does not make very accurate statistical computations without deliberate effort. But often we don’t make such an effort. Instead, we use shortcuts. We substitute questions, extrapolate from available memories, and try to construct plausible and coherent stories. We tend to underestimate uncertainty, are influenced by the way questions are framed, and our intuition is skewed by irrelevant details.

Kahneman quotes and summarizes a large amount of studies that have been performed, in most cases with sample questions. He offers explanations for the results when available, and also points out where the limits of present understanding are. In the later parts of the book he elaborates on the relevance of these findings about the way humans make decision for economics. While I had previously come across a big part of the studies that he summarizes in the early chapters, the relation to economics had not been very clear to me, and I found this part enlightening. I now understand my problems trying to tell economists that humans do have inconsistent preferences.

The book introduces a lot of terminology, and at the end of each chapter the reader finds a few examples for how to use them in everyday situations. “He likes the project, so he thinks its costs are low and its benefits are high. Nice example of the affect heuristic.” “We are making an additional investment because we not want to admit failure. This is an instance of the sunk-cost fallacy.” Initially, I found these examples somewhat awkward. But awkward or not, they serve very well for the purpose of putting the terminology in context.

The book is well written, reads smoothly, is well organized, and thoroughly referenced. As a bonus, the appendix contains reprints of Kahneman’s two most influential papers that contain somewhat more details than the summary in the text. He narrates along the story of his own research projects and how they came into being which I found a little tiresome after he elaborated on the third dramatic insight that he had about his own cognitive bias. Or maybe I'm just jealous because a Nobel Prize winning insight in theoretical physics isn't going to come by that way.

I have found this book very useful in my effort to understand myself and the world around me. I have only two complaints. One is that despite all the talk about the relevance of proper statistics, Kahneman does not mention the statistical significance of any of the results that he talks about. Now, this is all research which started two or three decades ago, so I have little doubt that the effects he talks about are indeed meanwhile well established, and, hey, he got a Nobel Prize after all. Yet, if it wasn’t for that I’d have to consider the possibility that some of these effects will vanish as statistical artifacts. Second, he does not at any time actually explain to the reader the basics of probability theory and Bayesian inference, though he uses it repeatedly. This, unfortunately, limits the usefulness of the book dramatically if you don’t already know how to compute probabilities. It is particularly bad when he gives a terribly vague explanation of correlation. Really, the book would have been so much better if it had at least an appendix with some of the relevant definitions and equations.

That having been said, if you know a little about statistics you will probably find, like I did, that you’ve learned to avoid at least some of the cognitive biases that deal with explicit ratios and percentages, and different ways to frame these questions. I’ve also found that when it comes to risks and losses my tolerance apparently does not agree with that of the majority of participants in the studies he quotes. Not sure why that is. Either way, whether or not you are subject to any specific bias that Kahneman writes about, the frequency by which they appear make them relevant to understand the way human society works, and they also offer a way to improve our decision making.

In summary, it’s a well-written and thoroughly useful book that is interesting for everybody with an interest in human decision-making and its shortcomings. I'd give this book four out of five stars.

Below are some passages that I marked that gave me something to think. This will give you a flavor what the book is about.

“A reliable way of making people believe in falsehoods is frequent repetition because familiarity is not easily distinguished from truth.”

“[T]he confidence that people experience is determined by the coherence of the story they manage to construct from available information. It is the consistency of the information that matters for a good story, not its completeness.”

“The world in our heads is not a precise replica of reality; our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed.”

“It is useful to remember […] that neglecting valid stereotypes inevitably results in suboptimal judgments. Resistance to stereotyping is a laudable moral position, but the simplistic idea that the resistance is cost-less is wrong.”

“A general limitation of the human mind is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.”

“I have always believed that scientific research is another domain where a form of optimism is essential to success: I have yet to meet a successful scientist who lacks the ability to exaggerate the importance of what he or she is doing, and I believe that someone who lacks a delusional sense of significance will wilt in the fact of repeated experiences of multiple small failures and rare successes, the fate of most researchers.”

“The brains s of humans and other animals contain a mechanism that is designed to give priority to bad news.”

“Loss aversion is a powerful conservative force that favors minimal changes from the status quo in the lives of both institutions and individuals.”

“When it comes to rare probabilities, our mind is not designed to get things quite right. For the residents of a planet that maybe exposed to events no one has yet experienced, this is not good news.”

“We tend to make decisions as problems arise, even when we are specifically instructed to consider them jointly. We have neither the inclination not the mental resources to enforce consistency on our preferences, and our preferences are not magically set to be coherent, as they are in the rational-agent model.”

“The sunk-cost fallacy keeps people for too long in poor jobs, unhappy marriages, und unpromising research projects. I have often observed young scientists struggling to salvage a doomed project when they would be better advised to drop it and start a new one.”

“Although Humans are not irrational, they often need help to make more accurate judgments and better decisions, and in some cases policies and institutions can provide that help.”

21 comments:

Arun said...

The personal sunk cost fallacy likely has a different dynamic than the organizational sunk cost fallacy.

Phil Warnell said...

Hi Bee,

A nice review of a book that attempts to have us further understand the decision making process; will put this on my list of what to read.

Best,

Phil

Giotis said...

Bee to tell you the truth I think you are overanalyzing things.

You can't guide your brain; be spontaneous.

But then again I don't know maybe this is a German thing:-)

Uncle Al said...

IQs in a committee add like ohms in parallel resistors. The second worst possible decision after a defective consensus is to do nothing, the worst is to do it harder. Those are the managerial defaults, hence the US social services agenda - welfare, law enforcement, finance.

http://www.mazepath.com/uncleal/aha.jpg

Plato Hagel said...

Hi Bee,

The human brain does not make very accurate statistical computations without deliberate effort. But often we don’t make such an effort. Instead, we use shortcuts. We substitute questions, extrapolate from available memories, and try to construct plausible and coherent stories. We tend to underestimate uncertainty, are influenced by the way questions are framed, and our intuition is skewed by irrelevant details.

I think a lot has been missed by these statements. You do call the work statistical.

The intuitive framework has to recognize that you have already worked the angles and that such intuition is gathered from all that has been worked. This contradicts what you are saying. I am not saying it is right just that I have seen this perspective in development with regard to scientists as they push through the wall that has separated them from moving on. This then details a whole set of new parameters in which the thinking mind can move forward with proposals.

I stopped at that point and will move forward from there.

Best,

Plato Hagel said...

.....

Of course it made me think of Follow Up to the Economic Manhattan Project

Maybe this is the organizational effect Arun is speaking about?

I never quite could get the economy either, until I understood the idea of Fractals as a gesture of the underlying pattern of all of the economy in expression. Of course that is my point of view. I might of called it the algorithm before.

The idea here is that all thing are expression of the underlying pattern and you might call the end result psychology or sociology of thinking and life as a result.

It seems that the accumulated reference of mind as a place in it's evolution is to see that all the statistical information is already parametrized by the judgements in which you give them personally?

Best,

Plato Hagel said...

Lastly.....


you might find this interesting. A very simple assumption.

Best,

Zephir said...

The fast and slow human thinking can be compared to spreading of gravitational and light waves in vacuum and/or longitudinal and transverse waves at the water surface. The intuitive thinking may not be misleading more than the strictly deterministic, if its follows a sufficiently high number of facts at the same moment. In this context the reading of articles The era of expert failure by
Arnold Kling,  Why experts are usually wrong by David
H. Freeman and Why the experts missed the crash by Phill Tetlock may be useful

Bee said...

Hi Arun,

"The personal sunk cost fallacy likely has a different dynamic than the organizational sunk cost fallacy."

It is probably the case that organizations have different ways to come to decisions because there are usually more people involved. Is that what you mean?

Best,

B.

Bee said...

Hi Giotis,

Well, the whole point of Kahneman's book is that it is possible to guide your brain - to some extent. Basically, his argument is that normally everything works just fine spontaneously. But there are instances where, for one reason or the other, the spontaneous "insight" is wrong. He compares this to optical illusions, for example the Müller-Lyer illusion that you probably know. You cannot actually avoid optical illusions, they're too deeply ingrained in the way our brains work. But what you can do is recognize them and recall that they're illusions. As you probably just did. So if somebody goes and asks you which line segment is longer, you'll be able to say: I've seen this before, they don't look like it but they're the same length. A similar approach can work for other cognitive illusions, for example framing effects or anchors and so on. You can't really avoid them, but you can learn to recognize them and be careful with your conclusions.

But, yeah, I'm overanalyzing things, you're not the first to say that. I don't think it's a specifically German thing though. It's just my way of acknowledging that the world doesn't actually make much sense to me. Best,

B.

Bee said...

Hi Plato,

You say that the intuitions are a statistical assessment of past memories. That's true in a sloppy way, in that the brain tries to come to some conclusion from the memories. But what I'm saying is that this conclusion is not the correct one you would get if you'd do the statistics correctly.

Let me give you a simple example. In your life you've met three people from, say, Austria, just to pick a random example. They were all grumpy and unfriendly. You're extrapolating that all Austrians are grumpy and unfriendly. What you are not computing is the probability that your sample is a statistical fluctuation, and the smaller the sample, the more likely is that. To make matters worse, it's a well established effect that people try to construct simple and coherent explanations, so the first Austrian you ever met probably shaded your whole expectations about Austria and Austrians. Not proper statistics.

There are many other examples in Kahneman's book. For example that, if you have a completely random series, your brain will over-estimate the correlations and construct causal explanations even if they're not present. He spends a whole chapter explaining that people who trade stocks believe they know how the market works even after you tell them that their gains and losses are to excellent precision statistically distributed and uncorrelated, that it doesn't matter at all what they "know", they could as well do random picks.

This part of the book about intuitions is very interesting because he elaborates on the conditions that need to be met for "expert intuition" to develop. Basically, the environment needs to allow the "experts" to learn from experience, which means both that that correlations are strong enough to be identifiable without computer help and that there is a feedback telling the expert what's right and what's wrong and that the incentives are towards doing the right thing. (Esp the latter isn't always the case. Kahneman explains for example that no CEO would keep their job if they'd be honest about how little they can actually "predict" about the future. They're expected to know, whether or not that's true.) Best,

B.

Bee said...

Hi Phil,

Well, let me know what you think when you're done reading. And don't let yourself be biased by my review ;o) Best,

B.

Arun said...

Hi Bee,
Sorry, typing on the iPad is slow, and I tend to be telegraphic. A person may want to persevere on in something for a variety of personal reasons. In an organization, however, admission of failure means heads must roll, and so admissions of failure are rare. This results in throwing in good money after bad.

The purpose of reading your reviews IS to be biased by them. :)

Best,
Arun

Plato Hagel said...

Bee writes to Goitis:Well, the whole point of Kahneman's book is that it is possible to guide your brain - to some extent.

Ultimately this is the setting for which your conclusions guide your perspective, yet, it is when we look back, one can choose too, "guide their brain?"

Bee:He spends a whole chapter explaining that people who trade stocks believe they know how the market works even after you tell them that their gains and losses are to excellent precision statistically distributed and uncorrelated, that it doesn't matter at all what they "know", they could as well do random picks.

If you did not pick it up, Benoit was able to reduce the economy too, and used an inductive deductive facility in regard to what is self evident. But I would point out what you might have interpreted as illusory in terms of the graph he sees on the board was instrumental to his penetrating the pattern in the economy.

Just raising the name of Nassim Nicholas Taleb and the idea of the Black Swan in relation to the basis of the economy Benoit raises deeper questions and does garner a look for me. I don't know what to expects is opening up the door to understanding more about such erudition's but they are with regard to the economy.

Best,

Plato Hagel said...

Hi Bee,

Taleb was collaborating with Benoit Mandelbrot on a general theory of risk management Collaborations

A simple assumption about heads and tails, leads to bell curves and such?

Taleb, N. N. (2008) Edge article: Real Life is Not a Casino

So you are looking at both sides of the coin.

More on the Black Swan here.

I do not know what to expect yet:)

Best,

Plato Hagel said...

Lastly,

While these writings are disparate pieces, do they indeed come together under this post book review?? As a scientist and mathematics person are you not intrigued about "the pattern?" I was shocked.....yet is made sense.

Now Nassim adds dimension to the subject. "He calls for cancellation of the Nobel Memorial Prize in Economics, saying that the damage from economic theories can be devastating".

Game theory if you know how it works it is used in all types of negotiation.

The Black Swan

Best,

Bee said...

Hi Plato,

Yes, Kahneman relates his elaborations to Taleb's black swans too. Basically, his point is that we're not good in judging the risk of events we have no experience with. This is what this quotation alludes to

“When it comes to rare probabilities, our mind is not designed to get things quite right. For the residents of a planet that maybe exposed to events no one has yet experienced, this is not good news.”

Best,

B.

Bee said...

Hi Arun,

I don't see why admitting failure is more difficult for organizations than for individuals. If there are several people involved, one can always try to blame somebody else or "the system". A person can only blame herself. Best,

B.

Arun said...

Hi Bee,

In organizations, failure is always a blame game. The options for an individual to decide to change are many more. Boredom is one. Having learned something is another. etc.

-Arun

AEspinel said...

“The brains s of humans and other animals contain a mechanism that is designed to give priority to bad news.”

This regrettable sentence suggests a designer. Apparently the reviewer, and the author, overlooked this comment and by the way, the concepts of evolution. A better sentence could be something like “ the brain’s structure of humans and other vertebrates (would be very useful to specify the taxonomical realm of the assertion) give priority to “bad” news… (also needed a definition of bad).

Phillip Helbig said...

"This regrettable sentence suggests a designer."

No, it doesn't. A mechanism can certainly evolve. "Mechanism" doesn't imply a designer any more than "structure" does.