Thinking Fast and Slow - Daniel Kahneman

 

Sustaining doubt is harder work than sliding into certainty.” – Daniel Kahneman

 

On the surface, Daniel Kahneman’s book ‘Thinking Fast and Slow’ is an examination of a two-component model of human behaviour – thinking fast (intuition) and thinking slow (analysis). At a deeper level it is the most comprehensive examination you will ever read about how the human mind screws thing up. The compendium of brain bothcing is chapter-and-verse substantiation for embracing the failure of making assessments with Vulcan-like logic.

Kahneman argues that the human mind has evolved both cerebral systems as a way to have the best of both worlds. The first is a rapid-response cognitive engine both for when time is of the essence as well as for everyday things so as not to exert too much valuable mental energy on routine matters. The second is a more computationally intensive engine for solving more complex issues. The human mind can apply either of these calculators to problem solving it faces. But that raises some interesting problems. First, how does a human brain know when the right time is to use one or the other. Second, what happens if the two systems disagree with one another?

  • “The bat-and-ball problem [‘A bat and ball cost $1.10. The bat costs one dollar more than the ball. How much more does the ball cost?] is our first encounter with an observation that will be a recurrent theme of this book; many people are overconfident, prone to place too much faith in their intuition…Subjective confidence in a judgement is not a reasoned evaluation of the probability that this judgement is correct. Confidence is a feeling, which reflects the coherence of the information and cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.”

He explores an unmatched compendium of cognitive failure syndromes some of which I have addressed before myself, but many of which are quite novel…

  • Confirmation Bias – While scientific method constantly seeks to disprove, confirmation bias selectively filters for reinforcing data…rose-tinted glasses.
  • Law of Small Numbers – “The law of small numbers is a manifestation of a general bias that favors certainty over doubt.”
  • Illusion of Validity – “The global evidence of our previous failure should have shaken our confidence in our judgements of the [officer training] candidates, but it did not. It should have caused us to moderate our predictions, but it did not. We knew as a general fact that our predictions were little better than random guesses but we continued to feel and act as if each of our specific predictions was validthe confidence that people have in their intuitions is not a reliable guide to their validity. In other words, do not trust anyone – including yourself – to tell you how much you should trust their judgement.” (also echoed in the Nisbett and Borgida experiment).
  • Illusion of Skill – (corollary to the Illusion of Validity) “The illusion of skill is not only an individual aberration; it is deeply ingrained in the culture of the industry. Facts that challenge such basic assumptions – and thereby threaten people’s livelihoods and self-esteem – are simply not absorbed. The mind does not digest themThose with the most knowledge are often less reliable. The reason is that the person who acquires more knowledge develops an enhanced illusion of her skill and becomes unrealistically overconfident.”
  • Planning Fallacy – (a more specific instantiation of the Illusion of Validity applied to ‘Planning’) “The term ‘planning fallacy’ [describes] plans and forecasts that (a) are unrealistically close to best-case scenarios, and (b) could be improved by consulting the statistics of similar cases.” Probably, most familiar in every day life to the rule of thumb to take the builders best estimate…and double it (for time and cost).
  • Diminishing Utility of Wealth – “Explains risk aversion – the common preference that people generally show for a sure thing over a favourable gamble of equal or slightly higher expected value.” Looks at not just the overall scale of wealth and the size of the risk/opportunity relative to it, but also the preceding state of wealth (ie. was the person richer or poorer earlier on). All of these contextual factors have a huge impact on people risk propensities.
  • Denominator Neglect – “If your attention is drawn to the winning marbles, you do not assess the number of non-winning marbles with the same care. Vivid imagery contributes to denominator neglect.” (used extensively by casinos)
  • Disposition Effect – “If the problem is framed as a choice between giving yourself pleasure or causing yourself pain, you will [choose the former]…Finance research has documented a massive performance for selling winners rather than losers.” (in the hopes that the losers will recover). Kahneman goes on to explain that this effect is itself an instance of ‘Narrow Framing’ fallacy and has a more specific example known as the…
  • Sunk Cost Fallacy – “The decision to invest additional resources in a losing account, when better investments are available.”
  • Taboo Trade-Off – The aversion to “accepting any increase in risk” of which the most classic is people going to extreme lengths to avoid risks for their children when the resources could be more effectively applied to other areas that would actually better protect the children.
  • Duration Neglect – “The duration of a [painful] procedure had no effect whatsoever on the ratings of total pain
  • Peak-end Rule – “The global retrospective rating was well predicted by the average of the level of pain reported at the worst moment of the experience and at its end.” Why ‘moments of truth’ are so critical to customer satisfaction (ie. cheery smiles and free candies at reception end up meaning nothing if you can’t deliver useful help when the customer is in more dire need of it).
  • Affect Heuristic – “where judgements and decisions are guided by feelings of liking or disliking, with little deliberation or reasoning.”
  • Affective Forecasting – “People who get married expect it will make them happier or because they hope that making a tie permanent will maintain the present state of bliss…on their wedding day, the bride and groom know that the rate of divorce is high and that the incidence of marital disappointment is even higher, but they do not believe that these statistics apply to them.”
  • Focusing Illusion – “Nothing in life is as important as you think it is when you are thinking about it.” An extension of ‘Affective Forecasting’ and something Dan Gilbert examined in depth…”Daniel Gilbert and his colleague provocatively claim that people generally anticipate more regret than they will actually experience, because they underestimate the efficacy of the psychological defences they will deploy – which they label the ‘psychological immune system.’ Their recommendation is that you should not put too much weight on regret; even if you have some, it will hurt less than you think.”

Finally, as I would expect from any thorough treatment of risk analysis, Kahneman’s work also includes some analysis apropos to the Upside/Downside model of Leadership/Management. In particular, his analysis of people’s attitudes to risk get quite sophisticated including the landmark ‘Boston Matrix’ for Prospect Theory dubbed the ‘Fourfold Pattern’. It represents (a) Probability, and (b) Gains/Losses across the two dimensions and identifies ‘Risk Adverse’ (Managers) and ‘Risk Seeking’ (Leaders) quadrants.

Required reading for any one serious about understanding how people really understand and respond to risk.

About these ads