[Book 16] Thinking Fast and Slow by Daniel Kahneman

Thinking Fast and Slow is a seminal study on the intricacies of mind’s inner workings, encompassing decision making and judgments. Daniel describes the core principle through two modes of thought: System 1 – fast, intuitive, emotional and System 2 – slow, logical, intentional. Within this framework, he explains numerous cognitive biases that lead to errors in our judgment, often when System 1 unwittingly takes a well-intentioned but incorrect shortcut. This book, better than any other one I’ve encountered, codifies the connections and inherent challenges between finance, psychology, and statistics that drive my interest in financial services.

You should read this book if you…

  • want to understand several sources of flaws in your own and other’s thinking
  • are interested in learning how probability and statistics can improve your world view
  • often find your emotions overriding logic and want to better understand why

Additional Information

Year Published: 2011
Book Ranking (from 1-10): 10 – Superb – Changed the way I live my life
Ease of Read (from 1-5): 4 – Moderately challenging

Key Highlights

  1. The essence of intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution
  2. Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed
  3. In addition to making your message simple, try to make it memorable. Put your ideas in verse if you can; they will be more likely to be taken as truth
  4. Meeting simple rule: before an issue is discussed, all members of the committee should be asked to write a very brief summary of their position. This procedure makes good use of the value of the diversity of knowledge and opinion in the group. The standard practice of open discussion gives too much weight to the opinions of those who speak early and assertively, causing others to line up behind them
  5. Amos liked the idea of an adjust- and- anchor heuristic as a strategy for estimating uncertain quantities: start from an anchoring number, assess whether it is too high or too low, and gradually adjust your estimate by mentally “moving” from the anchor. The adjustment typically ends prematurely, because people stop when they are no longer certain that they should move farther
  6. The sum of two people’s self assessed contribution to an issue greatly exceeds 100%. The explanation is a simple availability bias: both spouses remember their own individual efforts and contributions much more clearly than those of the other, and the difference in availability leads to a difference in judged frequency
  7. “The emotional tail wags the rational dog.” The affect heuristic simplifies our lives by creating a world that is much tidier than reality. Good technologies have few costs in the imaginary world we inhabit, bad technologies have no benefits, and all decisions are easy. In the real world, of course, we often face painful tradeoffs between benefits and costs
  8. The amount of concern is not adequately sensitive to the probability of harm; you are imagining the numerator— the tragic story you saw on the news— and not thinking about the denominator. Sunstein has coined the phrase “probability neglect” to describe the pattern
  9. People who are taught surprising statistical facts about human behavior may be impressed to the point of telling their friends about what they have heard, but this does not mean that their understanding of the world has really changed. The test of learning psychology is whether your understanding of situations you encounter has changed, not whether you have learned a new fact. There is a deep gap between our thinking about statistics and our thinking about individual cases
  10. The goal of venture capitalists is to call the extreme cases correctly, even at the cost of overestimating the prospects of many other ventures
  11. Those who know more forecast very slightly better than those who know less. But those with the most knowledge are often less reliable. The reason is that the person who acquires more knowledge develops an enhanced illusion of her skill and becomes unrealistically overconfident
  12. “The Hedgehog and the Fox.” Hedgehogs “know one big thing” and have a theory about the world; they account for particular events within a coherent framework and are confident in their forecasts. Foxes, by contrast, are complex thinkers. They don’t believe that one big thing drives the march of history. Instead the foxes recognize that reality emerges from the interactions of many different agents and forces, including blind luck, often producing large and unpredictable outcomes
  13. Formulas that assign equal weights to all the predictors are often superior, because they are not affected by accidents of sampling. The surprising success of equal- weighting schemes has an important practical implication: it is possible to develop useful algorithms without any prior statistical research
  14. Amos and I coined the term planning fallacy to describe plans and forecasts that are unrealistically close to best- case scenarios could be improved by consulting the statistics of similar cases
  15. Premortem. When the organization has almost come to an important decision but has not formally committed itself, state:  “Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.” The main virtue of the premortem is that it legitimizes doubts. Furthermore, it encourages even supporters of the decision to search for possible threats that they had not considered earlier
  16. The four problems highlight the weakness of Bernoulli’s model. His theory is too simple and lacks a moving part. The missing variable is the reference point, the earlier state relative to which gains and losses are evaluated. In Bernoulli’s theory you need to know only the state of wealth to determine its utility, but in prospect theory you also need to know the reference state. Prospect theory is therefore more complex than utility theory
  17. The long- term success of a relationship depends far more on avoiding the negative than on seeking the positive. Gottman estimated that a stable relationship requires that good interactions outnumber bad interactions by at least 5 to 1
  18. When you pay attention to a threat, you worry— and the decision weights reflect how much you worry. Because of the possibility effect, the worry is not proportional to the probability of the threat. Reducing or mitigating the risk is not adequate; to eliminate the worry the probability must be brought down to zero
  19. Ask 25 managers to consider a risky option in which, with equal probabilities, they could lose a large amount of the capital they controlled or earn double that amount. None of the executives was willing to take such a dangerous gamble. The  CEO’s answer, “I would like all of them to accept their risks.” In the context of that conversation, it was natural for the CEO to adopt a broad frame that encompassed all 25 bets. Like Sam facing 100 coin tosses, he could count on statistical aggregation to mitigate the overall risk
  20. Peak- end rule: The global retrospective rating was well predicted by the average of the level of pain reported at the worst moment of the experience and at its end. Duration neglect: The duration of the procedure had no effect whatsoever on the ratings of total pain.
  21. The photographer does not view the scene as a moment to be savored but as a future memory to be designed
  22. This is the essence of the focusing illusion, which can be described in a single sentence: Nothing in life is as important as you think it is when you are thinking about it

Discover more from The Broader Application

Subscribe to get the latest posts sent to your email.

Discover more from The Broader Application

Subscribe now to keep reading and get access to the full archive.

Continue reading