10 July 2014

Thinking, Fast and Slow -- the review

Daniel Kahneman deserved to win the Nobel Prize in Economics for his contributions to "behavioral economics." He and collaborator Amos Tversky (who died before he could receive the award) used psychological insights to explain "irrational" behavior that economists had pre-emptively (and un-realistically) dismissed. Their most important contributions concern "cognitive bias" (we focus on some -- not all -- costs and benefits) and "prospect theory" (we put more weight on potential losses than potential gains).

I read Kahneman's 2011 book over several months because it was long (499 pages) and thorough repetitive.

My top-line recommendation that that you read this insightful book, but I suggest you take a chapter per day (or week) to allow yourself time to digest -- and experience -- the ideas. (Alternatively, print this review and read one note per day! :)

Here are some notes on Kahneman's ideas:
  1. Kahneman suggests that we process decisions by instinct (System 1 thinking, or "guts") or after consideration (System 2 thinking, or "brains"). The important point is that each system is right for some situations but not others. Order food that "feels right" but don't buy a car that way. A car (or job or house) decision involves many factors that will interact and develop over years. We cannot predict all these factors, but we can give them appropriate weights with care.
  2. Salespeople appeal to your guts when they want you to trust them. You should rely on brains to evaluate their promises. 
  3. We make better gut decisions when we're happy but worse ones when we're sad or angry.
  4. Kahneman says we often fail to look beyond "what we see is all there is" when considering a situation. This leads to misdirected gut responses. (Nassim Taleb's Fooled by Randomness addresses this bias.) People over-estimate the risk of violent death because the media loves exotic, bloody stories.
  5. People vote for "competence" (strong, trustworthy) over "likability" when judging candidates on looks. Many voters choose candidates based on looks.
  6. People believe that a beneficial technology's risk is lower and that low risk technology brings more benefits. This may explain why most people don't care about the risks of driving cars (far more dangerous than flying in airplanes) or using cell phones. It also suggests that policy changes (e.g., higher prices for water) will be more acceptable when they are small and reversible. After the sky does not fall, the "low risk" strategy can be expanded. 
  7. The measuring stick of risk (relative to what?) affects people's perceptions of risk.
  8. Bayesian reasoning: (1) anchor your judgement on the probability of an outcome (given a plausible set of repetitions), then (2) question the accuracy of your belief in that probability as outcomes appear. Put differently, take a stand and reconsider it as new data arrive.
  9. People will pay more for a "full set of perfect dishes" than for the same set with extra damaged dishes -- violating the "free disposal" assumption of economic theory (we can always dump excess). This bias explains why a house with freshly painted, empty rooms will sell for more than one with fresh paint but old furniture.
  10. Stereotyping is bad from a social perspective, but we should not ignore the information included in group statistics. Looking from the other direction, people are far TOO willing to assume the group behaves as one individual. (When I was traveling, I learned "not to judge a person by their country nor a country by one person.")
  11. Passing the buck: People "feel relieved of responsibility then they know others have heard the same request for help." This fact explains the importance of putting one person in charge, asking that person for a decision, and setting a deadline to evaluate the decision's impact.
  12. "Regression to the mean" happens when average performance replaces a "hot streak." It's not caused by burn out. It's caused by statistics. (Try to get a "hot streak" in coin flips.)
  13. "Leaders who have been lucky are not punished for taking too much risk... they are credited with flair and foresight" [p204]. Two of three mutual funds underperform the market in any given year, but lucky managers (and their investors) cling to their "illusion of skill."
  14. Successful stock traders find undervalued companies, not good companies whose shares may already be overpriced.
  15. Philip Tetlock interviewed 284 people "who made their living commenting or offering advice on economic and political trends." Their predictions could have been beaten by dart-throwing monkeys -- even within their specializations. They offered excuses to explain their "bad luck" (see Note 8).
  16. "Errors of prediction are inevitable because the world is unpredictable" [p220].
  17. Algorithms are statistically superior to experts when it comes to diagnosing medical, psychological, criminal, financial and other events in "uncertain, unpredictable" domains. See my paper on real estate markets [pdf].
  18. Simpler statistics are often better. Forget multivariate regressions. Use simple weights. For example: Marital stability = f (frequency of lovemaking - frequency of quarrels).
  19. "Back-of-envelope is often better than an optimally weighted formula and certainly better than expert judgement" [p226].
  20. Good (trustworthy) intuition comes from having enough time to understand the regularities in a "predictable environment," e.g., sports competition. "Intuition cannot be trusted in the absence of stable regularities in the environment" [p241].
  21. The "planning fallacy" might lead one to believe the best-case prediction when events do not follow the best case path. Use less optimistic weights -- and read this book.
  22. Overoptimism explains lawsuits, wars, scientific research and small business startups. Leaders tend to be overoptimistic, for better or worse. (Aside: I think men are more optimistic than women, which is why they discover more and die more often.)
  23. Want to plan ahead? "Imagine it's one year in the future and the outcome of the plan was a complete disaster. Write a debrief on that disaster." This is useful because there are more ways to fail than succeed.
  24. Our attitudes towards wealth are affected by our reference point. Start poor, and it's all up; start rich, and you may be disappointed. (If you own a house, decide if you use the purchase price or its "value" during the bubble.") You're much happier going from $100 to $200 than $800 to $900.
  25. The asymmetry of loses/wins in prospect theory explains why it's harder for one side to "give up" the exact same amount as the other side gains. This explains the durability of institutions -- for better or worse -- and why they rarely change without (1) outside pressure of bigger losses or (2) huge gains to compensate for losses. It also explains why it's hard for invaders to win.
  26. Economists often fail to account for reference points, and they dislike them for "messing up" their models. Economists whose models ignore context may misunderstand behavior.
  27. We give priority to bad news, which is why losing $100 does not compensate for winning $100. Hence, "long term success in a relationship" depends on avoiding the negative more than seeking the positive" [p302].
  28. People think it's fairer to fire a $9/hr worker and hire a $7/hr worker than reduce the wages of the $9/hr worker. That may not be a good way to go.
  29. "The sunk cost fallacy keeps people too long in poor jobs, unhappy marriages and unpromising research projects" [p345].
  30. "The precautionary principle is costly, and when interpreted strictly it can be paralyzing." It would have prevented "airplanes, air conditioning, antibiotics, automobiles..."
  31. Framing and anchoring affect our perspectives. The American preference for miles per gallon (instead of liters per 100km) means they cannot accurately compare fuel efficiency among cars. This is not an accident as far as US car companies are concerned. (Another non-accident is raising fuel economy standards instead of gas taxes.)
  32. People may choose a vacation according to what they PLAN to remember than what they will experience. That may be because we remember high and low points but forget their duration. 
  33. "The easiest way to increase happiness is to control use of your time. Can you find more time to do the things you enjoy doing?" (I have the freedom to write this review, but it gets tedious after 3 hours...)
  34. "Experienced happiness and life satisfaction are largely determined by the genetics of temperament," but "the importance that people attached to income at age 18 anticipated their satisfaction with their income as adults" [pp400-401]. I am fortunate, I think, to have started life with low expectations. That makes it easier for me to make 1/3 the money in Amsterdam that I would in Riyadh because it's definitely better to be "poor" in Amsterdam.
  35. That said, "the goals people set for themselves are so important to what they do and how they feel that... we cannot hold a concept of well-being that ignores what people want" [p402].
  36. "Adaptation to a situation means thinking less and less about it" [p405].
  37. [Paraphrased from p412]: Our research has not shown that people are irrational. It has clarified the shape of their rationality, which creates a dilemma: should we protect people against their mistakes or limit their freedom to make them? Seen from the other side, we may think it easier to protect people from the quirks of "guts" and laziness of "brains." (Hence my support for a ban on advertising.)
  38. "Brains" may help us rationalize "guts" but they can also stop foolish impulses -- when we acknowledge the limits to our reason and the information we rely on.
  39. "Gut" feelings can guide us well if we can tell the difference between clear and complicated circumstances.
  40. "An organization is a factory that manufactures judgements and decisions" [p417]. It's important, therefore, to balance between its "gut" and "brain" functions.
Bottom Line: I give this book FOUR STARS. Skip psychology and read it to understand yourself and others.

2 comments:

  1. I also thought this was an excellent book.

    When you understand the difference between System 1 "thinking" and system 2 thinking it is amazing how much system 1 thinking you can find around you. Advertising, media, politics, especially use system 1 "thinking".

    It may just be my confirmation bias, but I find it difficult to find cogent arguments on TV or in newspapers.

    This make sense since most media have a very short window of opportunity to get your attention and emotional or gut issues are easier to exploit.

    Bottom Line: You can benefit from knowing the difference between fast thinking and slow thinking.

    ReplyDelete
  2. Good post, though I thought that the book could be summarized in 50% of the number of pages; but that probably was an economic decision of the publisher (more pages allow for a higher price despite hardly any extra costs). Thanks for this summary. I put the book away due to the repetition, which seems to be a problem with many recent (American?) 'management' books.

    ReplyDelete

Spam will be deleted. Comments on older posts must be approved.
If you're having problems posting, email your comment to me