Thinking, Fast and Slow

Thinking, Fast and Slow

Hardcover edition
Author Daniel Kahneman
Country United States
Language English
Subject Psychology
Genre Non-fiction
Publisher Farrar, Straus and Giroux
Publication date
2011
Media type Print (hardcover, paperback)
Pages 499 pages
ISBN 978-0374275631
OCLC 706020998

Thinking, Fast and Slow is a best-selling[1] 2011 book by Nobel Memorial Prize in Economics winner Daniel Kahneman which summarizes research that he conducted over decades, often in collaboration with Amos Tversky.[2][3] It covers all three phases of his career: his early days working on cognitive biases, his work on prospect theory, and his later work on happiness.

The book's central thesis is a dichotomy between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical. The book delineates cognitive biases associated with each type of thinking, starting with Kahneman's own research on loss aversion. From framing choices to people's tendency to substitute an easy-to-answer question for one that is harder, the book highlights several decades of academic research to suggest that people place too much confidence in human judgement.

Prospect theory

The basis for his Nobel prize, Kahneman developed prospect theory to account for experimental errors he noticed in Daniel Bernoulli's traditional utility theory. This theory makes logical assumptions of economic rationality that do not reflect people’s actual choices, and does not take into account behavioral biases.

One example is that people are loss-averse: they are more likely to act to avert a loss than to achieve a gain. Another example is that the value people place on a change in probability (e.g. of winning something) depends on the reference point: people appear to place greater value on a change from 0% to 10% (going from impossibility to possibility) than from, say, 45% to 55%, and place the greatest value of all on a change from 90% to 100% (going from possibility to certainty). This despite that all three changes give the same increase in utility. Consistent with loss-aversion, the order of the first and third of those is reversed when the event is presented as losing rather than winning something: there, the greatest value is placed on eliminating the probability of a loss to 0.

In 2012, the American Economic Association's Journal of Economic Literature published a review of Thinking Fast and Slow; a thorough discussion of Kahneman's take on prospect theory, as well as an analysis of the four fundamental factors that it rests on, can be seen on pages 7–9.[4]

Two systems

In the book's first section, Kahneman describes two different ways the brain forms thoughts:

Kahneman covers a number of experiments which purport to highlight the differences between these two thought processes, and how they arrive at different results even given the same inputs. Terms and concepts include coherence, attention, laziness, association, jumping to conclusions and how one forms judgments. The System 1 vs System 2 debate dives into the reasoning or lack thereof for human decision making, with big implications for market research.[5]

Heuristics and biases

The second section offers explanations for why humans struggle to think statistically. It begins by documenting a variety of situations in which we either arrive at binary decisions or fail to precisely associate reasonable probabilities to outcomes. Kahneman explains this phenomenon using the theory of heuristics. Kahneman and Tversky originally covered this topic in their landmark article from 1974 titled Judgment under Uncertainty: Heuristics and Biases.[6]

Kahneman uses heuristics to assert that System 1 thinking involves associating new information with existing patterns, or thoughts, rather than creating new patterns for each new experience. For example, a child who has only seen shapes with straight edges would experience an octagon rather than a triangle when first viewing a circle. In a legal metaphor, a judge limited to heuristic thinking would only be able to think of similar historical cases when presented with a new dispute, rather than seeing the unique aspects of that case. In addition to offering an explanation for the statistical problem, the theory also offers an explanation for human biases.

Anchoring

The “anchoring effect” names our tendency to be influenced by irrelevant numbers. Shown higher/lower numbers, experimental subjects gave higher/lower responses.[2]

This is an important chapter to have in mind when navigating a negotiation or considering a price. As an illustrative example, you might find it interesting that most people when asked whether Gandhi was more than 114 years old when he died, will provide a much larger estimate of his age at death than they would if the anchoring question referred to death at 35 years old. The funny and powerful examples shared here show that our behavior is influenced, much more than we know or want, by the environment of the moment.

Availability

The availability heuristic is a mental shortcut that occurs when people make judgments about the probability of events by how easy it is to think of examples.[7] The availability heuristic operates on the notion that, "if you can think of it, it must be important." The availability of consequences associated with an action is positively related to perceptions of the magnitude of the consequences of that action. In other words, the easier it is to recall the consequences of something, the greater we perceive these consequences to be. Sometimes, this heuristic is beneficial, but the frequencies that events come to mind are usually not accurate reflections of the probabilities of such events in real life.[8]

Substitution

System 1 is prone to substituting a difficult question with a simpler one. In what Kahneman calls their “best-known and most controversial” experiment, “the Linda problem,” subjects were told about an imaginary Linda, young, single, outspoken and very bright, who, as a student, was deeply concerned with discrimination and social justice. They asked whether it was more probable that Linda is a bank teller or that she is a bank teller and an active feminist. The overwhelming response was that “feminist bank teller” was more likely than “bank teller,” violating the laws of probability. (Every feminist bank teller is a bank teller.) In this case System 1 substituted the easier question, "Is Linda a feminist?" dropping the occupation qualifier. An alternative view is that the subjects added an unstated cultural implicature to the effect that the other answer implied an exclusive or (xor), that Linda was not a feminist.[2]

Optimism and loss aversion

Kahneman writes of a "pervasive optimistic bias", which “may well be the most significant of the cognitive biases.” This bias generates the illusion of control, that we have substantial control of our lives. This bias may be usefully adaptive. Optimists are more psychologically resilient and have stronger immune systems than more reality-based opposites. Also optimists are wrongly thought of having longer lives on average, a common belief which was disproved in the longevity project. Optimism protects from loss aversion: people's tendency to fear losses more than they value gains.[2]

A natural experiment reveals the prevalence of one kind of unwarranted optimism. The planning fallacy is the tendency to overestimate benefits and underestimate costs, impelling people to take on risky projects. In 2002, American kitchen remodeling was expected on average to cost $18,658, but actually cost $38,769.[2]

To explain overconfidence, Kahneman introduces the concept he labels What You See Is All There Is (WYSIATI). This theory states that when the mind makes decisions, it deals primarily with Known Knowns, phenomena it has already observed. It rarely considers Known Unknowns, phenomena that it knows to be relevant but about which it has no information. Finally it appears oblivious to the possibility of Unknown Unknowns, unknown phenomena of unknown relevance.

He explains that humans fail to take into account complexity and that their understanding of the world consists of a small and necessarily un-representative set of observations. Furthermore, the mind generally does not account for the role of chance and therefore falsely assumes that a future event will mirror a past event.

Framing

Framing is the context in which choices are presented. Experiment: subjects were asked whether they would opt for surgery if the “survival” rate is 90 percent, while others were told that the mortality rate is 10 percent. The first framing increased acceptance, even though the situation was no different.[9]

Sunk-cost

Main article: Sunk cost fallacy

Rather than consider the odds that an incremental investment would produce a positive return, people tend to "throw good money after bad" and continue investing in projects with poor prospects that have already consumed significant resources. In part this is to avoid feelings of regret.[9]

Choices

In this section Kahneman returns to economics and expands his seminal work on Prospect Theory. He discusses the tendency for problems to be addressed in isolation and how, when other reference points are considered, the choice of that reference point (called a frame) has a disproportionate impact on the outcome. This section also offers advice on how some of the shortcomings of System 1 thinking can be avoided.

Rationality and happiness

Evolution teaches that traits persist and develop because they increase fitness. One possible hypothesis is that our conceptual biases are adaptive, as are our rational faculties. Kahneman offers happiness as one quality that our thinking process nurtures. Kahneman first took up this question in the 1990s. At the time most happiness research relied on polls about life satisfaction.

Two selves

Kahneman proposed an alternate measure that assessed pleasure or pain sampled from moment to moment, and then summed over time. Kahneman called this “experienced” well-being and attached it to a separate "self". He distinguished this from the “remembered” well-being that the polls had attempted to measure. He found that these two measures of happiness diverged. His major discovery was that the remembering self does not care about the duration of a pleasant or unpleasant experience. Rather, it retrospectively rates an experience by the peak (or valley) of the experience, and by the way it ends. Further, the remembering self dominated the patient's ultimate conclusion.

“Odd as it may seem,” Kahneman writes, “I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.”[3][10]

Awards and honors

See also

Similarly themed books include:

References

  1. "The New York Times Best Seller List – December 25, 2011" (PDF). www.hawes.com. Retrieved 2014-08-17.
  2. 1 2 3 4 5 Holt, Jim (27 November 2011). "Two Brains Running". The New York Times. p. 16.
  3. 1 2 Daniel Kahneman (25 October 2011). Thinking, Fast and Slow. Macmillan. ISBN 978-1-4299-6935-2. Retrieved 8 April 2012.
  4. Psychologists at the Gate: A Review of Daniel Kahneman’s Thinking, Fast and Slow (PDF).
  5. System 1 VS System 2.
  6. Judgment under Uncertainty: Heuristics and Biases (PDF).
  7. "Availability: A heuristic for judging frequency and probability". Cognitive Psychology 5 (2): 207–232. September 1973. doi:10.1016/0010-0285(73)90033-9.
  8. Availability: A Heuristic for Judging Frequency and Probability (PDF).
  9. 1 2 Lowenstein, Roger (October 27, 2011). "A Nobel laureate’s new book cautions us not to trust our gut". Bloomberg Business Week.
  10. Brain-scanning experiments by Rafael Malach showed that when subjects are absorbed in an experience, such as watching a movie, the parts of the brain associated with self-consciousness are not merely quiet, they’re actually shut down (“inhibited”) by the rest of the brain. The self seems simply to disappear. If the self is not participating in the experience, how does the remembering self get its data?
  11. "2011 Los Angeles Times Book Prize Winners & Finalists". Los Angeles Times.
  12. The Economist - Books of the Year 2011.
  13. The Best Nonfiction of 2011.

External links

This article is issued from Wikipedia - version of the Friday, February 12, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.