Confirmation bias
From Wikipedia, the free encyclopedia
In psychology and cognitive science, confirmation bias is a tendency to search for or interpret new information in a way that confirms one's preconceptions and avoid information and interpretations which contradict prior beliefs. It is a type of cognitive bias and represents an error of inductive inference, or as a form of selection bias toward confirmation of the hypothesis under study or disconfirmation of an alternative hypothesis.
Confirmation bias is an area of interest in the teaching of critical thinking as the skill is misused when rigorous critical scrutiny is applied to evidence supporting a preconceived idea but not to evidence challenging the same preconception.[1]
Contents |
[edit] Naming
The effect is also known as belief bias, belief preservation, belief overkill, hypothesis locking, polarization effect, the Tolstoy syndrome, selective thinking and myside bias.
[edit] Overview
Among the first to investigate this phenomenon was Peter Cathcart Wason (1960), whose subjects were presented with three numbers (a triple):
-
2 4 6
and told that triple conforms to a particular rule. They were then asked to discover the rule by generating their own triples and use the feedback they received from the experimenter. Every time the subject generated a triple, the experimenter would indicate whether the triple conformed to the rule (right) or not (wrong). The subjects were told that once they were sure of the correctness of their hypothesized rule, they should announce the rule.
While the actual rule was simply “any ascending sequence”, the subjects seemed to have a great deal of difficulty in inducing it, often announcing rules that were far more complex than the correct rule. More interestingly, the subjects seemed to only test “positive” examples, triples that subjects believed would conform to their rule and thus confirm their hypothesis. What the subjects did not do was attempt to falsify their hypotheses by testing triples that they believed would not conform to their rule. Wason referred to this phenomenon as the confirmation bias, whereby subjects systematically seek evidence to confirm rather than to deny their hypotheses.
The confirmation bias was Wason’s original explanation for the systematic errors made by subjects in the Wason selection task. In essence, the subjects were only choosing to examine cards that could confirm the given rule rather than disconfirm it. Confirmation bias has been used as a theory for why people believe and maintain pseudoscientific ideas.
[edit] Political bias study
In January 2006, Drew Westen and a team from Emory University announced at the annual Society for Personality and Social Psychology conference in Palm Springs, California the results of a study[2] showing the brain activity for confirmation bias. Their results suggest the unconscious and emotion driven nature of this form of bias.
The study was carried out during the pre-electoral period of the 2004 presidential election on 30 men, half who described themselves as strong Republicans and half as strong Democrats. During a functional magnetic resonance imaging (fMRI) scan, the subjects were asked to assess contradictory statements by both George W. Bush and John Kerry. The scans showed that the part of the brain associated with reasoning, the dorsolateral prefrontal cortex, was not involved when assessing the statements. Conversely, the most active regions of the brain were those involved in processing emotions (orbitofrontal cortex), conflict resolution (anterior cingulate cortex) and making judgment about moral accountability (posterior cingulate cortex).[3]
Dr. Westen summarised the work:
“ | None of the circuits involved in conscious reasoning were particularly engaged. Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it, with the elimination of negative emotional states and activation of positive ones.... Everyone from executives and judges to scientists and politicians may reason to emotionally biased judgments when they have a vested interest in how to interpret 'the facts'.[4] | ” |
[edit] Evans experiment
In a series of experiments by Evans, et al., subjects were presented with deductive arguments (in each of which a series of premises and a conclusion are given) and asked to indicate if each conclusion necessarily follows from the premises given. In other words, the subjects are asked to make an evaluation of logical validity. The subjects, however, exhibited confirmation bias when they rejected valid arguments with unbelievable conclusions, and endorsed invalid arguments with believable conclusions. It seems that instead of following directions and assessing logical validity, the subjects base their assessments on personal beliefs. [5]
It has been argued that like in the case of the matching bias, using more realistic content in syllogisms can facilitate more normative performance, and the use of more abstract, artificial content has a biasing effect on performance.
[edit] Reasons for effect
There are prosaic reasons why beliefs persevere despite contrary evidence. Embarrassment over having to withdraw a publicly declared belief, for example, or stubbornness or hope. Superstition, religion, or ideology can allow a believer to give a greater weight to articles of faith over facts.[citation needed]
One explanation may lie in the workings of the human sensory system. Human brains and senses are organised in such a manner so as to facilitate rapid evaluation of social situations and others' states of mind. There is an evolutionary benefit in just estimating significance and relevance quickly, rather than waiting for an exact answer. Studies have shown that this behaviour is evident in the choosing of friends and partners [6] and houses, even [7], though it is largely subconscious. Although it can be a very fast process[8], the initial impression has a lasting effect as a byproduct of the brain's tendency to fill in the gaps of what it perceives and an unwillingness of the part of a believer to admit that their cogitation was erroneous.
[edit] Polarization effect
Polarization occurs when mixed or neutral evidence is used to bolster an already established and clearly biased point of view. As a result, people on both sides can move farther apart, or polarize, when they are presented with the same mixed evidence.
In 1979, Lord, Ross and Lepper conducted an experiment to explore what would happen if they presented subjects harboring divergent opinions with the same body of mixed evidence. They hypothesized that each opposing group would use the same pieces of evidence to further support their opinions. The subjects chosen were 24 proponents and 24 opponents of the death penalty. They were given an article about the effectiveness of capital punishment and were asked to evaluate it. Then, the subjects were given detailed research descriptions of the study they had just read, but this time it included procedure, results, prominent criticisms and results shown in a table or graph. They were then asked to evaluate the study, stating how well it was conducted and how convincing the evidence was overall.
The results were congruous with the hypothesis. Students found that studies which supported their pre-existing view were superior to those which contradicted it, in a number of detailed and specific ways. In fact, the studies all described the same experimental procedure but with only the purported result changed.[9]
Overall, there was a visible increase of attitude polarization. Initial analysis of the experiment shows that proponents and opponents confessed to shifting their attitudes slightly in the direction of the first study they read, but, once subjects read the more detailed study, they returned to their original belief regardless of the evidence provided, pointing to the details that support their viewpoint and disregarding anything that was contrary.
It is not accurate to say that the subjects were trying to view the evidence in a biased manner, but, since the subjects already had such strong opinions about capital punishment their reading of the evidence was colored towards their point of view. Looking at the same piece of evidence, an opponent and proponent would each argue that it supports their own cause, thus pushing their contrary opinions even further into their opposing corners.
Polarization can occur in conjunction with other assimilation biases such as illusory correlation, selective exposure or the primary effects. The normative model for this bias is the neutral evidence principle. Interestingly, a formulated belief can prevail even if the evidence that was used in the initial formation of that belief is entirely negated. [10]
[edit] Tolstoy syndrome
The behavior of confirmation bias has been named after a quote from Count Leo Tolstoy (1828-1910):
“ | "I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life".[11] | ” |
A related Tolstoy quote is:
“ | "The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him." | ” |
[edit] Myside bias
The term "myside bias" was coined by David Perkins, myside referring to "my" side of the issue under consideration. An important consequence of the myside bias is that many incorrect beliefs are slow to change and often become stronger even when evidence is presented which should weaken the belief. Generally, such irrational belief persistence results from according too much weight to evidence that accords with one's belief, and too little weight to evidence that does not. It can also result from the failure to search impartially for information.
Jonathan Baron describes many instances where myside bias affects our lives. For example, poor students suffer from irrational belief persistence when they fail to criticize their own ideas and remain rigid in their mistaken beliefs. These students suffer from myside bias because they do not look for or tend to ignore evidence against their mistaken claims. Baron also mentions certain forms of psychopathology as good examples of myside bias. Delusional patients, for instance, might continually wrongly believe that a cough or sneeze means that they are dying, even when doctors insist that they are healthy.
A.T. Beck describes the role of this type of bias in depressive patients. He argues that depressive patients maintain their depressive state because they fail to recognize information that might make them happier, and only focus on evidence showing that their lives are unfulfilling. According to Beck, an important step in the cognitive treatment of these individuals is to overcome this bias, and to search and recognize information about their lives more impartially.
[edit] See also
- Expectancy effect
- List of cognitive biases
- Informational listening
- Effective listening
- Active listening
- Wason selection task
- Robert Jervis
- Hostile media effect
- Experimenter's regress
- George Orwell's "Doublethink"
- Forer effect, the personal validation fallacy
- Fortune-telling
- Retroactive clairvoyance, or Postdiction
[edit] References
- ^ Tim van Gelder, "Heads I win, tails you lose": A Foray Into the Psychology of Philosophy
- ^ Westen, Drew; Kilts, C., Blagov, P., Harenski, K., and Hamann, S. (2006). "The neural basis of motivated reasoning: An fMRI study of emotional constraints on political judgment during the U.S. Presidential election of 2004.". Journal of Cognitive Neuroscience..
- ^ Shermer, Michael (July 2006). The Political Brain. Scientific American. Retrieved on September 3, 2006.
- ^ Emory University Health Sciences Center (2006-01-31). Emory Study Lights Up The Political Brain. Science Daily. Retrieved on September 3, 2006.
- ^ Evans, J. St. B. T., Barston, J.L., & Pollard, P. (1983). On the conflict between logic and belief in syllogistic reasoning. Memory and Cognition, 11, 295-306.
- ^ [1]
- ^ websites
- ^ [2]
- ^ summary here
- ^ Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37, 2098-2109.
- ^ Glossary. Ask Dr. Stoll. Retrieved on April 13, 2006.
[edit] Further reading
- Wason, P.C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology, 12, 129-140.
- Wason, P.C. (1966). Reasoning. In B. M. Foss (Ed.), New horizons in psychology I, 135-151. Harmondsworth, UK: Penguin.
- Wason, P.C. (1968). Reasoning about a rule. Quarterly Journal of Experimental Psychology, 20, 273-281.
- Mynatt, C.R., Doherty, M.E., & Tweney, R.D. (1977). Confirmation bias in a simulated research environment: an experimental study of scientific inference. Quarterly Journal of Experimental Psychology, 29, 85-95.
- Griggs, R.A. & Cox, J.R. (1982). The elusive thematic materials effect in the Wason selection task. British Journal of Psychology, 73, 407-420.
- Nickerson, R.S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175-220.
- Fugelsang, J., Stein, C., Green, A., & Dunbar, K. (2004). Theory and data interactions of the scientific mind: Evidence from the molecular and the cognitive laboratory. Canadian Journal of Experimental Psychology, 58, 132-141.
- Cohen, L.J. (1981). Can human irrationality be experimentally demonstrated? The Behavioral and Brain Sciences, 4, 317-370.
- Ross, L., Lepper, M. R. and Hubbard, M. "Perseverance in self-perception and social perception: Biased attributional processes in the debriefing paradigm" Journal of Personality and Scoial Psychology, 32, 880-892 1975
- D.W. Schumann (Ed.) "Causal reasoning and belief perseverance" Proceedings of the Society for Consumer Psychology (pp. 115-120) Knoxville, TN: University of Tennessee 1989
- Tutin, Judith "Belief Perseverance: A Replication and Extension of Social Judgment Findings" Edu. Resources Inf. Ctr. ED240409 1983
- Bell, R, 1992, Impure Science, John Wiley & Sons, New York
- Helman, H, 1998, Great Feuds in Science, John Wiley & Sons, New York
- Kohn, A, 1986, False Prophets, Basil Blackwell Inc, New York
- Ditto, P. H., & Lopez, D. F. (1992). Motivated skepticism: Use of differential decision criteria for preferred and non-preferred conclusions. Journal of Personality and Social Psychology, 63, 568-584.
- Edwards K. & Smith E. E. (1996). A disconfirmation bias in the evaluation of arguments. Journal of Personality and Social Psychology, 71, 5-24.
- Baron, Jonathan. (1988, 1994, 2000). Thinking and Deciding. Cambridge University Press.
- Beck, A.T. (1976). Cognitive therapy and the emotional disorders. New York: International Universities Press.