World Library  
Flag as Inappropriate
Email this Article

Confirmation bias

Confirmation bias, also called myside bias, is the tendency to search for, interpret, or remember information in a way that confirms one's beliefs or hypotheses.[Note 1][1] It is a type of cognitive bias and a systematic error of inductive reasoning. People display this bias when they gather or recall information selectively, or when they interpret it in a biased way. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs. People also tend to interpret ambiguous evidence as supporting their existing position. Biased search, interpretation and memory have been invoked to explain attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence), belief perseverance (when beliefs persist after the evidence for them is shown to be false), the irrational primacy effect (a greater reliance on information encountered early in a series) and illusory correlation (when people falsely perceive an association between two events or situations).

A series of experiments in the 1960s suggested that people are biased toward confirming their existing beliefs. Later work re-interpreted these results as a tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives. In certain situations, this tendency can bias people's conclusions. Explanations for the observed biases include wishful thinking and the limited human capacity to process information. Another explanation is that people show confirmation bias because they are weighing up the costs of being wrong, rather than investigating in a neutral, scientific way.

Confirmation biases contribute to

  • Skeptic's Dictionary: confirmation bias by Robert T. Carroll
  • Teaching about confirmation bias, class handout and instructor's notes by K. H. Grobman
  • Confirmation bias at You Are Not So Smart
  • Confirmation bias learning object, interactive number triples exercise by Rod McFarland for Simon Fraser University
  • Brief summary of the 1979 Stanford assimilation bias study by Keith Rollag, Babson College

External links

  • Stanovich, Keith (2009), What Intelligence Tests Miss: The Psychology of Rational Thought, New Haven (CT): Yale University Press,  
  • Westen, Drew (2007), The political brain: the role of emotion in deciding the fate of the nation, PublicAffairs,  
  • Keohane, Joe (11 July 2010), "How facts backfire: Researchers discover a surprising threat to democracy: our brains", Boston Globe (NY Times) 

Further reading

  • Baron, Jonathan (2000), Thinking and deciding (3rd ed.), New York: Cambridge University Press,  
  • Fine, Cordelia (2006), A Mind of its Own: how your brain distorts and deceives, Cambridge, UK: Icon books,  
  • Friedrich, James (1993), "Primary error detection and minimization (PEDMIN) strategies in social cognition: a reinterpretation of confirmation bias phenomena", Psychological Review (American Psychological Association) 100 (2): 298–319,  
  • Hergovich, Andreas; Schott, Reinhard; Burger, Christoph (2010), "Biased Evaluation of Abstracts Depending on Topic and Conclusion: Further Evidence of a Confirmation Bias Within Scientific Psychology", Current Psychology 29 (3): 188–209,  
  • Horrobin, David F. (1990), "The philosophical basis of peer review and the suppression of innovation", Journal of the American Medical Association 263 (10): 1438–1441,  
  • Kida, Thomas E. (2006), Don't believe everything you think: the 6 basic mistakes we make in thinking, Amherst, New York: Prometheus Books,  
  • Koehler, Jonathan J. (1993), "The influence of prior beliefs on scientific judgments of evidence quality", Organizational Behavior and Human Decision Processes 56: 28–55,  
  • Lewicka, Maria (1998), "Confirmation Bias: Cognitive Error or Adaptive Strategy of Action Control?", in Kofta, Mirosław; Weary, Gifford; Sedek, Grzegorz, Personal control in action: cognitive and motivational mechanisms, Springer, pp. 233–255,  
  • Maccoun, Robert J. (1998), "Biases in the interpretation and use of research results", Annual Review of Psychology 49: 259–87,  
  • Mahoney, Michael J. (1977), "Publication prejudices: an experimental study of confirmatory bias in the peer review system", Cognitive Therapy and Research 1 (2): 161–175,  
  • Oswald, Margit E.; Grosjean, Stefan (2004), "Confirmation Bias", in Pohl, Rüdiger F., Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory, Hove, UK: Psychology Press, pp. 79–96,  
  • Poletiek, Fenna (2001), Hypothesis-testing behaviour, Hove, UK: Psychology Press,  
  • Risen, Jane; Gilovich, Thomas (2007), "Informal Logical Fallacies", in Sternberg, Robert J.; Roediger III, Henry L.; Halpern, Diane F., Critical Thinking in Psychology, Cambridge University Press, pp. 110–130,  
  • Vyse, Stuart A. (1997), Believing in magic: The psychology of superstition, New York: Oxford University Press,  


  1. ^ a b Plous 1993, p. 233
  2. ^ Nickerson, Raymond S. (June 1998). "Confirmation Bias: A Ubiquitous Phenomenon in Many Guises". Review of General Psychology 2 (2): 175–220.  
  3. ^ Darley, John M.; Gross, Paget H. (2000), "A Hypothesis-Confirming Bias in Labelling Effects", in Stangor, Charles, Stereotypes and prejudice: essential readings, Psychology Press, p. 212,  
  4. ^ Risen & Gilovich 2007
  5. ^ a b c Zweig, Jason (November 19, 2009), "How to Ignore the Yes-Man in Your Head", Wall Street Journal (Dow Jones & Company), retrieved 2010-06-13 
  6. ^ Nickerson 1998, pp. 177–178
  7. ^ a b c d Kunda 1999, pp. 112–115
  8. ^ a b Baron 2000, pp. 162–164
  9. ^ Kida 2006, pp. 162–165
  10. ^ Devine, Patricia G.; Hirt, Edward R.; Gehrke, Elizabeth M. (1990), "Diagnostic and confirmation strategies in trait hypothesis testing", Journal of Personality and Social Psychology (American Psychological Association) 58 (6): 952–963,  
  11. ^ Trope, Yaacov; Bassok, Miriam (1982), "Confirmatory and diagnosing strategies in social information gathering", Journal of Personality and Social Psychology (American Psychological Association) 43 (1): 22–34,  
  12. ^ a b c Klayman, Joshua; Ha, Young-Won (1987), "Confirmation, Disconfirmation and Information in Hypothesis Testing", Psychological Review (American Psychological Association) 94 (2): 211–228,  
  13. ^ a b c Oswald & Grosjean 2004, pp. 82–83
  14. ^ Kunda, Ziva; Fong, G.T.; Sanitoso, R.; Reber, E. (1993), "Directional questions direct self-conceptions", Journal of Experimental Social Psychology (Society of Experimental Social Psychology) 29: 62–63,   via Fine 2006, pp. 63–65
  15. ^ a b Shafir, E. (1993), "Choosing versus rejecting: why some options are both better and worse than others", Memory and Cognition 21 (4): 546–556,   via Fine 2006, pp. 63–65
  16. ^ Snyder, Mark; Swann, Jr., William B. (1978), "Hypothesis-Testing Processes in Social Interaction", Journal of Personality and Social Psychology (American Psychological Association) 36 (11): 1202–1212,   via Poletiek 2001, p. 131
  17. ^ a b Kunda 1999, pp. 117–118
  18. ^ a b Albarracin, D.; Mitchell, A.L. (2004). "The Role of Defensive Confidence in Preference for Proattitudinal Information: How Believing That One Is Strong Can Sometimes Be a Defensive Weakness". Personality and Social Psychology Bulletin 30 (12): 1565–1584.  
  19. ^ Fischer, P.; Fischer, Julia K.; Aydin, Nilüfer; Frey, Dieter (2010). "Physically Attractive Social Information Sources Lead to Increased Selective Exposure to Information". Basic and Applied Social Psychology 32 (4): 340–347.  
  20. ^ a b c Stanovich, K. E.; West, R. F.; Toplak, M. E. (2013). "Myside Bias, Rational Thinking, and Intelligence". Current Directions in Psychological Science 22 (4): 259–264.  
  21. ^ a b Mynatt, Clifford R.; Doherty, Michael E.; Tweney, Ryan D. (1978), "Consequences of confirmation and disconfirmation in a simulated research environment", Quarterly Journal of Experimental Psychology 30 (3): 395–406,  
  22. ^ Kida 2006, p. 157
  23. ^ a b c d e f Lord, Charles G.; Ross, Lee; Lepper, Mark R. (1979), "Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence", Journal of Personality and Social Psychology (American Psychological Association) 37 (11): 2098–2109,  
  24. ^ a b Baron 2000, pp. 201–202
  25. ^ Vyse 1997, p. 122
  26. ^ a b c d Taber, Charles S.; Lodge, Milton (July 2006), "Motivated Skepticism in the Evaluation of Political Beliefs", American Journal of Political Science (Midwest Political Science Association) 50 (3): 755–769,  
  27. ^ a b c Westen, Drew; Blagov, Pavel S.; Harenski, Keith; Kilts, Clint; Hamann, Stephan (2006), "Neural Bases of Motivated Reasoning: An fMRI Study of Emotional Constraints on Partisan Political Judgment in the 2004 U.S. Presidential Election", Journal of Cognitive Neuroscience (Massachusetts Institute of Technology) 18 (11): 1947–1958,  
  28. ^ Gadenne, V.; Oswald, M. (1986), "Entstehung und Veränderung von Bestätigungstendenzen beim Testen von Hypothesen [Formation and alteration of confirmatory tendencies during the testing of hypotheses]", Zeitschrift für experimentelle und angewandte Psychologie 33: 360–374  via Oswald & Grosjean 2004, p. 89
  29. ^ Hastie, Reid; Park, Bernadette (2005), "The Relationship Between Memory and Judgment Depends on Whether the Judgment Task is Memory-Based or On-Line", in Hamilton, David L., Social cognition: key readings, New York: Psychology Press, p. 394,  
  30. ^ a b c Oswald & Grosjean 2004, pp. 88–89
  31. ^ Stangor, Charles; McMillan, David (1992), "Memory for expectancy-congruent and expectancy-incongruent information: A review of the social and social developmental literatures", Psychological Bulletin (American Psychological Association) 111 (1): 42–61,  
  32. ^ a b Snyder, M.; Cantor, N. (1979), "Testing hypotheses about other people: the use of historical knowledge", Journal of Experimental Social Psychology 15 (4): 330–342,   via Goldacre 2008, p. 231
  33. ^ Kunda 1999, pp. 225–232
  34. ^ Sanitioso, Rasyid; Kunda, Ziva; Fong, G.T. (1990), "Motivated recruitment of autobiographical memories", Journal of Personality and Social Psychology (American Psychological Association) 59 (2): 229–241,  
  35. ^ a b c Levine, L.; Prohaska, V.; Burgess, S.L.; Rice, J.A.; Laulhere, T.M. (2001). "Remembering past emotions: The role of current appraisals.". Cognition and Emotion 15: 393–417.  
  36. ^ a b Safer, M.A.; Bonanno, G.A.; Field, N. (2001). ""It was never that bad": Biased recall of grief and long-term adjustment to the death of a spouse". Memory 9 (3): 195–203.  
  37. ^ a b Russell, Dan; Jones, Warren H. (1980), "When superstition fails: Reactions to disconfirmation of paranormal beliefs", Personality and Social Psychology Bulletin (Society for Personality and Social Psychology) 6 (1): 83–88,   via Vyse 1997, p. 121
  38. ^ a b c Kuhn, Deanna; Lao, Joseph (March 1996), "Effects of Evidence on Attitudes: Is Polarization the Norm?", Psychological Science (American Psychological Society) 7 (2): 115–120,  
  39. ^ Baron 2000, p. 201
  40. ^ Miller, A.G.; McHoskey, J.W.; Bane, C.M.; Dowd, T.G. (1993), "The attitude polarization phenomenon: Role of response measure, attitude extremity, and behavioral consequences of reported attitude change", Journal of Personality and Social Psychology 64 (4): 561–574,  
  41. ^ "backfire effect".  
  42. ^ Silverman, Craig (2011-06-17). "The Backfire Effect". Columbia Journalism Review. Retrieved 2012-05-01. When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger. 
  43. ^ Nyhan, Brendan; Reifler, Jason (2010). "When Corrections Fail: The Persistence of Political Misperceptions". Political Behavior 32 (2): 303–330.  
  44. ^ a b c d Ross, Lee; Anderson, Craig A. (1982), "Shortcomings in the attribution process: On the origins and maintenance of erroneous social assessments", in Kahneman, Daniel; Slovic, Paul; Tversky, Amos, Judgment under uncertainty: Heuristics and biases, Cambridge University Press, pp. 129–152,  
  45. ^ a b c d Nickerson 1998, p. 187
  46. ^ Kunda 1999, p. 99
  47. ^ Ross, Lee; Lepper, Mark R.; Hubbard, Michael (1975), "Perseverance in self-perception and social perception: Biased attributional processes in the debriefing paradigm", Journal of Personality and Social Psychology (American Psychological Association) 32 (5): 880–892,   via Kunda 1999, p. 99
  48. ^ a b c Anderson, Craig A.; Lepper, Mark R.; Ross, Lee (1980), "Perseverance of Social Theories: The Role of Explanation in the Persistence of Discredited Information", Journal of Personality and Social Psychology (American Psychological Association) 39 (6): 1037–1049,  
  49. ^ a b c d e Baron 2000, pp. 197–200
  50. ^ a b c Fine 2006, pp. 66–70
  51. ^ a b Plous 1993, pp. 164–166
  52. ^ Redelmeir, D. A.; Tversky, Amos (1996), "On the belief that arthritis pain is related to the weather", Proceedings of the National Academy of Sciences 93 (7): 2895–2896,   via Kunda 1999, p. 127
  53. ^ a b c Kunda 1999, pp. 127–130
  54. ^ Plous 1993, pp. 162–164
  55. ^ Adapted from Fielder, Klaus (2004), "Illusory correlation", in Pohl, Rüdiger F., Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory, Hove, UK: Psychology Press, p. 103,  
  56. ^ Stanovich, K. E.; West, R. F.; Toplak, M. E. (5 August 2013). "Myside Bias, Rational Thinking, and Intelligence". Current Directions in Psychological Science 22 (4): 259–264.  
  57. ^ Baron, Jonathan (1995). "Myside bias in thinking about abortion.". Thinking & Reasoning: 221–235. 
  58. ^ a b c d Wolfe, Christopher; Anne Britt (2008). "The locus of the myside bias in written argumentation". Thinking & Reasoning 14: 1–27.  
  59. ^ Mason, Lucia; Scirica, Fabio (October 2006). "Prediction of students' argumentation skills about controversial topics by epistemological understanding". Learning and Instruction 16 (5): 492–509.  
  60. ^ Weinstock, Michael (December 2009). "Relative expertise in an everyday reasoning task: Epistemic understanding, problem representation, and reasoning competence". Learning and Individual Differences 19 (4): 423–434.  
  61. ^ Weinstock, Michael; Neuman, Yair; Tabak, Iris (January 2004). "Missing the point or missing the norms? Epistemological norms as predictors of students' ability to identify fallacious arguments".  
  62. ^ a b Baron 2000, pp. 195–196
  63. ^ Thucydides 4.108.4
  64. ^ Alighieri, Dante. Paradiso canto XIII: 118–120. Trans. Allen Mandelbaum
  65. ^ a b Bacon, Francis (1620). Novum Organum. reprinted in Burtt, E.A., ed. (1939), The English philosophers from Bacon to Mill, New York: Random House, p. 36  via Nickerson 1998, p. 176
  66. ^ Tolstoy, Leo. What is Art? p. 124 (1899). In The Kingdom of God Is Within You (1893), he similarly declared, "The most difficult participants can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him" (ch. 3). Translated from the Russian by Constance Garnett, New York, 1894. Project Gutenberg edition released November 2002. Retrieved 2009-08-24.
  67. ^ Gale, Maggie; Ball, Linden J. (2002), "Does Positivity Bias Explain Patterns of Performance on Wason's 2-4-6 task?", in Gray, Wayne D.; Schunn, Christian D., Proceedings of the Twenty-Fourth Annual Conference of the Cognitive Science Society, Routledge, p. 340,  
  68. ^ a b Wason, Peter C. (1960), "On the failure to eliminate hypotheses in a conceptual task", Quarterly Journal of Experimental Psychology (Psychology Press) 12 (3): 129–140,  
  69. ^ Nickerson 1998, p. 179
  70. ^ Lewicka 1998, p. 238
  71. ^ Oswald & Grosjean 2004, pp. 79–96
  72. ^ Wason, Peter C. (1968), "Reasoning about a rule", Quarterly Journal of Experimental Psychology (Psychology Press) 20 (3): 273–28,  
  73. ^ a b c Sutherland, Stuart (2007), Irrationality (2nd ed.), London: Pinter and Martin, pp. 95–103,  
  74. ^ Barkow, Jerome H.; Cosmides, Leda; Tooby, John (1995), The adapted mind: evolutionary psychology and the generation of culture, Oxford University Press US, pp. 181–184,  
  75. ^ Oswald & Grosjean 2004, pp. 81–82, 86–87
  76. ^ Lewicka 1998, p. 239
  77. ^ Tweney, Ryan D.; Doherty, Michael E.; Worner, Winifred J.; Pliske, Daniel B.; Mynatt, Clifford R.; Gross, Kimberly A.; Arkkelin, Daniel L. (1980), "Strategies of rule discovery in an inference task", The Quarterly Journal of Experimental Psychology (Psychology Press) 32 (1): 109–123,   (Experiment IV)
  78. ^ Oswald & Grosjean 2004, pp. 86–89
  79. ^ a b Hergovich, Schott & Burger 2010
  80. ^ Maccoun 1998
  81. ^ Friedrich 1993, p. 298
  82. ^ Kunda 1999, p. 94
  83. ^ Nickerson 1998, pp. 198–199
  84. ^ Nickerson 1998, p. 200
  85. ^ a b c Nickerson 1998, p. 197
  86. ^ Baron 2000, p. 206
  87. ^ Matlin, Margaret W. (2004), "Pollyanna Principle", in Pohl, Rüdiger F., Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory, Hove: Psychology Press, pp. 255–272,  
  88. ^ Dawson, Erica; Gilovich, Thomas; Regan, Dennis T. (October 2002), "Motivated Reasoning and Performance on the Wason Selection Task", Personality and Social Psychology Bulletin (Society for Personality and Social Psychology) 28 (10): 1379–1387,  
  89. ^ Ditto, Peter H.; Lopez, David F. (1992), "Motivated skepticism: use of differential decision criteria for preferred and nonpreferred conclusions", Journal of personality and social psychology (American Psychological Association) 63 (4): 568–584,  
  90. ^ Nickerson 1998, p. 198
  91. ^ Oswald & Grosjean 2004, pp. 91–93
  92. ^ Friedrich 1993, pp. 299, 316–317
  93. ^ Trope, Y.; Liberman, A. (1996), "Social hypothesis testing: cognitive and motivational mechanisms", in Higgins, E. Tory; Kruglanski, Arie W., Social Psychology: Handbook of basic principles, New York: Guilford Press,   via Oswald & Grosjean 2004, pp. 91–93
  94. ^ a b Dardenne, Benoit; Leyens, Jacques-Philippe (1995), "Confirmation Bias as a Social Skill", Personality and Social Psychology Bulletin (Society for Personality and Social Psychology) 21 (11): 1229–1239,  
  95. ^ Shanteau, James (2003). Sandra L. Schneider, ed. Emerging perspectives on judgment and decision research. Cambridge [u.a.]: Cambridge University Press. p. 445.  
  96. ^ Haidt, Jonathan (2012). The Righteous Mind : Why Good People are Divided by Politics and Religion. New York: Pantheon Books. pp. 1473–4 (e–book edition).  
  97. ^ Lindzey, edited by Susan T. Fiske, Daniel T. Gilbert, Gardner (2010). The handbook of social psychology. (5th ed.). Hoboken, N.J.: Wiley. p. 811.  
  98. ^ Pompian, Michael M. (2006), Behavioral finance and wealth management: how to build optimal portfolios that account for investor biases, John Wiley and Sons, pp. 187–190,  
  99. ^ Hilton, Denis J. (2001), "The psychology of financial decision-making: Applications to trading, dealing, and investment analysis", Journal of Behavioral Finance (Institute of Behavioral Finance) 2 (1): 37–39,  
  100. ^ Krueger, David; Mann, John David (2009), The Secret Language of Money: How to Make Smarter Financial Decisions and Live a Richer Life, McGraw Hill Professional, pp. 112–113,  
  101. ^ a b Nickerson 1998, p. 192
  102. ^ Goldacre 2008, p. 233
  103. ^  
  104. ^ Atwood, Kimball (2004), "Naturopathy, Pseudoscience, and Medicine: Myths and Fallacies vs Truth", Medscape General Medicine 6 (1): 33 
  105. ^ Neenan, Michael; Dryden, Windy (2004), Cognitive therapy: 100 key points and techniques, Psychology Press, p. ix,  
  106. ^ Blackburn, Ivy-Marie; Davidson, Kate M. (1995), Cognitive therapy for depression & anxiety: a practitioner's guide (2 ed.), Wiley-Blackwell, p. 19,  
  107. ^ Harvey, Allison G.; Watkins, Edward; Mansell, Warren (2004), Cognitive behavioural processes across psychological disorders: a transdiagnostic approach to research and treatment, Oxford University Press, pp. 172–173, 176,  
  108. ^ Nickerson 1998, pp. 191–193
  109. ^ Myers, D.G.; Lamm, H. (1976), "The group polarization phenomenon", Psychological Bulletin 83 (4): 602–627,   via Nickerson 1998, pp. 193–194
  110. ^ Halpern, Diane F. (1987), Critical thinking across the curriculum: a brief edition of thought and knowledge, Lawrence Erlbaum Associates, p. 194,  
  111. ^ Roach, Kent (2010), "Wrongful Convictions: Adversarial and Inquisitorial Themes", North Carolina Journal of International Law and Commercial Regulation 35,  
  112. ^ Baron 2000, pp. 191,195
  113. ^ Kida 2006, p. 155
  114. ^ Tetlock, Philip E. (2005), Expert Political Judgment: How Good Is It? How Can We Know?, Princeton, N.J.: Princeton University Press, pp. 125–128,  
  115. ^ "David Camm Blog: Investigation under fire". WDRB. October 10, 2013. 
  116. ^ Kircher, Travis. "David Camm blogsite: opening statements". WDRB. Retrieved January 3, 2014. 
  117. ^ "David Camm v. State of Indiana". Court of Appeals of Indiana. 2011-11-15. 
  118. ^ Boyd, Gordon (September 10, 2013). "Camm trial 9/10: Defense finds inconsistencies but can't touch Boney's past". WBRC. 
  119. ^ Zambroski,James. "Witness Says Prosecutor In First Camm Trial Blew Up When She Couldn't Link Camm's DNA To Boney's Shirt".  
  120. ^ Eisenmenger, Sarah (September 9, 2013). "Convicted Killer Charles Boney says David Camm was the shooter". wave3. Retrieved January 5, 2014. 
  121. ^ Eisenmenger, Sarah (Sep 9, 2013). "Convicted Killer Charles Boney says David Camm was the shooter". wave3. 
  122. ^ Adams, Harold J. (2011-02-18). "David Camm's attorney's appeal ruling, seek prosecutor's removal". Courier Journal, page B1. 
  123. ^ David Camm verdict: NOT GUILTY, WDRB TV, October 24, 2013
  124. ^ a b Smith, Jonathan C. (2009), Pseudoscience and Extraordinary Claims of the Paranormal: A Critical Thinker's Toolkit, John Wiley and Sons, pp. 149–151,  
  125. ^ Randi, James (1991), James Randi: psychic investigator, Boxtree, pp. 58–62,  
  126. ^ a b Nickerson 1998, p. 190
  127. ^ a b Nickerson 1998, pp. 192–194
  128. ^ a b Koehler 1993
  129. ^ a b c Mahoney 1977
  130. ^ Proctor, Robert W.; Capaldi, E. John (2006), Why science matters: understanding the methods of psychological research, Wiley-Blackwell, p. 68,  
  131. ^ Sternberg, Robert J. (2007), "Critical Thinking in Psychology: It really is critical", in Sternberg, Robert J.; Roediger III, Henry L.; Halpern, Diane F., Critical Thinking in Psychology, Cambridge University Press, p. 292,  
  132. ^ a b Shadish, William R. (2007), "Critical Thinking in Quasi-Experimentation", in Sternberg, Robert J.; Roediger III, Henry L.; Halpern, Diane F., Critical Thinking in Psychology, Cambridge University Press, p. 49,  
  133. ^ Jüni, P.; Altman, D. G.; Egger, M. (2001). "Systematic reviews in health care: Assessing the quality of controlled clinical trials". BMJ (Clinical research ed.) 323 (7303): 42–46.  
  134. ^ Shermer, Michael (July 2006), "The Political Brain", Scientific American,  
  135. ^ Emerson, G. B.; Warme, W. J.; Wolf, F. M.; Heckman, J. D.; Brand, R. A.; Leopold, S. S. (2010). "Testing for the Presence of Positive-Outcome Bias in Peer Review: A Randomized Controlled Trial". Archives of Internal Medicine 170 (21): 1934–1939.  
  136. ^ Horrobin 1990
  137. ^ a b Swann, William B.; Pelham, Brett W.; Krull, Douglas S. (1989), "Agreeable Fancy or Disagreeable Truth? Reconciling Self-Enhancement and Self-Verification", Journal of Personality and Social Psychology (American Psychological Association) 57 (5): 782–791,  
  138. ^ a b Swann, William B.; Read, Stephen J. (1981), "Self-Verification Processes: How We Sustain Our Self-Conceptions", Journal of Experimental Social Psychology (Academic Press) 17 (4): 351–372,  
  139. ^ Story, Amber L. (1998), "Self-Esteem and Memory for Favorable and Unfavorable Personality Feedback", Personality and Social Psychology Bulletin (Society for Personality and Social Psychology) 24 (1): 51–64,  
  140. ^ White, Michael J.; Brockett, Daniel R.; Overstreet, Belinda G. (1993), "Confirmatory Bias in Evaluating Personality Test Information: Am I Really That Kind of Person?", Journal of Counseling Psychology (American Psychological Association) 40 (1): 120–126,  
  141. ^ Swann, William B.; Read, Stephen J. (1981), "Acquiring Self-Knowledge: The Search for Feedback That Fits", Journal of Personality and Social Psychology (American Psychological Association) 41 (6): 1119–1128,  
  142. ^ Shrauger, J. Sidney; Lund, Adrian K. (1975), "Self-evaluation and reactions to evaluations from others", Journal of Personality (Duke University Press) 43 (1): 94–108,  


  1. ^ David Perkins, a geneticist, coined the term "myside bias" referring to a preference for "my" side of an issue. (Baron 2000, p. 195)
  2. ^ Text in cited article: Tuchman (1984) described a form of confirmation bias at work in the process of justifying policies to which a government has committed itself: “Once a policy has been adopted and implemented, all subsequent activity becomes an effort to justify it” (p.245). In the context of a discussion of the policy that drew the United States into war in Vietnam and kept the U.S. military engaged for 16 years despite countless evidences that it was a lost cause from the beginning, Tuchman argued that once a policy has been adopted and implemented by a government, all subsequent activity of that government becomes focused on justification of that policy.
    Wooden headedness, the source of self deception is a factor that plays a remarkably large role in government. It consists in assessing a situation in terms of preconceived fixed notions while ignoring or rejecting any contrary signs. It is acting according to wish while not allowing oneself to be deflected by the facts. It is epitomized in a historian’s statement about Philip II of Spain, the surpassing wooden head of all sovereigns: “no experience of the failure of his policy could shake his belief in essential excellence”. (p.7)
    Folly, she argued, is a form of self-deception characterized by “insistence on a rooted notion regardless of contrary evidence” (p.209)
  3. ^ "Assimilation bias" is another term used for biased interpretation of evidence. (Risen & Gilovich 2007, p. 113)
  4. ^ Wason also used the term "verification bias". (Poletiek 2001, p. 73)


See also

Social psychologists have identified two tendencies in the way people seek or interpret information about themselves. Self-verification is the drive to reinforce the existing self-image and self-enhancement is the drive to seek positive feedback. Both are served by confirmation biases.[137] In experiments where people are given feedback that conflicts with their self-image, they are less likely to attend to it or remember it than when given self-verifying feedback.[138][139][140] They reduce the impact of such information by interpreting it as unreliable.[138][141][142] Similar experiments have found a preference for positive feedback, and the people who give it, over negative feedback.[137]

In self-image

An experimenter's confirmation bias can potentially affect which data are reported. Data that conflict with the experimenter's expectations may be more readily discarded as unreliable, producing the so-called file drawer effect. To combat this tendency, scientific training teaches ways to prevent bias.[132] For example, experimental design of randomized controlled trials (coupled with their systematic review) aims to minimize sources of bias.[132][133] The social process of peer review is thought to mitigate the effect of individual scientists' biases,[134] even though the peer review process itself may be susceptible to such biases.[129][135] Confirmation bias may thus be especially harmful to objective evaluations regarding nonconforming results since biased individuals may regard opposing evidence to be weak in principle and give little serious thought to revising their beliefs.[128] Scientific innovators often meet with resistance from the scientific community, and research presenting controversial results frequently receives harsh peer review.[136]

In the context of scientific research, confirmation biases can sustain theories or research programs in the face of inadequate or even contradictory evidence;[73][130] the field of parapsychology has been particularly affected.[131]

A distinguishing feature of scientific thinking is the search for falsifying as well as confirming evidence.[127] However, many times in the history of science, scientists have resisted new discoveries by selectively interpreting or ignoring unfavorable data.[127] Previous research has shown that the assessment of the quality of scientific studies seems to be particularly vulnerable to confirmation bias. It has been found several times that scientists rate studies that report findings consistent with their prior beliefs more favorably than studies reporting findings inconsistent with their previous beliefs.[79][128][129] However, assuming that the research question is relevant, the experimental design adequate and the data are clearly and comprehensively described, the found results should be of importance to the scientific community and should not be viewed prejudicially, regardless of whether they conform to current theoretical predictions.[129]

In science

As a striking illustration of confirmation bias in the real world, Nickerson mentions numerological pyramidology: the practice of finding meaning in the proportions of the Egyptian pyramids.[126] There are many different length measurements that can be made of, for example, the Great Pyramid of Giza and many ways to combine or manipulate them. Hence it is almost inevitable that people who look at these numbers selectively will find superficially impressive correspondences, for example with the dimensions of the Earth.[126]

One factor in the appeal of alleged psychic readings is that listeners apply a confirmation bias which fits the psychic's statements to their own lives.[124] By making a large number of ambiguous statements in each sitting, the psychic gives the client more opportunities to find a match. This is one of the techniques of cold reading, with which a psychic can deliver a subjectively impressive reading without any prior information about the client.[124] Investigator James Randi compared the transcript of a reading to the client's report of what the psychic had said, and found that the client showed a strong selective recall of the "hits".[125]

In the paranormal

In the 2013 murder trial of David Camm, the defense argued that Camm was charged for the murders of his wife and two children solely because of confirmation bias within the investigation.[115] Camm was arrested three days after the murders on the basis of faulty evidence. Despite the discovery that almost every piece of evidence on the probable cause affidavit was inaccurate or unreliable, the charges were not dropped against him.[116][117] A sweatshirt found at the crime was subsequently discovered to contain the DNA of a convicted felon, his prison nickname, and his department of corrections number.[118] Investigators looked for Camm's DNA on the sweatshirt, but failed to investigate any other pieces of evidence found on it and the foreign DNA was not run through CODIS until 5 years after the crime.[119][120] When the second suspect was discovered, prosecutors charged them as co-conspirators in the crime despite finding no evidence linking the two men.[121][122] Camm was acquitted of the murders.[123]

A two-decade study of political pundits by Philip E. Tetlock found that, on the whole, their predictions were not much better than chance. Tetlock divided experts into "foxes" who maintained multiple hypotheses, and "hedgehogs" who were more dogmatic. In general, the hedgehogs were much less accurate. Tetlock blamed their failure on confirmation bias—specifically, their inability to make use of new information that contradicted their existing theories.[114]

Confirmation bias can be a factor in creating or extending conflicts, from emotionally charged debates to wars: by interpreting the evidence in their favor, each opposing party can become overconfident that it is in the stronger position.[112] On the other hand, confirmation bias can result in people ignoring or misinterpreting the signs of an imminent or incipient conflict. For example, psychologists Stuart Sutherland and Thomas Kida have each argued that US Admiral Husband E. Kimmel showed confirmation bias when playing down the first signs of the Japanese attack on Pearl Harbor.[73][113]

Nickerson argues that reasoning in judicial and political contexts is sometimes subconsciously biased, favoring conclusions that judges, juries or governments have already committed to.[108] Since the evidence in a jury trial can be complex, and jurors often reach decisions about the verdict early on, it is reasonable to expect an attitude polarization effect. The prediction that jurors will become more extreme in their views as they see more evidence has been borne out in experiments with mock trials.[109][110] Both inquisitorial and adversarial criminal justice systems are affected by confirmation bias.[111]

A woman and a man reading a document in a courtroom
Mock trials allow researchers to examine confirmation biases in a realistic setting.

In politics and law

Cognitive therapy was developed by Aaron T. Beck in the early 1960s and has become a popular approach.[105] According to Beck, biased information processing is a factor in depression.[106] His approach teaches people to treat evidence impartially, rather than selectively reinforcing negative outlooks.[62] Phobias and hypochondria have also been shown to involve confirmation bias for threatening information.[107]

Raymond Nickerson, a psychologist, blames confirmation bias for the ineffective medical procedures that were used for centuries before the arrival of scientific medicine.[101] If a patient recovered, medical authorities counted the treatment as successful, rather than looking for alternative explanations such as that the disease had run its natural course.[101] Biased assimilation is a factor in the modern appeal of alternative medicine, whose proponents are swayed by positive anecdotal evidence but treat scientific evidence hyper-critically.[102][103][104]

In physical and mental health

Confirmation bias can lead investors to be overconfident, ignoring evidence that their strategies will lose money.[5][98] In studies of political stock markets, investors made more profit when they resisted bias. For example, participants who interpreted a candidate's debate performance in a neutral rather than partisan way were more likely to profit.[99] To combat the effect of confirmation bias, investors can try to adopt a contrary viewpoint "for the sake of argument".[100] In one technique, they imagine that their investments have collapsed and ask themselves why this might happen.[5]

In finance


Psychologists Jennifer Lerner and Philip Tetlock distinguish two different kinds of thinking process. Exploratory thought neutrally considers multiple points of view and tries to anticipate all possible objections to a particular position, while confirmatory thought seeks to justify a specific point of view. Lerner and Tetlock say that when people expect to justify their position to others whose views they already know, they will tend to adopt a similar position to those people, and then use confirmatory thought to bolster their own credibility. However, if the external parties are overly aggressive or critical, people will disengage from thought altogether, and simply assert their personal opinions without justification.[95] Lerner and Tetlock say that people only push themselves to think critically and logically when they know in advance they will need to explain themselves to others who are well-informed, genuinely interested in the truth, and whose views they don't already know.[96] Because those conditions rarely exist, they argue, most people are using confirmatory thought most of the time.[97]

Explanations in terms of cost-benefit analysis assume that people do not just test hypotheses in a disinterested way, but assess the costs of different errors.[91] Using ideas from evolutionary psychology, James Friedrich suggests that people do not primarily aim at truth in testing hypotheses, but try to avoid the most costly errors. For example, employers might ask one-sided questions in job interviews because they are focused on weeding out unsuitable candidates.[92] Yaacov Trope and Akiva Liberman's refinement of this theory assumes that people compare the two different kinds of error: accepting a false hypothesis or rejecting a true hypothesis. For instance, someone who underestimates a friend's honesty might treat him or her suspiciously and so undermine the friendship. Overestimating the friend's honesty may also be costly, but less so. In this case, it would be rational to seek, evaluate or remember evidence of their honesty in a biased way.[93] When someone gives an initial impression of being introverted or extroverted, questions that match that impression come across as more empathic.[94] This suggests that when talking to someone who seems to be an introvert, it is a sign of better social skills to ask, "Do you feel awkward in social situations?" rather than, "Do you like noisy parties?" The connection between confirmation bias and social skills was corroborated by a study of how college students get to know other people. Highly self-monitoring students, who are more sensitive to their environment and to social norms, asked more matching questions when interviewing a high-status staff member than when getting to know fellow students.[94]

Motivational explanations involve an effect of desire on belief, sometimes called "wishful thinking".[85][86] It is known that people prefer pleasant thoughts over unpleasant ones in a number of ways: this is called the "Pollyanna principle".[87] Applied to arguments or sources of evidence, this could explain why desired conclusions are more likely to be believed true.[85] According to experiments that manipulate the desirability of the conclusion, people demand a high standard of evidence for unpalatable ideas and a low standard for preferred ideas. In other words, they ask, "Can I believe this?" for some suggestions and, "Must I believe this?" for others.[88][89] Although consistency is a desirable feature of attitudes, an excessive drive for consistency is another potential source of bias because it may prevent people from neutrally evaluating new, surprising information.[85] Social psychologist Ziva Kunda combines the cognitive and motivational theories, arguing that motivation creates the bias, but cognitive factors determine the size of the effect.[90]

Cognitive explanations for confirmation bias are based on limitations in people's ability to handle complex tasks, and the shortcuts, called heuristics, that they use.[81] For example, people may judge the reliability of evidence by using the availability heuristic—i.e., how readily a particular idea comes to mind.[82] It is also possible that people can only focus on one thought at a time, so find it difficult to test alternative hypotheses in parallel.[83] Another heuristic is the positive test strategy identified by Klayman and Ha, in which people test a hypothesis by examining cases where they expect a property or event to occur. This heuristic avoids the difficult or impossible task of working out how diagnostic each possible question will be. However, it is not universally reliable, so people can overlook challenges to their existing beliefs.[12][84]

Confirmation bias is often described as a result of automatic, unintentional strategies rather than deliberate deception.[13][79] According to Robert Maccoun, most biased evidence processing occurs through a combination of both "cold" (cognitive) and "hot" (motivated) mechanisms.[80]


In light of this and other critiques, the focus of research moved away from confirmation versus falsification to examine whether people test hypotheses in an informative way, or an uninformative but positive way. The search for "true" confirmation bias led psychologists to look at a wider range of effects in how people process information.[78]

Within the universe of all possible triples, those that fit the true rule are shown schematically as a circle. The hypothesized rule is a smaller circle enclosed within it.
If the true rule (T) encompasses the current hypothesis (H), then positive tests (examining an H to see if it is T) will not show that the hypothesis is false.

Two overlapping circles represent the true rule and the hypothesized rule. Any observation falling in the non-overlapping parts of the circles shows that the two rules are not exactly the same. In other words, those observations falsify the hypothesis.
If the true rule (T) overlaps the current hypothesis (H), then either a negative test or a positive test can potentially falsify H.

The triples fitting the hypothesis are represented as a circle within the universe of all triples. The true rule is a smaller circle within this.
When the working hypothesis (H) includes the true rule (T) then positive tests are the only way to falsify H.

A 1987 paper by Joshua Klayman and Young-Won Ha argued that the Wason experiments had not actually demonstrated a bias towards confirmation. Instead, Klayman and Ha interpreted the results in terms of a tendency to make tests that are consistent with the working hypothesis.[75] They called this the "positive test strategy".[7] This strategy is an example of a heuristic: a reasoning shortcut that is imperfect but easy to compute.[1] Klayman and Ha used Bayesian probability and information theory as their standard of hypothesis-testing, rather than the falsificationism used by Wason. According to these ideas, each answer to a question yields a different amount of information, which depends on the person's prior beliefs. Thus a scientific test of a hypothesis is one that is expected to produce the most information. Since the information content depends on initial probabilities, a positive test can either be highly informative or uninformative. Klayman and Ha argued that when people think about realistic problems, they are looking for a specific answer with a small initial probability. In this case, positive tests are usually more informative than negative tests.[12] However, in Wason's rule discovery task the answer—three numbers in ascending order—is very broad, so positive tests are unlikely to yield informative answers. Klayman and Ha supported their analysis by citing an experiment that used the labels "DAX" and "MED" in place of "fits the rule" and "doesn't fit the rule". This avoided implying that the aim was to find a low-probability rule. Participants had much more success with this version of the experiment.[76][77]

Klayman and Ha's critique

Wason accepted falsificationism, according to which a scientific test of a hypothesis is a serious attempt to falsify it. He interpreted his results as showing a preference for confirmation over falsification, hence the term "confirmation bias".[Note 4][71] Wason also used confirmation bias to explain the results of his selection task experiment.[72] In this task, participants are given partial information about a set of objects, and have to specify what further information they would need to tell whether or not a conditional rule ("If A, then B") applies. It has been found repeatedly that people perform badly on various forms of this test, in most cases ignoring information that could potentially refute the rule.[73][74]

While the actual rule was simply "any ascending sequence", the participants had a great deal of difficulty in finding it, often announcing rules that were far more specific, such as "the middle number is the average of the first and last".[68] The participants seemed to test only positive examples—triples that obeyed their hypothesized rule. For example, if they thought the rule was, "Each number is two greater than its predecessor", they would offer a triple that fit this rule, such as (11,13,15) rather than a triple that violates it, such as (11,12,19).[70]

The term "confirmation bias" was coined by English psychologist Peter Wason.[67] For an experiment published in 1960, he challenged participants to identify a rule applying to triples of numbers. At the outset, they were told that (2,4,6) fits the rule. Participants could generate their own triples and the experimenter told them whether or not each triple conformed to the rule.[68][69]

Wason's research on hypothesis-testing

I know that most men—not only those considered clever, but even those who are very clever, and capable of understanding most difficult scientific, mathematical, or philosophic problems—can very seldom discern even the simplest and most obvious truth if it be such as to oblige them to admit the falsity of conclusions they have formed, perhaps with much difficulty—conclusions of which they are proud, which they have taught to others, and on which they have built their lives.[66]
", Tolstoy wrote, What Is Art? In his essay "[65]Bacon said that biased assessment of evidence drove "all superstitions, whether in astrology, dreams, omens, divine judgments or the like".
The human understanding when it has once adopted an opinion ... draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distinction sets aside or rejects[.][65]

Before psychological research on confirmation bias, the phenomenon had been observed anecdotally by writers, including the Greek historian Novum Organum, wrote,

Informal observation


Overall, the results revealed that balance research instruction significantly increased the use of participants adding opposing information to their argument. These data also reveal that personal belief is not a source of myside bias. Furthermore, participants who believed that good arguments were based on facts were more likely to exhibit myside bias than participants who did not agree with this statement. This evidence is consistent with the claims proposed in Baron's article that people's opinions about good thinking can influence how arguments are generated.[58]

A study by Christopher Wolfe and Anne Britt also investigated how participants' views of "what makes a good argument?" can be a source of myside bias that influence the way a person creates their own arguments.[58] The study investigated individual differences of argumentation schema and asked participants to write essays. The participants were randomly assigned to write essays either for or against their side of the argument they preferred and given balanced or unrestricted research instructions. The balanced research instructions instructed participants to create a balanced argument that included both pros and cons and the unrestricted research instruction did not give any particular instructions on how to create the argument.[58]

A study has found individual differences in myside bias. This study investigates individual differences that are acquired through learning in a cultural context and are mutable. The researcher found important individual difference in argumentation. Studies have suggested that individual differences such as deductive reasoning ability, ability to overcome belief bias, epistemological understanding, and thinking disposition are a significant predictors of the reasoning and generating arguments, counterarguments, and rebuttals.[59][60][61]

Myside bias was once believed to be associated with greater intelligence; however, studies have shown that myside bias can be more influenced by ability to rationally think as opposed to amount of intelligence.[56] Myside bias can cause an inability to effectively and logically evaluate the opposite side of an argument. Studies have stated that myside bias is an absence of "active open-mindedness," meaning the active search for why an initial idea may be wrong.[57] Typically, myside bias is operationalized in empirical studies as the quantity of evidence used in support of their side in comparison to the opposite side.[58]

Individual differences

In the above fictional example, arthritic symptoms are more likely on days with no rain. However, people are likely to focus on the relatively large number of days which have both rain and symptoms. By concentrating on one cell of the table rather than all four, people can misperceive the relationship, in this case associating rain with arthritic symptoms.[55]

Days Rain No rain
Arthritis 14 6
No arthritis 7 2

This effect is a kind of biased interpretation, in that objectively neutral or unfavorable evidence is interpreted to support existing beliefs. It is also related to biases in hypothesis-testing behavior.[53] In judging whether two events, such as illness and bad weather, are correlated, people rely heavily on the number of positive-positive cases: in this example, instances of both pain and bad weather. They pay relatively little attention to the other kinds of observation (of no pain and/or good weather).[54] This parallels the reliance on positive tests in hypothesis testing.[53] It may also reflect selective recall, in that people may have a sense that two events are correlated because it is easier to recall times when they happened together.[53]

Another study recorded the symptoms experienced by arthritic patients, along with weather conditions over a 15-month period. Nearly all the patients reported that their pains were correlated with weather conditions, although the real correlation was zero.[52]

Illusory correlation is the tendency to see non-existent correlations in a set of data.[50] This tendency was first demonstrated in a series of experiments in the late 1960s.[51] In one experiment, participants read a set of psychiatric case studies, including responses to the Rorschach inkblot test. The participants reported that the homosexual men in the set were more likely to report seeing buttocks, anuses or sexually ambiguous figures in the inkblots. In fact the case studies were fictional and, in one version of the experiment, had been constructed so that the homosexual men were less likely to report this imagery.[50] In a survey, a group of experienced psychoanalysts reported the same set of illusory associations with homosexuality.[50][51]

Illusory association between events

Another experiment involved a slide show of a single object, seen as just a blur at first and in slightly better focus with each succeeding slide.[49] After each slide, participants had to state their best guess of what the object was. Participants whose early guesses were wrong persisted with those guesses, even when the picture was sufficiently in focus that the object was readily recognizable to other people.[45]

One demonstration of irrational primacy used colored chips supposedly drawn from two urns. Participants were told the color distributions of the urns, and had to estimate the probability of a chip being drawn from one of them.[49] In fact, the colors appeared in a pre-arranged order. The first thirty draws favored one urn and the next thirty favored the other.[45] The series as a whole was neutral, so rationally, the two urns were equally likely. However, after sixty draws, participants favored the urn suggested by the initial thirty.[49]

Experiments have shown that information is weighted more strongly when it appears early in a series, even when the order is unimportant. For example, people form a more positive impression of someone described as "intelligent, industrious, impulsive, critical, stubborn, envious" than when they are given the same words in reverse order.[49] This irrational primacy effect is independent of the primacy effect in memory in which the earlier items in a series leave a stronger memory trace.[49] Biased interpretation offers an explanation for this effect: seeing the initial evidence, people form a working hypothesis that affects how they interpret the rest of the information.[45]

Preference for early information

In another study, participants read job performance ratings of two firefighters, along with their responses to a risk aversion test.[44] This fictional data was arranged to show either a negative or positive association: some participants were told that a risk-taking firefighter did better, while others were told they did less well than a risk-averse colleague.[48] Even if these two case studies were true, they would have been scientifically poor evidence for a conclusion about firefighters in general. However, the participants found them subjectively persuasive.[48] When the case studies were shown to be fictional, participants' belief in a link diminished, but around half of the original effect remained.[44] Follow-up interviews established that the participants had understood the debriefing and taken it seriously. Participants seemed to trust the debriefing, but regarded the discredited information as irrelevant to their personal belief.[48]

A common finding is that at least some of the initial belief remains even after a full debrief.[46] In one experiment, participants had to distinguish between real and fake suicide notes. The feedback was random: some were told they had done well while others were told they had performed badly. Even after being fully debriefed, participants were still influenced by the feedback. They still thought they were better or worse than average at that kind of task, depending on what they had initially been told.[47]

Confirmation biases can be used to explain why some beliefs persist when the initial evidence for them is removed.[45] This belief perseverance effect has been shown by a series of experiments using what is called the "debriefing paradigm": participants read fake evidence for a hypothesis, their attitude change is measured, then the fakery is exposed in detail. Their attitudes are then measured once more to see if their belief returns to its previous level.[44]

"[B]eliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the total destruction of their original evidential bases."

—Lee Ross and Craig Anderson[44]

Persistence of discredited beliefs

The "backfire effect" is a name for the finding that, given evidence against their beliefs, people can reject the evidence and believe even more strongly.[41][42] The phrase was first coined by Brendan Nyhan and Jason Reifler.[43]

Charles Taber and Milton Lodge argued that the Stanford team's result had been hard to replicate because the arguments used in later experiments were too abstract or confusing to evoke an emotional response. The Taber and Lodge study used the emotionally charged topics of gun control and affirmative action.[26] They measured the attitudes of their participants towards these issues before and after reading arguments on each side of the debate. Two groups of participants showed attitude polarization: those with strong prior opinions and those who were politically knowledgeable. In part of this study, participants chose which information sources to read, from a list prepared by the experimenters. For example they could read the National Rifle Association's and the Brady Anti-Handgun Coalition's arguments on gun control. Even when instructed to be even-handed, participants were more likely to read arguments that supported their existing attitudes than arguments that did not. This biased search for information correlated well with the polarization effect.[26]

A less abstract study was the Stanford biased interpretation experiment in which participants with strong opinions about the death penalty read about mixed experimental evidence. Twenty-three percent of the participants reported that their views had become more extreme, and this self-reported shift correlated strongly with their initial attitudes.[23] In later experiments, participants also reported their opinions becoming more extreme in response to ambiguous information. However, comparisons of their attitudes before and after the new evidence showed no significant change, suggesting that the self-reported changes might not be real.[26][38][40] Based on these experiments, Deanna Kuhn and Joseph Lao concluded that polarization is a real phenomenon but far from inevitable, only happening in a small minority of cases. They found that it was prompted not only by considering mixed evidence, but by merely thinking about the topic.[38]

A collection of eight different handguns resting on the ground
Strong opinions on an issue such as gun ownership can bias how someone interprets new evidence.

When people with opposing views interpret new information in a biased way, their views can move even further apart. This is called "attitude polarization".[38] The effect was demonstrated by an experiment that involved drawing a series of red and black balls from one of two concealed "bingo baskets". Participants knew that one basket contained 60% black and 40% red balls; the other, 40% black and 60% red. The experimenters looked at what happened when balls of alternating color were drawn in turn, a sequence that does not favor either basket. After each ball was drawn, participants in one group were asked to state out loud their judgments of the probability that the balls were being drawn from one or the other basket. These participants tended to grow more confident with each successive draw—whether they initially thought the basket with 60% black balls or the one with 60% red balls was the more likely source, their estimate of the probability increased. Another group of participants were asked to state probability estimates only at the end of a sequence of drawn balls, rather than after each ball. They did not show the polarization effect, suggesting that it does not necessarily occur when people simply hold opposing positions, but rather when they openly commit to them.[39]

Polarization of opinion

Related effects

One study showed how selective memory can maintain belief in extrasensory perception (ESP).[37] Believers and disbelievers were each shown descriptions of ESP experiments. Half of each group were told that the experimental results supported the existence of ESP, while the others were told they did not. In a subsequent test, participants recalled the material accurately, apart from believers who had read the non-supportive evidence. This group remembered significantly less information and some of them incorrectly remembered the results as supporting ESP.[37]

Myside bias has been shown to influence the accuracy of memory recall.[36] In an experiment, widows and widowers rated the intensity of their experienced grief six months and five years after the deaths of their spouses. Participants noted a higher experience of grief at six months rather than at five years. Yet, when the participants were asked after five years how they had felt six months after the death of their significant other, the intensity of grief participants recalled was highly correlated with their current level of grief. Individuals appear to utilize their current emotional states to analyze how they must have felt when experiencing past events.[35] Emotional memories are reconstructed by current emotional states.

Changes in emotional states can also influence memory recall.[35][36] Participants rated how they felt when they had first learned that O.J. Simpson had been acquitted of murder charges.[35] They described their emotional reactions and confidence regarding the verdict one week, two months, and one year after the trial. Results indicated that participants' assessments for Simpson's guilt changed over time. The more that participants' opinion of the verdict had changed, the less stable were the participant's memories regarding their initial emotional reactions. When participants recalled their initial emotional reactions two months and a year later, past appraisals closely resembled current appraisals of emotion. People demonstrate sizable myside bias when discussing their opinions on controversial topics.[20] Memory recall and construction of experiences undergo revision in relation to corresponding emotional states.

In one study, participants read a profile of a woman which described a mix of introverted and extroverted behaviors.[32] They later had to recall examples of her introversion and extroversion. One group was told this was to assess the woman for a job as a librarian, while a second group were told it was for a job in real estate sales. There was a significant difference between what these two groups recalled, with the "librarian" group recalling more examples of introversion and the "sales" groups recalling more extroverted behavior.[32] A selective memory effect has also been shown in experiments that manipulate the desirability of personality types.[30][33] In one of these, a group of participants were shown evidence that extroverted people are more successful than introverts. Another group were told the opposite. In a subsequent, apparently unrelated, study, they were asked to recall events from their lives in which they had been either introverted or extroverted. Each group of participants provided more memories connecting themselves with the more desirable personality type, and recalled those memories more quickly.[34]

Even if people gather and interpret evidence in a neutral manner, they may still remember it selectively to reinforce their expectations. This effect is called "selective recall", "confirmatory memory" or "access-biased memory".[29] Psychological theories differ in their predictions about selective recall. Schema theory predicts that information matching prior expectations will be more easily stored and recalled than information that does not match.[30] Some alternative approaches say that surprising information stands out and so is memorable.[30] Predictions from both these theories have been confirmed in different experimental contexts, with no theory winning outright.[31]

Biased memory

Biased interpretation is not restricted to emotionally significant topics. In another experiment, participants were told a story about a theft. They had to rate the evidential importance of statements arguing either for or against a particular character being responsible. When they hypothesized that character's guilt, they rated statements supporting that hypothesis as more important than conflicting statements.[28]

Biases in belief interpretation are persistent, regardless of intelligence level. Participants in an experiment took the SAT test (a college admissions test used in the United States) to assess their intelligence levels. They then read information regarding safety concerns for vehicles, and the experimenters manipulated the national origin of the car. American participants provided their opinion if the car should be banned on a six point scale, where one indicated "definitely yes" and six indicated "definitely no." Participants firstly evaluated if they would allow a dangerous German car on American streets and a dangerous American car on German streets. Participants believed that the dangerous German car on American streets should be banned more quickly than the dangerous American car on German streets. There was no difference among intelligence levels at the rate participants would ban a car.[20]

In this experiment, the participants made their judgments while in a magnetic resonance imaging (MRI) scanner which monitored their brain activity. As participants evaluated contradictory statements by their favored candidate, emotional centers of their brains were aroused. This did not happen with the statements by the other figures. The experimenters inferred that the different responses to the statements were not due to passive reasoning errors. Instead, the participants were actively reducing the cognitive dissonance induced by reading about their favored candidate's irrational or hypocritical behavior.[27]

Another study of biased interpretation occurred during the John Kerry or a politically neutral public figure. They were also given further statements that made the apparent contradiction seem reasonable. From these three pieces of information, they had to decide whether or not each individual's statements were inconsistent.[27]:1948 There were strong differences in these evaluations, with participants much more likely to interpret statements from the candidate they opposed as contradictory.[27]:1951

A large round machine with a hole in the middle, with a platter for a person to lie on so that their head can fit into the hole
An MRI scanner allowed researchers to examine how the human brain deals with unwelcome information.

The participants, whether supporters or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Participants described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways.[23][25] Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, "The research didn't cover a long enough period of time", while an opponent's comment on the same study said, "No strong evidence to contradict the researchers has been presented".[23] The results illustrated that people set higher standards of evidence for hypotheses that go against their current expectations. This effect, known as "disconfirmation bias", has been supported by other experiments.[26]

A team at Stanford University conducted an experiment involving participants who felt strongly about capital punishment, with half in favor and half against it.[23][24] Each participant read descriptions of two studies: a comparison of U.S. states with and without the death penalty, and a comparison of murder rates in a state before and after the introduction of the death penalty. After reading a quick description of each study, the participants were asked whether their opinions had changed. Then, they read a more detailed account of each study's procedure and had to rate whether the research was well-conducted and convincing.[23] In fact, the studies were fictional. Half the participants were told that one kind of study supported the deterrent effect and the other undermined it, while for other participants the conclusions were swapped.[23][24]

Confirmation biases are not limited to the collection of evidence. Even if two individuals have the same information, the way they interpret it can be biased.

"Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons."

Biased interpretation

Another experiment gave participants a complex rule-discovery task that involved moving objects simulated by a computer.[21] Objects on the computer screen followed specific laws, which the participants had to figure out. So, participants could "fire" objects across the screen to test their hypotheses. Despite making many attempts over a ten-hour session, none of the participants figured out the rules of the system. They typically attempted to confirm rather than falsify their hypotheses, and were reluctant to consider alternatives. Even after seeing objective evidence that refuted their working hypotheses, they frequently continued doing the same tests. Some of the participants were taught proper hypothesis-testing, but these instructions had almost no effect.[21]

Personality traits influence and interact with biased search processes.[18] Individuals vary in their abilities to defend their attitudes from external attacks in relation to selective exposure. Selective exposure occurs when individuals search for information that is consistent, rather than inconsistent, with their personal beliefs.[19] An experiment examined the extent to which individuals could refute arguments that contradicted their personal beliefs.[18] People with high confidence levels more readily seek out contradictory information to their personal position to form an argument. Individuals with low confidence levels do not seek out contradictory information and prefer information that supports their personal position. People generate and evaluate evidence in arguments that are biased towards their own beliefs and opinions.[20] Heightened confidence levels decrease preference for information that supports individuals' personal beliefs.

Similar studies have demonstrated how people engage in a biased search for information, but also that this phenomenon may be limited by a preference for genuine diagnostic tests. In an initial experiment, participants rated another person on the introversion–extroversion personality dimension on the basis of an interview. They chose the interview questions from a given list. When the interviewee was introduced as an introvert, the participants chose questions that presumed introversion, such as, "What do you find unpleasant about noisy parties?" When the interviewee was described as extroverted, almost all the questions presumed extroversion, such as, "What would you do to liven up a dull party?" These loaded questions gave the interviewees little or no opportunity to falsify the hypothesis about them.[16] A later version of the experiment gave the participants less presumptive questions to choose from, such as, "Do you shy away from social interactions?"[17] Participants preferred to ask these more diagnostic questions, showing only a weak bias towards positive tests. This pattern, of a main preference for diagnostic tests and a weaker preference for positive tests, has been replicated in other studies.[17]

Even a small change in a question's wording can affect how people search through available information, and hence the conclusions they reach. This was shown using a fictional child custody case.[15] Participants read that Parent A was moderately suitable to be the guardian in multiple ways. Parent B had a mix of salient positive and negative qualities: a close relationship with the child but a job that would take him or her away for long periods of time. When asked, "Which parent should have custody of the child?" the majority of participants chose Parent B, looking mainly for positive attributes. However, when asked, "Which parent should be denied custody of the child?" they looked for negative attributes and the majority answered that Parent B should be denied custody, implying that Parent A should have custody.[15]

The preference for positive tests in itself is not a bias, since positive tests can be highly informative.[12] However, in combination with other effects, this strategy can confirm existing beliefs or assumptions, independently of whether they are true.[13] In real-world situations, evidence is often complex and mixed. For example, various contradictory ideas about someone could each be supported by concentrating on one aspect of his or her behavior.[7] Thus any search for evidence in favor of a hypothesis is likely to succeed.[13] One illustration of this is the way the phrasing of a question can significantly change the answer.[7] For example, people who are asked, "Are you happy with your social life?" report greater satisfaction than those asked, "Are you unhappy with your social life?"[14]

Experiments have found repeatedly that people tend to test hypotheses in a one-sided way, by searching for evidence consistent with their current hypothesis.[6][7] Rather than searching through all the relevant evidence, they phrase questions to receive an affirmative answer that supports their hypothesis.[8] They look for the consequences that they would expect if their hypothesis were true, rather than what would happen if it were false.[8] For example, someone using yes/no questions to find a number he or she suspects to be the number 3 might ask, "Is it an odd number?" People prefer this type of question, called a "positive test", even when a negative test such as "Is it an even number?" would yield exactly the same information.[9] However, this does not mean that people seek tests that guarantee a positive answer. In studies where subjects could select either such pseudo-tests or genuinely diagnostic ones, they favored the genuinely diagnostic.[10][11]

A drawing of a man sitting on a stool at a writing desk
Confirmation bias has been described as an internal "yes man", echoing back a person's beliefs like Charles Dickens' character Uriah Heep.[5]

Biased search for information

Confirmation biases are effects in information processing. They differ from the behavioral confirmation effect, also called "self-fulfilling prophecy", in which behavior, influenced by expectations, causes those expectations to come true.[3] Some psychologists use "confirmation bias" to refer to the tendency to avoid rejecting beliefs, while searching for evidence, interpreting it, or recalling it from memory. Other psychologists restrict the term to selective collection of evidence.[4][Note 3]



  • Types 1
    • Biased search for information 1.1
    • Biased interpretation 1.2
    • Biased memory 1.3
  • Related effects 2
    • Polarization of opinion 2.1
    • Persistence of discredited beliefs 2.2
    • Preference for early information 2.3
    • Illusory association between events 2.4
  • Individual differences 3
  • History 4
    • Informal observation 4.1
    • Wason's research on hypothesis-testing 4.2
    • Klayman and Ha's critique 4.3
  • Explanations 5
  • Consequences 6
    • In finance 6.1
    • In physical and mental health 6.2
    • In politics and law 6.3
    • In the paranormal 6.4
    • In science 6.5
    • In self-image 6.6
  • See also 7
  • Notes 8
  • References 9
  • Sources 10
  • Further reading 11
  • External links 12

[Note 2][2]

This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.

Copyright © World Library Foundation. All rights reserved. eBooks from Hawaii eBook Library are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.