“The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it.” – Francis Bacon
Have you ever found yourself in a situation where you show an individual evidence to support a claim you are making, only to have that person refuse to listen to anything you say and realize that he or she is wrong?
Many times, this can happen even if you show that other person facts, statistics, and evidence across all domains to show that what you are saying is true– but that person can stick doggedly to their false beliefs, citing little bits of “evidence” that is unreliable or untrue.
Why does this happen?
The Confirmation Bias: a type of cognitive bias that causes the tendency to discount evidence that undermines our beliefs, while simultaneously searching for anything that supports our beliefs; this is especially evident for beliefs that are emotionally charged. Basically, we selectively remember and accumulate information to find justifications for whatever conclusion we want to reach, rather than the conclusion that is the most accurate.
“Over time, by never seeking the antithetical, through accumulating subscriptions to magazines, stacks of books and hours of television, you can become so confident in your world-view no one could dissuade you. ” David McRaney
We favor things that are agreeable to our beliefs: in 2009, an Ohio State study found that people spend 36% more time reading an essay if that essay aligns with their opinions.
For example, if one believes that landings on the moon were fabricated, he or she can ignore all the literature and evidence to support it, and absorb the evidence that does support his or her claim.
More prosaically, this is what makes conversations between liberals and conservatives, atheists and the religious, utterly pointless: Each side thinks the other one’s “evidence” is meaningless.
“Be careful. People like to be told what they already know. Remember that. They get uncomfortable when you tell them new things. New things…well, new things aren’t what they expect. They like to know that, say, a dog will bite a man. That is what dogs do. They don’t want to know that man bites a dog, because the world is not supposed to happen like that. In short, what people think they want is news, but what they really crave is olds…Not news but olds, telling people that what they think they already know is true.” Terry Pratchett
In a study conducted in 1979 by the University of Minnesota (Mark Snyder and Nancy Cantor), participants were given readings about a normal week in the life of a fictional character named “Jane”.
Throughout the week, Jane did things which showed she could be extroverted in some situations and introverted in others. After a few days, the participants were asked to return. Synder & Cantor divided the participants into groups to help decide if Jane would be suited for a particular job.
The first group was asked if she would be a good librarian, and the other if she would be a good real-estate agent. The librarian group remembered her as an introvert, and the real-estate group remembered her as being an extrovert. After the evaluation, when they were asked if she would be good at the other profession, people stuck with their original assessment– saying she wasn’t suited for the other job.
These findings suggests even in your memories you fall prey to confirmation bias, recalling those things which support your beliefs, forgetting those things which debunk them.
“Thanks to Google, we can instantly seek out support for the most bizarre idea imaginable. If our initial search fails to turn up the results we want, we don’t give it a second thought, rather we just try out a different query and search again.” Justin Owings
Why is confirmation bias dangerous?
If one has incorrect ideas or beliefs, it is possible to find some sort of information (whether it be valid or invalid) that will support those beliefs– thus strengthening these incorrect ideas and beliefs. Once he or she has a personal viewpoint or schema on how things “should” be, only information that confirms that viewpoint will be actively searched for and accepted. Information that goes against your belief system can even be distorted to fit your belief system.
How can you combat this? In psychology, we can examine data by scrutinizing it, analyzing how it was collected, who collected it, and who it was collected from– and thinking about how all these things could have affected the outcomes of the study or compromised its findings. We can look at reliability/unreliability and see if the study is flawed in any way. In daily life, we can try to be more open to considering varying opinions from people with different perspectives than our own.
In science, you move closer to the truth by seeking evidence to the contrary. Perhaps the same method should inform your opinions as well.
Devine, Patricia G.; Hirt, Edward R.; Gehrke, Elizabeth M. (1990), “Diagnostic and confirmation strategies in trait hypothesis testing”, Journal of Personality and Social Psychology (American Psychological Association) 58 (6): 952–963
Fischer, P.; Fischer, Julia K.; Aydin, Nilüfer; Frey, Dieter (2010). “Physically Attractive Social Information Sources Lead to Increased Selective Exposure to Information”. Basic and Applied Social Psychology 32 (4): 340–347.
Hastie, Reid; Park, Bernadette (2005), “The Relationship Between Memory and Judgment Depends on Whether the Judgment Task is Memory-Based or On-Line”, in Hamilton, David L., Social cognition: key readings, New York: Psychology Press, p. 394
Knobloch-Westerwick, S., & Meng, J. (2009, June). Looking the other way: Selective exposure to attitude-consistent and counter attitudinal political information. Communication Research 36(3), 426–448.
Nickerson, Raymond S. (June 1998). “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”. Review of General Psychology 2 (2): 175–220
Russell, Dan; Jones, Warren H. (1980), “When superstition fails: Reactions to disconfirmation of paranormal beliefs”,Personality and Social Psychology Bulletin (Society for Personality and Social Psychology) 6 (1): 83–88.