The Misconception: When your beliefs are challenged with facts, you alter your opinions and incorporate the new information into your thinking.
The Truth: When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.
Let’s say you’re having an argument with a friend about, oh, universal healthcare or who the best quarterback in the NFL is.
You present your friend with a set of facts that you assume would clinch your argument. And yet, while the facts you present clearly contradict your friend’s position, you discover that presenting these facts does nothing to correct his or her false or unsubstantiated belief.
In fact, your friend is even more emboldened in his or her belief after being exposed to corrective information.
What’s up with that?
Dartmouth researchers have studied the so-called “backfire effect,” which is defined as the effect in which “corrections actually increase misperceptions among the group in question.”
The problem here may be the way your friend receives your facts. Since your friend knows you and your opinions well, he or she does not view you as an “omniscient” source of information.
When it comes to receiving corrective information about a public policy issue, the study found that people typically receive corrective information within “objective” news reports pitting two sides of an argument against each other, which is significantly more ambiguous than receiving a correct answer from an omniscient source.
In such cases, citizens are likely to resist or reject arguments and evidence contradicting their opinions. So when we read a news story that presents both sides of an issue, we simply pick the side we happen to agree with and it reinforces our viewpoint. (And a bunch of research supports this finding.)
But what about those individuals who don’t simply resist challenges to their views, but who actually come to hold their original opinion even more strongly?
The Dartmouth study describes the “backfire effect” as a possible result of the process by which people counterargue preference-incongruent information and bolster their preexisting views.
If people counter unwelcome information vigorously enough, they may end up with “more attitudinally congruent information in mind than before the debate,” which in turn leads them to report opinions that are more extreme than they otherwise would have had.
This study goes a long way to explain the state of rational discourse in the country right now. So what can be done? How can you have a more effective discussion with your friend about Obamacare or Peyton Manning?
That is the million dollar question, and one I will tackle shortly!