Amazing, isn’t it? It does a much better job explaining the Backfire Effect than I could. The idea is simple, but counterintuitive: giving people facts against something they believe actually, makes them believe it even more fervently. This is why it’s often impossible to win an argument online. Even when you show your friend’s racist uncle statistics on crime from the FBI, his racist view of crime will remain. (Is that just me?) Make no mistake– everyone does this, not only racists. In fact, it’s very important to know that we all do it, and it doesn’t make us bad people.
This insight comes from a 2010 Dartmouth paper where two researchers asked people about their belief that Iraq had WMDs (weapons of mass destruction) before America’s invasion in 2003. They were then presented with a made up report showing facts that disagreed with their belief. Most people did not change what they thought, even after having been presented with evidence to the contrary.
One hypothesized reason for this is the same reason why we spend so much more time thinking about criticism than praise. All humans spend more time thinking about information they disagree with, because it doesn’t fit into their world view. So we mull it over, looking for its weak points. The very fact that we spend more time thinking about it creates more neural connections in our brains, making the links stronger. That, together with the fact that memories change every time we recall them, means we end up having very strong opinions, that facts alone can not change.
Okay, so this all makes sense, right? I’ve provided you with a sound logical argument, and scientific studies to back it up. Intuitively it makes sense, and it does a good job of explaining a weird phenomenon. We like it when the world makes sense.
What if I told you it wasn’t real?
Two recent studies have called the Backfire Effect into question. The researchers went into the experiments thinking that they would find strong evidence that the effect found in the original study existed, and designed the recent studies to reflect that.
A second study included many more people (8,100 instead of 130) than the original study, and included more politically charged topics. Tests in the middle of election season and using facts said by both candidates, they figured, would show the greatest effect. And yet, the second study found no compelling evidence to show that. In fact, the second study actually showed that people will change their minds when presented with evidence, even about topics they hold dear.
Next is my favorite part of the story. They reached out to the authors of the original study and instead of attacking the second study, the authors of the original said, huh, maybe we did our study badly.
Think about that for a moment.
The authors of the original study about the Backfire Effect changed their minds when presented with evidence that the Backfire Effect might not exist. I love that.
A third study was done by the authors of the first and second studies. Study three was designed to once again show that there was actually a Backfire Effect. Good scientists always should try to prove their own beliefs wrong, instead of looking for evidence that supports their beliefs. And again, study three showed no evidence of an effect.
So why have we all been in a situation where we’re in an argument with our friend’s racist uncle online, who has chosen to not believe all your hyperlinked facts? That probably has more to do with how you’re saying it, instead of what you’re saying. In all the studies, the participants were forced to read information that they disagreed with.
Your friend’s racist uncle will probably not read your FBI statistics if you call him a racist and stupid in your post. You’re never going to win an argument if your opponent is always on the defensive. How do you win a debate online? Stay civil, pick your battles, and listen to the other side. Do what you can to make the new information interesting, and non-threatening. There is hope for civil internet discourse yet!