Get to know the backfire effect

In the coming years, you’re likely to slam up against a number of cognitive biases, because they are what entrench social attitudes and beliefs. Humans like to be comfortable and secure in their interpretation of the world and they will go to incredible mental gymnastics to avoid information that might disrupt their beliefs and attitudes. None of us are exempt from cognitive biases — they shape interactions on a daily level, and it pays to be aware of them in ourselves while also thinking about how they affect society.

There’s one that came up a lot during the election, and will likely play an even bigger role in years to come, and it’s known as the backfire effect (if you want to read more about the science behind politics and the backfire effect, here’s an excellent study). Here’s how it works: Person A firmly believes that X is true, and that thinking often backs the way Person A thinks about a variety of related issues. In fact, X is not true, which Person B tries to point out, illustrating this with facts Y and Z. In a logical world, Person A looks at these facts and realises that they’re operating on outdated or incorrect information, coming to understand the reality of Y and Z.

That is not actually how this works out, though. Instead, Person A continues to believe X, and in fact only believes it more strongly after being threatened with the presentation of factual information. This applies not just to specific instances, but to broader social attitudes.

For example: I’m on the left. If someone told me ‘Jeff Sessions once voted to force people to register their Muslim babies’ and I was feeling credulous, I might be inclined to believe it. The responsible version of me would say ‘oh, that’s interesting, where did you hear it?’ That version of me might also go hunting for the specific legislation being referenced to learn more about what actually happened there. But because I’m on the left, I’m predisposed to believe that someone on the right whom I dislike did something that I think is pretty vile. Being primed with that information means that if someone says ‘did you know Jeff Sessions once saved a litter of kittens from a sinking ship?’ I am predisposed to disbelieve it, because it goes against my personal understanding of Jeff Sessions, Satan Incarnate. Even when presented with facts, like documentation in the form of newspaper articles and testimony from credible sources, I would be wary and dismissive (‘that’s not a reputable paper’ ‘maybe the boat wasn’t actually sinking’ ‘maybe Jeff Sessions paid them to say that’).

Sometimes the backfire effect is brought up as evidence that it’s effectively impossible to win an argument — no matter how many facts you present, your opponent will continue to believe whatever it is they think, even if they are wrong, and there is nothing you can do to fix that. Given that people tend to fixate on the first thing they hear, that’s why fake news and distortions of reality are so dangerous, because they are designed to appeal to preexisting beliefs, and they establish a footing for something that can’t be factually refuted.

It can certainly feel this way, especially for those of us who actually go to a lot of trouble to develop an arsenal of facts for deployment in arguments with people who say things that are wrong, or ill-informed, or inaccurate, or offensive, or any combination thereof. I don’t dismiss the value of facts (and, additionally, the value of verifying facts when they are presented to you to learn more about their context, because left and right alike lie, cheat, and fudge information to advance a desired viewpoint).

But it does help to know about the backfire effect, and to consider it in discussions with people. Sometimes, there are crafty ways you can work around it, such as resorting to an appeal with a personal story, which can be quite persuasive, even when that story contradicts someone’s social attitudes. Rather than listing off statistics surrounding access to bathrooms, for example, it can help to personalise the issue with a story about a woman you actually know who was denied access to a bathroom, or to draw upon a humorous anecdote about the issue. If that feels like gross pandering and it makes you uneasy, I can understand why — but sometimes, it goes a lot further than anything else does. Providing a concrete, real-world example can illustrate a point better than statistics do.

Waiting can also do the trick. The only thing worse than being proved wrong is being proved wrong in front of people, especially if they are of higher status and you are trying to impress them. Rather than jumping into an argument midstream with a recitation of facts, when sentiments are high and people are being watched, sometimes it pays to take someone aside later for a brief conversation. That doesn’t necessarily mean you’re coddling their feelings, but it does mean you want to have an actual serious discussion about the issue at hand and it’s important to you that you change this person’s mind, not just react in the moment.

Simplifying the issue can also be helpful, though frustrating — sometimes the more nuance involved, and the more complexity, the more likely people are to just give up. It’s too difficult. It’s better to just retreat to what you know. Distilling an argument to a key point, on the other hand, can help you create a wedge to break through their attitudes — especially if you can turn it to their own self interest. In a discussion about Obamacare, for example, don’t rattle off facts, but turn instead to a personal story about someone similar to the person you are talking to, or use that person directly as an example. Make this about what is at stake for them, and how their beliefs are actually wrong — for example, they may benefit from an ACA provision without being aware of it, and would lose that protection in the event of a repeal.

Knowing that people resist facts which disprove their beliefs can help you better communicate with them, and that, ultimately, is the most important goal.

Image: Fact, Pat Castaldo, Flickr