Wednesday, July 1, 2015

The Backfire Effect and Confirmation Bias

Confirmation biases can be used to explain why some beliefs persist when the initial evidence for them is removed. This belief perseverance effect has been shown by a series of experiments using what is called the "debriefing paradigm": participants read fake evidence for a hypothesis, their attitude change is measured, then the fakery is exposed in detail. Their attitudes are then measured once more to see if their belief returns to its previous level.

A common finding is that at least some of the initial belief remains even after a full debrief. In one experiment, participants had to distinguish between real and fake suicide notes. The feedback was random: some were told they had done well while others were told they had performed badly. Even after being fully debriefed, participants were still influenced by the feedback. They still thought they were better or worse than average at that kind of task, depending on what they had initially been told.

In another study, participants read job performance ratings of two firefighters, along with their responses to a risk aversion test. This fictional data was arranged to show either a negative or positive association: some participants were told that a risk-taking firefighter did better, while others were told they did less well than a risk-averse colleague. Even if these two case studies were true, they would have been scientifically poor evidence for a conclusion about firefighters in general. However, the participants found them subjectively persuasive. When the case studies were shown to be fictional, participants' belief in a link diminished, but around half of the original effect remained. Follow-up interviews established that the participants had understood the debriefing and taken it seriously. Participants seemed to trust the debriefing, but regarded the discredited information as irrelevant to their personal belief.

Source: Wikipedia, "Confirmation Bias," Wikipedia website, June 26, 2015, accessed July 1, 2015,


[323] We find that responses to corrections in mock news articles differ significantly according to subjects’ ideological views. As a result, the corrections fail to reduce misperceptions for the most committed participants. Even worse, they actually strengthen misperceptions among ideological subgroups in several cases. [...]

The backfire effects that we found seem to provide further support for the growing literature showing that citizens engage in “motivated reasoning.” While our experiments focused on assessing the effectiveness of corrections, the results show that direct factual contradictions can actually strengthen ideologically grounded factual beliefs – an empirical finding with important theoretical implications. Previous research on motivated reasoning has largely focused on the evaluation and usage of factual evidence in constructing opinions and evaluating arguments (e.g. Taber and Lodge 2006). By contrast, our research – the first to directly measure the effectiveness of corrections in a realistic context – suggests that it would be valuable to directly study the cognitive and affective processes that take place when subjects are confronted with discordant factual information.


Source: Brendan Nyhan and Jason Reifler, "When Corrections Fail: The Persistance of Political Misperceptions," Political Behavior 32 (2010): 303–330,


It’s one of the great assumptions underlying modern democracy that an informed citizenry is preferable to an uninformed one. “Whenever the people are well-informed, they can be trusted with their own government,” Thomas Jefferson wrote in 1789. [...] If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.

In the end, truth will out. Won’t it?

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger. [...]

This effect is only heightened by the information glut, which offers — alongside an unprecedented amount of good information — endless rumors, misinformation, and questionable variations on the truth. In other words, it’s never been easier for people to be wrong, and at the same time feel more certain that they’re right.


Source: Joe Keohane, "How Facts Backfire," website, July 11, 2010, accessed July 1, 2015,

No comments:

Post a Comment

All comments ad hominem or deemed offensive by the moderator will be subject to immediate deletion.