In the past, the research has found that no only do facts fail to sway minds, but they can sometimes produce what’s known as a “backfire effect,” leaving people even more stubborn and sure of their preexisting belief.
But there’s new evidence on this question that’s a bit more hopeful. It finds backfiring is rarer than originally thought — and that fact-checks can make an impression on even the most ardent of Trump supporters. [...]
The study, conducted in the fall of 2005, split 130 participants into groups who read different versions of a news article about President George W. Bush defending his rationale for engaging in the Iraq War. One version merely summarized Bush’s rationale — ‘‘There was a risk, a real risk, that Saddam Hussein would pass weapons or materials or information to terrorist networks.” Another version of the article offered a correction that, no, there was not any evidence Saddam Hussein was stockpiling weapons of mass destruction.
The results were stunning: Staunch conservatives who saw the correction became more likely to believe Hussein had weapons of mass destruction. (In another experiment, the study found a backfire on a question about tax cuts. On other questions, like on stem cell research, there was no backfire.) [...]
In both experiments, the researchers couldn’t find instance of backfire. Instead, they found that corrections did what they were intended to do: nudge people toward the truth. Trump supporters were more resistant to the nudge, but they were nudged all the same.
But here’s the kicker: The corrections didn’t change their feelings about Trump (when participants in the corrections conditions were compared with controls).
No comments:
Post a Comment