It probably shouldn’t come as a surprise to anyone versed in psychology, but more and more research is supporting the idea that political falsehoods are effective: even if they are later exposed as false. Whether you be Democrat or Republican, the emotional effect of a compelling narrative or juicy smear seems to remain even if its decisively debunked. While we all seem to form knee-jerk attitudes initially because of certain claims, but we don’t base the attitude on the continued veracity claims: the attitude stands on its own with out without the survival of the supporting claims.
But in some cases, it’s even more bizarre than that. As political scientists Brendan Nyhan and Jason Reifler discovered, conservatives are especially prone to a sort of backlash effect: being given evidence that a claim is false seems to make them more likely to believe it’s true:
In a paper approaching publication, Nyhan, a PhD student at Duke University, and Reifler, at Georgia State University, suggest that Republicans might be especially prone to the backfire effect because conservatives may have more rigid views than liberals: Upon hearing a refutation, conservatives might “argue back” against the refutation in their minds, thereby strengthening their belief in the misinformation. Nyhan and Reifler did not see the same “backfire effect” when liberals were given misinformation and a refutation about the Bush administration’s stance on stem cell research.
Kevin Drum thinks that this effect may have something to do with the carefully celebrated disdain many conservatives have cultivated for experts and media sources in general, and there may be something to that. Drum also notes that the source of the refutation didn’t seem to help either: conservatives seem more likely to believe a politically convenient falsehood even if it’s FoxNews that’s trying to correct the misinformation.
Liberals will no doubt find this research as yet more evidence that their counterparts are indeed stubborn science-haters who prefer ideology to reality (conservatives may, ironically, respond by denying the science behind this study). But before we go whole-hog down that route, I can think of one major explanation for the results that Drum might have missed, and for obvious partisan reasons.
Simply put, this research might not be evidence of conservative pigheadedness: it could just as easily be taken as evidence of legitimate conservative cockiness in the face of consistently lousy critics. That is, it could be that, in the actual real-world experience of most conservatives over the past few decades, prominent “refutations” of ideologically pro-conservative claims really have turned out to be wrong a lot of the time. Perhaps even so much that encountering strong objections to such claims is itself a good statistical predictor of their veracity.
This isn’t necessarily a rational reaction on a case by case basis; it doesn’t have to be. Like any Pavlovian mechanism, what matters is simply its general effectiveness as an association over time and experience. A knee-jerk “backfire effect” response may not make conservatives look very good in a controlled situation in which the claim is already known to be wrong. But it might be a reaction that’s served conservatives pretty well in everyday political life.
Thus, what may be at work here is simply a difference in actual historical experience. Refutations of claims that liberals like may simply have turned out to be valid more often than the refutations of claims conservatives like. And because each group has had different experiences, they’ve developed different knee-jerk mechanisms for how they process a refutation of a politically convenient claim.
Of course, this explanation would require you to basically accept that, in practice, conservative claims really are right more often than liberal ones. Or, at least, that critics of core conservative claims tend to be a lot sloppier and untrustworthy than critics of liberal claims. As someone that leans towards the liberal side of things myself, my own knee-jerk reaction is to find such possibilities absurd: how could our “reality-based community” be less reliable than… than… them?!
The problem, of course, is that I’m obviously too biased to subjectively sum up such a broad and comprehensive balance sheet of overall trustworthiness. Nor can I think of any immediate way to test a partisan bias in “accuracy” empirically.
But I do know that it’s at least a possible explanation for the highly partisan nature of the “backfire effect” that the researchers observed; it’s one which I can’t, as a good social scientist, immediately discount just because I happen to get all worked up about McCain’s latest campaign ads.
And it is an intriguing thought in any case: that the individually irrational behavior of a certain group towards criticism could itself be evidence that their ideological red meat is generally more accurate in the face of criticism.