quote:
Het is natuurlijk wel de essentie wanneer je als naïeve tiener begint te discussiëren op internet. Je denkt nog dat mensen voor rede vatbaar zijn en dat ze, ondanks dat ze een andere mening zijn toegedaan, op z'n minst openstaan voor correctie van feitelijke onjuistheden en eventueel op basis daarvan geneigd zouden zijn standpunten enigszins te nuanceren of herzien. Niets is minder waar natuurlijk. Een tijdje terug was daar nog wel een aardig onderzoekje over:
quote:
Elections are a time for smearing, and the Mails desperate story about Nick Clegg and the Nazis is my favourite so far. Generally the truth comes out, in time. But how much damage can smears do?
A new experiment published this month in the journal Political Behaviour sets out to examine the impact of corrections, and what they found was far more disturbing than they expected: far from changing peoples minds, if you are deeply entrenched in your views, a correction will only reinforce them.
The first experiment used articles claiming that Iraq had weapons of mass destruction immediately before the US invasion. 130 participants were asked to read a mock news article, attributed to Associated Press, reporting on a Bush campaign stop in Pennsylvania during October 2004. The article describes Bushs appearance as a rousing, no-retreat defense of the Iraq war and quotes a line from a genuine Bush speech from that year, suggesting that Saddam Hussein really did have WMD, which he could have passed to terrorists, and so on. There was a risk, a real risk, that Saddam Hussein would pass weapons or materials or information to terrorist networks, and in the world after September the 11th, said Bush: that was a risk we could not afford to take.
The 130 participants were then randomly assigned to one of two conditions. For half of them, the article stopped there. For the other half, the article continues, and includes a correction: it discusses the release of the Duelfer Report, which documented the lack of Iraqi WMD stockpiles or an active production program immediately prior to the US invasion.
After reading the article, subjects were asked to state whether they agreed with the following statement: Immediately before the U.S. invasion, Iraq had an active weapons of mass destruction program, the ability to produce these weapons, and large stockpiles of WMD, but Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived. Their responses were measured on a five-point scale ranging from strongly disagree to strongly agree.
As you would expect, those who self-identified as conservatives were more likely to agree with the statement. Separately, meanwhile, more knowledgeable participants (independently of political persuasion) were less likely to agree. But then the researchers looked at the effect of whether you were also given the correct information at the end of the article, and this is where things get interesting. They had expected that the correction would become less effective in more conservative participants, and this was true, up to a point: so for very liberal participants, the correction worked as expected, making them more likely to disagree with the statement that Iraq had WMD when compared with those who were very liberal but received no correction. For those who described themselves as left of center, or centrist, the correction had no effect either way.
But for people who placed themselves ideologically to the right of center, the correction wasnt just ineffective, it actively backfired: conservatives who received a correction telling them that Iraq did not have WMD were more likely to believe that Iraq had WMD than people who were given no correction at all. Where you might have expected people simply to dismiss a correction that was incongruous with their pre-existing view, or regard it as having no credibility, it seems that in fact, such information actively reinforced their false beliefs.
Maybe the cognitive effort of mounting a defense against the incongruous new facts entrenches you even further. Maybe you feel marginalised and motivated to dig in your heels. Who knows. But these experiments were then repeated, in various permutations, on the issue of tax cuts (or rather, the idea that tax cuts had increased national productivity so much that tax revenue increased overall) and stem cell research. All the studies found exactly the same thing: if the original dodgy fact fits with your prejudices, a correction only reinforces these even more. If your goal is to move opinion, then this depressing finding suggests that smears work, and whats more, corrections dont challenge them much: because for people who already agree with you, it only make them agree even more.