viernes, 21 de mayo de 2010

Viviendo en la Negación


Living in denial: Unleashing a lie

Continue reading page |1 |2

IN November 2006, the conservative columnist Piers Akerman published a scathing attack on climate science in Australia's Daily Telegraph. Akerman contended that warnings about warming were deliberately exaggerated. To back his claim, he quoted John Houghton, a former chair of the Intergovernmental Panel on Climate Change, saying: "Unless we announce disasters no one will listen."

Early the next year, Lee Morrison, a conservative Canadian journalist, reused the quote in an opinion piece in the Calgary Herald. That summer, a scholar at the Acton Institute for the Study of Religion and Liberty in Grand Rapids, Michigan, repeated it again in a journal article.

So the quote's gradual rise to prominence began. It has now appeared in at least three books, well over 100 blog posts and on around 24,000 web pages. It has become a rallying cry for climate deniers. Yet Houghton never said or wrote those words. His 1994 book usually cited as the source contains no such phrase. The first person to publish them appears to have been Akerman.

How did a fabrication spread so widely? It's something that happens disturbingly often, even with preposterous or discredited claims. According to Cass Sunstein, a legal scholar at Harvard University, the answer lies with the frailties of human psychology. Once released into the wild, erroneous statements follow predictable routes into acceptance or obscurity, driven by well-known psychological processes.

First of all, a falsehood has to have at least a shred of believability. In 2008 bloggers claimed that Obama was the secret love child of Malcolm X. They did not get much traction.

Falsehoods that sound plausible, on the other hand, can seep unquestioned into consciousness. This happens in part because we use mental short cuts to help us make sense of the world, and also because we seldom bother to check the veracity of what we are told. Here, for instance, is a rumour I just made up: England footballer Ashley Cole owns a fur-lined Ferrari. This is both silly and, as far as I know, untrue. But Cole does have a well-known fondness for bling, which people may take into account when evaluating statements about him. It has been shown that untruths that fit with such mental short cuts are more likely to be remembered as correct, even when there is no evidence they are true.

This may be what drove the spread of the Houghton quote. The writers who recycled it were already hostile to climate science, so the idea that a prominent scientist had been deliberately alarmist probably seemed reasonable to them.

Any falsehood can acquire currency in this way, as long as there are enough people inclined to believe it. Science is especially vulnerable as most people cannot evaluate its claims for themselves - and that can mean anything goes.

Any falsehood can acquire currency if there are enough people inclined to believe it

Once receptive individuals start circulating a falsehood, it is a candidate for widespread dissemination. To understand how some untruths go on to gain general acceptance, we need to consider how social groups shape our judgements.

Mass delusion

Imagine a group of parents who are individually weighing up the evidence for and against vaccination. Let's say that one couple, perhaps already suspicious of mainstream medicine, encounter a rumour that vaccines cause autism, and decide not to vaccinate. The next couple now have new information to consider. As well as the scientific evidence, there is the knowledge that two friends are worried enough not to vaccinate. This might swing them against vaccination too... and so on for each subsequent parent. At some point, expert advice and reports of scientific studies arguing for vaccination come to be outweighed by the mass of parents who say vaccination is unsafe.

This is an informational cascade, a phenomenon first described in 1993 by the economist David Hirshleifer, now at the University of California, Irvine. Cascades can drive the popularity of everything from YouTube videos to medical procedures. They also mean that falsehoods can come to be believed simply because others believe them.

The process is amplified by the "echo chamber" of the internet, which has made it easier than ever to encounter and spread falsehoods. It also makes it easier to start them. Propagators are often aware of what they are doing, according to Sunstein. Some act out of self-interest, such the desire for money or fame. Others are defending an ideology or faith. Some are simply malicious.

The mainstream media often participates in the cascade. John Kerry's 2004 US presidential bid was derailed by a group of Vietnam veterans called the Swift Boat Veterans for Truth, who disputed his war record. Though their allegations were largely unfounded, dozens of media outlets repeated them.

To the casual listener or reader it seemed that pundits everywhere were questioning Kerry's war record. In situations like that, a phenomenon that psychologists refer to as the "illusion of truth" can kick in. "Hearing something 10 times does not mean there are 10 different pieces of information," says Hirshleifer. "But the more you hear something the more likely you are to believe it is true." And so it is with denial: if everybody appears to be saying that climate science is corrupt, or that the MMR vaccine causes autism, it takes on the appearance of fact.




Is there any way to combat the corrosive spread of untruth? The obvious strategy is simply to set the record straight - yet that often fails, and can even be self-defeating.

Political scientists Brendan Nyhan of the University of Michigan, Ann Arbor, and Jason Reifler of Georgia State University, Atlanta, have studied this phenomenon. In one experiment they had students read news stories that included a quote stating, incorrectly, that George W. Bush had banned all stem-cell research. Some stories also included a correction. As expected, students who read the second version were less likely to come away with the belief that Bush had banned stem cell research - but only if they were already sympathetic to Bush. Liberal students were impervious to the correction.

This is an example of confirmation bias, the natural tendency to seek out and believe evidence that fits with our preconceived ideas while ignoring or dismissing the rest.

Self-defeating correction

Another experiment involved a story implying that weapons of mass destruction had been found in Iraq. Some versions of the story included material stating, correctly, that this was not true. Once again, the correction did not get through to those inclined to believe the misconception. In fact, the correction actually hardened some peoples' belief that WMDs were present. The cause of this "backfire effect" is not clear, but could explain why attempts to tackle denial can end up entrenching it.

This does not mean that corrections are never worthwhile. Since Akerman wasexposed in February by UK newspaper The Independent, at least one vocal critic of climate science has said he will stop using the quote. Aside from that, however, it is business as usual. Several websites have repeated the quote since.

And Akerman? After the exposé, he published a genuine quote from Houghton, edited to appear alarmist, in an attempt to show he had been right all along. And so the rumour-mongering goes on.

Read more: Special report: Living in denial

Jim Giles is a correspondent in New Scientist's San Francisco bureau. He posts at twitter.com/jimgiles

No hay comentarios:

Publicar un comentario

yesyukan