Articles

UQx DENIAL101x 6.2.2.1 Worldview backfire effect


In 1975, psychologist Daniel Batson from the
University of Kansas ran a bold experiment. To a group of young Christians, Batson presented
evidence that Jesus Christ did *not* rise from the dead. Now the evidence wasn’t real,
it was created for the experiment. Batson wanted to see how the Christians would react. The results were surprising. The left bar
here shows the level of religious belief *before* receiving the evidence. The right bar shows
religious belief *after* the evidence. After people saw evidence that ran counter to their
religious belief, their faith actually got *stronger*. Now why am I talking about a 1970s experiment
examining the belief system of young Christians? This study raises a key question: how can
people update their beliefs in the opposite direction to the evidence? How could they
come to have more faith in their religious belief than before they received evidence
to the contrary? This type of response is called the worldview
backfire effect. Evidence can backfire if it threatens someone’s worldview. And it
doesn’t just apply to religious faith. Let’s look at some
other examples. One recent study by Brendan Nyhan and his
colleagues tested people’s intent to vaccinate their children. This graph shows the level
of intent to vaccinate for people who were least favourable towards vaccination. The
experimenters showed a range of different messages about the importance of vaccination
to people who deny the positive benefits of vaccination. The bar on the left here shows
the intent to vaccinate amongst a control group, who weren’t shown any messages. A second group read a message explaining the
risks of preventable diseases. As you can see, no difference to the control group. A third group read dramatic narratives about
diseases. Again, there was no increase in an intent to vaccinate. A fourth group were shown images of diseases,
with no significant effect either. And a fifth group read a debunking of the
myth claiming that vaccinations led to autism. Debunking the autism myth actually *lowered*
their intent to vaccinate. No matter what message was presented to people predisposed
against vaccinations, none increased their intent to vaccinate their children. In other
words, for people whose worldview predisposed them to oppose vaccination, no message was
successful in changing their minds. In another experiment, researchers presented
evidence that there were no weapons of mass destruction in Iraq. What effect did this
have on participant’s *belief* that there were weapons of mass destruction in Iraq. This graph shows the change in belief, after
receiving the evidence that no such weapons existed. A negative change in belief, below
the dotted line, means a decrease in belief in
weapons of mass destruction. The researchers found that American conservatives become *more
likely* to believe that there *were* weapons of mass destruction in Iraq. Another example
of the worldview backfire effect. We also see the backfire effect with climate
change. News stories about the health impacts of climate change backfired among political
conservatives. They became less likely to support policies to mitigate climate change.
What all this research tells us is worldview influences how people respond to *new evidence*
about climate change. In other words, it affects how people *update* their beliefs. We see
this with belief in weapons of mass destruction. Beliefs about climate change. Religious belief. So let’s go back to that 1970s experiment.
When Christians were given evidence challenging their faith, their faith got stronger. 40
years later, Professor of Psychology Alan Jern explored what might be happening psychologically
to cause such a backfire effect. His research suggested that religious believers
*expect* their faith to be challenged. If someone *expects* challenges to their beliefs
that actually strengthen their belief, then they don’t think those challenges have validity.
This implies that *distrust towards the evidence* is driving the negative reaction. Nicholas Smith and Anthony Leiserowitz from
Yale University ran an experiment that saw this distrust in action. They asked participants
for the first words that came to mind when thinking about global warming. Among the people
who reject climate change, they saw a range of options, from blaming global warming on
natural causes to expressing doubt about the science. But the most common response, by
far, involved conspiracy theories – the notion that global warming was a hoax. There is a significant problem presenting
scientific evidence to people who think that the science is a hoax. If they think the scientific
consensus position is the result of a conspiracy, then any additional evidence supporting the
consensus will just be seen as *more proof* of the conspiracy. So a growing body of research, across a range
of issues, show that evidence that threatens someone’s worldview can actually backfire
and strengthen people’s beliefs. We see this with religious beliefs. Debunking myths about vaccination can actually
*reduce* intent to vaccinate. Presenting evidence of *no* weapons of mass
destruction in Iraq caused conservatives to believe *more* in weapons of mass destruction
in Iraq. And presenting information about climate change
caused *lower* concern about climate change among conservatives. Is there anything we can do about the worldview
backfire effect? Researchers have explored various options. One study by David Hardisty and his colleagues
tested whether it might be more effective to communicate in a way that doesn’t threaten
worldview. They used the example of having to pay more for a product in order to help
the environment. When the extra cost was framed as a tax,
conservatives were less likely to support the price increase. However, if the price increase was framed
as an *offset* rather than a tax, there was equal support for the measure among conservatives
and liberals. Using language that wasn’t threatening to conservatives neutralised the
biasing influence of ideology. A second approach suggests that rather than
try to convince people about the realities of climate change, communication efforts should
focus on mitigation efforts that lead to a better society. Researchers from the University of Queensland
presented three different reasons for climate action. The first reason was to avoid environmental
and health risks. The green bar shown in this graph is a measure of intent to act environmentally,
among people who denied global warming. The second reason talked about how climate
action would be good for the economy and stimulate more scientific development. This led to a
higher intent to act environmentally. The third reason emphasized how climate action
would help make people more caring and friendly towards each other. Among people denying climate
change, the most effective statement in increasing their intention to act environmentally was
the warmth statement. So stepping back and looking at these different
lines of research, what does it all mean? Any response to climate science denial needs
to recognise what’s driving the denial – in many cases, the dominant driver is worldview.
Engaging with people who deny climate science can result in counterproductive, backfire effects or at
best, a small positive effect when messages are framed in a way that avoids threatening
their worldview. Meanwhile, misconceptions originating from
climate denial continue to confuse the public. While the proportion of people denying climate
science is small, they can influence the large, undecided majority. And misconceptions about
climate change can erode public support for climate action. Given the stakes, it’s important to take
a scientific approach to science denial. The science outlines specific characteristics
of denial that allow us to distinguish between genuine scepticism and denial. It if *is*
denial promoting misconceptions about climate change, then we need to look at how to protect
the public’s right to be informed. How do we do that? The evidence tells us that reducing the influence
of denial won’t happen through engagement with the minority who deny climate science.
Instead, it requires communication with the large, undecided majority. For this group,
who are more open to evidence, we need to explain two things – the science of climate
change, and how that science can be distorted.

Leave a Reply

Your email address will not be published. Required fields are marked *