Here is a PowerPoint presentation on myth debunking for educators

advertisement
John Cook
Myth Debunking
Global Change Institute, University of Queensland
A Tool for Education & Science Communication
Web: http://www.skepticalscience.com
Email: john@skepticalscience.com
Ullrich Ecker
Phone: +61
7 3365
3553
Cognitive
Science
Laboratories, The University of Western Australia
John Cook
Global Change Institute, The University of Queensland
Stephan Lewandowsky
Cognitive Science Laboratories, The University of Western Australia
Copyright
This presentation is licensed under the Creative
Commons Attribution-NonCommercial-ShareAlike 3.0
Unported License. To view a copy of this license, visit
http://creativecommons.org/licenses/by-nc-sa/3.0/.
You are free to copy, distribute, transmit, and adapt the work, under the following conditions:
Attribution — You must attribute the work as specified in the following (but not in any way that suggests that
the original authors endorse you or your use of the work):
•
Cite as Ecker, Cook, & Lewandowsky (2012). Myth Debunking—A Tool for Education & Science
Communication. Available at www.cogsciwa.com.
•
Retain the footer (including university logos) when incorporating individual slides into your
presentation/publication.
Noncommercial — You may not use this work for commercial purposes.
Share Alike — If you alter, transform, or build upon this work, you may distribute the resulting work only under
the same or similar license to this one.
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
The Authors
Dr Ullrich Ecker is an Australian Postdoctoral Fellow, funded by the Australian Research Council, at the University of
Australia’s School of Psychology. He was named Outstanding Young Investigator of 2011 at the University of Western
Australia, and was a finalist in the 2012 Western Australia Early Career Scientist of the Year Awards. His research looks
at memory updating and the cognitive processing of misinformation.
E-mail: ullrich.ecker@uwa.edu.au
Web: www.cogsciwa.com
John Cook is a Climate Communication Fellow for the Global Change Institute at the University of Queensland. He also
runs skepticalscience.com, a website that makes climate science accessible to the general public, for which he received
the Australian Museum’s 2011 Eureka Prize for Advancement of Climate Change Knowledge. He is also an Adjunct
Lecturer and PhD student at the University of Western Australia. His research looks at the influence of worldview in
how we process scientific information.
E-mail: j.cook3@uq.edu.au
Web: www.skepticalscience.com
Dr Stephan Lewandowsky is a cognitive scientist and Winthrop Professor at the University of Australia’s School of
Psychology. He is the recipient of a Discovery Outstanding Researcher Award from the Australian Research Council
(2011). His research addresses memory updating, the distinction between skepticism and denial, and the role of
uncertainty in the climate system.
E-mail: stephan.lewandowsky@uwa.edu.au
Web: www.cogsciwa.com
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
A Review: Lewandowsky et al. (2012) in
Psychological Science in the Public Interest
For an authoritative review of the literature on misinformation, please see:
Lewandowsky, Ecker, Seifert, Schwarz, & Cook (2012). Misinformation and its correction: Continued
influence and successful debiasing. Psychological Science in the Public Interest, 13, 106-131.
Abstract: The widespread prevalence and persistence of misinformation in contemporary societies, such as the false
belief that there is a link between childhood vaccinations and autism, is a matter of public concern. For example,
the myths surrounding vaccinations, which prompted some parents to withhold immunization from their children,
have led to a marked increase in vaccine-preventable disease, as well as unnecessary public expenditure on
research and public-information campaigns aimed at rectifying the situation.
We first examine the mechanisms by which such misinformation is disseminated in society, both inadvertently and
purposely. Misinformation can originate from rumors but also from works of fiction, governments and politicians,
and vested interests. Moreover, changes in the media landscape, including the arrival of the Internet, have
fundamentally influenced the ways in which information is communicated and misinformation is spread.
We next move to misinformation at the level of the individual, and review the cognitive factors that often render
misinformation resistant to correction. We consider how people assess the truth of statements and what makes
people believe certain things but not others. We look at people’s memory for misinformation and answer the
questions of why retractions of misinformation are so ineffective in memory updating and why efforts to retract
misinformation can even backfire and, ironically, increase misbelief. Though ideology and personal worldviews can
be major obstacles for debiasing, there nonetheless are a number of effective techniques for reducing the impact
of misinformation, and we pay special attention to these factors that aid in debiasing.
We conclude by providing specific recommendations for the debunking of misinformation. These
recommendations pertain to the ways in which corrections should be designed, structured, and applied in order to
maximize their impact. Grounded in cognitive psychological theory, these recommendations may help
practitioners—including journalists, health professionals, educators, and science communicators—design effective
misinformation retractions, educational tools, and public-information campaigns.
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Misinformation: Why is it important?
• Decisions are based on information
• Misinformation leads to bad decisions
• Simple retractions of misinformation are
inefficient
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Inefficient Retractions
• Example 1
– USA, 2003: Many tentative media reports of WMD
discoveries during Iraq invasion
– All retracted
– Nonetheless, substantial proportion of populace continued
to believe that WMD’s had been found (Kull et al., 2003)
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Inefficient Retractions
• Example 2
– UK, 1998: Claim that a certain vaccine causes autism
– Retracted
– Nonetheless, substantial and long-lasting drop in
vaccination rates, leading to hospitalization and even
death of many children (Owens, 2002; Poland & Jacobsen, 2011)
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
…maybe people missed the retraction?
• Lab research shows that people continue to
rely on misinformation even
– when they understand the retraction
– when they correctly remember the retraction
– when they have no reason to believe one event version or the other
(Ecker et al., 2010, 2011a, 2011b; Johnson & Seifert, 1994)
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Example
(Ecker et al., 2011b; Johnson & Seifert, 1994; Wilkes & Leatherbarrow, 1988)
• People first read:
– “…Negligent storage of gas cylinders caused a fire…”
• Then the retraction:
– “…Police have confirmed there was no evidence of gas
cylinders or negligence causing the fire…”
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Example
( Ecker et al., 2011b)
• Verbatim responses of a person to later
questions:
– What was the content of the police message?
• “That there were no gas cylinders involved.”
– What caused the explosions?
• “The gas cylinders.”
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Continued Influence after a Retraction
References to a Piece of (Mis)Information
Ecker et al. (2011a)
5
4
3
No-Retraction
Retraction
Retraction+Alternative
2
1
0
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Why?
• Why do people find it difficult to “unlearn”,
i.e. “remove” misinformation?
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Retractions Create a Gap
MYTH
Removing
a myth leaves
a gap
People hate gaps!
“Removed” misinformation is
actually still in memory, and
people use it even when they
know it is wrong!
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Fill the Gap
FACT
…Unless…
The gap can be filled
with plausible
alternative information
Replace
with alternative
narrative
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Example
(Ecker et al., 2011b; Johnson & Seifert, 1994)
• People first read:
– “…Negligent storage of gas cylinders caused a fire…”
• Then a retraction including a causal
alternative:
– “…Police have confirmed there was no evidence of gas
cylinders or negligence causing the fire…”
– “…but arson materials have been found…”
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Continued Influence after a Retraction
References to a Piece of (Mis)Information
Ecker et al. (2011a)
5
4
3
No-Retraction
Retraction
Retraction+Alternative
2
1
0
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Element 1 of an Effective Debunking
Create a gap
with a
retraction
© Ecker, Cook, & Lewandowsky (2012)
Fill the gap with
a factual
alternative
ullrich.ecker@uwa.edu.au
The Curiosity Gap
• If you retract a myth in someone’s head, the
gap is obvious.
• But sometimes people are unaware of their
knowledge gaps.
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
“Fight sticky ideas
with stickier ideas.”
The Curiosity Gap:
Generate curiosity
by pointing out
knowledge gaps;
then fill those gaps
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Element 1 of an Effective Debunking
Create a gap
with a
retraction
Fill the gap
with a factual
alternative
Point out a gap
to generate
curiosity
Fill the gap
with answers
to questions
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Retractions can backfire
• The last thing you want is to make things
worse!
• A few principles should guide retractions
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
The Familiarity
Backfire Effect
Debunking a myth
can reinforce the
myth in people’s
minds
(Schwarz et al., 2007)
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
The Familiarity Backfire Effect
• When people read
– “The side effects of the vaccine are worse than the flu.”
• Followed by
– “That’s a myth, the worst side effect would be a sore arm.”
• What they may remember after a while is:
– “The vaccine has side effects.”
• This is because repeating the myth when retracting it
makes it more familiar.
• People remember and believe familiar things.
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
The Familiarity Backfire Effect
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
The Familiarity Backfire Effect
• Hence, start with the facts!
– “The vaccine is safe! The worst side effect would
be a sore arm.”
• Then address the myth
– “There are myths that the side effects are worse
than the flu, but they are just that: myths.”
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
The Familiarity Backfire Effect
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Element 2 of an Effective Debunking
Don’t repeat
the myths; give
the facts
Avoid the
Familiarity
Backfire Effect
If the retraction needs to repeat the myth, give an
explicit warning before giving the myth.
This puts people into a more careful processing
mode and reduces misinformation effects
(Ecker et al., 2010)
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
“I didn’t have time
to write a short letter,
so I wrote a long one
instead.”
MARK TWAIN
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Simplicity
(Chater & Vitanyi, 2003; Lombrozo, 2006, 2007)
• People have limited mental capacity
• People hate complicated explanations
– Complex explanations = A lot of thinking
– Thinking is hard
• People like simple explanations
– Does not mean “simplistic”
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
The Overkill Backfire Effect
MYTH
FACT FACT FACT
FACT FACT FACT
FACT FACT FACT
FACT FACT FACT
MYTH
A simple myth is more cognitively attractive
than an over-complicated correction
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
The Overkill Backfire Effect
• Focus on the most important arguments.
• Make things easy to understand.
• Use simple language.
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
The Overkill Backfire Effect
MYTH
FACT
FACT
FACT
© Ecker, Cook, & Lewandowsky (2012)
FACT
ullrich.ecker@uwa.edu.au
Element 3 of an Effective Debunking
Avoid the
Overkill
Backfire Effect
© Ecker, Cook, & Lewandowsky (2012)
Focus on the
best arguments;
keep it simple
ullrich.ecker@uwa.edu.au
“Make things as
simple as possible
(but not simpler).”
ALBERT EINSTEIN
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Getting the message across
• Sometimes language is not the best way to
communicate facts
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
The Use of Graphs
(Nyhan & Reifler, 2011)
Groups of scientists from several major
institutions — NASA's Goddard Institute for
Space Studies, the National Oceanic and
Atmospheric Administration's National
Climatic Data Center, the Japanese
Meteorological Agency and the Met Office
Hadley Centre in the United Kingdom — tally
data collected by temperature monitoring
stations spread around the world. All four
records show peaks and valleys that vary in
virtual sync with each other. They each show
an increase in average global surface
temperatures of approximately 0.5 degrees
Celsius over the last three decades. Data from
each source also indicate that the last decade
is the warmest since 1940.
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Percent Agreement that “Global
temperatures have decreased”
(in people strongly identifying with the Republican Party; Nyhan & Reiffler, 2011)
40
Control
Text
Graph
20
0
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
The Use of Graphs
• The same evidence can lead to a much bigger
reduction of misbeliefs if presented graphically
instead of (only) verbally (Nyhan & Reifler, 2011)
• Information that runs counter to one’s beliefs
may be more readily accepted if it is presented
graphically (Lewandowsky, 2011)
– Quantification reduces ambiguity
– This reduces counterarguing
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Graph design
(Beware of misleading graphs!)
• Make sure graph is designed properly
– Clearly labeled axes
– Appropriate, preferably non-truncated scales
– Using all available/relevant data (no omitted data
points)
– No distortion (3D effects, etc.)
– Including index of data variability (e.g. error bars)
where possible
– Provide a source (reference)
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Element 4 of an Effective Debunking
Avoid
communication
barriers
© Ecker, Cook, & Lewandowsky (2012)
Use graphs (or
other means)
as appropriate
ullrich.ecker@uwa.edu.au
The Worldview Backfire Effect
• If people have strong attitudes
– They will be less skeptical about
attitude-congruent
misinformation and its source
(Lewandowsky et al., 2005)
– Attempts to correct a dearly
held misbelief can cause
stronger misbelief
– Beliefs that are central to one’s
identity will be defended
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
The Worldview Backfire Effect
(Prasad et al., 2009; cf. also Nyhan & Reifler, 2010)
• Republicans who believed Saddam Hussein
was linked to 9/11 were shown evidence that
there was no such link
• Only 2% of people changed their mind
• Most showed “Attitude bolstering”
– People bring supporting facts to mind
– They ignore or counter-argue contrary facts
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
The Worldview Backfire Effect
• Avoid aggressive, inflammatory language
• Messages can be framed to
affirm worldview
– Conservatives are more likely to accept
climate science when policy changes are framed
as business opportunities for nuclear power
(cf. Kahan, 2010)
• Strong believers are difficult to persuade
even with hard evidence
– Efforts are best targeted towards undecided majority
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Element 5 of an Effective Debunking
Avoid the
Worldview
Backfire Effect
© Ecker, Cook, & Lewandowsky (2012)
Target majority;
align correction
to recipient
ullrich.ecker@uwa.edu.au
Example of a Debunking
“…the Earth quit
warming and now we
may be in a cooling
cycle.”
(U.S. Republican Congressman D Rohrabacher)
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Core Fact
Explicit Warning
Earth has been
building up heat
for 4 decades,
at a rate of 2
Hiroshima
bombs per
second.
There are many
myths on global
warming, which
focus only on a small
piece of the puzzle
while ignoring the
full picture.
Myth
Periods of
surface cooling
show that global
warming has
stopped.
Retraction
That is incorrect.
Alternative
Explanation
Surface temperature jumps
up and down because heat
sloshes around between
the ocean and atmosphere.
Nevertheless, Earth
continues to build up heat.
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Misinformation as a Teaching Tool
• In education settings, standard teaching practices often leave
students’ pre-existing misconceptions intact.
• In this case, misinformation and its refutation can be useful as
a teaching tool (cf. Bedford, 2010; Kowalski & Taylor, 2009; Osborne, 2010)
– The refutation of misinformation can provide students who have no (or
very limited) knowledge of a topic with an entry point to build up
knowledge
– The refutation of misinformation can be used to develop students’
understanding of empirical science and foster skeptical appraisal of
evidence and conclusions
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Improvement in test score (%)
100
Compared to a standard teaching format, highlighting
and refuting misconceptions in the classroom can lead
to a better understanding of the subject.
75
50
Refutation
25
Standard
Format
0
Kowalski & Taylor 2009
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Misinformation as a Teaching Tool
• Cave: Using myth refutation as a teaching tool
requires
– Sufficient time to make sure refutation can be
thorough
– Open-mindedness and engagement in the
‘students’
• Otherwise risk of a ‘familiarity backfire’ effect
• The suggestions pertaining to the design of
effective retractions should still be followed
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Summary of Recommendations
• Provide alternative explanation when retracting a myth
• Focus on facts and evidence
• Avoid repeating the myths (unless thorough refutation
is possible; e.g. in classroom)
• Focus on the best arguments
• Use simple language
• Use informative, well-designed graphs
• Avoid aggression; align corrections with recipients’
worldviews as much as possible
• Target non-radical majority
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
For a review, see Lewandowsky et al. (2012)
article in Psychological Science in the Public
Interest, available at:
http://psi.sagepub.com/content/13/3.toc
For a brief summary, see Cook &
Lewandowsky’s (2011) Debunking Handbook,
available at:
http://sks.to/debunk
For more literature &
enquiries, please visit
www.cogsciwa.com
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
References
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Bedford, D. (2010). Agnotology as a teaching tool: Learning climate science by studying misinformation. Journal of Geography, 109, 159-165.
Chater, N., & Vitanyi, P. (2003). Simplicity: a unifying principle in cognitive science. Trends in Cognitive Sciences, 7, 19-22.
Church, J. et al. (2011). Revisiting the Earth's sea-level and energy budgets from 1961 to 2008. Geophysical Research Letters, American Geophysical Union, 38, 18601.
Ecker, U. K. H., Lewandowsky, S., & Tang, D. T. W. (2010). Explicit warnings reduce but do not eliminate the continued influence of misinformation. Memory & Cognition, 38, 10871100.
Ecker, U. K. H., Lewandowsky, S., & Apai, J. (2011a). Terrorists brought down the plane!—No, actually it was a technical fault: Processing corrections of emotive information.
Quarterly Journal of Experimental Psychology, 64, 283-310.
Ecker, U. K. H., Lewandowsky, S., Swire, B., & Chang, D. (2011b). Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction.
Psychonomic Bulletin & Review, 18, 570-578.
Heath, C., & Heath, D. (2007). Made to stick: Why some ideas survive and others die. New York: Random House.
Johnson, H. M., & Seifert, C. M. (1994). Sources of the continued influence effect: When misinformation in memory affects later inferences. Journal of Experimental Psychology:
Learning, Memory and Cognition, 20, 1420-1436.
Kahan, D. M. (2010). Fixing the communications failure. Nature, 463, 296-297.
Kowalski, P., & Taylor, A. K. (2009). The effect of refuting misconceptions in the introductory psychology class. Teaching of Psychology, 36, 153-159.
Kull, S., Ramsay, C., & Lewis, E. (2003). Misperceptions, the media, and the Iraq war. Political Science Quarterly, 118, 569-598.
Lewandowsky, S. (2011). Popular consensus: Climate change is set to continue. Psychological Science, 22, 460-463.
Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological
Science in the Public Interest, 13, 106-131.
Lewandowsky, S., Stritzke, W. G. K., Oberauer, K., & Morales, M. (2005). Memory for fact, fiction and misinformation: The Iraq War 2003. Psychological Science, 16, 190-195.
Lombrozo, T. (2006). The structure and function of explanations. Trends in Cognitive Sciences, 10, 464-470.
Lombrozo, T. (2007). Simplicity and probability in causal explanation. Cognitive Psychology, 55, 232-257.
Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32, 303-330.
Nyhan, B., & Reifler, J. (2011). Opening the political mind? The effects of self-affirmation and graphical information on factual misperceptions. Unpublished manuscript, Dartmouth
College.
Owens, S. R. (2002). Injection of confidence: The recent controversy in the UK has led to falling MMR vaccination rates. European Molecular Biology Organization Reports, 3, 406409.
Poland, G. A., & Jacobsen, R. M. (2011). The age-old struggle against the antivaccinationists. The New England Journal of Medicine, 364, 97-99.
Prasad, M., Perrin, A. J., Bezila, K., Hoffman, S. G., Kindleberger, K., Manturuk, K., . (2009). “There must be a reason”: Osama, Saddam, and inferred justification. Sociological
Inquiry, 79, 142-162.
Schwarz, N., Sanna, L. J., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting people straight: Implications for debiasing and public information
campaigns. Advances in Experimental Social Psychology, 39, 127-161.
Wilkes, A. L., & Leatherbarrow, M. (1988). Editing episodic memory following the identification of error. Quarterly Journal of Experimental Psychology: Human Experimental
Psychology, 40, 361-387.
© Ecker, Cook, & Lewandowsky (2012)
ullrich.ecker@uwa.edu.au
Download