Uploaded by jordanwwong

Misinformation and Disinformation: Using behavioural science to tackle the spread of misinformation (OECD, 2023)

advertisement
MISINFORMATION AND
DISINFORMATION
An international
effort using
behavioural
science to
tackle the
spread of
misinformation
Public Governance Policy Paper
An international effort using behavioural science to tackle the spread of misinformation
TABLE OF CONTENTS
Key policy messages
4
Executive Summary
5
1. Why? Drivers and Objectives
6
Introduction
Behavioural Science for information ecosystems
Behavioural experimentation for sustainable policy solutions
Global responses to global challenges
6
7
9
10
2. How? A case study from Canada
12
Background and objectives
Approach and study design
Findings
Findings summary
Limitations
12
13
16
21
22
3. Future considerations following first case study
Key takeaways from the case study
Next steps in behavioural research for misinformation
23
23
24
4. So what? Implications for policy makers
26
References
28
2
An international effort using behavioural science to tackle the spread of misinformation
ACKNOWLEDGEMENTS
This document was developed by Chiara Varazzani, Lead Behavioural Scientist (OECD), Michaela
Sullivan-Paul (OECD), Lauryn Conway, Andrea Colasanti, and Nicholas Diamond from the Canadian
Privy Council Office’s Impact and Innovation Unit (IIU).
The OECD Secretariat is grateful to have partnered with the Canadian Privy Council Office’s Impact
and Innovation Unit (IIU) in preparing this document. The OECD Secretariat is especially grateful for
the many contributions of Brian Pereira and Alyssa Whalen from the Canadian Privy Council Office’s
Impact and Innovation Unit (IIU), and Mariam Chammat from the French Behavioural Insights Unit,
Direction Interministérielle de la transformation Publique (DITP).
The OECD Secretariat would also like to thank the many colleagues who provided peer review
comments and feedback to this work, which include: Jan Pfänder (DITP), Sacha Altay (Reuters
Institute for the Study of Journalism, Oxford University), Julio Bacio Terracino (OECD), Frédéric
Boehm (OECD), Jeanne Bolleé (OECD), Monica Brezzi (OECD), Craig Matasick (OECD), David
Nguyen (OECD), Mariana Prats (OECD), and Andrea Uhrhammer (OECD), as well as design and
editing support by Claire Karle (OECD).
The OECD Secretariat would also like to thank the members of the OECD global network of
Behavioural Insights experts in government who provided detailed and timely feedback and
comments during the consultation stages. The OECD is grateful to all public officials from Australia,
Austria, Belgium, Brazil, Canada, Chile, Colombia, Czech Republic, Denmark, Estonia, Finland,
France, Germany, Greece, Hungary, Iceland, India, Indonesia, Ireland, Italy, Japan, Lebanon, Malaysia,
Netherlands, New Zealand, Norway, Portugal, Romania, Singapore, Slovakia Republic, South Africa,
Spain, Sweden, Switzerland, Turkey, United Arab Emirates, United Kingdom, and United States for
their participation in the activities of the OECD’s Behavioural Insights Expert Network Meetings and
their valuable contributions to this document.
This work was completed under the leadership of Marco Daglio, Head of the Observatory for Public
Sector Innovation, OECD, Carlos Santiso, Head of the Open and Innovative Governance Division,
OECD, and the overall steering of Elsa Pilichowski, Director of the Public Governance Directorate,
OECD, and Rodney Ghali, Assistant Secretary to the Cabinet, Privy Council Office of Canada.
3
An international effort using behavioural science to tackle the spread of misinformation
4
KEY POLICY
MESSAGES
•
Global policy challenges such as misinformation
overwhelmingly rely on individual behaviours and
social factors to drive group-level action and change
•
both on and offline.
•
•
•
Understanding how cognitive, emotional, and social
factors influence the way individuals navigate
information ecosystems, and conversely, how
information ecosystems influence individual-level
behaviours, can allow governments to better design
and deliver policies, programs and communications.
The OECD convened a first-of-its kind international
partnership on behavioural science and
misinformation between the Canadian and the
French governments to develop and disseminate
behaviourally-informed and evidence-based
solutions that can guide government response to
misinformation.
The study tested 1,872 Canadians’ intentions to
share false COVID-related headlines online with two
behavioural interventions: an accuracy evaluation
prompt and digital media literacy prompt.
The data generated by this partnership show that the
digital media literacy tips reduced intentions to share
fake news online by 21% compared to the control
group – having the greatest impact on online users.
•
These insights can enable policy makers to enact
measures that defend and empower online users
against environments designed to exploit certain
natural but maladaptive tendencies and place the
control back into the hands of online users.
•
Relying solely on traditional top-down approaches
that aim to regulate content are insufficient at limiting
the immediate dangers of misinformation.
•
Innovative policy-making tools such as behavioural
science can help provide immediate and long-term
solutions to misinformation and should be considered
as part of a holistic and comprehensive strategy to
offset the threats of misinformation.
•
Governments should conduct rigorous policy
experiments in collaboration with other countries,
like the one presented here, before enacting policy
that affects a larger population to address the crossborder nature of misinformation.
An international effort using behavioural science to tackle the spread of misinformation
EXECUTIVE
SUMMARY
M
is- and disinformation can have profound
effects on the ability of democratically-elected
governments to serve and deliver to the public
by disrupting policy implementation and hindering trust in
institutions.
Driven by a joint objective to better understand and
reduce the spread of misinformation with insights
and tools from behavioural science, the OECD, in
partnership with behavioural science experts from
Impact Canada (IIU) and from the French Direction
interministérielle de la transformation publique (DITP),
launched a first-of-its-kind international collaboration.
This exercise in knowledge sharing of best practices
between governments and academics initiated a study
conducted in Canada using a randomised controlled trial
– embedded within the longitudinal COVID-19 Snapshot
Monitoring Study COSMO Canada.
This study tested the impact of two behaviourallyinformed interventions on intentions to share true
and false news headlines about COVID-19 on social
media: an attention accuracy prompt and a set of digital
media literacy tips. While both behaviourally-informed
interventions were found to be effective, the digital
literacy tips were significantly more effective than the
accuracy evaluation prompt at improving the quality of
news sharing, reducing intentions to share false headlines
about COVID-19 by 21%.
Overall, results from this experiment suggest that userend, scalable, behavioural interventions can reduce
sharing of false news headlines in online settings. These
results offer compelling avenues for empowering
5
individuals to make active choices about information
quality while preserving citizen autonomy as a priority.
These tools can provide pre-emptive and complementary
approaches that can be deployed alongside systemlevel approaches that regulate, set standards, or
otherwise address false information online. The results
of this study are an important step in bolstering our
understanding of the impact of mis- and disinformation
and testing feasible, effective, and collaborative
solutions.
The key insights generated from the results of this study
can be summarised as follows:
•
•
•
Behavioural interventions are effective, scalable
tools for tackling the spread of misinformation and
they can enhance system-level policies aimed at
empowering users.
A comprehensive policy response to mis- and
disinformation should include an expanded
understanding of human behaviour.
International and collaborative experimentation
is vital for tackling global policy challenges with
a sustainable response to the spread of mis- and
disinformation.
The paper concludes by outlining additional opportunities
for research and urging governments to increase efforts
to conduct policy experimentation in collaboration with
other countries before enacting policy.
1
An international effort using behavioural science to tackle the spread of misinformation
6
WHY?
DRIVERS AND
OBJECTIVES
INTRODUCTION
A
dvancements in digital technologies and
information ecosystems – which here, refers
to all relevant regulation, actors, digital
platforms, and information itself – have fundamentally
reshaped the way information is shared and consumed.
On the one hand, this interconnected network of
information has facilitated greater civic participation
in political, economic and social life while also
enhancing governments’ capacity to communicate and
respond to the evolving needs of the public they serve
(OECD, 2014[1]). On the other hand, instantaneous
and continuous exposure to conflicting and rapidly
changing information further complicates an already
overwhelming landscape of information and knowledge
exchange (OECD, 2021c[2]). Although this increase
in accessibility facilitates an unprecedented conduit
to knowledge and opportunities, it also provides the
optimal environment for circulating false and harmful
information at a rapidly accelerated pace. The rise in the
circulation and exposure of inaccurate information has
trickled into all aspects of public life, including health,
education, market activity, financial literacy, and political
and social affairs (Greifeneder et al., 2021[3]).
Inaccurate information is often presented in common
formats such as junk science, fake news, and gossip
tabloids, among others, but is best understood as either
misinformation, which is the unintentional spread of
inaccurate information, or disinformation, which is the
intentional spread of inaccurate information for political,
financial, or otherwise self-serving ends (CarrascoFarré, 2022[4]; OECD, 2021c[2]).
Exposure to false or
misleading statements can
cast doubt on official and
factual information, and
can erode the integrity and
credibility of democratic
institutions and their ability
to enhance public welfare
through policy measures.
The threats posed by the spread of mis- and
disinformation have profound effects on the ability
of democratically-elected governments to serve their
public, disrupting policy implementation and hindering
trust in institutions. Exposure to false or misleading
statements can cast doubt on official and factual
information, and can erode the integrity and credibility of
democratic institutions and their ability to enhance public
welfare through policy measures (Brezzi et al., 2020[5];
Colomina et al., 2021[55]; Lesher et al., 2022[6]). Fears
relating to the increased circulation of false information
An international effort using behavioural science to tackle the spread of misinformation
7
were abundant when reports broke of the Cambridge
Analytica Scandal in 2018, and its influence in political
affairs such as the 2016 United States Presidential
Election and the 2016 UK referendum to withdraw
from the European Union (BBC, 2018[7]; Bovet & Makse,
2019[8]; Marshall & Drieschova, 2018[9]). Although
we are unable to accurately assess the full impact of
misinformation, risk perceptions associated with the
spread of misinformation among online users remain
decision-making, presenting opportunities for even small
digital modifications to amount to large-scale changes
in collective-behaviours (Lorenz-Spreen et al., 2020[13]).
Elevated by the rise of digital environments designed to
compete for user engagement and attention, there has
been an alarming increase in the use of proprietary, nontransparent, and often unregulated measures, such as
sophisticated AI algorithms, that are designed to extract
and exploit user patterns and behaviours (Lorenz-
high. A recent survey conducted across 142 countries
found that 58.5% of regular Internet users reported
concerns over the spread of misinformation with the
highest concern concentrated in liberal democratic
governments (Knuutila et al., 2022[10]). Despite growing
worries over the manipulation of digital technologies for
interfering in political and social affairs, the implications
of mis- and disinformation on the dynamics between
government and the public became most apparent
during the COVID-19 pandemic when accessibility to
accurate, timely, and reliable information became crucial
in all efforts to control the spread of the virus (Posetti &
Bontchva, 2020[11]).
Spreen et al. 2020[13]; Matasick, et al., 2020[14]). These
sophisticated information spaces, which are increasingly
used as primary sources for news information, can
intentionally or unintentionally destabilize a shared
sense of truth and undermine trust in public institutions
and authoritative sources of information about all
aspects of public life.
BEHAVIOURAL
SCIENCE FOR
INFORMATION
ECOSYSTEMS
The rapid advancements in online informational platforms
have created a sophisticated environment of triggers,
cues, and choices that have broadened the scope of
influence that digital technologies have on individual-level
behaviour. Unlike traditional informational mediums,
digital hubs engage users in active behaviours and
Behavioural science is a multidisciplinary approach to
the study of human behaviour and decision-making,
combining findings and methods from cognitive science,
decision science, social science, as well as psychology,
anthropology, and economics, to integrate a humancentred perspective towards today’s policy goals. Within
the realm of misinformation, behavioural science draws
on knowledge of human behaviour and current, as well
as past contextual factors, to understand how individuals
consume and interact with information, what attracts
them to certain information sources or mediums, and
why some may be more easily targeted by false or
unsubstantiated claims than others (Van der Linden et
al., 2021[15]).
As a discipline, behavioural science provides an empirical
framework that supports researchers and practitioners
in performing deeper analyses that explore the social,
emotional, and cognitive conditions – such as trust
bonds or openness to information – that influence how
individuals interact with the information around them
(for examples, see, Roozenbeek et al., 2020[16]; Van
der Linden et al., 2020[17]; Zimmermann & Kohring,
An international effort using behavioural science to tackle the spread of misinformation
2020[18]). In doing so, behavioural science can also allow
researchers, policy makers, and Big Tech to disentangle
information ecosystems specifically – including their
design, functions, and objectives – to better understand
how to carefully manage online platforms to optimise
their potential to serve, rather than exploit, users.
Understanding how complex cognitive, emotional, and
social processes influence the way individuals navigate
information ecosystems, and conversely, how information
ecosystems influence individual-level behaviours, can
allow governments, communicators, and behavioural
science practitioners to better understand the landscape
in which they design and deliver policies, programs and
communications.
Cognitive and contextual factors have a particularly
strong influence over how people exchange and consume
information. Cognitive overload, for example, can
cause individuals to reject true information because of
negative emotions, such as stress, anxiety, confusion,
fear, or fatigue, that are felt when they are subjected
to high volumes of information (Sweller, 1988[19]). The
most recent Digital News Report, published by Reuters
Institute at the University of Oxford, finds an increase in
news fatigue, not only for COVID-related news, but also
around a variety of topics, instigating readers’ selective
avoidance to reduce their consumption of specific
information (Newman et al., 2022[20]). Conversely,
confirmation bias and anchoring bias can influence the
way individuals are prone to accepting misinformation.
Confirmation bias describes the tendency to reject
information that does not confirm or support
predetermined beliefs or ideas. This bias is worsened
by the presence of anchoring bias, which occurs when
individuals anchor their beliefs and opinions to outdated
or irrelevant information rather than updating their
perspectives according to recent or new information
(Furnham & Boo, 2011[21]).
8
Behavioural science can also provide frameworks for
analysing the underlying systems of choice and decision
making to establish how these relate to individuals’
information-seeking and sharing behaviours. For example,
understanding the core elements that influence how
individuals form trust and what social and contextual
factors can strengthen and weaken that trust can
provide useful insights into how individuals decide
where to seek out information, their willingness to
accept or reject that information, and can even help
predict their susceptibility to believing and sharing
misinformation (Agley & Xiao, 2021[12]; Ognyanova et al.,
2020[22]).
Broadening our collective understanding on the
determinants of trust also provides insights into the ways
social, emotional, and cognitive factors influence the
dynamics between government institutions and the public
they serve.
For instance, individuals’ willingness to comply with
COVID-containment measures in Europe were found
to be associated with their trust towards policy makers
prior to the pandemic (Brezzi et al., 2021[23]). Through
this, and similar insights, behavioural science, when
embedded throughout the policy cycle, can provide
practical and empirically-tested advice to help inform
problem-identification, as well as contribute to the
design and implementation of sustainable and targeted
policy solutions that serve the needs of the people they
are enacted to benefit.
Cognitive and contextual
factors have a particularly
strong influence over how
people exchange and consume
information.
An international effort using behavioural science to tackle the spread of misinformation
9
In addition to helping identify factors that are common
to individuals everywhere, behavioural science also
provides a scientific basis for better understanding the
factors that make individuals unique. Collecting data
on the diversity in people’s preferences, perspectives,
concerns, and motivations – as well as the factors that
contribute to variations in these – can allow policy makers
to propose human-centred solutions that reflect the
unique composition of their audience. Classic socio-
BEHAVIOURAL
EXPERIMENTATION
FOR SUSTAINABLE
POLICY SOLUTIONS
economic indicators, such as sex, income, or educationlevel, can be enhanced with relevant data about the
diverse preferences and needs of the public and ideally,
inform decisions that are targeted and tailored to the
preferences of those they aim to serve. As such, failure
to employ a behavioural lens can be detrimental to the
uptake and overall success of such policy solutions.
D
Failure to employ a
behavioural lens can be
detrimental to the uptake
and overall success of such
policy solutions.
uring the pandemic, governments began
exploring opportunities to leverage behavioural
science and experimentation in their COVIDresponse measures (OECD, 2020[24]). Frameworks,
toolkits, and additional resources aimed at providing
governments with guidance for combatting the spread
of COVID -19 frequently pointed to the advantages
of applying behavioural functions to reinforce health
and safety and to better understand how citizens were
experiencing a novel situation without a clear precedent.
COVID-19 containment measures rely overwhelmingly
on individual behaviours and people’s willingness to
increase practices such as handwashing, mask-wearing,
and social distancing. As such, initiatives employing
a behavioural lens to understand the barriers and
enablers that influence individuals’ willingness to engage
in such practices became imperative to the design
and implementation of far-reaching and typically allencompassing measures (European Centre for Disease
Prevention and Control, 2021[25]; Office of Evaluation
Sciences, 2021[26]; WHO, 2020[27]).
Subsequent reports for more targeted applications of
behavioural science emerged with recommendations
for addressing misinformation relating to COVID-19
specifically, including the World Health Organisation
(WHO) behavioural science survey tool which was
used to inform the design of the featured case study
(WHO, 2020[27]) (for additional examples see, OECD,
2020[24]; Office of the U.S. Surgeon General, 2021[28]).
An international effort using behavioural science to tackle the spread of misinformation
Governments and BI teams began conducting their own
research to understand the effects of behaviourallyinformed policy within their jurisdictions and to scale
successful interventions as part of their own COVID-19
response. This included gathering insights on how
citizens were coping with the effects of the pandemic,
trends in attitudes and behaviours relating to the
COVID-19 virus, and whether response measures and
communications were serving the public as intended.
For example, the US’ Office of Evaluation Sciences
launched a five-year project, encompassing eight
randomised controlled trials, to test the effects of seven
different behaviourally-informed communications for
increasing vaccination uptake among Americans, which
concluded that behavioural interventions to be effective
for boosting vaccination rates (Office of Evaluation
Sciences, 2021[26]). Similarly, the Irish Department of
Health used empirical research on the use of cognitive
levers such as salience, accessibility, and informational
provisions to inform public communications on
handwashing in public and workplaces (Murphy,
2020[29]). They used randomised controlled trials to test
the effects of behaviourally-informed methods, such as
goal-framing, on Ireland’s contact-tracing application
COVID Tracker, finding benefits for increasing uptake,
trust, and participation for contract-tracing apps
(Julienne et al., 2020[30]).
Contrary to beliefs that research and rigorous evaluation
is always time and cost-intensive, experimentation
designed with a behavioural lens can provide valuable
evidence-based insights even when time and budget are
limited. These examples, including the featured case
study, represent only some of the many successful
behaviourally-informed tests conducted during a time of
crisis where time and resources were scarce.
These examples reinforce the observable trend towards
mainstreaming behavioural science in the policy cycle,
10
while also underscoring the importance of using
quantitative and qualitative research methods to provide
key insights at the earliest and most critical stages of the
policy cycle. Holistic problem-diagnostics, coupled with
empirical data about the unique preferences and needs
of a population can be used to better anticipate policy
outcomes and avoid unintended, negative or backfire
effects by testing and/or comparing iterations of policy
interventions on a subset of the population of interest
in a controlled environment. Rigorous testing by BI
practitioners is fundamental to understanding key policy
areas like climate change, social inequality, policy reform,
and mis- and disinformation, as well as individuals’
attitudes and acceptance of specific policy interventions
such as taxation, regulation, or social support proposals.
GLOBAL
RESPONSES
TO GLOBAL
CHALLENGES
T
he insights produced by behaviourally-informed
experimentation become even more powerful
when tested and replicated in different
environments and across diverse samples. Comparative
analyses between different countries or along different
demographic segmentations can provide significant
insights on the influence of economic, social, political,
and cultural preferences on shaping and shifting human
behaviour. These insights can in turn help explain why
certain policies are successful in some contexts and not in
others, while also revealing additional social or economic
trends that can help inform current and future policy
objectives. Although these analyses are relevant for
An international effort using behavioural science to tackle the spread of misinformation
local, regional, and national levels, their insights provide a
unique benefit for the global community concerned with
challenges that demand cross-border co-operation and
co-ordination.
The global nature of the COVID-19 pandemic demands
transnational responsiveness. This is a common
characteristic of today’s global challenges, such as climate
change, and misinformation, which are united by their
boundless influence and implications. Solidarity among
and across governments is the first step in co-ordinating
efforts to tackle global policy issues. The OECD continues
to foster global dialogue on key challenges such as misand disinformation by convening governments with
shared objectives of generating relevant and timely
research that can protect the integrity of democratic
institutions and their ability to maintain and enhance
the well-being of the public. Examples of the OECD’s
contribution in this regard include publications such as
the OECD Report on Public Communication the Global
Context and the Way Forward (2021[2]) and Governance
responses to disinformation: How open government
principles can inform policy options (Matasick et al.,
2020[14]). The impact of this work is enhanced through
coordinated efforts across governments to achieve crossborder innovation and experimentation, and contribute to
global knowledge on policy challenges and their solutions
(OECD, n.d.[31], 2021a[32], 2021b[33], 2022[34]).
The OECD continues to
foster global dialogue on
key challenges such as
mis- and disinformation by
co-ordinating governments
with shared objectives in
order to protect the integrity
of democratic institutions
and their ability to maintain
and enhance the well-being
of the public.
11
2
An international effort using behavioural science to tackle the spread of misinformation
12
HOW?
A CASE STUDY
FROM CANADA
BACKGROUND
AND OBJECTIVES
I
n 2017, the Government of Canada established
Impact Canada, a whole of government framework for
scaling up and mainstreaming outcomes-based policy/
program methods, such as the application of insights
and methodologies from the behavioural sciences. This
portfolio of work seeks to bridge the gap between policy
development and effective implementation. With a
Centre of Expertise housed in the Privy Council Office (a
central agency in Canada’s federal public service), Impact
Canada is comprised of a multidisciplinary, specialised
team with extensive experience in the development and
execution of novel policy and program methods, and is
home to Canada’s Behavioural Science Team.
In March 2020, Impact Canada launched a program of
applied research – grounded in behavioural science – to
support the Government’s COVID-19 response efforts
in accurately and effectively promoting behaviours
recommended by Canadian public health experts. This
included a nationwide longitudinal study (COVID-19
Snapshot Monitoring Study - or ‘COSMO’ for short) which
was adapted from a suite of resources released by the
WHO (Privy Council Office of Canada, 2020[35]; WHO,
2020[27]). Between April 2020 and 2021, sixteen waves
of data from roughly the same cohort of 2,000 Canadians
were collected through COSMO. Over time, COSMO has
amassed a rich and robust dataset reflecting Canadians’
diverse pandemic experiences.
In continuing to monitor and analyse the pandemic
through a behavioural lens, a series of troubling signals
emerged from the data related to misinformation. For
example, between April 2020 and 2021, Impact Canada
identified pervasive knowledge gaps and high levels of
belief in verifiably false information about COVID-19.
Importantly, greater belief in misinformation was
consistently associated with important health-related
behaviours, such as intentions to get vaccinated (and
boosted) against COVID-19; a finding later corroborated
in the broader academic literature (for example, Loomba
et al., 2021[36]). These findings – alongside data emerging
from the international community – alerted the team to
the threat of misinformation for COVID-19 response
efforts, and prompted the development of a new scope of
work focused on better understanding and mitigating the
spread of misinformation in Canada.
To tackle this, the OECD convened a first-of-its kind
partnership between Impact Canada, the French
Behavioral Insights Team at the DITP, and academic
experts. At its core, the partnership aims to develop
and share behaviourally-informed best practices that
can guide government response to this pressing policy
challenge. To this end, Impact Canada led an inaugural,
collaborative project amongst the partners: a randomised
controlled trial embedded within COSMO Canada aimed
at better understanding and improving the quality of
information shared in the online social media ecosystem.
An international effort using behavioural science to tackle the spread of misinformation
APPROACH AND
STUDY DESIGN
E
xisting research suggests that changing behaviours
relating to sharing misinformation (i.e., the act of
being exposed to misinformation and choosing
to share it on social media) can significantly impact the
quality of information circulating online (Pennycook &
Rand, 2022[37]). This study was designed as a first step
towards building knowledge on how Canadians engage
with COVID-related information online and make
choices about what types of information to share or
amplify within their social networks.
The academic literature offers a few theoretical accounts
of why individuals share misinformation, acknowledging
that drivers of susceptibility and spread are complex and
multifaceted. One such account, explored in detail in a
recent Nature publication by Pennycook and colleagues
(2021[38]), hypothesised that most people do care about
sharing accurate information, and can discern true from
false content; however, they are often distracted from
thinking about the accuracy of information in online
settings. Other features of the user experience when
scrolling rapidly through high volumes of mixed-content
likely interfere with analytical thinking processes and pull
attention away from the accuracy of information when
making decisions about what to share in one’s networks
(Zimmerman & Kohring, 2020[18]; Pennycook et al.,
2020[39]). Accordingly, the authors provide evidence that
a brief intervention designed to cue users to think about
accuracy, prior to engaging with social media content, can
reduce the sharing of false content online (Pennycook et
al., 2021[38]). They have since replicated this finding across
a number of settings and using a variety of different
types of news stimuli (including COVID-related news)
(Pennycook and Rand, 2022[37]).
13
Leveraging this research, Impact Canada and its partners
sought to replicate Pennycook and colleagues’ (2021[38])
work in a Canadian context and contrast it with a
digital media literacy tips intervention (building upon
approaches tested by (Epstein et al., 2021[40] and Guess
et al., 2020[41]). Cross-national replication and extension
of experiments is particularly important here, as belief in
misinformation and sharing of misinformation has been
found to vary substantially across countries (Arechar et
al., 2022[42]). The objectives of this study were threefold:
(1) establish a baseline understanding of Canadians’
willingness to share COVID-19 information online, (2)
better understand the individual- and environmentallevel factors contributing to sharing decisions, and
(3) evaluate the effectiveness of two behaviourallyinformed interventions aimed at cueing attention to
information accuracy and thereby reducing spreading of
verifiably false or debunked information.
Pennycook and colleagues
(2021) hypothesised that
most people do care
about sharing accurate
information, and can
discern true from false
content; however, they
are often distracted from
thinking about the accuracy
of information in online
settings.
An international effort using behavioural science to tackle the spread of misinformation
14
Figure 1. Delineative representation of the four experimental groups
Full Sample
(n = 1872)
Accuracy Judgment
Control
n = 282
Outcome
Variables
Stimuli
Conditions
randomized
Sharing Intention
Control
n = 525
Treatment 1:
Accuracy Evaluation
n = 532
Treatment 2:
Media Literacy Tips
n = 533
All participants shown the same 14 COVID-19 headlines
(7 true, 7 false; randomized order)
To the best of your
knowledge, is the
claim in the above
headline accurate?
4-point Likert (Very Inaccurate Very Accurate)
If you were to see the above article online (for example, through Facebook or
Twitter), how likely would you be to share it?
6-point Likert (Extremely Unlikely - Extremely Likely)
Source: Diamond, N.B, Pereira, B., Colasanti, A., Chammat, M., Varazzani, C., & Conway, L. (2022, May[43]). Understanding and countering
misinformation in Canada with insights from the behavioural sciences [Conference]. Behavioural Science and Policy Association Annual
Conference 2022.
To achieve these objectives, the IIU designed and
embedded an online randomised controlled trial within
Wave 15 of the COSMO Canada survey (fielded August
12-14, 2021). The study included a broadly nationally
representative sample of 1872 Canadians. Participants
were randomly assigned to one of four groups: (1)
Accuracy Judgment control; (2) Sharing Intentions
control; (3) Accuracy evaluation treatment; and (4)
Digital Media Literacy treatment (see Figure 1).
Accuracy Control
The accuracy control group served as a baseline measure
of Canadians’ ability to discern between true and false
COVID-19 news headlines. Participants (n=282) were
presented with 14 real COVID-related news headlines
(half true and half false, in randomised order) as they
would appear on social media. Headlines were drawn
from previously published work by Pennycook and
colleagues, and false headlines were deemed false by
third-party fact-checking websites. Participants were
then asked, ‘To the best of your knowledge, is this claim
in the above headline accurate?’ and given a 4-point
scale ranging from ‘not at all accurate’ to ‘very accurate’,
following Pennycook et al. (2021[38]; Studies 3-5).
Sharing Intentions
Control
The sharing intentions control group served as a
baseline measure of Canadians’ intentions to share true
and false headlines online. Participants (n=525) were
presented with the same 14 headlines and asked, “How
likely would you be to share this news headline on social
media?”. Participants responded using a 6-point scale
ranging from ‘extremely unlikely’ to ‘extremely likely’.
Accuracy Evaluation
Prompt
Participants in the accuracy evaluation prompt group
(n=532) were presented with an accuracy evaluation
prompt containing a neutral, non-COVID-19 related
headline, “Scientists discover the ‘most massive neutron
star ever detected”, and asked to rate its accuracy (using the
same 4-point scale provided to the accuracy control group).
Following this prompt, the participants were presented
with the same 14 headlines and asked how likely they
would share them on social media (using the same 6-point
scale as the sharing intentions control group).
INTERVENTION 2
INTERVENTION 1
An international effort using behavioural science to tackle the spread of misinformation
15
Digital Media
Literacy Tips
Participants in this group (n = 533) were first provided
with a list of digital media literacy tips, used in prior
academic research (Guess et al., 2020[41]):
•
•
•
•
•
“Investigate the source. [Ensure that the story is
written by a source that you trust with a reputation
for accuracy. If the story comes from an unfamiliar
organization, check their “About” section to learn
more]”;
“Check the evidence. [Check the author’s sources
to confirm that they are accurate. Lack of evidence
or reliance on unnamed experts may indicate a false
news story.]”;
“Look at other reports. [If no other news source is
reporting the same story, it may indicate that the
story is false. If the story is reported by multiple
sources you trust, it’s more likely to be true.]”;
“Be skeptical of headlines. [False news stories often
have catchy headlines in all caps with exclamation
points. If shocking claims in the headline sound
unbelievable, they probably are.]”;
“Watch for unusual formatting. [Many false news
sites have misspellings or awkward layouts. Read
carefully if you see these signs.]”.
Participants then performed a multiple choice
attention check question to confirm they read the
tips. Participants were then presented with the same
14 headlines and asked how likely they would share
them on social media (using the same 6-point scale
as the sharing intentions control group and accuracy
evaluation prompt group). In both treatments 1 and
2, the intervention (accuracy evaluation prompt or
digital media literacy tips) occurred one time, before
beginning the sharing intentions rating task.
An international effort using behavioural science to tackle the spread of misinformation
These findings replicate the disconnect between
accuracy judgments and sharing intentions first
identified by Pennycook et al (2021[38]). Given
that sharing intentions appear to be at least partly
disconnected from knowledge of what is true and false,
these results indicate that some individuals may share
news that they do not believe.
First, the results of this experiment showed that both
accuracy ratings and sharing intentions were higher
for true versus false headlines (see Figure 2). That
is, overall, respondents were more likely to rate true
headlines as accurate compared to false headlines, and
indicated higher intentions to share true headlines,
as expected. However, the difference in sharing
intentions for true versus false headlines was four
times smaller (in effect size) than the difference in
accuracy judgments.
FINDING 2
Some individuals may
share news that they
do not believe
Shifting attention to
accuracy significantly
increased the quality
of information shared
Secondly, replicating Pennycook et al. (2021[38]), the
experiment demonstrated that providing an accuracy
evaluation prompt (i.e., asking respondents to evaluate
the accuracy of an unrelated, neutral news headline)
Figure 2. Accuracy ratings (A) and sharing intentions (B) for true vs. false headlines.
Figure 2. Perceived Accuracy of News Headlines vs. Intentions to Share News Headlines
Perceived Accuracy
of News Headlines
Intentions to Share
News Headlines
vs.
***
***
2.6
Intention to share (/6)
Perceived accuracy (/4)
FINDING 1
FINDINGS
16
2.4
2.2
2.0
1.8
2.6
2.4
2.2
2.0
1.8
1.6
1.6
False News
True News
False News
True News
***p < .001, **p < .01, *p < .05; Holm correction for multiple comparisons; error bars are 95% confidence intervals. For accuracy perceptions, participants
were asked ‘To the best of your knowledge, is this claim in the above headline accurate?’, and participants responded on a 4-point scale ranging from
‘not at all accurate’ to ‘very accurate’. For sharing intentions, participants were asked ‘If you were to see the above article online (for example, through
Facebook or Twitter), how likely would you be to share it?’, and participants responded on a 6-point scale ranging from ‘extremely unlikely’ to ‘extremely
likely’. Here, and in below analyses, participants who responded ‘extremely unlikely’ to every headline (both true and false) were omitted from analysis,
consistent with prior literature. Scales were derived from Pennycook et al. (2021[38]; studies 3-5).
Source: Diamond, N.B, Pereira, B., Colasanti, A., Chammat, M., Varazzani, C., & Conway, L. (2022, May[43]). Understanding and countering misinformation
in Canada with insights from the behavioural sciences [Conference]. Behavioural Science and Policy Association Annual Conference 2022.
improved the overall quality of news shared relative
to the control condition (see Figure 3). In other words,
the accuracy evaluation prompt increased the gap in
sharing intentions for true versus false headlines by
eliciting a small reduction in sharing intentions for
false headlines. The effect of this intervention was very
subtle, but consistent with Pennycook et al. (2021[38]),
who also demonstrated that alternative prompts (e.g.
asking participants to rate the humorousness of a
FINDING 3
An international effort using behavioural science to tackle the spread of misinformation
single headline prior to engaging with social media
content) did not affect sharing intentions. Ultimately,
this finding replicates and extends what has previously
been demonstrated in the literature to a Canadian
context, and provides additional evidence that accuracy
evaluation prompts may be a promising tool for
improving the quality of information shared online.
17
Digital media literacy
tips had the greatest
impact on reducing
intentions to share
false headlines online
Similar to the accuracy valuation prompt, providing a list
of digital media literacy tips also improved the overall
quality of news shared. Although both interventions were
found to be effective relative to the sharing intentions
control group, digital media literacy tips were much more
effective than the accuracy evaluation prompt in reducing
participants’ intentions to share false headlines (see
Figure 3).
Figure 3. Sharing intentions for false vs. true headlines across the three sharing conditions
21% reduction (d=.41)
Intention to share (/6)
6
***
***
5
4
Sharing Intention Control
Treatment 1: Accuracy Evaluation
3
Treatment 2: Media Literacy Tips
2
1
False News
True News
Headline Veracity
***p < .001, **p < .01, *p < .05; Holm correction for multiple comparisons; along with boxplots, diamonds with white fill depict groups means, error
bars depict 95% confidence intervals, and small coloured dots depict individual participants.
Source: Diamond, N.B, Pereira, B., Colasanti, A., Chammat, M., Varazzani, C., & Conway, L. (2022, May[43]). Understanding and countering misinformation
in Canada with insights from the behavioural sciences [Conference]. Behavioural Science and Policy Association Annual Conference 2022.
An international effort using behavioural science to tackle the spread of misinformation
The digital media literacy tips intervention reduced
false headline sharing 3.5x more than the accuracy
evaluation prompt. Relative to the control group, digital
media literacy tips resulted in a 21% decrease in stated
intentions to share false headlines. Having found that
digital media literacy tips had such a strong effect on
false news sharing intentions, it is clear that more
research is needed to investigate which aspects of digital
media literacy, and kinds of information delivery, are
FINDING 4
most effective.
Individual
differences in trust
and information
consumption shape
sharing of, and belief in
misinformation about
COVID-19
As some individuals may believe and/or share
misinformation more than others, several key questions
remained: who believes and spreads misinformation
in the first place, and why? Do our interventions work
similarly for different sub-groups of people? To answer
these questions, we leveraged the richness of the
COSMO Canada survey data, which included a variety
of validated measures of respondents’ attitudes, beliefs,
cognitive factors, as well as data on trust and information
consumption. We hypothesised that differences in
the sources respondents trust for COVID-related
information, and the way they seek out and react to that
information, would be associated with differences in
susceptibility to misinformation – both in terms of belief
and in sharing.
18
First, a multivariate machine learning approach
(‘k-means clustering’) was used to identify underlying
trust and information consumption profiles. This analysis
is a common method for identifying sub-groups or
‘clusters’ of individuals from a larger sample, where
individuals within a cluster have similar patterns of
responses, and individuals in different clusters have
different patterns of responses.
The following variables were used in this analysis:
ratings of trust in a variety of sources for COVID-related
information (federal government health agencies,
healthcare workers, public news channels, social media,
family and friends), trust in scientists, conspiratorial
thinking (from the Conspiracy Mentality Questionnaire;
Bruder et al., 2013[44]), psychological reactance
(measuring one’s aversion to perceived threats to their
freedom; Hong & Faedda, 1996[45]), and openness to
evidence (from the actively open-minded thinking about
evidence scale; Pennycook et al., 2020[39]).1
The analyses identified three clusters of respondents
based on their trust and information consumption
profiles. They are visualised in Figure 4.
The section below briefly describes each cluster,
including how many respondents each cluster comprises
(as a percentage of the sample). Differences in select
demographic and socioeconomic factors across
clusters were also explored, and those that emerged as
statistically significant are reported below.
1
Conspiratorial thinking measures the tendency to engage in
conspiratorial ideation. Psychological reactance measures how
individuals react when they experience a threat to or loss of their
freedoms. The actively open-minded thinking scale (AOTE) measures to
what extent respondents think their beliefs/opinions ought to change
according to evidence.
An international effort using behavioural science to tackle the spread of misinformation
19
Figure 4. Clustering participants based on trust, information consumption, and cognitive traits
Note: The scale on this graph (ranging from -1.5 to 1) reflects standardised scores for each of the included variables. This means that ‘0’ represents the
average score for each variable across the whole sample (i.e., prior to clustering). This graph is designed to highlight differences across clusters, separately
for each variable – differences in average trust across variables (e.g., greater average trust in government health agencies versus social media) are not
visible here.
Source: Diamond, N.B, Pereira, B., Colasanti, A., Chammat, M., Varazzani, C., & Conway, L. (2022, May[43]). Understanding and countering misinformation
in Canada with insights from the behavioural sciences [Conference]. Behavioural Science and Policy Association Annual Conference 2022.
Three clusters
• Non-Trusting (22.5% of the sample).
Respondents falling in this cluster, are
characterised by low self-reported trust in
all sources (with lowest trust in government
health agencies and highest trust in social
media, relative to other respondents). They
exhibit relatively high conspiratorial thinking
and psychological reactance, and low openness
to evidence.
• High (Social Media) Trusting (34.6% of the
sample). Respondents falling in this cluster, exhibit
high trust in all sources, especially (by comparison
to the other clusters) in social media, family,
and friends. Their lowest trust ratings were for
government health agencies and scientists. They
exhibit a medium-high degree of conspiratorial
thinking, high psychological reactance, and low
openness to evidence.
• Institution-Trusting (42.9% of the sample).
Respondents falling in this cluster exhibit high
trust in institutional/authoritative sources
of information (government health agencies,
healthcare workers, public news sources and
scientists) and low trust in social news sources
(social media, family and friends). They exhibit
low conspiratorial thinking and psychological
reactance, and high openness to evidence.
On average, institution-trusting respondents
are significantly older, more educated (higher
proportion of university graduates), and have
higher income than respondents in the other
two clusters.
An international effort using behavioural science to tackle the spread of misinformation
Having identified three profiles of respondents based
on their trust and information consumption profiles, we
investigated whether these profiles were associated
with differences belief in and sharing of COVID-19 news
headlines (see Figure 5). Indeed, as shown in Figure 5,
the three clusters of respondents differed significantly
in their belief of false news headlines about COVID-19,
as reflected in their accuracy ratings (this analysis was
restricted to respondents in the accuracy control group).
Specifically, respondents in the Non-Trusting and High
(Social Media) Trusting clusters reported significantly
higher belief in false news headlines than InstitutionTrusting respondents, despite no difference in belief in
true news headlines. As shown in Figure 5, there was
evidence of a similar pattern for sharing intentions,
as well: Non-Trusting and High (Social Media) Trusting
respondents reported significantly higher sharing
intentions for false news, compared to Institution Trusting
respondents.
20
These results suggest that the information people
consume, and the way they react to it, can in part be
predicted by their beliefs about the world.
As a final step, the team examined whether the effects of
the two interventions – accuracy evaluation prompt and
digital media literacy tips – differed significantly across
the identified clusters. This was not the case. This
null finding suggests that the effect of the interventions
on false news sharing intentions did not significantly
differ across sub-populations, regardless of differences
in cognitive factors, information consumption, trust
and beliefs. Future research is needed to replicate and
extend this finding in larger and diverse samples.
Figure 5. Group assignment predicts degree of misinformation belief and sharing intentions
Note: Both belief and sharing intentions models revealed significant interactions between segment and headline veracity. ***p < .001, **p < .01, *p <
.05; Holm correction for multiple comparisons’.
Source: Diamond, N.B, Pereira, B., Colasanti, A., Chammat, M., Varazzani, C., & Conway, L. (2022, May[43]). Understanding and countering
misinformation in Canada with insights from the behavioural sciences [Conference]. Behavioural Science and Policy Association Annual Conference
2022.
An international effort using behavioural science to tackle the spread of misinformation
21
FINDINGS SUMMARY
1
Some individuals may share
news that they do not believe
(see figure 2)
2
Shifting attention to
accuracy significantly
increased the quality of
information shared
3
Digital media literacy tips had
the greatest impact on online
users, reducing intentions to
share fake news by 21%
(see figure 3)
4
Individual differences in trust
and information consumption
shape sharing of, and belief
in misinformation about
COVID-19 (see figures
4 and 5)
An international effort using behavioural science to tackle the spread of misinformation
LIMITATIONS
W
hile this study replicates and extends
published findings in the academic literature,
there are some key study limitations to bear
in mind.
The experiment was conducted online in the context of
the COSMO Canada survey, and therefore results have
limited external validity. Testing these interventions
using a simulated social media environment, or, better
yet, directly on social media platforms where individuals
interact with information in day-to-day life, would
provide better estimates of real world impacts, both at
the level of individuals and on the aggregate quality of
information circulating online. Pennycook et al. (2021[38])
demonstrated the real-world efficacy of the accuracy
evaluation prompt in a real-world social media field
experiment, and make the case that even small effects
on individuals’ sharing behaviour could amount to
larger improvements to the social media ecosystem via
compounding network effects.
There is, however, little solid evidence at present about
the macro-scale consequences of deploying these
interventions at scale. Research in this area is rapidly
advancing, however, and new computational modeling
work highlights the importance of multi-pronged
interventions for meaningfully tackling the problem of
viral online misinformation at scale (Bak-Coleman et al.,
2022[46]).
Relatedly, the study only analysed sharing intentions
and not actual sharing behaviour. Previous work has
demonstrated concordance between surveyed sharing
intentions and actual social media sharing behaviour
‘in the wild’ (Pennycook et al., 2021[38]), but we can only
speculate about the degree to which sharing intentions
22
in our sample generalise to real-world behaviour. In
general, self-reported data, though common, is limited
in the sense that there are often discrepancies between
individuals’ stated intentions and their actions.
Finally, the present study measured responses to only
14 real COVID-19 news headlines. Though these
were drawn from pre-validated stimuli from previous
publications, it is unclear whether and how these
particular headlines (necessarily reflecting sentiments/
events in the world at a particular moment in time)
generalise to the broader, contemporary set of true
and false COVID-19-related news circulating on social
media. Future work is also needed to determine if the
results may be applicable to other domains, topics, and
to what extent misinformation susceptibility is shared
across content domains (e.g., COVID-19 vs. climate
change) within individuals.
3
An international effort using behavioural science to tackle the spread of misinformation
23
FUTURE CONSIDERATIONS
FOLLOWING FIRST
CASE STUDY
KEY TAKEAWAYS
FROM THE
CASE STUDY
G
iven the complex and unprecedented nature of
today’s digital world, it would be irresponsible
to believe that there was a single solution for
mitigating all risks associated with misinformation. An
issue as complex as misinformation demands a response
equally multifaceted and agile. With this in mind, the
following section offers insights from the case study.
The results gathered in this international collaboration
suggest that intentions to share false (i.e., independently
debunked) news headlines about COVID-19 online
could be improved with solutions that leverage user-end
decision-making. The results show that briefly cueing
users’ attention to the accuracy of the content they
encounter on social media can have a subtle but positive
effect on the quality of information they choose to share.
Explicitly communicating media literacy tips to users,
by contrast, has an even greater impact on their sharing
decisions, significantly reducing intentions to share false
news specifically.
Broadly, the outcomes of the study suggest that
employing behaviourally-informed modifications to
users’ online journey can have a positive effect on their
ability to self-monitor their content consumption and the
spread of misinformation, while also preserving freedom
of choice. The results of the segmentation analysis
provide compelling support for expanding the scope of
variables used by policy makers to better understand the
cognitive, emotional, and social drivers that shape people’s
relationship with information. These findings highlight
the utility of employing complementary approaches for
rigorous policy and other data-driven initiatives that aim
to understand variability across individuals’ information
consumption behaviours. A richer understanding
of complex and multifactorial social issues – like the
spread of misinformation – is required to develop
lasting solutions that will generalise beyond stand-alone
solutions, and to meet the changing demands of a diverse
population.
These results can inform a wide spectrum of wholeof-society approaches relating to various policy areas.
Although replication and further testing is required to
understand the effects of these interventions in other
contexts, the results signal the potential for behavioural
applications that extend beyond prompting related cues
on digital platforms associated with misinformation.
Through behavioural approaches, data science tools
can offer insights into different sub-populations which
An international effort using behavioural science to tackle the spread of misinformation
24
provides new avenues to initiate and tailor measures
and mechanisms that are better informed, situated, and
accepted. In this sense, similar applications of behavioural
science can help inform government communications
and policy-making strategies and to ensure that key
information is reaching its intended target audience. The
results open the possibilities for employing behavioural
research beyond misinformation as an isolated policy
challenge, but also exploring how misinformation
communities. While academic literature on this topic
is rapidly advancing, the application of such research
in the public sphere remains limited. This is partly
due to the limitations of current literature to speak to
real-world effects of lab-tested experiments. Research
methodologies can be easily outpaced by the rapidly
changing architecture of the digital world, making it
difficult for relevant stakeholders to anticipate the
full impact and implications of misinformation and any
converges with other policy challenges such as climate
change, gender equality, war and conflict, and the health
and safety of citizens.
solutions aimed at reducing its related harms.
NEXT STEPS IN
BEHAVIOURAL
RESEARCH FOR
MISINFORMATION
Advancing knowledge
on the use of
behavioural science
to understand and
address mis- and
disinformation
Although the case study provides robust evidence in
support of behaviourally-informed interventions for
tackling misinformation, it provides limited insights
into why some individuals are more susceptible to
believing or sharing misinformation than others and how
misinformation circulates among individuals and across
Our knowledge remains limited with regards to the
effects that cognitive, social, attitudinal, and emotional
processes – such as trust in governments, in institutions
and in each other – have on the way individuals navigate
informational ecosystems and relatedly, how information
ecosystems influence individual and collective behaviour.
Defining the issue of misinformation remains challenging
to all relevant actors, especially since consensus on
definitions relating to misinformation such as “scientific
truth”, remain contentious (Krause et al., 2022[48]).
What we do know is that factors such as trust are
multidimensional and differ considerably across public
institutions, levels of government, socio-economic
dimensions such as sex, age, education and household
income, and across different policy areas and challenges,
which often shift and evolve over time (Brezzi et al.,
2021[23]).
As such, additional efforts to apply a behavioural
perspective to policy challenges are required to better
understand and identify the necessary interventions
needed to preserve the integrity and quality of the
information that circulate both off and online. These
efforts should strive to build upon classic socioeconomic classifications of populations and include
measures of cognitive, emotional, and social factors that
can contribute to a more comprehensive and accurate
overview of the dynamics of the digital world and the
associated risks and solutions for misinformation.
An international effort using behavioural science to tackle the spread of misinformation
25
Most importantly, advancements in related research
should be executed with a global objective that matches
the expansive nature of today’s informational landscape.
This can include international collaboration to
produce preliminary research, develop cross-border
comparative analyses in policy testing and evaluation,
or offer open-access to data on completed or ongoing
projects applying behavioural science, including their
methodology, analyses, and outcomes.
Follow-up testing is required to understand the lasting
effects of the interventions. Performing follow-up
analyses is crucial for designing sustainable policy that will
have durable effects. Studying how participants respond
to the accuracy evaluation prompts and digital media
literacy tips when presented repeatedly over time can
reveal how stable these interventions are in the long run
or whether other interventions are better at producing
more effective results. Literature in this domain, relating
Replicating and
adapting empirical
results
to inoculation approaches for instance, tend to focus more
heavily on the short-term effects of interventions and
typically, invest less effort in designing studies that assess
long-term effects (or lack thereof). As such, conducting
follow up analyses that explore the persisting impact of
interventions are necessary for better identifying the
lifespan of a given policy measure.
Beyond upholding scientifically rigorous practices,
independent replication is vital for establishing the validity
of ones’ findings and their generalisability to a larger
and/or different population (Arechar et al., 2022[42]). In
the context of public policy, replication contributes to a
greater understanding of the micro and macro trends
in the economic, social, and political factors that guide
individuals’ preferences and beliefs while serving to
validate or advance existing literature and research on
related topics.
Intervention testing should be replicated and adapted
by governments to different contexts. For instance,
governments may choose to test the effects of both
interventions across different demographic groups or
adjust the digital literacy tips to present different cues
for discerning false news from factual sources. Modifying
already tested interventions will generate additional data
about their effectiveness in different environments. It will
also subject existing research to constructive scrutiny,
which serves to improve the data and research often used
to inform current and future policy outcomes.
Finally, it is important to emphasise the value of adapting
results in a real-world environment for understanding and
documenting the ways in which self-reported intentions
translate to real sharing behaviour. The discrepancy
between individuals’ intentions and actions are frequently
observed in the behavioural science literature - this is
commonly referred to as the intention-action gap. This
gap can help explain why some studies that generate
effective results in artificial environments fail to scale
once implemented in the real world. Although previous
studies testing accuracy nudges were able to find similar
results in field testing, testing accuracy prompts and
digital literacy tips in digital spaces where users may
encounter and share misinformation, will be the best way
to understand their impact for reducing the spread of
information online. These results should be used to inform
governments’ policy decisions – including upstream
educational efforts – as well as the design of the userexperience curated and determined by tech companies.
3
An international effort using behavioural science to tackle the spread of misinformation
26
SO WHAT?
IMPLICATIONS FOR
POLICY MAKERS
T
he following section outlines key implications
and considerations for policy makers and other
relevant actors concerned with the spread of misand disinformation:
2. Behavioural interventions provide effective,
feasible, and scalable methods for combatting
the spread of misinformation. The results of the
experiment suggest that behaviourally-informed
modifications can be effective interventions to
help mitigate the spread of misinformation online.
Online platforms and technologies drastically
outperform traditional policy processes in terms
of responsiveness and adaptability, severely
disadvantaging policy makers seeking to implement
timely and relevant policy to mitigate the spread
of mis- and disinformation (Lorenz-Spreen et al.,
2020[13]). Behavioural interventions provide effective
tools that are cost- and time-effective and can be
useful for providing intermediary solutions while
awaiting comprehensive regulatory measures to
be implemented, as well as long-term solutions that
boost regulatory approaches to digital regulation and
user protections when/if enacted.
1. A comprehensive policy response to mis- and
disinformation should include user-centred
approaches that are informed by an expanded
understanding of human behaviour. Many of
the current measures for combatting the spread
of misinformation employ top-down solutions,
whereby restrictions on the type of information
circulated online are imposed vis-à-vis content quality
regulation (Haciyakupoglu et al., 2018[50]). Proactive
approaches that seek to ‘inoculate’ individuals
against mis- and disinformation offer a user-centred
approach that is complementary to existing measures
and can act as an additional safeguard for when
false information circumvents content regulation
and accurate information fails to reach its audience
(Van der Linden et al., 2021[15]). Policy makers and
regulators should consider the impact and potential
of implementing user-end solutions that empower
individuals to be critical thinkers and encouraging
online spaces to adapt such measures where
applicable. These solutions should be designed to
enhance individuals’ ability to manage their own
consumption of mis- and disinformation and preserve
their freedom of choice while navigating the online
world.
3.
Behaviourally-informed research provides insights
into the underlying conditions that influence
information consumption and exchange.
Designing effective and targeted solutions
to misinformation requires a comprehensive
assessment of the landscape in which it circulates.
Exploring the dynamics of cognitive, emotional
and social factors can offer new opportunities to
understand the contextual conditions that influence
individuals’ behaviours and preferences (Brezzi et al.,
An international effort using behavioural science to tackle the spread of misinformation
2021[23]). Armed with this knowledge, policy makers
and regulators can enhance current and future policy
decisions as well as produce innovative solutions
that encourage citizens to be empowered and
knowledgeable online users.
4. There are no one-size-fits-all solutions for
addressing mis- and information. Although the
study’s results suggest both interventions are
effective on all three clustered groups despite
differing along socio-demographic dimensions,
it also signals the potential for using behavioural
science to offer bespoke solutions that align with
the distinct and diversified abilities and interests of a
given population. This study is one of the first steps
to improving our understanding of the underlying
conditions that should be considered when designing
and implementing policy aimed for all of society.
Behavioural science can expand the basis of variables
policy makers refer to when designing policy by
providing a richer understanding of their population’s
priorities, experiences and expectations. Equipped
with knowledge on who spreads misinformation and
why, decision makers can make informed judgements
on how to strategically implement policy according
to the specific preferences of a chosen population
(Terracino et al., 2022[51]).
5. International and collaborative experimentation
to understand what works, for whom, and in
which context is crucial for tackling global policy
challenges. International organisations such as the
United Nations, the World Bank, and the World
Health Organisation align with the OECD in a call
for an increase in co-ordinated international efforts
to tackle today’s global challenges (United Nations,
2022[52]; World Bank Group, 2022[53]; WHO,
2022[54]). Cross-border experimentation is key for
revealing the necessary conditions for successful
policy in any given context. By replicating experiments
27
in different jurisdictions, environments and among
diverse populations, policy makers can better
understand to what extent the unique cultural, social,
political, and economic dimensions of their population
shape policy outcomes (see Arechar et al. 2022[42]
for examples of comparative analyses for reducing
misinformation online).
6. Partnerships between governmental and nongovernmental actors are vital for an effective
and sustainable response to the spread of misand disinformation. A collective approach that
is inclusive of government, experts, academics
and other non-governmental actors is necessary
for a co-ordinated and immediate response to
this and other global policy challenges. Strategic
partnerships such as the one formed for the
provided case study create opportunities to foster
knowledge-sharing and exchange best practices for
mitigating the risks of misinformation. Engaging
a diverse set of stakeholders in decision-making
processes can enhance collective knowledge on
the threats posed by the spread of harmful and
inaccurate information as well as the immediate
and long-term solutions (Terracino et
al., 2022[51]).
An international effort using behavioural science to tackle the spread of misinformation
28
REFERENCES
[12] Agley, J., & Xiao, Y. (2021). Misinformation about
COVID-19: Evidence for differential latent profiles and
a strong association with trust in science. BMC Public
Health, 21(1), 89. https://doi.org/10.1186/s12889020-10103-x
Imhoff, R. (2013). Measuring Individual Differences in
Generic Beliefs in Conspiracy Theories Across Cultures: Conspiracy Mentality Questionnaire. Frontiers in
Psychology, 4, 225. https://doi.org/10.3389/
fpsyg.2013.00225
[42] Arechar, A. A., Allen, J. N. L., Berinsky, A., Cole, R.,
Epstein, Z., Garimella, K., Gully, A., Lu, J. G., Ross, R. M.,
Stagnaro, M., Zhang, J., Pennycook, G., & Rand, D.
(2022). Understanding and Reducing Online Misinformation Across 16 Countries on Six Continents. PsyArXiv. https://doi.org/10.31234/osf.io/a9frz
[4] Carrasco-Farré, C. (2022). The fingerprints of
misinformation: How deceptive content differs from
reliable sources in terms of cognitive effort and appeal
to emotions. Humanities and Social Sciences Communications, 9(1), 1–18. https://doi.org/10.1057/s41599022-01174-9
[46] Bak-Coleman, J.B., Kennedy, I., Wack, M. et al.
Combining interventions to reduce the spread of viral
misinformation. Nat Hum Behav (2022). https://doi.
org/10.1038/s41562-022-01388-6
[55] Colomina, C., Sánchez Margalef, H., & Youngs, R.
(2021). The impact of disinformation on democratic
processes and human rights in the world [Study].
European Union. https://www.europarl.europa.eu/
RegData/etudes/STUD/2021/653635/EXPO_
STU(2021)653635_EN.pdf
[7] BBC. (2018). “Cambridge Analytica planted fake
news.” BBC News. https://www.bbc.com/news/av/
world-43472347
[8] Bovet, A., & Makse, H. A. (2019). Influence of fake
news in Twitter during the 2016 US presidential
election. Nature Communications, 10(1), 7. https://doi.
org/10.1038/s41467-018-07761-2
[23] Brezzi, M., González, S., Nguyen, D., & Prats, M.
(2021). An updated OECD framework on drivers of
trust in public institutions to meet current and future
challenges (Working Paper No. 48; OECD Working
Papers on Public Governance, p. 61). OECD Publishing.
https://www.oecd-ilibrary.org/docserver/b6c5478c-en.
pdf?expires=1654088106&id=id&accname=ocid84004878&checksum=01FB290821A961EB978F6960E71147CE
[5] Brezzi, M., González, S., & Prats, M. (2020). All you
need is trust: Informing the role of government in the
COVID-19 context. The OECD Statistics Newsletter
Issue No 73. https://www.oecd.org/gov/all-you-need-istrust-statistics-newsletter-12-2020.pdf
[44] Bruder, M., Haffke, P., Neave, N., Nouripanah, N., &
[43] Diamond, N.B, Pereira, B., Colasanti, A., Chammat,
M., Varazzani, C., & Conway, L. (2022, May). Understanding and countering misinformation in Canada with
insights from the behavioural sciences [Conference].
Behavioural Science and Policy Association Annual
Conference 2022.
[40] Epstein, Z., Berinsky, A., Cole, R., Gully, A., Pennycook, G., & Rand, D. (2021). Developing an accuracyprompt toolkit to reduce COVID-19 misinformation
online. PsyArXiv. https://doi.org/10.31234/osf.io/sjfbn
[25] European Centre for Disease Prevention and
Control. (2021). Behavioural Insights research to
support the response to COVID-19: A survey of
implementation in the EU/EEA [Technical Report].
European Centre for Disease Prevention and Control.
https://www.ecdc.europa.eu/sites/default/files/
documents/Behavioural-Insights-research-to%20
support-the-response-to-COVID-19.pdf
[21] Furnham, A., & Boo, H. C. (2011). A literature review
of the anchoring effect. The Journal of Socio-Economics, 40(1), 35–42. https://doi.org/10.1016/j.socec.2010.10.008
An international effort using behavioural science to tackle the spread of misinformation
[3] Greifeneder, R., Jaffe, M., Newman, E., & Schwarz, N.
(Eds.). (2021). The Psychology of Fake News: Accepting, Sharing, and Correcting Misinformation (Online).
Routledge. https://doi.org/10.4324/9780429295379
[41] Guess, A. M., Lerner, M., Lyons, B., Montgomery, J.
M., Nyhan, B., Reifler, J., & Sircar, N. (2020). A digital
media literacy intervention increases discernment
between mainstream and false news in the United
States and India. Proceedings of the National Academy
of Sciences of the United States of America, 117(27),
15536–15545. https://doi.org/10.1073/
pnas.1920498117
[50] Haciyakupoglu, G., Hui, J. Y., Suguna, V. S., & Leong,
D. (2018). Countering Fake News: A survey of recent
global initatives (p. 24) [Policy Report]. https://www.
think-asia.org/bitstream/handle/11540/8063/
PR180307_Countering-Fake-News.pdf?sequence=1
[45] Hong, S.-M., & Faedda, S. (1996). Refinement of the
Hong Psychological Reactance Scale. Educational and
Psychological Measurement, 56(1), 173–182. https://
doi.org/10.1177/0013164496056001014
[30] Julienne, H., Lavin, C., Belton, C., Barjaková, M.,
Timmons, S., & Lunn, P. (2020). Behavioural pre-testing
of COVID Tracker, Ireland’s contact-tracing app. ESRI
Working Paper 687 December 2020. [Working Paper].
https://www.esri.ie/system/files/publications/
WP687_1.pdf
[10] Knuutila, A., Neudert, L.-M., & Howard, P. N. (2022).
Who is afraid of fake news? Modeling risk perceptions
of misinformation in 142 countries. Harvard Kennedy
School Misinformation Review. https://doi.
org/10.37016/mr-2020-97
[48] Krause, N. M., Freiling, I., & Scheufele, D. A. (2022).
The “Infodemic” Infodemic: Toward a More Nuanced
Understanding of Truth-Claims and the Need for (Not)
Combatting Misinformation. The ANNALS of the
American Academy of Political and Social Science,
700(1), 112–123. https://doi.
org/10.1177/00027162221086263
[6] Lesher, M., Pawelec, H., & Desai, A. (2022). Disentangling untruths online: Creators, spreaders and how
29
to stop them (Policy Brief No. 23; Going Digital Toolkit
Note). OECD Publishing. https://goingdigital.oecd.org/
data/notes/No23_ToolkitNote_UntruthsOnline.
pdf?utm_source=Adestra&utm_medium=email&utm_
content=Read%20the%20policy%20brief&utm_campaign=STI%20News%2015%20June&utm_term=sti
[17] Van der Linden, S. van der, Panagopoulos, C., &
Roozenbeek, J. (2020). You are fake news: Political bias
in perceptions of fake news. Media, Culture & Society,
42(3), 460–470. https://doi.
org/10.1177/0163443720906992
[15] Van der Linden, S. van der, Roozenbeek, J., Maertens, R., Basol, M., Kácha, O., Rathje, S., & Traberg, C. S.
(2021). How Can Psychological Science Help Counter
the Spread of Fake News? The Spanish Journal of
Psychology, 24. https://doi.org/10.1017/SJP.2021.23
[36] Loomba, S., de Figueiredo, A., Piatek, S. J., de Graaf,
K., & Larson, H. J. (2021). Measuring the impact of
COVID-19 vaccine misinformation on vaccination
intent in the UK and USA. Nature Human Behaviour,
5(3), 337–348. https://doi.org/10.1038/s41562-02101056-1
[13] Lorenz-Spreen, P., Lewandowsky, S., Sunstein, C. R.,
& Hertwig, R. (2020). How behavioural sciences can
promote truth, autonomy and democratic discourse
online. Nature Human Behaviour, 4(11), 1102–1109.
https://doi.org/10.1038/s41562-020-0889-7
[9] Marshall, H., & Drieschova, A. (2018). Post-Truth
Politics in the UK’s Brexit Referendum. New Perspectives, 26(3), 89–105. https://doi.
org/10.1177/2336825X1802600305
[14] Matasick, C., Alfonsi, C., & Bellantoni, A. (2020).
Governance responses to disinformation: How open
government principles can inform policy options
(Working Paper No. 39; OECD Working Papers on
Public Governance, p. 45). OECD Publishing. https://
www.oecd-ilibrary.org/governance/governance-responses-to-disinformation_d6237c85-en
[29] Murphy, R. (2020). Using Behavioural Science to
Improve Hand Hygiene in Workplaces and Public
Places (A Department of Health Research Working
An international effort using behavioural science to tackle the spread of misinformation
Paper) [Working Paper]. Research Services and Policy
Unit, Department of Health of Ireland.
[20] Newman, N., Fletcher, R., Robertson, C. T., Eddy, K.,
& Nielsen, R. K. (2022). Digital News Report 2022
(Digital News Report 2, p. 164). Reuters Institute for
the Study of Journalism. https://reutersinstitute.
politics.ox.ac.uk/digital-news-report/2022
[31] Organisation for Economic and Co-operative
Development (OECD). (n.d.). Understanding the
challenges posed by mis- and disinformation to develop
better policy responses. Retrieved May 3, 2022, from
https://www.oecd.org/gov/open-government/understanding-disinformation-to-develop-better-policy-responses.htm
[1] OECD. (2014). Recommendation of the Council on
Digital Government Strategies. OECD Publishing.
https://www.oecd.org/gov/digital-government/
Recommendation-digital-government-strategies.pdf
[24] OECD. (2020). Regulatory policy and COVID-19:
Behavioural insights for fast-paced decision making.
OECD. https://www.oecd.org/coronavirus/policy-responses/regulatory-policy-and-covid-19-behaviouralinsights-for-fast-paced-decision-making7a521805/#section-d1e637
[32] OECD. (2021a). Governing Cross-Border Challenges (Achieving Cross- Border Government Innovation).
OECD Publishing. https://cross-border.oecd-opsi.org/
reports/governing-cross-border-challenges/
[33] OECD. (2021b). Surfacing Insights and Experimenting Across Borders (Achieving Cross- Border Government Innovation). OECD Publishing. https://crossborder.oecd-opsi.org/reports/
surfacing-insights-and-experimenting-across-borders/
[2] OECD. (2021c). OECD Report on Public Communication: The Global Context and the Way Forward.
OECD. https://doi.org/10.1787/22f8031c-en
[34] OECD. (2022). Delivering and Enabling Impactful
Cross-Border Solutions (Achieving Cross- Border
Government Innovation). OECD Publishing. https://
cross-border.oecd-opsi.org/reports/governing-crossborder-challenges/
30
[26] Office of Evaluation Sciences. (2021). Using
Behavioral Science to Increase COVID-19 Vaccination
Uptake: Synthesis of Evidence from the Office of
Evaluation Sciences Portfolio. Office of Evaluation
Sciences. https://oes.gsa.gov/assets/publications/
OES-vaccine-paper-2-page-summary.pdf
[28] Office of the U.S. Surgeon General. (2021). A
Community Toolkit for Addressing Health Misinformation. Office of the U.S. Surgeon General. https://www.
hhs.gov/sites/default/files/health-misinformation-toolkit-english.pdf
[22] Ognyanova, K., Lazer, D., Robertson, R. E., & Wilson,
C. (2020). Misinformation in action: Fake news
exposure is linked to lower trust in media, higher trust
in government when your side is in power. Harvard
Kennedy School Misinformation Review. https://doi.
org/10.37016/mr-2020-024
[38] Pennycook, G., Epstein, Z., Mosleh, M., Arechar, A.
A., Eckles, D., & Rand, D. G. (2021). Shifting attention to
accuracy can reduce misinformation online. Nature,
592, 590–595. https://doi.org/10.1038/s41586-02103344-2
[37] Pennycook, G., & Rand, D. G. (2022). Accuracy
prompts are a replicable and generalizable approach
for reducing the spread of misinformation. Nature
Communications, 13(1), 2333. https://doi.
org/10.1038/s41467-022-30073-5
[11] Posetti, J., & Bontchva, K. (2020). Disinfodemic:
Deciphering COVID-19 Disinformation (No. 1;
Disinfodemic). UNESCO. https://unesdoc.unesco.org/
ark:/48223/pf0000374416/PDF/374416eng.pdf.
multi
[35] Privy Council Office of Canada. (2020). Canada
COVID-19 Snapshot Monitoring (COSMO Canada):
Monitoring knowledge, risk perceptions, preventive
behaviours, and public trust in the current coronavirus
outbreak in Canada. https://www.psycharchives.org/
en/item/74b8ee9c-1670-4dd5-9e30-39dbed8f1528
[16] Roozenbeek, J., Schneider, C. R., Dryhurst, S., Kerr,
J., Freeman, A. L. J., Recchia, G., van der Bles, A. M., &
van der Linden, S. (2020). Susceptibility to misinformation about COVID-19 around the world. Royal Society
An international effort using behavioural science to tackle the spread of misinformation
Open Science, 7(10), 201199. https://doi.org/10.1098/
rsos.201199
[19] Sweller, J. (1988). Cognitive Load During Problem
Solving: Effects on Learning. Cognitive Science, 12,
257–285.
[51] Terracino, J. B., Munro, C., Matasick, C., & Jofre, S.
W. (2022). Draft Action Plan on Public Governance for
Combating Mis-information and Dis-information.
OECD Publishing.
[52] United Nations. (2022). Global Issues. United
Nations; United Nations. https://www.un.org/en/
global-issues
[53] World Bank Group. (2022). Global Themes. https://
www.worldbank.org/en/about/unit/global-themes
31
[27] World Health Organisation (WHO). (2020). Survey
tool and guidance: Rapid, simple, flexible behavioural
insights on COVID-19. WHO. https://www.euro.who.
int/en/health-topics/health-emergencies/coronaviruscovid-19/publications-and-technical-guidance/
risk-communication-and-community-engagement/
who-tool-for-behavioural-insights-on-covid-19/
survey-tool-and-guidance-behavioural-insights-on-covid-19-produced-by-the-who-european-region
[54] WHO. (2022). Global Policy Group. https://www.
who.int/director-general/global-policy-group
[18] Zimmermann, F., & Kohring, M. (2020). Mistrust,
Disinforming News, and Vote Choice: A Panel Survey
on the Origins and Consequences of Believing Disinformation in the 2017 German Parliamentary Election.
Political Communication, 37(2), 215–237. https://doi.
org/10.1080/10584609.2019.1686095
Download