0% found this document useful (0 votes)
20 views50 pages

WHO EURO 2024 8271 48043 71198 Eng

The toolkit provides a structured five-step process for managing false information during health emergencies, including signal detection, verification, risk assessment, response design, and outreach. It aims to support authorities and stakeholders in promoting accurate information dissemination and informed public health decisions. By operationalizing infodemic management, the toolkit enhances the capacity to address misinformation effectively in the WHO European Region.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views50 pages

WHO EURO 2024 8271 48043 71198 Eng

The toolkit provides a structured five-step process for managing false information during health emergencies, including signal detection, verification, risk assessment, response design, and outreach. It aims to support authorities and stakeholders in promoting accurate information dissemination and informed public health decisions. By operationalizing infodemic management, the toolkit enhances the capacity to address misinformation effectively in the WHO European Region.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 50

Managing false information

in health emergencies:
an operational toolkit
2024
ABSTRACT

The proposed toolkit provides procedures for the detection and handling of false information, following a five-step process involving signal detection,
verification, risk assessment, response design and outreach. A valuable resource for authorities and other stakeholders, this toolkit helps facilitate active
infodemic management, promoting accurate information dissemination and informed public health decisions.

KEYWORDS

EMERGENCIES COMMUNITY PARTICIPATION


EMERGENCY PREPAREDNESS INFODEMIC
HEALTH COMMUNICATION SIGNAL DETECTION

Document number: WHO/EURO:2024-8271-48043-71198

© World Health Organization 2024

Some rights reserved. This work is available under the Creative Commons Third-party materials. If you wish to reuse material from this work that
Attribution-NonCommercial-ShareAlike 3.0 IGO licence (CC BY-NC- is attributed to a third party, such as tables, figures or images, it is your
SA 3.0 IGO; https://creativecommons.org/licenses/by-nc-sa/3.0/igo). responsibility to determine whether permission is needed for that reuse
and to obtain permission from the copyright holder. The risk of claims
Under the terms of this licence, you may copy, redistribute and adapt the resulting from infringement of any third-party-owned component in the
work for non-commercial purposes, provided the work is appropriately work rests solely with the user.
cited, as indicated below. In any use of this work, there should be no
suggestion that WHO endorses any specific organization, products or General disclaimers. The designations employed and the presentation
services. The use of the WHO logo is not permitted. If you adapt the work, of the material in this publication do not imply the expression of any
then you must license your work under the same or equivalent Creative opinion whatsoever on the part of WHO concerning the legal status of
Commons licence. If you create a translation of this work, you should any country, territory, city or area or of its authorities, or concerning the
add the following disclaimer along with the suggested citation: “This delimitation of its frontiers or boundaries. Dotted and dashed lines on
translation was not created by the World Health Organization (WHO). maps represent approximate border lines for which there may not yet be
WHO is not responsible for the content or accuracy of this translation. full agreement.
The original English edition shall be the binding and authentic edition:
Managing false information in health emergencies: an operational The mention of specific companies or of certain manufacturers’ products
toolkit. Copenhagen: WHO Regional Office for Europe; 2024.” does not imply that they are endorsed or recommended by WHO in
preference to others of a similar nature that are not mentioned. Errors and
Any mediation relating to disputes arising under the licence shall be omissions excepted, the names of proprietary products are distinguished
conducted in accordance with the mediation rules of the World Intellectual by initial capital letters.
Property Organization (http://www.wipo.int/amc/en/mediation/rules/).
All reasonable precautions have been taken by WHO to verify the
Suggested citation. Managing false information in health emergencies: information contained in this publication. However, the published material
an operational toolkit. Copenhagen: WHO Regional Office for Europe; 2024. is being distributed without warranty of any kind, either expressed or
Licence: CC BY-NC-SA 3.0 IGO. implied. The responsibility for the interpretation and use of the material
lies with the reader. In no event shall WHO be liable for damages arising
Cataloguing-in-Publication (CIP) data. CIP data are available at from its use.
http://apps.who.int/iris.
All photos: ©WHO/EURO
Sales, rights and licensing. To purchase WHO publications, see
http://apps.who.int/bookorders. To submit requests for commercial use
and queries on rights and licensing, see https://www.who.int/about/
policies/publishing/copyright

Art-direction and layout: www.freightdesign.co.uk


Managing false information
in health emergencies:
an operational toolkit
2024
© WHO
Contents
Acknowledgements iv
Abbreviations v
What is the Operational toolkit for detecting 1
and addressing false information in health emergencies?
Introduction 1
Purpose of the toolkit 1
Structure of the toolkit 3
Ethical considerations 5

The five phases of false information management 6


Phase 1: Signal detection 7
Description 8
Key steps in the signal detection phase 9

Phase 2: Signal verification 15


Description 16
Key steps in the signal verification phase 16
How to verify collected signals 16

Phase 3: Risk assessment 17


Description 17
Key steps in the risk assessment phase 18
Reporting on the outcome of the risk assessment 21

Phase 4: Response design 23


Description 24
Key steps in the response design phase 24
How to develop effective response messages 25
Debunking as a reactive response technique 26
Prebunking as a proactive response technique 26
A comparison between debunking and prebunking 33

Phase 5: Outreach 35
Description 36
Key steps in the outreach phase 36
Outreach case study 37

References 39
Further reading 42

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 iii


Acknowledgements

The WHO Regional Office would like to acknowledge


the many stakeholders who reviewed and
shared feedback on this document, including
members of the Technical Advisory Group for Risk
Communication, Community Engagement and
Infodemic Management (RCCE-IM) in the WHO
European Region, especially Dr. Seema Yasmin,
Stanford University, United States; and Robert
Steiner, University of Toronto, Canada.

This document has been developed with


contributions from experts from the RCCE-IM team at
the WHO Regional Office for Europe. The main author
is Ravi Sreenath, and Stefan Voinea and Kimberly
Rambaud made substantial technical contributions
to the content of this implementation tool. Cristiana
Salvi supervised the process, provided guidance
and made significant contributions to the technical
review. Additional staff members and consultants
within the WHO Regional Office for Europe who
offered technical review and guidance included Altug
Akin, Philippe Borremans, Ben Duncan, Olha Izhyk,
and Leonardo Palumbo.

The WHO Regional Office also expresses gratitude


to Dr Gerald Rockenschaub (former Regional
Emergency Director, WHO Regional Office for Europe)
for his support and review of this tool.

iv MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Abbreviations

CSOs civil society organizations


HSE Health Service Executive (Ireland)
IM infodemic management
MMR measles, mumps and rubella
RCCE-IM risk communication, community engagement and infodemic management

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 V


© WHO

This toolkit provides operational


support to stakeholders engaged
in infodemic management in the
WHO European Region, in the
context of health emergencies.

1 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


What is the operational toolkit for
detecting and addressing false
information in health emergencies?

Introduction By effectively managing an infodemic, accurate


information dissemination can be promoted and
As a result of the COVID-19 pandemic, there has
communities can be aided to make well-informed
been an alarming acceleration in the creation and
decisions for health protection. This in turn
distribution of “information disorders” such as
contributes to establishing structures, systems
misinformation, disinformation and malinformation,
and skills for sustained IM. During the COVID-19
across both digital and physical spaces (1). This
pandemic in the European Region, there has been
scenario was aggravated by the current information
a marked increase in requests by Member States
environment where factual information is sometimes
to the WHO Regional Office for Europe for IM
disregarded, and conspiracy theories find fertile
capacity-building support. This surge in demand
ground. These conditions have consistently eroded
has been accompanied by calls for assistance in
trust in authorities, undermined advances in public
implementing strategies to mitigate and manage
health, complicated health decision-making and put
the spread of false information.
lives at risk.

WHO defines this information disorder, or infodemic Purpose of the toolkit


– as an overabundance of information, including
The purpose of this toolkit is to provide operational
false or misleading information, in digital and
support to national authorities, partners, civil
physical environments during an emergency.
society organizations (CSOs) and other stakeholders
Infodemic management (IM) is the systematic use of
engaged in IM in the WHO European Region, in the
risk- and evidence-based analysis and approaches to
context of health emergency preparedness and
manage harmful information and reduce its impact
response.
on health behaviours during health emergencies.
While offline information tracking and data collection As part of comprehensive IM, this toolkit focuses on
is a valuable source of information, this toolkit is actionable tips and steps that support the detection
mainly focused on online data monitoring. and evaluation of false health information to inform
response. This in turn will prevent and mitigate the
IM is a crucial part of an integrated emergency
impact of harmful information on public health.
public health intervention that includes Risk
Communication, Community Engagement and The toolkit complements IM initiatives to translate
Infodemic Management (RCCE-IM). During specific country needs into action, in two major
emergency response, the primary role of IM is to ways:
detect, prevent and address various forms of health
information disorders, contributing to an improved • Operationalizing IM: The toolkit is an extension
health information ecosystem. of Advancing infodemic management in risk
communication and community engagement in the
WHO European Region: implementation guidance
(2) and provides actionable tools in the execution
of IM tasks as well as examples and case studies.
This can help reduce the risk of errors, improving
overall quality and efficiency.

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 2


• Reinforcing learning: The toolkit can serve as Phase 1: Signal detection
a reinforcement tool for Member States and The first phase is monitoring and signal detection.
other relevant stakeholders who take part in IM This phase involves actively monitoring the
workshops or trainings. It serves as a reference information ecosystem to identify potential signals
guide with key concepts and processes which of false information or rumours related to public
enable participants to reinforce and apply IM health. This requires the use of various online and
knowledge in their day-to-day work. offline tools and methods, such as social listening,
media monitoring, community engagement and
Structure of the toolkit expert networks to identify signals in real-time. The
The toolkit describes a five-phase process (Fig. 1) outcome of this phase is a set of identified signals
to deal with false information signals during a that need to be verified and further assessed in the
health emergency (Fig. 1). It can also be applied to subsequent phases of the process.
managing other components of an infodemic such
as public questions, concerns and information Phase 2: Signal verification
voids (where people seek accurate health The signal verification phase involves determining
information but cannot find it). The five phases: whether a signal is true or false, and identifying the
detection, verification, risk assessment, response source of information. In this phase, information
design and outreach are briefly summarized below is gathered from various sources to validate or
and then expanded on throughout the toolkit, invalidate the signal. This can include fact-checking
with sections on each respective phase containing the information, analysing the credibility of the
short, actionable guidance and, as needed, relevant source, and assessing the accuracy and consistency
checklists, algorithms and infographics. of the information across multiple sources. The
outcome of this phase is a determination of whether
By employing the toolkit, valuable knowledge can the signal is true, false or partially true, and the
be acquired and consolidated into an infodemic confidence level of that determination.
insights report which can serve as a pivotal resource
for guiding RCCE-IM interventions.
Phase 3: Risk assessment
The risk assessment phase involves performing
an integrated analysis and evaluating its potential
impact on public health, as well as assessing the level
of risk associated with it. The outcome of this phase
is a determination of the potential consequences
of the false information signal on public health and
to guide the development of action, including no
action or a response strategy based on the following
criteria: 1) source credibility; 2) spread of the false
information; and 3) public health consequences.
Fig. 1. The key phases to detect and address false information
signals during a health emergency

MONITOR (1) UNDERSTAND (2+3) RESPOND (4+5)

1 2 3 4 5

Signal Signal Risk Response


Detection Verification Assessment Design Outreach

3 Community Engagement
MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024
Phase 4: Response design
In case a response is needed, the response design information, driving its cost almost to zero. In
phase involves developing effective interventions contrast, a social listening system demands
to counter false information and address the considerable resources, expertise and effort to
monitor, analyse and respond effectively. To
identified risks. It includes segmenting audiences;
address this disparity, leveraging partnerships
crafting accurate, clear and relevant messages;
with technology companies, investing in
selecting appropriate communication channels; automated detection systems and fostering
and determining the best timing and frequency for international collaboration can create a more
message targeting. This section also provides tactical scalable and resource-efficient approach to
guidance on designing “debunks” and “prebunks” as combating false health information.
response interventions to counter false information.
The outcome of this phase is the design of an 2. Social listening data lacks offline context.
Online listening primarily captures the views
effective response.
and opinions of a specific demographic, leaving
out substantial sections of the population. From
Phase 5: Outreach an operational perspective, online data needs
In this final phase of the process, the response plan to be complemented with offline research using
diverse sources, such as offline surveys and
is implemented and key messages are targeted to
community engagement, for an inclusive and
segmented audiences, encouraging them to adopt accurate understanding of signals.
the behaviour that supports the intended public
health outcomes. The outreach phase can also link 3. Human analysts are needed to make sense of
back to the first phase of signal detection through a automatically generated data. Social media
feedback loop. The outcome of this phase is enabling posts often lack context and are riddled with
people to take informed decisions to protect their language complexities, making automated data
interpretation impossible. It’s crucial to employ
health, through refuting false information and
human analysts to discern the true meaning of
promoting accurate information and advice.
shared information, evaluate its accuracy and
assess associated health risks (3).

4. Response strategies may not reach the same


audience as the initial false information.
Operational challenges to detecting
Due to the “filter bubble” effect (4), outreach
and addressing false information activities often fail to target the same audience
that was initially exposed to misinformation.
When setting up a system for detecting harmful Algorithmic personalization on social media
health information, a variety of operational factors platforms tailors content to user preferences
come into play. Recognizing and understanding and viewpoints, thus creating a “bubble”
these aspects, especially in the context of health that primarily exposes them to information
emergencies, is crucial for designing effective reinforcing their existing
strategies to combat false information and mitigate
its negative impacts. The operational challenges
encountered should not be a reason for inaction,
but rather a catalyst to innovate and implement
robust, sustainable interventions.

1. Addressing false information is considerably


more resource-intensive than producing it.
The process often mirrors the act of trying to
extinguish a fire with a single water droplet,
particularly for resource-strapped public health
authorities combating a steady stream of
harmful health information. Advancements in
artificial intelligence models such as ChatGPT
have further simplified the creation of false

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 4


Ethical considerations 3. Algorithmic misrepresentation of intent.
Signal detection is critical for IM and detecting and Artificial intelligence-assisted social listening
addressing false information during response, yet tools such as sentiment analysis algorithms
this process raises ethical considerations that must may not always accurately determine the
be considered and addressed. Here are some key intent behind a post or message. This can lead
challenges and considerations regarding social to misinterpretation and potential harm if
listening in the context of RCCE-IM: misinformation is incorrectly identified. Human
oversight and intervention are essential to
1. Privacy and data protection. Privacy validate and interpret the results accurately.
considerations are paramount when engaging in
social listening to detect health misinformation. 4. Bias, discrimination and cultural sensitivity.
Focusing on themes and trends rather than Different cultures may express health-related
specific individuals helps minimize the risk information differently. It is crucial to ensure
to individual privacy. Analysing anonymized that the monitoring methods used do not
and aggregated data can still provide valuable perpetuate or reinforce biases, discrimination, or
insights without exposing personally identifiable stigmatization of particular groups or individuals.
information (5). It is also essential to consider cultural context
and avoid misjudging or misrepresenting cultural
2. Transparency and potential for overreach. expressions as misinformation.
Setting up monitoring systems needs to be
done with transparency about the purpose and 5. Validity and feasibility. In the WHO European
scope of the monitoring. It is important to be Region, many digital platforms do not allow
transparent about the data being collected, how researchers to access and analyse data,
it will be used and who will have access to it. resulting in validity and feasibility challenges
There is a risk of overreach if the data collected (6). Policy-makers and platform operators
is used for other purposes beyond its intended should collaborate to devise secure and privacy-
scope. conscious mechanisms that allow entities with
legitimate purposes to access relevant data for
analysis.

5 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


The five phases of false
information management

1 Signal detection

2 Signal verification

3 Risk assessment

4 Response design

5 Outreach
MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 6
© WHO

The signal detection phase is


critical to identifying potential
outbreaks of false information
and to help public health officials
develop effective response
strategies.

7 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Phase 1: Signal detection

Description An infodemic is not solely composed of


Signal detection involves establishing systematic misinformation or disinformation; it also includes
processes to track information and data regarding legitimate questions, concerns and information
a specific health topic, while also monitoring for voids that people have. These signals serve as
potential emerging questions, concerns, rumours precursors to the spread of false information.
and false information. This can include monitoring Detecting the signals and addressing them – by
social media, news outlets, online forums, call filling information voids, answering questions, and
centre data and more. alleviating concerns – is generally easier for health
systems. Once misinformation and disinformation
The signal detection phase is critical to identifying take root, their mitigation becomes more
potential outbreaks of false information and to help challenging (Fig. 2).
public health officials develop effective response
strategies. By monitoring and analysing data As introduced above, while offline information and
effectively, public health officials can create risk data tracking is a valuable source of insights and will
communication strategies that are informed by users be mentioned in this toolkit, the focus will be online
to inform people’s decision making on their health data monitoring. This is primarily due to the broader
protection. The goal of this phase is to identify false reach, immediacy and the dynamic nature of digital
information during a health emergency. It lays the platforms, which provide real-time insights and
foundation for the subsequent phases of the process facilitate prompt interventions in addressing false
of verifying and addressing false information. information.

Fig. 2. An infodemic is made up of more than misinformation

Health systems have more influence here Less influence here

Information Misinformation Disinformation


Questions Concerns
Voids

Growth of narratives and if sustained, increasing potential for harm

Source: (7).

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 8


Key steps in the signal detection phase Step 2: Develop a data collection plan and start
collecting data
Step 1: Define the scope and objectives of the The next step in the signal detection phase is to
signal detection phase identify and select data sources that will be used
The first step in the signal detection phase is to to monitor the spread of health-related false
define the scope and objectives. This involves information. Data sources can include publicly
identifying the specific health-related topic or issue available social media platforms, online news
that requires monitoring, as well as the specific sources, online forums, blogs, and other relevant
location or language, timeframe, goals and objectives sources of information. The selection of data sources
of this exercise. Examples of scope setting include: considers both the geographical region of interest
and the health concern under examination. This
• What are the false information narratives
targeted strategy ensures that monitoring is finely
surrounding mpox in the affected and vulnerable
tuned to the locations and themes key to the study.
communities in country X?
For example, in some parts of the WHO European
• Who are the key generators of false narratives Region, Telegram and VKontakte are the most
surrounding the avian influenza spreading to cats popular social media messaging channels, while
from birds in country Y? in others Facebook or TikTok can be much more
relevant.
The key aspects to be refined in this step include:
Once data sources have been identified, the next
• Selecting a specific topic that requires
step is to develop a data collection plan. Data
monitoring: Determining the particular health
collection is most often conducted through social
topic or issue that needs close observation, such
listening. This involves determining the frequency
as a particular disease outbreak or a misleading
and tools for data collection through social listening,
health narrative.
as well as the specific data points that will be
• Choosing analysis methods: Selecting collected.
appropriate techniques for analysing the
gathered data (see Step 3 below for available An essential aspect of the data collection plan is
methods). defining the relevant keywords and phrases to
effectively monitor health-related false information.
• Defining boundaries of the analysis: Establishing
These keywords act as the search queries for social
the time period for monitoring, target geographic
listening tools to gather specific data. The selection
areas to focus on, languages to consider and
of keywords should be comprehensive and updated
specific platforms where the information may be
regularly to adapt to evolving trends and emerging
spreading.
false information.

9 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Defining search terms
The quality and accuracy of risk signals and false-information narratives that are identified significantly
depends on the search terms used for finding the signals. These search terms can be categorized into
static and dynamic terms for clarity and ease of understanding.
• Static search terms: These are fixed terms used consistently for each report on a specific topic.
Translations from the target languages of focus countries are also included.
• Dynamic search terms: These terms change based on current events, misinformation trends
and the prevailing information-landscape. They are combined with the static search terms using
appropriate Boolean operators (AND, OR, NEAR/5, etc.). These terms need to be updated regularly
and translated into the local languages of focus countries.
In the COVID-19 example:
• Covid-19 focused static search terms:
covid OR COVID-19 or couronne OR corona OR корона OR coronavirus OR коронавирус OR
pandemic OR pandémie OR pandemie OR pandemia OR пандемия
• Covid-19 focused dynamic search terms: The search terms below are an extract from the dynamic
terms list of one internal IM report (WHO Regional Office for Europe, unpublished, 2021) Please note
that these terms below are for illustrative purpose only and need to be updated and refined for
each production of an IM report based
• on current context.
•  EAR/5
N
(“cardiac-related death” OR “blood clotting” OR transmission OR booster* OR jab OR vaccin* OR
“Swab test” OR “blood clots” OR “vaccine side effects” OR “prolonged symptoms” OR “vaccine
passport” OR immunity OR lockdown OR masks OR “mask wearing” OR mask OR deoxygenation
OR “insulin cost” OR Ivermectin OR “big pharma” OR bigpharma OR “lab-created” OR “US
Labs” OR “United states labs” OR “wuhan lab” OR depopulation OR fertility OR “home remedy”
OR “magical cure” OR herbs OR rash OR rashes OR DNA OR dna OR fertility OR pharma OR
pharmac*)
•  OT
N
(Halloween OR wearables OR pele)

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 10


If there is a budget available for data collection, a
diverse range of digital tools can be utilized to run Box 1. Free social listening tools
searches and conduct sophisticated analyses in
real time or periodically. These tools and platforms 1. Google Alerts (8) enables users to
are primarily designed to cater to the needs of the monitor the web for specific keywords
private sector, serving purposes such as market and phrases. Users can set up alerts to
research and brand reputation monitoring. Examples receive notifications when new content
of widely used paid social listening tools and services is published that matches their search
include TalkWalker, CrowdTangle, Sprout Social and criteria.
Brandwatch and an example list of free tools can be
2. Google Trends (9) allows users to see
found in Box 1.1 However, it’s important to note that
what people are searching for on Google.
while these tools offer valuable functionalities, they
It provides insights into search interest
may not be optimally suited for professionals seeking
over time and by location, as well as
specific insights in the field of public health.
related topics and queries. Google Trends
can be used to identify trending topics
and keywords related to health-related
misinformation, allowing users to monitor
and analyse conversations and develop
appropriate responses.
3. Followerwonk (10) provides free options
that enable users to focus their social
listening efforts on Twitter. It provides a
range of analytics and insights, including
follower demographics, social authority.
and social influence.
4. Hoaxy (11) is a web-based tool that
visualizes the spread of articles online.
It searches for claims and fact-checking
going back to 2016 and tracks the sharing
of articles. It can be used to identify
the spread of misinformation and
disinformation.
5. Bot Sentinel (12) identifies and tracks bots
and trolls on Twitter. It provides a range
of analytics and insights, including bot
scores, troll scores and sentiment analysis.
6. Talkwalker Free Social Search (13) is
a social listening tool which provides a
variety of metrics including top themes,
influencers, engagement, sentiment and
reach. The free version of Talkwalker is
limited to 7 days of historical data.
7. RAND Corporation has compiled a
repository of free tools that can work
for specific use cases in addressing false
information (14).

1 The mention of these tools and services (both paid and free) does not constitute an endorsement by the World Health Organization.

11 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Best practices: online data Key resourcing issues to address
sources commonly used during when setting up a signal detection
the COVID-19 pandemic system
Given the large volume of content posted 1. Define a realistic labour input from your
each day on social media, analysts need to team: e.g. 7 staff hours per week (1 hour
take a targeted approach. Good practices per day, or 2 half days), 20 staff hours per
demonstrated by national health authorities week (one staff member working 50% or
during the COVID-19 pandemic include: two working 25% on this task), 40 staff
• using Google Trends and other free tools hours per week (one person working full
to check what people in their country or time on monitoring).
region are searching for online; 2. Make a realistic assessment of how many
• monitoring the online reaction (views, likes social media accounts and other online
and shares) and comments produced in sources can be read and analysed in the
response to social media posts made by time available, then prioritize among
health authorities; accounts and other online sources to be
monitored.
• conducting short opinion polls (one or
two questions) using the polling feature 3. Create a monitoring plan linked to the
on social media channels of the health signal detection objectives. This should
authority or WHO Country Office; define the sources to be monitored, how
often content will be reviewed (e.g. daily,
• monitoring comments and reactions to weekly or monthly) and what will be
news stories posted on social media by looked for.
leading national, local and/or international
news organizations;
• monitoring conversations in publicly Step 3: Analyse data and identify trends
available and open access key health Data analysis should be conducted to identify
and/or news discussion forums online; and patterns and trends in the information being
• monitoring social media posts made by monitored. This can involve identifying the
societal leaders and influencers and the sources of false information, the types of false
responses to these posts. information being spread, and the rate and speed
of dissemination. There are various methods that
Data collected using the practices mentioned analysts can apply to the collected data to detect
above has been used by national public health false information (Table 1). Most of these analytical
authorities to conduct a signal analysis and methods are readily available in various social
risk assessment of their findings. Typically, listening tools. Detailed guides on these methods
one or more members of an RCCE-IM team are beyond the scope of this toolkit.
would review the signals and produce a
report on how public sentiment was evolving,
the main rumours and false information
circulating, new narratives emerging and the
latest trends in what people were searching
for online. This was done daily by multiple
health authorities during the most acute
phases of the pandemic, or weekly, bi-weekly,
or monthly during less acute phases.

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 12


Step 4: Adjust signal detection plan
The signal detection plan should be regularly
reviewed and adjusted based on the findings of the
analysis. If the volume of social listening data is low
or if the results are not relevant, data sources, search
terms or data collection methods should be adjusted
or new trends or areas of concern identified.

Table 1. Methods used by analysts to detect false information

Analysis type Description

This involves analysing the language used in social media posts and online content to identify
patterns and trends in how information is being shared. Text analysis can be used to identify the
use of specific keywords or phrases that may be associated with false information, as well as to
track changes in the language used to describe a particular health issue.
Text analysis
• Example: In the case of an mpox outbreak in Country X, analysts use social listening tools
to find media articles and social media posts and categorize them based on the WHO Public
health taxonomy for social listening on mpox conversations (15). Posts are assigned one of
the following categories: cause, illness, treatment, interventions or meta-conversation.

This involves tracking the volume of mentions of a particular health issue or topic over time to
identify changes in how the issue is being discussed online. By analysing the content of these
mentions, public health officials can identify whether false information is becoming viral and can
Mention volume take steps to counteract this by promoting accurate information and targeting messaging and
trend analysis outreach efforts to the areas where the false information is most prevalent.
• Example: While monitoring the volume of mentions of mpox in Country X over time, analysts
notice a sudden increase of false narratives, which is linked to an influencer making a
misinformed claim on social media.

This involves analysing the emotional tone of social media posts and online content to identify
patterns in how people are reacting to a particular health issue. False messaging is engineered to
go viral. Content that provokes strong negative emotions, such as hate, disgust and indignation,
is more likely to spread quickly. Sentiment analysis can be used to identify the spread of false or
Sentiment analysis misleading information that is generating strong emotional reactions among the public.
• Example: During the mpox outbreak in Country X, analysts use social media analysis tools
to automatically assess mpox conversations and assign them a defining sentiment: positive,
negative or neutral. An unexpected rise in negative emotions is linked to false claims,
allowing targeted risk communication to calm public anxiety.

This involves mapping the connections between individuals and groups who are sharing
information about a particular health issue. Network analysis can be used to identify key
influencers who are spreading false information and to track the spread of false information
across different social media platforms.
Network analysis • Example: Mapping the network of connections between social media health influencers in
Country X that are sharing information about mpox allows the analyst to understand that a
network of only 10 people are responsible for more than 80% of the reach of the mpox-related
social media posts. This leads the analyst to suggest (further) engaging those influencers in
public outreach on the topic.

This involves analysing the geographic distribution of social media posts and online content to
identify patterns in how information is being shared across different regions. Geographic analysis
can be used to identify areas where false information is particularly prevalent and to target
Geographic analysis messaging and outreach efforts to these areas.
• Example: Using Google Trends to look up which regions in the country are most interested in
mpox and to identify key questions and concerns, the analyst can then propose region-specific
RCCE-IM interventions.

13 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Leverage community engagement for signal detection
Community engagement plays a crucial role in signal detection. Two methods can be employed to
incorporate community engagement into signal detection.

1. Conduct regular RCCE-IM surveys. Identify communities at-risk or those who have been historically
at-risk in previous health emergencies. This can be done through mapping community-based
actors and structures or organizing community meetings in your country or area. Local emergency
responders, such as national Red Cross and Red Crescent societies may have established community
listening systems. Leverage those systems or develop mechanisms to regularly engage the
communities to understand their concerns, questions and any rumours or false information that is
circulated.
2. Understand how the community interacts with health information. Take into account the
community context, ranging from preferred communications channels and style, to the main
community influencers. This can be done through key informant interviews or focus groups with
community members. Existing behavioural and cultural insights studies may provide insights into
preferred channels and trusted sources of information.

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 14


© WHO

Comprehensive signal
verification ensures that the
subsequent risk assessment
phase is based on verified
and credible information.

15 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Phase 2: Signal verification

Description 3. Check for contextual information, such as the


timing and location of the signal.
The goal of signal verification is to verify the accuracy
of relevant signals identified during the signal 4. Verify with experts or authorities to validate the
detection phase. The focus is on gathering additional information.
information and evidence to determine whether the 5. Summarize findings to inform the next phase
signals represent real and accurate information or (risk assessment).
if they are false or misleading. Signal verification
helps ensure that the subsequent risk assessment These activities help to ensure that the signals
phase is based on verified and credible information. detected during the surveillance phase are validated,
and that the information used in the subsequent risk
assessment phase is accurate and reliable.
Key steps in the signal verification phase
The main steps in the signal verification phase of
the process include the following steps. How to verify collected signals
Simple techniques and action-oriented steps for
1. Identify the source of the signal to determine
signal verification are described in Table 2.
its credibility.
2. Check for supporting evidence that can
corroborate the signal.

Table 2. Signal verification activities

Activity Description

Cross-checking can be done by verifying the information from trusted sources to determine its
Cross-checking accuracy. This can include conducting keyword searches on different search engines, reviewing
social media platforms and checking news articles from various sources.

Source verification involves determining the credibility and reliability of the sources that
Source verification provided the information. This can be done by checking the background of the sources, their
track record in providing accurate information and their affiliations.

Fact-checking involves verifying the accuracy of the information by consulting reliable sources
such as scientific research, government agencies, and reputable news organizations. Fact-
Fact-checking
checking resources such as the European Digital Media Observatory directory (16) can also be
used.

Expert consultation involves seeking the opinion of subject matter experts such as
Expert consultation epidemiologists, clinicians and researchers to verify the accuracy and relevance of the
information.

Documentation and reporting involve keeping track of the sources of the information, of the
Documentation
verification process and of the results of the verification. This can help in identifying patterns
and reporting
and trends in false information and can be used for future reference.

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 16


© WHO

Not all signals need a response.


Conducting a risk assessment
allows responses to high-risk
signals to be prioritized.

17 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Phase 3: Risk assessment

Description Key steps in the risk assessment phase


The goal of the risk assessment phase is to assess Three questions to guide the risk assessment:
the public health threat posed by the signals 1. How extensively has the signal spread among
identified during the signal detection and signal the target audience?
verification phases.
2. How influential is the source of the signal?
By performing an integrated analysis of social 3. What level of risk does the signal pose to public
listening signals and other data sources, the health?
outcome of this phase is a decision on whether the
identified signal warrants a response or not. Not all Risk assessment of infodemic signals is not a
signals need to be responded to and conducting a rigid formulaic process. While these guiding
risk assessment allows responses to high-risk signals questions and risk assessment frameworks such
to be prioritized while also avoiding the amplification as the one in Fig. 3 are helpful in reducing bias and
of low-risk rumours and false information. noise in judgment (17), they should be seen as tools
that inform a human decision. The analyst’s deep
The results of the risk assessment phase then contextual understanding is crucial for accurate
inform the response design phase, in which risk assessment.
appropriate responses are developed based on the
identified risks. This phase also links to the outreach Examples of risk assessment for specific signals
phase, as the risk assessment may identify specific are provided in the sections overleaf to illustrate
audiences that are particularly vulnerable to the this process at a tactical level.
risks associated with the false information and who
may require targeted outreach efforts.

Online social listening in combination with offline


community engagement can be used to identify
particularly vulnerable communities to false
health information by tracking, analysing, and
synthesizing community inputs both digital and
offline. This process can help identify questions,
queries, concerns, complaints and suggestions
shared by communities, which can be integrated,
categorized and analysed to produce actionable
insights. By understanding the information needs
of vulnerable communities, RCCE-IM interventions
can be developed to address their specific concerns
and promote accurate health information.

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 18


Fig. 3. Example risk evaluation matrix.

Indicator Low Risk Medium Risk High Risk

Risk to vaccine hesitancy Potential to trigger Potential to lead vaccine


Low risk to vaccine demand
and demand hesitancy to vaccinate refusals

Reach and scope of Limited potential reach or Moderate potential reach or Wide or cross-country reach
misinformation scope scope or scope

Likelihood of issue spread Unlikely to spread in Spreading in community Spreading rapidly in


or escalation community or online and/or online community and online

Limited existing messages


Strong messaging and Limited existing messages
Response capacity and resources to manage
capacity in place and capacity exceeded
crisis

Remaining trust in Reduced trust in Outward displays of


General public trust government, health government, health mistrust government,
services, vaccines services, vaccines health services, vaccines

Monitor closely, consider


Response Debunk, raise trusted voices Debunk, raise trusted voices
prebunking

A closer look at each question


How extensively has the signal spread among
the target audience?
The initial step in risk assessment involves
determining the current and potential virality of
the signal. By conducting signal detection and
verification, statistical information about the signal
can be gathered, including the following.

• Reach: The number of unique users who have


seen a piece of content. This metric can be used
to measure the size of the audience and the
potential impact of this content.
• Impressions: The number of times a piece of
content has been displayed. This metric can
be used to measure the potential reach of this
message.
• Engagement rate: The percentage of users who
have engaged with a piece of content, such as
likes, comments and shares. This metric can
be used to measure the effectiveness of this
content and the level of audience engagement.

19 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Example post (content altered to protect privacy):
YourName @yourname

I've done my research, and I'm convinced that these COVID


vaccines are just a scheme for government control. Why be
a pawn? Think for yourself and refuse the shot
#antivax
#COVIDisnotreal
5.29 PM · Feb 15, 2023 · 10.7K Views

1 1 390

YourName
Reflection: This hypothetical tweet has minimal @yourname How influential is the source of the signal?
retweets and likes (engagement) even if it has The second goal of the assessment is to determine
garnered a significant number of views (more than the influence
I've done my research, and @yourname I'm convinced thatofthese
the source
COVID regarding public health
10 000). YourName
vaccines are just a scheme for government control. Why bemedia platforms
and scientific matters. Social
providethea voice
shot and the ability to connect with
Recommendation:aAnalysis
pawn? shows
Thinkthat
for theyourself
signal and refuse
broad audiences. Often the influence does not
#antivax
has a low virality (not many similar posts have vaccinedepend
Are you aware that COVID contains graphene
on the oxide, on the subject
level of expertise
#COVIDisnotreal
been shared after its posting). An analyst could
a toxic chemical causing health issues? matter, We arethe
but on just testof the outreach. While it is
extent
recommend not to5.29 debunk or 15,
PM · Feb respond directly,
2023 · 10.7K Viewsbut
subjects in their experiment. Don'tconcerning follow theif herd,
a healthrefuse
professional or expert shares
to simply keep monitoring for similar themes.
the1 vaccine! 1 390
false information, in some cases it can be even more
detrimental and harmful to trust if an athlete or
#grapheneoxide musician with a large fanbase expresses skepticism
#conspiracy towards COVID-19 vaccination.
1.26 PM · Apr 2, 2023 · 829.4K Views

269 5,304 9.092 1,599


Example post (content altered to protect privacy):
YourName @yourname

Are you aware that COVID vaccine contains graphene oxide,


a toxic chemical causing health issues? We are just test
subjects in their experiment. Don't follow the herd, refuse
the vaccine!
#grapheneoxide
#conspiracy
1.26 PM · Apr 2, 2023 · 829.4K Views

269 5,304 9.092 1,599

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 20


Reflection: In order to assess the author’s influence, Example:
the following metrics and aspects can be considered. During the COVID-19 pandemic, an influential
Follower count: One of the most straightforward public figure suggested that injecting bleach or
metrics is the number of followers, which generally disinfectants could cure or prevent the disease (19).
indicates a larger audience and greater influence. Reflection: This is a dangerous and potentially
While it’s challenging to set a definitive threshold deadly suggestion as bleach and disinfectants are
for a “large enough” follower count needing a toxic substances that can cause serious harm to the
response, a general guideline might be to prioritize human body.
sources with followers exceeding 100 000. Instead of
a fixed number, an analyst could use a percentage- Recommendation: This signal not only has high
based system as a guideline: for example, if a reach and comes from an influential figure, but
source reaches or influences more than 1% of the the suggestion to inject bleach made by the public
target demographic or population, it necessitates a official also poses a major health risk. An analyst’s
response. recommendation could be to immediately issue
a public health alert advising people not to inject
Engagement: Beyond the raw follower count, the bleach as it can cause serious harm and even death.
interaction the source has with its audience is crucial.
A source with a smaller follower count but high
engagement may have a more dedicated and active Reporting on the outcome of the
audience that is more likely to act on their message. risk assessment
Domain credibility: While the source might be • To effectively communicate risk assessment
influential and credible in a different domain (like results, it is crucial to generate regular reports
music or sports), some of that influence will remain that are readily shared with stakeholders.
in other domains and still make their messages Stakeholders encompass a wide range of
impactful. individuals and groups, including internal
team members, other sector authorities and
Past endorsements: If the source has previously administration levels, CSOs, fact-checkers,
been endorsed or amplified by other influential international organizations and other pertinent
figures or entities, it can boost their influence and parties.
credibility among certain audiences. • For more information on building an infodemic
insights report, read the WHO and United Nations
Recommendation: Analysis reveals that the
Children’s Fund manual How to build an infodemic
author has great influence on his large number of
insights report in 6 steps (20).
followers, even though the author is not a scientific
organization or subject matter expert. The post has
received significant engagement and almost a million
views. An analyst could recommend a targeted
debunking response to this signal.

What level of risk does the signal pose to public health?

When conducting our risk assessment, the last


question calls for reflection on the threat to
public health and safety due to the spread of false
information. A few key points to consider are:

• Severity: what is the potential harm or impact


of the information on public health?
• Vulnerability: which groups of people are
particularly vulnerable to the health risk?

21 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Leverage community engagement for risk assessment
Risk assessment becomes more accurate and relevant when community engagement is used in the
process. Two approaches illustrate how community engagement can be used during risk assessment:

1. Triangulate using other public engagement mechanisms: if other engagement mechanisms are set
up in the community, such as rumour reporting (21), a signal to be assessed can be cross-checked
to have a fuller understanding of whether the signal is also appearing in community settings. Local
emergency responders might have community listening systems that can also be used to triangulate
rumours.

2. Tap into established relationships: now is the time to engage with CSOs and community leaders.
Tapping into established relationships enables a secondary step in risk assessment which allows for
assessment of the reach of the signal and how the signal is perceived by a particular community.
This step plays a pivotal role in fostering trust, which is essential for an effective RCCE-IM
intervention.

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 22


© WHO

Effective response interventions


must recognize that knowledge
alone does not guarantee action.
People’s behaviour is influenced
by beliefs, cultural norms,
emotions and social pressures.

23 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Phase 4: Response design

Description Key steps in the response design phase


The response design phase is focused on designing There are 10 main activities in the process of
an effective response plan as needed, including designing a response to false information.
the development of key messages, selection of
appropriate communication channels, and the 1. Identify the target audience(s) for the
creation of materials and resources. response to determine who needs to receive
the message and what their characteristics and
The response design phase relies upon insights communication preferences are.
generated in the risk assessment phase. The high- • Who should take action?
priority signals identified during the risk assessment
2. Define the goals and objectives of the response
phase form the basis for designing response
in a clear and measurable way to achieve the
interventions. Response can be carried out through
desired outcomes.
both online and offline channels. Interventions
can include targeted messaging, engaging with • What do we want our target audience to do?
key influencers, developing risk communication 3. Identify and engage stakeholders and partners
materials or other activities to address a specific who can support the response effort and engage
false information narrative. them in the process. Co-design response efforts
where appropriate.
This phase also involves the ongoing monitoring of
•  ho can help us achieve our desired
W
the situation to ensure that response interventions
outcome?
are effective. Adjustments need to be made based on
new developments and feedback from stakeholders. 4. Develop a rapid response outline that outlines
the tactics, timelines and resources needed to
Effective response interventions must recognize implement the response effectively.
that knowledge alone does not guarantee action •  ow do we plan to achieve our desired
H
(22). People’s behaviour is influenced by a complex outcomes?
interplay of beliefs, cultural norms, emotions
and social pressures. For example, even when 5. Develop response messages by crafting clear,
parents understand the importance of keeping a concise, and compelling messages that are
child with measles at home to prevent spreading tailored to the target audience and that address
the infection, they may still send them to school the specific concerns and false information being
due to work commitments, social obligations or circulated.
misunderstanding the severity of the situation. By •  hat are the key actionable messages that
W
integrating behavioral frameworks, interventions can help us achieve our desired outcomes?
target these underlying factors, resonating with the 6. Determine the response channels and select the
way people actually think and behave. This approach most effective channels and trusted messengers
is more likely to lead to meaningful changes in to deliver the response messages to the target
behaviour, as it considers not just what people need audience(s), such as social media, traditional
to know, but also what motivates them to act. media, through CSOs and through other offline
channels.
•  hat are the channels that our audience(s)
W
use and the influencers they trust?

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 24


7. Create response materials and develop a range How to develop effective response
of materials to support the response, such as messages
tweets, longer posts, social media tiles, fact
The characteristics of the most effective response
sheets, infographics, videos and other social
messages are presented in Table 3.
media content.
•  hat are the materials that our audience(s)
W When considering response interventions, there
would be most engaged with? can be two approaches.
8. Test and refine the response by conducting
1. Developing corrective messaging: Corrective
small-scale tests of the response messages and
messaging involves creating and disseminating
materials with members of the target audience(s)
accurate information to directly counteract the
and refine them based on feedback.
false information that has been spread. This
•  ow are our messages and materials received
H messaging should be carefully crafted to speak
by target audience(s)? to the specific needs of the target population and
9. Monitor and evaluate the response by regularly ensure that it effectively addresses the specific
monitoring the effectiveness of the response, concerns and questions raised by the original
gathering feedback from the target audience(s), false information. One example of corrective
and evaluating the impact of the response on messaging is the debunking strategy.
attitudes and behaviours via CSOs and other 2. Developing counter messaging: Counter-
on-the-ground partners. messaging involves creating messages that
•  hat are the key performance indicators
W offer a different perspective or alternative
we need to put in place to evaluate our explanation of the issues at hand without
intervention(s)? correcting it directly. This can be an effective
10. Learn and enhance the response by strategy for addressing false information that
incorporating feedback and adjusting key is difficult to correct. One example of counter
messages or messaging formats as necessary. messaging is the prebunking strategy.
•  hat are the main findings we have learned
W The following two sections will go into detail on
from monitoring and evaluation that need to these two response techniques, debunking and
be reflected into our plan? prebunking.

Table 3. Characteristics of effective response messages

Characteristics Explanation

The response message should be delivered as quickly as possible to prevent the spread of false
Timeliness
information or confusion.

Clarity The message should be clear and easy to understand, use simple language and avoid jargon.

Accuracy The response message should be based on accurate and reliable information from credible sources.

Specificity The message should be specific to the topic or issue being addressed rather than general or vague.

The message should be consistent with other messages from the same source and with information
Consistency
from other credible sources.

The message should provide clear and actionable steps that the target audience can take to protect
Actionability
themselves or address the issue.

The message should be delivered in a tone that is empathetic and understanding of the concerns,
Empathy
emotions and beliefs of the target audience.

25 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Debunking as a reactive response How does debunking work?
technique When people are exposed to false information
repeatedly, they may begin to accept it as true,
What is debunking? even if it goes against their preexisting beliefs.
Debunking is a method for exposing and correcting However, when the false information is challenged
false or misleading information. This involves using and corrected, people may adjust their beliefs
evidence-based and logical arguments to challenge accordingly.
and disprove claims that are not supported by facts
or scientific evidence. Debunking techniques can Table 4 describes the key factors that determine
include fact-checking, source verification, expert the effectiveness of the debunking technique.
opinions and the development of critical thinking
skills. By using a debunking intervention, it is
possible to reduce the spread of false information
and promote accurate information.

Example: One example of debunking in a public health emergency is the WHO’s Mythbusters
webpage (23) that during the acute phase of the emergency provided accurate and reliable information
about COVID-19 and addressed some of the common myths and misconceptions circulating in the
media and social networks.

Table 4. Key factors determining effective debunking

Factor Description

Debunking is most effective when it’s done quickly and before the false information has a chance
to spread widely and become entrenched in people’s beliefs.
Timing Example: When a celebrity posts on Facebook a misleading fact about harms caused by vaccines,
a health organization promptly replies with accurate information, preventing the misinformation
from spreading widely.

The effectiveness of debunking can vary depending on the audience. Some people may be more
resistant to changing their beliefs, particularly if those beliefs are deeply held and important
to their identity. Once we have a clear understanding of our target audience, we can frame the
accurate information in a way that is relevant and resonant with that specific audience. This
Audience could include using techniques like storytelling, personal anecdotes or emotional appeals to help
our audience connect with the information on a deeper level.
Example: A local public health entity recognizes that some older adults in their area are resistant
to a new medical treatment, so they organize a town hall meeting with trusted local doctors to
connect with the audience’s values and experiences.

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 26


Table 4. Key factors determining effective debunking (continued)

Factor Description

The way the debunking message is framed can affect its effectiveness. Research has shown that
debunking messages that focus on the correct information rather than the false information are
more effective (24). Also to be considered:
Emotions: Our beliefs can be driven by emotions such as fear or anger. By acknowledging and
empathizing with these emotions, trust and credibility can be built with the target audience,
making them more receptive to the correct information.
Context: False information is often misleading because it lacks important context. By providing
Message framing added context, such as explaining the limitations of a study or the broader context of a news
story, the effects of the false information can be countered.
Example: A public health campaign seeks to debunk common misconceptions about the flu
vaccine. Instead of solely focusing on the false information, the campaign emphasizes the correct
information, using relatable stories and statistics. They also include a video with personal
experiences from individuals who benefited from the vaccine. To connect with emotions, the
campaign acknowledges common fears and concerns, providing reassurance through expert
testimonials. It also provides context by explaining how vaccines are tested and approved.

The credibility of the source delivering the debunking message is important. People are more
likely to accept debunking information from sources they trust; these may or may not be public
Trustworthiness of the health officials and health-care workers, but the engagement of trusted influencers is key to
source establish this trust.
Example: To counter misinformation about a public health crisis, a government agency
collaborates with faith leaders, leveraging their credibility and trust within the community.

Debunking messages that are simple, clear and easy to understand are most effective. Visuals
can include tools such as graphs, charts and infographics which are effective at debunking false
information, as they can help make complex information more accessible.
Clarity and simplicity
Example: Instead of publishing a 50-page brochure, an environmental organization creates an
of the message
easy-to-understand video debunking false information regarding the health impacts of climate
change. They use simple language and clear visuals and make the information more accessible
to the public.

By using channels where the target audience is most active and engaged, the likelihood of the
message being seen and engaged with is increased. Therefore, it is important to research and
identify these channels, such as social media platforms, email newsletters or community groups,
Selecting channels and utilize them for targeted messaging.
Example: A nonprofit organization aimed at addressing mental health in war refugees researches
the platforms that their demographic use to access health information. Based on the research,
the nonprofit launches campaigns on those platforms to spread the message effectively.

Debunking can be more effective when there is a consensus among experts or authoritative
sources on the correct information. Working in partnership with other health agencies and
relevant stakeholders can amplify the debunking messaging.
Consensus
Example: To debunk the myth that drinking cold water is unsafe during heat waves because
“blood vessels would explode” (25), CSOs and public health experts come together to issue a
joint statement, demonstrating a unified agreement on the facts.

27 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


When – and why – does debunking There are many psychological factors at play
not work? (see Table 5).
While debunking can be an effective response
technique, not all attempts to debunk false
information are successful. In some cases, debunking
efforts can backfire, leading target audiences to
become even more entrenched in their false beliefs.

Table 5. Factors at play in unsuccessful debunking

Factor Explanation

Rarely, when people encounter information that contradicts their beliefs, they may become even
more entrenched in their false beliefs. This is known as the backfire effect and it can occur when
debunking is done in a way that challenges people’s identity or core values (26).
Example: Some people may believe that ivermectin is a natural and safe alternative to vaccines,
which they perceive as risky or harmful. If they are confronted with evidence that ivermectin
Backfire effect
is not effective or safe for COVID-19, they may feel threatened and defensive, and reject the
correction. They may also rationalize their belief by finding flaws in the evidence or sources, or
by seeking out more supportive information.
Mitigation: Engage respectfully and empathetically, affirming the individual’s values and identity
before presenting contradictory evidence.

Repeated exposure to false information can make it seem more familiar and therefore more
believable. Debunking may not be effective in correcting false information that has already
become familiar to people (27).
Example: Individuals may have been exposed many times to the claim that ivermectin is effective
Familiarity effect against COVID-19, getting this message from social media, news outlets or from friends and
family. If they are presented with a debunking message that contradicts this claim, they may not
pay attention to it or remember it, because it is less familiar than the false information.
Mitigation: Repeat the accurate health information frequently and through various channels, to
build familiarity with the truth.

People may believe that they are less susceptible to false information than others, which can
make them resistant to correction. Such biases have been recorded at higher rates in people
with higher educational attainment (28,29).
Example: Some people may think that they are well-informed about ivermectin and COVID-19,
Overconfidence bias and that they can distinguish between true and false information better than others. If they are
exposed to a debunking message that challenges their belief, they may dismiss it as irrelevant or
inaccurate, because they trust their own judgment more than the external source.
Mitigation: Frame the debunking information in a way that appeals to the individual’s sense of
intelligence and critical thinking.

People may seek out and believe information that confirms their preexisting beliefs, while
discounting information that contradicts them. This can make them resistant to correction (30).
Example: Some may have a strong preference for ivermectin over vaccines for COVID-19, because
of their personal values, experiences or emotions. If they encounter a debunking message
Confirmation bias that shows that ivermectin is not effective or safe for COVID-19, they may ignore it or reject
it, because it does not fit with their worldview. They may also look for more information that
supports their belief in ivermectin.
Mitigation: Present information from sources that align with the target audience’s worldview and
create opportunities for active engagement.

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 28


Debunking techniques
There are several debunking techniques (31) that
can be used to counter false information. Two of
the most effective techniques – “truth sandwich”
and the refutation technique – are presented
below.

Truth sandwich
The “truth sandwich” is a technique used to refute
health misinformation, which involves presenting
the truth, briefly describing the falsehood, and then
repeating the truth (32). This technique is designed
to avoid further spreading misinformation while still
addressing it.

Example: Truth sandwich


Example: A WHO Regional Office for Europe
False information: “Measles, mumps and
campaign message debunking a common myth
rubella (MMR) vaccines are not safe and can
regarding COVID-19 vaccines and fertility (WHO
cause autism.”
Regional Office for Europe, unpublished, 2023)
Applying the truth sandwich technique:
By using the truth sandwich technique, false
• Truth: MMR vaccines are safe and information can be corrected while still
important vaccine for children. acknowledging and addressing it. This can help to
• Falsehood: While some non-experts believe build trust and credibility with the target audience
that vaccines can cause autism, there is no and make them more receptive to the correct
scientific evidence to support this claim. information.

• Truth: In fact, numerous studies have


shown that vaccines are safe and do not Refutation technique
cause autism. Getting vaccinated is the The refutation technique involves directly refuting
best way to protect yourself and others false information with evidence and alternative
from serious diseases, talk to your doctor information. It aims to correct false information
about vaccination.” by presenting accurate information in a clear and
concise manner. While refutation can be a valuable
tool, it should be complemented with other
approaches, such as proactive communication,
building trust and promoting accurate information,
to effectively address the challenges posed by false
information. In some cases, attempts to refute false
information can backfire and reinforce people’s
beliefs in the false information, leading to further
entrenchment (33).

29 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Example: Using the refutation technique to debunk a piece of false information:
False information: “The COVID-19 vaccine contains a microchip that the government will use to
track your movements.”

Refutation technique response:


• Identify the false information: The claim that the COVID-19 vaccine contains a microchip that
will be used to track people is false.
• Present evidence: There is no evidence to support this claim. The vaccines have undergone
rigorous testing and have been shown to be safe and effective at preventing COVID-19.
• Explain the evidence: The COVID-19 vaccines do not contain any tracking devices or microchips.
They work by teaching the body how to recognize and fight the virus that causes COVID-19.
• Provide alternative information: The vaccines have been authorized for emergency use by the
United States Food and Drug Administration, and have been administered to millions of people
with few serious side effects.
• Repeat and reinforce: It is important to get accurate information about the COVID-19 vaccines
from reliable sources, such as WHO or your health-care provider. Vaccines are a safe and effective
way to protect yourself and others from COVID-19.

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 30


Prebunking as a proactive 2. Social proof: Social proof is a psychological
response technique phenomenon where people assume the actions
of others to reflect correct behaviour for a given
What is prebunking? situation (37). Just as vaccines work by building
Prebunking is a proactive response approach herd immunity in populations, prebunking helps
to address rumours, false information and to build a community of individuals who are
disinformation by preemptively providing accurate better equipped to recognize and resist false
information to the public before the false information information. This can ultimately help to reduce
spreads, and equipping individuals with the skills the overall spread and impact of harmful false
needed to identify inaccurate information. information. Prebunking uses social proof to
nudge people towards more accurate beliefs
The goal of prebunking is to inoculate individuals and behaviours. This means that by providing
against false information by providing them with accurate information that is supported by social
accurate information before they are exposed to norms and trusted sources, prebunking can
falsehoods, as well as by making them aware of the encourage people to adopt more accurate beliefs
tactics used to spread false information. and behaviours (38).
3. Self-affirmation: Prebunking works by using
This approach is similar to a vaccine which works by
self-affirmation techniques to build resistance
providing the body with a small, harmless amount
against persuasion. By providing people with
of an antigen to build immunity against disease. In
opportunities to affirm their values and identity
both cases, the goal is to prevent harm by preparing
before presenting them with information that
the individual’s body or mind to recognize and
challenges their beliefs, prebunking can help
resist harmful agents before they encounter them in
people feel less defensive and more open to
real-life scenarios. Prebunking helps to “immunize”
considering new information.
individuals against false information by providing
them with the cognitive tools and knowledge needed • F or example, a RCCE-IM campaign might ask
to identify and reject false information. individuals to reflect on a personal value that
is important to them, such as their family
Behavioural psychology explains why prebunking works structure or their health decisions, and then
ask them to make a short video or a social
Prebunking is built on inoculation theory, which was
media post about why that value is important
developed in the 1960s by social psychologists, and
to them. This activity could be completed
it aims to train people to recognize tactics used to
before individuals are exposed to false
manipulate information, much like vaccines train
information about a health topic, such as the
the immune response against a virus. Research has
safety of a particular medication.
shown that prebunking can be more effective than
debunking in reducing the belief in and spread of •  esearch suggests that this type of self-
R
misinformation (34). affirmation activity can increase individuals’
confidence in their own values and beliefs,
The key aspects of prebunking are described below. which in turn can make them more resistant
to the influence of false information. When
1. Inoculation: Prebunking works by inoculating people are confident in their own values and
people against false information. By exposing beliefs, they are less likely to be swayed by
people to small doses of false information and false information that conflicts with those
then immediately providing them with accurate values (39,40).
information, prebunking can help people build up
resistance to future false information on the same
topic (34–36).

31 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


4. Memory bias (misinformation effect): directly informed why the information is false,
Prebunking helps counteract memory bias, which active inoculation requires them to learn by
is the tendency to remember false information actively constructing the misinformation
even after it has been corrected (41). By providing themselves in a controlled environment.
accurate information before false information • A similar game called Bad News (45)
is encountered, prebunking can help people educates players on six prevalent tactics
remember accurate information instead of false used in spreading fake news:
information (42).
1. Impersonation: Pretending to be
5. Trust enhancement: Prebunking enhances someone else or representing a
trust between the public and the authoritative group to make the information seem
sources of information. By providing accurate more credible.
information in a proactive and contextualized
way, prebunking can demonstrate the credibility 2. Polarization: Exploiting political
of the authoritative sources and enhance trust divisions to create a wider gap
(36). between groups.
3. Emotional Language: Using excessive
Prebunking techniques used in the emotional words to twist the original
public health context news, provoking intense feelings.
Prebunking can be used in the public health context 4. Conspiracy Creation: Crafting or
to inoculate against harmful health information as in encouraging conspiracy theories to
the examples listed below. interpret recent happenings.
5. Trolling: Targeting users, celebrities
• Researchers have used brief inoculation videos
or organizations to give the illusion
to train people in the detection of flawed
of widespread agreement or
arguments, as an example of passive inoculation
disagreement with a statement.
(43). The videos expose participants to a
single misleading technique, providing both a 6. Discrediting: Attacking the credibility
forewarning and explanation of the manipulation of individuals, institutions or
technique. This exposure helps to enhance the well-accepted truths to sow doubt
participants’ ability to detect and resist false among the audience.
information.
• Games can be an effective technique to inoculate
against false claims.
•  he Social Decision-Making Lab at the
T
University of Cambridge supported by the
World Health Organization built the game
GoViral! (44). When a player enters the game,
they are encouraged to “walk a mile in the
shoes of a manipulator to get to know their
tactics from the inside” and “see it as ruining
the magician’s trick so that we don’t fall
for it next time around.” In this simulation
exercise, players learn how filter bubbles
create echo chambers of false information
and to manipulate negative emotions to stoke
outrage and build influence. This method
of exposing players to false information is
also referred to as active inoculation. Unlike
passive inoculation, where individuals are

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 32


A comparison between debunking Table 6. Comparing debunking and prebunking

and prebunking Debunking Prebunking


Debunking is a reactive approach for addressing
Reactive response Proactive response
rumours and disinformation by correcting false
information after it has already been disseminated. Identify false information Identify potential for false
In contrast, prebunking is preventive and proactive signals information to spread
and aims to prevent the impact of false information
(Table 6). Refute false information Anticipate false
through response/ information and prepare
message development a response

Communicate accurate
Communicate the
information before false
refutation to the audience
information spreads

Occurs before the false


Occurs after the false
information has a chance
information has spread
to spread

Leverage community engagement for response design


Community engagement helps shape contextually relevant responses by capturing insights into local
knowledge, attitudes and perceptions that influence behaviour. These are two ways through which
community engagement supports response in its design phase:
• Providing community insights:
Community engagement creates an essential feedback loop between the designers of the response
and the community. Obtaining direct insights from the community can support the design of long-
term interventions. CSOs and community actors know the intended audiences, their attitudes,
practices and beliefs and can be a valuable source of insights to design the most effective response
based on these.
• Co-designing message and interventions: CSOs and community actors should be involved in the
development and testing of messages to ensure appropriateness and understanding. Involving them
in the design of interventions can ensure that they resonate more with target audiences and are
more effective to achieve desired outcomes.

33 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


© WHO MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 34
© WHO

Outreach empowers
individuals to make informed
decisions to protect their health.
This is achieved by refuting false
information and promoting
accurate information and advice.

35 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Phase 5: Outreach

Description Key steps in the outreach phase


Outreach is the final phase in the process, where The key activities in this phase are summarized
the response plan is implemented and key messages in Table 7.
are targeted to the intended audiences. The goal of
Table 7. Key outreach phase activities
this phase is to engage the audience and promote
behaviour change that supports the intended public Activity Description
health outcomes.
Disseminate the messages through
The outreach phase should also link back to the the most effective channels to
first phase of signal detection through a feedback reach the intended audience(s),
Disseminating
such as social media platforms,
loop. Once a response intervention is implemented, messages
email newsletters, websites,
it is important to continue monitoring the situation leaflets and other online and offline
to ensure that it is reaching the intended audiences communication channels.
and has the desired impact. The feedback loop can
also provide insight as to how messages are being Use a variety of strategies to amplify
perceived by the target audiences, allowing for the messages and increase their
Amplifying reach, such as partnering with
continuous improvement of the response messages
messages influencers, engaging with online
and the overall IM process. and offline communities, or paid
promotion of social media posts.
Furthermore, the feedback loop can also provide
valuable information to the signal detection phase Monitor feedback from the target
as any new signals or emerging issues can be Monitoring
audience(s) (how they receive and
detected early on and incorporated into the IM perceive the messages) and adjust
feedback
messaging as needed based on their
process, enabling the process to be more proactive
response and engagement.
and responsive to emerging issues.
Evaluate the impact of the messaging
on the target audience(s) (whether
Evaluating they accept and uptake advice) and
impact make adjustments to the outreach
strategy as needed to improve
effectiveness.

Continue to engage with the target


audience(s) over time, building trust
Maintaining
and establishing a relationship that
engagement
supports ongoing communication
and engagement.

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 36


Outreach case study Surge resources
This case study describes how Ireland’s Health In 2019, HSE’s social media team consisted of four
Service Executive (HSE) used social media for two- staff: one manager, two executives and an assistant.
way communication, social listening and countering In 2020, the team was assigned three extra staff
false information (46). bringing the total to seven.

Context: When the COVID-19 pandemic hit, Ireland’s HSE’s social media team counters false information
HSE started mapping the information needs of Countering online misinformation from anti-vaccine
people contacting their call centre and used this campaigners and their allies immediately became
to develop a script, which answered the most a high priority for the HSE social media team. The
frequently asked questions about COVID-19. Social Media Manager Muiriosa Ryan also stated that,
“Twitter put a button on its site for all users in Ireland
The social media team was also receiving a large linking to HSE’s vaccine website to make reliable
number of information requests, most of them as information more accessible,” and “They [the social
direct messages via HSE’s Twitter and Instagram media companies] have generally been pretty good
accounts. The HSE social media team collaborated at taking down misinformation when we report it.
with technical experts on answering these questions. Content that HSE reports gets fast tracked for action
seven days a week. Our biggest challenge is finding
Muiriosa Ryan, Social Media Manager at HSE the time to keep up with all the misinformation
remembered: being posted”.
When the government was going to announce
a new initiative on COVID-19 testing or a change
in the travel rules, we knew the public were
going to have a lot of questions. HSE’s call
centre, content and social media teams worked
together with HSE’s public health experts to keep
the common talking points and FAQ [further
answer questions] document on COVID-19 and
the information on HSE’s website up to date and
relevant. Answering questions on COVID-19 from
07:00 until 22:00, seven days a week, became a
routine task for the social media team.

37 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Leverage community engagement for outreach
Engaging communities in the outreach phase bridges the gaps between health institutions and the
public. These are some ways in which community engagement can support the outreach process:

• Supporting message deployment: Nongovernmental entities often hold a high level of trust and
respect within the communities they serve. Trusted influencers, CSOs and community actors can
encourage target audiences to refute false information and accept and uptake accurate advice.

• Selecting communication channels: It is vital to involve CSOs and community actors to identify
the most suitable channels to reach our target audiences. CSOs and other community groups often
have their own newsletters, websites and other online and offline communication channels that
may be used to reach target audiences. Furthermore, CSOs can support offline responses for
example through community sessions focusing on topics related to a specific false narrative or by
organizing sessions with public health experts and community members.
• Providing feedback on outreach: CSOs and community partners are best positioned to get
feedback on how messages are perceived and their influence on behavioural change. Therefore, it
is recommended to work with CSOs to track feedback from message recipients and to help shape
iterations and follow-up.
• Build back better together: Involve partners and stakeholders in lessons learned and “building
back better” efforts. Intra- and after-action reviews with communities are essential to identify
resource gaps, the most effective measures, challenges and recommendations to strengthen future
responses.

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 38


References 2

1. Wardle C, Derakhshan H. Information disorder: 7. Purnat TD, Nguyen T, and Briand S


toward an interdisciplinary framework for (editors). Managing Infodemics in the 21st
research and policy making. Strasbourg: Council Century: Addressing New Public Health
of Europe; 2017 (https://rm.coe.int/information- Challenges in the Information Ecosystem.
disorder-toward-an-interdisciplinary-framework- Cham: Springer Cham; 2023.
for-researc/168076277c).
8. Google alerts. Mountain View: Google; 2023
2. Advancing infodemic management in risk (https://www.google.com/alerts).
communication and community engagement
in the WHO European Region: implementation 9. Google trends. Mountain View: Google; 2023
guidance. Copenhagen: WHO Regional (https://trends.google.com/trends/).
Office for Europe; 2022 (https://www.who.
10. Followerwonk. Seattle: Followerwonk; 2023
int/europe/publications/i/item/WHO-
(https://followerwonk.com/).
EURO-2022-5842-45607-65433).
11. Hoaxy2 beta. Bloomington: Indiana university
3. Lohiniva AL, Sibenberg K, Austero S, Skogberg N.
Observatory on social media; 2023 (https://
Social Listening to Enhance Access to Appropriate
hoaxy.osome.iu.edu/).
Pandemic Information Among Culturally
Diverse Populations: Case Study From Finland. 12. Bot Sentinel. Hasbrouck Heights: Bot Sentinel;
JMIR Infodemiology. 2022;2(2):e38343. doi: 2023 (https://botsentinel.com/).
10.2196/38343.
13. Talkwalker. Luxembourg: Talkwalker; 2023
4. Policy guide on children and digital connectivity. (https://www.talkwalker.com/social-media-
New York: United Nations Children’s Fund; 2018 analytics-search).
(https://www.unicef.org/esa/media/3141/file/
PolicyLab-Guide-DigitalConnectivity-Nov.6.18- 14. Tools That Fight Disinformation Online. Santa
lowres.pdf). Monica: RAND; 2023 (https://www.rand.
org/research/projects/truth-decay/fighting-
5. Lotto M, Hanjahanja-Phiri T, Padalko H, disinformation/search.html).
Oetomo A, Butt ZA, Boger J et al. Ethical
principles for infodemiology and infoveillance 15. WHO releases a public health taxonomy for
studies concerning infodemic management social listening on monkeypox conversations.
on social media. Front Public Health. 2023 Mar In: World Health Organization [website]. Geneva:
23;11:1130079. doi: 10.3389/fpubh.2023.1130079. World Health Organization; 2022 (https://www.
who.int/news/item/26-09-2022-who-releases-a-
6. Roozenbeek J, Zollo F. Democratize social-media public-health-taxonomy-for-social-listening-on-
research - with access and funding. Nature. 2022 monkeypox-conversations).
Dec;612(7940):404. doi: 10.1038/d41586-022-
04407-8. 16. Fact-checking. Florence: European digital media
observatory (https://edmo.eu/fact-checking/).

2 All online weblinks were accessed 8 August 2023

39 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


17. Vaccine Misinformation Management Field 25. VERA FILES FACT CHECK: Viral advisory against
Guide. New York: United Nations Children’s Fund; drinking cold water amid heat NOT TRUE.
2020 (https://vaccinemisinformation.guide/). Quezon City: Vera Files; 2021 (https://verafiles.
org/articles/vera-files-fact-check-viral-advisory-
18. Kahneman D, Sibony O, Sunstein CR. Noise: a against-drinking- cold-w).
flaw in human judgment. Boston: Hachette Book
Group; 2021. 26. Nyhan B. Why the backfire effect does not explain
the durability of political misperceptions. Proc
19. Coronavirus: Outcry after Trump suggests Natl Acad Sci U S A. 2021;118(15):e1912440117.
injecting disinfectant as treatment. London; doi: 10.1073/pnas.1912440117.
British British Broadcasting Corporation;
2020 (https://www.bbc.com/news/world-us- 27. Nourbakhsh A, Liu X, Li Q, Shah S. Mapping the
canada-52407177). echo-chamber: detecting and characterizing
partisan networks on Twitter [conference
20. WHO/UNICEF How to build an infodemic paper]. Proceedings of the 2017 International
insights report in 6 steps. Geneva: World Health Conference on Social Computing, Behavioral-
Organization; 2023 (https://www.who.int/ Cultural Modeling, & Prediction and Behavior
publications/i/item/9789240075658). Representation in Modeling and Simulation;
2017 (http://sbp-brims.org/2017/proceedings/
21. Rumour Tracker Programme: A community
papers/challenge_papers/MappingTheEcho-
based approach to address information gaps
Chamber.pdf)
and misinformation on COVID-19. Geneva:
World Health Organization; 2022 (https://cdn. 28. Swire-Thompson B, Miklaucic N, Wihbey JP,
who.int/media/docs/default-source/science- Lazer D, DeGutis J. The backfire effect after
translation/case-studies-1/cs13_rumourtracking. correcting misinformation is strongly associated
pdf?sfvrsn=829a4b42_4) with reliability. J Exp Psychol Gen. 2022
Jul;151(7):1655-1665. doi: 10.1037/xge0001131.
22. Technical note from the WHO Technical Advisory
Group on behavioural insights and science for 29. Albarracín D, Albarracín J, Chan MS, Hall
health. Geneva: World Health Organization; 2021 Jamieson K. Creating Conspiracy Beliefs:
(https://www.who.int/publications/m/item/ How Our Thoughts Are Shaped. Cambridge:
technical-note-from-the-who-technical-advisory- Cambridge University Press; 2022.
group-on-behavioural-insights-and-science-for-
health). 30. Soldá A. Overconfidence as an interpersonal
strategy [PhD thesis]. Brisbane: Queensland
23. Coronavirus disease (COVID-19) advice for University of Technology; 2020. doi: 10.5204/
the public: Mythbusters. In: World Health thesis.eprints.135191.
Organization [website]. Geneva: World Health
Organization; 2022 (https://www.who.int/ 31. Amazeen M. The Debunking Handbook
emergencies/diseases/novel-coronavirus-2019/ 2020. Fairfax: George Mason University Center
advice-for-public/myth-busters). for Climate Change Communication; 2020
(https://www.bu.edu/com/research/the-
24. Chan MS, Jones CR, Hall Jamieson K, debunking-handbook-2020/).
Albarracín D. Debunking: A Meta-Analysis of the
Psychological Efficacy of Messages Countering 32. Conger K. How misinformation, medical
Misinformation. Psychol Sci. 2017;28(11):1531- mistrust fuel vaccine hesitancy. In: Stanford
1546. doi: 10.1177/0956797617714579. Medicine [website]. Stanford: Standford
Medicine; 2021 (https://med.stanford.edu/news/
all-news/2021/09/infodemic-covid-19.html).

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 40


33. MacFarlane D, Tay LQ, Hurlstone MJ, Ecker 41. Loftus EF. Planting misinformation in the human
UKH. Refuting Spurious COVID-19 Treatment mind: a 30-year investigation of the malleability
Claims Reduces Demand and Misinformation of memory. Learn Mem. 2005 Jul-Aug;12(4):361-6.
Sharing. J Appl Res Mem Cogn. 2021;10(2):248- doi: 10.1101/lm.94705.
258. doi: 10.1016/j.jarmac.2020.12.005.
42. Ecker UKH, Lewandowsky S, Cook J, Schmid P,
34. Roozenbeek J, Van der Linden S, Nygren Fazio L, Brashier N et al. The psychological drivers
T. Prebunking interventions based on of misinformation belief and its resistance to
“inoculation” theory can reduce susceptibility correction. Nat Rev Psychol. 2022;1:13–29. doi:
to misinformation across cultures. Cambridge 10.1038/s44159-021-00006-y.
(MA): Harvard Kennedy School; 2023 (https://
misinforeview.hks.harvard.edu/article/global- 43. Lewandowsky S, van der Linden
vaccination-badnews/). S. Countering Misinformation and
Fake News Through Inoculation and
35. Garcia L, Shane T. A guide to prebunking: Prebunking. Eur Rev Soc Psychool. 2021;1–
a promising way to inoculate against 38. doi:10.1080/10463283.2021.1876983.
misinformation. New York: First Draft; 2021
(https://firstdraftnews.org/articles/a-guide- 44. Social decision-making lab at the University
to-prebunking-a-promising-way-to-inoculate- of Cambridge, Drog, Tilt, Gusmanson, United
against-misinformation/). Kingdom cabinet office. GoViral! [online game].
Cambridge: University of Cambridge; 2023
36. Harjani T, Roozenbeek J, Biddlestone M, van der (https://www.goviralgame.com).
Linden S, Stuart A, Iwahara M et al. A Practical
Guide to Prebunking Misinformation. United 45. Social decision-making lab at the University of
Kongdom: University of Cambridge, BBC Media Cambridge, Gusmanson. Bad News [online game].
Action, Jigsaw; 2022 (https://interventions. Cambridge: University of Cambridge; 2023
withgoogle.com/static/pdf/A_Practical_Guide_ (https://www.getbadnews.com).
to_Prebunking_Misinformation.pdf).
46. Risk communication and community
37. Cialdini RB. Influence: Science and practice. engagement: a compendium of case studies in
Vol. 4. Boston: Pearson education; 2009. times of COVID-19. Copenhagen: WHO Regional
Office for Europe; 2022 (https://apps.who.int/iris/
38. The gentle science of persuasion, part three: handle/10665/363343).
Social proof. Tempe: Arizona State University;
2007 (https://news.wpcarey.asu.edu/20070103-
gentle-science-persuasion-part-three-social-
proof).

39. Iles IA, Gillman AS, Platter HN, Ferrer RA, Klein
WMP. Investigating the Potential of Inoculation
Messages and Self-Affirmation in Reducing the
Effects of Health Misinformation. Sci Commun.
2021;43:6. doi: 10.1177/10755470211048.

40. Carnahan D, Hao Q, Jiang X, Lee H. Feeling


fine about being wrong: The influence of self-
affirmation on the effectiveness of corrective
information. Hum Commun Res. 2018;44(3):
274–298. doi: 10.1093/hcr/hqy001.

41 MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024


Further reading

1. Wang S, Pang MS, Pavlou P. Cure or Poison? 9. Mourali M, Drake C. The Challenge of Debunking
Identity Verification and the Posting of Fake News Health Misinformation in Dynamic Social Media
on Social Media. J Manag Inf Syst. 2021;38:1011– Conversations: Online Randomized Study of
1038. Doi: 10.1080/07421222.2021.1990615. Public Masking During COVID-19. J Med Internet
Res. 2022 Mar 2;24(3):e34831. doi: 10.2196/34831.
2. Kolluri NL, Murthy D. CoVerifi: A COVID-19
news verification system. Online Soc Netw 10. Whitehead HS, French CE, Caldwell DM, Letley
Media. 2021;22:100123. doi: 10.1016/j. L, Mounier-Jack S. A systematic review of
osnem.2021.100123. communication interventions for countering
vaccine misinformation. Vaccine. 2023;41(5):1018-
3. Tschiatschek S, Singla A, Rodriguez M,
1034. doi: 10.1016/j.vaccine.2022.12.059.
Merchant A, Krause A. Fake News Detection
in Social Networks via Crowd Signals. WWW 11. How ‘prebunking’ can fight fast-moving vaccine
‘18: Companion Proceedings of the The lies. In: PBS News Hour [website]. Washington
Web Conference 2018. 2018;517-524. doi: DC: PBS; 2021 (https://www.pbs.org/newshour/
10.1145/3184558.3188722. health/how-prebunking-can-fight-fast-moving-
vaccine-lies)
4. Torres R, Gerhart N, Negahban A. Combating fake
news: An investigation of information verification 12. University of Cambridge. Social media
behaviors on social networking sites [conference experiment reveals potential to ‘inoculate’
paper]. Hawaii International Conference millions of users against misinformation.
on System Sciences. 2018. doi: 10.24251/ Rockville: ScienceDaily; 2022 (www.sciencedaily.
HICSS.2018.499. com/releases/2022/08/220824152220.htm).
5. Ullrich EKH, Lewandowsky S, Cook J, Schmid P, 13. Bond S. False information is everywhere. ‘Pre-
Fazio L, Brashier N et al. The psychological drivers bunking’ tries to head it off early. In: npr [website].
of misinformation belief and its resistance to Washington DC: npr; 2022 (https://www.npr.
correction. Nat Rev Psych. 2022;1(1): 13–29. org/2022/10/28/1132021770/false-information-
is-everywhere-pre-bunking-tries-to-head-it-off-
6. Van der Linden S. Foolproof: why we fall for false
early).
information and how to build immunity. New
York: Harper Colins; 2023. 14. Google to Expand False information ‘Prebunking’
in Europe. In: VOA [website]. Washington DC:
7. van der Linden S. Misinformation: susceptibility,
VOA; 2023 (https://www.voanews.com/a/google-
spread, and interventions to immunize the public.
to-expand-misinformation-prebunking-in-
Nat Med. 2022;28(3):460-467. doi: 10.1038/s41591-
europe/6960557.html).
022-01713-6.
8. Young K, Hyunji L. Debunking misinformation
in times of crisis: Exploring misinformation
correction strategies for effective internal crisis
communication. J Contingencies Crisis Manag.
2022;31. doi: 10.1111/1468-5973.12447

MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 42


The World Health Organization (WHO) is a specialized agency of the United Nations created
in 1948 with the primary responsibility for international health matters and public health.
The WHO Regional Office for Europe is one of six regional offices throughout the world, each
with its own programme geared to the particular health conditions of the countries it serves.
Member States
Albania
Andorra
Armenia
Austria
Azerbaijan
Belarus
Belgium
Bosnia and Herzegovina
Bulgaria
Croatia
Cyprus
Czechia
Denmark
Estonia
Finland
France
Georgia
Germany
Greece
Hungary
Iceland
Ireland
Israel
Italy
Kazakhstan
Kyrgyzstan
Latvia
Lithuania
Luxembourg
Malta
Monaco
Montenegro
Netherlands (Kingdom of the)
North Macedonia
Norway
Poland
Portugal
Republic of Moldova
Romania
Russian Federation
San Marino
Serbia
Slovakia World Health Organization
Slovenia Regional Office for Europe
Spain UN City, Marmorvej 51,
Sweden
DK-2100 Copenhagen Ø,
Switzerland
Tajikistan
Denmark
Türkiye
Turkmenistan
TEL +45 45 33 70 00
Ukraine FAX +45 45 33 70 01
United Kingdom EMAIL [email protected]
Uzbekistan WEB www.who.int/europe

WHO/EURO:2024-8271-48043-71198

You might also like