WHO EURO 2024 8271 48043 71198 Eng
WHO EURO 2024 8271 48043 71198 Eng
in health emergencies:
an operational toolkit
2024
ABSTRACT
The proposed toolkit provides procedures for the detection and handling of false information, following a five-step process involving signal detection,
verification, risk assessment, response design and outreach. A valuable resource for authorities and other stakeholders, this toolkit helps facilitate active
infodemic management, promoting accurate information dissemination and informed public health decisions.
KEYWORDS
Some rights reserved. This work is available under the Creative Commons Third-party materials. If you wish to reuse material from this work that
Attribution-NonCommercial-ShareAlike 3.0 IGO licence (CC BY-NC- is attributed to a third party, such as tables, figures or images, it is your
SA 3.0 IGO; https://creativecommons.org/licenses/by-nc-sa/3.0/igo). responsibility to determine whether permission is needed for that reuse
and to obtain permission from the copyright holder. The risk of claims
Under the terms of this licence, you may copy, redistribute and adapt the resulting from infringement of any third-party-owned component in the
work for non-commercial purposes, provided the work is appropriately work rests solely with the user.
cited, as indicated below. In any use of this work, there should be no
suggestion that WHO endorses any specific organization, products or General disclaimers. The designations employed and the presentation
services. The use of the WHO logo is not permitted. If you adapt the work, of the material in this publication do not imply the expression of any
then you must license your work under the same or equivalent Creative opinion whatsoever on the part of WHO concerning the legal status of
Commons licence. If you create a translation of this work, you should any country, territory, city or area or of its authorities, or concerning the
add the following disclaimer along with the suggested citation: “This delimitation of its frontiers or boundaries. Dotted and dashed lines on
translation was not created by the World Health Organization (WHO). maps represent approximate border lines for which there may not yet be
WHO is not responsible for the content or accuracy of this translation. full agreement.
The original English edition shall be the binding and authentic edition:
Managing false information in health emergencies: an operational The mention of specific companies or of certain manufacturers’ products
toolkit. Copenhagen: WHO Regional Office for Europe; 2024.” does not imply that they are endorsed or recommended by WHO in
preference to others of a similar nature that are not mentioned. Errors and
Any mediation relating to disputes arising under the licence shall be omissions excepted, the names of proprietary products are distinguished
conducted in accordance with the mediation rules of the World Intellectual by initial capital letters.
Property Organization (http://www.wipo.int/amc/en/mediation/rules/).
All reasonable precautions have been taken by WHO to verify the
Suggested citation. Managing false information in health emergencies: information contained in this publication. However, the published material
an operational toolkit. Copenhagen: WHO Regional Office for Europe; 2024. is being distributed without warranty of any kind, either expressed or
Licence: CC BY-NC-SA 3.0 IGO. implied. The responsibility for the interpretation and use of the material
lies with the reader. In no event shall WHO be liable for damages arising
Cataloguing-in-Publication (CIP) data. CIP data are available at from its use.
http://apps.who.int/iris.
All photos: ©WHO/EURO
Sales, rights and licensing. To purchase WHO publications, see
http://apps.who.int/bookorders. To submit requests for commercial use
and queries on rights and licensing, see https://www.who.int/about/
policies/publishing/copyright
Phase 5: Outreach 35
Description 36
Key steps in the outreach phase 36
Outreach case study 37
References 39
Further reading 42
1 2 3 4 5
3 Community Engagement
MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024
Phase 4: Response design
In case a response is needed, the response design information, driving its cost almost to zero. In
phase involves developing effective interventions contrast, a social listening system demands
to counter false information and address the considerable resources, expertise and effort to
monitor, analyse and respond effectively. To
identified risks. It includes segmenting audiences;
address this disparity, leveraging partnerships
crafting accurate, clear and relevant messages;
with technology companies, investing in
selecting appropriate communication channels; automated detection systems and fostering
and determining the best timing and frequency for international collaboration can create a more
message targeting. This section also provides tactical scalable and resource-efficient approach to
guidance on designing “debunks” and “prebunks” as combating false health information.
response interventions to counter false information.
The outcome of this phase is the design of an 2. Social listening data lacks offline context.
Online listening primarily captures the views
effective response.
and opinions of a specific demographic, leaving
out substantial sections of the population. From
Phase 5: Outreach an operational perspective, online data needs
In this final phase of the process, the response plan to be complemented with offline research using
diverse sources, such as offline surveys and
is implemented and key messages are targeted to
community engagement, for an inclusive and
segmented audiences, encouraging them to adopt accurate understanding of signals.
the behaviour that supports the intended public
health outcomes. The outreach phase can also link 3. Human analysts are needed to make sense of
back to the first phase of signal detection through a automatically generated data. Social media
feedback loop. The outcome of this phase is enabling posts often lack context and are riddled with
people to take informed decisions to protect their language complexities, making automated data
interpretation impossible. It’s crucial to employ
health, through refuting false information and
human analysts to discern the true meaning of
promoting accurate information and advice.
shared information, evaluate its accuracy and
assess associated health risks (3).
1 Signal detection
2 Signal verification
3 Risk assessment
4 Response design
5 Outreach
MANAGING FALSE INFORMATION IN HEALTH EMERGENCIES: AN OPERATIONAL TOOLKIT 2024 6
© WHO
Source: (7).
1 The mention of these tools and services (both paid and free) does not constitute an endorsement by the World Health Organization.
This involves analysing the language used in social media posts and online content to identify
patterns and trends in how information is being shared. Text analysis can be used to identify the
use of specific keywords or phrases that may be associated with false information, as well as to
track changes in the language used to describe a particular health issue.
Text analysis
• Example: In the case of an mpox outbreak in Country X, analysts use social listening tools
to find media articles and social media posts and categorize them based on the WHO Public
health taxonomy for social listening on mpox conversations (15). Posts are assigned one of
the following categories: cause, illness, treatment, interventions or meta-conversation.
This involves tracking the volume of mentions of a particular health issue or topic over time to
identify changes in how the issue is being discussed online. By analysing the content of these
mentions, public health officials can identify whether false information is becoming viral and can
Mention volume take steps to counteract this by promoting accurate information and targeting messaging and
trend analysis outreach efforts to the areas where the false information is most prevalent.
• Example: While monitoring the volume of mentions of mpox in Country X over time, analysts
notice a sudden increase of false narratives, which is linked to an influencer making a
misinformed claim on social media.
This involves analysing the emotional tone of social media posts and online content to identify
patterns in how people are reacting to a particular health issue. False messaging is engineered to
go viral. Content that provokes strong negative emotions, such as hate, disgust and indignation,
is more likely to spread quickly. Sentiment analysis can be used to identify the spread of false or
Sentiment analysis misleading information that is generating strong emotional reactions among the public.
• Example: During the mpox outbreak in Country X, analysts use social media analysis tools
to automatically assess mpox conversations and assign them a defining sentiment: positive,
negative or neutral. An unexpected rise in negative emotions is linked to false claims,
allowing targeted risk communication to calm public anxiety.
This involves mapping the connections between individuals and groups who are sharing
information about a particular health issue. Network analysis can be used to identify key
influencers who are spreading false information and to track the spread of false information
across different social media platforms.
Network analysis • Example: Mapping the network of connections between social media health influencers in
Country X that are sharing information about mpox allows the analyst to understand that a
network of only 10 people are responsible for more than 80% of the reach of the mpox-related
social media posts. This leads the analyst to suggest (further) engaging those influencers in
public outreach on the topic.
This involves analysing the geographic distribution of social media posts and online content to
identify patterns in how information is being shared across different regions. Geographic analysis
can be used to identify areas where false information is particularly prevalent and to target
Geographic analysis messaging and outreach efforts to these areas.
• Example: Using Google Trends to look up which regions in the country are most interested in
mpox and to identify key questions and concerns, the analyst can then propose region-specific
RCCE-IM interventions.
1. Conduct regular RCCE-IM surveys. Identify communities at-risk or those who have been historically
at-risk in previous health emergencies. This can be done through mapping community-based
actors and structures or organizing community meetings in your country or area. Local emergency
responders, such as national Red Cross and Red Crescent societies may have established community
listening systems. Leverage those systems or develop mechanisms to regularly engage the
communities to understand their concerns, questions and any rumours or false information that is
circulated.
2. Understand how the community interacts with health information. Take into account the
community context, ranging from preferred communications channels and style, to the main
community influencers. This can be done through key informant interviews or focus groups with
community members. Existing behavioural and cultural insights studies may provide insights into
preferred channels and trusted sources of information.
Comprehensive signal
verification ensures that the
subsequent risk assessment
phase is based on verified
and credible information.
Activity Description
Cross-checking can be done by verifying the information from trusted sources to determine its
Cross-checking accuracy. This can include conducting keyword searches on different search engines, reviewing
social media platforms and checking news articles from various sources.
Source verification involves determining the credibility and reliability of the sources that
Source verification provided the information. This can be done by checking the background of the sources, their
track record in providing accurate information and their affiliations.
Fact-checking involves verifying the accuracy of the information by consulting reliable sources
such as scientific research, government agencies, and reputable news organizations. Fact-
Fact-checking
checking resources such as the European Digital Media Observatory directory (16) can also be
used.
Expert consultation involves seeking the opinion of subject matter experts such as
Expert consultation epidemiologists, clinicians and researchers to verify the accuracy and relevance of the
information.
Documentation and reporting involve keeping track of the sources of the information, of the
Documentation
verification process and of the results of the verification. This can help in identifying patterns
and reporting
and trends in false information and can be used for future reference.
Reach and scope of Limited potential reach or Moderate potential reach or Wide or cross-country reach
misinformation scope scope or scope
1 1 390
YourName
Reflection: This hypothetical tweet has minimal @yourname How influential is the source of the signal?
retweets and likes (engagement) even if it has The second goal of the assessment is to determine
garnered a significant number of views (more than the influence
I've done my research, and @yourname I'm convinced thatofthese
the source
COVID regarding public health
10 000). YourName
vaccines are just a scheme for government control. Why bemedia platforms
and scientific matters. Social
providethea voice
shot and the ability to connect with
Recommendation:aAnalysis
pawn? shows
Thinkthat
for theyourself
signal and refuse
broad audiences. Often the influence does not
#antivax
has a low virality (not many similar posts have vaccinedepend
Are you aware that COVID contains graphene
on the oxide, on the subject
level of expertise
#COVIDisnotreal
been shared after its posting). An analyst could
a toxic chemical causing health issues? matter, We arethe
but on just testof the outreach. While it is
extent
recommend not to5.29 debunk or 15,
PM · Feb respond directly,
2023 · 10.7K Viewsbut
subjects in their experiment. Don'tconcerning follow theif herd,
a healthrefuse
professional or expert shares
to simply keep monitoring for similar themes.
the1 vaccine! 1 390
false information, in some cases it can be even more
detrimental and harmful to trust if an athlete or
#grapheneoxide musician with a large fanbase expresses skepticism
#conspiracy towards COVID-19 vaccination.
1.26 PM · Apr 2, 2023 · 829.4K Views
1. Triangulate using other public engagement mechanisms: if other engagement mechanisms are set
up in the community, such as rumour reporting (21), a signal to be assessed can be cross-checked
to have a fuller understanding of whether the signal is also appearing in community settings. Local
emergency responders might have community listening systems that can also be used to triangulate
rumours.
2. Tap into established relationships: now is the time to engage with CSOs and community leaders.
Tapping into established relationships enables a secondary step in risk assessment which allows for
assessment of the reach of the signal and how the signal is perceived by a particular community.
This step plays a pivotal role in fostering trust, which is essential for an effective RCCE-IM
intervention.
Characteristics Explanation
The response message should be delivered as quickly as possible to prevent the spread of false
Timeliness
information or confusion.
Clarity The message should be clear and easy to understand, use simple language and avoid jargon.
Accuracy The response message should be based on accurate and reliable information from credible sources.
Specificity The message should be specific to the topic or issue being addressed rather than general or vague.
The message should be consistent with other messages from the same source and with information
Consistency
from other credible sources.
The message should provide clear and actionable steps that the target audience can take to protect
Actionability
themselves or address the issue.
The message should be delivered in a tone that is empathetic and understanding of the concerns,
Empathy
emotions and beliefs of the target audience.
Example: One example of debunking in a public health emergency is the WHO’s Mythbusters
webpage (23) that during the acute phase of the emergency provided accurate and reliable information
about COVID-19 and addressed some of the common myths and misconceptions circulating in the
media and social networks.
Factor Description
Debunking is most effective when it’s done quickly and before the false information has a chance
to spread widely and become entrenched in people’s beliefs.
Timing Example: When a celebrity posts on Facebook a misleading fact about harms caused by vaccines,
a health organization promptly replies with accurate information, preventing the misinformation
from spreading widely.
The effectiveness of debunking can vary depending on the audience. Some people may be more
resistant to changing their beliefs, particularly if those beliefs are deeply held and important
to their identity. Once we have a clear understanding of our target audience, we can frame the
accurate information in a way that is relevant and resonant with that specific audience. This
Audience could include using techniques like storytelling, personal anecdotes or emotional appeals to help
our audience connect with the information on a deeper level.
Example: A local public health entity recognizes that some older adults in their area are resistant
to a new medical treatment, so they organize a town hall meeting with trusted local doctors to
connect with the audience’s values and experiences.
Factor Description
The way the debunking message is framed can affect its effectiveness. Research has shown that
debunking messages that focus on the correct information rather than the false information are
more effective (24). Also to be considered:
Emotions: Our beliefs can be driven by emotions such as fear or anger. By acknowledging and
empathizing with these emotions, trust and credibility can be built with the target audience,
making them more receptive to the correct information.
Context: False information is often misleading because it lacks important context. By providing
Message framing added context, such as explaining the limitations of a study or the broader context of a news
story, the effects of the false information can be countered.
Example: A public health campaign seeks to debunk common misconceptions about the flu
vaccine. Instead of solely focusing on the false information, the campaign emphasizes the correct
information, using relatable stories and statistics. They also include a video with personal
experiences from individuals who benefited from the vaccine. To connect with emotions, the
campaign acknowledges common fears and concerns, providing reassurance through expert
testimonials. It also provides context by explaining how vaccines are tested and approved.
The credibility of the source delivering the debunking message is important. People are more
likely to accept debunking information from sources they trust; these may or may not be public
Trustworthiness of the health officials and health-care workers, but the engagement of trusted influencers is key to
source establish this trust.
Example: To counter misinformation about a public health crisis, a government agency
collaborates with faith leaders, leveraging their credibility and trust within the community.
Debunking messages that are simple, clear and easy to understand are most effective. Visuals
can include tools such as graphs, charts and infographics which are effective at debunking false
information, as they can help make complex information more accessible.
Clarity and simplicity
Example: Instead of publishing a 50-page brochure, an environmental organization creates an
of the message
easy-to-understand video debunking false information regarding the health impacts of climate
change. They use simple language and clear visuals and make the information more accessible
to the public.
By using channels where the target audience is most active and engaged, the likelihood of the
message being seen and engaged with is increased. Therefore, it is important to research and
identify these channels, such as social media platforms, email newsletters or community groups,
Selecting channels and utilize them for targeted messaging.
Example: A nonprofit organization aimed at addressing mental health in war refugees researches
the platforms that their demographic use to access health information. Based on the research,
the nonprofit launches campaigns on those platforms to spread the message effectively.
Debunking can be more effective when there is a consensus among experts or authoritative
sources on the correct information. Working in partnership with other health agencies and
relevant stakeholders can amplify the debunking messaging.
Consensus
Example: To debunk the myth that drinking cold water is unsafe during heat waves because
“blood vessels would explode” (25), CSOs and public health experts come together to issue a
joint statement, demonstrating a unified agreement on the facts.
Factor Explanation
Rarely, when people encounter information that contradicts their beliefs, they may become even
more entrenched in their false beliefs. This is known as the backfire effect and it can occur when
debunking is done in a way that challenges people’s identity or core values (26).
Example: Some people may believe that ivermectin is a natural and safe alternative to vaccines,
which they perceive as risky or harmful. If they are confronted with evidence that ivermectin
Backfire effect
is not effective or safe for COVID-19, they may feel threatened and defensive, and reject the
correction. They may also rationalize their belief by finding flaws in the evidence or sources, or
by seeking out more supportive information.
Mitigation: Engage respectfully and empathetically, affirming the individual’s values and identity
before presenting contradictory evidence.
Repeated exposure to false information can make it seem more familiar and therefore more
believable. Debunking may not be effective in correcting false information that has already
become familiar to people (27).
Example: Individuals may have been exposed many times to the claim that ivermectin is effective
Familiarity effect against COVID-19, getting this message from social media, news outlets or from friends and
family. If they are presented with a debunking message that contradicts this claim, they may not
pay attention to it or remember it, because it is less familiar than the false information.
Mitigation: Repeat the accurate health information frequently and through various channels, to
build familiarity with the truth.
People may believe that they are less susceptible to false information than others, which can
make them resistant to correction. Such biases have been recorded at higher rates in people
with higher educational attainment (28,29).
Example: Some people may think that they are well-informed about ivermectin and COVID-19,
Overconfidence bias and that they can distinguish between true and false information better than others. If they are
exposed to a debunking message that challenges their belief, they may dismiss it as irrelevant or
inaccurate, because they trust their own judgment more than the external source.
Mitigation: Frame the debunking information in a way that appeals to the individual’s sense of
intelligence and critical thinking.
People may seek out and believe information that confirms their preexisting beliefs, while
discounting information that contradicts them. This can make them resistant to correction (30).
Example: Some may have a strong preference for ivermectin over vaccines for COVID-19, because
of their personal values, experiences or emotions. If they encounter a debunking message
Confirmation bias that shows that ivermectin is not effective or safe for COVID-19, they may ignore it or reject
it, because it does not fit with their worldview. They may also look for more information that
supports their belief in ivermectin.
Mitigation: Present information from sources that align with the target audience’s worldview and
create opportunities for active engagement.
Truth sandwich
The “truth sandwich” is a technique used to refute
health misinformation, which involves presenting
the truth, briefly describing the falsehood, and then
repeating the truth (32). This technique is designed
to avoid further spreading misinformation while still
addressing it.
Communicate accurate
Communicate the
information before false
refutation to the audience
information spreads
Outreach empowers
individuals to make informed
decisions to protect their health.
This is achieved by refuting false
information and promoting
accurate information and advice.
Context: When the COVID-19 pandemic hit, Ireland’s HSE’s social media team counters false information
HSE started mapping the information needs of Countering online misinformation from anti-vaccine
people contacting their call centre and used this campaigners and their allies immediately became
to develop a script, which answered the most a high priority for the HSE social media team. The
frequently asked questions about COVID-19. Social Media Manager Muiriosa Ryan also stated that,
“Twitter put a button on its site for all users in Ireland
The social media team was also receiving a large linking to HSE’s vaccine website to make reliable
number of information requests, most of them as information more accessible,” and “They [the social
direct messages via HSE’s Twitter and Instagram media companies] have generally been pretty good
accounts. The HSE social media team collaborated at taking down misinformation when we report it.
with technical experts on answering these questions. Content that HSE reports gets fast tracked for action
seven days a week. Our biggest challenge is finding
Muiriosa Ryan, Social Media Manager at HSE the time to keep up with all the misinformation
remembered: being posted”.
When the government was going to announce
a new initiative on COVID-19 testing or a change
in the travel rules, we knew the public were
going to have a lot of questions. HSE’s call
centre, content and social media teams worked
together with HSE’s public health experts to keep
the common talking points and FAQ [further
answer questions] document on COVID-19 and
the information on HSE’s website up to date and
relevant. Answering questions on COVID-19 from
07:00 until 22:00, seven days a week, became a
routine task for the social media team.
• Supporting message deployment: Nongovernmental entities often hold a high level of trust and
respect within the communities they serve. Trusted influencers, CSOs and community actors can
encourage target audiences to refute false information and accept and uptake accurate advice.
• Selecting communication channels: It is vital to involve CSOs and community actors to identify
the most suitable channels to reach our target audiences. CSOs and other community groups often
have their own newsletters, websites and other online and offline communication channels that
may be used to reach target audiences. Furthermore, CSOs can support offline responses for
example through community sessions focusing on topics related to a specific false narrative or by
organizing sessions with public health experts and community members.
• Providing feedback on outreach: CSOs and community partners are best positioned to get
feedback on how messages are perceived and their influence on behavioural change. Therefore, it
is recommended to work with CSOs to track feedback from message recipients and to help shape
iterations and follow-up.
• Build back better together: Involve partners and stakeholders in lessons learned and “building
back better” efforts. Intra- and after-action reviews with communities are essential to identify
resource gaps, the most effective measures, challenges and recommendations to strengthen future
responses.
39. Iles IA, Gillman AS, Platter HN, Ferrer RA, Klein
WMP. Investigating the Potential of Inoculation
Messages and Self-Affirmation in Reducing the
Effects of Health Misinformation. Sci Commun.
2021;43:6. doi: 10.1177/10755470211048.
1. Wang S, Pang MS, Pavlou P. Cure or Poison? 9. Mourali M, Drake C. The Challenge of Debunking
Identity Verification and the Posting of Fake News Health Misinformation in Dynamic Social Media
on Social Media. J Manag Inf Syst. 2021;38:1011– Conversations: Online Randomized Study of
1038. Doi: 10.1080/07421222.2021.1990615. Public Masking During COVID-19. J Med Internet
Res. 2022 Mar 2;24(3):e34831. doi: 10.2196/34831.
2. Kolluri NL, Murthy D. CoVerifi: A COVID-19
news verification system. Online Soc Netw 10. Whitehead HS, French CE, Caldwell DM, Letley
Media. 2021;22:100123. doi: 10.1016/j. L, Mounier-Jack S. A systematic review of
osnem.2021.100123. communication interventions for countering
vaccine misinformation. Vaccine. 2023;41(5):1018-
3. Tschiatschek S, Singla A, Rodriguez M,
1034. doi: 10.1016/j.vaccine.2022.12.059.
Merchant A, Krause A. Fake News Detection
in Social Networks via Crowd Signals. WWW 11. How ‘prebunking’ can fight fast-moving vaccine
‘18: Companion Proceedings of the The lies. In: PBS News Hour [website]. Washington
Web Conference 2018. 2018;517-524. doi: DC: PBS; 2021 (https://www.pbs.org/newshour/
10.1145/3184558.3188722. health/how-prebunking-can-fight-fast-moving-
vaccine-lies)
4. Torres R, Gerhart N, Negahban A. Combating fake
news: An investigation of information verification 12. University of Cambridge. Social media
behaviors on social networking sites [conference experiment reveals potential to ‘inoculate’
paper]. Hawaii International Conference millions of users against misinformation.
on System Sciences. 2018. doi: 10.24251/ Rockville: ScienceDaily; 2022 (www.sciencedaily.
HICSS.2018.499. com/releases/2022/08/220824152220.htm).
5. Ullrich EKH, Lewandowsky S, Cook J, Schmid P, 13. Bond S. False information is everywhere. ‘Pre-
Fazio L, Brashier N et al. The psychological drivers bunking’ tries to head it off early. In: npr [website].
of misinformation belief and its resistance to Washington DC: npr; 2022 (https://www.npr.
correction. Nat Rev Psych. 2022;1(1): 13–29. org/2022/10/28/1132021770/false-information-
is-everywhere-pre-bunking-tries-to-head-it-off-
6. Van der Linden S. Foolproof: why we fall for false
early).
information and how to build immunity. New
York: Harper Colins; 2023. 14. Google to Expand False information ‘Prebunking’
in Europe. In: VOA [website]. Washington DC:
7. van der Linden S. Misinformation: susceptibility,
VOA; 2023 (https://www.voanews.com/a/google-
spread, and interventions to immunize the public.
to-expand-misinformation-prebunking-in-
Nat Med. 2022;28(3):460-467. doi: 10.1038/s41591-
europe/6960557.html).
022-01713-6.
8. Young K, Hyunji L. Debunking misinformation
in times of crisis: Exploring misinformation
correction strategies for effective internal crisis
communication. J Contingencies Crisis Manag.
2022;31. doi: 10.1111/1468-5973.12447
WHO/EURO:2024-8271-48043-71198