0% found this document useful (0 votes)
17 views42 pages

Week 10 - Misinformation (Mar 11)_bs

The document discusses the concept of misinformation, its mechanisms, and its impact on society and businesses, particularly in the digital age. It highlights the roles of AI, deepfakes, and the challenges of regulating content online, as well as potential solutions such as fact-checking, regulations, and promoting transparency. The document emphasizes the importance of understanding misinformation's effects on trust, political influence, and economic losses.

Uploaded by

manavsim2005
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views42 pages

Week 10 - Misinformation (Mar 11)_bs

The document discusses the concept of misinformation, its mechanisms, and its impact on society and businesses, particularly in the digital age. It highlights the roles of AI, deepfakes, and the challenges of regulating content online, as well as potential solutions such as fact-checking, regulations, and promoting transparency. The document emphasizes the importance of understanding misinformation's effects on trust, political influence, and economic losses.

Uploaded by

manavsim2005
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 42

MGMT3601 – Week 9

Misinformation

MGMT3601 – Information in a Networked World


March 11, 2025
Housekeeping
• Show up to class, that’s easy grades! Participation marks extend all the way up to the end of the
semester.
• Schedule for group presentations has been updated with topics and pairings for class evaluation – your
group is expected to attend class the day you will evaluate another group. Make sure that your
presentation follows the required the shark-tank style.
• Every team is expected to have their presentation and related handout ready for March 18th
• Peer evaluation will be done in class on April 3rd.
• Research papers are due for April 10.
• Quiz 2 is now open!
Today’s agenda - Misinformation
1. Concepts and issues of misinformation
2. Mechanisms of misinformation
3. Misinformation in the digital age
4. Experts, authority and content regulation
5. AI and deepfakes
6. Impacts of misinformation on society and businesses
7. Discussion – Solutions to misinformation
March 11
Concepts and issues of misinformation
What’s the deal with the dissemination of
false information?
Definitions
Misinformation
False or inaccurate information
Disinformation
False information deliberately intented to mislead
Fake news / information disorder
False or misleading information presented as news
Propaganda
Content deliberately used to fabricate and manage attitudes, values, and knowledge – especially around political perspectives
The characteristics of false information often makes it undistinguishable from true information
Who can disseminate false information
Answer: Everyone!
- Corporations and NGOs
- Governments
- Medias and journalism
- Citizens and colleagues
- Etc.
Potential impact of getting false information
- Lost of trust
- Political influence
- Power imbalance
- Health consequences, even death
- Barriers to solutions
- Conspiracy theories
- Isolation
- Economic lost
- Etc.
What is more important: the truth, or what
people believe is the truth?
March 11
Mechanisms of misinformation
Mechanisms - Individual
1) Is the information compatible with what I believe?
2) Is the story coherent?
3) Is the information from a credible source?
4) Do others believe this information?

5) Mental models
6) Retrieval failure
7) Fluency and familiarity
8) Reactance
Mechanisms – Individual (mental models)
“If a retraction invalidates a central piece of information (e.g., factor B, the presence of gas
and paint), people will be left with a gap in their model of the event and an event
representation that just "doesn’t make sense" unless they maintain the false assertion.”
(Lewandowsky et al., 2012)
Mechanisms – Individual (fluency)
“Without direct questions about truth values, people may rely on their metacognitive
experience of fluency during thinking about an event to assess plausibility of their thoughts,
a process that would give well-formed, coherent models an advantage – as long as thoughts
flow smoothly, people may see little question to question their veracity (Schwarz et al.,
2007).” (Lewandowsky et al., 2012)
Mechanisms - Collective
“Not all information is equally valuable to individuals. We are more likely to share
information from and with people we consider members of our group, when we believe that
it is true, and when the information is novel or urgent.” (McBride, 2021)

You can think of individuals as parts of a hive mind


March 11
Misinformation in the digital age
What changed? What do you believe are some
of the most critical transformations that came
with the development of digital technologies
when you think about misinformation?
What changed?
1) Fast-paced creation and dissemination of content
2) New formats of information dissemination
3) "Breaking" or at least deep change in information gatekeeping ecosystem
4) Materialization of trust issues toward information gatekeepers
5) Authority crisis
Filter bubble
“A filter bubble is an algorithmic bias that skews or limits the information an individual user sees
on the internet. The bias is caused by the weighted algorithms that search engines, social media
sites and marketers use to personalize user experience (UX).
Filter bubbles, which affect an individual's online advertisements, social media newsfeeds and web
searches, essentially insulate the person from outside influences and reinforce what the individual
already thinks. The word bubble, in this context, is a synonym for isolation; its context comes from
a medical device called the isolator, a plastic bubble that was infamously used to sequester a
young patient with immunodeficiencies in the 1970s.”
Relationship to expertise
Our relationship to expertise and authority is vastly getting transformed by the possibilities of
creating and disseminating content online
“Two things are clear. One, there are more capabilities to consume a variety of media content,
from streaming videos to reading niche newsletters and blogs. Two, people worldwide have grown
accustomed to the habit of consuming content types across the border and from a variety of
sources. But the scary part is that people are naturally prone to trust the transparency of
the content they consume rather than taking a step back and evaluating the source prior to
diving into the contents.” (Fowler, 2022)
Regulation of content in online spheres
Regulation of content in online spheres
Bots and trolls
“The stunning levels of Twitter bot activity on topics related to global heating and the
climate crisis is distorting the online discourse to include far more climate science
denialism than it would otherwise.
An analysis of millions of tweets from around the period when Donald Trump announced the
US would withdraw from the Paris climate agreement found that bots tended to applaud the
president for his actions and spread misinformation about the science.” (Milman, 2018)
Question
Should digital platforms and innovations have the power to decide what is true or not? How
can we make sure they remain accountable?
March 11
AI and deepfakes
Transformations of reporting
“Many newsrooms went further, providing their employees and audiences with statements
or guidelines describing how they intended to approach the use of generative AI in their
workflows and news products. Some even began publishing a few experimental articles
written by ChatGPT. Very few, however, have yet taken specific steps to pragmatically and
routinely apply these technologies in their newsrooms. Change is in the air, but specific
initiatives are harder to find.” (Caswell, 2023)
Solutions from AI
“One example of an application of such technology is fact-checking. Creating software that’s able to
learn based on human fact-checkers’ work ethic in order to verify the information and sources
within an article is the AI’s essence in its simplest form, and this can be a powerful starting point in
the line of defense against fake news.
Another obvious avenue is fake news detection. AI is easily trainable in identifying examples of news
that are factually correct, and by tapping into AI’s ability to discern anomalies or deviations from
the norm, it’s possible to develop a solution that can continuously monitor and compare the
truthfulness of articles and report back on the results it finds.” (Fowler, 2022)
ChatGPT
ChatGPT
Image generation
“While the most popular uses of generative AI so far are for satire and entertainment
purposes, the sophistication of their technology is growing fast. A number of prominent
researchers, technologists and public figures have signed an open letter asking for a
moratorium of at least six months on the training and research of AI systems more powerful
than GPT-4, a large language model created by US company Open AI. “Should we let
machines flood our information channels with propaganda and untruth?” they ask.” (Kahn,
2023)
Deepfakes
Question?
How to adapt to the lasting effects of AI?
March 11
Impacts of misinformation on businesses and society
Political influence
“One of the first ones that I really dug into was the hashtag #MediaLiesAgain, which just popped up out of
nowhere and all of a sudden had thousands of tweets associated with it. I ran some basic analytics just to
see what the activity looked like over time, and who were the top people tweeting about it.
The activity pattern was very abnormal. Beyond the time factor of being 3 a.m., these tweets were being
pushed in waves. You’d see these big spikes in activity where a bunch of tweets will come out all at once and
then it drops back off.
[…]
This pattern was not organic — it was not people sitting at their computers deciding all at once to tweet
about media lies. A couple of the tweets were geo-tagged to Macedonia, which was known to host these fake
news farms.” (Garossino and Orr Bueno, 2017)
Health consequences
COVID misinformation led to at least 2,800 deaths in Canada, $300M in costs: report, CTV
NEWS
Economic lost
According to Statista, the economical lost from misinformation in 2020 was 78 billions dollars

“Many social-media websites struggle to maximize user engagement while minimizing the amount
of misinformation shared and reshared. The stakes are high for Facebook, Twitter, and their rivals,
which generate most of their revenue from advertising. Viral content leads to higher user
engagement, which in turn leads to more advertising revenue. But content-management
algorithms designed to maximize user engagement may inadvertently promote content of dubious
quality—including fake news.” (Jacobs, 2018)
Barriers to solutions
March 11
End of class discussion – Solutions to counter misinformation
Fact-checking
“Other scholars argue that it is actually not “motivated reasoning” but “lack of reasoning”
that is to blame. If people accept incorrect information, then it may simply come down to
“lazy thinking” and a failure to make the effort to realign their views with inconvenient
truths. Whatever the cause, there seems to be little that fact-checking in the strictest
sense can do to alter deeply-held beliefs that are based on inaccurate information.”
(Tompkins, 2020)
More regulations
“Despite these challenges, Germany has been eminent in the number and quality of legal and social
initiatives against disinformation. The amendment to the Interstate Media Treaty (Medienstaatsvertrag)
added measures to fight against disinformation and misinformation. The state media authorities have
received competence to initiate proceedings against media outlets if the journalistic due diligence
obligations have not been adequately respected. Messages or advertising created by social bots must also
be marked clearly, which may prevent automated amplification of certain content. To increase the
transparency of online advertisements, all advertisements, whether political, social or religious, must be
labelled as such, and their advertisers or their buyers must be clearly indicated.” (Bayer, 2021)
Empathy
“By bonding over the values we truly share, and by connecting them to climate, we can
inspire one another to act together to fix this problem. But it all begins with understanding
who we already are, and what we already care about – because chances are, whatever that
is, it’s already being affected by climate change, whether we know it or not.” (Hayhoe, 2021)
Transparency
“The proliferation of artificial intelligence (AI) and generative AI technologies, combined with the
ever-present challenges of misinformation and disinformation, make the need for transparency
and access to reliable sources of information more pressing than ever. These technologies have
the power to shape public opinion, influence political outcomes and impact individual lives. In this
environment, ensuring the public has access to accurate and timely government-held information
about present and historical decisions and events is key to maintaining the integrity of our
democratic processes and upholding trust in our public institutions.” (Kosseim, 2024)

You might also like