Annotated Bibliography
Annotated Bibliography
This article made a claim that misinformation indeed caused human life during the
COVID-19 pandemic based on scientific research and report data. Plus, the author
proposed possible solutions when dealing with fake news and misinformation which
are feasible and key to the point.
The evidence can be used to back up my own claim that misinformation and fake
news is causing social problems. This pandemic is just a lens to help us look into it.
However, the solution advice is mainly for the COVID-19 and public health concern
particularly, not for the general case of misinformation.
“THE DAILY ME.” Republic: Divided Democracy in the Age of Social Media, by Cass R.
Sunstein, Princeton University Press, 2017, pp. 1–30.
Cass Sunstein is currently the Robert Walmsley University Professor at Harvard and
he is also the founder and director of the Program on Behavioral Economics and
Public Policy at Harvard Law School. Professor Cass is now working on a variety of
projects involving fake news and freedom of speech. In this Chapter, he illustrated
the potential problems of personalized news feed and how the problem can arise
from the angle of people’s freedom of choice, human’s natural tendency towards
homophily, and the preferred architecture of serendipity for democracy. The author
believes a personalized news feed system (i.e. Daily Me) is going to create gated
communities and promote polarization and fragmentation. And he proposed that a
well-functioning system of free expression must meet two distinctive requirements:
first, people should be exposed to materials that they would not have chosen in
advance. Second, many or most citizens should have a wide range of common
experiences.
Cass’ chapter helped me see how the modern social media system threatened the
well-being of Democracy and why a recommendation system or information filtering
architecture can be a danger to democracy.
Tufekci, Zeynep. “We're Building a Dystopia Just to Make People Click on Ads.” TED, 2017,
www.ted.com/talks/zeynep_tufekci_we_re_building_a_dystopia_just_to_make_people_click
_on_ads?referrer=playlist-the_race_for_your_attention.
In this TED Talk, Tufekci convinced me of the potential danger of AI, not from the
traditional humanoid robot perspectives but how the people in power will use AI to
control us, and to manipulate us in novel, hidden, subtle and unexpected ways. She
explained that with AI, a persuasion architecture can be built at the scale of billions
and they can target, infer, understand, and be deployed at individuals one by one by
figuring out our weakness. This persuasion architecture is the recommendation
system and the underlying classification AI system collecting and analyzing user data
in order to sell not only products but news, information, and politics.
This talk has warned me of a clear role switch: in the world of information, users of
social media platforms are no longer customers but products to be sold to
advertisers. Therefore, the intention for the platform is only to attract as much of our
time as possible. This is the large social media tech companies fundamental
business logic. They are not unbiased and neural information intermediaries. They
are corporations aiming for profits.
This journal article investigates how twitter has enforced its content-vetting policies
during the Covid-19 pandemic. According to the author, twitter has broadened its
definition of harmful content as a strategy to combat misinformation on its platforms.
During the pandemic, social media has been used as a main source for sharing,
receiving, and engaging with virus-related content. However, the problem of
misinformation also is enlarged to become an infordemic. This article will help me
look into one platform specifically of how the original content is updated and
redefined in response to the massive misleading information.
Bulanda, Jennifer Roebuck, et al. “Vaccine Opposition in the COVID-19 Age.” Social
Problems in the Age of COVID-19 Vol 1, edited by Glenn W. Muschert et al., 1st ed., Bristol
University Press, 2020, pp. 122–33. JSTOR, doi:10.2307/j.ctv15d81tx.17.
This book chapter described how vaccine opposition has become a growthing social
problem because of abundant misinformation, economic uncertainty and politicization
of the pandemic. A lot of scientific research comparing the history of vaccination with
today’s is used to support her claim. An effective vaccine with the majority of people’s
commitment to forgo it is the key factor to end the epidemic. However, with the
growthing anti-vaccination sentiment, the future is not optimistic. The writer proposed
6 recommendations for remedy which are inspirational.
This book provides a unique lens into how social media and the wide spread of
misinformation can cause social problems and cost human life. Detailed case study
will be used in my essay to illustrate the severeness of the misinformation problem.
This source is timely and relates to currently what we are all experiencing, therefore it
can help all readers to see the problem.
Sunstein, Cass. “Is Social Media Good or Bad for Democracy?” Sur: Revista Internacional
de Direitos Humanos, vol. 15, no. 27, July 2018, pp. 83–89.
This article is from professor Sunstein again exploring what social media can do bad
to Democracy from the perspective of polarization. This article is targeting the
“personalized experience” specifically, which is what social media companies seek to
achieve as a main competitive force to attract users. He tries to explain how this
mechanism will risk keeping users in an “information cocoon” and threaten the
information flow requirement for a democracy. This paper also provides a better
architecture to resolve the mentioned problem.
This article is not looking at misinformation particularly but solely on how the
“personalized experience” feature of social media is jeopardizing the basis for a
healthy democracy. This brings new perspectives into the discussion.
Nechushtai, Efrat, and Seth C. Lewis. “What Kind of News Gatekeepers Do We Want
Machines to Be? Filter Bubbles, Fragmentation, and the Normative Dimensions of
Algorithmic Recommendations.” Computers in Human Behavior, vol. 90, Jan. 2019, pp.
298–307. EBSCOhost, doi:10.1016/j.chb.2018.07.043.
Sætra, Henrik Skaug. “The Tyranny of Perceived Opinion: Freedom and Information in the
Era of Big Data.” Technology in Society, vol. 59, Elsevier Ltd, Nov. 2019. EBSCOhost,
doi:10.1016/j.techsoc.2019.101155.
This article helps me understand what fundamentally has been threatened and
changed by the AI system in a well-functioning democracy. It provides me more
detailed analysis combining all the previous key concepts such as echo chambers
and homophily.
Niemiec, Emilia. “COVID-19 and Misinformation Is Censorship of Social Media a Remedy to
the Spread of Medical Misinformation?” EMBO REPORTS, Oct. 2020. EBSCOHost,
doi:10.15252/embr.202051420.
This is a very interesting article for me to understand freedom of speech and the
remedy of misinformation. Because I would ask, is content policy a censorship?
Should we pass on the responsibility of resolving misinformation solely on social
media companies? Did we just give them more influential power over users and thus
the society as a whole.
Miller, Michael L., and Cristian Vaccari. “Digital Threats to Democracy: Comparative Lessons
and Possible Remedies.” International Journal of Press/Politics, vol. 25, no. 3, July 2020, pp.
333–56. EBSCOhost, doi:10.1177/1940161220922323.
This journal did a comparison from 23 countries and four continents on possible
remedies to various digital threats to democracy. It also pointed out three challenges
for future research on digital media and politics.
This journal looks at not only in the US but more generally across various
civilizations, how did the problem of media arise, how has it impacted the citizens
and the proposed solutions in different regions. I think it will provide extra insights
into remedies and possible future outlook of the problem.
Aswad, Evelyn Mary. “In a World of ‘Fake News,’ What’s a Social Media Platform to Do?”
Utah Law Review, no. 4, Utah Law Review, Aug. 2020, pp. 1009–28.
This Essay informed me of what law and political guidelines the content policy
created by social media platforms should follow in a sense that the content policy
does not become a form of censorship.