3 Cho Micah 2024 BS
3 Cho Micah 2024 BS
On my honor as a University Student, I have neither given nor received unauthorized aid on this
assignment as defined by the Honor Guidelines for Thesis-related Assignments
Advisor
Caitlin D. Wylie, Department of Engineering and Society
1
Introduction and Background
Cambridge Analytica, shocked the world by releasing documents to British newspaper the
Observer which revealed that millions of Facebook profiles had been unethically harvested by
his company in a worldwide violation of user privacy. In a statement to the Observer, Wylie
admitted, “We exploited Facebook to harvest millions of people’s profiles. And built models to
exploit what we knew about them and target their inner demons” (Cadwalladr & Graham-
Harrison, 2018). The data was harvested through a third-party application called “this is your
application paid Facebook users to complete a survey which would be used for “academic use.”
Kogan, however, had an agreement to sell the data he collected to Cambridge Analytica, whose
CEO at the time was Alexander Nix (Cadwalladr & Graham-Harrison, 2018). Nix and his firm
then used the harvested data to provide political messages to Facebook users through
Nix’s intentions were to influence presidential elections. Specifically, the political messages
were meant to sway voters in the 2016 U.S. presidential election. Nix was even caught claiming
credit for President Donald J. Trump’s election, saying “We did all the research, all the data, all
the analytics, all the targeting. We ran all the digital campaign, the television campaign and our
data informed all the strategy” (McKee, 2018). Nix directly admitted to using the harvested data
to micro target Facebook users while also attesting that his firm’s work did influence the election
results.
This incident, which later became known as the Facebook-Cambridge Analytica scandal,
has raised concerns about user privacy violations in an age where technology platforms have
2
access to loads of personal information. In this paper, I will investigate the responsibilities of
developers and technology platforms in handling user data through an examination of the
scandal. Specifically, I will analyze the roles that Facebook, Aleksandr Kogan, and Alexander
Nix played through the lens of duty ethics and ethics of care. My analysis will show that all three
parties have a duty of care towards their users in handling their personal data and that each
individual party failed in that duty. Furthermore, my analysis will show that failures in duties of
Analysis
I. Facebook
Technology companies handle vast amounts of personal data and therefore possess a duty
of care towards the users whose data they are handling. Facebook, one of the largest technology
platforms in the world that handles the personal data of hundreds of millions of users, is no
exception to this duty. Duty of care, in the context of technology platforms, refers to the
responsibility to protect users from harm that can result from usage, exposure, or exploitation of
their personal data. In the Facebook-Cambridge Analytica scandal, Facebook failed in their duty
of care towards their users. In this section, I will examine how Facebook failed to provide a
protective technology platform, adequate user consent mechanisms, and transparency to its
users.
technology that protects user data and does not leave user data vulnerable. However, an analysis
of an API version that Facebook released in 2010 demonstrates that Facebook failed in this duty.
In 2010, Facebook updated its platform by adding Graph API v.1.0, an open graph tool. Through
this API, third-party developers and applications could retrieve the personal data of all of a user’s
3
Facebook “friends” by gaining consent from an individual user (Mitra, 2018). This open graph
API exploit was how Kogan was able to harvest the data of so many Facebook profiles through
his application. While this may not be a violation of privacy to the individual user, this is a clear
violation of privacy to all of the user’s connections because the user’s connections did not give
consent to the third-party application for their data to be accessed and used. Facebook failed to
properly manage access levels in this API release. In other words, the platform failed to control
who has access to what and what can be done with the data. Proper access level management is
necessary for any technology platform to protect user data. This flaw in the technology that
Facebook released left users and their data vulnerable. Since Facebook did not have control over
who could access their data, Facebook users were not protected from harm that could result from
their data. Overall, through the release of their open graph API, Facebook failed to fulfill their
user consent mechanisms. This must be accomplished in order to respect each user’s autonomy, a
key component in the fulfillment of the duty of care. Facebook’s main failure in providing
adequate user consent mechanisms was allowing third-party developers and applications to
access the personal data of the connections of an individual Facebook user. The consent of a
Facebook user who is a Facebook friend of another user is not adequate consent to access
personal data. Aleksandr Kogan, the developer of the “this is your digital life” application,
claimed in a statement to CNBC: “Each user who authorized the app was presented with both a
list of exact data we would be collecting, and also a Terms of Service detailing the commercial
nature of the project and the rights they gave us as far as the data” (Aiello, 2018). If Kogan’s
third-party application provided adequate user consent mechanisms, then Facebook’s platform
4
was inherently flawed in its ability to protect users. The API functionality was too easy to exploit
and left users vulnerable, as shown by the fact that millions of users had their data harvested
Finally, as a part of its duty of care towards users, Facebook has a responsibility of
inform its users of the outcomes of the scandal, what caused those outcomes, and actions being
taken. Five days after Wylie blew the whistle on Cambridge Analytica, Facebook CEO Mark
Zuckerberg released a statement on his own platform. Part of his statement read:
personality quiz app. It was installed by around 300,000 people who shared their data as
well as some of their friends' data. Given the way our platform worked at the time this
meant Kogan was able to access tens of millions of their friends' data.” (Zuckerberg,
2018, p. 1)
Zuckerberg simply states that Kogan was able to harvest data because of “the way our platform
worked.” He fails to inform his users of the cause of the incident, namely the inherent flaw in his
platform which allowed for the harvesting of so much data. As a part of the duty of care, it is
necessary to inform users of this cause so that users can be able to make informed choices on
what steps they can take to protect their own data. Furthermore, if the cause is not revealed, users
will be put in a place of mistrust towards the company. While Zuckerberg did provide measures
that Facebook will take to protect privacy in the future in his post, analysis of Facebook actions
one year after the scandal reveals that no actions were taken on these promised measures.
According to Wong (2019), Facebook had not yet pursued a forensic audit of Cambridge
Analytica and had not investigated “all apps that had access to large amounts of information” as
5
of 2019. Also, Zuckerberg had promised a “clear history” tool for Facebook users, but the tool
was not available and had no timeline for release in 2019. Facebook had also not provided any
updates on these promises as of 2019. These failures to inform users of the cause of the scandal
and updates on actions demonstrate a lack of transparency by Facebook in the aftermath of the
scandal. Technology companies have a responsibility to reveal such information when their users
scandal, failed in its duty of care towards Facebook users and should be held morally responsible
for the breach of data privacy that occurred. Facebook failed in its responsibilities of providing a
protective platform, adequate user consent mechanisms, and transparency in the aftermath of the
scandal.
Technology companies are not the only entity that possesses this duty of care. Each
individual developer and engineer also has the responsibility to protect users from harm by
developing technology that protects user data and does not leave user data vulnerable. It is
necessary for individuals to hold this responsibility on top of companies and platforms for two
reasons. First, companies are often influenced by money and will choose to do what is
financially best over what is best for their users. Individual developers must mitigate this by
choosing to do what is best for users. Second, company platform policies are not always reliable,
as they are often not written by technical experts, so developers cannot blindly follow them when
they can be dangerous for users. For these two reasons, developers hold a duty of care towards
users and their data. In this section, I will examine how Aleksandr Kogan, developer of the “this
6
is your digital life” application failed in his duty of care by taking advantage of a loophole in
development of the “this is your digital life” application, Aleksandr Kogan did not consider
Kogan was a psychologist with a doctorate from Hong Kong University, who was working as an
assistant professor at Cambridge University when he was first contacted by SCL Elections, a
Cambridge Analytica entity, in 2014 (Davies et. al., 2018). Prior to being contacted by
Cambridge Analytica, Kogan had developed a personality quiz application, called “this is your
digital life” which he claimed was for academic research. To Kogan’s users, the application was
nothing more than a digital survey in which they had to first log in to their Facebook accounts to
take. However, what users of the application did not realize was that by logging into their
Facebook account, they were authorizing the application to collect the personal information on
their account, along with the personal information of all the accounts they were Facebook
“friends” with. In an interview on 60 Minutes with Lesley Stahl, Kogan admitted that he did
harvest the data of each user’s Facebook connections even though these users “didn’t opt-in
explicitly” and also acknowledged that the ability to use this was a feature of the Facebook
platform (Stahl, 2018). In the same interview, Kogan also admitted to providing the harvested
data to Cambridge Analytica, knowing that it would be used for microtargeting in campaign
advertisements.
Kogan’s actions and awareness of what he was doing demonstrate a failure in his duty of
care as a developer. Kogan knew that the ability to harvest data of social connections of
Facebook users was a loophole in the Facebook platform that was created by poor policy and left
7
user data vulnerable, but still chose to exploit that loophole. Furthermore, Kogan was aware that
these social connections did not give consent for their data to be accessed, so his application did
not take user privacy into consideration. Ultimately, Kogan neglected to assume any role in
protecting personal data despite his duty of care towards users of his application, and along with
Facebook and Alexander Nix, should be held responsible for the breach of data that occurred in
the scandal.
Cambridge Analytica, as a political consulting firm that utilizes personal data, also
possesses a duty of care. Specifically, companies that use personal data have a responsibility of
using data that has been collected with consent and using this data in an ethical manner.
Companies that use harvested data possess this duty of care for the same reasons that companies
that handle data, like Facebook, have a duty of care. Namely, these companies have a
responsibility to protect their users’ data in order to foster a culture of trust with users.
Furthermore, by using data that was unethically harvested, they are encouraging data mining
practices that violate user privacy. Thus, Cambridge Analytica possesses a duty of care to use
properly harvested data, and to not encourage unethical data harvesting practices. In this section,
I will analyze how Cambridge Analytica, and specifically CEO Alexander Nix, engaged in
Nix was directly responsible for his company’s decision to use the data Kogan had
collected and for arranging for the usage of microtargeted advertisements. While being secretly
recorded by Channel 4 News, Nix bragged about his company’s role in the election of Donald
“We did all the research, all the data, all the analytics, all the targeting. We ran all the
8
digital campaign, the television campaign, and our data informed all the strategy.”
(McKee, 2018, p. 1)
Nix acknowledges, through this statement, that he exploited Facebook users by using their data
to target them with digital advertising. In an effort to gain financial profit along with a strong
reputation as a political consulting firm, Nix completely neglected the users whose data he
exploited and instead used them as a means for his and his company’s benefit. Through this
neglect, Nix broke the culture of trust between his firm and the users whose data he was
handling. Furthermore, had his firm not been caught, Nix’s violations in data collection would
only have encouraged other developers to unethically harvest data in order to profit through
companies like Cambridge Analytica. Therefore, Nix’s actions demonstrate a failure in his duty
of care towards the users whose data he was handling. Nix could have shown care towards these
users by simply not using the data he had access to. By choosing to not use the data, the scandal
would not have happened, and Nix would not have been fired. However, since he chose to use
the data, he failed in his duty of care and should be held morally responsible for the data breach
that occurred in the scandal, along with Facebook and Kogan. CEO’s can show care towards
their clients and users both by making decisions that prioritize the culture of trust that they have
with their users and by forming their company with other individuals who prioritize this culture
of trust.
Conclusion
In the aftermath of the scandal, Facebook removed the open graph API tool from its
platform that opened the door for the scandal. CEO Mark Zuckerburg also promised to
investigate other third-party applications with access to large amounts of user data and to place
9
Facebook’s role in the scandal, the company has made no updates to these promises as of 2019.
Kogan was accused by the Federal Trade Commission (FTC) of misleading his survey takers and
reached a settlement in 2019 in which he was required to destroy the data he harvested from
Facebook (Federal Trade Commission, 2019). He was not required to confirm or deny the
allegations against him. Nix faced similar allegations by the FTC and was also not required to
confirm or deny the allegations against him (Federal Trade Commission, 2019). Nix was
immediately suspended by the board of Cambridge Analytica, which later declared bankruptcy
(McKee, 2018). No single entity faced punishment after the scandal, leaving many questioning
the safety and security of their information in the digital world. More research is needed on how
responsibility can be assigned and what actions should be taken to punish responsible
parties. Furthermore, more research is needed on how companies and individual developers can
be kept accountable by the public and the users whose privacy is dependent on them.
Overall, Facebook, Aleksandr Kogan, and Alexander Nix failed in their duties of care
that they each owed to their users. Facebook failed to provide a technology platform that
adequately protected users, while Kogan exploited weaknesses in Facebook’s platform and
provided data to Nix, who unethically used the data to influence an election. Through my
analysis, it is apparent that the failures of technology platforms and developers in their duty of
care towards users can result in violations of user privacy. Had one of these three parties acted
ethically in their care for their users, this scandal could have been entirely avoided.
10
References
Aiello, C. (2018, March 21). Developer behind the App at the Center of the Data Scandal
kogan-facebook-shouldve-known-how-app-data-was-being-used.html
Profiles Harvested for Cambridge Analytica in Major Data Breach. The Guardian.
https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-
influence-us-election
Davies, H. et. al. (2018, April 24). How Academic at Centre of Facebook Scandal Tried – and
https://www.theguardian.com/news/2018/apr/24/aleksandr-kogan-cambridge-analytica-
facebook-data-business-ventures
Detrow, S. (2018, March 20). What Did Cambridge Analytica Do During the 2016 Election?
NPR. https://www.npr.org/2018/03/20/595338116/what-did-cambridge-analytica-do-
during-the-2016-election
Federal Trade Commission (2019, July 24). FTC Sues Cambridge Analytica, Settles with
releases/2019/07/ftc-sues-cambridge-analytica-settles-former-ceo-app-developer
Fernekes, C. & Harbath, K. (2023, March 16). History of the Cambridge Analytica Controversy.
controversy/
Hanna, M.J. & Isaak, J. (2018). User Data Privacy: Facebook, Cambridge Analytica, and
11
Privacy Protection. IEEE.
https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8436400
Mckee, R. (2018, March 20). Alexander Nix, Cambridge Analytica CEO, Suspended after Data
analytica-suspends-ceo-alexander-nix
Mitra, R. (2018, June 15). How the Facebook API Led to the Cambridge Analytica Fiasco.
APIacademy. https://apiacademy.co/2018/06/how-the-facebook-api-led-to-the-
cambridge-analytica-fiasco/
Stahl, L. (2018, April 22). Aleksandr Kogan: The Link Between Cambridge Analytica and
between-cambridge-analytica-and-facebook/
Wong, J.C. (2019, March 18). The Cambridge Analytica Scandal Changed the World - But it
https://www.theguardian.com/technology/2019/mar/17/the-cambridge-analytica-scandal-
changed-the-world-but-it-didnt-change-facebook
https://www.facebook.com/zuck/posts/10104712037900071
12