0% found this document useful (0 votes)
134 views14 pages

Ai Ethics in Predictive Policing

This document discusses ethics in predictive policing and the use of AI by law enforcement. It argues that an 'ethics of care' approach is needed rather than just focusing on threats. Predictive policing raises concerns about prejudice and discrimination. The adoption of data-driven systems also impacts organizations and priorities. An ethics framework is needed to ensure fair and just applications of AI technologies.

Uploaded by

saru
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
134 views14 pages

Ai Ethics in Predictive Policing

This document discusses ethics in predictive policing and the use of AI by law enforcement. It argues that an 'ethics of care' approach is needed rather than just focusing on threats. Predictive policing raises concerns about prejudice and discrimination. The adoption of data-driven systems also impacts organizations and priorities. An ethics framework is needed to ensure fair and just applications of AI technologies.

Uploaded by

saru
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

AI Ethics

in Predictive Policing

WIKIMEDIA/BRETT GUSTAFSON

From Models of Threat


to an Ethics of Care

Peter M. Asaro

Digital Object Identifier 10.1109/MTS.2019.2915154


Date of publication: 30 May 2019

40 0278-0097/19©2019IEEE IEEE TECHNOLOGY AND SOCIETY MAGAZINE ∕ JUNE 2019

d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
T
he adoption of data-driven organi- any meaningful sense. Rather, there are myriad social,
zational management — which in- institutional, and individual values that go into the deci-
cludes big data, machine learning, sions of which data to collect, when and where to col-
and artificial intelligence (AI) tech- lect it, how to encode it, how to assemble it in
niques — is growing rapidly across databases, how to interpret it, and how to use it to
all sectors of the knowledge econo- address social, institutional, and individual concerns. It
my. There is little doubt that the is those values which are the primary concern of ethics
collection, dissemination, analysis, and use of data in information systems design.
in government policy formation, strategic planning, This article outlines a new ethical approach that bal-
decision execution, and the daily performance of du- ances the promising benefits of AI with the realities of
ties can improve the functioning of government and how information technologies and AI algorithms are
the performance of public services. This is as true for actually adopted, applied, and used. It proposes that AI
law enforcement as any other government service. ethics should be driven by a substantive and systemic
Significant concerns have been raised, however, Ethics of Care, rather than by narrow Models of Threat
around the use of data-driven algorithms in policing, based on utilitarian risk and threat models. While it
law enforcement, and judicial proceedings. This focuses on law enforcement policies and policing prac-
includes predictive policing — the use of historic crime tices, it hopes to contribute to the broader discussion
data to identify individuals or geographic areas with over the ethical application of AI technologies in govern-
elevated risks for future crimes, in order to target them ment policy-making and the delivery of public and com-
for increased policing. Predictive policing has been mercial services more generally. The paper concludes
controversial for multiple reasons, including questions that while data-driven AI techniques could have many
of prejudice and precrime1 and effectively treating peo- socially beneficial applications, actually realizing those
ple as guilty of (future) crimes for acts they have not benefits requires careful consideration of how systems
yet committed and may never commit. This central are embedded in, and shape, existing practices, beyond
controversy over prejudice and precrime is amplified questions of de-biasing data. Absent such consideration,
and exacerbated by concerns over the implicit biases most applications are likely to have unjust, prejudicial,
contained in historic data sets, and the obvious impli- and discriminatory consequences. This conclusion sup-
cations for racial, gendered, ethnic, religious, class, ports a proposed Ethics of Care in the application of AI,
age, disability, and other forms of discriminatory which demands moral attention to those who may be
policing, as well as how the use of predictive informa- negatively impacted by the use of technology.
tion systems shapes the psychology and behavior of
police officers. Recent Excitement about AI
As more bureaucratic processes are automated, There is a recent and widespread excitement about the
there are growing concerns over the fairness, account- application of artificial intelligence to nearly every
ability, and transparency of the algorithms used to aspect of society — from commerce to government. AI,
make consequential decisions that determine peoples’ as a scientific research field, has long sought to devel-
life opportunities and rights. Less discussed are the op computer programs to perform tasks that were pre-
ways in which the introduction of data-centric processes viously thought to require human intelligence. This
and data-driven management have significant conse- somewhat abstract and conditional definition has given
quences on the techno-social and spatio-temporal struc- rise to a wide array of computational techniques, from
ture of organizations [1], as well as on the priorities of logical inference to statistical machine learning, that
organization management, the nature of labor, and the enable computers to process large and complex datas-
quality of results [2]. Such is the nature of contemporary ets and quickly provide useful information. Whether
technocratic governance [3]. Yet neither the increasing through traversing long chains of inference or sifting
collection and reliance on data, nor specific socio-tech- through vast amounts of data to find patterns, AI aims
nical and spatio-temporal organization of governmental to provide logically sound and evidence-based insights
institutions is determined by the technology alone, nor into datasets. Insofar as these datasets accurately rep-
by the utility of data. Nor is the kind of analysis per- resent phenomena in the world, such AI techniques can
formed on that data, or the specific problems to which potentially provide useful tools for analyzing that data
the data is addressed, pre-determined or “natural” in and choosing intelligent actions in response to that
analysis, all with far less human labor and effort. This is
the traditional approach of AI, or what we might consid-
1
”Precrime” is a science fiction concept that first appeared in the writings
of Philip K. Dick in a novel [19] that was later turned into a major Hollywood
er artificial specialized intelligence. This type of AI is
movie [20]. essentially about creating a customized piece of

JUNE 2019 ∕ IEEE TECHNOLOGY AND SOCIETY MAGAZINE 41


d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
software to address a complex issue or solve a specific decisions or perform ethical analyses. What is of con-
problem by automating what would otherwise require cern to the public, and in this paper, is how well the sys-
human mental effort.2 tems are designed, and the ethical implications of
Specialized AI is best seen as an extension of more introducing them into police practices.
traditional practices such as software engineering, IT There is a growing body of research examining the
systems design, database management and data sci- ways in which data-driven algorithms are being used in
ence which deploys a range of AI techniques to auto- an increasing number of critical decision processes,
mate the search for solutions to problems that currently often with little or no accountability [6]–[9], and some-
require substantial human mental labor and skill. Much times with little or no real understanding of how they
of the current excitement around AI is focused on “deep function in the real world or why they reach the results
learning” machine learning techniques that use many- they do in particular cases [10]–[12]. Consequently,
layered “deep” neural networks that can find complex there are many ways for such systems to “go wrong.”
patterns in large datasets (“big data”). Far from artificial Sometimes this is due to a well-intentioned but math-
sentience, consciousness or general intelligence, we ematically naive understanding of how such systems
could consider this as enthusiasm for “statistics on ste- work. This includes the failure to understand how
roids.” Commercial and governmental institutions have statistical outliers may be mishandled or misrepre-
long used statistics to develop representations of the sented, or how historical data patterns can be self-
world that can inform future actions and policies. In this reinforcing — such as denying credit and charging
sense, the AI revolution is really a continuation, and higher interest rates to poorer individuals and com-
massive acceleration, of much longer and older trends munities, thus systematically denying them opportuni-
of datafication and computerization. What is new and ties to escape poverty. Sometimes this is due to the
unprecedented is the sheer volume of data, the speed intended desire to transfer responsibility and blame
at which it can now be effectively processed, the sophis- to an automated process, and relieve human agents
tication of the analysis of that data, the degree of auto- of their responsibility. And sometimes there may be
mation and the consequent lack of direct human malevolent motives behind using data in obviously
oversight that is possible. discriminatory ways — such as purging voter rolls to
As data-driven organizational management — led deny eligible voters to an opposing political party. But
by big data, machine learning and AI techniques — these are ultimately “narrow” views of AI ethics, which
continues to accelerate, and more processes are auto- look to improving accuracy and performance of the
mated, there are growing concerns over the social and technology, while largely ignoring the context of use. It
ethical implications of this transformation. Machine has also been argued that the focus of AI ethics on
ethics is concerned with how autonomous systems “solving” the bias problem is a distraction from other
can be imbued with ethical values. “AI ethics” consid- and more important ethical and social issues [13].
ers both designing AI to explicitly recognize and solve Without discounting the value of such narrow approach-
ethical problems, and the implicit values and ethics of es, this paper will examine the importance of taking a
implementing various AI applications and making broader ethical perspective on AI, and the problems
automated decisions with ethical consequences. This that will not be fixed through fairness, accountability
paper will consider the latter, implicit view that corre- and transparency alone.
sponds to what is sometimes called “robot ethics,” to
distinguish it from explicit “machine ethics” [4]. Ideal- Two Approaches to AI Ethics
ly, the explicit ethics, implicit ethics, and the embed- This paper aims to go beyond the ways in which data
ding and regulation of the system in society should all and AI algorithms might be biased or unaccountable,
align [5]. and consider the ethics of how AI systems are em­­
The outputs of predictive policing algorithms clearly bedded in social practices. Because AI ostensibly
have ethical consequences, even if the systems under automates various forms of human reasoning, consider-
consideration do not try to design systems for explicit ation, and judgement, the accuracy or fairness of such
ethical reasoning. In the predictive policing systems processes alone do not guarantee that their use will pro-
under consideration, there is little or no effort to design vide just, ethical, and socially desirable results. Rather,
the systems to frame their analysis or results as ethical careful attention must be paid to the ways in which the
implementation of such systems changes the practices
2
Some theorists have speculated about the possibility or consequences
of those who use them. In order to redirect attention to
of an artificial general intelligence (AGI) which might be able to learn with
little or no direct instruction from humans, and in some sense recognize the bigger picture of the socio-technical embeddedness
problems on its own that are in need of solution, and then adapt itself to
of AI when considering ethics, the paper will formulate
solve them. AGI is not technologically feasible for the foreseeable future,
and as such it will not be given much consideration here. two broad concepts of AI ethics, which will be named

42 IEEE TECHNOLOGY AND SOCIETY MAGAZINE ∕ JUNE 2019

d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
“Models of Threat” and an “Ethics of Care.”3 It will first surveillance of the public [24]. However, these recent
outline these concepts in broad terms. It will then algorithmic techniques and applications have their
examine two illustrative cases, in the area of predictive roots in much older practices of collecting and utilizing
policing, which epitomize each approach. It concludes comparative statistics (better known as CompStat)
with some observations and reflections on how to about crimes to manage large police forces, which
design better and more ethical AI through an Ethics of began in New York City in 1995. While many CompStat
Care approach. programs utilized computer programs to calculate the
Perhaps the greatest ethical concerns over algorith- statistics from crime and accident reports and arrest
mic decisions have been raised around the use of data- records and in some cases automatically generate
driven algorithms in policing, law enforcement, and “pin-maps” of crime activity, CompStat was really a set
judicial proceedings. One well-researched and much of data collection, analysis, and management practices
discussed example from the Florida judicial system rather than a piece of software [25]. And CompStat has
involves the use of algorithms to predict future recidi- seen its share of criticism, including from former
vism in convicts as a basis for determining the length of police officers [26].
their sentences.4 Another growing application is predic- Moreover, the algorithmic techniques that are
tive policing — the use of historic crime data to identify increasingly being employed by police forces draw
individuals or geographic areas with elevated risks for upon data that goes well beyond the digitized crime
future crimes, in order to target them for increased reports of the CompStat legacy, or automatically gener-
policing. Predictive policing has been controversial — as ated “heat maps” of areas of high crime activity.5 In
it aspires to prevent crime, it also raises questions of recent years, police departments have begun deploying
prejudice and precrime and effectively treating individu- and integrating large-scale video surveillance systems,
als and communities as guilty of (future) crimes for acts traffic cameras, license-plate and face recognition
they have not yet committed and may never commit technologies, audio gun-shot locators, cellphone inter-
[21], [22]. This central controversy of prejudice and pre- ceptors, aerial surveillance, and a host of other sur-
crime is amplified and exacerbated by more general veillance and data-collection technologies. As these
concerns over the implicit biases contained in historic systems become networked and produce large amounts
data sets, and the obvious implications for racial, gen- of data, there is increased pressure to analyze, inte-
dered, ethnic, religious, class, age, disability, and other grate, and utilize this data for improving law enforce-
forms of discriminatory policing. ment, which leads to increased reliance on automation
Predictive policing as a term can refer to a variety of and algorithms for sorting and sifting through that data
technologies and practices. The technical usage of the and translating it into policing priorities and strategies.
term usually refers to algorithmic processes for pre- As such, the term predictive policing can be taken to
dicting locations or individuals with high probabilities refer to a broad class of algorithmic and data-driven
of being involved in future crime, based upon histori- practices and software tools utilized by police forces.
cal data patterns [23]. Recent approaches utilize “big Predictive policing is also a good example of how AI
data” techniques and arguably entail forms of mass might be deployed more generally, and the ethical chal-
lenges that may arise.
3
Neither term is original, and each is meant to evoke traditions of thought A general approach to AI ethics is characterized here
and their general perspective, while not necessarily implying that the spe-
cific projects described were conscious of, or directly influenced by, those
as an “Ethics of Care.” Ethics of Care uses predictive
traditions. “Threat Modeling” has been an important methodology in cyber- policing, and the design of AI-based systems within it,
security for identifying, assessing, prioritizing, and mitigating threats and
vulnerabilities since at least the early 2000s [14], while “Threat Perception”
to lay out the framework for an AI Ethics of Care. In par-
has been a key concept in international relations and political psychol- ticular we look at two recent, but very different, imple-
ogy in assessing military threats and deterrence strategies [15]. “Ethics of
Care” has been gaining popularity in medical and educational ethics since
mentations of data-driven interventions on youth gun
its introduction by Carol Gilligan to explain moral development in child psy- violence in Chicago, Illinois, U.S.A. Predictive policing is
chology in the late 1970s and its extension by Nel Noddings into a moral
theory based on interpersonal relationships of caregiving and receiving in
particularly good for this purpose for several reasons.
the early 1980s [16]. As should be clear from the discussion above, policing
4
In an analysis of 7000 sentencing cases in Broward County, Florida, over
the period 2012-2013 that used the COMPAS software, journalists found
is an area the gives rise to a number of critical ethical
similar error rates in the assessment and sentencing of white and black and legal issues, and has relevance not only to society
convicts, but diametrically opposed in their direction. White convicts were
more likely to be erroneously predicted not to commit future crimes, while
at large, but to a host of other governmental functions
black convicts were more likely to be erroneously predicted to commit and other industries. It is also an area that has an his-
future crimes, resulting in shorter sentences for white convicts and longer
sentences for black convicts [17].
torical practice of data collection, and recent trials in
Another study of the same dataset shows that amateur humans are
5
able to make better predictions than the COMPAS software, using the same Such “heat maps” have become ubiquitous in the age of big data, and are
six factors as the software, and even better predictions can be made using even reproduced, albeit at lower resolution, on real estate websites such
just two factors — defendant’s age and number of past convictions [18]. Trulia.com [27].

JUNE 2019 ∕ IEEE TECHNOLOGY AND SOCIETY MAGAZINE 43


d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
the application of AI techniques to those practices. Fur- The traditional notion of Ethics of Care is that inter-
ther the algorithms of predictive policing embed values personal relationships form the basis for normativity,
and make designations and decisions with implicit ethi- and should be guided by benevolence [16].7 When it
cal consequences. comes to law enforcement, we can see the Models of
The Ethics of Care approach has a history of its own Threat approach seeking to better identify violations of
as well, and is similar in some ways to concepts in the law, and to predict when and where violations will
related fields, including the “Duty to Protect” in policing occur, so as to better deploy police officers to respond.
[28] and the “Duty of Care” in law [29]. In contrast, the It might also aim to assist police in identifying perpetra-
Models of Threat approach construes the world and the tors and bringing them to justice. The Ethics of Care
individuals within it as risks and threats which must be approach, might instead consider the factors that lead
managed, mitigated, and eliminated. The later discus- people to violate the law, and seek out new interven-
sion section will consider what it means to implement tions that make crimes less likely, thus requiring fewer
the Ethics of Care approach, following the examples. resources to enforce the law. It would also view the rela-
First we give a brief sketch of each approach. tionship between law enforcement and the community
The Models of Threat approach begins from the as primary and consider how any new data tool might
assumption that the world can be classified into clear cat- impact that relationship.
egories, i.e, threats and non-threats, and that this is the
first step in choosing an appropriate action to take.6 It A Note on “Precrime”
focuses on capturing and processing increasing amounts Beyond the practical socio-technical meanings of pre-
and types of data, and processing this data to provide dictive policing, there is also a deeply troubling connota-
increasingly accurate classifiers of what constitutes a tion to the term, captured in the concept of “precrime.”
threat, and predictors of the likelihood and risk from that This notion is more philosophical in nature, and draws
threat. It largely assumes that the actions that will be upon our concepts of guilt, responsibility, agency, cau-
taken to address threats and risks are independent of the sality, and their temporality, as well as the means and
observation, collection, and analysis of data. This ultimate aims of law enforcement in the regulation of
approach also assumes that the primary values are in the society. The term is also mentioned extensively by near-
accuracy, precision, fidelity, and comprehensiveness of ly every press article about predictive policing, and the
the data model, and in the correctness of its classifica- commercial software startup PredPol, which supplies
tions and reliability of its predictions. This approach Los Angeles and many other police departments with
could also be characterized as taking a narrow view, data analysis software, states prominently on their
being very detail oriented, atomistic, and deeply analytic. “About” page that they are not selling “Minority Report”
By contrast, the Ethics of Care approach is holistic, technology [30]. Yet, the notion of precrime has power-
and takes a broad, big-picture view of the values and ful cultural meanings for good reasons beyond the pop-
goals of systems design. It considers the interaction and ularity of sci-fi.
interrelation between an action or intervention and the The basic idea of precrime stems from the idea that
nature of classifying things and predicting outcomes the goal of policing is the reduction and, ultimately,
within specific contexts. The goals and values of an Eth- the elimination of crime altogether. While investigating
ics of Care approach is to benefit everyone represented crimes after they occur and responding to crimes-in-
by the system as well as those who use the system, and action are good, it would be even better to prevent
the society as a whole. The Ethics of Care approach rec- crimes before they happen, or so this line of thinking
ognizes the complexity of social relations and socio- goes. This view tends to emphasize deterrence over
technical systems, including the organization using the other key elements of criminal justice — retribution
system, and does not expect more and better data to and reformation. The goal is to disrupt or dissuade
simply solve complex social and institutional problems, criminality before it manifests. While crime prevention
but rather to provide opportunities for finding better could focus on eliminating the means of committing
solutions, better actions, and better policies than what
are already considered. 7
According to the Internet Encyclopedia of Philosophy, “Normatively, care
ethics seeks to maintain relationships by contextualizing and promoting
the wellbeing of caregivers and care receivers in a network of social rela-
6
This is not to say that the world, or its representation in a computational tions. Most often defined as a practice or virtue rather than a theory as
model, is necessarily discrete. One could represent the likelihood that an such, “care” involves maintaining the world of, and meeting the needs of,
individual or area might present a threat or risk as a continuous variable. our self and others. It builds on the motivation to care for those who are
And while the scale and threshold for action on the basis of that variable dependent and vulnerable, and it is inspired by both memories of being
might not be predetermined, or determined by the system, it is expected cared for and the idealizations of self. Following in the sentimentalist tradi-
that such metrics will influence the decisions and actions of police officers tion of moral theory, care ethics affirms the importance of caring motiva-
with respect to those individuals and areas — i.e., that the threat or risk tion, emotion and the body in moral deliberation, as well as reasoning from
represented by the calculation can and should result in actions. particulars.” [16].

44 IEEE TECHNOLOGY AND SOCIETY MAGAZINE ∕ JUNE 2019

d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
crimes,8 it more often focuses on motives, and as According to the U.S. legal system, criminal liability
such employs psychological theories of choice and and guilt depend upon a combination of actus reus (the
sociological theories of behavior, and generally focus- “guilty act”) and mens rea (“the guilty mind”). That is,
es on maximizing the likelihood and cost of penalties one must actually commit the act for which one is held
for wrongdoing by stricter enforcement and harsher responsible, and one must have had in mind the inten-
penalties.9 The temporality also becomes deeply prob- tion, or at least the awareness, that one was doing
lematic here. There is an obvious utility in preventing something wrong, or should have known (as mere igno-
crimes before they occur, but our notions of individual rance of the law is not a suitable defense). From this
responsibility, guilt, and punishment rest on the com- perspective, one cannot be guilty of a crime before
mission of acts — of actually doing certain things that actually committing the act, and should not be held lia-
constitute crimes — rather than imagining, desiring, ble for a crime not committed. And this is where pre-
or simply being psychologically pre-disposed or cir- crime clashes with fundamental concepts of justice. If
cumstantially inclined toward doing things which society, and police, act upon precrimes, and those sus-
would be criminal. In some instances, planning or dis- pected of them, in the same way as already committed
cussing criminal acts with others are acts that can crimes, then they are treating as guilty, or at the very
themselves constitute a lesser crime, such as conspir- least as suspect, those who have not yet, and not actu-
acy or solicitation to commit a crime, and a failed ally, committed a crime. This is a profound form of prej-
attempt, e.g., to kill someone, can still constitute the udice, in which judgments are made not only before
crime of attempted murder even if nobody is actually relevant evidence of a criminal act can be obtained and
hurt. But there are, and should be, different standards analyzed, but before such evidence can even exist.
for citizens who have committed no crime, those in Rather, judgement is passed on information derived
the act of committing a crime, those suspected of a from statistical inference, patterns, trends and probabili-
crime, those convicted of a crime, and those who ties. But a statistical likelihood of an event is neither an
have served their sentences for a crime. How should event nor an act.11 And it is fundamentally unjust to
law enforcement treat “those ‘likely’ to commit a treat someone as guilty of a crime they did not commit.
crime”? And does the epistemic basis for that likeli- Moreover, it is powerfully felt as an injustice when indi-
hood determination matter? viduals and communities are treated “as if” they are
The classification of individuals also becomes critical guilty of doing something they have not yet, or not indi-
here. When we say that an individual is “likely to commit vidually, done, based simply on their being members of
a crime” is that based on their individual behavior and a category or demographic group. Indeed, the imposi-
actions, or because of membership in a certain demo- tion of social categories can even give rise to new social
graphic group? “Profiling” becomes problematic in the identities [35] — and thus machine-generated categories
latter case, when individuals are classified according to are likely to create new types of people. This makes the
population-level statistics and biases. Statistics are noto- creation and designation of a “criminal type” deeply
rious for not distinguishing correlations in data from problematic.
causal reasons, and it would be unjust to treat people Still, there is a practical concern that law enforce-
with suspicion for coincidental correlations when the ment cannot ignore information about likely crimes
underlying causal mechanisms for criminal behavior are without sacrificing their duty to prevent crime. While the
absent. This kind of profiling becomes deeply problem- scope and nature of that duty are themselves contested,
atic when it becomes prejudicial, and the correlation is this is a powerful intuition. Indeed, it is the same intu-
taken as itself constitutive of guilt, or warranting a pre- ition that motivates much data-driven management.
sumption of guilt, rather than a presumption of That is, if we can use historical data to predict future
­innocence.10 trends and events, and thus better allocate valuable
resources towards fulfilling a mission or goal, then we
8
For instance, adding better locks to protect property, such as ignition
immobilizers on cars, or making it more difficult to resell stolen goods [31].
should do so. While not incorrect — certainly better use
In some cases, increasing the policing of crimes may actually have counter- of information can improve policing in many ways — if
intuitive effects of increasing crime, according to an economic analysis of
the theft of art works [32].
pursued without careful consideration, caution, and
9
Rarely do these approaches take into account the outright irrationality
or the failure of individuals to actually think about committing crimes in
11
rational terms. This is because cognition in the wild follows other lines of Just consider gambling on horse races, which historically gave rise to
reason and risk assessment, from inflamed passions, to rational biases, to modern statistics [33]. Oddsmakers go to great lengths to provide accurate
human necessity. statistical predictions of the chances for each horse in a race. Yet, which-
10
For example, if one is worried about a copycat bombing like the Boston ever horse is the favorite to win does not necessarily win — the actual
Marathon bombing, it might make sense to flag individuals who shop for outcome of the race matters. The favorite only wins about 1/3 of the time
pressure cookers and backpacks. However, one should still presume there [34]. Gambling would not make sense if this were not the case — though in
is a reasonable explanation for this rather than presuming they must be ter- many games of chance it can be argued that it is mathematically irrational
rorists for doing so [32]. to place bets at all.

JUNE 2019 ∕ IEEE TECHNOLOGY AND SOCIETY MAGAZINE 45


d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
sensitivity to its various implications and specific imple- of predictive policing, and perhaps two extremes there-
mentations, pursuing such intuitions blindly can quickly of. They accordingly have very different ways of thinking
lead to problems. Unfortunately, the strength of this about what being an “at-risk” youth means, and conse-
intuition and its simple logic make it an easy policy quently pursue very different approaches to intervening
argument to make in many institutional and bureaucrat- so as to reduce that risk. More importantly, they also had
ic settings. One might even argue that this is the very different outcomes in terms of their effectiveness in
“default” policy argument in the age of data, and thus reducing gun violence and in influencing the life out-
Models of Threat is the default approach to predictive comes for those identified as “at-risk” in each program.
policing. And it is safe to assume that without critical In short, the Strategic Subjects List can be described as
reflection and active awareness on the part of systems taking a “Models of Threat” approach to at-risk youth.
designers, something similar will be the likely default That is, at-risk youth in that project are primarily viewed
goal of most AI systems. To better understand how the as threats to the community because they are at-risk,
design of systems can mitigate or exacerbate the prob- and interventions are targeted at increased police scruti-
lems inherent in data-driven management, we now turn ny and enforcement against those individuals. Whereas
to two examples of predictive policing. the One Summer program takes an “Ethics of Care”
approach to at-risk youth, in which at-risk youth are given
One City, Two Cases of Predictive Policing access to social services and resources aimed at reduc-
The City of Chicago, IL, has seen a spike in gun violence ing their risks of becoming involved in violence.12 Like
in recent years. The city has led the United States in the their philosophies, their outcomes were also dramatical-
number of shootings and gun homicides, peaking with ly different, despite resting on similar data-driven assess-
758 total homicides and more than 4300 shootings in ments of being “at-risk.”
2016, and down slightly in 2017 [36]. This has led to a
serious effort by the Chicago Police Department (CPD) The Heat List
to address this spike by focusing on neighborhoods and The Strategic Subject List (SSL) algorithm was devel-
individuals most likely to become involved in gun vio- oped as an experiment by a researcher at the Illinois
lence. A number of studies, experiments, and policies Institute of Technology, and was utilized by CPD starting
have been tested and implemented in recent years. By in 2012 and continuing until today. In its early iterations
comparing different applications of data-driven interven- and implementations, it took data about individuals
tions occurring in the same city at the same time peri- from CPD arrest records, taking into account some 48
od, we can develop insights into the implications of data factors, including number of arrests, convictions, drug
for shaping policing practices. arrests, gang affiliations, and being the victim of crimes
Two such experiments, in particular, offer a good or violence [38]. The SSL then went further, taking into
insight into the ways in which data can be applied to account these factors for the individual’s social network
address gun violence, and also into the ways that the as determined by who was arrested together with an
implementation and utilization of those insights can individual [39]. These factors were weighted and com-
have radically different social and ethical implications. piled into an overall SSL score from 1–500. The initial
One has been the subject of critical scrutiny by journal- implementation contained over 398 000 individuals
ists and researchers, called the Strategic Subjects List. drawn from police arrest records, and identified 1400
More often called the “heat list” by police officers, it was as being at “high-risk” of being involved in violence.
first used by CPD in 2012, and its use continues, though While some 258 received the top score of 500 points,
under a revised set of guidelines following criticism of only 48% of these had previously been arrested for a
the early uses described here. The other started in the gun crime, and many people on the list had never them-
summer of 2011 as a pilot research program imple- selves been arrested, but rather were victims or were in
mented by the City of Chicago, and was studied the fol- the social networks of victims or perpetrators [39]. Many
lowing year by University of Chicago researchers. Called police officers reported that they were not fully
One Summer, it has since been adopted as an annual informed of how the list was compiled. They assumed,
program by the City of Chicago. While both started out or were led to believe, that everyone on the list was a
as academic research projects, both were analyzed by perpetrator of violence and was likely to commit more
outside researchers in 2012, and both utilized data to violence, whereas the SSL scores combined those at-
assess and identify youth who are at-risk of being risk of being victims with those at-risk of being perpetra-
involved in gun violence, in most other ways the two tors in a single metric of “being involved in violence.”
programs are very different.
The two projects can best be characterized as illustra- 12
The slogan of the One Summer program is “Nothing Stops a Bullet Like a
tive case studies, embodying two different philosophies Job” [37].

46 IEEE TECHNOLOGY AND SOCIETY MAGAZINE ∕ JUNE 2019

d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
The practical use of the SSL list and scores was
somewhat haphazard in its early years.13 While there
was no official policy regarding its use, it did feature in The Ethics of Care approach might
some CompStat reports [40], and was used by police
officers in some more controversial ways. The first of
instead consider factors that lead
these, called “custom notification,” involved police offi- people to violate the law, and seek
cers making personal visits to high-risk individuals,
informing them of their presence on the list, and fur- out new interventions that make
ther, informing them that they would be subjected to
additional police scrutiny [41]. In other words, they were
crimes less likely.
told that the police were “watching them” more careful-
ly, and they should expect more police encounters. The
other, and more common use of the SSL was as a “heat That same Chicago Tribune article indicates that
list” following a violent crime, in order to round-up the 85% of the 2100 shooting victims so far that year had
“usual suspects” from the list for questioning, in this been on the SSL, but does not indicate how they scored
case people in the vicinity of the crime who had high or whether they were all in the list of 1400 high-risk indi-
scores on the list. As a result, people on the list were far viduals, or the longer list of 398 000 individuals includ-
more likely to be detained and arrested by police, sim- ed in the dataset.
ply for being on the list. A detailed RAND study showed Both of the main applications of the SSL, the “custom
that the use of heat list in this way had no statistical notification” warnings and using the “heat list” to bring
impact on the likelihood of individuals on the list being people in for questioning, contain elements of precrime.
involved in gun violence, nor on the overall gun violence In the warnings, there is a sense in which the police still
in their communities [42]. It did, however, radically cannot arrest an individual before a crime, but they do
increase the likelihood of being arrested and convicted attempt to intimidate and threaten an individual who, in
of a crime for those people on the list. the majority of cases, has never been arrested for a vio-
Further, the data and algorithm behind the SSL was lent crime. While the police do offer to “help individuals
not shared publicly, making it difficult to determine to leave gangs,” it is not clear what specific services they
whether the list simply replicated long-standing racial offered, or whether those services are effective in either
and class discrimination. The CPD told the Chicago Tri- helping individuals get out of gangs or in avoiding future
bune that, violence. Similarly, rounding up people in the area who
appear on the “heat list” may be an expedient tool, but it
“[The SSL] is not based on race, ethnicity or is no substitute for doing the policework of a real investi-
geographic location..We don’t use it to target gation, or following leads from witnesses and suspects.
certain individuals other than we pay a visit to Indeed, it may impede or undermine community-orient-
their residence to offer them services to get out ed policing strategies. While police may complain that
of the (gang).” witnesses, and even victims, are often unwilling to coop-
erate with police, these heavy-handed tactics of rounding
But a California-based group that defends civil liber- up suspects based on data-driven lists only further
ties in the digital world raised concern that the arrest breaks down trust between communities and the police.
data that goes into the SSL could be inherently biased As such, these uses of SSL actually work against confi-
against African-American and other minorities: dence-building efforts by police, while offering little or no
demonstrative positive results [42], [43].
“Until they show us the algorithm and the exhaus- Both applications also appear to engage in victim-
tive factors of what goes into the algorithm, the blaming. In some cases literally so, insofar as the SSL
public should be concerned about whether combines victims and perpetrators in a single category of
the program further replicates racial disparities in “being a party to violence” or at-risk of being “involved
the criminal justice system,” in violence.” It makes little sense to show up at some-
one’s door to tell them that they may be the victims of
said Adam Schwartz, a staff attorney for the Electronic violence,14 and less sense to threaten them with
Frontier Foundation [41].
14
Making someone aware of a specific threat against them would be helpful,
13
It is also worth noting that the SSL, and the data and algorithms upon but people are usually aware of the fact that they live in a violent neighbor-
which it was based, was kept private by the CPD. It was only after a long hood. Nonspecific warnings are of little help, as has been seen with color-
legal battle that the Chicago Sun-Times newspaper was able to force the coded threat risks from the Department of Homeland Security, which do not
CPD to make the SSL and its data public [39]. specify any particular location or type of activity to be on the lookout for.

JUNE 2019 ∕ IEEE TECHNOLOGY AND SOCIETY MAGAZINE 47


d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
increased surveillance, or to round them up for question- More than 700 youth ages 14–21 were selected to
ing after a violent crime. Detailed analysis of the effects participate in One Summer Plus in 2012 from an
of these practices bear out the futility of these interven- open application process available at thirteen Chi-
tions. Accordingly, this approach can best be character- cago public schools located in high-violence and
ized as “Models of Threat.” Individuals on the SSL are low-income neighborhoods. Applicants faced a
seen as threats, and are themselves threatened and sub- number of challenges; the year before they
jected to additional police attention, and are much more entered the program, they had missed an average
likely to be questioned and arrested. Indeed, from a of six weeks of school and about 20 percent had
crime statistics perspective, the success of a police been arrested [44].
department rests on the number of violent crimes, and
many gun crimes are the result of and/or give rise to As a data-driven technique, it was largely the schools
retaliation, so it makes sense to combine the victims and that were identified through historical data. While the
perpetrators of violence in a single metric. In other methodology used to identify the 13 schools is not dis-
words, individuals likely to be involved in violence are a cussed in detail, presumably it was based on the geo-
“threat” to the department’s CompStat numbers, regard- graphic location of historical incidence of violence, and
less of whether they are victims or perpetrators. Thus, in the proximity of those schools to violent areas, in com-
a Models of Threat approach, even a victim is viewed as a bination with demographic income data. But it is impor-
“threat.” Yet, in any commonsense approach to violence tant to note that individual students were initially
there should be a difference in how one approaches or identified only by virtue of attending a designated
intervenes with an individual who is likely to be a victim school. The accepted applicants may have been further
from someone likely to be a perpetrator.15 It would be dif- screened for factors such as school attendance, previ-
ficult to argue this approach has improved policing — for ous arrests, or other factors. But it is worth noting that
instance by making police work more efficient according this was not a highly sophisticated data-driven tech-
to its own metrics — when it has been proven to have no nique for identifying which individual youth were “at-
effect on violent crime on either an individual or commu- risk.” As far as the program was concerned, anyone
nity level. And while conflating victims and perpetrators is living in a low-income, high-violence area was “at-risk,”
poor data practice, it is not clear that “getting the data and more detailed or nuanced classifications were not
right” would actually improve the results of SSL. It is essential to participation or effectiveness.
hoped that an AI ethic would be able to avoid such inef- Researchers studying One Summer found a 51%
fectual and counterproductive applications. But to do so, reduction in involvement in violence-related arrests
it must look beyond the numbers and datasets, to under- among youth who participated in the program com-
stand how data and information systems are embedded pared to the control group that did not participate.16
in communities and policing practices. Their analysis of the data from the initial study, and of
subsequent years, demonstrates that this was not sim-
Nothing Stops a Bullet Like a Job ply the result of getting them off the streets for 25 hours
The Ethics of Care approach offers a stark contrast to the per week, but that there were significant changes in
Models of Threat. One Summer started as a pilot pro- their cognitive and behavioral approaches to school,
gram in the summer of 2011 by the City of Chicago. In work and becoming involved in violence [46]. Much of
2012 it became part of a controlled study (One Sum- this was attributed to improved impulse control, learned
mer Plus) by researchers at the University of Chicago both through their employment and through training
Crime Lab. The basic idea was to intervene with at-risk sessions they received as part of the program. There
youth by providing them with summer jobs, for 8 weeks were also economic benefits resulting from the addition-
and 25 hours a week at minimum wage, mostly work- al income received by the participants and their fami-
ing for organizations focused on their local communi- lies, and participants were much more likely to seek and
ties. According to the City’s press release about the get jobs after participating in the program.
program, “at-risk” was defined by a combination of The One Summer program provides a good illustra-
attending an at-risk school and a review of individual tion of an Ethics of Care approach insofar as it focuses
applications as follows. on the contextual manifestations of violence, and seeks
a means of directly intervening to change that context.
15
The assumption made by researchers in doing this appears to be that Rather than focusing on the metric or individual
there is significant overlap in the categories of victims and perpetrators.
This is especially true given the cyclical nature of gun violence in Chicago,
“threat,” Ethics of Care focuses on the system. Ethics of
driven by rivalries and revenge killings that beget further revenge killings.
Still, associating with people connected to violence might make you more
16
likely to become a victim of violence without becoming more likely to com- Subsequent research places the figure at a 43% reduction in violent
mit violence. arrests [45].

48 IEEE TECHNOLOGY AND SOCIETY MAGAZINE ∕ JUNE 2019

d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
Care also starts from respecting people and maintains a should shape data, particularly for AI ethics, we turn now
focus on the duties and responsibilities to the individu- to a discussion of what the framework for AI ethics drawn
als it deals with. By contrast, a Models of Threat from an Ethics of Care would look like.
approach sees people as statistics, and treats the indi-
viduals on a list as threats, whether they have done any- AI Ethics of Care: From Data to Models
thing or not, and regardless of whether they are victims to Implementation
or perpetrators–thereby undermining their humanity. The Ethics of Care has its own history, coming out of
Ethics of Care sees the individual as having rights and feminist thought. As a general normative theory, it has
deserving of respect, and sees those at risk as being in been criticized for failing to question what is right to
need of care. An Ethics of Care does not disregard data, do,  in favor of seeking what is best to do in the
but rather utilizes data in the service of performing a circumstances. But as an approach to practical applied
duty in a manner that respects everyone involved. That ethics, it has proven illuminating in areas such as edu-
respect extends to taking the effort and care to under- cational and healthcare ethics [49], [50]. It is proposed
stand a situation from multiple perspectives, including that policing, like education and healthcare, aims to
that of citizens and working police — and how data gets “serve and protect” the community with limited resourc-
used and how it relates to the lived world. Indeed, as es,18 and as such is also a good candidate for an Ethics
the RAND researcher who studied the SSL says, data of Care. It is further proposed that in trying to improve
and AI ethics is less about sophisticated data analysis the management of a broad variety of governmental,
techniques and more about understanding context: non-profit and commercial organizations with data-driv-
en techniques, AI ethics can also draw upon the Ethics
The biggest issue for those agencies considering of Care, as robot ethics has done [53]. In this section we
predictive policing is not the statistical model or look at how an Ethics of Care can be applied to data sci-
tool used to make forecasts. Getting predictions ence and AI, from data collection, to data modeling, to
that are somewhat reasonable in identifying
where or who is at greater risk of crime is fairly 18
The motto of the Los Angeles Police Department, “To Protect and To
easy. Instead, agencies should be most con- Serve,” was introduced in1955 following a contest at their police academy,
won by Officer Joseph S. Dorobek [28]. It, and its variants, have since been
cerned about what they plan to do as a result [47]. adopted as the motto of numerous police departments across the United
States. But what do these words really mean? The topic has been much dis-
cussed within police departments. In 1998, an Ohio police officer offered
There is a deeper lesson in this observation — the possi- his views in Police Magazine:
bility of action, and the types of interventions envisioned,
While what constitutes “protect” may be open to some debate, it
can strongly shape data representations, and the value of seems to be more clear-cut than does the word “serve.” It’s obvious
various kinds of data. While the current fashion is to col- that we protect the citizens and their property from the criminal ele-
ment. The word “serve” on the other hand is somewhat ambiguous.
lect any and all available data, in the hope that some- What “to serve” may mean to one law enforcement agency it may
thing useful might be inferable from it, there is still value mean quite the opposite to another. “To serve” also takes on a differ-
ent meaning depending upon department size. For example, I know a
in considering what actions are available to address a chief in a small village not far from the city where I work. He recently
problem. This also means using data to find new means had a call to “assist the woman.” We all get these types of calls, but
his was to assist the woman in re-hanging her draperies! To serve? Is
of acting and intervening, and better understanding the that what people want? A tax supported drapery service? [51].
problem, rather than simply making the current means of
There are two striking aspects to this passage and the article, which also
addressing a problem more efficient. Indeed, many AI seems representative of the views of many police officers, and much of the
ethicists concerned about AGI worry that a hyper-efficient public. The first striking aspect is the extent to which “service” is framed as
a question of resources. Of course, the police are public servants, as are
AGI might be so good at achieving a set goal, or maxi- other agents and officers of government. But they also have a specific func-
mizing a certain value, that it does so to the great detri- tion, and should have priorities within that function. Indeed, the rest of the
article is devoted to discussing the way nonemergency calls are overload-
ment of other human values.17 In the case of policing, ing 9-1-1 operators and keeping police from getting to real emergencies. “In
many current policies and tactical goals of policing could many small cities, the police are the only visible and accessible arm of the
local government available after 5 p.m. and on weekends. Because of that
be dangerous, unjust, and counter-productive if executed we become the water department, the street department, the dog warden,
with complete accuracy and efficiency. And most people etc. — and people begin to expect it from us.” [51].
Of course, the “public” within the concept of public servant should be
would not be happy living in a society where every viola- understood to include everyone in the community, not just “citizens” or
tion of the law was detected and punished strictly and “taxpayers” or even just “law abiding” people. Police have a duty to serve
everyone, including the “criminal element.”
with perfect efficiency. At least this would require rethink- Following several court and Supreme Court decisions in the United
ing many laws, policies and punishments [48]. In order to States, there is now a legal precedent that police do not have a specific
legal duty to protect, or even to enforce the law or court orders. At least
better appreciate how actions and practice could or in terms of having a duty to lend aid or to protect a particular individual,
a police officer is not compelled by the law to intervene, put themselves at
17
Nick Bostrum’s infamous paperclip maximizer, which quickly and effi- risk, or act to enforce applicable laws. The court has upheld the discretion
ciently turns the world into paperclips at the expense of everyone and of police to decide when and where to enforce the law or protect individu-
everything else, is an example of this. als from danger [52].

JUNE 2019 ∕ IEEE TECHNOLOGY AND SOCIETY MAGAZINE 49


d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
data-driven policies and actions, drawing upon practical statistics. They argued that this systematic downgrading
examples from data-driven policing. of crime statistics was the result of pressure from police
Predictive policing, as the application of AI tech- leadership and administration. They further argued that
niques to policing data, has its roots in much older prac- pressures to increase police stops, especially in the era
tices of collecting crime data. Yet it also has the of “stop and frisk” in New York City, was highly racially
potential to draw upon data from other sources in discriminatory. The book caused enough controversy
increasingly networked police departments, and and embarrassment for the NYPD that the Police Com-
increasingly digitally surveilled communities. Ethical missioner ordered an independent study to review
questions arise at almost every stage of data collection CompStat [55]. That review did indeed find serious sys-
and analysis, from where data is collected and sensors temic reporting errors. It did not, however, find evidence
are placed, to how data is encoded, to existing biases in that this was the result of administrative pressure,
segregated communities and policing practices, to the though the review did not investigate that point exhaus-
ways data is used in police management and police tively, nor did it seriously assess systemic racism within
encounters with the public. For building a more general CompStat’s data collection practices.
approach to AI ethics, it is useful to separate these What emerges from the investigations and reports
problems out and identify the key ethical issues, and into CompStat, from a data science and AI ethics per-
how AI researchers and system designers might think spective, is the susceptibility of data to political and
about and address them. bureaucratic pressure. While it may be convenient to
assume that a given dataset offers an accurate represen-
Data: From CompStat to Critical Data Science tation of the world, this should not be taken for granted.
Information and communication technologies (ICT) In this case there were widespread and systematic errors
have long been central to policing. From the keeping of in the reported data. If that data were to be used by pre-
criminal records and crime statistics and their collec- dictive policing algorithms, those errors could have a sig-
tion in databases, to the use of police boxes, tele- nificant impact on policing practices. And if that data is
phones, radio dispatching, and 9-1-1 emergency call indeed racially biased, as it most likely is, it could further
centers, many ICT technologies have become as closely bias policing practices. But without an awareness of
associated with policing as badges and handcuffs. Ini- these issues, and the potential for inaccurate data or
tially, these technologies were analog–paper records, latent bias within data, the designers of those AI algo-
photographs and inked fingerprints, dedicated police rithms may be creating garbage-in-garbage-out systems,
telephone boxes, and wireless radios. With the comput- believing that they are producing quality systems (as
erization of businesses and government agencies from measured by their available data). The lesson for AI eth-
the 1960s to 1990s, many aspects of police work also ics is to never take for granted the accuracy of given
became digitized and computerized. Police patrol cars data, but to be suspicious, to seek out likely ways in
began getting computers in the early 1980s, which which political, economic, or social pressures may have
allowed officers to check vehicle license plates, and influenced historical datasets, to consider how it may be
eventually check individuals for outstanding warrants. shaping current data collection practices, and to be sen-
The transition from paper to digital records for crime sitive to the ways in which new data practices may trans-
reports soon led to interest in compiling crime statistics form social practices and how that relates to the
at a local level for use in guiding the management of communities and individuals a system aims to care for.
patrols and policing priorities. CompStat, short for Com- With the growing popularity of AI, and increasing
parative Statistics, was the result. Initially adopted by concerns about its impact on society, universities and
the New York City police department in 1995, similar professional organizations have recognized the problem
practices have been adopted across the country, espe- and taken up the challenge of teaching ethics to the
cially in large urban departments. next generation of AI designers. Today, many undergrad-
CompStat as a mere data gathering and manage- uate and graduate programs teaching AI include ethical
ment practice has not been without its critics. In 2010, training, but its adoption has been uneven and more
John Eterno and Eli Silverman, retired New York police could be done. Many online and professional training
captains turned university professor and criminology programs still lack critical design and ethical thinking in
professor, respectively, published a book-length criti- favor of teaching the latest techniques and tools over
cism of CompStat practices in the NYPD [54]. The book good design. Professional organizations including IEEE,
argues that there was widespread misreporting of ACM, and AAAI have also led initiatives to develop ethi-
crimes across NYPD precincts, which took the form of cal standards, codes of ethics, and organize a growing
downgrading the seriousness of reported crimes in an number of conferences and workshops on AI ethics.
effort to show annual improvements in serious crime These are all positive developments, and it is hoped that

50 IEEE TECHNOLOGY AND SOCIETY MAGAZINE ∕ JUNE 2019

d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
this paper will contribute to the discussion of the ethical nor be available means for a law enforcement officer. To
design of AI, especially as AI comes to be applied in an some extent there is discretion on the part of law
increasing number of socially significant and ethically enforcement, prosecutors, and judges as to how to
consequential decisions. appraise and categorize such a crime — and they may
While not every AI system developer can become an take factors into account other than the strict value of
expert in the application domain of their techniques, the the property. But once categorized, that discretionary
basics of critical data analysis should be taught along- nature tends to be erased — the crime becomes
side statistical techniques and machine learning tech- defined through its given category, documented and
niques. In particular, system designers should be adept entered into data collection systems. AI systems design-
at recognizing the necessary characteristics of an ade- ers need to be sensitive these types of processes.
quate dataset, and what can and cannot be reasonably Indeed, understanding data collection, and critical data
drawn from a given dataset. In many cases, only domain representation issues should be integral to computer
experts will have the kind of cultural knowledge to iden- and information science education. Taking care in the
tify exogenous influences. This fact supports a systems design of AI means being able to determine what an
design approach that includes domain experts as well as adequate dataset is, and being able to think critically
critical social scientists as members of design teams, about how to define it, and what the implications of vari-
and recognizes and respects the necessity of their ous choices of categorization are. How best to do this,
expertise in shaping the ultimate system design [56]. in general, is a matter for further research.

Models Matter Putting AI Into Practice


A dataset on its own is just a collection of numbers The discussion so far has focused on input — how data
delimited by some kind of file structure. Even decisions is structured and collected. But the presentation of data
as to how to represent a data field with a number — analysis, and its impact on individual and institutional
binary, integer, real, pointer, formula — can have conse- practices must also be taken in account. A good exam-
quences for how that data gets processed. Numbers are ple of such an issue can be seen in the use of the SSL
abstract values, which are then represented by digital by Chicago police. In principle, the SSL could have been
numerals within computational systems. How they are used to recruit youth for the One Summer program. The
numerically represented can matter. But often it is far choice by precincts and officers to use the list for “cus-
more important how we choose to represent the world tom notification” and for “heat lists” following crimes is
through numbers. Even when we are simply “counting” not disconnected from the design of a system like SSL.
things in the world, we are also engaged in processes of While data scientists and software engineers may wish
classification and categorization. The data “model” that to wash their hands of responsibility for how officers
a system employs involves myriad representational actually use their tools, they cannot. At the very least
choices, and seeks to serve various purposes [57]. this constitutes a sort of negligence and failure to warn.
The most obvious case in law enforcement is to char- Many officers were not properly or fully informed of how
acterize the law, and represent violations of the law. But the list was put together, and held mistaken and prob-
there are many possible computational models of any lematic understandings of what it was and how it
given set of legal rules and codes, and they may not worked. The officers also lacked training, guidance, and
always represent the same mappings of events in the direction on how to use the system, if indeed there ever
world as computational encodings. was a comprehensive plan as to how to deploy and use
Consider the case of CompStat crime underreporting the system. These factors surely contributed to its mis-
discussed above. We could look to New York Penal Law use, and all but guaranteed its ineffectual use.
§155.05 and §155.25 for a definition of “Petite Larceny” An Ethics of Care approach ought to ensure that the
which is theft or withholding of property valued at less operators of AI systems and users of data they generate
than $1000 (and not a firearm, automobile, or credit are aware of the scope and limitations of those sys-
card) [58]. What if a bike has been stolen, which cost a tems. It may be too much to expect them to fully under-
little more than $1000 when it was new, but it is used stand the computational techniques — indeed even AI
and would likely not sell for that much, nor would an experts may find the performance of certain machine
insurance company compensate its loss for more than learning systems inscrutable. But this does not mean
$1000? Determining the appropriate crime requires esti- that people who use these systems can be ignorant of
mating the value of the property. This is a non-trivial cat- what the system can and cannot do, how reliable it is,
egorization — an auction might determine the current and what its limitations in representing the world are.
market value, or a bike sales expert might be able to Designers also need to be aware of the context in
give an appraisal, but these may not agree on the price, which AI systems will be deployed and used. It should

JUNE 2019 ∕ IEEE TECHNOLOGY AND SOCIETY MAGAZINE 51


d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
not be hard to predict what police might do with a “heat Even when statistically justified, such categories, and the
list,” if one has a realistic sense of police work and the actions of government agents on the basis of those cate-
pressures operating within precincts and departments. gories, may disrespect individual rights, human dignity,
This again points to the need for domain experts and and undermine justice.
participatory design [56]. One imagines that a police By taking an Ethics of Care approach to AI systems
sergeant on the design team of the SSL would have design and ethics, designers should have a greater aware-
pointed out the likely misuses of the system. Prototyp- ness and respect for these issues. While any design
ing and testing could also help reveal such tendencies, approach is ultimately limited in its ability to mitigate all
as well as short-term and long-term evaluations of the possible failures and harms, an Ethics of Care can help
system implementation. mitigate the most significant and widespread flaws in AI
Transparency over the algorithms, data, and practic- systems that will impact people’s lives in consequential
es of implementation are also necessary. While the Chi- ways. An AI Ethics of Care has the potential to apply to
cago Police Department sought to avoid embarrassment areas far beyond predictive policing, and can inform
from releasing the details of the SSL, it would be impos- many applications of AI for consequential decisions.
sible for independent outside researchers to evaluate its
impacts — positive and negative — without access to Acknowledgment
the data and algorithms. It should not take a prolonged This work was supported in part by a Beneficial AI
lawsuit from a newspaper for government agencies to research grant from the Future of Life Institute.
share public data. Of course, as more and more com-
mercial systems, like PredPol,19 make the algorithms Author Information
and even the data proprietary, they will fall under intel- Peter Asaro is Associate Professor and Director of Gradu-
lectual property protections. This means private compa- ate Studies in the School of Media Studies at The New
nies will be processing the data, and will not be School, New York, NY. He is also Visiting Professor at the
required to reveal their algorithms, or subject them to Munich Center for Technology in Society at TU Munich,
independent outside scrutiny. In some cases, private and Affiliate Scholar at Stanford Law School’s Center for
companies are even withholding crime data from the Internet and Society. Email: [email protected].
cities that produced it because they have formatted it in
a database for their system and even encrypted it such
References
that it cannot be used if the city changes to another [1] P. Miller and T.O’Leary, “Accounting, ‘economic citizenship’ and
software platform [59]. the spatial reordering of manufacture,” Accounting, Organizations
and Society, vol. 19, no. 1, pp. 15-43, 1994.
[2] S. Zuboff, In the Age of the Smart Machine: The Future of
Central Issues Facing AI Predictive Policing Work and Power. Basic, 1988.
It is hoped that this article has shed light upon some of [3] L. Winner, Langdon, Autonomous Technology. Cambridge, MA:
the central issues facing AI ethics in general and predic- M.I.T. Press, 1977.
[4] P. Asaro and W. Wallach, “An introduction to machine eth-
tive policing in particular. While the use of data and AI in ics and robot ethics,” in Machine Ethics and Robot Ethics (The
policing is not intrinsically or necessarily unethical, it Library of Essays on the Ethics of Emerging Technologies), W. Wal-
must be done with care to avoid unjust and unethical lach and P. Asaro, Eds. Routledge, 2017; http://peterasaro.org/
writing/WALLACH%20ASARO%20(Machine%20Ethics%20Robot
impacts. First among these issues is that while AI ethics %20Ethics)%20.pdf.
needs to understand the computational techniques it [5] P. Asaro, “What should we want from a robot ethic?,” Int. Rev.
deploys, it also needs a critical understanding of the Information Ethics, vol. 6, no. 12, pp. 9-16, 2006.
[6] D.K. Citron, “Technological due process,” University of Maryland
datasets it operates on, how data is collected, and the Legal Studies, Res. Pap. no. 2007-26; Washington Univ. Law Rev., vol.
social organizations and the biases that those datasets 85, pp. 1249-1313, 2007; https://ssrn.com/abstract=1012360.
may represent. This requires understanding how data [7] F. Pasquale, The Black Box Society. Cambridge, MA: Harvard
Univ. Press, 2015.
practices are embedded within socio-technical systems, [8] A. Selbst and S. Barocas, “Regulating inscrutable systems,” pre-
and not blindly analyzing data assuming that it is without sented at WeRobot 2017; 2017; http://www.werobot2017.com/wp-
bias. It is also important to understand how the use of AI content/uploads/2017/03/Selbst-and-Barocas-Regulating-Inscrutable-
Systems-1.pdf.
tools and techniques will impact the beliefs and practic- [9] R. Caplan, J. Donovan, L. Hanson, and J. Matthews, “Algorith-
es of those who engage with them. Datasets and their mic accountability: A primer,” Data & Society Tech. Rep., Apr.
computational analysis have the power to “makeup peo- 18, 2018; https://datasociety.net/output/algorithmic-accountability-
a-primer/.
ple” in the sense of Hacking [36], and also to prejudge [10] V. Eubanks, Automating Inequality: How High-tech Tools Pro-
them according to statistical patterns and categories. file, Police and Punish the Poor. St. Martin’s, 2017.
[11] S.U. Noble, Algorithms of Oppression: How Search Engines
Reinforce Racism. New York, NY: New York Univ. Press, 2018.
19
PredPol is a commercial software company developing data management [12] C. O’Neil, Weapons of Math Destruction. Crown Random
and predictive data systems for police departments [30]. House, 2017.

52 IEEE TECHNOLOGY AND SOCIETY MAGAZINE ∕ JUNE 2019

d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
[13] J. Powles and H. Nissenbaum, “The seductive diversion of ‘solv- [38] University of Chicago, “One Summer Project,” Urban Labs,
ing’ bias in artificial intelligence,” Medium, Dec. 7, 2018; https:// https://urbanlabs.uchicago.edu/projects/one-summer-chicago-plus-
medium.com/s/story/the-seductive-diversion-of-solving-bias-in- nothing-stops-a-bullet-like-a-job, accessed May 2019.
artificial-intelligence-890df5e5ef53. [39] “Strategic Subject List,” Chicago Data Portal; https://data.city
[14] Wikipedia, “Threat model,” https://en.wikipedia.org/wiki/Threat_ ofchicago.org/Public-Safety/Strategic-Subject-List/4aki-r3np, accessed
model; accessed May 2019. May 2019.
[15] J.G. Stein, “Threat perception in international relations,” The [40] M. Dumke and F. Main, “A look inside the watch list Chicago
Oxford Handbook of Political Psychology, 2nd ed., L. Huddy, D.O. Police fought to keep secret,” Chicago Sun-Times, May 18, 2017;
Sears, and J.S. Levy, Eds. Oxford, U.K. Oxford Univ. Press, 2013. https://chicago.suntimes.com/politics/what-gets-people-on-watch-list-
[16] M. Sander-Staud, “Care ethics,” Internet Encyclopedia of Phi- chicago-police-fought-to-keep-secret-watchdogs/.
losophy, https://www.iep.utm.edu/care-eth/, accessed May 2019. [41] Y. Kunichoff and P. Sier, “The contradictions of Chicago Police’s
[17] J. Angwin, J. Larson, S. Mattu, and L. Kirchner, “Machine Bias: secretive list,” Chicago Mag., Aug. 2017; http://www.chicagomag
There’s software used across the country to predict future crimi- .com/city-life/August-2017/Chicago-Police-Strategic-Subject-List/.
nals. And it’s biased against blacks,” ProPublica, May 23, 2016; [42] J. Gorner, “With violence up, Chicago Police focus on a list of
https://www.propublica.org/article/machine-bias-risk-assessments-in- likeliest to kill, be killed,” Chicago Tribune, July 22, 2016; http://
criminal-sentencing. www.chicagotribune.com/news/ct-chicago-police-violence-strategy-
[18] J. Dressel and H. Farid, “The accuracy, fairness, and limits of met-20160722-story.html.
predicting recidivism,” Science Advances, vol. 4, no. 1, Jan. 17, [43] J. Saundrs, P. Hunt, and J.S. Hollywood, “Predictions put into
2018; http://advances.sciencemag.org/content/4/1/eaao5580/tab- practice: A auasi-experimental evaluation of Chicago’s Predictive
pdf. Policing pilot,” J. Experimental Criminology, vol. 12, no. 3, pp
[19] P.K. Dick, The Minority Report, 1956. 347–371, Sept. 2016.
[20] Minority Report, Stephen Spielberg, dir., 2002. [44] M.K. Sparrow, Handcuffed: What Holds Policing Back, and the
[21] A. Shapiro, “Reform predictive policing,” Nature, Jan. 25, Keys to Reform. Brookings Inst. Press, 2016.
2017; https://www.nature.com/news/reform-predictive-policing- [45] Office of the Mayor, “Study: Chicago’s One Summer Plus Youth
1.21338. Employment Program cuts violent crime arrests in half,” Press
[22] A.G. Ferguson, The Rise of Big Data Policing: Surveillance, Release, City of Chicago, Chicago, IL, Aug. 6, 2013, https://www
Race and the Future of Law Enforcement. New York, NY: New York .cityofchicago.org/city/en/depts/mayor/press_room/press_
Univ. Press, 2017. releases/2013/august_2013/study_chicago_s_onesummer
[23] H. Kerrigan, “Data-driven policing,” Governing the States and plusyouthemploymentprogramcutsviolentcr.html.
Localities, May 2011; http://www.governing.com/topics/public- [46] S.B. Heller, “Summer jobs reduce violence among disadvan-
justice-safety/Data-driven-Policing.html. taged youth,” Science, vol. 346, no. 6214, pp. 1219-1223, Dec. 5,
[24] S. Brayne, “Big Data surveillance: The case of policing,” Ameri- 2014.
can Sociological Rev., vol. 82, no. 5, pp. 977-1008, 2017. [47] J. Hollywood, “CPD’s ‘Heat List’ and the dilemma of Pre-
[25] “CompStat,” Wikipedia, https://en.wikipedia.org/wiki/CompStat, dictive Policing,” RAND Blog, Sept. 2016; https://www.rand.org/
accessed May 2019. blog/2016/09/cpds-heat-list-and-the-dilemma-of-predictive-policing
[26] G. Rayman, “NYPD commanders critique Comp Stat and the .html.
reviews aren’t good,” Village Voice, Oct. 18, 2010; https://www [48] W. Hartzog, G. Conti, J. Nelson, and L.A. Shay, “Inefficiently auto-
.villagevoice.com/2010/10/18/nypd-commanders-critique-comp-stat- mated law enforcement,” Michigan State Law Rev., pp. 1763-1796,
and-the-reviews-arent-good/. 2015; https://pdfs.semanticscholar.org/ec71/95d72b4ea51c9c6cc5
[27] “Crime data in Chicago,” Trulia.com; https://www.trulia.com/ d6a0e153448bbf702e.pdf.
real_estate/Chicago-Illinois/crime/, accessed May 2019. [49] “Ethics of Care” Wikipedia, https://en.wikipedia.org/wiki/
[28] “The origin of the LAPD motto,” BEAT Mag., Dec. 1963; http:// Ethics_of_care, accessed May 2019.
www.lapdonline.org/history_of_the_lapd/content_basic_view/1128. [50] V. Held, Ethics of Care: Personal, Political and Global, 2nd ed.
[29] Wikipedia, “Duty of care;” https://en.wikipedia.org/wiki/Duty_of_ Oxford Univ. Press, 2006.
care, accessed May 2019. [51] M. Burg, “To serve and protect?” Police Mag., 1998; http://
[30] “Overview,” PredPol, 2018; http://www.predpol.com/about/. www.policemag.com/channel/patrol/articles/1998/12/to-serve-and-
[31] J. Barro, “Here’s why stealing cars went out of fashion,” protect.aspx.
NYTimes, Aug. 11, 2014; https://www.nytimes.com/2014/08/12/ [52] K. Keopong, “The police are not required to protect you,”
upshot/heres-why-stealing-cars-went-out-of-fashion.html. Barnes Law, June 26, 2016; http://www.barneslawllp.com/police-
[32] F. Chen and R. Regan, “Arts and craftiness: An economic analy- not-required-protect/.
sis of art heists,” J. Cultural Economics, vol. 41, no. 3, pp. 283-307, [53] A. Van Wynsberghe, “Designing robots for care: Care centered
Aug. 2017; https://economiststalkart.org/2016/05/31/why-are-there- value-sensitive design,” Science and Engineering Ethics, vol. 19,
so-many-art-thefts-and-what-can-be-done-about-them/. no. 2, pp. 407-433, 2013.
[33] A. Gabbatt, “New York woman visited by police after research- [54] J. Eterno and E. Silverman, The Crime Numbers Game: Man-
ing pressure cookers online,” The Guardian, Aug. 1, 2013; https:// agement by Manipulation. CRC, 2010.
www.theguardian.com/world/2013/aug/01/new-york-police-terrorism- [55] D.N. Kelley and S.L. McCarthy, “The Report of the Crime Report-
pressure-cooker. ing Review Committee to Commissioner Raymond W. Kelley con-
[34] I. Hacking, The Emergence of Probability: A Philosophical cerning CompStat Auditing,” NYPD, Apr. 8, 2013 (released July 2013);
Study of Early Ideas About Probability, Induction and Statistical http://www.nyc.gov/html/nypd/downloads/pdf/public_information/
Inference, 2nd ed. Cambridge, U.K.: Cambridge Univ. Press, 2006 crime_reporting_review_committee_final_report_2013.pdf.
(1975). [56] P. Asaro, “Transforming society by transforming technology:
[35] R. Nilsen, “How well do horse racing favorites perform?,” Feb. The science and politics of participatory design,” Accounting, Man-
12, 2012; http://agameofskill.com/how-well-do-horse-racing-favorites- agement and Information Technologies, vol. 10, no. 4, pp. 257-
perform/. 290, 2000; http://peterasaro.org/writing/Asaro%20PD.pdf.
[36] I. Hacking, “Making up people,” in Reconstructing Individual- [57] G.C. Bowker and S.L. Star, Sorting Things Out: Classification
ism: Autonomy, Individuality and the Self in Western Thought, and its Consequences. Cambridge, MA: M.I.T. Press, 2000.
T.C. Heller, Ed. Stanford Univ. Press, 1986, pp. 222-236. [58] “New York State Penal Code,” New York Laws, 2019; http://
[37] Y. Romanyshyn, “Chicago homicide rate compared: Most big ypdcrime.com/penal.law/article155.htm?#p155.05.
cities don’t recover from spikes right away,” Chicago Tribune, Sept. [59] E. Joh, “The undue influence of surveillance technology compa-
26, 2017; http://www.chicagotribune.com/news/data/ct-homicide- nies on policing,” New York Univ. Law Rev., 2017.
spikes-comparison-htmlstory.html. 

JUNE 2019 ∕ IEEE TECHNOLOGY AND SOCIETY MAGAZINE 53


d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio

You might also like