Ai Ethics in Predictive Policing
Ai Ethics in Predictive Policing
in Predictive Policing
WIKIMEDIA/BRETT GUSTAFSON
Peter M. Asaro
d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
T
he adoption of data-driven organi- any meaningful sense. Rather, there are myriad social,
zational management — which in- institutional, and individual values that go into the deci-
cludes big data, machine learning, sions of which data to collect, when and where to col-
and artificial intelligence (AI) tech- lect it, how to encode it, how to assemble it in
niques — is growing rapidly across databases, how to interpret it, and how to use it to
all sectors of the knowledge econo- address social, institutional, and individual concerns. It
my. There is little doubt that the is those values which are the primary concern of ethics
collection, dissemination, analysis, and use of data in information systems design.
in government policy formation, strategic planning, This article outlines a new ethical approach that bal-
decision execution, and the daily performance of du- ances the promising benefits of AI with the realities of
ties can improve the functioning of government and how information technologies and AI algorithms are
the performance of public services. This is as true for actually adopted, applied, and used. It proposes that AI
law enforcement as any other government service. ethics should be driven by a substantive and systemic
Significant concerns have been raised, however, Ethics of Care, rather than by narrow Models of Threat
around the use of data-driven algorithms in policing, based on utilitarian risk and threat models. While it
law enforcement, and judicial proceedings. This focuses on law enforcement policies and policing prac-
includes predictive policing — the use of historic crime tices, it hopes to contribute to the broader discussion
data to identify individuals or geographic areas with over the ethical application of AI technologies in govern-
elevated risks for future crimes, in order to target them ment policy-making and the delivery of public and com-
for increased policing. Predictive policing has been mercial services more generally. The paper concludes
controversial for multiple reasons, including questions that while data-driven AI techniques could have many
of prejudice and precrime1 and effectively treating peo- socially beneficial applications, actually realizing those
ple as guilty of (future) crimes for acts they have not benefits requires careful consideration of how systems
yet committed and may never commit. This central are embedded in, and shape, existing practices, beyond
controversy over prejudice and precrime is amplified questions of de-biasing data. Absent such consideration,
and exacerbated by concerns over the implicit biases most applications are likely to have unjust, prejudicial,
contained in historic data sets, and the obvious impli- and discriminatory consequences. This conclusion sup-
cations for racial, gendered, ethnic, religious, class, ports a proposed Ethics of Care in the application of AI,
age, disability, and other forms of discriminatory which demands moral attention to those who may be
policing, as well as how the use of predictive informa- negatively impacted by the use of technology.
tion systems shapes the psychology and behavior of
police officers. Recent Excitement about AI
As more bureaucratic processes are automated, There is a recent and widespread excitement about the
there are growing concerns over the fairness, account- application of artificial intelligence to nearly every
ability, and transparency of the algorithms used to aspect of society — from commerce to government. AI,
make consequential decisions that determine peoples’ as a scientific research field, has long sought to devel-
life opportunities and rights. Less discussed are the op computer programs to perform tasks that were pre-
ways in which the introduction of data-centric processes viously thought to require human intelligence. This
and data-driven management have significant conse- somewhat abstract and conditional definition has given
quences on the techno-social and spatio-temporal struc- rise to a wide array of computational techniques, from
ture of organizations [1], as well as on the priorities of logical inference to statistical machine learning, that
organization management, the nature of labor, and the enable computers to process large and complex datas-
quality of results [2]. Such is the nature of contemporary ets and quickly provide useful information. Whether
technocratic governance [3]. Yet neither the increasing through traversing long chains of inference or sifting
collection and reliance on data, nor specific socio-tech- through vast amounts of data to find patterns, AI aims
nical and spatio-temporal organization of governmental to provide logically sound and evidence-based insights
institutions is determined by the technology alone, nor into datasets. Insofar as these datasets accurately rep-
by the utility of data. Nor is the kind of analysis per- resent phenomena in the world, such AI techniques can
formed on that data, or the specific problems to which potentially provide useful tools for analyzing that data
the data is addressed, pre-determined or “natural” in and choosing intelligent actions in response to that
analysis, all with far less human labor and effort. This is
the traditional approach of AI, or what we might consid-
1
”Precrime” is a science fiction concept that first appeared in the writings
of Philip K. Dick in a novel [19] that was later turned into a major Hollywood
er artificial specialized intelligence. This type of AI is
movie [20]. essentially about creating a customized piece of
d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
“Models of Threat” and an “Ethics of Care.”3 It will first surveillance of the public [24]. However, these recent
outline these concepts in broad terms. It will then algorithmic techniques and applications have their
examine two illustrative cases, in the area of predictive roots in much older practices of collecting and utilizing
policing, which epitomize each approach. It concludes comparative statistics (better known as CompStat)
with some observations and reflections on how to about crimes to manage large police forces, which
design better and more ethical AI through an Ethics of began in New York City in 1995. While many CompStat
Care approach. programs utilized computer programs to calculate the
Perhaps the greatest ethical concerns over algorith- statistics from crime and accident reports and arrest
mic decisions have been raised around the use of data- records and in some cases automatically generate
driven algorithms in policing, law enforcement, and “pin-maps” of crime activity, CompStat was really a set
judicial proceedings. One well-researched and much of data collection, analysis, and management practices
discussed example from the Florida judicial system rather than a piece of software [25]. And CompStat has
involves the use of algorithms to predict future recidi- seen its share of criticism, including from former
vism in convicts as a basis for determining the length of police officers [26].
their sentences.4 Another growing application is predic- Moreover, the algorithmic techniques that are
tive policing — the use of historic crime data to identify increasingly being employed by police forces draw
individuals or geographic areas with elevated risks for upon data that goes well beyond the digitized crime
future crimes, in order to target them for increased reports of the CompStat legacy, or automatically gener-
policing. Predictive policing has been controversial — as ated “heat maps” of areas of high crime activity.5 In
it aspires to prevent crime, it also raises questions of recent years, police departments have begun deploying
prejudice and precrime and effectively treating individu- and integrating large-scale video surveillance systems,
als and communities as guilty of (future) crimes for acts traffic cameras, license-plate and face recognition
they have not yet committed and may never commit technologies, audio gun-shot locators, cellphone inter-
[21], [22]. This central controversy of prejudice and pre- ceptors, aerial surveillance, and a host of other sur-
crime is amplified and exacerbated by more general veillance and data-collection technologies. As these
concerns over the implicit biases contained in historic systems become networked and produce large amounts
data sets, and the obvious implications for racial, gen- of data, there is increased pressure to analyze, inte-
dered, ethnic, religious, class, age, disability, and other grate, and utilize this data for improving law enforce-
forms of discriminatory policing. ment, which leads to increased reliance on automation
Predictive policing as a term can refer to a variety of and algorithms for sorting and sifting through that data
technologies and practices. The technical usage of the and translating it into policing priorities and strategies.
term usually refers to algorithmic processes for pre- As such, the term predictive policing can be taken to
dicting locations or individuals with high probabilities refer to a broad class of algorithmic and data-driven
of being involved in future crime, based upon histori- practices and software tools utilized by police forces.
cal data patterns [23]. Recent approaches utilize “big Predictive policing is also a good example of how AI
data” techniques and arguably entail forms of mass might be deployed more generally, and the ethical chal-
lenges that may arise.
3
Neither term is original, and each is meant to evoke traditions of thought A general approach to AI ethics is characterized here
and their general perspective, while not necessarily implying that the spe-
cific projects described were conscious of, or directly influenced by, those
as an “Ethics of Care.” Ethics of Care uses predictive
traditions. “Threat Modeling” has been an important methodology in cyber- policing, and the design of AI-based systems within it,
security for identifying, assessing, prioritizing, and mitigating threats and
vulnerabilities since at least the early 2000s [14], while “Threat Perception”
to lay out the framework for an AI Ethics of Care. In par-
has been a key concept in international relations and political psychol- ticular we look at two recent, but very different, imple-
ogy in assessing military threats and deterrence strategies [15]. “Ethics of
Care” has been gaining popularity in medical and educational ethics since
mentations of data-driven interventions on youth gun
its introduction by Carol Gilligan to explain moral development in child psy- violence in Chicago, Illinois, U.S.A. Predictive policing is
chology in the late 1970s and its extension by Nel Noddings into a moral
theory based on interpersonal relationships of caregiving and receiving in
particularly good for this purpose for several reasons.
the early 1980s [16]. As should be clear from the discussion above, policing
4
In an analysis of 7000 sentencing cases in Broward County, Florida, over
the period 2012-2013 that used the COMPAS software, journalists found
is an area the gives rise to a number of critical ethical
similar error rates in the assessment and sentencing of white and black and legal issues, and has relevance not only to society
convicts, but diametrically opposed in their direction. White convicts were
more likely to be erroneously predicted not to commit future crimes, while
at large, but to a host of other governmental functions
black convicts were more likely to be erroneously predicted to commit and other industries. It is also an area that has an his-
future crimes, resulting in shorter sentences for white convicts and longer
sentences for black convicts [17].
torical practice of data collection, and recent trials in
Another study of the same dataset shows that amateur humans are
5
able to make better predictions than the COMPAS software, using the same Such “heat maps” have become ubiquitous in the age of big data, and are
six factors as the software, and even better predictions can be made using even reproduced, albeit at lower resolution, on real estate websites such
just two factors — defendant’s age and number of past convictions [18]. Trulia.com [27].
d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
crimes,8 it more often focuses on motives, and as According to the U.S. legal system, criminal liability
such employs psychological theories of choice and and guilt depend upon a combination of actus reus (the
sociological theories of behavior, and generally focus- “guilty act”) and mens rea (“the guilty mind”). That is,
es on maximizing the likelihood and cost of penalties one must actually commit the act for which one is held
for wrongdoing by stricter enforcement and harsher responsible, and one must have had in mind the inten-
penalties.9 The temporality also becomes deeply prob- tion, or at least the awareness, that one was doing
lematic here. There is an obvious utility in preventing something wrong, or should have known (as mere igno-
crimes before they occur, but our notions of individual rance of the law is not a suitable defense). From this
responsibility, guilt, and punishment rest on the com- perspective, one cannot be guilty of a crime before
mission of acts — of actually doing certain things that actually committing the act, and should not be held lia-
constitute crimes — rather than imagining, desiring, ble for a crime not committed. And this is where pre-
or simply being psychologically pre-disposed or cir- crime clashes with fundamental concepts of justice. If
cumstantially inclined toward doing things which society, and police, act upon precrimes, and those sus-
would be criminal. In some instances, planning or dis- pected of them, in the same way as already committed
cussing criminal acts with others are acts that can crimes, then they are treating as guilty, or at the very
themselves constitute a lesser crime, such as conspir- least as suspect, those who have not yet, and not actu-
acy or solicitation to commit a crime, and a failed ally, committed a crime. This is a profound form of prej-
attempt, e.g., to kill someone, can still constitute the udice, in which judgments are made not only before
crime of attempted murder even if nobody is actually relevant evidence of a criminal act can be obtained and
hurt. But there are, and should be, different standards analyzed, but before such evidence can even exist.
for citizens who have committed no crime, those in Rather, judgement is passed on information derived
the act of committing a crime, those suspected of a from statistical inference, patterns, trends and probabili-
crime, those convicted of a crime, and those who ties. But a statistical likelihood of an event is neither an
have served their sentences for a crime. How should event nor an act.11 And it is fundamentally unjust to
law enforcement treat “those ‘likely’ to commit a treat someone as guilty of a crime they did not commit.
crime”? And does the epistemic basis for that likeli- Moreover, it is powerfully felt as an injustice when indi-
hood determination matter? viduals and communities are treated “as if” they are
The classification of individuals also becomes critical guilty of doing something they have not yet, or not indi-
here. When we say that an individual is “likely to commit vidually, done, based simply on their being members of
a crime” is that based on their individual behavior and a category or demographic group. Indeed, the imposi-
actions, or because of membership in a certain demo- tion of social categories can even give rise to new social
graphic group? “Profiling” becomes problematic in the identities [35] — and thus machine-generated categories
latter case, when individuals are classified according to are likely to create new types of people. This makes the
population-level statistics and biases. Statistics are noto- creation and designation of a “criminal type” deeply
rious for not distinguishing correlations in data from problematic.
causal reasons, and it would be unjust to treat people Still, there is a practical concern that law enforce-
with suspicion for coincidental correlations when the ment cannot ignore information about likely crimes
underlying causal mechanisms for criminal behavior are without sacrificing their duty to prevent crime. While the
absent. This kind of profiling becomes deeply problem- scope and nature of that duty are themselves contested,
atic when it becomes prejudicial, and the correlation is this is a powerful intuition. Indeed, it is the same intu-
taken as itself constitutive of guilt, or warranting a pre- ition that motivates much data-driven management.
sumption of guilt, rather than a presumption of That is, if we can use historical data to predict future
innocence.10 trends and events, and thus better allocate valuable
resources towards fulfilling a mission or goal, then we
8
For instance, adding better locks to protect property, such as ignition
immobilizers on cars, or making it more difficult to resell stolen goods [31].
should do so. While not incorrect — certainly better use
In some cases, increasing the policing of crimes may actually have counter- of information can improve policing in many ways — if
intuitive effects of increasing crime, according to an economic analysis of
the theft of art works [32].
pursued without careful consideration, caution, and
9
Rarely do these approaches take into account the outright irrationality
or the failure of individuals to actually think about committing crimes in
11
rational terms. This is because cognition in the wild follows other lines of Just consider gambling on horse races, which historically gave rise to
reason and risk assessment, from inflamed passions, to rational biases, to modern statistics [33]. Oddsmakers go to great lengths to provide accurate
human necessity. statistical predictions of the chances for each horse in a race. Yet, which-
10
For example, if one is worried about a copycat bombing like the Boston ever horse is the favorite to win does not necessarily win — the actual
Marathon bombing, it might make sense to flag individuals who shop for outcome of the race matters. The favorite only wins about 1/3 of the time
pressure cookers and backpacks. However, one should still presume there [34]. Gambling would not make sense if this were not the case — though in
is a reasonable explanation for this rather than presuming they must be ter- many games of chance it can be argued that it is mathematically irrational
rorists for doing so [32]. to place bets at all.
d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
The practical use of the SSL list and scores was
somewhat haphazard in its early years.13 While there
was no official policy regarding its use, it did feature in The Ethics of Care approach might
some CompStat reports [40], and was used by police
officers in some more controversial ways. The first of
instead consider factors that lead
these, called “custom notification,” involved police offi- people to violate the law, and seek
cers making personal visits to high-risk individuals,
informing them of their presence on the list, and fur- out new interventions that make
ther, informing them that they would be subjected to
additional police scrutiny [41]. In other words, they were
crimes less likely.
told that the police were “watching them” more careful-
ly, and they should expect more police encounters. The
other, and more common use of the SSL was as a “heat That same Chicago Tribune article indicates that
list” following a violent crime, in order to round-up the 85% of the 2100 shooting victims so far that year had
“usual suspects” from the list for questioning, in this been on the SSL, but does not indicate how they scored
case people in the vicinity of the crime who had high or whether they were all in the list of 1400 high-risk indi-
scores on the list. As a result, people on the list were far viduals, or the longer list of 398 000 individuals includ-
more likely to be detained and arrested by police, sim- ed in the dataset.
ply for being on the list. A detailed RAND study showed Both of the main applications of the SSL, the “custom
that the use of heat list in this way had no statistical notification” warnings and using the “heat list” to bring
impact on the likelihood of individuals on the list being people in for questioning, contain elements of precrime.
involved in gun violence, nor on the overall gun violence In the warnings, there is a sense in which the police still
in their communities [42]. It did, however, radically cannot arrest an individual before a crime, but they do
increase the likelihood of being arrested and convicted attempt to intimidate and threaten an individual who, in
of a crime for those people on the list. the majority of cases, has never been arrested for a vio-
Further, the data and algorithm behind the SSL was lent crime. While the police do offer to “help individuals
not shared publicly, making it difficult to determine to leave gangs,” it is not clear what specific services they
whether the list simply replicated long-standing racial offered, or whether those services are effective in either
and class discrimination. The CPD told the Chicago Tri- helping individuals get out of gangs or in avoiding future
bune that, violence. Similarly, rounding up people in the area who
appear on the “heat list” may be an expedient tool, but it
“[The SSL] is not based on race, ethnicity or is no substitute for doing the policework of a real investi-
geographic location..We don’t use it to target gation, or following leads from witnesses and suspects.
certain individuals other than we pay a visit to Indeed, it may impede or undermine community-orient-
their residence to offer them services to get out ed policing strategies. While police may complain that
of the (gang).” witnesses, and even victims, are often unwilling to coop-
erate with police, these heavy-handed tactics of rounding
But a California-based group that defends civil liber- up suspects based on data-driven lists only further
ties in the digital world raised concern that the arrest breaks down trust between communities and the police.
data that goes into the SSL could be inherently biased As such, these uses of SSL actually work against confi-
against African-American and other minorities: dence-building efforts by police, while offering little or no
demonstrative positive results [42], [43].
“Until they show us the algorithm and the exhaus- Both applications also appear to engage in victim-
tive factors of what goes into the algorithm, the blaming. In some cases literally so, insofar as the SSL
public should be concerned about whether combines victims and perpetrators in a single category of
the program further replicates racial disparities in “being a party to violence” or at-risk of being “involved
the criminal justice system,” in violence.” It makes little sense to show up at some-
one’s door to tell them that they may be the victims of
said Adam Schwartz, a staff attorney for the Electronic violence,14 and less sense to threaten them with
Frontier Foundation [41].
14
Making someone aware of a specific threat against them would be helpful,
13
It is also worth noting that the SSL, and the data and algorithms upon but people are usually aware of the fact that they live in a violent neighbor-
which it was based, was kept private by the CPD. It was only after a long hood. Nonspecific warnings are of little help, as has been seen with color-
legal battle that the Chicago Sun-Times newspaper was able to force the coded threat risks from the Department of Homeland Security, which do not
CPD to make the SSL and its data public [39]. specify any particular location or type of activity to be on the lookout for.
d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
Care also starts from respecting people and maintains a should shape data, particularly for AI ethics, we turn now
focus on the duties and responsibilities to the individu- to a discussion of what the framework for AI ethics drawn
als it deals with. By contrast, a Models of Threat from an Ethics of Care would look like.
approach sees people as statistics, and treats the indi-
viduals on a list as threats, whether they have done any- AI Ethics of Care: From Data to Models
thing or not, and regardless of whether they are victims to Implementation
or perpetrators–thereby undermining their humanity. The Ethics of Care has its own history, coming out of
Ethics of Care sees the individual as having rights and feminist thought. As a general normative theory, it has
deserving of respect, and sees those at risk as being in been criticized for failing to question what is right to
need of care. An Ethics of Care does not disregard data, do, in favor of seeking what is best to do in the
but rather utilizes data in the service of performing a circumstances. But as an approach to practical applied
duty in a manner that respects everyone involved. That ethics, it has proven illuminating in areas such as edu-
respect extends to taking the effort and care to under- cational and healthcare ethics [49], [50]. It is proposed
stand a situation from multiple perspectives, including that policing, like education and healthcare, aims to
that of citizens and working police — and how data gets “serve and protect” the community with limited resourc-
used and how it relates to the lived world. Indeed, as es,18 and as such is also a good candidate for an Ethics
the RAND researcher who studied the SSL says, data of Care. It is further proposed that in trying to improve
and AI ethics is less about sophisticated data analysis the management of a broad variety of governmental,
techniques and more about understanding context: non-profit and commercial organizations with data-driv-
en techniques, AI ethics can also draw upon the Ethics
The biggest issue for those agencies considering of Care, as robot ethics has done [53]. In this section we
predictive policing is not the statistical model or look at how an Ethics of Care can be applied to data sci-
tool used to make forecasts. Getting predictions ence and AI, from data collection, to data modeling, to
that are somewhat reasonable in identifying
where or who is at greater risk of crime is fairly 18
The motto of the Los Angeles Police Department, “To Protect and To
easy. Instead, agencies should be most con- Serve,” was introduced in1955 following a contest at their police academy,
won by Officer Joseph S. Dorobek [28]. It, and its variants, have since been
cerned about what they plan to do as a result [47]. adopted as the motto of numerous police departments across the United
States. But what do these words really mean? The topic has been much dis-
cussed within police departments. In 1998, an Ohio police officer offered
There is a deeper lesson in this observation — the possi- his views in Police Magazine:
bility of action, and the types of interventions envisioned,
While what constitutes “protect” may be open to some debate, it
can strongly shape data representations, and the value of seems to be more clear-cut than does the word “serve.” It’s obvious
various kinds of data. While the current fashion is to col- that we protect the citizens and their property from the criminal ele-
ment. The word “serve” on the other hand is somewhat ambiguous.
lect any and all available data, in the hope that some- What “to serve” may mean to one law enforcement agency it may
thing useful might be inferable from it, there is still value mean quite the opposite to another. “To serve” also takes on a differ-
ent meaning depending upon department size. For example, I know a
in considering what actions are available to address a chief in a small village not far from the city where I work. He recently
problem. This also means using data to find new means had a call to “assist the woman.” We all get these types of calls, but
his was to assist the woman in re-hanging her draperies! To serve? Is
of acting and intervening, and better understanding the that what people want? A tax supported drapery service? [51].
problem, rather than simply making the current means of
There are two striking aspects to this passage and the article, which also
addressing a problem more efficient. Indeed, many AI seems representative of the views of many police officers, and much of the
ethicists concerned about AGI worry that a hyper-efficient public. The first striking aspect is the extent to which “service” is framed as
a question of resources. Of course, the police are public servants, as are
AGI might be so good at achieving a set goal, or maxi- other agents and officers of government. But they also have a specific func-
mizing a certain value, that it does so to the great detri- tion, and should have priorities within that function. Indeed, the rest of the
article is devoted to discussing the way nonemergency calls are overload-
ment of other human values.17 In the case of policing, ing 9-1-1 operators and keeping police from getting to real emergencies. “In
many current policies and tactical goals of policing could many small cities, the police are the only visible and accessible arm of the
local government available after 5 p.m. and on weekends. Because of that
be dangerous, unjust, and counter-productive if executed we become the water department, the street department, the dog warden,
with complete accuracy and efficiency. And most people etc. — and people begin to expect it from us.” [51].
Of course, the “public” within the concept of public servant should be
would not be happy living in a society where every viola- understood to include everyone in the community, not just “citizens” or
tion of the law was detected and punished strictly and “taxpayers” or even just “law abiding” people. Police have a duty to serve
everyone, including the “criminal element.”
with perfect efficiency. At least this would require rethink- Following several court and Supreme Court decisions in the United
ing many laws, policies and punishments [48]. In order to States, there is now a legal precedent that police do not have a specific
legal duty to protect, or even to enforce the law or court orders. At least
better appreciate how actions and practice could or in terms of having a duty to lend aid or to protect a particular individual,
a police officer is not compelled by the law to intervene, put themselves at
17
Nick Bostrum’s infamous paperclip maximizer, which quickly and effi- risk, or act to enforce applicable laws. The court has upheld the discretion
ciently turns the world into paperclips at the expense of everyone and of police to decide when and where to enforce the law or protect individu-
everything else, is an example of this. als from danger [52].
d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
this paper will contribute to the discussion of the ethical nor be available means for a law enforcement officer. To
design of AI, especially as AI comes to be applied in an some extent there is discretion on the part of law
increasing number of socially significant and ethically enforcement, prosecutors, and judges as to how to
consequential decisions. appraise and categorize such a crime — and they may
While not every AI system developer can become an take factors into account other than the strict value of
expert in the application domain of their techniques, the the property. But once categorized, that discretionary
basics of critical data analysis should be taught along- nature tends to be erased — the crime becomes
side statistical techniques and machine learning tech- defined through its given category, documented and
niques. In particular, system designers should be adept entered into data collection systems. AI systems design-
at recognizing the necessary characteristics of an ade- ers need to be sensitive these types of processes.
quate dataset, and what can and cannot be reasonably Indeed, understanding data collection, and critical data
drawn from a given dataset. In many cases, only domain representation issues should be integral to computer
experts will have the kind of cultural knowledge to iden- and information science education. Taking care in the
tify exogenous influences. This fact supports a systems design of AI means being able to determine what an
design approach that includes domain experts as well as adequate dataset is, and being able to think critically
critical social scientists as members of design teams, about how to define it, and what the implications of vari-
and recognizes and respects the necessity of their ous choices of categorization are. How best to do this,
expertise in shaping the ultimate system design [56]. in general, is a matter for further research.
d licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING - CDAC - NOIDA. Downloaded on July 21,2020 at 16:58:13 UTC from IEEE Xplore. Restrictio
[13] J. Powles and H. Nissenbaum, “The seductive diversion of ‘solv- [38] University of Chicago, “One Summer Project,” Urban Labs,
ing’ bias in artificial intelligence,” Medium, Dec. 7, 2018; https:// https://urbanlabs.uchicago.edu/projects/one-summer-chicago-plus-
medium.com/s/story/the-seductive-diversion-of-solving-bias-in- nothing-stops-a-bullet-like-a-job, accessed May 2019.
artificial-intelligence-890df5e5ef53. [39] “Strategic Subject List,” Chicago Data Portal; https://data.city
[14] Wikipedia, “Threat model,” https://en.wikipedia.org/wiki/Threat_ ofchicago.org/Public-Safety/Strategic-Subject-List/4aki-r3np, accessed
model; accessed May 2019. May 2019.
[15] J.G. Stein, “Threat perception in international relations,” The [40] M. Dumke and F. Main, “A look inside the watch list Chicago
Oxford Handbook of Political Psychology, 2nd ed., L. Huddy, D.O. Police fought to keep secret,” Chicago Sun-Times, May 18, 2017;
Sears, and J.S. Levy, Eds. Oxford, U.K. Oxford Univ. Press, 2013. https://chicago.suntimes.com/politics/what-gets-people-on-watch-list-
[16] M. Sander-Staud, “Care ethics,” Internet Encyclopedia of Phi- chicago-police-fought-to-keep-secret-watchdogs/.
losophy, https://www.iep.utm.edu/care-eth/, accessed May 2019. [41] Y. Kunichoff and P. Sier, “The contradictions of Chicago Police’s
[17] J. Angwin, J. Larson, S. Mattu, and L. Kirchner, “Machine Bias: secretive list,” Chicago Mag., Aug. 2017; http://www.chicagomag
There’s software used across the country to predict future crimi- .com/city-life/August-2017/Chicago-Police-Strategic-Subject-List/.
nals. And it’s biased against blacks,” ProPublica, May 23, 2016; [42] J. Gorner, “With violence up, Chicago Police focus on a list of
https://www.propublica.org/article/machine-bias-risk-assessments-in- likeliest to kill, be killed,” Chicago Tribune, July 22, 2016; http://
criminal-sentencing. www.chicagotribune.com/news/ct-chicago-police-violence-strategy-
[18] J. Dressel and H. Farid, “The accuracy, fairness, and limits of met-20160722-story.html.
predicting recidivism,” Science Advances, vol. 4, no. 1, Jan. 17, [43] J. Saundrs, P. Hunt, and J.S. Hollywood, “Predictions put into
2018; http://advances.sciencemag.org/content/4/1/eaao5580/tab- practice: A auasi-experimental evaluation of Chicago’s Predictive
pdf. Policing pilot,” J. Experimental Criminology, vol. 12, no. 3, pp
[19] P.K. Dick, The Minority Report, 1956. 347–371, Sept. 2016.
[20] Minority Report, Stephen Spielberg, dir., 2002. [44] M.K. Sparrow, Handcuffed: What Holds Policing Back, and the
[21] A. Shapiro, “Reform predictive policing,” Nature, Jan. 25, Keys to Reform. Brookings Inst. Press, 2016.
2017; https://www.nature.com/news/reform-predictive-policing- [45] Office of the Mayor, “Study: Chicago’s One Summer Plus Youth
1.21338. Employment Program cuts violent crime arrests in half,” Press
[22] A.G. Ferguson, The Rise of Big Data Policing: Surveillance, Release, City of Chicago, Chicago, IL, Aug. 6, 2013, https://www
Race and the Future of Law Enforcement. New York, NY: New York .cityofchicago.org/city/en/depts/mayor/press_room/press_
Univ. Press, 2017. releases/2013/august_2013/study_chicago_s_onesummer
[23] H. Kerrigan, “Data-driven policing,” Governing the States and plusyouthemploymentprogramcutsviolentcr.html.
Localities, May 2011; http://www.governing.com/topics/public- [46] S.B. Heller, “Summer jobs reduce violence among disadvan-
justice-safety/Data-driven-Policing.html. taged youth,” Science, vol. 346, no. 6214, pp. 1219-1223, Dec. 5,
[24] S. Brayne, “Big Data surveillance: The case of policing,” Ameri- 2014.
can Sociological Rev., vol. 82, no. 5, pp. 977-1008, 2017. [47] J. Hollywood, “CPD’s ‘Heat List’ and the dilemma of Pre-
[25] “CompStat,” Wikipedia, https://en.wikipedia.org/wiki/CompStat, dictive Policing,” RAND Blog, Sept. 2016; https://www.rand.org/
accessed May 2019. blog/2016/09/cpds-heat-list-and-the-dilemma-of-predictive-policing
[26] G. Rayman, “NYPD commanders critique Comp Stat and the .html.
reviews aren’t good,” Village Voice, Oct. 18, 2010; https://www [48] W. Hartzog, G. Conti, J. Nelson, and L.A. Shay, “Inefficiently auto-
.villagevoice.com/2010/10/18/nypd-commanders-critique-comp-stat- mated law enforcement,” Michigan State Law Rev., pp. 1763-1796,
and-the-reviews-arent-good/. 2015; https://pdfs.semanticscholar.org/ec71/95d72b4ea51c9c6cc5
[27] “Crime data in Chicago,” Trulia.com; https://www.trulia.com/ d6a0e153448bbf702e.pdf.
real_estate/Chicago-Illinois/crime/, accessed May 2019. [49] “Ethics of Care” Wikipedia, https://en.wikipedia.org/wiki/
[28] “The origin of the LAPD motto,” BEAT Mag., Dec. 1963; http:// Ethics_of_care, accessed May 2019.
www.lapdonline.org/history_of_the_lapd/content_basic_view/1128. [50] V. Held, Ethics of Care: Personal, Political and Global, 2nd ed.
[29] Wikipedia, “Duty of care;” https://en.wikipedia.org/wiki/Duty_of_ Oxford Univ. Press, 2006.
care, accessed May 2019. [51] M. Burg, “To serve and protect?” Police Mag., 1998; http://
[30] “Overview,” PredPol, 2018; http://www.predpol.com/about/. www.policemag.com/channel/patrol/articles/1998/12/to-serve-and-
[31] J. Barro, “Here’s why stealing cars went out of fashion,” protect.aspx.
NYTimes, Aug. 11, 2014; https://www.nytimes.com/2014/08/12/ [52] K. Keopong, “The police are not required to protect you,”
upshot/heres-why-stealing-cars-went-out-of-fashion.html. Barnes Law, June 26, 2016; http://www.barneslawllp.com/police-
[32] F. Chen and R. Regan, “Arts and craftiness: An economic analy- not-required-protect/.
sis of art heists,” J. Cultural Economics, vol. 41, no. 3, pp. 283-307, [53] A. Van Wynsberghe, “Designing robots for care: Care centered
Aug. 2017; https://economiststalkart.org/2016/05/31/why-are-there- value-sensitive design,” Science and Engineering Ethics, vol. 19,
so-many-art-thefts-and-what-can-be-done-about-them/. no. 2, pp. 407-433, 2013.
[33] A. Gabbatt, “New York woman visited by police after research- [54] J. Eterno and E. Silverman, The Crime Numbers Game: Man-
ing pressure cookers online,” The Guardian, Aug. 1, 2013; https:// agement by Manipulation. CRC, 2010.
www.theguardian.com/world/2013/aug/01/new-york-police-terrorism- [55] D.N. Kelley and S.L. McCarthy, “The Report of the Crime Report-
pressure-cooker. ing Review Committee to Commissioner Raymond W. Kelley con-
[34] I. Hacking, The Emergence of Probability: A Philosophical cerning CompStat Auditing,” NYPD, Apr. 8, 2013 (released July 2013);
Study of Early Ideas About Probability, Induction and Statistical http://www.nyc.gov/html/nypd/downloads/pdf/public_information/
Inference, 2nd ed. Cambridge, U.K.: Cambridge Univ. Press, 2006 crime_reporting_review_committee_final_report_2013.pdf.
(1975). [56] P. Asaro, “Transforming society by transforming technology:
[35] R. Nilsen, “How well do horse racing favorites perform?,” Feb. The science and politics of participatory design,” Accounting, Man-
12, 2012; http://agameofskill.com/how-well-do-horse-racing-favorites- agement and Information Technologies, vol. 10, no. 4, pp. 257-
perform/. 290, 2000; http://peterasaro.org/writing/Asaro%20PD.pdf.
[36] I. Hacking, “Making up people,” in Reconstructing Individual- [57] G.C. Bowker and S.L. Star, Sorting Things Out: Classification
ism: Autonomy, Individuality and the Self in Western Thought, and its Consequences. Cambridge, MA: M.I.T. Press, 2000.
T.C. Heller, Ed. Stanford Univ. Press, 1986, pp. 222-236. [58] “New York State Penal Code,” New York Laws, 2019; http://
[37] Y. Romanyshyn, “Chicago homicide rate compared: Most big ypdcrime.com/penal.law/article155.htm?#p155.05.
cities don’t recover from spikes right away,” Chicago Tribune, Sept. [59] E. Joh, “The undue influence of surveillance technology compa-
26, 2017; http://www.chicagotribune.com/news/data/ct-homicide- nies on policing,” New York Univ. Law Rev., 2017.
spikes-comparison-htmlstory.html.