Introduction - Race After Technology - Abolitionist Tools For The New Jim Code
Introduction - Race After Technology - Abolitionist Tools For The New Jim Code
Introduction
The New Jim Code
Naming a child is serious business. And if you are not White in the United States, there is much
more to it than personal preference. When my younger son was born I wanted to give him an
Arabic name to reflect part of our family heritage. But it was not long after 9/11, so of course I
hesitated. I already knew he would be profiled as a Black youth and adult, so, like most Black
mothers, I had already started mentally sparring those who would try to harm my child, even
before he was born. Did I really want to add another round to the fight? Well, the fact is, I am also
very stubborn. If you tell me I should not do something, I take that as a dare. So I gave the child an
Arabic first and middle name and noted on his birth announcement: “This guarantees he will be
flagged anytime he tries to fly.”
If you think I am being hyperbolic, keep in mind that names are racially coded. While they are one
of the everyday tools we use to express individuality and connections, they are also markers
interacting with numerous technologies, like airport screening systems and police risk
assessments, as forms of data. Depending on one’s name, one is more likely to be detained by state
actors in the name of “public safety.”
Just as in naming a child, there are many everyday contexts – such as applying for jobs, or
shopping – that employ emerging technologies, often to the detriment of those who are racially
marked. This book explores how such technologies, which often pose as objective, scientific, or
progressive, too often reinforce racism and other forms of inequity. Together, we will work to
decode the powerful assumptions and values embedded in the material and digital architecture of
our world. And we will be stubborn in our pursuit of a more just and equitable approach to tech –
ignoring the voice in our head that says, “No way!” “Impossible!” “Not realistic!” But as activist and
educator Mariame Kaba contends, “hope is a discipline.”1 Reality is something we create together,
except that so few people have a genuine say in the world in which they are forced to live. Amid so
much suffering and injustice, we cannot resign ourselves to this reality we have inherited. It is time
to reimagine what is possible. So let’s get to work.
Everyday Coding
Each year I teach an undergraduate course on race and racism and I typically begin the class with
an exercise designed to help me get to know the students while introducing the themes we will
wrestle with during the semester. What’s in a name? Your family story, your religion, your
nationality, your gender identity, your race and ethnicity? What assumptions do you think people
make about you on the basis of your name? What about your nicknames – are they chosen or
imposed? From intimate patterns in dating and romance to large-scale employment trends, our
names can open and shut doors. Like a welcome sign inviting people in or a scary mask repelling
and pushing them away, this thing that is most ours is also out of our hands.
The popular book and Netflix documentary Freakonomics describe the process of parents naming
their kids as an exercise in branding, positioning children as more or less valuable in a competitive
social marketplace. If we are the product, our names are the billboard – a symptom of a larger
neoliberal rationale that subsumes all other sociopolitical priorities to “economic growth,
competitive positioning, and capital enhancement.”2 My students invariably chuckle when the
“baby-naming expert” comes on the screen to help parents “launch” their newest offspring. But the
fact remains that naming is serious business. The stakes are high not only because parents’
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 1 of 23
2022/1/16, 1:43 AM
fact remains that naming is serious business. The stakes are high not only because parents’
decisions will follow their children for a lifetime, but also because names reflect much longer
histories of conflict and assimilation and signal fierce political struggles – as when US immigrants
from Eastern Europe anglicize their names, or African Americans at the height of the Black Power
movement took Arabic or African names to oppose White supremacy.
I will admit, something that irks me about conversations regarding naming trends is how distinctly
African American names are set apart as comically “made up” – a pattern continued in
Freakonomics. This tendency, as I point out to students, is a symptom of the chronic anti-
Blackness that pervades even attempts to “celebrate difference.” Blackness is routinely conflated
with cultural deficiency, poverty, and pathology … Oh, those poor Black mothers, look at how they
misspell “Uneeq.” Not only does this this reek of classism, but it also harbors a willful disregard for
the fact that everyone’s names were at one point made up!3
Usually, many of my White students assume that the naming exercise is not about them. “I just
have a normal name,” “I was named after my granddad,” “I don’t have an interesting story, prof.”
But the presumed blandness of White American culture is a crucial part of our national narrative.
Scholars describe the power of this plainness as the invisible “center” against which everything else
is compared and as the “norm” against which everyone else is measured. Upon further reflection,
what appears to be an absence in terms of being “cultureless” works more like a superpower.
Invisibility, with regard to Whiteness, offers immunity. To be unmarked by race allows you to reap
the benefits but escape responsibility for your role in an unjust system. Just check out the hashtag
#CrimingWhileWhite to read the stories of people who are clearly aware that their Whiteness
works for them like an armor and a force field when dealing with the police. A “normal” name is
just one of many tools that reinforce racial invisibility.
As a class, then, we begin to understand that all those things dubbed “just ordinary” are also
cultural, as they embody values, beliefs, and narratives, and normal names offer some of the most
powerful stories of all. If names are social codes that we use to make everyday assessments about
people, they are not neutral but racialized, gendered, and classed in predictable ways. Whether in
the time of Moses, Malcolm X, or Missy Elliot, names have never grown on trees. They are
concocted in cultural laboratories and encoded and infused with meaning and experience –
particular histories, longings, and anxieties. And some people, by virtue of their social position, are
given more license to experiment with unique names. Basically, status confers cultural value that
engenders status, in an ongoing cycle of social reproduction.4
In a classic study of how names impact people’s experience on the job market, researchers show
that, all other things being equal, job seekers with White-sounding first names received 50 percent
more callbacks from employers than job seekers with Black-sounding names.5 They calculated that
the racial gap was equivalent to eight years of relevant work experience, which White applicants
did not actually have; and the gap persisted across occupations, industry, employer size – even
when employers included the “equal opportunity” clause in their ads.6 With emerging technologies
we might assume that racial bias will be more scientifically rooted out. Yet, rather than challenging
or overcoming the cycles of inequity, technical fixes too often reinforce and even deepen the status
quo. For example, a study by a team of computer scientists at Princeton examined whether a
popular algorithm, trained on human writing online, would exhibit the same biased tendencies that
psychologists have documented among humans. They found that the algorithm associated White-
sounding names with “pleasant” words and Black-sounding names with “unpleasant” ones.7
Such findings demonstrate what I call “the New Jim Code”: the employment of new technologies
that reflect and reproduce existing inequities but that are promoted and perceived as more
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 2 of 23
2022/1/16, 1:43 AM
that reflect and reproduce existing inequities but that are promoted and perceived as more
objective or progressive than the discriminatory systems of a previous era.8 Like other kinds of
codes that we think of as neutral, “normal” names have power by virtue of their perceived
neutrality. They trigger stories about what kind of person is behind the name – their personality
and potential, where they come from but also where they should go.
Codes are both reflective and predictive. They have a past and a future. “Alice Tang” comes from a
family that values education and is expected to do well in math and science. “Tyrone Jackson” hails
from a neighborhood where survival trumps scholastics; and he is expected to excel in sports. More
than stereotypes, codes act as narratives, telling us what to expect. As data scientist and Weapons
of Math Destruction author Cathy O’Neil observes, “[r]acism is the most slovenly of predictive
models. It is powered by haphazard data gathering and spurious correlations, reinforced by
institutional inequities, and polluted by confirmation bias.”9
Racial codes are born from the goal of, and facilitate, social control. For instance, in a recent audit
of California’s gang database, not only do Blacks and Latinxs constitute 87 percent of those listed,
but many of the names turned out to be babies under the age of 1, some of whom were supposedly
“self-described gang members.” So far, no one ventures to explain how this could have happened,
except by saying that some combination of zip codes and racially coded names constitute a risk.
10 Once someone is added to the database, whether they know they are listed or not, they undergo
Importantly, the attempt to shroud racist systems under the cloak of objectivity has been made
before. In The Condemnation of Blackness, historian Khalil Muhammad (2011) reveals how an
earlier “racial data revolution” in the nineteenth century marshalled science and statistics to make
a “disinterested” case for White superiority:
Racial knowledge that had been dominated by anecdotal, hereditarian, and pseudo-biological
theories of race would gradually be transformed by new social scientific theories of race and
society and new tools of analysis, namely racial statistics and social surveys. Out of the new
methods and data sources, black criminality would emerge, alongside disease and
intelligence, as a fundamental measure of black inferiority.13
You might be tempted to see the datafication of injustice in that era as having been much worse
than in the present, but I suggest we hold off on easy distinctions because, as we shall see, the
language of “progress” is too easily weaponized against those who suffer most under oppressive
systems, however sanitized.
Readers are also likely to note how the term New Jim Code draws on The New Jim Crow, Michelle
Alexander’s (2012) book that makes a case for how the US carceral system has produced a “new
racial caste system” by locking people into a stigmatized group through a colorblind ideology, a way
of labeling people as “criminals” that permits legalized discrimination against them. To talk of the
new Jim Crow, begs the question: What of the old? “Jim Crow” was first introduced as the title
character of an 1832 minstrel show that mocked and denigrated Black people. White people used it
not only as a derogatory epithet but also as a way to mark space, “legal and social devices intended
to separate, isolate, and subordinate Blacks.”14 And, while it started as a folk concept, it was taken
up as an academic shorthand for legalized racial segregation, oppression, and injustice in the US
South between the 1890s and the 1950s. It has proven to be an elastic term, used to describe an
era, a geographic region, laws, institutions, customs, and a
code of behavior that upholds White supremacy.15 Alexander compares the old with the new Jim
Crow in a number of ways, but most relevant for this discussion is her emphasis on a shift from
explicit racialization to a colorblind ideology that masks the destruction wrought by the carceral
system, severely limiting the life chances of those labeled criminals who, by design, are
overwhelmingly Black. “Criminal,” in this era, is code for Black, but also for poor, immigrant,
second-class, disposable, unwanted, detritus.
What happens when this kind of cultural coding gets embedded into the technical coding of
software programs? In a now classic study, computer scientist Latanya Sweeney examined how
online search results associated Black names with arrest records at a much higher rate than White
names, a phenomenon that she first noticed when Google-searching her own name; and results
suggested she had a criminal record.16 The lesson? “Google’s algorithms were optimizing for the
racially discriminating patterns of past users who had clicked on these ads, learning the racist
preferences of some users and feeding them back to everyone else.”17 In a technical sense, the
writer James Baldwin’s insight is prescient: “The great force of history comes from the fact that we
carry it within us, are unconsciously controlled by it in many ways, and history is literally present
in all that we do.”18 And when these technical codes move beyond the bounds of the carceral
system, beyond labeling people as “high” and “low” risk criminals, when automated systems from
employment, education, healthcare, and housing come to make decisions about people’s
deservedness for all kinds of opportunities, then tech designers are erecting a digital caste system,
structured by existing racial inequities that are not just colorblind, as Alexander warns. These tech
advances are sold as morally superior because they purport to rise above human bias, even though
they could not exist without data produced through histories of exclusion and discrimination.
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 4 of 23
2022/1/16, 1:43 AM
In fact, as this book shows, colorblindness is no longer even a prerequisite for the New Jim Code.
In some cases, technology “sees” racial difference, and this range of vision can involve seemingly
positive affirmations or celebrations of presumed cultural differences. And yet we are told that how
tech sees “difference” is a more objective reflection of reality than if a mere human produced the
same results. Even with the plethora of visibly diverse imagery engendered and circulated through
technical advances, particularly social media, bias enters through the backdoor of design
optimization in which the humans who create the algorithms are hidden from view.
Move Slower …
Problem solving is at the heart of tech. An algorithm, after all, is a set of instructions, rules, and
calculations designed to solve problems. Data for Black Lives co-founder Yeshimabeit Milner
reminds us that “[t]he decision to make every Black life count as three-fifths of a person was
embedded in the electoral college, an algorithm that continues to be the basis of our current
democracy.”19 Thus, even just deciding what problem needs solving requires a host of judgments;
and yet we are expected to pay no attention to the man behind the screen.20
As danah boyd and M. C. Elish of the Data & Society Research Institute posit, “[t]he datasets and
models used in these systems are not objective representations of reality. They are the culmination
of particular tools, people, and power structures that foreground one way of seeing or judging over
another.”21 By pulling back the curtain and drawing attention to forms of coded inequity, not only
do we become more aware of the social dimensions of technology but we can work together against
the emergence of a digital caste system that relies on our naivety when it comes to the neutrality of
technology. This problem extends beyond obvious forms of criminalization and surveillance.22 It
includes an elaborate social and technical apparatus that governs all areas of life.
The animating force of the New Jim Code is that tech designers encode judgments into technical
systems but claim that the racist results of their designs are entirely exterior to the encoding
process. Racism thus becomes doubled – magnified and buried under layers of digital denial. There
are bad actors in this arena that are easier to spot than others. Facebook executives who denied
and lied about their knowledge of Russia’s interference in the 2016 presidential election via social
media are perpetrators of the most broadcast violation of public trust to date.23 But the line
between bad and “neutral” players is a fuzzy one and there are many tech insiders hiding behind
the language of free speech, allowing racist and sexist harassment to run rampant in the digital
public square and looking the other way as avowedly bad actors deliberately crash into others with
reckless abandon.
For this reason, we should consider how private industry choices are in fact public policy decisions.
They are animated by political values influenced strongly by libertarianism, which extols individual
autonomy and corporate freedom from government regulation. However, a recent survey of the
political views of 600 tech entrepreneurs found that a majority of them favor higher taxes on the
rich, social benefits for the poor, single-payer healthcare, environmental regulations, parental
leave, immigration protections, and other issues that align with Democratic causes. Yet most of
them also staunchly opposed labor unions and government regulation.24 As one observer put it,
“Silicon Valley entrepreneurs don’t mind the government regulating other industries, but they
prefer Washington to stay out of their own business.”25 For example, while many say they support
single-payer healthcare in theory, they are also reluctant to contribute to tax revenue that would
fund such an undertaking. So “political values” here is less about party affiliation or what people
believe in the abstract and more to do with how the decisions of tech entrepreneurs impact
questions of power, ethics, equity, and sociality. In that light, I think the dominant ethos in this
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 5 of 23
2022/1/16, 1:43 AM
questions of power, ethics, equity, and sociality. In that light, I think the dominant ethos in this
arena is best expressed by Facebook’s original motto: “Move Fast and Break Things.” To which we
should ask: What about the people and places broken in the process? Residents of Silicon Valley
displaced by the spike in housing costs, or Amazon warehouse workers compelled to skip bathroom
breaks and pee in bottles.26 “Move Fast, Break People, and Call It Progress”?
“Data sharing,” for instance, sounds like a positive development, streamlining the bulky
bureaucracies of government so the public can access goods and services faster. But access goes
both ways. If someone is marked “risky” in one arena, that stigma follows him around much more
efficiently, streamlining marginalization. A leading Europe-based advocate for workers’ data rights
described how she was denied a bank loan despite having a high income and no debt, because the
lender had access to her health file, which showed that she had a tumor.27 In the United States,
data fusion centers are one of the most pernicious sites of the New Jim Code, coordinating “data-
sharing among state and local police, intelligence agencies, and private companies”28 and
deepening what Stop LAPD Spying Coalition calls the stalker state. Like other techy euphemisms,
“fusion” recalls those trendy restaurants where food looks like art. But the clientele of such upscale
eateries is rarely the target of data fusion centers that terrorize the residents of many cities.
If private companies are creating public policies by other means, then I think we should stop
calling ourselves “users.” Users get used. We are more like unwitting constituents who, by clicking
submit, have authorized tech giants to represent our interests. But there are promising signs that
the tide is turning.
According to a recent survey, a growing segment of the public (55 percent, up from 45 percent)
wants more regulation of the tech industry, saying that it does more to hurt democracy and free
speech than help.29 And company executives are admitting more responsibility for safeguarding
against hate speech and harassment on their platforms. For example, Facebook hired thousands
more people on its safety and security team and is investing in automated tools to spot toxic
content. Following Russia’s disinformation campaign using Facebook ads, the company is now
“proactively finding and suspending coordinated networks of accounts and pages aiming to spread
propaganda, and telling the world about it when it does. The company has enlisted fact-checkers to
help prevent fake news from spreading as broadly as it once did.”30
In November 2018, Zuckerberg held a press call to announce the formation of a “new independent
body” that users could turn to if they wanted to appeal a decision made to take down their content.
But many observers criticize these attempts to address public concerns as not fully reckoning with
the political dimensions of the company’s private decisions. Reporter Kevin Roose summarizes this
governance behind closed doors:
Shorter version of this call: Facebook is starting a judicial branch to handle the overflow for its
executive branch, which is also its legislative branch, also the whole thing is a monarchy.31
The co-director of the AI Now Research Institute, Kate Crawford, probes further:
Will Facebook’s new Supreme Court just be in the US? Or one for every country where they
operate? Which norms and laws rule? Do execs get to overrule the decisions? Finally, why stop
at user content? Why not independent oversight of the whole system?”
32
The “ruthless code of secrecy” that enshrouds Silicon Valley is one of the major factors fueling
public distrust.33 So, too, is the rabid appetite of big tech to consume all in its path, digital and
physical real estate alike. “There is so much of life that remains undisrupted.” As one longtime tech
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 6 of 23
2022/1/16, 1:43 AM
physical real estate alike. “There is so much of life that remains undisrupted.” As one longtime tech
consultant to companies including Apple, IBM, and Microsoft put it, “For all intents and purposes,
we’re only 35 years into a 75-or 80-year process of moving from analog to digital. The image of
Silicon Valley as Nirvana has certainly taken a hit, but the reality is that we the consumers are
constantly voting for them.”34 The fact is, the stakes are too high, the harms too widespread, the
incentives too enticing, for the public to accept the tech industry’s attempts at self-regulation.
It is revealing, in my view, that many tech insiders choose a more judicious approach to tech when
it comes to raising their own kids.35 There are reports of Silicon Valley parents requiring nannies
to sign “no-phone contracts”36 and opting to send their children to schools in which devices are
banned or introduced slowly, in favor of “pencils, paper, blackboards, and craft materials.”37 Move
Slower and Protect People? All the while I attend education conferences around the country in
which vendors fill massive expo halls to sell educators the latest products couched in a concern that
all students deserve access – yet the most privileged refuse it? Those afforded the luxury of opting
out are concerned with tech addiction – “On the scale between candy and crack cocaine, it’s closer
to crack cocaine,” one CEO said of screens.38 Many are also wary about the lack of data privacy,
because access goes both ways with apps and websites that track users’ information.
In fact the author of The Art of Computer Programming, the field’s bible (and some call Knuth
himself “the Yoda of Silicon Valley”), recently commented that he feels “algorithms are getting too
prominent in the world. It started out that computer scientists were worried nobody was listening
to us. Now I’m worried that too many people are listening.”39 To the extent that social elites are
able to exercise more control in this arena (at least for now), they also position themselves as
digital elites within a hierarchy that allows some modicum of informed refusal at the very top. For
the rest of us, nanny contracts and Waldorf tuition are not an option, which is why the notion of a
personal right to refuse privately is not a tenable solution.40
The New Jim Code will not be thwarted by simply revising user agreements, as most companies
attempted to do in the days following Zuckerberg’s 2018 congressional testimony. And more and
more young people seem to know that, as when Brooklyn students staged a walkout to protest a
Facebook-designed online program, saying that “it forces them to stare at computers for hours and
‘teach ourselves,’” guaranteeing only 10–15 minutes of “mentoring” each week!41 In fact these
students have a lot to teach us about refusing tech fixes for complex social problems that come
packaged in catchphrases like “personalized learning.”42 They are sick and tired of being atomized
and quantified, of having their personal uniqueness sold to them, one “tailored” experience after
another. They’re not buying it. Coded inequity, in short, can be met with collective defiance, with
resisting the allure of (depersonalized) personalization and asserting, in this case, the sociality of
learning. This kind of defiance calls into question a libertarian ethos that assumes what we all
really want is to be left alone, screen in hand, staring at reflections of ourselves. Social theorist Karl
Marx might call tech personalization our era’s opium of the masses and encourage us to “just say
no,” though he might also point out that not everyone is in an equal position to refuse, owing to
existing forms of stratification. Move slower and empower people.
Tailoring: Targeting
In examining how different forms of coded inequity take shape, this text presents a case for
understanding race itself as a kind of tool – one designed to stratify and sanctify social injustice as
part of the architecture of everyday life. In this way, this book challenges us to question not only
the technologies we are sold, but also the ones we manufacture ourselves. For most of US history,
White Americans have used race as a tool to denigrate, endanger, and exploit non-White people –
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 7 of 23
2022/1/16, 1:43 AM
White Americans have used race as a tool to denigrate, endanger, and exploit non-White people –
openly, explicitly, and without shying away from the deadly demarcations that racial imagination
brings to life. And, while overt White supremacy is proudly reasserting itself with the election of
Donald Trump in 2016, much of this is newly cloaked in the language of White victimization and
false equivalency. What about a White history month? White studies programs? White student
unions? No longer content with the power of invisibility, a vocal subset of the population wants to
be recognized and celebrated as White – a backlash against the civil rights gains of the mid-
twentieth century, the election of the country’s first Black president, diverse representations in
popular culture, and, more fundamentally, a refusal to comprehend that, as Baldwin put it, “white
is a metaphor for power,” unlike any other color in the rainbow.
43
The dominant shift toward multiculturalism has been marked by a move away from one-size-fits-
all mass marketing toward ethnically tailored niches that capitalize on calls for diversity. For
example, the Netflix movie recommendations that pop up on your screen can entice Black viewers,
by using tailored movie posters of Black supporting cast members, to get you to click on an option
that you might otherwise pass on.44 Why bother with broader structural changes in casting and
media representation, when marketing gurus can make Black actors appear more visible than they
really are in the actual film? It may be that the hashtag #OscarsSoWhite drew attention to the
overwhelming Whiteness of the Academy Awards, but, so long as algorithms become more
tailored, the public will be given the illusion of progress.45
Importantly, Netflix and other platforms that thrive on tailored marketing do not need to ask
viewers about their race, because they use prior viewing and search histories as proxies that help
them predict who will be attracted to differently cast movie posters. Economic recognition is a
ready but inadequate proxy for political representation and social power. This transactional model
of citizenship presumes that people’s primary value hinges on the ability to spend money and, in
the digital age, expend attention … browsing, clicking, buying. This helps explain why different
attempts to opt out of tech-mediated life can itself become criminalized, as it threatens the digital
order of things. Analog is antisocial, with emphasis on anti … “what are you trying to hide?”
Meanwhile, multiculturalism’s proponents are usually not interested in facing White supremacy
head on. Sure, movies like Crazy Rich Asians and TV shows like Black-ish, Fresh off the Boat, and
The Goldbergs do more than target their particular demographics; at times, they offer incisive
commentary on the racial–ethnic dynamics of everyday life, drawing viewers of all backgrounds
into their stories. Then there is the steady stream of hits coming out of Shondaland that
deliberately buck the Hollywood penchant for typecasting. In response to questions about her
approach to shows like Grey’s Anatomy and Scandal, Shonda Rhimes says she is not trying to
diversify television but to normalize it: “Women, people of color, LGBTQ people equal WAY more
than 50 percent of the population. Which means it ain’t out of the ordinary. I am making the world
of television look NORMAL.”46
But, whether TV or tech, cosmetic diversity too easily stands in for substantive change, with a focus
on feel-good differences like food, language, and dress, not on systemic disadvantages associated
with employment, education, and policing. Celebrating diversity, in this way, usually avoids sober
truth-telling so as not to ruin the party. Who needs to bother with race or sex disparities in the
workplace, when companies can capitalize on stereotypical differences between groups?
The company BIC came out with a line of “BICs For Her” pens that were not only pink, small, and
bejeweled, but priced higher than the non-gendered ones. Criticism was swift. Even Business
Insider, not exactly known as a feminist news outlet, chimed in: “Finally, there’s a lady’s pen that
makes it possible for the gentler sex to write on pink, scented paper: Bic for Her. Remember to dot
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 8 of 23
2022/1/16, 1:43 AM
makes it possible for the gentler sex to write on pink, scented paper: Bic for Her. Remember to dot
your i’s with hearts or smiley faces, girls!” Online reviewers were equally fierce and funny:
Finally! For years I’ve had to rely on pencils, or at worst, a twig and some drops of my
feminine blood to write down recipes (the only thing a lady should be writing ever) … I had
despaired of ever being able to write down said recipes in a permanent manner, though my
men-folk assured me that I “shouldn’t worry yer pretty little head.” But, AT LAST! Bic, the
great liberator, has released a womanly pen that my gentle baby hands can use without fear of
unlady-like callouses and bruises. Thank you, Bic!47
No, thank you, anonymous reviewers! But the last I checked, ladies’ pens are still available for
purchase at a friendly online retailer near you, though packaging now includes a nod to “breast
cancer awareness,” or what is called pinkwashing – the co-optation of breast cancer to sell
products or provide cover for questionable political campaigns.48
Critics launched a similar online campaign against an IBM initiative called Hack a Hair Dryer. In
the company’s efforts to encourage girls to enter STEM professions, they relied on tired stereotypes
of girls and women as uniquely preoccupied with appearance and grooming:
Sorry @IBM i’m too busy working on lipstick chemistry and writing down formula with little
hearts over the i s to #HackAHairDryer”49
Niche marketing, in other words, has a serious downside when tailoring morphs into targeting and
stereotypical containment. Despite decades of scholarship on the social fabrication of group
identity, tech developers, like their marketing counterparts, are encoding race, ethnicity, and
gender as immutable characteristics that can be measured, bought, and sold. Vows of
colorblindness are not necessary to shield coded inequity if we believe that scientifically calculated
differences are somehow superior to crude human bias.
Consider this ad for ethnicity recognition software developed by a Russian company, NTech Lab –
which beats Google’s Facenet as the world’s best system for recognition, with 73.3 percent accuracy
on 1 million faces (Figure 0.1).50 NTech explains that its algorithm has “practical applications in
retail, healthcare, entertainment and other industries by delivering accurate and timely
demographic data to enhance the quality of service”; this includes targeted marketing campaigns
and more.51
What N-Tech does not mention is that this technology is especially useful to law enforcement and
immigration officials and can even be used at mass sporting and cultural events to monitor
streaming video feed.52 This shows how multicultural representation, marketed as an
individualistic and fun experience, can quickly turn into criminalizing misrepresentation. While
some companies such as NTech are already being adopted for purposes of policing, other
companies, for example “Diversity Inc,” which I will introduce in the next chapter, are squarely in
the ethnic marketing business, and some are even developing techniques to try to bypass human
bias. What accounts for this proliferation of racial codification?
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 9 of 23
2022/1/16, 1:43 AM
Why Now?
Today the glaring gap between egalitarian principles and inequitable practices is filled with subtler
forms of discrimination that give the illusion of progress and neutrality, even as coded inequity
makes it easier and faster to produce racist outcomes. Notice that I said outcomes and not beliefs,
because it is important for us to assess how technology can reinforce bias by what it does,
regardless of marketing or intention. But first we should acknowledge that intentional and targeted
forms of White supremacy abound!
As sociologist Jessie Daniels documents, White nationalists have ridden the digital wave with great
success. They are especially fond of Twitter and use it to spread their message, grow their network,
disguise themselves online, and generate harassment campaigns that target people of color,
especially Black women.
53 Not only does the design of such platforms enable the “gamification of hate” by placing the
burden on individual users to report harassers; Twitter’s relatively hands-off approach when it
comes to the often violent and hate-filled content of White supremacists actually benefits the
company’s bottom line.
This is a business model in which more traffic equals more profit, even if that traffic involves
violently crashing into other users – as when Ghostbusters star Leslie Jones received constant
threats of rape and lynching after noted White supremacist Milo Yiannopoulos rallied a digital mob
against her: a high-profile example of the macro-aggressions that many Black women experience
on social media every day.54 In Daniels’ words, “[s]imply put, White supremacists love Twitter
because Twitter loves them back.”55 Jones for her part reached out to her friend, Twitter’s CEO
Jack Dorsey; and Dorsey is now considering artificial intelligence (AI) of the kind used on
Instagram to identify hate speech and harassment.56
And, while the use of social media to amplify and spread obvious forms of racial hatred is an
ongoing problem that requires systematic interventions, it is also the most straightforward to
decode, literally. For example, White supremacists routinely embed seemingly benign symbols in
online content, cartoon characters or hand signs, that disseminate and normalize their
propaganda. However, these are only the most visible forms of coded inequity in which we can
identify the intentions of self-proclaimed racists. The danger, as I see it, is when we allow these
more obvious forms of virulent racism to monopolize our attention, when the equivalent of slow
death – the subtler and even alluring forms of coded inequity – get a pass. My book hopes to focus
more of our attention on this New Jim Code.
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 10 of 23
2022/1/16, 1:43 AM
Beyoncé, tuning in to Oprah, and pining for the presidency of Obama be … racist? But alas, “Black
faces in high places” is not an aberration but a key feature of a society structured by White
supremacy.58 In hindsight, we would not point to the prominence of Black performers and
politicians in the early twentieth century as a sign that racism was on the decline. But it is common
to hear that line of reasoning today.
Tokenism is not simply a distraction from systemic domination. Black celebrities are sometimes
recruited to be the (Black) face of technologies that have the potential to deepen racial inequities.
For example, in 2018 Microsoft launched a campaign featuring the rapper Common to promote AI:
Today, right now, you have more power at your fingertips than entire generations that came
before you. Think about that. That’s what technology really is. It’s possibility. It’s adaptability.
It’s capability. But in the end it’s only a tool. What’s a hammer without a person who swings it?
It’s not about what technology can do, it’s about what you can do with it. You’re the voice, and
it’s the microphone. When you’re the artist, it’s the paintbrush. We are living in the future we
always dreamed of … AI empowering us to change the world we see … So here’s the question:
What will you do with it?59
Savvy marketing on the part of Microsoft, for sure. What better aesthetic than a Black hip-hop
artist to represent AI as empowering, forward-thinking, cool – the antithesis of anti-Black
discrimination? Not to mention that, as an art form, hip-hop has long pushed the boundaries of
technological experimentation through beatboxing, deejaying, sampling, and more. One could
imagine corporate-sponsored rap battles between artists and AI coming to a platform near you.
The democratizing ethos of Common’s narration positions the listener as a protagonist in a world
of AI, one whose voice can direct the development of this tool even though rarely a day goes by
without some report on biased bots. So what is happening behind the screens?
A former Apple employee who noted that he was “not Black or Hispanic” described his experience
on a team that was developing speech recognition for Siri, the virtual assistant program. As they
worked on different English dialects – Australian, Singaporean, and Indian English – he asked his
boss: “What about African American English?” To this his boss responded: “Well, Apple products
are for the premium market.” And this happened in 2015, “one year after [the rapper] Dr. Dre sold
Beats by Dr. Dre to Apple for a billion dollars.” The irony, the former employee seemed to imply,
was that the company could somehow devalue and value Blackness at the same time.60 It is one
thing to capitalize on the coolness of a Black artist to sell (overpriced) products and quite another
to engage the cultural specificity of Black people enough to enhance the underlying design of a
widely used technology. This is why the notion that tech bias is “unintentional” or “unconscious”
obscures the reality – that there is no way to create something without some intention and
intended user in mind (a point I will return to in the next chapter).
For now, the Siri example helps to highlight how just having a more diverse team is an inadequate
solution to discriminatory design practices that grow out of the interplay of racism and capitalism.
Jason Mars, a Black computer scientist, expressed his frustration saying, “There’s a kind of
pressure to conform to the prejudices of the world … It would be interesting to have a black guy
talk [as the voice for his app], but we don’t want to create friction, either. First we need to sell
products.”61 How does the fist-pumping empowerment of Microsoft’s campaign figure in a world
in which the voices of Black programmers like Mars are treated as conflict-inducing? Who gets
muted in this brave new world? The view that “technology is a neutral tool” ignores how race also
functions like a tool, structuring whose literal voice gets embodied in AI. In celebrating diversity,
tokenistic approaches to tech development fail to acknowledge how the White aesthetic colors AI.
The “blandness” of Whiteness that some of my students brought up when discussing their names is
treated by programmers as normal, universal, and appealing. The invisible power of Whiteness
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 12 of 23
2022/1/16, 1:43 AM
treated by programmers as normal, universal, and appealing. The invisible power of Whiteness
means that even a Black computer scientist running his own company who earnestly wants to
encode a different voice into his app is still hemmed in by the desire of many people for White-
sounding voices.
So, as we work to understand the New Jim Code, it is important to look beyond marketing rhetoric
to the realities of selling and targeting diversity. One of the companies, Diversity, Inc., which I will
discuss in more detail in Chapter 1, creates software that helps other companies and organizations
tailor marketing campaigns to different ethnic groups. In the process it delineates over 150 distinct
ethnicities and “builds” new ones for companies and organizations that want to market their goods
or services to a subgroup not already represented in the Diversity, Inc. database. Technologies do
not just reflect racial fault lines but can be used to reconstruct and repackage social groupings in
ways that seem to celebrate difference. But would you consider this laudable or exploitative,
opportunistic or oppressive? And who ultimately profits from the proliferation of ethnically
tailored marketing? These are questions we will continue to wrestle with in the pages ahead.
Finally, the New Jim Code is part of a broader push toward privatization where efforts to cut costs
and maximize profits, often at the expense of other human needs, is a guiding rationale for public
and private sectors alike.62 Computational approaches to a wide array of problems are seen as not
only good but necessary, and a key feature of cost-cutting measures is the outsourcing of decisions
to “smart” machines. Whether deciding which teacher to hire or fire or which loan applicant to
approve or decline, automated systems are alluring because they seem to remove the burden from
gatekeepers, who may be too overworked or too biased to make sound judgments. Profit
maximization, in short, is rebranded as bias minimization.
But the outsourcing of human decisions is, at once, the insourcing of coded inequity. As
philosopher and sociologist Herbert Marcuse remarked, “[t]echnological rationality has become
political rationality.” Considering Marcuse’s point, as people become more attuned to racial biases
in hiring, firing, loaning, policing, and a whole host of consequential decisions – an awareness we
might take to be a sign of social progress – this very process also operates as a kind of opportunity
for those who seek to manage social life more efficiently. The potential for bias creates a demand
for more efficient and automated organizational practices, such as the employment screening
carried out by AI – an example we will explore in more depth. Important to this story is the fact
that power operates at the level of institutions and individuals – our political and mental structures
– shaping citizen-subjects who prioritize efficiency over equity.
It is certainly the case that algorithmic discrimination is only one facet of a much wider
phenomenon, in which what it means to be human is called into question. What do “free will” and
“autonomy” mean in a world in which algorithms are tracking, predicting, and persuading us at
every turn? Historian Yuval Noah Harari warns that tech knows us better than we know ourselves,
and that “we are facing not just a technological crisis but a philosophical crisis.”63 This is an
industry with access to data and capital that exceeds that of sovereign nations, throwing even that
sovereignty into question when such technologies draw upon the science of persuasion to track,
addict, and manipulate the public. We are talking about a redefinition of human identity,
autonomy, core constitutional rights, and democratic principles more broadly.64
In this context, one could argue that the racial dimensions of the problem are a subplot of (even a
distraction from) the main action of humanity at risk. But, as philosopher Sylvia Wynter has
argued, our very notion of what it means to be human is fragmented by race and other axes of
difference. She posits that there are different “genres” of humanity that include “full humans, not-
quite humans, and nonhumans,”65 through which racial, gendered, and colonial hierarchies are
encoded. The pseudo-universal version of humanity, “the Man,” she argues, is only one form, and
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 13 of 23
2022/1/16, 1:43 AM
encoded. The pseudo-universal version of humanity, “the Man,” she argues, is only one form, and
that it is predicated on anti-Blackness. As such, Black humanity and freedom entail thinking and
acting beyond the dominant genre, which could include telling different stories about the past, the
present, and the future.66
But what does this have to do with coded inequity? First, it’s true, anti-Black technologies do not
necessarily limit their harm to those coded Black.67 However, a universalizing lens may actually
hide many of the dangers of discriminatory design, because in many ways Black people already
live in the future.68 The plight of Black people has consistently been a harbinger of wider processes
– bankers using financial technologies to prey on Black homeowners, law enforcement using
surveillance technologies to control Black neighborhoods, or politicians using legislative
techniques to disenfranchise Black voters – which then get rolled out on an even wider scale. An
#AllLivesMatter approach to technology is not only false inclusion but also poor planning,
especially by those who fancy themselves as futurists.
Many tech enthusiasts wax poetic about a posthuman world and, indeed, the expansion of big data
analytics, predictive algorithms, and AI, animate digital dreams of living beyond the human mind
and body – even beyond human bias and racism. But posthumanist visions assume that we have
all had a chance to be human. How nice it must be … to be so tired of living mortally that one
dreams of immortality. Like so many other “posts” (postracial, postcolonial, etc.), posthumanism
grows out of the Man’s experience. This means that, by decoding the racial dimensions of
technology and the way in which different genres of humanity are constructed in the process, we
gain a keener sense of the architecture of power – and not simply as a top-down story of powerful
tech companies imposing coded inequity onto an innocent public. This is also about how we (click)
submit, because of all that we seem to gain by having our choices and behaviors tracked, predicted,
and racialized. The director of research at Diversity, Inc. put it to me like this: “Would you really
want to see a gun-toting White man in a Facebook ad?” Tailoring ads makes economic sense for
companies that try to appeal to people “like me”: a Black woman whose sister-in-law was killed in a
mass shooting, who has had to “shelter in place” after a gunman opened fire in a neighboring
building minutes after I delivered a talk, and who worries that her teenage sons may be assaulted
by police or vigilantes. Fair enough. Given these powerful associations, a gun-toting White man
would probably not be the best image for getting my business.
But there is a slippery slope between effective marketing and efficient racism. The same sort of
algorithmic filtering that ushers more ethnically tailored representations into my feed can also
redirect real estate ads away from people “like me.” This filtering has been used to show higher-
paying job ads to men more often than to women, to charge more for standardized test prep
courses to people in areas with a high density of Asian residents, and many other forms of coded
inequity. In cases of the second type especially, we observe how geographic segregation animates
the New Jim Code. While the gender wage gap and the “race tax” (non-Whites being charged more
for the same services) are nothing new, the difference is that coded inequity makes discrimination
easier, faster, and even harder to challenge, because there is not just a racist boss, banker, or
shopkeeper to report. Instead, the public must hold accountable the very platforms and
programmers that legally and often invisibly facilitate the New Jim Code, even as we reckon with
our desire for more “diversity and inclusion” online and offline.
Taken together, all these features of the current era animate the New Jim Code. While more
institutions and people are outspoken against blatant racism, discriminatory practices are
becoming more deeply embedded within the sociotechnical infrastructure of everyday life.
Likewise, the visibility of successful non-White individuals in almost every social arena can obscure
the reality of the systemic bias that still affects many people. Finally, the proliferation of ever more
sophisticated ways to use ethnicity in marketing goods, services, and even political messages
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 14 of 23
2022/1/16, 1:43 AM
sophisticated ways to use ethnicity in marketing goods, services, and even political messages
generates more buy-in from those of us who may not want to “build” an ethnicity but who are part
of New Jim Code architecture nevertheless.
Race as Technology
This field guide explores not only how emerging technologies hide, speed up, or reinforce racism,
but also how race itself is a kind of technology70 – one designed to separate, stratify, and sanctify
the many forms of injustice experienced by members of racialized groups, but one that people
routinely reimagine and redeploy to their own ends.
Human toolmaking is not limited to the stone instruments of our early ancestors or to the sleek
gadgets produced by the modern tech industry. Human cultures also create symbolic devices that
structure society. Race, to be sure, is one of our most powerful tools – developed over hundreds of
years, varying across time and place, codified in law and refined through custom, and, tragically,
still considered by many people to reflect immutable differences between groups. For that reason,
throughout this book, we will consider not only how racial logics enter the design of technology but
how race itself operates as a tool of vision and division with often deadly results.
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 15 of 23
2022/1/16, 1:43 AM
Racism is, let us not forget, a means to reconcile contradictions. Only a society that extolled “liberty
for all” while holding millions of people in bondage requires such a powerful ideology in order to
build a nation amid such a startling contradiction. How else could one declare “[w]e hold these
truths to be self-evident, that all men are created equal, that they are endowed by their Creator
with certain unalienable Rights,” and at the same time deny these rights to a large portion of the
population71 – namely by claiming that its members, by virtue of their presumed lack of humanity,
were never even eligible for those rights?72 Openly despotic societies, by contrast, are in no need of
the elaborate ideological apparatus that props up “free” societies. Freedom, as the saying goes, ain’t
free. But not everyone is required to pay its steep price in equal measure. The same is true of the
social costs of technological progress.
Consider that the most iconic revolt “against machines,” as it is commonly remembered, was
staged by English textile workers, the Luddites, in nineteenth-century England. Often remembered
as people who were out of touch and hated technology, the Luddites were actually protesting the
social costs of technological “progress” that the working class was being forced to accept. “To break
the machine was in a sense to break the conversion of oneself into a machine for the accumulating
wealth of another,” according to cultural theorist Imani Perry.73 At a recent conference titled “AI &
Ethics,” the communications director of a nonprofit AI research company, Jack Clark, pointed out
that, although the term “Luddite” is often used today as a term of disparagement for anyone who is
presumed to oppose (or even question!) automation, the Luddite response was actually directed at
the manner in which machinery was rolled out, without consideration for its negative impact on
workers and society overall. Perhaps the current era of technological transformation, Clark
suggested, warrants a similar sensibility – demanding a more careful and democratic approach to
technology.74
Shifting from nineteenth-century England to late twenty-first-century Mexico, sci-fi filmmaker
Alex Rivera wrestles with a similar predicament of a near future in which workers are not simply
displaced but inhabited by technology. Sleep Dealer (2008) is set in a dystopian world of
corporate-controlled water, militarized drones, “aqua-terrorists” (or water liberators, depending on
your sympathies), and a walled-off border between Mexico and the United States. The main
protagonist, Memo Cruz, and his co-workers plug networked cables into nodes implanted in their
bodies. This enables them to operate robots on the other side of the border, giving the United
States what it always wanted: “all the work without the workers.”75
Such fictional accounts find their real-life counterpart in “electronic sweatshops,” where companies
such as Apple, HP, and Dell treat humans like automata, reportedly requiring Chinese workers to
complete tasks every three seconds over a 12-hour period, without speaking or using the bathroom.
76 Indeed, as I write, over 1,000 workers at Amazon in Spain have initiated a strike over wages and
rights, following similar protests in Italy and Germany in 2017. If we probe exploitative labor
practices, the stated intention would likely elicit buzzwords such as “lower costs” and “greater
efficiency,” signaling a fundamental tension and paradox – the indispensable disposability of those
whose labor enables innovation. The language of intentionality only makes one side of this
equation visible, namely the desire to produce goods faster and cheaper, while giving people “the
opportunity to work.” This fails to account for the social costs of a technology in which global forms
of racism, caste, class, sex, and gender exploitation are the nuts and bolts of development.77
“Racing” after technology, in this context, is about the pursuit of efficiency, neutrality, Ready to
Update, Install Now, I Agree, and about what happens when we (click) submit too quickly.78
Whether it is in the architecture of machines or in the implementation of laws, racial logic imposes
“race corrections” that distort our understanding of the world.79 Consider the court decision in the
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 16 of 23
2022/1/16, 1:43 AM
“race corrections” that distort our understanding of the world.79 Consider the court decision in the
case against one Mr. Henry Davis, who was charged with destruction of property for bleeding on
police uniforms after officers incorrectly identified him as having an outstanding warrant and then
beat him into submission:
On and/or about the 20th day of September 20, 2009 at or near 222 S. Florissant within the
corporate limits of Ferguson, Missouri, the above-named defendant did then and there
unlawfully commit the offense of “property damage” to wit did transfer blood to the
uniform.80
When Davis sued the officers, the judge tossed out the case, saying: “a reasonable officer could
have believed that beating a subdued and compliant Mr. Davis while causing a concussion, scalp
lacerations, and bruising with almost no permanent damage, did not violate the Constitution.”81
The judge “race-corrected” our reading of the US Constitution, making it inapplicable to the likes
of Mr. Davis – a reminder that, whatever else we think racism is, it is not simply ignorance, or a not
knowing. Until we come to grips with the “reasonableness” of racism, we will continue to look for it
on the bloody floors of Charleston churches and in the dashboard cameras on Texas highways, and
overlook it in the smart-sounding logics of textbooks, policy statements, court rulings, science
journals, and cutting-edge technologies.
Beyond Techno-Determinism
In the following chapters we will explore not only how racism is an output of technologies gone
wrong, but also how it is an input, part of the social context of design processes. The mistaken view
that society is affected by but does not affect technological development is one expression of a
deterministic worldview. Headlines abound: “Is Facebook Making Us Lonely?”;82 “Genetic
Engineering Will Change Everything Forever”;83 “Pentagon Video Warns of ‘Unavoidable’
Dystopian Future for World’s Biggest Cities.”84 In each, you can observe the conventional
relationship proffered between technology and society. It is the view that such developments are
inevitable, the engine of human progress … or decline.
An extreme and rather mystical example of techno-determinism was expressed by libertarian
journalist Matt Ridley, who surmised that not even basic science is essential, because innovation
has a trajectory all its own:
Technology seems to change by a sort of inexorable, evolutionary progress, which we probably
cannot stop – or speed up much either … Increasingly, technology is developing the kind of
autonomy that hitherto characterized biological entities … The implications of this new way of
seeing technology – as an autonomous, evolving entity that continues to progress whoever is
in charge – are startling. People are pawns in a process. We ride rather than drive the
innovation wave. Technology will find its inventors, rather than vice versa.85
Whereas such hard determinists, like Ridley, posit that technology has a mind of its own, soft
determinists grant that it is at least possible for people to make decisions about technology’s
trajectory. However, they still imagine a lag period in which society is playing catch-up, adjusting
its laws and norms to the latest invention. In this latter view, technology is often depicted as
neutral, or as a blank slate developed outside political and social contexts, with the potential to be
shaped and governed through human action. But, as Manuel Castells argues, “[t]he dilemma of
technological determinism is probably a false problem, since technology is society, and society
cannot be understood or represented without its technological tools.”
86
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 17 of 23
2022/1/16, 1:43 AM
86
Considering Castells’ point about the symbiotic relationship between technology and society, this
book employs a conceptual toolkit that synthesizes scholarship from STS and critical race studies.
Surprisingly, these two fields of study are not often put into direct conversation. STS scholarship
opens wide the “Black box” that typically conceals the inner workings of socio-technical systems,
and critical race studies interrogates the inner workings of sociolegal systems. Using this hybrid
approach, we observe not only that any given social order is impacted by technological
development, as determinists would argue, but that social norms, ideologies, and practices are a
constitutive part of technical design.
Much of the early research and commentary on race and information technologies coalesced
around the idea of the “digital divide,” with a focus on unequal access to computers and the
Internet that falls along predictable racial, class, and gender lines. And, while attention to access is
vital, especially given numerous socioeconomic activities that involve using the Internet, the larger
narrative of a techno-utopia in which technology will necessarily benefit all undergird the “digital
divide” focus. Naively, access to computers and the Internet is posited as a solution to inequality.87
And, to the extent that marginalized groups are said to fear or lack an understanding of technology,
the “digital divide” framing reproduces culturally essentialist understandings of inequality. A focus
on technophobia and technological illiteracy downplays the structural barriers to access, and also
ignores the many forms of tech engagement and innovation that people of color engage in.
In fact, with the advent of mobile phones and wireless laptops, African Americans and Latinxs are
more active web users than White people.88 Much of the African continent, in turn, is expected to
“leapfrog” past other regions, because it is not hampered by clunky infrastructure associated with
older technologies. In “The Revolution Will Be Digitized: Afrocentricity and the Digital Public
Sphere,” Anna Everett critiques “the overwhelming characterizations of the brave new world of
cyberspace as primarily a racialized sphere of Whiteness” that consigns Black people to the low-
tech sphere – when they are present at all.89 Other works effectively challenge the “digital divide”
framing by analyzing the racialized boundary constructed between “low” and “high tech.”90
Likewise, Lisa Nakamura (2013) challenges the model minority framing of Asian Americans as the
“solution” to the problem of race in a digital culture. She explains:
Different minorities have different functions in the cultural landscape of digital technologies.
They are good for different kinds of ideological work … seeing Asians as the solution and
blacks as the problem [i.e. cybertyping] is and has always been a drastic and damaging
formulation which pits minorities against each other …91
In contrast to critical race studies analyses of the dystopian digital divide and cybertyping, another
stream of criticism focuses on utopian notions of a “race-free future” in which technologies would
purportedly render obsolete social differences that are divisive now.92 The idea that, “[o]n the
Internet, nobody knows you’re a dog” (a line from Peter Steiner’s famous 1993 New Yorker
cartoon, featuring a typing canine) exemplifies this vision. However, this idea relies on a text-only
web, which has been complicated by the rise of visual culture on the Internet.93 For example, as
already mentioned, Jessie Daniels (2009) investigates the proliferation of White nationalist
ideology and communities online, unsettling any techno-utopian hopes for a colorblind approach
to social life in a digital era. And, as Alondra Nelson shows, both the digital divide and the raceless
utopia framings posit race as a liability, as “either negligible or evidence of negligence,” so that
“racial identity, and blackness in particular, is the anti-avatar of digital life.”94 It is also worth
noting how, in both conceptions, technology is imagined as impacting racial divisions – magnifying
or obliterating them – but racial ideologies do not seem to shape the design of technology.
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 18 of 23
2022/1/16, 1:43 AM
or obliterating them – but racial ideologies do not seem to shape the design of technology.
Race critical code studies would have us look at how race and racism impact who has access to new
devices, as well as how technologies are produced in the first place. Two incisive works are
particularly relevant for thinking about the tension between innovation and containment. In
Algorithms of Oppression Safiya Noble (2018) argues that the anti-Black and sexist Google search
results – such as the pornographic images that come up when you search for “Black girls” – grow
out of a “corporate logic of either willful neglect or a profit imperative that makes money from
racism and sexism,” as key ingredients in the normative substrate of Silicon Valley. In a similar
vein, Simone Browne (2015), in
Dark Matters: On the Surveillance of Blackness, examines how surveillance technologies
coproduce notions of Blackness and explains that “surveillance is nothing new to black folks”; from
slave ships and slave patrols to airport security checkpoints and stop-and-frisk policing practices,
she points to the “facticity of surveillance in black life.”95 Challenging a technologically determinist
approach, she argues that, instead of “seeing surveillance as something inaugurated by new
technologies,” to “see it as ongoing is to insist that we factor in how racism and anti-Blackness
undergird and sustain the intersecting surveillances of our present order.”96 As both Noble and
Browne emphasize and as my book will expand upon, anti-Black racism, whether in search results
or in surveillance systems, is not only a symptom or outcome, but a precondition for the fabrication
of such technologies.97
Race as technology: this is an invitation to consider racism in relation to other forms of domination
as not just an ideology or history, but as a set of technologies that generate patterns of social
relations, and these become Black-boxed as natural, inevitable, automatic. As such, this is also an
invitation to refuse the illusion of inevitability in which technologies of race come wrapped and to
“hotwire” more habitable forms of social organization in the process.98
Race critical code studies, as I develop it here, is defined not just by what we study but also by how
we analyze, questioning our own assumptions about what is deemed high theory versus pop
culture, academic versus activist, evidence versus anecdote. The point is not just to look beneath
the surface in order to find connections between these categories, but to pay closer attention to the
surfaces themselves. Here I draw upon the idea of thin description as a method for reading
surfaces – such as screens and skin – especially since a key feature of being racialized is “to be
encountered as a surface.”99 In anthropologist John L. Jackson’s formulation, thin description is
“about how we all travel … through the thicket of time and space, about the way … both of those
trajectories might be constructively thinned, theorized, concretized, or dislodged in service to
questions about how we relate to one another in a digital age.”100 He critiques the worship of thick
description within anthropology, arguing that it “tries to pass itself off as more than it is, as
embodying an expertise that simulates (and maybe even surpasses) any of the ways in which the
people being studied might know themselves … one that would pretend to see everything and,
therefore, sometimes sees less than it could.”101
Thinness, in this way, attempts a humble but no less ambitious approach to knowledge production.
Thinness allows greater elasticity, engaging fields of thought and action too often disconnected.
This analytic flexibility, in my view, is an antidote to digital disconnection, tracing links between
individual and institutional, mundane and spectacular, desirable and deadly in a way that troubles
easy distinctions.
At the same time, thin description is a method of respecting particular kinds of boundaries.
According to Jackson,
If thick description imagines itself able to amass more and more factual information in service
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 19 of 23
2022/1/16, 1:43 AM
If thick description imagines itself able to amass more and more factual information in service
to stories about cultural difference, “thin description” doesn’t fall into the trap of
conceptualizing its task as providing complete and total knowledge … So, there are secrets you
keep. That you treat very preciously. Names of research subjects you share but many more you
do not. There is information veiled for the sake of story. For the sake of much more.102
If the New Jim Code seeks to penetrate all areas of life, extracting data, producing hierarchies, and
predicting futures, thin description exercises a much needed discretion, pushing back against the
all-knowing, extractive, monopolizing practices of coded inequity.
Thinness is not an analytic failure, but an acceptance of fragility … a methodological counterpoint
to the hubris that animates so much tech development. What we know today about coded inequity
may require a complete rethinking, as social and technical systems change over time. Let’s not
forget: racism is a mercurial practice, shape-shifting, adept at disguising itself in progressive-like
rhetoric. If our thinking becomes too weighed down by our own assuredness, we are likely to miss
the avant-garde stylings of NextGen Racism as it struts by.
the progressive narratives that surround technology and encourages us to examine how racism is
often maintained or perpetuated through technical fixes to social problems. And finally, the next
chapters examine the different facets of coded inequity with an eye toward designing them
differently. Are you ready?
Notes
1. Kaba describes “grounded hope” as a philosophy of living that must be practiced every day and
that it is different from optimism and does not protect one from feeling sadness, frustration, or
anger. See her “Beyond Prisons” podcast, episode 19, at
https://shadowproof.com/2018/01/05/beyond-prisons-episode-19-hope-is-a-discipline-feat-
mariame-kaba.
2. Brown 2015, p. 26.
3. Inevitably, my students turn the question back on me: “Tell us about your name, prof?” As I was
born to an African American father and a Persian Indian mother, my parents wanted me to have
a first name with Arabic origins, but one that was short enough, so English speakers wouldn’t
butcher it. They were mostly successful, except that my friends still call me “Ru” … nicknames
are a form of endearment after all. What I find amusing these days is getting messages
addressed to “Mr. Benjamin” or “Mr. Ruha.” Since Benjamin is more often used as a masculine
first name, people whom I have never met routinely switch the order in their heads and mis-
gender me as a result. I sometimes wonder whether I receive some fleeting male privilege –
more deference, perhaps. This, after all, is the reason why some of my female students say their
parents gave them more gender-neutral names: to delay (if not diminish) sexist assumptions
about their qualifications and capacities. Similar rationale for my Black, Asian, and Latinx
students with stereotypically White-sounding names: “My parents didn’t want me to have a hard
time,” “They wanted me to have a normal American name” (where “American” is always coded
“White”).
4. The Apples and Norths of the world tend to experience less ridicule and more fascination, owing
to their celebrity parentage, which tell us that there is nothing intrinsic to a “good” name,
nothing that makes for it.
5. So, is the solution for those with racially stigmatized names to code-switch by adopting names
that offer more currency on the job market? Or does this simply accommodate bias and leave it
in place? In a number of informal experiments, job seekers put this idea to the test. Jose Zamora
dropped one letter from his first name and found that “Joe Zamora,” with all the same education
and credentials, magically started hearing from employers. Similarly, after two years of
searching for a job, Yolanda Spivey changed the name on her résumé to “Bianca White,” and
suddenly her inbox was full of employers interested in interviewing her. What stunned Yolanda
most was that, while the same résumé was posted with her real name on the employment
website, employers were repeatedly calling “Bianca,” desperate to get an interview.
6. When the study was replicated in France, another team found that Christian-sounding names
had a similar value over and above Muslim-sounding names, and they could not explain the
difference through other factors such as experience or education.
7. Caliskan et al. 2017. Fun fact: did you know that the words “algorithm” and “algebra” come from
a Persian astronomer and mathematician, Muhammad Ibn Musa al-Khwarizmi, whose last
name was Latinized as Algorithmi? I suspect, given how his name would likely trigger
surveillance systems today, he would cheer on algorithmic audits that are trying to prevent such
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 21 of 23
2022/1/16, 1:43 AM
surveillance systems today, he would cheer on algorithmic audits that are trying to prevent such
biased associations!
8. I’m thinking of Browne’s (2015) “racializing surveillance,” Broussard’s (2018)
“technochauvinism,” Buolamwini’s (2016) “coded gaze,” Eubanks’ (2018) “digital poorhouse,”
Noble’s (2018) “algorithms of oppression and technological redlining,” or Wachter-Boettcher’s
(2017) “algorithmic inequity” (among other kindred formulations) as “cousin concepts” related
to the New Jim Code.
9. O’Neil 2016, p. 23.
10. Another example is Wilmer Catalan-Ramirez, an undocumented Chicago resident who was
listed without his knowledge in the city’s gang database as a member of two rival gangs (Saleh
2018).
11. See the CalGang Criminal Intelligence System report at http://www.voiceofsandiego.org/wp-
content/uploads/2016/08/CalGangs-audit.pdf. See also Harvey 2016.
12. Harvey 2016.
13. Muhammad 2011, p. 20, emphasis added; see also Zuberi 2003.
14. Wacquant 2017, p. 2.
15. Wacquant 2017; emphasis added.
16. Sweeney 2013.
17. boyd and Elish 2018.
18. Baldwin 1998, p. 723.
19. In her letter to Zuckerberg, Milner (2018) continues:
“Histories of redlining, segregation, voter disenfranchisement and state sanctioned violence
have not disappeared, but have been codified and disguised through new big data regimes.”
20. This refers to a classic line in the film Wizard of Oz in which Oz attempts to conceal his
machinations: “Pay no attention to the man behind the curtain.”
21. boyd and Elish 2018.
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 22 of 23
2022/1/16, 1:43 AM
https://r4.vlereader.com/epubprint?ean=1781509526437&printRange=9-26 Page 23 of 23