China Report (2021)
China Report (2021)
ARTICLE 19 works for a world where all people everywhere can freely express themselves and actively
engage in public life without fear of discrimination. We do this by working on two interlocking freedoms,
which set the foundation for all our work. The Freedom to Speak concerns everyone’s right to express and
disseminate opinions, ideas and information through any means, as well as to disagree from, and question
power-holders. The Freedom to Know concerns the right to demand and receive information by power-
holders for transparency good governance and sustainable development. When either of these freedoms
comes under threat, by the failure of power-holders to adequately protect them, ARTICLE 19 speaks with
one voice, through courts of law, through global and regional organisations, and through civil society
wherever we are present.
ARTICLE 19
60 Farringdon Road
UK
www.article19.org
A19/DIG/2021/001
Text and analysis Copyright ARTICLE 19, November 2020 (Creative Commons License 3.0)
____________________________________________________________________________
About Creative Commons License 3.0: This work is provided under the Creative Commons Attribution-
Non-Commercial-ShareAlike 2.5 license. You are free to copy, distribute and display this work and to
make derivative works, provided you: 1) give credit to ARTICLE 19; 2) do not use this work for commercial
purposes; 3) distribute any works derived from this publication under a license identical to this one. To
access the full legal text of this license, please visit: http://creativecommons.org/licenses/by-nc-sa/2.5/
legalcode
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Contents
Executive Summary 5
Acknowledgements 9
Glossary 10
List of Abbreviations 11
Introduction 12
Why China? 13
Methodology 14
Use Cases 17
Public Security 19
Foreign Emotion Recognition Precursors as Motivation 19
Three Types of Security-Use Contexts and Their Rationales 19
Public Security Implementations of Emotion Recognition 20
Driving Safety 23
In-Vehicle Emotion Recognition 23
Insurance Companies and Emotion Recognition of Drivers 23
Emotion Recognition Outside of Cars 24
State and Tech Industry Interest 24
Education 25
Emotion and Edtech 25
China’s Push for ‘AI+Education’ 25
Chinese Academic Research on Emotion Recognition in Education 25
China’s Market for Emotion Recognition in Education 26
Emotion Recognition in Online and In-Person Classrooms 29
Students’ Experiences of Emotion Recognition Technologies 30
Parents’ Perceptions of Emotion Recognition Technologies 34
Teachers’ Experiences of Emotion Recognition Technologies 31
School Administrators’ Perceptions of Emotion Recognition Technologies 32
3
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Right to Privacy 36
Right to Protest 38
Non-Discrimination 38
Function Creep 39
National Law 45
Chinese Constitution 45
Data Protection 45
Instruments 46
Biometric Data 47
Standardisation 47
Ethical Frameworks 48
Recommendations 49
To the Chinese Government 50
To the International Community 50
To the Private Companies Investigated in this Report 50
To Civil Society and Academia 50
Endnotes 51
4
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Executive Summary
In this report, ARTICLE 19 provides evidence and particularly freedom of expression, and the
analysis of the burgeoning market for emotion potential and ongoing detrimental impact of this
recognition technologies in China and its technology on people’s lives;
detrimental impact on individual freedoms and 3. Provide rich detail on actors, incentives, and
human rights, in particular the right to freedom the nature of applications within three emotion
of expression. Unlike better-known biometric recognition use cases in the Chinese market:
applications, like facial recognition, that focus public security, driving, and education;
on identifying individuals, emotion recognition 4. Analyse the legal framework within which these
purports to infer a person’s inner emotional state. use cases function; and
Applications are increasingly integrated into 5. Set out recommendations for stakeholders,
critical aspects of everyday life: law enforcement particularly civil society, on how to respond to
authorities use the technology to identify the human rights threats posed by emotion
‘suspicious’ individuals, schools monitor students’ recognition technologies in China.
attentiveness in class, and private companies
determine people’s access to credit. This report will better equip readers to understand
the precise ways in which China’s legal, economic,
Our report demonstrates the need for strategic and cultural context is different, the ways in which
and well-informed advocacy against the design, it is not, and why such distinctions matter. Each use
development, sale, and use of emotion recognition case bears its own social norms, laws, and claims
technologies. We emphasise that the timing of such for how emotion recognition improves upon an
advocacy – before these technologies become existing process. Likewise, the interaction between
widespread – is crucial for the effective promotion pre-existing Chinese surveillance practices and
and protection of people’s rights, including their these use cases shapes the contributions emotion
freedoms to express and opine. High school recognition will make in China and beyond.
students should not fear the collection of data
on their concentration levels and emotions in The implications of the report’s findings are twofold.
classrooms, just as suspects undergoing police First, a number of problematic assumptions (many
interrogation must not have assessments of based on discredited science) abound amongst
their emotional states used against them in an stakeholders interested in developing and/or
investigation. These are but a glimpse of uses for deploying this technology. This report unpacks and
emotion recognition technologies being trialled in critically analyses the human rights implications
China. of emotion recognition technologies and the
assumptions implicit in their marketing in China.
This report describes how China’s adoption of Second, Chinese tech firms’ growing influence in
emotion recognition is unfolding within the country, international technical standards-setting could
and the prospects for the technology’s export. It encompass standards for emotion recognition.
aims to: Using a human rights lens, the report addresses
the most problematic views and practices that, if
1. Unpack and analyse the scientific foundations uncontested, could become codified in technical
on which emotion recognition technologies are standards – and therefore reproduced in technology
based; at a massive scale – at technical standard-setting
2. Demonstrate the incompatibility between bodies, like the International Telecommunications
emotion recognition technology and Union (ITU) and the Institute of Electrical and
international human rights standards, Electronics Engineers (IEEE).
5
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Some of the main findings from the research on Emotion recognition technologies’ flawed and long-
deployment of emotion recognition technology in discredited scientific assumptions do not hinder
China include the following: their market growth in China. Three erroneous
assumptions underlie justifications for the use and
The design, development, sale, and use of emotion sale of emotion recognition technologies: that facial
recognition technologies are inconsistent with expressions are universal, that emotional states can
international human rights standards. While be unearthed from them, and that such inferences
emotion recognition is fundamentally problematic, are reliable enough to be used to make decisions.
given its discriminatory and discredited scientific Scientists across the world have discredited all three
foundations, concerns are further exacerbated by assumptions for decades, but this does not seem
how it is used to surveil, monitor, control access to to hinder the experimentation and sale of emotion
opportunities, and impose power, making the use of recognition technologies (pp. 18–35).
emotion recognition technologies untenable under
international human rights law (pp. 36–44). Chinese law enforcement and public security
bureaux are attracted to using emotion recognition
The opaque and unfettered manner in which emotion software as an interrogative and investigatory tool.
recognition is being developed risks depriving Some companies seek procurement order contracts
people of their rights to freedom of expression, for state surveillance projects (pp. 18-22) and train
privacy, and the right to protest, amongst others. police to use their products (p. 22). Other companies
Our investigation reveals little evidence of oversight appeal to law enforcement by insinuating that their
mechanisms or public consultation surrounding technology helps circumvent legal protections
emotion recognition technologies in China, which concerning self-incrimination for suspected
contributes significantly to the speed and scale at criminals (pp. 42-43).
which use cases are evolving. Mainstream media
is yet to capture the nuance and scale of this While some emotion recognition companies allege
burgeoning market, and evidence collection is crucial they can detect sensitive attributes, such as mental
at this moment. Together, these factors impede civil health conditions and race, none have addressed
society’s ability to advocate against this technology. the potentially discriminatory consequences of
collecting this information in conjunction with
Emotion recognition’s pseudoscientific foundations emotion data. Some companies’ application
render this technology untenable as documented programming interfaces (APIs) include questionable
in this report. Even as some stakeholders claim racial categories for undisclosed reasons (p. 41).
that this technology can get better with time, given Firms that purportedly identify neurological diseases
the pseudoscientific and racist foundations of and psychological disorders from facial emotions
emotion recognition on one hand, and fundamental (pp. 41-42) fail to account for how their commercial
incompatibility with human rights on the other, the emotion recognition applications might factor in
design, development, deployment, sale, and transfer these considerations when assessing people’s
of these technologies must be banned. emotions in non-medical settings, like classrooms.
6
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Chinese emotion recognition companies’ stances on None of the Chinese companies researched here
the relationship between cultural background and appears to have immediate plans to export their
expressions of emotion influence their products. products. Current interest in export seems low,
This can lead to problematic claims about emotions (p. 40) although companies that already have major
being presented in the same way across different markets abroad, such as Hikvision and Huawei, are
cultures (p. 40) – or, conversely, to calls for models working on emotion recognition applications
trained on ‘Chinese faces’ (p. 41). The belief that (pp. 23, 27, 29-33, 40).
cultural differences do not matter could result in
inaccurate judgements about people from cultural People targeted by these technologies in China
backgrounds that are underrepresented in the – particularly young adults (pp. 30–31) –
training data of these technologies – a particularly predominantly report feeling distrust, anxiety, and
worrying outcome for ethnic minorities. indifference regarding current emotion recognition
applications in education. While some have
Chinese local governments’ budding interest in criticised emotion recognition in education-use
emotion recognition applications confer advantages scenarios (pp. 30-31, 34), it is unclear whether there
to both startups and established tech firms. Law will be ongoing pushback as awareness spreads.
enforcement institutions’ willingness to share their
data with companies for algorithm-performance Civil society strategies for effective pushback will
improvement (p. 22), along with local government need to be tailored to the context of advocacy.
policy incentives (pp. 18, 20, 22, 24, 25, 33), enable Civil society interventions can focus on debunking
the rapid development and implementation of emotion recognition technology’s scientific
emotion recognition technologies. foundations, demonstrating the futility of using
it, and/or demonstrating its incompatibility with
The emotion recognition market is championed human rights. The strategy (or strategies) that
by not only technology companies but also civil society actors eventually employ may need to
partnerships linking academia, tech firms, and be adopted in an agile manner that considers the
the state. Assertions about emotion recognition geographic, political, social, and cultural context of
methods and applications travel from academic use.
research papers to companies’ marketing materials
(pp. 22, 25-26) and to the tech companies’ and
state’s public justifications for use (pp. 20, 22-33).
These interactions work in tandem to legitimise
uses of emotion recognition that have the potential
to violate human rights.
7
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Acknowledgements
9
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Glossary
Biometric data: Data relating to physical, physiological, or behavioural characteristics of a
natural person, from which identification templates of that natural person
– such as faceprints or voice prints – can be extracted. Fingerprints have
the longest legacy of use for forensics and identification,1 while more recent
sources include (but are not limited to) face, voice, retina and iris patterns, and
gait.
Emotion recognition: A biometric application that uses machine learning in an attempt to identify
individuals’ emotional states and sort them into discrete categories, such as
anger, surprise, fear, happiness, etc. Input data can include individuals’ faces,
body movements, vocal tone, spoken or typed words, and physiological signals
(e.g. heart rate, blood pressure, breathing rate).
Facial recognition: A biometric application that uses machine learning to identify (1:n matching) or
verify (1:1 matching) individuals’ identities using their faces. Facial recognition
can be done in real time or asynchronously.
Machine learning: A popular technique in the field of artificial intelligence that has gained
prominence in recent years. It uses algorithms trained with vast amounts of
data to improve a system’s performance at a task over time.
10
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
List of Abbreviations
AI Artificial intelligence
11
Introduction
1. Introduction
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Human rights organisations, including ARTICLE 1. They all hint at how ‘smart city’ marketing
19, have argued that public and private actors’ will encompass emotion recognition.
use of biometrics poses profound challenges for
individuals in their daily lives, from wrongfully 2. They all take place in spaces that people
denying welfare benefits to surveilling and tracking often have no choice in interacting with,
vulnerable individuals with no justifications. As leaving no substantial consent or opt-out
they are currently used, biometric technologies mechanisms for those who do not want to
thus pose disproportionate risks to human rights, participate.
in particular to individuals’ freedom of expression,
privacy, freedom of assembly, non-discrimination, 3. Although major Chinese tech companies
and due process. A central challenge for civil – including Baidu and Alibaba – are
society actors and policymakers thus far is experimenting with emotion recognition,
that pushback against these technologies is this report focuses on the majority of
often reactive rather than proactive, reaching a commercial actors in the field: smaller
crescendo only after the technologies have become startups that go unnoticed in major
ubiquitous.4 English-language media outlets, but that
have nonetheless managed to link up
In an attempt to encourage pre-emptive and strategic with academics and local governments
advocacy in this realm, this report focuses on emotion to develop and implement emotion
recognition, a relatively under-observed application recognition.
of biometric technology, which is slowly entering both
public and private spheres of life. Emerging from the
field of affective computing,5 emotion recognition is Why China?
projected to be a USD65 billion industry by 2024,6 and This report focuses on China because it is
is already cropping up around the world.7 Unlike any a dominant market with the technologically
ubiquitous biometric technology, it claims to infer skilled workforce, abundant capital, market
individuals’ inner feelings and emotional states, and demand, political motivations, and export
a ground truth about a subjective, context-dependent potential for artificial intelligence (AI) that could
state of being. While face recognition asked who we enable rapid diffusion of emotion recognition
are, emotion recognition is chiefly concerned with how technologies.9 Over the past few years, Chinese
we feel. Many believe this is not possible to prove or tech companies have fuelled an international
disprove.8 boom in foreign governments’ acquisition of
surveillance technology.10 China’s One Belt, One
Road (OBOR) initiative has enabled the wide-scale
13
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
implementation of Huawei’s Safe Cities policing become major suppliers, following on the heels of
platforms and Hikvision facial recognition cameras, their dominance of the facial recognition market.15
in democracies and autocracies alike, without With this report, ARTICLE 19 therefore seeks to
accompanying public deliberation or safeguards. galvanise civil society attention to the increasing
In the context of facial recognition in particular, use of emotion recognition technologies, their
policymakers were taken aback by how quickly the pseudoscientific underpinnings, and the fundamental
Chinese companies that developed this technology inconsistency of their commercial applications with
domestically grew and started to export their international human rights standards. We seek to do
products to other countries.11 so early in emotion recognition’s commercialisation,
before it is widespread globally, to pre-empt the
Discussing emotion recognition technologies, blunt and myopic ways in which adoption of this
Rosalind Picard – founder of major affective technology might grow.
computing firm, Affectiva, and one of the leading
researchers in the field – recently commented: Methodology
“The way that some of this technology is The research for this report began with a literature
being used in places like China, right now review built from Mandarin-language sources in
[…] worries me so deeply, that it’s causing two Chinese academic databases: China National
me to pull back myself on a lot of the things Knowledge Infrastructure and the Superstar
that we could be doing, and try to get the Database (超星期刊). Search keywords included
community to think a little bit more about terms related to emotion recognition (情绪识别),
[...] if we’re going to go forward with that, micro-expression recognition (微表情识别), and
how can we do it in a way that puts forward affective computing (情感计算). In parallel, the
safeguards that protect people?”12 authors consulted Chinese tech company directory
Tianyancha (天眼查), where 19 Chinese companies
To effectively advocate against emotion recognition
were tagged as working on emotion recognition.
technologies, it is crucial to concentrate on the
Of these, eight were selected for further research
motivations and incentives of those Chinese
because they provided technology that fit within the
companies that are proactive in proposing
three use cases the report covers. The additional
international technical standards for AI applications,
19 companies investigated came up in academic
including facial recognition, at convening bodies
and news media articles that mentioned the eight
like the ITU.13 Internationally, a head start on
firms chosen from the Tianyancha set, and were
technical standards-setting could enable Chinese
added into the research process. Google, Baidu, and
tech companies to develop interoperable systems
WeChat Mandarin-language news searches for these
and pool data, grow more globally competitive,
companies, as well as for startups and initiatives
lead international governance on AI safety and
unearthed in the academic literature, formed the next
ethics, and obtain the ‘right to speak’ that Chinese
stage of source collection.
representatives felt they lacked when technical
standards for the Internet were set.14 This codification
Finally, where relevant, the authors guided a research
reverberates throughout future markets for this
assistant to find English-language news and
particular technology, expanding the technical
academic research that shed light on comparative
standards’ worldwide influence over time.
examples.
Focusing on the Chinese emotion recognition market,
We mention and analyse these 27 companies based
in particular, provides an opportunity to pre-empt
on the credibility and availability of source material,
how China’s embrace of emotion recognition can
both within and outside company websites, and
– and will – unfold outside of China’s borders.
examples of named institutions that have pilot
If international demand for emotion recognition
tested or fully incorporated these companies’
increases, China’s pre-existing market for technology
products. For a few companies, such as Miaodong
exports positions a handful of its companies to
14
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
in Guizhou, news coverage is not recent and it is Hawkeye (阿尔法鹰眼), a Chinese company that
unclear whether the company is still operating. supplies emotion recognition for public security,
Nonetheless, such examples were included characterises it as ‘biometrics 3.0’18, while a write-
alongside more recently updated ones to highlight up of another company, Xinktech (云思创智),
details that are valuable to understanding the predicts ‘the rise of emotion recognition will be
broader trend of emotion recognition applications, faster than the face recognition boom, because now
such as access to law enforcement data for training there is sufficient computing power and supporting
emotion recognition models, or instances where data. The road to emotion recognition will not be as
public pushback led to modification or removal of long.’19
a technology. Even if some of these companies are
defunct, a future crop of competitors is likely to How Reliable is Emotion Recognition?
follow in their stead.
Two fundamental assumptions undergird emotion
recognition technologies: that it is possible to
Finally, although other types of emotion recognition
gauge a person’s inner emotions from their external
that do not rely on face data are being used in
expressions, and that such inner emotions are
China, the report focuses primarily on facial
both discrete and uniformly expressed across
expression-based and multimodal emotion
the world. This idea, known as Basic Emotion
recognition that includes face analysis, as our
Theory (BET), draws from psychologist Paul
research revealed these two types of emotion
Ekman’s work from the 1960s. Ekman suggested
recognition are more likely to be used in high-
humans across cultures could reliably discern
stakes settings.
emotional states from facial expressions, which
he claimed were universal.20 Ekman and Friesen
Background to Emotion Recognition also argued that micro-momentary expressions
(‘micro-expressions’), or facial expressions that
What Are Emotion Recognition occur briefly in response to stimuli, are signs of
Technologies? ‘involuntary emotional leakage [which] exposes a
person’s true emotions’.21
Emotion recognition technologies purport to infer
an individual’s inner affective state based on traits BET has been wildly influential, even inspiring
such as facial muscle movements, vocal tone, body popular television shows and films.22 However,
movements, and other biometric signals. They use scientists have investigated, contested, and largely
machine learning (the most popular technique in the rejected the validity of these claims since the
field of AI) to analyse facial expressions and other time of their publication.23 In a literature review of
biometric data and subsequently infer a person’s 1,000 papers’ worth of evidence exploring the link
emotional state.16 between emotional states and expressions, a panel
of authors concluded:
Much like other biometric technologies (like facial
recognition), the use of emotion recognition “very little is known about how and
involves the mass collection of sensitive personal why certain facial movements express
data in invisible and unaccountable ways, enabling instances of emotion, particularly at a level
the tracking, monitoring, and profiling of individuals, of detail sufficient for such conclusions
often in real time.17 to be used in important, real-world
applications. Efforts to simply ‘read out’
Some Chinese companies describe the link people’s internal states from an analysis
between facial recognition technologies (based of their facial movements alone, without
on comparing faces to determine a match) considering various aspects of context, are
and emotion recognition (analysing faces and at best incomplete and at worst entirely
assigning emotional categories to them) as a lack validity, no matter how sophisticated
matter of incremental progress. For example, Alpha the computational algorithms”.24
15
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Another empirical study sought to find out whether creates a basic template of expressions that are
the assumption that facial expressions are a then filtered through culture to gain meaning’.29
consequence of emotions was valid, and concluded This is corroborated by a recent study from the
that ‘the reported meta-analyses for happiness/ University of Glasgow, which found that culture
amusement (when combined), surprise, disgust, shapes the perception of emotions.30 Yet even
sadness, anger, and fear found that all six emotions theories of minimum universality call the utility
were on average only weakly associated with the of AI-driven emotion recognition systems into
facial expressions that have been posited as their question. One scholar has suggested that, even if
UEs [universal facial expressions]’.25 such technologies ‘are able to map each and every
human face perfectly, the technical capacities of
The universality of emotional expressions has physiological classification will still be subject to
also been discredited through the years. For one, the vagaries of embedded cultural histories and
researchers found that Ekman’s methodology to contemporary forms of discrimination and of racial
determine universal emotions inadvertently primed ordering’.31
subjects (insinuated the ‘correct’ answers) and
eventually distorted results.26 The ‘natural kind’ view Even so, academic studies and real-world
of emotions as something nature has endowed applications continue to be built on the basic
humans with, independent of our perception of assumptions about emotional expression discussed
emotions and their cultural context, has been above, despite these assumptions being rooted in
strongly refuted as a concept that has ‘outlived its dubious scientific studies and a longer history of
scientific value and now presents a major obstacle discredited and racist pseudoscience.32
to understanding what emotions are and how they
work’.27 Emotion recognition’s application to identify, surveil,
track, and classify individuals across a variety
Finally, empirical studies have disproved the of sectors is thus doubly problematic – not just
notion of micro-expressions as reliable indicators because of its dangerous applications, but also
of emotions; instead finding them to be both because it doesn’t even work as its developers and
unreliable (due to brevity and infrequency) and users claim.33
discriminatory.28 Some scholars have proposed a
‘minimum universality’ of emotions, insisting ‘the
finite number of ways that facial muscles can move
16
2. Use Cases
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Paving the Way for Emotion Recognition such as discounted products and services from
in China major tech firms. In late 2018, a conference on
digital innovation and social management, The New
As one of the world’s biggest adopters of facial Fengqiao Experience, convened police officers and
recognition cameras, China has come under companies including Alibaba.39
scrutiny for its tech firms’ far-reaching international
sale of surveillance technology.34 The normalisation Although reporting on Sharp Eyes and Fengqiao-
of surveillance in Chinese cities has developed style policing has not yet touched on emotion
in parallel with the government’s crackdown on recognition, both are relevant for three reasons. For
the ethnic minority Uighur population in Xinjiang one, Sharp Eyes and the Fengqiao project exemplify
province. For Xinjiang’s majority-Muslim population, templates for how multiple national government
security cameras, frequent police inspections, organisations, tech companies, and local law
law enforcement’s creation of Uighur DNA and enforcement unite to implement surveillance
voiceprint databases, and pervasive Internet technology at scale. Second, companies
monitoring and censorship of content about or specialising in emotion recognition have begun
related to Islam are inescapable.35 to either supply technology to these projects or to
incorporate both Sharp Eyes and Fengqiao into their
One state-sponsored security venture, the ‘Sharp marketing, as seen below with companies Alpha
Eyes’ project (雪亮工程), has come up in relation Hawkeye (阿尔法鹰眼), ZNV Liwei, and Xinktech.40
to three of the ten companies investigated in Finally, Chinese tech firms’ commercial framing
this section. Sharp Eyes is a nationwide effort to of emotion recognition as a natural next step in
blanket Chinese cities and villages with surveillance the evolution of biometric technology applications
cameras, including those with licence plate-reading opens up the possibility that emotion recognition
and facial recognition capabilities.36 The project, will be integrated in places where facial recognition
which the Central Committee of the Chinese has been widely implemented. Independent
Communist Party approved in 2016, relies in part on researchers are already using cameras with
the government procurement-order bidding process image resolution sufficiently high to conduct face
to allocate billions of yuan in funding to (foreign recognition in experiments to develop emotion and
and domestic) firms that build and operate this gesture recognition.41
infrastructure.37
It is important to note that interest in multimodal
A homologous concept resurgent in contemporary emotion recognition is already high. Media
surveillance is the ‘Fengqiao experience’ (枫桥经 coverage of the company Xinktech predicts that
验), a Mao Zedong-contrived practice in which micro-expression recognition will become a
ordinary Chinese citizens monitored and reported ubiquitous form of data collection, fuelling the rise
each other’s improper behaviour to the authorities. of ‘multimodal technology [as an] inevitable trend,
In a story that has come to exemplify Fengqiao, a sharp weapon, and a core competitive advantage
rock musician Chen Yufan was arrested for drug in the development of AI’.42 By one estimate,
charges when a ‘community tip’ from within his the potential market for multimodal emotion
residential area made its way to authorities.38 recognition technologies is near 100 billion yuan
President Xi Jinping has praised the return of the (over USD14.6 billion).43 How did multimodality
Fengqiao experience through neighbourhood-level garner such hype this early in China’s commercial
community watch groups that report on suspected development of emotion recognition? Part of the
illegal behaviour. Though senior citizens are the answer lies in how Chinese tech firms depict foreign
backbone of this analogue surveillance, police have examples of emotion recognition as having been
begun to head up watch groups, and technology unilateral successes – ignoring the scepticism that
companies have capitalised on the Fengqiao trend terminated some of these initiatives.
by developing local apps incentivising people to
report suspicious activity in exchange for rewards,
18
2. Use cases
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
19
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Another paper from two researchers at Sichuan website does not indicate whether emotion
Police College envisioned a Tibetan border-patrol recognition capabilities are among them.58 An article
inspection system that would fit both the ‘early from 2017 indicated that Alpha Hawkeye planned
warning’ and follow-up inspection functions. 53 They to develop its own ‘high-risk crowd database’ that
argued that traditional border-security inspections would match footage collected from its cameras
can be invasive and time-consuming, and that against (unnamed) ‘national face recognition
the longer they take, the more the individuals databases’. 59 In coordination with local authorities,
being inspected feel they are being discriminated the company has conducted pilot tests in rail
against.54 Yet if AI could be used to identify and subway stations in Beijing, Hangzhou, Yiwu
suspicious micro-expressions, they reasoned, (Zhejiang), Urumqi (Xinjiang), and Erenhot (Inner
presumably fewer people would be flagged for Mongolia), at airports in Beijing and Guangzhou, and
additional inspection, and the process would at undisclosed sites in Qingdao and Jinan, although
be less labour-intensive for security personnel. it is ambiguous about whether these applications
Moreover, the speed of the automated process involved only face recognition or also included
is itself presented as somehow ‘fairer’ for those emotion recognition.60
under inspection by taking up less of their time. In a
similar framing to the Hubei Police Academy paper, The user interface for an interrogation platform from
the authors believed their system would be able to CM Cross (深圳市科思创动科技有限公司, known
root out ‘Tibetan independence elements’ on the as 科思创动) contains a ‘Tension Index Table’ (紧
basis of emotion recognition.55 These disconcerting 张程度指数表) that conveys the level of tension a
logical leaps are replicated in how the companies person under observation supposedly exhibits, with
themselves market their products. outputs including ‘normal’, ‘moderate attention’, and
‘additional inspection suggested’.61 Moreover, the
Public Security Implementations of Emotion CM Cross interrogation platform sorts questions to
Recognition pose to suspects into interview types; for example,
News coverage and marketing materials for the ‘conventional interrogations’, ‘non-targeted
ten companies described in Table 1 flesh out the interviews’, and ‘comprehensive cognitive tests’.62
context in which emotion recognition applications
are developed. At the 8th China (Beijing) International Police
Equipment and Counter-Terrorism Technology
According to one local news story, authorities Expo in 2019, Taigusys Computing representatives
at the Yiwu Railway Station (Zhejiang) used marketed their interrogation tools as obviating
Alpha Hawkeye’s emotion recognition system the need for polygraph machines, and boasted
to apprehend 153 so-called ‘criminals’ between that their prison-surveillance system can prevent
October 2014 and October 2015.56 The headline inmate self-harm and violence from breaking out
focused on the more mundane transgression by sending notifications about inmates expressing
that these types of systems tend to over-police: ‘abnormal emotions’ to on-site management staff.
individuals’ possession of two different state ID Images of the user interface for the ‘Mental Auxiliary
cards. Alpha Hawkeye’s products have reportedly Judgment System’ (精神辅助判定系统) on the
been used in both Sharp Eyes projects and in the company’s website show that numerical values are
OBOR ‘counterterrorism industry’.57 ZNV Liwei assigned to nebulous indicators, such as ‘physical
(ZNV力维) is also reported to have contributed and mental balance’ (身心平衡).63
technology to the Sharp Eyes surveillance project
and to have provided police in Ningxia, Chongqing,
Shenzhen, Shanghai, and Xinjiang with other ‘smart
public security products’, though the company’s
20
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
21
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Xinktech (南京云思创智科技公司) aims to create interrogation software was reputedly only accurate
the ‘AlphaGo of interrogation’.85 Their ‘Lingshi’ 50% of the time. They then came to the attention of
Multimodal Emotional Interrogation System’ local officials in the Guiyang High Tech Zone and
(灵视多模态情绪审讯系统), showcased at the teamed up with the Liupanshui PSB. After this, the
Liupanshui 2018 criminal defence law conference PSB shared several archived police interrogation
in Hubei, contains ‘core algorithms that extract 68 videos with Miaodong, and the company says its
facial feature points and can detect eight emotions accuracy rates rose to 80%.95 Similarly, Xinktech
(calmness, happiness, sadness, anger, surprise, fear, partnered with police officers to label over 2,000
contempt, disgust).86 Aside from providing a venue hours of video footage containing 4 million
for the companies to showcase their products, samples of emotion image data. When asked why
conferences double as a site for recruiting both Xinktech entered the public security market, CEO
state and industry partners in development and Ling responded: “We discovered that the majority
implementation. of unicorns in the AI field are companies who
start out working on government business, mainly
In 2018, Hangzhou-based video surveillance because the government has pain points, funding,
firm Joyware signed a cooperative agreement to and data.”96 Exploiting these perceived ‘pain points’
develop ‘emotional AI’ with the Canadian image further, some companies offer technology training
recognition company NuraLogix.87 NuraLogix trains sessions to law enforcement.
models to identify facial blood flow as a measure
of emotional state and other vital signs.88 ZNV Liwei At a conference, Xinktech CEO Ling Zhihui discussed
has collaborated with Nanjing Forest Police College the results of Xinktech’s product applications
and CM Cross to establish an ‘AI Emotion Big Data in Wuxi, Wuhan, and Xinjiang. 97 Afterwards, Ling
Joint Laboratory’ (AI情绪大数据联合实验室), where facilitated a visit to the Caidian District PSB in
they jointly develop ‘psychological and emotion Wuhan to demonstrate their pilot programme using
recognition big data systems’.89 In 2019, Xinktech Xinktech’s ‘Public Security Multimodal Emotion
held an emotion recognition technology seminar in Research and Judgment System’ (公安多模态情
Nanjing. Media coverage of the event spotlighted 绪研判系统).98 Xinktech reportedly also sells its
the company’s cooperative relationship with the ‘Lingshi’ interrogation platform to public security
Interrogation Science and Technology Research and prosecutorial institutions in Beijing, Hebei,
Center of the People’s Public Security University Hubei, Jiangsu, Shaanxi, Shandong, and Xinjiang.99
of China, along with Xinktech’s joint laboratory Concurrently with the Hubei conference, Xinktech’s
with the Institute of Criminal Justice at Zhongnan senior product manager led the ‘Interrogation
University of Economics and Law established earlier Professionals Training for the Province-Wide
that year.90 Criminal Investigation Department’ (全省刑侦部
门审讯专业人才培训) at the Changzhou People’s
Xinktech’s partnerships with both of these Police Academy in Jiangsu province, an event co-
universities and Nanjing Forest Police Academy sponsored by the Jiangsu Province Public Security
account for some of its training data acquisition Department.100 Finally, in late 2019, EmoKit’s CEO
and model-building process – contributions that described a pilot test wherein police in Qujing,
reflect a symbiotic exchange between firms and Yunnan, would trial the company’s interrogation
the state.91 EmoKit (翼开科技), which professed to technology. EmoKit planned to submit results
have 20 million users of its open APIs four years from this test run in its application to join the list
ago, partnered with the Qujing Public Security of police equipment procurement entities that
Bureau (PSB) in Yunnan Province.92 According supply the Ministry of Public Security.101 EmoKit
to one source, EmoKit obtained 20 terabytes of also purports to work with the military, with one
interrogation video data from a southern Chinese military-cooperation contract raking in 10 million
police department.93 In Guizhou, a startup called RMB (USD1.5 million USD), compared with 1 million
Miaodong (秒懂) received a similar boost from RMB (USD152,000 USD) orders in the financial and
local government in 2016. 94 At first, Miaodong’s education sectors, respectively.102
22
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
23
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
An’s Property and Casualty Insurance Technology fare evasion. 118 While there has been no indication
Center revealed that, in addition to driver facial that this particular project is to be trialled or
expression data, the cars are outfitted to incorporate implemented thus far, there are some signs of state
real-time data on other cars and the roads being support for the development of emotion recognition
used.113 This real-time analysis can allegedly catch for driving safety.
drivers ‘dozing off, smoking, playing with mobile
phones, carelessly changing lanes, [and] speeding’. State and Tech Industry Interest
Ping An’s Chief Scientist, Xiao Jing, praised this The city of Guangzhou issued a suggested research
AI system’s acceleration of the insurance-claim topic, ‘Research and Development on Applications
investigation process.114 of Video Surveillance Inside and Outside of
Vehicles’, in its 2019 special application guide for
Emotion Recognition Outside of Cars ‘smart connected cars’. Specifically, the application
To date, Chinese driving-safety applications of guide expressed interest in supporting ‘recognition
emotion recognition capabilities tend to focus on of and feedback on mental state, emotional
individuals inside of cars; yet there is also emerging conditions, vital signs, etc. to improve security in the
interest in how the technology could be used at driver’s seat’, and ‘achievable emotion recognition
highway toll booths. An academic paper by four of drivers, automated adjustment of the vehicle’s
researchers at the Xi’An Highway Research Institute interior aroma/music category/colour of ambient
proposes an early-warning system that would use lighting to stabilize the driver’s mental state’.119
micro-expression recognition to detect drivers likely
to commit highway fare evasion.115 The authors note Tech companies that have provided emotion
that, in some parts of China, highway toll booths recognition capabilities in other use cases have
are already outfitted with automated licence plate shown interest in the driving-safety subsector.
readers and facial recognition-equipped cameras Xinktech, for instance, has mentioned highway
to track the vehicles of drivers who evade tolls. In management as a next step for its ‘Lingshi’
addition to their proposal that micro-expression multimodal emotion analysis system.120 The
recognition be used to detect suspects likely to company is also researching in-car emotion
commit fare evasion, they broaden the scope to recognition for potential use in taxis. In making a
include ‘early detection’ of drivers who may pose case for studying the emotional expressions of taxi
a physical threat to tollbooth operators.116 Such drivers and passengers before and after incidents
a system would require the creation of a facial- of verbal and physical conflict erupt, Xinktech CEO
expression database comprising people who Ling Zhihui cited a series of murders and rapes
have evaded fares or perpetrated violence against that drivers for ride-hailing company Didi Chuxing
tollbooth operators in the past, which could be committed. Ling suggests these data can be used
shared across multiple highway systems and to train early-warning models that would alert Didi’s
updated with the facial expression data it would customer-service representatives to intervene
continually amass. 117 This envisioned system would and prevent passenger harm.121 Much like with
connect to existing highway-monitoring systems, technologies that purport to predict acts of terror,
and could link a driver’s facial recognition and facial these ‘solutions’ could instead cause problems for
expression data with the same individual’s licence drivers incorrectly flagged as at-risk of harming a
information, creating what the authors describe as a passenger. Recording suspicious behaviour in the
‘highway traveller credit database’ (高速公路出行者 driver’s ridesharing profile, banning the driver from
信用数据库) that could be shared with the Ministry the platform, or escalating the situation to involve
of Public Security, as well as with transportation the police are all potential negative outcomes if
and security-inspection authorities, as evidence of emotion recognition is applied in this setting.
24
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
25
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
plans.130 The presumed causal link between content emotion recognition in education. As the COVID-19
taught and students’ outward expressions of pandemic has popularised ‘blended learning’ (混
interest are the foundations of an argument for 合学习) – where some classroom instruction is
personalised learning that many firms (described conducted using digital technologies, while the
in China’s Market for Emotion Recognition in rest retains the traditional face-to-face approach –
Education) repeat. Another study applies deep- several of these companies are prepared to absorb
learning algorithms to identify students’ real- new demand.
time facial expressions in massive open online
courses (MOOCs). 131 The authors believe emotion China’s Market for Emotion Recognition in
recognition benefits MOOCs in particular because Education
teachers are not co-located with their students Given how similar emotion recognition product
and need to enhance the quality of student– offerings in the education field are, one way to
teacher interaction.132 Although at least one study differentiate between them is to examine how they
acknowledges that equating students’ facial came to incorporate emotion recognition into their
emotions with states of learning engagement is core offerings. One set is companies that did not
a highly limited approach, the main response to start out in the education sector but later developed
this shortcoming has been to create new versions their emotion recognition software and/or hardware
that learn from past data (or, ‘iterate’) on unimodal for education use cases (Hanwang, Hikvision,
emotion recognition with multimodal alternatives.133 Lenovo, Meezao, Taigusys Computing). Another is
edtech firms with emotion recognition capabilities
One multimodal study of Chinese MOOC ostensibly built in-house (Haifeng Education,
participants collected facial-image and brainwave New Oriental, TAL, VIPKID). The third comprises
data to create a novel dataset, comprised of Chinese partnerships between edtech firms and major tech
learners (as opposed to human research subjects companies specialising in emotion, face, voice,
of other ethnicities), and to address low course- and gesture recognition (EF Children’s English,
completion and participation rates in MOOCs.134 VTron Group). As with security applications, user
Others investigated methods for using Tencent’s interfaces from these companies illuminate which
near-ubiquitous messaging app, WeChat, to data points are used to restructure the learning
conduct face, expression, and gesture recognition experience.
that would be implemented in classrooms as a
continuous, cost-efficient alternative to end-of-term As of December 2017, Hanwang Education supplied
questionnaire evaluations of teachers.135 In a similar at least seven schools around China with its CCS.138
vein, another paper suggests vocal tone-recognition In an interview for The Disconnect, Hanwang
technology can be used like a ‘smoke alarm’: if Education’s general manager logged into the CCS
teachers’ voices express insufficient warmth or user account of a teacher at Chifeng No. 4 Middle
affinity (亲和力), they can receive reminders to do School in the Inner Mongolia Autonomous Region
so.136 to demonstrate the app. 139 Aside from behavioural
scores, the app breaks down percentages of
Academic literature within China does not touch on class time spent on each of the five behaviours it
an important consideration in the use of emotional recognises, and compares individual students with
recognition in schools: recent research has found the class average. For example, a student who was
that current facial-emotion recognition methods marked as focused 94% of the time in his English
demonstrate subpar performance when applied to class, but was recorded as only answering one of
children’s facial expressions.137 Nonetheless, as the the teacher’s questions in a week, was considered
11 companies in Table 2 demonstrate, physical and to have low class participation.140
virtual classrooms across China are test sites for
26
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Type of
Company Product Name and Description
Instruction
EF Children’s English In person and Partners with Tencent Cloud to conduct image, emotion, and
英孚少儿英语 online voice recognition, and receives curriculum design assistance
to EF’s product-development teams and teachers.141
Hanwang Education In person Class Care System (CCS) cameras take photos of whole
汉王教育 classes once per second, connect to a programme that
purportedly uses deep-learning algorithms to detect
behaviours (including ‘listening, answering questions, writing,
interacting with other students, or sleeping’) and issue
behavioural scores to students every week. Scores are part of
a weekly report that parents and teachers access via the CCS
mobile app.142
Haifeng Education Online Cape of Good Hope multimodal emotion recognition platform
海风教育 tracks students’ eyeball movements, facial expressions, vocal
tone, and dialogue to measure concentration.143
Hikvision In person Smart Classroom Behaviour Management System integrates
海康威视 three cameras, positioned at the front of the classroom, and
identifies seven types of emotions (fear, happiness, disgust,
sadness, surprise, anger, and neutral) and six behaviours
(reading, writing, listening, standing, raising hands, and laying
one’s head on a desk).144 Cameras take attendance using face
recognition, and scan students’ faces every 30 seconds.145
Lenovo In person ‘Smart education solutions’ include speech, gesture, and facial
联想 emotion recognition.146
Meezao In person Uses facial expression recognition and eye-tracking software
蜜枣网 to scan preschoolers’ faces over 1,000 times per day and
generate reports, which are shared with teachers and
parents.147 Reports contain data visualisations of students’
concentration levels at different points in class.148
New Oriental Blended learning AI Dual Teacher Classrooms contain a ‘smart eye system
新东方 based on emotion recognition and students’ attention levels’,
which the company says can also detect emotional states,
including ‘happy, sad, surprised, normal, and angry’.149 A
subsidiary, BlingABC, offers online teaching tools such as
the AI Foreign Teacher, which contains face- and voice-
recognition functions. BlingABC counts how many words
students speak, along with data about students’ focus levels
and emotions, and claims reports containing this combination
of data can help students, parents, and teachers zero in
on exactly which parts of a lesson a student did not fully
understand.150
Taigusys Computing In person Collects data from three cameras, one each on students’
太古计算 faces, teachers, and a classroom’s blackboard. The system
detects seven emotions (neutral, happy, surprised, disgusted,
sad, angry, scared) and seven actions (reading, writing,
listening, raising hands, standing up, lying on the desk, playing
with mobile phones).151
27
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Taigusys Computing (太古计算), which has Tech companies use slightly different arguments
produced emotion recognition platforms for for emotion recognition depending on students’
prison surveillance and police interrogations, (see age group and whether the technology is to be
the Public Security section), has a teacher user used for online teaching or in-person classrooms.
interface that displays ‘problem student warnings’ Companies that have produced online teaching
with corresponding emotions, such as sadness platforms for nursery school-aged children, for
and fear. Other data visualisations combine data example, market their products as critical to not
on expression and behaviour recognition alongside only assessing young children’s budding abilities
academic performance to typologise students. For to concentrate and learn but also protecting
instance, the ‘falsely earnest type’ is assigned to a these students’ safety. Meezao (蜜枣网), which
student who ‘attentively listens to lectures [but has] won awards from Microsoft Research Asia for its
bad grades’, while a ‘top student’ might be one with applications of emotion recognition technology in
‘unfocused listening, strong self-study abilities, but retail and banking before turning to the education
good grades’.156 field, provides one example. 158
Although most of these systems are developed Meezao’s founder, Zhao Xiaomeng, cited the
solely within companies, a few draw from academic November 2017 RYB incident, in which Beijing
partnerships and funding of smaller startups. kindergarten teachers were reported to have
Some of the support for emotion, gesture, and face abused students with pills and needles, as having
recognition in products from one of China’s biggest made him ‘recognise [that] emotional intelligence
edtech suppliers, TAL, comes from its Tsinghua [technology’s] reflection of children’s emotional
University–TAL Intelligent Education Information changes can help administrators more accurately
Technology Research Center, and from technology and quickly understand hidden dangers to children’s
TAL has acquired through FaceThink (德麟科技), safety, such that they can prevent malicious
an expression recognition startup it has funded.157 incidents from happening again’.159 Zhao described
When it comes to selling and implementing a trial of Meezao’s technology in a preschool
products, several of the companies examined here classroom, where the software identified fear on
have been known to play to two narratives that many of the children’s faces when a male stranger
surpass education: parents’ fears about students’ with an athletic build entered the classroom.160
safety, and ‘digital divide’ concerns that less-
developed regions of China will technologically lag
behind wealthier coastal provinces.
28
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Similarly, according to VTron Group (威创集团) CEO, Public criticism directed at various applications of
Guo Dan, their collaborations with teams from both emotion recognition in Chinese schools does not
Baidu and Megvii enables the use of: appear to impede the likelihood that more domestic
companies will apply voice and facial expression-
“AI cameras, recognition algorithms, and based emotion recognition in the education sector.
big data analysis, to accurately obtain Factors that contribute to this potential proliferation
information on individuals’ identities include the breadth of market opportunities,
and teachers’ and students’ emotions, both within and beyond schools; perceptions
and to provide complete solutions for of technological prestige, attributed to specific
[ensuring] kindergarteners can be picked institutions and the country as a whole, for leading
up [from school] safely, for teachers’ the adoption of these tools; and local governments’
emotional guidance, and for early warning policy support and subsidies of these technologies’
mechanisms [that detect] child safety installation and upkeep.
crises”.161
The faulty assumptions that these security Emotion Recognition in Online and In-Person
arguments are based on remain unchallenged in Classrooms
the Chinese literature. As with narratives about In May 2018, Hangzhou No. 11 Middle School held
ameliorating education across rural and lower- a ‘smart campus’ seminar where it unveiled a Smart
resourced regions of China, the companies make Classroom Behaviour Management System (智
promises the technology alone cannot deliver on – 慧课堂行为管理系统), which the world’s biggest
and, indeed, are not held accountable for upholding. surveillance-camera producer, Hikvision, produced
in conjunction with the school.165 Computer monitors
Hardware giant Lenovo has extended its near teachers’ desks or lecture stands displayed
emotion recognition capabilities (originally the system’s assignment of the values A, B, and C
used in customer-service settings) to Chinese to students, based on emotional and behavioural
classrooms.162 Lenovo has sold edtech to indicators, and included a column representing
elementary and high schools in Sichuan, Tibet, ‘school-wide expression data’. According to
Shandong, and Yunnan (among at least a dozen the school’s administration, the only behaviour
provinces the company contracts with), touting registered as ‘negative’ was resting one’s head on
sales to these provinces as a means of closing the a desk; and if a student did this often enough to
digital divide. However, because Lenovo’s emotion surpass a preset threshold, they were assigned a
recognition feature is modular, it is difficult to C value. Twenty minutes into each class, teachers’
pinpoint exactly how many of these schools use display screens issued notifications about which
it.163 New Oriental (新东方), which has brought its AI students were inattentive. These notifications
Dual Teacher Classroom (AI双师课堂) to over 600 disappeared after three minutes.166 Outside of
classrooms in 30 cities across China, strategically classrooms, monitors showing how many students
spotlights its efforts in cities like Ya’an in Sichuan were making ‘sour faces’ (苦瓜脸) and neutral faces
province.164 Despite these sizeable user bases, in- were mounted in hallways.167 Some media accounts
depth testimonials of how these technologies are in English and Mandarin suggest the technology has
viewed within schools are scarce. One exception since been scaled back, while others indicate it has
comes from the country’s best-documented – and been removed altogether. Yet, in its brief trial period,
perhaps most contentious – implementation of the Smart Classroom Behavioural Management
emotion recognition, at a high school in the coastal System revealed how perceptions of emotion
tech hub of Hangzhou. recognition changed social dynamics in schools.
29
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Students’ Experiences of Emotion Recognition elicit this performative instinct, however, are not
Technologies in agreement about how to respond. Shanghai
Despite avowals from companies such as Meezao Jiaotong University professor Xiong Bingqi
and Hikvision that their emotion recognition castigates the cameras as a “very bad influence on
applications were designed in conjunction with the students’ development […] [that] take[s] advantage
schools that use them, students appeared to have of students” and should be removed.174 He Shanyun,
been left out of these consultations. As a Hanwang an associate professor of education at Zhejiang
Education technician put it: “We suggest the schools University, believes the ‘role-playing’ effect could
ask for the students’ consent before using CCS […] be mitigated if classroom applications of emotion
If they don’t, there’s nothing we can do.”168 Of the recognition are not tied to rewards and disciplinary
few students interviewed about their experiences measures against students, and are only used for
of emotion recognition technologies in Chinese follow-up analysis of students’ learning progress.
classrooms, none supported their schools’ use of Shanghai University of Finance and Economics law
these systems. professor, Hu Ling, emphasised that schools needed
to do the work to convince parents and students
At Hangzhou No. 11, which claims to have only that the technology was not being used to assess
tested the Hikvision Smart Classroom Behaviour academic performance.175 Yet, to place the onus for
Management System on two tenth-grade classes, seeking consent on the school alone absolves the
some students were afraid when their teachers companies of responsibility.
demonstrated the technology.169 While one student’s
fear was grounded in her understanding of how Niulanshan First Secondary School teamed up
powerful Hikvision’s high-resolution surveillance with Hanwang to use the company’s CCS cameras
cameras are, others worried about being to conduct facial sampling of students every few
academically penalised if any of their movements months to account for changes in their physical
were recorded as unfocused.170 “Ever since the appearance.176 This continual sampling – coupled
‘smart eyes’ have been installed in the classroom, with accounts from students at Hangzhou No. 11,
I haven’t dared to be absent-minded in class,” who found their school’s face-recognition-enabled
reflected one student at Hangzhou No. 11.171 This Hikvision cameras often failed when they changed
narrative can fuel belief in the power of a technology hairstyles or wore glasses – suggests this converse
that potentially exceeds what it is being used for; scenario of error-prone cameras both undermines
one student at Niulanshan First Secondary School the argument that these new technologies are
in Beijing was anxious that data about the moments fool proof and can even lead to students being
when he is inattentive in class could be shared with apathetic about these new measures.177 At
universities he wants to attend. Hangzhou No. 11, some students noticed attentive
classmates were sometimes mislabelled ‘C’ for
Examples of behaviour changes in students bear out unfocused behaviour.178 Perception of this error led
a concern that Chinese academics have regarding these students to discredit the system, with one
emotion recognition; namely, that students will commenting: “it’s very inaccurate, so the equipment
develop ‘performative personalities’ (表演型人 is still continually being debugged,” and another
格), feigning interest in class if this becomes admitting: “We don’t look at this thing too often.”179
another metric on which their academic abilities
are judged.172 Some students found staring straight Perceptions of inaccuracies do not always end
ahead was the key to convincing the system they with ignoring technology, however. Some Chinese
were focused.173 Experts who agree that the cameras academics see the misjudgements of emotion
30
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
31
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
At the same time, promises about advancing reporting has paid little attention to teachers’
personalised learning and improving teacher quality impressions of emotion recognition, focusing more
fail to elaborate on what kinds of recommendations on students, as well as on the biggest champions of
teachers are given to achieve these vague these technologies: school administrators.
outcomes. For example, there is no clear
differentiation regarding whether data on which
School Administrators’ Perceptions of Emotion
students were ‘attentive’ during certain parts of a
Recognition Technologies
lecture reflects interest in the material, approval of a
teacher’s pedagogical methods, or another reason School administrators have tended to take defensive
altogether – let alone guidance on how the data can stances on the value of emotion recognition
be converted into personalised solutions tailored to technology to their schools. The defensiveness is, in
each student. part, a response to spirited public discussion about
privacy concerns surrounding the use of emotion
In the aforementioned expert panel on the use of the recognition in schools in China. For instance,
Smart Classroom Behaviour Management System Megvii’s implementation of an attendance-,
at Hangzhou No. 11, the difficulty of balancing emotion-, and behaviour-recognition camera,
competing interests and drawbacks to teachers and the MegEye-C3V-920, at China Pharmaceutical
students was evident. Ni Mijing, a member of the University in Nanjing met with criticism on Chinese
national Committee of the Chinese People’s Political social media.192 While social media commentary
Consultative Conference and deputy director of focuses on a suite of privacy and rights violations,
the Shanghai Municipal Education Commission, news media accounts instead tend to focus on the
acknowledged the value of data on students’ risks of data leakage and third-party misuse.193
reactions to evaluations of teachers, and advocated
openness to edtech trials as a way for schools Hangzhou No. 11 Principal, Ni Ziyuan, responded to
to learn from their mistakes.189 However, he also criticisms that the Hikvision-built Smart Classroom
warned: Behaviour Management System violates student
privacy with the rebuttal that it does not store video
“We should oppose using technology recordings of classroom activity, but instead merely
to judge the pros and cons of children’s records the behavioural information extracted from
study methods, [and we] should even video footage.194 Vice Principal Zhang Guanchao
oppose using this set of technologies has also tried to assuage privacy concerns by
to judge teachers’ teaching quality, pointing out that students’ data are only stored on
otherwise this will produce data local servers (rather than in the cloud), supposedly
distortion and terrible problems for preventing data leaks; that the school’s leadership
education […] Excessive top-level design and middle management have differentiated
and investment are very likely to become permissions for who can access certain student
a digital wasteland, a phenomenon I call data; and that the system only analyses the
a digital curse.”190 behaviour of groups, not individuals.195 Hanwang
As with students who expressed doubts about the Education’s CEO maintains the company’s CCS does
accuracy of Hikvision’s Smart Classroom Behaviour not share reports with third parties and that, when a
Management System and other Hikvision face parent is sent still images from classroom camera
recognition cameras at their school, teachers have footage, all students’ faces except their child’s
implied the technology has not changed much about are blurred out.196 In general, the defences that
how they do their work. For example, one teacher at administrators have raised ignore the concerns that
Hangzhou No. 11 said: “Sometimes during class I’ll students and education experts have voiced about
glimpse at it, but I still haven’t criticised students these technologies.
because their names appear on it.”191 Chinese news
32
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Other administration accounts of the Hikvision At a local level, policymakers’ guidance is more
system at Hangzhou No. 11 stand in questionable directed. The May 2018 event, at which the
contrast with students’ and teachers’ comments. Hangzhou No. 11–Hikvision collaboration was
In an interview for ThePaper, Vice Principal Zhang first launched, was organised by the Hangzhou
boasted the system had achieved positive results: Educational Technology Center – itself supervised
students were adapting their behaviour to it, and by the Hangzhou Education Bureau. The Hangzhou
teachers used data about students’ expressions and Educational Technology Center is in charge of
behaviour to change their approach to teaching.197 both edtech procurement and technical training
While Hangzhou No. 11’s principal and vice principal for primary and secondary schools in the city.202
said facial expression and behavioural data would While Hangzhou is among China’s wealthier
not affect evaluations of students’ academic cities, with resources at its disposal to conduct
performance, in an interview with the Beijing News, edtech experiments, the user bases of the
Zhang said: “Right now [I] can’t discuss [whether aforementioned tech companies are likely to grow,
this system will extend to] evaluations.”198 leading more of them to come up against the same
issues Hangzhou No. 11 did. Not all municipal
Despite administrators’ ardent defences of the and provincial governments neglect public
Smart Classroom Behaviour Management System, responses to these technological interventions;
one account suggests its use was halted the Shenzhen’s Municipal Education Bureau decided
same month it was launched.199 In spring 2019, against implementing online video surveillance
Vice Principal Zhang announced the school had of kindergarten classrooms to protect student
modified its system to stop assessing students’ privacy.203 Examples like this are the exception,
facial expressions, although the cameras would however, and do not preclude other cities and
still detect students resting heads on desks provinces from experimenting with emotion
and continue to issue behavioural scores.200 The recognition.
contradictory statements the administration issued,
along with this retraction of facial expression A central tension that schools will continue to
detection, may point to a mismatch between face concerns whether emotion recognition will be
expectations and reality when it comes to applying used to measure academic performance, student
emotion recognition in schools. behaviour, or both. ‘Function creep’ –technologies’
expansion into collecting data and/or executing
Positive media coverage of schools’ embrace functions they were not originally approved to
of new technologies prevail over accounts of collect or execute – is another possibility. For
the ultimate underuse and distrust of emotion example, in acknowledging that Hangzhou No. 11’s
recognition technologies in the education sector. Smart Classroom Behaviour Management System
Moreover, school administrators continue to benefit may label students who rest their heads on their
from touting these technological acquisitions desks due to illness as ‘inattentive’, Vice Principal
when publicising themselves to local government Zhang suggested the school nurse’s office could
authorities as progressive and worthy of more establish ‘white lists’ of ill students to prevent
funding. On a national level, being the first country them from being unfairly marked as unfocused in
to publicly trial these technologies is a source class.204 Similarly, Hangzhou No. 11 implemented
of pride. For instance, one account of TAL’s facial recognition as a form of mobile payment
Magic Mirror posited that ‘emotion recognition authentication in its cafeteria in 2017. Not long after,
technology is also China’s “representative product the school used face recognition to monitor library
of independent intellectual property rights”’ – a loans and compile annual nutrition reports for each
description that reappears on Hangzhou No. student, which shared information about students’
11’s official WeChat account in a write-up of the cafeteria food consumption with their parents.205
Hikvision Smart Classroom Behaviour Management
System.201
33
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Parents’ Perceptions of Emotion Recognition commented on Chinese parents’ vigilance over their
Technologies children’s academic performance and behaviour
Although parents can – in theory –advocate for in class as a product of the national education
their children’s interests in schools, the extent to system’s focus on testing as a central determinant
which they have done so regarding schools’ use of future opportunities.209
of emotion recognition is unclear. One article,
reporting on a Chinese Communist Party-sponsored Professor Hu hit upon the question that schools
technology expo that featured TAL’s Magic Mirror, will continue to revisit regarding not only emotion
quoted an impressed parent who felt the use of recognition, but also all future technological
this technology made their child’s education better interventions that purportedly make education more
than that of their parents’ generation.206 Yet, a blog efficient, effective, quantifiable, and manageable:
“
post declared that parents disliked this monitoring
The most fundamental
of their children, and that some companies
question is, what do we
subsequently removed phrases like ‘emotion
expect education to become?
recognition’, ‘facial recognition’, and ‘magic mirror’
If it is guided by efficient
from their marketing.207
test-taking, it will naturally
cut all classroom behaviour
Regardless of parents’ views on the issue, Professor
into fragments, layers, and
Hu Ling of Shanghai University of Finance and
scores, [and] an algorithm
Economics noted that “schools hold the power
will evaluate if you are a child
to evaluate, punish, and expel”, and so “parents
who loves to learn or if you
won’t sacrifice the students’ futures by standing
are a child who doesn’t love to
up against the schools, which leaves the students
learn.” 210
34
3. Emotion Recognition and Human Rights
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
International human rights are guaranteed by particularly given the discriminatory and discredited
the Universal Declaration of Human Rights and scientific foundations on which this technology is
given binding legal force through the International built.
Covenant on Civil and Political Rights (ICCPR) and in
regional treaties.
Right to Privacy
States are under binding legal obligations to
Emotion recognition technologies require collecting
promote, respect, protect, and guarantee human
sensitive personal information for both training and
rights in these treaties. They are also under the
application. Individuals being identified, analysed,
obligation to provide guidance to businesses on
and classified may have no knowledge that they
how to respect human rights throughout their
are being subject to these processes, making the
operations.211
risks that emotion recognition poses to individual
rights and freedoms grave. While these technologies
Private companies also have responsibility to
in isolation do not necessarily identify individuals,
respect human rights; the Guiding Principles on
they can be used to corroborate identities when
Business and Human Rights provide a starting
used among other technologies that carry out
point for articulating the role of the private sector
identification. This significantly impedes ability to
in protecting human rights in relation to digital
remain anonymous, a key concept in the protection
technologies. Even though these principles are not
of the right to privacy as well as freedom of
binding, the UN Special Rapporteur on Freedom
expression.215
of Expression and Opinion has stated that ‘the
companies’ overwhelming role in public life
A common thread across all use cases discussed in
globally argues strongly for their adoption and
this report is the concentration of state and industry
implementation’.212
power; to use emotion recognition technologies,
companies and state actors have to engage
While we have discussed the dubious scientific
in constant, intrusive, and arbitrary qualitative
foundations that underpin emotion recognition
judgements to assess individuals. It is important,
technologies, it is crucial to note that, emotion
therefore, to consider surveillance as an inevitable
recognition technologies serve as a basis to restrict
outcome of all emotion recognition applications.
access to services and opportunities, as well as
For example, all the use cases for early warning,
disproportionately impacting vulnerable individuals
closer monitoring, and interrogation related to
in society. They are therefore fundamentally
public security are deployed on the grounds that
inconsistent with international human rights
they are necessary to prevent crime and ensure
standards, described in this chapter.
safety. In practice, however, they are deployed
indiscriminately for fishing expeditions that are
Human dignity underpins and pervades these
unrelated to the needs of a particular operation.
human rights instruments.213 As stated in the
Mass surveillance thus increasingly becomes an
Preamble to the ICCPR, ‘[human] rights derive from
end in and of itself. Further, the stated purpose
the inherent dignity of the human person’, which
of driving-safety applications is to ensure driver
is underscored by the fact that it [dignity] is not a
and passenger safety, but the outcome includes
concept confined to preambulatory clauses alone
systematic invasions of privacy and significant
but is also used in context of substantive rights.214
mission creep, in the case of biometric information
Emotion recognition strikes at the heart of this
potentially being used for insurance purposes. A
concept by contemplating analysing and classifying
basic tenet of international human rights law is
human beings into arbitrary categories that touch
that rights may not be violated in ways that confer
on the most personal aspects of their being.
unfettered discretion to entities in power, which is a
Overarchingly, the very use of emotion recognition
feature of – not a bug in – these technologies.
imperils human dignity and, in turn, human rights –
36
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Any interference with the right to privacy must be Right to Freedom of Expression
provided by law, in pursuit of a legitimate aim, and
Freedom of expression and privacy are mutually
necessary and proportionate.216 Privacy concerns
reinforcing rights. Privacy is a prerequisite to the
over biometric mass surveillance have received
meaningful exercise of freedom of expression,
dedicated attention in the last few years. In a 2018
particularly given its role in preventing state and
report on the right to privacy in the digital age, the
corporate surveillance that stifles free expression.
UN High Commissioner for Human Rights, while
While freedom of expression is fundamental
discussing significant human rights concerns raised
to diverse cultural expression, creativity, and
by biometric technologies, stated:
innovation, as well as the development of one’s
“Such data is particularly sensitive, as personality through self-expression, the right
it is by definition inseparably linked to a to privacy is essential to ensuring individuals’
particular person and that person’s life, and autonomy, facilitating the development of
has the potential to be gravely abused […] their sense of self, and enabling them to forge
Moreover, biometric data may be used for relationships with others.220
different purposes from those for which
it was collected, including the unlawful Claims that emotion recognition technology can
tracking and monitoring of individuals. infer people’s ‘true’ inner states, and making
Given those risks, particular attention decisions based on these inferences has two
should be paid to questions of necessity significant implications for freedom of expression.
and proportionality in the collection of First, it gives way to significant chilling effects on
biometric data. Against that background, the right to freedom of expression – the notion
it is worrisome that some States are of being not only seen and identified, but also
embarking on vast biometric data-based judged and classified, functions as an intimidation
projects without having adequate legal and mechanism to make individuals conform to ‘good’
procedural safeguards in place.”217 forms of self-expression lest they be classified
as ‘suspicious’, ‘risky’, ‘sleepy’, or ‘inattentive’
As noted by the UN Special Rapporteur on Privacy, (depending on the use case). Second, given the wide
‘evidence has not yet been made available that range of current applications, it normalises mass
would persuade the [Special Rapporteur] of the surveillance as part of an individual’s daily life, in
proportionality or necessity of laws regulating public and private spaces. Proposed uses, such
surveillance which permit bulk acquisition of as the research paper suggesting deployment of
all kinds of data including metadata as well as emotion recognition technology to identify people
content’.218 entering Tibet who have pro-Tibetan independence
views, create a dangerously low threshold for
Importantly, the nature of these technologies is also authorities to misidentify self-incriminating
at odds with the notion of preserving human dignity, behaviour in a region that is already over-surveilled.
and constitutes a wholly unnecessary method of Importantly, freedom of expression includes the
achieving the purported aims of national security, right not to speak or express oneself.221
public order, and so on (as the case may be). While
international human rights standards carve out Right to information is an important part of freedom
national security and public order as legitimate of expression. This includes transparency of
justifications for the restriction of human rights, how state institutions are operating and making
including privacy, these situations do not give states public affairs open to public scrutiny so as to
free rein to arbitrarily procure and use technologies enable citizens to understand the actions of their
that have an impact on human rights; nor do they governments. The UN Special Rapporteur on the
permit states to violate rights without providing promotion and protection of the right to freedom of
narrowly tailored justifications and valid, specific opinion and expression emphasised that:
reasons for doing so.219
37
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
38
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
(e.g. people of colour, transgender, and non-binary recognition authentication for cafeteria payments,
individuals) will be disproportionately surveilled, and subsequently expanded its use to monitoring
tracked, and judged. library loans and nutrition reports for each student,
outlining food consumption information for
The UN Human Rights Council has emphasised parents.231
that ‘automatic processing of personal data for
individual profiling may lead to discrimination or This function creep also stems from a more
decisions that otherwise have the potential to affect general ‘tech-solutionist’ tendency to using new
the enjoyment of human rights, including economic, technologies to solve administrative and social
social and cultural rights’.228 Although profiling can problems.
lead to discriminatory outcomes in disproportionate
ways regardless of the specific technology in Growing Chorus of Technical Concerns
question, this risk is even more pronounced in
There is growing recognition of the limitations
the case of emotion recognition, as the criteria
of emotion recognition technologies from the
for classification are primed for discrimination.
developers, implementers, and individuals subject
Consequential decisions in the contexts of hiring,
to them. Experts who advocate using emotion
national security, driving safety, education, and
recognition for security, in particular, acknowledge
criminal investigations are often built on the
some drawbacks to this technology. However, most
foundations of such profiling.
of their critiques address the technical concerns of
surveillers at the expense of the real-life impacts
Other Technical and Policy on those being surveilled. For example, Wenzhou
Considerations customs officials published a research paper on
automated identification of micro-expressions in
A number of additional strategic and substantive
customs inspections, which admits that camera-
threads of analysis in the Chinese context are worth
footage quality, lighting, and the added anxiety and
noting. We outline these thematically below to aid in
fatigue of travel can affect how micro-expressions
effective civil society advocacy going forward.
are produced, recorded, and interpreted.232
Function Creep
False positives are another commonly recognised
The intended goal for the use of emotion issue; however, the Chinese research and security
recognition systems has varied between use literature often attributes these to the person
cases, but indications of function creep beyond under surveillance deliberately feigning emotions,
use cases discussed in this report already exist. rather than to the system’s own flaws. The most
Ping An Group’s demonstration, from late 2019, well-known of these is the ‘Othello error’, in which
indicates the firm’s intention to move past using someone telling the truth unintentionally produces
emotion recognition to monitor safety and avert micro-expressions associated with liars. This is a
accidents, and towards feeding into insurance particularly important finding, from a human rights
assessments.229 Meezao has already pivoted from perspective, as the overarching issues surrounding
only providing emotion recognition in schools dignity, privacy, and freedom of expression seem to
to also offering these technologies at the China be precluded from public deliberation and critique of
Museum of Science and Technology to collect data emotion recognition technologies.
on children’s responses to physics and chemistry
experiments. 230 This function creep has happened
before: in 2017, Hangzhou No. 11 introduced facial-
39
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Misaligned Stakeholder Incentives where face recognition cameras are already in use
– the other places to watch for potential export
Cooperation between academic research are markets where Chinese tech companies have
institutions, tech companies, and local state actors already sold face recognition cameras. For instance,
reveals the perceived benefits to each group of Hikvision has provided surveillance equipment to
participating in the diffusion of these technologies, schools in Canada, Denmark, Dubai, India, Japan,
which is at odds with the human rights concerns Malaysia, Pakistan, and South Africa,236 while
arising from them. As one study of facial recognition Huawei has provided numerous cities around the
firms in China found, companies that received globe – including in Asia, Africa, and Latin America
training data from the government were more likely – with policing platforms.237 In 2017, Huawei issued
to spin off additional government and commercial a call for proposals that included ‘dialog emotion
software.233 As such – and aside from procurement detection based on context information’, ‘emotional
contracts to furnish technology for Sharp Eyes, state analysis based on speech audio signal’, and
Fengqiao, and related pre-existing government multimodal emotion recognition.238
surveillance projects – emotion recognition firms
may see longer-term financial opportunities and Ethnicity and Emotion
profits from these multi-institutional collaborations.
Racial, gender-based, and intersectional forms of
Regional and Global Impact discrimination in biometric technologies like face
recognition have been demonstrated in a wide
Throughout the literature on emotion recognition range of academic and civil society research in
technology in China, few companies have expressed the last few years. The UN Special Rapporteur on
the intention of exporting their products at this Contemporary Forms of Racism calls for ‘racial
phase of their development. Media coverage of equality and non-discrimination principles to
EmoKit – the company that partnered with the bear on the structural and institutional impacts of
city of Qujing, Yunnan, to pilot test its emotion emerging digital technologies’.239 Criticisms of facial
recognition interrogation platform – suggested recognition technologies’ inaccuracies across skin
Yunnan’s geographical proximity to South and tone and gender map onto debates around emotion
Southeast Asia could be advantageous for exports recognition, along with an additional variable:
to countries that comprise the OBOR and Maritime cultural differences in expressions of emotion.
Silk Road regions.234 While OBOR represents a
terrestrial route connecting China to Europe via With some exceptions, Chinese companies
Central Asia, the Maritime Silk Road is the Indian tend to tout the narrative that facial emotion
Ocean-traversing counterpart that connects expressions are universal,240 but years of scientific
ports in China, South and Southeast Asia, the evidence demonstrate cultural differences in facial
Middle East, and Eastern Africa. Alpha Hawkeye expressions and the emotions they are interpreted
has allegedly supplied OBOR countries with its to signify. This marketing strategy is unsurprising,
technology for counterterrorism and garnered however, given its ability to boost faith in the
interest from Southeast Asian security departments technology’s alleged objectivity and to unearth
in the Philippines, Malaysia, Thailand, Myanmar, ‘true’ emotions, while also paving a future path to
and Indonesia.235 Publicly available data have not its international export. Wang Liying, a technical
provided additional evidence of this, however, and director at Alpha Hawkeye, proclaimed that ‘the
the company’s own media presence has dwindled in entire recognition process is uninfluenced by
the last two years. expression, race, age, and shielding of the face’.241
Yet, if the ‘biometrics 3.0’ framing of emotion Research suggests otherwise. In her paper ‘Racial
recognition as a next step from face recognition Influence on Automated Perceptions of Emotions,’
persists – and if these firms demonstrate that Professor Lauren Rhue compiled a dataset of
emotion recognition capabilities are easily applied headshots of white and Black male National
40
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Basketball Association (NBA) players to compare the existence of such systems (and categories)
the emotional analysis components of Chinese face deeply problematic. Second, the solution to the
recognition company Megvii’s Face++ software to discriminatory effects of these systems is not to
Microsoft’s Face API (application programming add more nuanced alternatives for categorising
interface). In particular, she found ‘Face++ rates race, but rather to ban the use of such technologies
black faces as twice as angry as white faces,’ while altogether.244
Face API views Black faces as three times as angry
as white ones.242 Companies’ Claims About Mental Health
and Neurological Conditions
China is not the only country whose tech firms
Proposed uses of emotion recognition to help
factor race into facial recognition and related
people with neurological conditions, disabilities,
technologies. However, its tech sector’s growing
and mental health afflictions are not new to the
influence over international technical standards-
field. Affectiva has stated it began its work by
setting for these technologies presents an
developing a ‘Google Glass-like device that helped
opportunity to address the domestically long-
individuals on the autism spectrum read the
ignored consequences of technological racial and
social and emotional cues of other people they
ethnic profiling. Instead of this open reckoning,
are interacting with’.245 While this report excludes
admission of racial inequities in training
an in-depth analysis of similar use cases, which
datasets tends to become a justification for the
are often carried out in medical institutions, it
creation of datasets of ‘Chinese faces’ to reduce
must take into account a critical omission in the
inaccuracies in domestic applications of emotion
emerging literature on commercial applications of
recognition.243Arguments like this account for the
emotion recognition in China: thus far, companies
potential bias of datasets that may over-represent a
have ignored questions of how these technologies
tacitly implied Han Chinese range of facial features
will work for neurodiverse individuals. Companies
and expressions while failing to address if and
engaged in non-medical applications make
how new datasets created within China will draw
particularly troubling claims about their ability to
samples from China’s 56 officially recognised ethnic
detect mental health disorders and neurological
groups.
conditions (both diseases and disorders) – highly
discrete categories that this literature often groups
Some companies’ open-source APIs include race
together, as though they were indistinguishable.
variables that raise a host of concerns about human
rights implications particularly for ethnic minorities
Chinese companies like Taigusys Computing and
– even before considering sources of training
EmoKit have mentioned autism, schizophrenia,
data, accuracy rates, and model interpretability.
and depression as conditions they can diagnose
Baidu’s face-detection API documentation includes
and monitor using micro-expression recognition.246
parameters for emotion detection as well as race,
Meezao CEO Zhao said the company is testing its
with a sample of an API call return including ‘yellow’
emotion recognition technology on children with
as a type of race. Taigusys Computing’s open-
disabilities; for instance, to detect types of smiling
source expression-recognition API includes ‘yellow’,
that could serve as early indicators of epilepsy.247
‘white’, ‘black’, and ‘Arabs’ (黄种人,白种人,黑种
One concern is that these systems will impose
人,阿拉伯人) as its four racial categories. Neither
norms about neurotypical behaviour on people
company accounts for why race would be assessed
who do not display it in a way the technology
alongside emotion in their APIs. This is untenable
is designed to detect.248 Another possible issue
for two reasons. First, fundamental issues
involves potential discrimination against people the
surrounding the discredited scientific foundations
technology perceives as exhibiting such conditions.
and racist legacy of emotion recognition makes
41
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
42
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Although not all police officers view this approach into the procedure, even more of these moments
as a way to get around the law, an equally can presumably be identified. An article about
problematic possibility is that some will believe Shenzhen Anshibao confirmed the technology could
that using emotion recognition in interrogation is be used for post-mortem emotion recognition, citing
more scientific and rights-protective. As far back video footage of the Boston marathon bombing as
as 2014, an academic paper from the People’s an example.255
Public Security University of China, detailing how
law enforcement officials could be trained to The role of security blacklists and criminal
visually observe micro-expressions, made a similar backgrounds is also critical to the justifications
argument: that companies, researchers, and the state present
for emotion recognition. Advocates of emotion
“Under the new Criminal Procedure Law’s recognition for public security note that, while
principle that ‘no individuals can be forced face recognition enables cross-checking with
to prove their guilt,’ it has become more police blacklist databases, they fail to account for
difficult to obtain confessions. Criminal people who lack criminal records. One paper, from
suspects often respond to questioning with the Public Security College of Gansu University
silence and unresponsiveness. In actuality, of Political Science and Law, laments that current
it is common for investigations to turn facial recognition systems in China lack data on
up no clues. Micro-expression analysis residents of Hong Kong, Taiwan, Macao, and other
techniques can now solve this issue.”254 foreign nationals. Micro-expression recognition,
Specifically, investigators would be trained to the authors argue, would widen the net of which
introduce ‘stimuli’ – such as the names of people ‘dangerous’ people can be flagged in early-warning
and objects related to a crime – while watching for systems.256 This suggestion takes on added portent
micro-expressions that correspond to these words. in light of China’s recent crackdowns on Hong
They would then treat terms that elicit these minute Kong protests and the instatement of China’s new
responses as ‘clues’ in a case. The paper presaged national security law there.
the ability to return to archival interrogation video
footage to search for moments when incriminating
micro-expressions appeared. When AI is brought
43
4. China’s Legal
Framework and Human
Rights
China’s legal landscape around data protection and The legal preparations to ratify the ICCPR have
AI is multi-layered and constantly evolving. Two of been in motion for at least a decade, with little
the main contributions of this report are: tangible progress.259 It is not clear what incremental
advances towards this goal are implied in the
1. Unpacking one national context – including 2016–2020 National Human Rights Action Plan.
incentives, actors, and narratives – within
which these systems are meant to function; National Law
and Chinese Constitution
45
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
46
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
47
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Ethical Frameworks
One of the most prominent AI ethics statements to
come out of China is from the Artificial Intelligence
Industry Alliance, which, in 2019, published a self-
discipline ‘joint pledge’ underscoring the need to:
48
5. Recommendations
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
This report has covered vast terrain: from the To the Private Companies Investigated in this
legacy and efficacy of emotion recognition systems Report:
to an analysis of the Chinese market for these 1. Halt the design, development, and
technologies. We direct our recommendations as deployment of emotion recognition
follows. technologies, as they hold massive
potential to negatively affect people’s lives
To the Chinese Government: and livelihoods, and are fundamentally and
1. Ban the development, sale, transfer, and use intrinsically incompatible with international
of emotion recognition technologies. These human rights standards.
technologies are based on discriminatory
methods that researchers within the fields 2. Provide disclosure to individuals impacted
of affective computing and psychology by these technologies and ensure that
contest. effective, accessible and equitable
grievance mechanisms are available to
2. Ensure that individuals already impacted them for violation of their rights as result of
by emotion recognition technologies have being targeted emotion recognition.
access to effective remedies for violation of
their rights through judicial, administrative, To Civil Society and Academia:
legislative or other appropriate means. This 1. Advocate for the ban on the design,
should include measures to reduce legal, development, testing, sale, use, import, and
practical and other relevant barriers that export of emotion recognition technology.
could lead to a denial of access to remedies.
2. Support further research in this field,
To the International Community: and urgently work to build resistance by
1. Ban the conception, design, development, emphasising human rights violations linked
deployment, sale, import and export to uses of emotion recognition.
of emotion recognition technologies,
in recognition of their fundamental
inconsistency with international human
rights standards.
50
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Endnotes
1 J. McDowell, ‘Something You Are: Biometrics ket-insights-status-latest-amend-
vs. Privacy’, GIAC Security Essentials Project, ments-and-outlook-2019-2025-2020-09-17.
version 1.4b, 2002, https://www.giac.org/
paper/gsec/2197/are-biometrics-priva- 7 See e.g.: F. Hamilton, ‘Police Facial Recognition
cy/103735. Also see: Privacy International. Robot Identifies Anger and Distress’, The
‘Biometrics: Friend or Foe?’, 2017, https:// Sunday Times, 15 August 2020; S. Cha, ‘“Smile
privacyinternational.org/sites/default/ with Your Eyes”: How To Beat South Korea’s
files/2017-11/Biometrics_Friend_or_foe.pdf AI Hiring Bots and Land a Job’, Reuters, 13
January 2020, https://in.reuters.com/article/
Introduction us-southkorea-artificial-intelligence-jo/smile-
with-your-eyes-how-to-beat-south-koreas-ai-
2 K. Hill, ‘The Secretive Company that Might hiring-bots-and-land-a-job-idINKBN1ZC022;
End Privacy as We Know It’, The New York R. Metz, ‘There’s a New Obstacle to Landing
Times, 19 January 2020, https://www.nytimes. a Job After College: Getting Approved by AI’,
com/2020/01/18/technology/clearview-priva- CNN, 15 January 2020, https://edition.cnn.
cy-facial-recognition.html com/2020/01/15/tech/ai-job-interview/
index.html; A. Chen and K. Hao, ‘Emotion
3 K. Hill, ‘The Secretive Company that Might AI Researchers Say Overblown Claims Give
End Privacy as We Know It’, The New York Their Work a Bad Name’, MIT Technology
Times, 19 January 2020, https://www. Review, 14 February 2020, https://www.
nytimes.com/2020/01/18/technology/ technologyreview.com/2020/02/14/844765/
clearview-privacy-facial-recognition.html. Also ai-emotion-recognition-affective-comput-
see: Amba Kak (ed.), ‘Regulating Biometrics: ing-hirevue-regulation-ethics/; D. Harwell, ‘A
Global Approaches and Urgent Questions’, Face-Scanning Algorithm Increasingly Decides
AI Now Institute, 1 September 2020, https:// Whether You Deserve the Job’, The Washington
ainowinstitute.org/regulatingbiometrics.pdf Post, 6 November 2019, https://www.
washingtonpost.com/technology/2019/10/22/
4 For instance, controversy around facial ai-hiring-face-scanning-algorithm-increas-
recognition in 2020, which culminated in Big ingly-decides-whether-you-deserve-job/;
Tech backing away from the development and iBorderCtrl, ‘IBorderControl: The Project’,
sale of these technologies to varying degrees, 2016, https://www.iborderctrl.eu/The-project;
did little to scale back facial recognition’s C. Rajgopal, ‘SA Startup Launches Facial
public footprint. See e.g.: N. Jansen Reventlow, Recognition Software that Analyses
‘How Amazon’s Moratorium on Facial Moods’, The Independent Online, 29 July
Recognition Tech is Different from IBM’s and 2019, https:/www.iol.co.za/technology/
Microsoft’s’, Slate, 11 June 2020, https://slate. sa-startup-launches-facial-recognition-soft-
com/technology/2020/06/ibm-microsoft-am- ware-that-analyses-moods-30031112
azon-facial-recognition-technology.html
8 Please see the “Background to Emotion
5 R.W. Picard, ‘Affective Computing’, MIT Media Recognition” section in this report for a
Laboratory Perceptual Computing Section detailed analysis of this.
Technical Report, no. 321, 1995, https://affect.
media.mit.edu/pdfs/95.picard.pdf 9 It is important to note that China is not the
only country where emotion-recognition
6 Market Watch, Emotion Detection and technology is being developed and deployed.
Recognition (EDR) Market Insights, For a comparative overview of emotion- and
Status, Latest Amendments and Outlook affect-recognition technology developments
2019–2025, 17 September 2020, https:// in the EU, US, and China, see: S. Krier, ‘Facing
www.marketwatch.com/press-release/ Affect Recognition’, in Asia Society, Exploring
emotion-detection-and-recognition-edr-mar- AI Issues Across the United States and
51
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
52
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
23 J.A. Russell, ‘Is There Universal Recognition 30 C. Chen et al., ‘Distinct Facial Expressions
of Emotion from Facial Expression? A Review Represent Pain and Pleasure Across Cultures’,
of the Cross-Cultural Studies’, Psychological Proceedings of the National Academy of
Bulletin, vol. 115, no. 1, 1994, pp. 102–141, Sciences of the United States of America, vol.
https://doi.org/10.1037/0033-2909.115.1.102 115, no. 43, 2018, pp. E10013–E10021, https://
www.pnas.org/content/115/43/E10013
24 L.F. Barrett et al., ‘Emotional Expressions
Reconsidered: Challenges to Inferring Emotion 31 L. Stark, ‘Facial Recognition, Emotion and
from Human Facial Movements’, Psychological Race in Animated Social Media’, First Monday,
Science in the Public Interest, vol. 20, no. vol. 23, no. 9, 3 September 2018; L. Rhue,
1, 2019, https://journals.sagepub.com/ ‘Racial Influence on Automated Perceptions
doi/10.1177/1529100619832930 of Emotions’, SSRN, November 2018, https://
papers.ssrn.com/sol3/papers.cfm?abstract_
25 J.A. Russell and J.M. Fernández-Dols, id=3281765; L. Rhue, ‘Emotion-Reading Tech
‘Coherence between Emotions and Facial Fails the Racial Bias Test’, The Conversation,
Expressions’, The Science of Facial Expression, 3 January 2019, https://theconversation.
Oxford Scholarship Online, 2017, doi: 10.1093/ com/emotion-reading-tech-fails-the-racial-
acprof:oso/9780190613501.001.0001. bias-test-108404. The company referred
to in Professor Rhue’s paper, Megvii, is one
26 See e.g.: L.F. Barrett, ‘What Faces Can’t Tell of the companies sanctioned in the US for
Us’, The New York Times, 28 February 2014, supplying authorities in Xinjiang province with
https://www.nytimes.com/2014/03/02/ face-recognition cameras, used to monitor
opinion/sunday/what-faces-cant-tell-us.html Uighur citizens.
53
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
32 Some recent examples include: L. Safra, C. to Xinjiang (Part 3)’, China Digital Times, 13
Chevallier, J. Grezes, and N. Baumard, ‘Tracking September 2019, https://chinadigitaltimes.
Historical Changes in Trustworthiness Using net/2019/09/sharper-eyes-shandong-to-
Machine Learning Analyses of Facial Cues xinjiang-part-3/; S. Denyer, ‘China’s Watchful
in Paintings’, Nature Communications, vol. Eye: Beijing Bets on Facial Recognition in a Big
11, no. 4728, 2020, https://www.nature. Drive for Total Surveillance’, Washington Post,
com/articles/s41467-020-18566-7. Also 7 January 2018, https://www.washingtonpost.
see: Coalition for Critical Technology. com/news/world/wp/2018/01/07/feature/
‘Abolish the #TechToPrisonTimeline’, in-china-facial-recognition-is-sharp-end-of-
Medium, 23 June 2020, https://medium. a-drive-for-total-surveillance/
com/@CoalitionForCriticalTechnology/abol-
ish-the-techtoprisonpipeline-9b5b14366b16 37 Axis Communications, a Swedish company,
supplied surveillance cameras to the Sharp
33 In a similar vein, read about physiognomy’s Eyes project. See: Amnesty International,
enduring legacy: A. Daub, ‘The Return of Out of Control: Failing EU Laws for Digital
the Face’, Longreads, 3 October 2018, Surveillance Export, 21 September 2020,
https://longreads.com/2018/10/03/ https://www.amnesty.org/en/documents/
the-return-of-the-face/ EUR01/2556/2020/en/
36 While the Chinese name 雪亮 (xuěliàng) literally 40 To read more on Alpha Hawkeye’s
translates to ‘dazzling snow’, this is often used participation in Sharp Eyes, see: ‘原创|人
as a figurative expression to describe ‘sharp 工智能 情感计算反恐缉私禁毒应用新方向’
eyes’. See: D. Bandurski, ‘“Project Dazzling [‘Innovation| AI New Directions in Affective
Snow”: How China’s Total Surveillance Computing Counterterror, Anti-Smuggling,
Experiment Will Cover the Country’, Hong Anti-Drug Applications’], 中国安防行业
Kong Free Press, 12 August 2018, https:// 网 [China Security Industry Network], 17
hongkongfp.com/2018/08/12/project-daz- July 2019, http://news.21csp.com.cn/
zling-snow-chinas-total-surveillance-experi- C19/201907/11383210.html; ‘阿尔法鹰眼’
ment-set-expand-across-country/; D. Peterson [‘Alpha Hawkeye’], Homepage, http://www.
and J. Rudolph, ‘Sharper Eyes: From Shandong alphaeye.com.cn/. For more on ZNV Liwei’s
54
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
55
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
51 Ibid. 59 ‘生物识别3.0时代,阿尔法鹰眼想用“情感计
算”布局智慧安全’ [‘In the Age of Biometrics
52 Ibid.
3.0, Alpha Hawkeye wants to use “Affective
53 刘缘、庾永波 [L. Yan and L. Yongbo],‘在安检 Computing” to Deploy Smart Security’],
中加强“微表情”识别的思考 – – 基于入藏公路 Sohu, 28 April, 2017. https://www.sohu.
安检的考察’[‘Strengthening the Application com/a/137016839_114778
of Micro-Expression in Security Inspection:
60 Sohu, ‘生物识别3.0时代,阿尔法鹰眼想
Taking the Road Security Check Entering
用”情感计算”布局智慧安全’ [‘In the Age of
Tibet as an Example’], 《四川警察学院学报》
Biometrics 3.0, Alpha Hawkeye wants to
[Journal of Sichuan Police College], vol. 30, no.
use “Affective Computing” to Deploy Smart
1, February 2019.
Security’], 28 April 2017, https://www.sohu.
54 Ibid. com/a/137016839_114778;《南京日报》
[Nanjing Daily], ‘多个城市已利用AI读心加
55 Ibid. 强反恐安防’ [‘Several Cities Have Used AI
Mind Reading to Strengthen Counterterrorist
56 Early descriptions of ‘Alpha Eye’ closely mirror Security’], 29 September 2018, http://
later write-ups of Alpha Hawkeye, which itself njrb.njdaily.cn/njrb/html/2018-09/29/
has translated its name as ‘Alpha Eye’ in some content_514652.htm; Sohu, ‘iRank: 基于
images. This report assumes both names refer 互联网类脑架构的阿尔法鹰眼发展趋势评
to the same company. 察网 [Cha Wang], ‘比“阿 估’ [‘iRank: Analysis of Alpha Hawkeye’s
法狗”更厉害的是中国的“阿尔法眼”’ [‘China’s Internet-like Brain Architecture Development
“Alpha Eye” is More Powerful Than an “Alpha Trend’], 28 March 2018, https://www.sohu.
Dog”’], 17 March 2016, http://www.cwzg.cn/ com/a/226632874_297710; 云涌 [Yunyong
politics/201603/26982.html; 杨丽 [Y. Li], ‘‘阿 (Ningbo News)], ‘专访之三:看一眼就读懂你,
尔法眼’义乌试验两天 查到5个带两张身份证的 甬企这双”鹰眼”安防科技够 “黑” [‘Interview 3:
人’ [‘“Alpha Eye” Trialled in Yiwu for Two Days, It Can Understand You in One Glance, This
Finds 5 People Carrying Two State ID Cards’], Ningbo Company’s Pair of “Hawk Eyes”
56
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
57
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
58
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
59
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
a model requires at least 10,000 such data Reached Over 20 Million Users and More
samples, with each costing 2,000–3,000 yuan Than 10 Million Orders’], 芯基建 [‘Core
(USD305–457). Infrastructure’ WeChat public account], 1
June 2017, https://mp.weixin.qq.com/s/
97 Xinktech, ‘强强联合,推动行业进步 – – 云思创 JdhZbS4Ndb_mfq4dV7A0_g
智受邀与”中南财经政法大学刑事司法学院”进行
技术交流’ [Strong Alliance to Promote Industry 103 US-based multimodal emotion-recognition
Progress- – – Xinktech Invited to Conduct a provider, Eyeris, featured an in-vehicle product
Technical Exchange With “School of Criminal installed in a Tesla and announced interest
Justice, Zhongnan University of Economics from UK and Japanese auto manufacturers.
and Law”’], 29 December 2018, http://www. C.f. e.g. PR Newswire, ‘Eyeris Introduces
xinktech.com/news_detail9.html World’s First In-Cabin Sensor Fusion AI
at CES2020’, 6 January 2020, https://
98 Ibid. www.prnewswire.com/news-releases/
eyeris-introduces-worlds-first-in-cabin-
99 Sohu, ‘云思创智”灵视多模态情绪研判审讯系统” sensor-fusion-ai-at-ces2020-300981567.html.
亮相南京安博会’ [‘Xinktech “Lingshi Multimodal Boston-based Affectiva is one of at least half a
Emotion Research and Interrogation dozen other companies known to be developing
System” Appeared at Nanjing Security in-vehicle emotion-based driver-safety
Expo’], 21 March 2019, https://www.sohu. products. Some in the field anticipate that
com/a/302906390_120049574 emotion-sensing technologies in cars will
become mainstream within the next three
100 Xinktech, ‘江苏省公安厅举办刑侦部门审讯专 years. See M. Elgan, ‘What Happens When Cars
业人才培训 云思创智应邀分享微表情技术实 Get Emotional?’, Fast Company, 27 June 2019,
战应用’ [‘Jiangsu Province’s Public Security https://www.fastcompany.com/90368804/
Department Criminal Investigation Bureau emotion-sensing-cars-promise-to-make-
Held Training for Interrogation Professionals, our-roads-much-safer. Companies working
Xinktech Was Invited to Share Practical on driver-fatigue recognition include EyeSight
Applications of Micro Expression Technology’], Technologies, Guardian Optical Technologies,
29 December 2018, http://www.xinktech.com/ Nuance Automotive, Smart Eye, and Seeing
news_detail8.html, and identical article on Machines. For details on the Euro NCAP
Xinktech’s WeChat public account: https:// program, see: S. O’Hear. ‘EyeSight Scores
mp.weixin.qq.com/s/InCb8yR68v1FiMOaS- $15M to use computer vision to combat
JZwMA. The same year, Xinktech won the driver distraction’, Tech Crunch, 23 October
Jiangsu Province Top Ten AI Products Award; 2018, https://techcrunch.com/2018/10/23/
see: Xinktech, ‘云思创智“沉思者智能算法建模 eyesight. The EU has sponsored the European
训练平台”荣获“2018年度江苏省十佳优秀人工 New Car Assessment Programme, a voluntary
智能产品”奖’ [‘Xinktech’s “Thinker Intelligent vehicle-safety rating system that calls for
Algorithm Model-Building Training Platform” inclusion of driver-monitoring systems.
Wins “2018 Jiangsu Top Ten Excellent AI
Products” Award’], 8 September 2018, http:// 104 Sohu, ‘乐视无人驾驶超级汽车亮相6股有望爆
www.xinktech.com/news_detail3.html 发’ [‘LeEco Driverless Car VV6 Shares Expected
to Blow Up’], 21 April 2016, https://www.
101 中国青年网 [Youth.cn], ‘“曲靖模式”先行 sohu.com/a/70619656_115411; M. Phenix,
项目 – – 翼开科技,成功在曲落地扎根 ‘From China, A Shot Across Tesla’s Bow’, BBC,
[‘The First “Qujing Style” Project-EmoKit 21 April 2016, http://www.bbc.com/autos/
Technology Successfully Takes Root in story/20160421-from-china-a-shot-across-
Quluo’], 5 September 2019, http://finance. teslas-bow
youth.cn/finance_cyxfgsxw/201909/
t20190905_12062221.htm 105 Great Wall Motor Company Limited,
2019 Corporate Sustainability
102 赵青晖 [Z. Qinghui], ‘凭借识别人的情绪, Report, p. 34, https://res.gwm.com.
他们做到了2000多万用户、1000多万订单’ cn/2020/04/26/1657959_130_E-12.pdf
[‘Relying on Recognizing Emotions, They
60
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
106 See: 长城网 [Hebei.com.cn], ‘2019年那些用过 112 Sina Cars [新浪汽车], ‘车内生物监测、
就无法回头的汽车配置: L2自动驾驶’ [‘In 2019 乘员情绪识别,爱驰U5的黑科技你了
There’s No Looking Back After Using Those 解多少?’ [‘Biometric Monitoring and
Cars’ Configurations: L2 Autonomous Driving’], Passenger Emotion Recognition in
3 January 2020, http://auto.hebei.com.cn/ Vehicles, How Much Do You Understand
system/2020/01/02/100152238.shtml About Aichi U5’s Black Technology?’],
19 July 2019, https://k.sina.com.cn/
107 Sina Cars [新浪汽车], ‘长城汽车发布生命 article_5260903737_13993053900100iug3.
体征监测技术 2021款WEY VV6将成为首款车 html; AIWAYS, ‘深化探索AI突围之路 爱驰汽车
型’ [Great Wall Motors Releases Vital Signs 亮相2019中国国际智能产业博览会’ [‘AIWAYS
Monitoring Technology, The 2021 Model of VV6 Shows Deep Exploration of AI Breakthroughs
Will Become the First Model’], 8 June 2020, at 2019 China International Smart Industry
https://auto.sina.com.cn/newcar/2020-06-08/ Expo’], 27 August 2019, https://www.ai-ways.
detail-iircuyvi7378483.shtml com/2019/08/27/9162/
108 重庆晨报 [Chongqing Morning Post, ‘上游新 113 凤凰网 [iFeng News], ‘打造保险科技转型急先
闻直播 “UNI-TECH” 科技日活动,秀智能长安 锋 平安产险携多项AI技术亮相世界人工智能大
汽车实力圈粉’ [Upstream News Broadcasts 会’ [‘Creating a Pioneer in the Transformation
“UNI-TECH” Technology Day Events, Shows of Insurance Technology, Ping An Property
Chang’an Automobile’s Power-Fans’], Insurance Company Showcases Several AI
11 April 2020, https://www.cqcb.com/ Technologies at the World Artificial Intelligence
qiche/2020-04-12/2323984_pc.html Conference’], 29 August 2019, http://sn.ifeng.
com/a/20190829/7693650_0.shtml
109 Ibid
114 Xinhua, ‘金融与科技加速融合迈入“智能金融
110 China Daily, ‘真车来了!华为 HiCar在卓悦中 时代” [‘Accelerate the Fusion of Finance and
心展示硬核智慧出行服务’ [‘A Real Car Has Technology Into the “Era of Smart Finance”’], 30
Arrived! Huawei Showcases HiCar’s Hard-Core August 2019, http://www.xinhuanet.com/2019-
Smart Transportation Services at One Avenue 08/30/c_1124942152.htm
Center’], 8 June 2020, http://cn.chinadaily.com.
cn/a/202006/08/WS5eddda77a31027ab2a- 115 姬建岗、 郭晓春、张敏、冯春强l [J. Jiangang
8ceed4.html. Startups specialising in various et al.], ‘人脸识别技术在高速公路打逃中的应
AI applications including voice and emotion 用探讨’ [‘Discussion on Application of Face
recognition have also been known to supply Recognition Technology in Highway [Toll]
these capabilities to car companies, such as Evasion’],《中国交通信息化》[China ITS
the company AI Speech’s (思必驰) partnerships Journal], no. 1, 2018.
with two state-owned car manufacturers, BAIC
116 Ibid.
Group and FAW Group. See: Sina Finance, ‘
华为、英特尔、富士康等合作伙伴 AI企业
117 Ibid.
思必驰完成E轮4.1亿元融资’ [‘Huawei, Intel,
Foxconn, and Other Cooperative Partners of 118 Ibid.
AI Company AISpeech Complete E Round
of 410 Million Yuan Financing’], 7 April 119 广州市科学技术局 [Guangzhou Municipal
2020, https://finance.sina.cn/2020-04-07/ Science and Technology Bureau], ‘广州市重点
detail-iimxxsth4098224.d.html 领域研发计划 2019 年度”智能网联汽车”(征求
意见稿)’ [‘Guangzhou Key Areas for Research
111 杨雪娇 [Y. Xuejiao], ‘发力行为识别技术 太古 and Development Annual Plan 2019 “Smart
计算的AI生意经’ [‘Generating Momentum in Connected Cars” (Draft for Comments’], pp.
Behavior Recognition Technology, Taigusys 5–6, http://kjj.gz.gov.cn/GZ05/2.2/201908/
Computing’s AI Business Sense’], CPS中安 b6444d5e26fc4a628fd7e90517dff499/
网 [‘CPS Zhong’an Network’ WeChat public files/452599ab52df422c999075acf19a3654.
account], 24 June 2019, https://mp.weixin. pdf
qq.com/s/Q7_Kqghotd7X38qXw4gLCg
61
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
62
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
63
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
142 Y. Xie, ‘Camera Above the Classroom’, The Classroom’s “Smart Eyes” Lock Onto You’]; 17
Disconnect, no. 3, Spring 2019, pp. 7–8, May 2018, https://hznews.hangzhou.com.cn/
https://thedisconnect.co/three/camera- kejiao/content/2018-05/17/content_7003432.
above-the-classroom/. Hanwang Technology, html
now known as ‘China’s grandfather of facial
recognition’ began working on face recognition 145 新京报网 [The Beijing News], ‘杭州一中
ID in part due to its early pivot away from 学课堂引入人脸识别”黑科技’’’[‘Hangzhou
fingerprint-scanning and towards contactless No. 11 Middle School Introduces “Black
identity verification after the SARS epidemic. Technology” for Face Recognition’], 18
During the COVID-19 pandemic, the firm has May 2018, http://www.bjnews.com.cn/
claimed it can accurately use face recognition news/2018/05/18/487458.html
to identify people wearing masks. See: M.
Pollard, ‘Even Mask-Wearers Can Be ID’d, 146 Sohu, ‘智能错题本、人脸情绪识别、课堂即
China Facial Recognition Firm Says’, Reuters, 9 时交互、智慧云课堂 – – 联想智慧教育将
March 2020, https://www.reuters.com/article/ 为北京“新高”考赋’ [‘Smart Wrong Answer
us-health-coronavirus-facial-recognition/ Book, Facial Emotion Recognition, Immediate
even-mask-wearers-can-be-idd-china-facial- Classroom Interaction, Smart Cloud Classroom
recognition-firm-says-idUSKBN20W0WL; L. – – Lenovo Smart Education Will Enable
Lucas and E. Feng, ‘Inside China’s Surveillance “New Gaokao” for Beijing’], 9 September 2019,
State’, Financial Times, 20 July 2018, https:// https://www.sohu.com/a/339676891_363172;
www.ft.com/content/2182eebe-8a17-11e8- and 经济网 [CE Weekly], ‘联想打造全国首个科
bf9e-8771d5404543 技公益教育平台 开播首课25万人观看’ [‘Lenovo
Builds Country’s First Science and Technology
143 亿欧 [Yi’ou], ‘海风教育上线AI系统“好望角”, Public Welfare Education Platform, Broadcasts
情绪识别是AI落地教育的重点方向?’ [‘Haifeng First Class to 250,000 Viewers’], 6 March 2020,
Education “Cape of Hope” AI System Goes http://www.ceweekly.cn/2020/0306/289041.
Online, Is Emotion Recognition the Key shtml. A 错题本 (cuòtíběn) is a workbook
Direction for Implementing AI in Education?’], containing incorrect answers to questions and
23 April 2018, https://www.iyiou.com/p/70791. explaining why they are wrong.
html; Sina, ‘海风教育发布K12落地AI应用”
好望角” 借助情绪识别赋能教学’ [‘Haifeng 147 Tianyancha [天眼查], ‘蜜枣网:自主研发情绪智
Education Releases K-12 Implementation 能分析系统,深度改变零售与幼教’ [‘Meezao:
of AI Application “Cape of Good Hope”, Independent Research and Development of
Empowers Teaching With Aid of Emotion Emotion Intelligence Analytical Systems,
Recognition’], 23 April 2018, http://edu.sina. Deeply Changing Logistics and Preschool
com.cn/l/2018-04-23/doc-ifzfkmth7156702. Education’], 28 June 2018, https://news.
shtml; Haifeng Education, ‘海风优势’ [‘Haifeng tianyancha.com/ll_i074g8rnrd.html
Advantage’], https://www.hfjy.com/hfAdvan
tage?uid=uid_1593910508_998072. Haifeng 148 Meezao, ‘AI技术将改进基础教育的方法与效
Education (海风教育) has a user base of 率’ [‘AI Technology Will Change Methods and
over 4 million students, and focuses on K-12 Efficiency of Basic Education’], 26 December
online education and educational guidance 26 2019, http://www.meezao.com/news/
in areas such as college preparedness. See: shownews.php?id=62
Haifeng Education, ‘海风教育好望角系统 打
149 网易 [NetEase], ‘新东方助力打造雅安首个AI
响‘教育+AI’第一枪’ [‘Haifeng Education’s
双师课堂’ [‘New Oriental Helps Build Ya’an’s
Cape of Good Hope System Fires First
First AI Dual Teacher Classroom’], 6 September
Shot of “AI+Education”’], 13 February
2018, https://edu.163.com/18/0906/11/
2019, https://www.chengjimanfen.com/
DR14325T00297VGM.html
yiduiyifudao_xinwen/414
150 极客公园 [GeekPark], ‘「今天我的课堂专注
144 杭州网 [Hangzhou.com.cn], ‘智慧课堂行为管理
度在三位同学中最高!」比邻东方「AI 班主
系统上线 教室’慧眼’锁定你’ [‘Smart Classroom
任」用数据量化孩子课堂表现’ [‘“Today My
Behavior Management System Goes Online,
Class Concentration Level is the Highest
64
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
Among Three Classmates!” Bling ABC New 156 Taigusys Computing, ‘产品中心: AI课堂专注
Oriental “AI Class Teacher” Uses Data to 度分析系统’ [‘Product Center: AI Classroom
Quantify Children’s Classroom Performance’], Concentration Analysis System’], http://www.
2 November 2018, https://www.geekpark.net/ taigusys.com/procen/procen162.html
news/234556
157 For announcement of Tsinghua collaboration,
151 Taigusys Computing, ‘产品中心: AI课堂专注 see: PR Newswire, ‘TAL Lays out Future
度分析系统’ [‘Product Center: AI Classroom Education in Terms of Scientific &
Concentration Analysis System’], http://www. Technological Innovation at the Global
taigusys.com/procen/procen162.html Education Summit’, 6 December 2017, https://
www.prnewswire.com/news-releases/
152 In Mandarin, TAL goes by the name 好未来 tal-lays-out-future-education-in-terms-of-sci-
(‘good future’). Originally named Xue’ersi (学 entific-technological-innovation-at-the-glob-
而思), the company changed its name to TAL al-education-summit-300567747.html. For
in 2013, but still retains the Xue’ersi name on documentation of FaceThink’s partnership
products including the Magic Mirror. See: 天元 with TAL, see: 芥末堆 [Jiemodui], ‘FaceThink
数据 [Tianyuan Data], ‘揭秘中国市值最高教育巨 获好未来千万级投资,将情绪识别功能引
头: 狂奔16年,靠什么跑出‘好未来’’ [‘Unmasking 入双师课堂’ [‘FaceThink Receives Tens of
China’s Highest Market Value Education Giant: Millions in Investments from TAL, Introduces
In a 16-Year Mad Dash, What to Rely On to Emotion Recognition Function In Dual-Teacher
Run Towards a “Good Future”?’], 10 June 2019, Classrooms’]. 3 May 2017, https://www.
https://www.tdata.cn/int/content/index/id/ jiemodui.com/N/70500; 铅笔道 [Qianbidao], ‘
viewpoint_102421.html 获好未来投资 30岁副教授AI识别20种表情 实
时记录学生上课状态’ [‘Winning Investment
153 Sohu, ‘打造”未来智慧课堂”科技让教育更懂孩子 From TAL, 30-Year-Old Associate Professor’s
[‘Create “Future Smart Classroom” Technology AI Recognizes 20 Expressions and Records
to Make Education Understand Children Students’ In-Class Status in Real Time’], 9 May
More’], 23 October 2017, https://www.sohu. 2017, https://www.pencilnews.cn/p/13947.
com/a/199552733_114988 html
154 李保宏 [L. Baohong], ‘人工智能在中俄两 158 Meezao, ‘创新为本,AI为善 --- 蜜枣网发布幼
国教育领域发展现状及趋势’ [‘The Status
儿安全成长智能系统’ [Innovation-Oriented, AI
Quo and Trend of Artificial Intelligence
for Good – Meezao Presents a Smart System
in the Field of Education in China and
for Children’s Safe Growth’], 7 January 2019,
Russia’], Science Innovation, vol. 7, no. 4,
2019, p. 134, http://sciencepg.org/journal/ http://www.meezao.com/news/shownews.
archive?journalid=180&issueid=180070 php?id=36; 拓扑社 [Topological Society], ‘以零
售场景切入,蜜枣网利用情绪识别分析用户喜
155 宁波新闻网 [Ningbo News Network], ‘威创集团 好降低流失率’ [‘Entering from Retail Scenarios,
发布儿童成长平台,宣布与百度进行AI技术合作’ Meezao Uses Emotion Recognition to Analyze
[‘VTron Group Releases Child Growth Platform, User Preferences and Reduce Turnover Rate’],
Announces AI Technology Cooperation with QQ, 15 May 2018, https://new.qq.com/omn/
Baidu’], 1 July 2020, http://www.nbyoho.com/
news/1577803932239322942.html; 证券日报 159 For details of the RYB incident, see: C. Buckley,
网 [Securities Daily], ‘威创CPO郭丹:威创对 ‘Beijing Kindergarten Is Accused of Abuse, and
科技赋能幼教三个层次的认知与实践’ [‘VTron Internet Erupts in Fury’, The New York Times,
CPO Guo Dan: VTron’s Three-Tiered Thinking 25 November 2017, https://www.nytimes.
and Practice In Empowering Preschool com/2017/11/24/world/asia/beijing-kinder-
Education With Science and Technology’], 16 garten-abuse.html. Meezao, ‘创新为本,AI
November 2018, http://www.zqrb.cn/gscy/ 为善 --- 蜜枣网发布幼 儿安全成长智能系统’
gongsi/2018-11-16/A1542353382387.html [Innovation-Oriented, AI for Good – Meezao
Presents a Smart System for Children’s Safe
Growth’], 7 January 2019, http://www.meezao.
com/news/shownews.php?id=36
65
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
66
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
169 Y. Xie, ‘Camera Above the Classroom’, 181 Hangzhou No. 11 Middle School, ‘杭州第十一中
The Disconnect, no. 3, Spring 2019, pp. “智慧课堂管理系统” 引争议 – – 课堂需要什么
11–12, https://thedisconnect.co/three/ 样的”高科技’’’ [‘“Smart Classroom Management
camera-above-the-classroom/ System” in Hangzhou No.11 Middle School
Causes Controversy – What Kind of “High Tech”
170 Ibid. Does a Classroom Need?’], 8 June 2018, http://
www.hsyz.cn/article/detail/idhsyz_6336.htm
171 腾讯网 [Tencent News], ‘ ‘智慧校园’就这么开始
了,它是个生意,还是个问题?’ [‘Is This Kind 182 Ibid.
of Start to “Smart Campuses” a Business or a
Problem?’], 30 May 2018, https://new.qq.com/ 183 Y. Xie, ‘Camera Above the Classroom’,
omn/20180530/20180530A03695.html The Disconnect, no. 3, Spring 2019, p.
10, https://thedisconnect.co/three/
172 Sina, ‘杭州一中学引进智慧课堂行为管理系统引热 camera-above-the-classroom/
议’ [‘Hangzhou No. 11 Middle School Introduces
Smart Classroom Behavior Management 184 网易 [NetEase], ‘新东方助力打造雅安首个AI
System’], 18 July 2018, http://edu.sina.com.cn/ 双师课堂’ [‘New Oriental Helps Build Ya’an’s
zxx/2018-07-18/doc-ihfnsvyz9043937.shtml First AI Dual Teacher Classroom’], 6 September
2018, https://edu.163.com/18/0906/11/
173 D. Lee, ‘At This Chinese School, Big Brother Was DR14325T00297VGM.htm
Watching Students – and Charting Every Smile
or Frown’, Los Angeles Times, 30 June 2018, 185 岳丽丽 [Y. Lili], ‘好未来推出“WISROOM”智慧
https://www.latimes.com/world/la-fg-china- 课堂解决方案,升级’魔镜’ [‘TAL Launches
face-surveillance-2018-story.html “WISROOM” Smart Classroom Solution,
Upgrades “Magic Mirror”’], 猎云网 [Lieyun
174 Ibid. Network], 18 July 2018, https://www.
lieyunwang.com/archives/445413
175 腾讯网 [Tencent News], ‘“智慧校园” 就这么开
始了,它是个生意,还是个问题?’ [‘Is This Kind 186 李保宏 [L. Baohong], ‘人工智能在中俄两国教育领
of Start to “Smart Campuses” a Business or a 域发展现状及趋势’ [‘The Status Quo and Trend
Problem?’], 30 May 2018, https://new.qq.com/ of Artificial Intelligence in the Field of Education
omn/20180530/20180530A03695.html in China and Russia’], Science Innovation, vol. 7,
no. 4, 2019, p.134, http://sciencepg.org/journal/
176 Y. Xie, ‘Camera Above the Classroom’, archive?journalid=180&issueid=1800704; 张无
The Disconnect, no. 3, Spring 2019, p. 荒 [Z. Wuhuang], ‘海风教育让在线教育进入智能
13, https://thedisconnect.co/three/ 学习时代 ‘好望角’AI系统发布 [‘Haifeng Education
camera-above-the-classroom/ Brings Online Education into the Era of Smart
Learning, Releases “Cape of Good Hope”
177 Y. Xie, ‘Camera Above the Classroom’, AI System’], Techweb, 23 April 2018, http://
The Disconnect, No. 3, Spring 2019, ai.techweb.com.cn/2018-04-23/2657964.shtml.
p. 18, https://thedisconnect.co/three/
camera-above-the-classroom/ 187 葛熔金 [G. Rongjin], ‘杭州一高中教室装
组合摄像头,分析学生课堂表情促教学改
178 腾讯网 [Tencent News], ‘“智慧校园” 就这么开 进’ [‘A High School in Hangzhou Equipped
始了, 它是个生意,还是个问题?’ [‘Is This Kind Classrooms With Combined Cameras,
of Start to “Smart Campuses” a Business or a Analyzes Students’ Facial Expressions in the
Problem?’], 30 May 2018, https://new.qq.com/ Classroom to Improve Teaching’], ThePaper [
omn/20180530/20180530A03695.html 澎湃], 16 May 2018, https://www.thepaper.cn/
newsDetail_forward_2133853
179 Ibid.
180 Ibid.
67
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
188 赵雨欣 [Z. Yuxin], ‘人工智能抓取孩子课堂情 195 腾讯网 [Tencent News], ‘ “智慧校园” 就这么开
绪?在线教育还能这样玩’ [‘AI Can Capture 始了,它是个生意,还是个问题?’ [‘Is This Kind
Children’s Emotions in the Classroom? of Start to “Smart Campuses” a Business or a
Online Education Can Do This Too’], 成都商报 Problem?’], 30 May 2018, https://new.qq.com/
[Chengdu Economic Daily], 15 December 2017, omn/20180530/20180530A03695.html; 新京
https://www.cdsb.com/Public/cdsb_ 报网 [The Beijing News], ‘杭州一中学课堂引入
offical/2017-12-15/1629504657121871 人脸识别‘黑科技’’ [‘Hangzhou No. 11 Middle
46040444024408208084222.html School Introduces “Black Technology” for
Face Recognition’], 18 May 2018, http://www.
bjnews.com.cn/news/2018/05/18/487458.
189 Hangzhou No. 11 Middle School, ‘杭州第十一 html. The claim that the Smart Classroom
中“智慧课堂管理系统”引争议 – – 课堂需要什么 Behavior Management System only displays
样的“高科技”’ [‘“Smart Classroom Management data on groups rather than individuals is at
System” in Hangzhou No.11 Middle School odds with the description of the monitors
Causes Controversy – What Kind of “High Tech” teachers can see, which provide push
Does a Classroom Need?’], 8 June 2018 notifications about which students are
https://www.cdsb.com/Public/cdsb_offical/
inattentive.
2017-12-15/16295046571218714604044402
4408208084222.html 196 Y. Xie, ‘Camera Above the Classroom’,
The Disconnect, no. 3, Spring 2019, p.
190 Ibid. 10, https://thedisconnect.co/three/
camera-above-the-classroom/
191 腾讯网 [Tencent News], “智慧校园” 就这么开始
了,它是个生意,还是个问题?’ [‘Is This Kind 197 葛熔金 [G. Rongjin], ‘杭州一高中教室装
of Start to “Smart Campuses” a Business or a 组合摄像头,分析学生课堂表情促教学改
Problem?’], 30 May 2018, https://new.qq.com/ 进’ [‘A High School in Hangzhou Equipped
omn/20180530/20180530A03695.html Classrooms with Combined Cameras, Analyzes
Students’ Facial Expressions in the Classroom
192 腾讯网 [Tencent News], ‘人在坐,AI在 to Improve Teaching’], ThePaper [澎湃],
看’ [‘While People Sit, AI Watches’], 3 16 May 2018, https://www.thepaper.cn/
September 2019, https://new.qq.com/om- newsDetail_forward_2133853
n/20190903/20190903A09VG600.html
198 新京报网 [The Beijing News], ‘杭州一中
193 For instance, one article recounted a court case 学课堂引入人脸识别“黑科技’’’ [‘Hangzhou
from 2018, which revealed that an employee No. 11 Middle School Introduces “Black
of major AI company iFlytek illegally sold Technology” for Face Recognition’], 18
student data. The employee was in charge of May 2018, http://www.bjnews.com.cn/
a school-registration management system in news/2018/05/18/487458.html. Hikvision’s
Anhui province, and was reported to have sold Education Industry director Yu Yuntao
data from 40,000 students. 腾讯网 [Tencent echoed Zhang’s statement about the
News], ‘人在坐,AI在看’ [‘While People Sit, AI expression-recognition data only being
Watches’], 3 September 2019, https://new. used for teachers’ reference; see: 腾讯网
qq.com/omn/20190903/20190903A09VG600. [Tencent News], ‘ “智慧校园”就这么开始了,
html 它是个生意,还是个问题?’ [‘Is This Kind of
Start to “Smart Campuses” a Business or a
194 ‘智慧课堂行为管理系统上线 教室“慧眼”锁定 Problem?’], 30 May 2018, https://new.qq.com/
你’ [‘Smart Classroom Behavior Management omn/20180530/20180530A03695.html
System Goes Online, Classroom’s “Smart
199 L. Lucas and E. Feng, ‘Inside China’s
Eyes” Lock Onto You’]. 杭州网 [Hangzhou.
Surveillance State’, Financial Times, 20 July
com.cn]. May 17, 2018. https://hznews.
2018, https://www.ft.com/content/2182eebe-
hangzhou.com.cn/kejiao/content/2018-05/17/
8a17-11e8-bf9e-8771d5404543
content_7003432.htm
68
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
200 Y. Xie, ‘Camera Above the Classroom’, 205 新京报网 [The Beijing News], ‘杭州一中学课堂引
The Disconnect, no. 3, Spring 2019, p. 入人脸识别”黑科技’”’ [‘Hangzhou No. 11 Middle
19, https://thedisconnect.co/three/ School Introduces “Black Technology” for
camera-above-the-classroom/ Face Recognition’], 18 May 2018, http://www.
bjnews.com.cn/news/2018/05/18/487458.
201 Sohu, ‘打造”未来智慧课堂”科技让教育更懂孩子 html; Hangzhou No. 11 Middle School, ‘我
[‘Create “Future Smart Classroom” Technology 校隆重举行“未来智慧校园的探索与实践”活
to Make Education Understand Children 动’ [‘Our School Held a Grand Activity of
More’], 23 October 2017, https://www.sohu. “Exploration and Practice of Future Smart
com/a/199552733_114988; Hangzhou No. 11 Campus”’], 16 May 2018, http://www.hsyz.
Middle School, ‘未来已来!未来智慧校园长啥 cn/article/detail/idhsyz_6308.htm. The
样?快来杭十一中看看’ [‘The Future is Here! What ‘smart cafeteria’ food-monitoring project was
Will the Smart Campus of the Future Look Like? undertaken by Zhejiang Primary and Secondary
Come Quick and See at Hangzhou No. 11 Middle School Education and Logistics Management
School’], WeChat, 9 May 2018, https://mp.weixin. Association’s Primary and Secondary School
qq.com/s/zvH3OZH3Me2QLQB5lPA3vQ Branch, and the Hangzhou Agricultural and
Sideline Products Logistics Network Technology
202 腾讯网 [Tencent News], ‘“智慧校园”就这么开 Co. Ltd. See: 腾讯网 [Tencent News], ‘“智慧校
始了,它是个生意,还是个问题?’ [‘Is This Kind 园”就这么开始了,它是个生意,还是个问题?’
of Start to “Smart Campuses” a Business or a [‘Is This Kind of Start to “Smart Campuses” a
Problem?’], 30 May 2018, https://new.qq.com/ Business or a Problem?’], 30 May 2018, https://
omn/20180530/20180530A03695.html. The new.qq.com/omn/20180530/20180530A03695.
same month that Hangzhou No. 11 launched html
its Smart Classroom Behavior Management
System, the nearby Jiangsu Province Education 206 Sohu, ‘打造”未来智慧课堂”科技让教育更懂孩子
Department, the Jiangsu Institute of Economic [‘Create “Future Smart Classroom” Technology
and Information Technology, and the Jiangsu to Make Education Understand Children
Province Department of Finance co-released More’], 23 October 2017, https://www.sohu.
the Guiding Opinions on Jiangsu Province com/a/199552733_114988
Primary and High School Smart Campus
Construction (《江苏省中小学智慧校园建设指导 207 36氪 [36Kr], ‘新东方发布教育新产品”AI班主任,”
意见(试行)》). This policy document specifically 人工智能这把双刃剑,教育公司到底怎么用?’
references smart classrooms as ‘collecting [‘New Oriental Releases New Education Product
teaching and learning behavior data throughout “AI Teacher”, A Double-Edged Sword of AI: How
the entire process’. 江苏省教育厅 [Jiangsu Can Education Companies Use It?’], 29 October
Education Department], ‘关于印发智慧校园建 2018, https://36kr.com/p/1722934067201
设指导意见的通知’ [‘Notice on Printing and
Distributing Guiding Opinions on Smart Campus 208 Y. Xie, ‘Camera Above the Classroom’,
Construction’], 23 May 2018, http://jyt.jiangsu. The Disconnect, no. 3, Spring 2019, pp.
gov.cn/art/2018/5/23/art_61418_7647103.html 6, 21, https://thedisconnect.co/three/
camera-above-the-classroom/
203 Taigusys Computing, ‘深圳安全监控系统进校
园’ [‘Shenzhen Security Surveillance System 209 Ibid.
Enters Campus’], 19 January 2019, http://www.
taigusys.com/news/news120.html 210 腾讯网 [Tencent News], ‘“智慧校园”就这么开始
了,它是个生意,还是个问题?’ [‘Is This Kind
204 新京报网 [The Beijing News], ‘杭州一中学课堂引 of Start to “Smart Campuses” a Business or a
入人脸识别”黑科技’’’ [‘Hangzhou No. 11 Middle Problem?’], 30 May 2018, https://new.qq.com/
School Introduces “Black Technology” for Face omn/20180530/20180530A03695.html
Recognition’], 18 May 2018, http://www.bjnews.
com.cn/news/2018/05/18/487458.html
69
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
3. Emotion Recognition and Human (Article 17, ICCPR), 8 April 1988, para 3, http://
Rights tbinternet.ohchr.org/Treaties/CCPR/Shared%20
Documents/1_Global/INT_CCPR_GEC_6624_E.
211 J. Ruggie, ‘Guiding Principles on Business and doc (noting that ‘[t]he term “unlawful” means that
Human Rights: Implementing the United Nations no interference can take place except in cases
“Protect, Respect and Remedy” Framework’, envisaged by the law’, and that ‘[i]nterference
Report of the Special Representative of the authorised by States can only take place on the
Secretary-General on the Issue of Human Rights basis of law, which itself must comply with the
and Transnational Corporations and other provisions, aims and objectives of the Covenant’);
Business Enterprises, A/HRC/17/31, UN Human Necessary and Proportionate: International
Rights Council, 17th Session, 21 March 2011, Principles on the Application of Human Rights to
https://undocs.org/A/HRC/17/31 Communications Surveillance, Principle 1, https://
necessaryandproportionate.org/principles (these
212 D. Kaye, Report of the Special Rapporteur on principles apply international human rights law
the Promotion and Protection of the Right to modern digital surveillance; an international
to Freedom of Opinion and Expression, A/ coalition of civil society, privacy, and technology
HRC/38/35, UN Human Rights Council, 38th experts drafted them in 2013, and over 600
Session, 6 April 2018, para 10, https://undocs. organisations around the world have endorsed
org/A/HRC/38/35 them
213 A. Chaskalson, ‘Dignity as a Constitutional 217 UN High Commissioner for Human Rights,
Value: A South African Perspective’, American The Right to Privacy in the Digital Age: Report
University International Law Review, vol. 26, of the United Nations High Commissioner for
no. 5, 2011, p. 1382, https://digitalcommons. Human Rights, A/HRC/39/29, UN Human Rights
wcl.american.edu/cgi/viewcontent. Council, 39th Session, 3 August 2018, https://
cgi?article=2030&context=auilr undocs.org/A/HRC/39/29
214 C. McCrudden, ‘Human Dignity and Judicial 218 UN Human Rights Council, Report of the Special
Interpretation of Human Rights’, The European Rapporteur on the Right to Privacy, Joseph A.
Journal of International Law, vol. 19, no. 4, 2008, Cannataci, A/HRC/34/60, 24 February 2017,
http://ejil.org/pdfs/19/4/1658.pdf para 17, https://ap.ohchr.org/documents/
dpage_e.aspx?si=A/HRC/34/60
215 ARTICLE 19, Right to Online Anonymity: Policy
Brief, June 2015, https://www.article19.org/ 219 Article 12.3, International Covenant on Civil and
data/files/medialibrary/38006/Anonymity_ Political Rights, 16 December 1996 (23 March
and_encryption_report_A5_final-web.pdf. 1976), https://www.ohchr.org/en/profession-
Also see: S. Chander, ‘Recommendations alinterest/pages/ccpr.aspx
for a Fundamental Rights-Based Artificial
Intelligence Regulation’, European Digital Rights, 220 ARTICLE 19, The Global Principles on Protection
4 June 2020, https://edri.org/wp-content/ of Freedom of Expression and Privacy, 19
uploads/2020/06/AI_EDRiRecommendations. January 2018, http://article19.shorthand.com/
pdf
221 General Comment No. 34, CCPR/C/GC/3, para
216 Article 17(1), ICCPR; Article 11, ACHR (‘2. No 10; J. Blocher, Rights To and Not To, California
one may be the object of arbitrary or abusive Law Review, vol. 100, no. 4, 2012, pp. 761–815
interference with his private life, his family, his (p. 770).
home, or his correspondence […] 3. Everyone has
the right to the protection of the law against such 222 UN Special Rapporteur on the promotion and
interference […]’). Also see: UN Human Rights protection of the right to freedom of opinion
Committee, General Comment No. 16 and expression, UN Doc A/68/362. 4 September
2013.
70
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
223 UN Office of the High Commissioner for Insurance Company Showcases Several AI
Human Rights, Guiding Principles on Technologies at the World Artificial Intelligence
Business and Human Rights, p. 15, https:// Conference’], 29 August 2019, http://sn.ifeng.
www.ohchr.org/documents/publications/ com/a/20190829/7693650_0.shtml, and
guidingprinciplesbusinesshr_en.pdf Xinhua, “金融与科技加速融合迈入”智能金融
时代”’ [‘Accelerate the Fusion of Finance and
224 The right of peaceful assembly includes the
Technology Into the “Era of Smart Finance”’], 30
right to hold meetings, sit-ins, strikes, rallies,
events, or protests, both offline and online. August 2019, http://www.xinhuanet.com/2019-
See: UN High Commissioner for Human 08/30/c_1124942152.htm
Rights, Impact of New Technologies on the
Promotion and Protection of Human Rights 230 Tianyancha [天眼查], ‘蜜枣网:自主研发情绪智
in the Context of Assemblies, Including 能分析系统,深度改变零售与幼教’ [‘Meezao:
Peaceful Protests: Report of the United Nations Independent Research and Development of
High Commissioner for Human Rights, A/ Emotion Intelligence Analytical Systems,
HRC/44/24, 24 June 2020, para 5, https:// Deeply Changing Logistics and Kindergarten
undocs.org/en/A/HRC/44/24 Education’], 28 June, 2018, https://news.
tianyancha.com/ll_i074g8rnrd.html, and
225 E. Selinger and A.F. Cahn, ‘Did You Protest Meezao official company website, “蜜枣网
Recently? Your Face Might Be in a Database’,
CEO赵小蒙:除了新零售,情绪智能识别还可
The Guardian, 17 July 2020, https://www.
以改变幼教” [“Meezao CEO Zhao Xiaomeng:
theguardian.com/commentisfree/2020/
In Addition to New Retail, Smart Emotional
jul/17/protest-black-lives-matter-database;
The Wire, ‘Delhi Police Is Now Using Recognition Can Also Change Preschool
Facial Recognition Software to Screen Education”], 19 July 2018, http://www.meezao.
“Habitual Protestors”’, 29 December com/news/shownews.php?id=35
2019, https://thewire.in/government/
delhi-police-is-now-using-facial-recogni- 231 “杭州一中学课堂引入人脸识别‘黑科
tion-software-to-screen-habitual-protestors 技’” [“Hangzhou No. 1 Middle School
Introduces “Black Technology” for Face
226 UN Human Rights Council, Report of the Recognition”]. 新京报网 [The Beijing News].
Special Rapporteur on the Rights to Freedom 18 May 2018, http://www.bjnews.com.cn/
of Peaceful Assembly and of Association, 17 news/2018/05/18/487458.html
May 2019, para 57, https://www.ohchr.org/
EN/Issues/AssemblyAssociation/Pages/ 232 蔡村、陈正东、沈蓓蓓. [C. Cun, C. Zhengdong,
DigitalAge.aspx and S. Beibei], ‘把握瞬间真实:海关旅检应用微表
情心理学的构想’ [‘Grasp the Truth in an Instant:
227 Organization for Security and Co-operation in Application of Micro-Expressions Psychology
Europe, ‘Right to be Presumed Innocent and in Customs Inspection of Passengers’],《海关
Privilege against Self-Incrimination’, Legal 与经贸研究》[Journal of Customs and Trade],
Digest of International Fair Trials, p. 99, https:// no. 3, 2018, pp. 31, 33.
www.osce.org/files/f/documents/1/f/94214.
pdf#page=90 233 M. Beraja, D.Y. Yang, and N. Yuchtman,
Data-Intensive Innovation and the State:
228 UN Human Rights Council, Resolution on Evidence from AI Firms in China (draft), 16
the Right to Privacy in the Digital Age, A/ August 2020, http://davidyyang.com/pdfs/
HRC/34/L,7, 23 March 2017, page 3. https:// ai_draft.pdf
digitallibrary.un.org/record/1307661?ln=en
234 中国青年网 [Youth.cn], ‘“曲靖模式”先行
229 凤凰网 [iFeng News], ‘打造保险科技转型急先 项目 – – 翼开科技,成功在曲落地扎根
锋 平安产险携多项AI技术亮相世界人工智能大 [‘The First “Qujing Style” Project-EmoKit
会’ [‘Creating a Pioneer in the Transformation Technology Successfully Takes Root in
of Insurance Technology, Ping An Property Quluo’], 5 September 2019, http://finance.
71
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
235 云涌 [Yunyong (Ningbo News)], ‘专访之三: 240 For examples of exceptions, see: 腾讯网
看一眼就读懂你,甬企这双”鹰眼”安防科技够” [Tencent News], ‘“智慧校园”就这么开始了,它是
黑”’ [‘Interview 3: It Can Understand You in 个生意,还是个问题?’ [‘Is This Kind of Start to
One Glance, This Ningbo Company’s Pair of “Smart Campuses” a Business or a Problem?’],
“Hawk Eyes” Security Technology is “Black” 30 May 2018, https://new.qq.com/om-
Enough’]. 4 May 2018, http://yy.cnnb.com.cn/ n/20180530/20180530A03695.html, 马爱平 [M.
system/2018/05/04/008748677.shtml Aiping], ‘AI不仅能认脸,还能“读心’’ [‘AI Doesn’t
Just Read Faces, It Can Also “Read Hearts”’],
236 Hikvision, Success Stories, https://www. Xinhua, 17 June 2020, http://www.xinhuanet.
hikvision.com/content/dam/hikvision/en/ com/fortune/2020-06/17/c_1126123641.htm
brochures-download/success-stories/
Success-Stories-2019.pdf. In case original 241《南京日报》 [Nanjing Daily], ‘多个城市已
source link is broken, please contact the 利用AI读心加强反恐安防’ [‘Several Cities
authors for a copy. Have Used AI Mind Reading to Strengthen
Counterterrorist Security’], 29 September 2018,
237 Infoweek, ‘Surveillance Networks Operated by http://njrb.njdaily.cn/njrb/html/2018-09/29/
China Spread Throughout Latin America’, 7 content_514652.htm.
August 2019, https://infoweek.biz/2019/08/07/
seguridad-redes-de-vigilancia-china-latino- 242 L. Rhue, ‘Racial Influence on Automated
america/; LaPolitica Online, ‘Huawei Lands Perceptions of Emotions’, SSRN, November
in Mendoza to Sell its Supercameras with 2018, https://papers.ssrn.com/sol3/
Facial Recognition and Big Data’, 28 April papers.cfm?abstract_id=3281765; L. Rhue,
2018, https://www.lapoliticaonline.com/ ‘Emotion-reading Tech Fails the Racial Bias
nota/112439-huawei-desembarca-en-men- Test’, 3 January 2019, The Conversation,
doza-para-vender-sus-supercama- https://theconversation.com/emotion-reading-
ras-con-reconocimiento-facial-y-big-data/; tech-fails-the-racial-bias-test-108404. Megvii
T. Wilson and M. Murgia, ‘Uganda Confirms is one of the companies sanctioned in the US
Use of Huawei Facial Recognition Cameras’, for supplying authorities in Xinjiang province
The Financial Times, 21 August 2019, https:// with face recognition cameras used to monitor
www.ft.com/content/e20580de-c35f- Uighur citizens.
11e9-a8e9-296ca66511c9; S. Woodhams,
‘Huawei Says its Surveillance Tech Will Keep 243 段蓓玲 [D. Beiling], ‘视频侦查主动预警系统
African Cities Safe but Activists Worry it’ll Be 应用研究’ [‘Applied Research on Active Early
Misused’, Quartz, 20 March 2020, https:// Warning System for Video Investigations’],《法
qz.com/africa/1822312/huaweis-surveil- 制博览》 [Legality Vision], no. 16, 2019 p. 65.
lance-tech-in-africa-worries-activists/; B.
244 For Baidu’s API, see: “人脸检测” [Face
Jardine, ‘China’s Surveillance State Has Eyes
Detection”]. Baidu official company website.
on Central Asia’, Foreign Policy, 15 November
72
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
73
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
256 王鹏,马红平 [W. Peng and M. Hongping], ‘公共 may, on any ground, infringe upon the freedom
场所视频监控预警系统的应用’ [‘Research on and privacy of citizens’ correspondence –
Application of Video Monitoring and Warning except in cases where, to meet the needs of
System in Public Places’],《广西警察学院学 state security or of investigation into criminal
报》[Journal of Guangxi Police College], vol. 31, offences, public security or procuratorial
no. 2, 2018, pp. 42–45. organs are permitted to censor correspondence
in accordance with procedures prescribed by
4. China’s Legal Framework and Human law.
Rights
261 Q. Zhang, ‘A Constitution Without
257 For more context on this, see: UN Human Constitutionalism? The Paths of Constitutional
Rights Council, Report of the Working Group Development in China’,. International Journal
on the Universal Periodic Review: China, A/ of Constitutional Law, vol. 8, no. 4, October
HRC/11/255, October 2009. Also see: W. Zeldin, 2010, pp. 950–976, https://doi.org/10.1093/
‘China: Legal Scholars Call for Ratification icon/mor003, https://academic.oup.com/icon/
of ICCPR’, Global Legal Monitor, 2 February article/8/4/950/667092
2008, https://www.loc.gov/law/foreign-news/
article/china-legal-scholars-call-for-ratifica- 262 J. Ding, ‘ChinAI #77: A Strong Argument
tion-of-iccpr/; V. Yu, ‘Petition Urges NPC to Against Facial Recognition in the
Ratify Human Rights Treaty in China’, South Beijing Subway’, ChinAI, 10 December
China Morning Post, 28 February 2013, https:// 2019, https://chinai.substack.com/p/
www.scmp.com/news/china/article/1160622/ chinai-77-a-strong-argument-against
petition-urges-npc-ratify-human-rights-trea-
ty-china; Rights Defender, ‘Nearly a Hundred 263 E. Pernot-Leplay, ‘China’s Approach on Data
Shanghai Residents Called on the National Privacy Law: A Third Way Between the US
People’s Congress to Ratify the International and EU?’, Penn State Journal of Law and
Covenant on Civil and Political Rights (Photo)’, International Affairs, vol. 8, no. 1, May 2020,
23 July 2014, http://wqw2010.blogspot. https://elibrary.law.psu.edu/cgi/viewcontent.
com/2013/07/blog-post_8410.html cgi?article=1244&context=jlia. Also see: Y.
Wu et al., ‘A Comparative Study of Online
258 Information Office of the State Council, Privacy Regulations in the U.S. and China’,
The People’s Republic of China, National Telecommunications Policy, no. 35, 2011, pp.
Human Rights Action Plan for China, 603, 613.
2016–2020, 1st ed., August 2016, http://
www.chinahumanrights.org/html/2016/ 264 P. Triolo, S. Sacks, G. Webster, and R.
POLITICS_0929/5844.html Creemers, ‘China’s Cybersecurity Law
One Year On’, DigiChina, 30 November
259 For tracing how this intention has spanned 2017, https://www.newamerica.org/
multiple reports, see the 2012–2015 plan: cybersecurity-initiative/digichina/blog/
Permanent Mission of the People’s Republic of chinas-cybersecurity-law-one-year/
China to the United Nations Office at Geneva
and Other International Organizations in 265 R. Creemers, P. Triolo, and G. Webster,
Switzerland, National Human Rights Action ‘Translation: Cybersecurity Law of the
Plan for China (2012–2015), 11 June 2012, People’s Republic of China (Effective 1 June
http://www.china-un.ch/eng/rqrd/jblc/ 2017)’, DigiChina, 29 June 2018, https://www.
t953936.htm#:~:text=The%20period%20 newamerica.org/cybersecurity-initiative/
2012%2D2015%20is,it%20is%20also%20an%20 digichina/blog/translation-cybersecuri-
important ty-law-peoples-republic-china/
260 Article 40 of the Chinese Constitution. The 266 G. Greenleaf and S. Livingston, ‘China’s New
freedom and privacy of correspondence of Cybersecurity Law – Also a Data Privacy Law?’,
citizens of the People’s Republic of China are UNSW Law Research Paper, no. 17–19; 144
protected by law. No organisation or individual Privacy Laws & Business International Report,
no. 1–7, 1 December 2016, https://papers.ssrn.
74
ARTICLE 19 · Emotional Entanglement: China’s emotion recognition market and its implications for human rights ·2021
269 R. Creemers, M. Shi, L. Dudley, and G. 275 J. Ding, ‘ChinAI #84: Biometric Recognition
Webster, ‘China’s Draft “Personal Information White Paper 2019’, ChinAI, 2 March
Protection Law” (Full Translation)’, DigiChina, 2019, https://chinai.substack.com/p/
21 October 2020, https://www.newamerica. chinai-84-biometric-recognition-white
org/cybersecurity-initiative/digichina/blog/
chinas-draft-personal-information-protec- 276 G. Webster, R. Creemers, P. Triolo, and
tion-law-full-translation/; G. Webster and R. E. Kania, ‘Full Translation: China’s
Creemers, ‘A Chinese Scholar Outlines Stakes “New Generation Artificial Intelligence
for New “Personal Information” and “Data Development Plan” (2017)’, DigiChina, 1
Security” Laws (Translation)’, DigiChina, 28 August 2017, https://www.newamerica.
May 2020, https://www.newamerica.org/ org/cybersecurity-initiative/digichina/blog/
cybersecurity-initiative/digichina/blog/ full-translation-chinas-new-generation-artifi-
chinese-scholar-outlines-stakes-new-person- cial-intelligence-development-plan-2017/
al-information-and-data-security-laws-trans-
lation/ 277 M. Shi, ‘Translation: Principles and
Criteria from China’s Draft Privacy Impact
270 P. Triolo, S. Sacks, G. Webster, and R. Assessment Guide’, DigiChina, 13 September
Creemers, ‘China’s Cybersecurity Law 2018, https://www.newamerica.org/
One Year On’, DigiChina, 30 November cybersecurity-initiative/digichina/blog/
2017, https://www.newamerica.org/ translation-principles-and-criteria-from-chi-
cybersecurity-initiative/digichina/blog/ nas-draft-privacy-impact-assessment-guide/
chinas-cybersecurity-law-one-year/
278 G. Webster, ‘Translation: Chinese AI Alliance
271 For an official translation of the 2020 Drafts Self-Discipline “Joint Pledge”’, DigiChina,
version, see: State Administration for Market 17 June 2019, https://www.newamerica.
Supervision of the People’s Republic of org/cybersecurity-initiative/digichina/blog/
China and Standardization Administration of translation-chinese-ai-alliance-drafts-self-
the People’s Republic of China, Information discipline-joint-pledge/
Security Technology – Personal Information
(PI) Security Specification: National 279 L. Laskai and G. Webster, ‘Translation:
Standard for the People’s Republic of Chinese Expert Group Offers “Governance
China, GB/T 35273–2020, 6 March 2020, Principles” for “Responsible AI”, DigiChina,
https://www.tc260.org.cn/front/postDetail. 17 June 2019, https://www.newamerica.
html?id=20200918200432 org/cybersecurity-initiative/digichina/blog/
translation-chinese-expert-group-offers-gov-
272 S. Sacks, ‘New China Data Privacy Standard ernance-principles-responsible-ai/
Looks More Far-Reaching than GDPR’, CSIS, 29
January 2018, https://www.csis.org/analysis/ 280 BAAI, Beijing AI Principles, 28 May 2019,
new-china-data-privacy-standard-looks- https://www.baai.ac.cn/news/beijing-ai-prin-
more-far-reaching-gdpr ciples-en.html
75
ARTICLE 19
Free Word Centre
60 Farringdon Road
London EC1R 3GA
United Kingdom
article19.org