Autonomous Weapons and
Autonomous Weapons and
Abstract
Within the broader context of the problems raised by the interaction between humans
and machines in weaponry and targeting, this paper deals with the specific issue of the
mens rea required to establish responsibility for the war crime of indiscriminate
attacks, in the context of attacks performed with semi-autonomous weapons or with
the support of artificial intelligence (AI) in targeting decision-making. The author
presents the difficulties that are determined by the interaction between humans and
machines, and highlights that an interpretation that would allow for risk-taking mental
elements such as dolus eventualis and recklessness in the framework of the war crime
of attacking civilians would be better able to capture the criminality of the conduct of
the person who knowingly accepts the risk of killing civilians as part of an AI-powered
attack where the result of hitting the civilian target is one of the possible outcomes.
However, the article indicates that this construction can be employed only in specific
circumstances, since in most scenarios even these lowered mens rea requirements
would not be met. In most human-machine teaming scenarios, lower types of intent
such as dolus eventualis would still be insufficient for the ascription of criminal
responsibility for such indiscriminate attacks against civilians. This is because of the
specific risks posed by the integration of autonomy in the targeting process and the
resulting changes to the cognitive environment in which human agents operate, which
significantly affect specific components of mens rea standards.
* Post-doctoral researcher at the Graduate Institute, Geneva (Switzerland) and at the Asser
Institute, The Hague (The Netherlands). This research is part of the Swiss National Science
Foundation (SNSF)-funded project ‘Lethal Autonomous Weapon Systems and War Crimes:
Who Is to Bear Criminal Responsibility for Commission?’ (Projet 10001C_176435) led by
Prof. Paola Gaeta. I would like to thank Paola Gaeta, Guido Acquaviva, Giulia Pinzauti,
Rogier Bartels, Abhimanyu George Jain, Taylor Woodcock and the Journal’s reviewers for their
helpful advice and comments. Usual disclaimers apply. [[email protected]]
........................................................................
Journal of International Criminal Justice (2021), 1 of 25 doi:10.1093/jicj/mqab005
ß The Author(s) (2021). Published by Oxford University Press. All rights reserved.
For permissions, please email: [email protected]
2 of 25 JICJ (2021)
1. Introduction
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
Artificial intelligence (AI) in the weaponry industry is leading to the develop-
ment of so-called fully lethal autonomous weapons, namely weapons that once
activated, select and strike targets whose engagement is not predetermined by
a human (the so-called ‘man out of the loop’ scenario).1 This development has
sparked vivid scholarly debates on the so-called ‘responsibility gap’, which also
refers to problems in attributing criminal responsibility for war crimes to indi-
viduals when they have no or reduced control over weapon systems that
attain autonomous decision-making capacity in the targeting process.2
However, in addition to asking whether a responsibility gap arises if and
when fully autonomous weapons are developed and deployed,3 it is also press-
ing to analyse the challenges in the determination of criminal responsibility
posed by the current integration of autonomy in targeting and in weapon
systems leading to so-called human-machine teaming. It is a fact that the
growing development of autonomy in target recognition and in the analysis
of intelligence increasingly requires humans to take decisions in tandem with
machines. More and more human operators are required to process targeting
information provided by autonomous systems in order to take engagement
decisions. In semi-autonomous systems, autonomy is implemented in one or
more targeting functions (i.e. the search, identification, tracking or prioritiza-
tion of targets) and the human operator retains either the decision to engage
specific targets (the final engagement step or the decision of whether engagement
1 Much of the scholarly and political debate has long centred on the lack of and potential need
for an agreed-upon definition of autonomous weapons and it is only more recently that the
debate has moved towards discussing degrees of autonomy in weapons without focusing on a
definition. By way of simplification and to have a clear starting point for this discussion, this
article adopts the definition of autonomous weapons proposed in P. Scharre and M.C. Horowitz,
An Introduction to Autonomy in Weapon Systems (Center for a New American Security, 2015), at
16.
2 A. Matthias, ‘The Responsibility Gap: Ascribing Responsibility for the Actions of Learning
Automata’, 6 Ethics and Information Technology (2004) 175; R. Sparrow, ‘Killer Robots’, 24
Journal of Applied Philosophy (2007) 62; M. Schulzke, ‘Autonomous Weapons and Distributed
Responsibility’, 26 Philosophy and Technology (2013) 203, at 205 ff.; M. Champagne and R.
Tonkens, ‘Bridging the Responsibility Gap in Automated Warfare’, 28 Philosophy and Technology
(2015) 125; Human Rights Watch (HRW) and International Human Rights Clinic at Harvard
Law School, Mind the Gap: the Lack of Accountability for Killer Robots, 9 April 2015, available
online at www.hrw.org/report/2015/04/09/mind-gap/lack-accountability-killer-robots (visited
13 October 2020); T. Chengeta, ‘Accountability Gap: Autonomous Weapon Systems and
Modes of Responsibility in International Law’, 45 Denver Journal of International Law and
Policy (2016–2017) 1; T. Swoboda, ‘Autonomous Weapon Systems – An Alleged
Responsibility Gap’, in V.C. Müller (ed.), Philosophy and Theory of Artificial Intelligence 2017
(Springer, 2018) 302.
3 Definitional ambiguity translates into uncertainty about the current and future development of
these weapons. Many argue that autonomous weapons are yet to exist but that we are inevitably
going towards full autonomy in weapons, see in relation to drone technology W.C. Marra and
S.K. McNeil, ‘Understanding ‘‘The Loop’’: Regulating the Next Generation of War Machines’, 36
Harvard Journal of Law and Public Policy (2013) 1139, at 1160–1177. Others maintain that
autonomous weapons are already here, see for example R. Crootof, ‘The Killer Robots Are
Here: Legal and Policy Implications’, 36 Cardozo Law Review (2015) 1837, at 1868–1872.
Autonomous Weapons and the Responsibility Gap 3 of 25
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
necessary.4 Supervised autonomous weapons, employed by at least thirty states,
present similar characteristics to autonomous weapons in that the whole engage-
ment cycle is automated, but humans supervise their operations and can
intervene.5
Human-machine teaming is, and is likely to continue being, the key para-
digm of warfare.6 Arguably, incorporating autonomy or automation7 in weap-
ons and targeting decision-making processes is ideal on the battlefield, since it
makes it possible to ‘leverage the precision and reliability of automation with-
out sacrificing the robustness and flexibility of human intelligence’.8 However,
when humans operate semi-autonomous or supervised weapons, and, more
generally, when humans team with autonomous machines in the targeting
process, the attribution of criminal responsibility for targeting-related war
crimes raises specific legal challenges.9
Among these, there is the question of responsibility for attacks against civil-
ians launched by human agents using, cooperating and interacting with au-
tonomous processes, whenever the mental element which must be established
is intent (dolus) in the sense of excluding risk-taking forms of criminal behav-
iour. For instance, this is the case for the war crime of attacking civilians
under the Statute of the International Criminal Court (ICC). The requirement
of ‘intentionality’ would exclude prima facie the responsibility of a human agent
who is simply aware of the possibility that civilians could be hit. Such
4 P. Scharre, ‘Centaur Warfighting: The False Choice of Humans vs. Automation’, 30 Temple
International and Comparative Law Journal (2016) 151, at 154.
5 P. Scharre, Army of None: Autonomous Weapons and the Future of War (W.W. Norton and
Company, 2018), at 45.
6 See Scharre, supra note 4, at 153; with reference to the US ‘Third Offset Strategy’, see S. De
Spiegeleire, M. Maas, T. Sweijs, The Hague Centre for Strategic Studies (HCSS), Artificial Intelligence
and the Future of Defense: Strategic Implications for Small- and Medium-Sized Force Providers (2017),
available online at https://hcss.nl/sites/default/files/files/reports/Artificial%20Intelligence%20and%
20the%20Future%20of%20Defense.pdf (visited 13 October 2020), at 85.
7 In relation to the distinction between automated and autonomous systems, see International
Committee of the Red Cross (ICRC), Autonomy, Artificial Intelligence and Robotics: Technical
Aspects of Human Control, August 2019, available online at www.icrc.org/en/document/
autonomy-artificial-intelligence-and-robotics-technical-aspects-human-control (visited 13
October 2020), at 7: ‘Autonomy . . . is the ability of the system to act without direct human
intervention, although it is a continuum with various levels and many grey areas. . . . some
autonomous systems perform prescribed actions that are fixed in advance and do not change in
response to the environment . . . These are sometimes referred to as ‘‘automatic’’. Other systems
initiate or adjust their actions or performance based on feedback from the environment (‘‘auto-
mated’’) and more sophisticated systems combine environmental feedback with the system’s
own analysis regarding its current situation (‘‘autonomous’’). Increasing autonomy is generally
equated with greater adaptation to the environment and is sometimes presented as increased
‘‘intelligence’’—or even ‘‘AI’’—for a particular task’.
8 Scharre, supra note 4, at 152.
9 In this article, I will not need to distinguish between automation, autonomy and AI since they
have comparable effects on decisions-making and on the attribution of criminal responsibility in
human machine-teaming, with AI and, in particular, machine learning (ML) creating additional
problems in terms of understandability and traceability.
4 of 25 JICJ (2021)
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
the human counterpart involved in the targeting decision may not have
intended to attack civilians, but has ‘only’ taken the risk of such occurrence.
The need to address the problem of intent in attacks against civilians flowing
from human-machine teaming situations emerges from a pre-existing problem
within the ICC Statute framework for targeting-related war crimes. This con-
cerns the vacuum resulting from the lack of criminalization of attacks that are
not intentionally directed at civilians but instead fall short of distinguishing
between lawful and unlawful targets, where a risk concerning the possibility of
attacking civilians is taken. The lack of specific criminalization under the ICC
Statute and the uncertain contours10 of violations of the principle of distinction
that may constitute indiscriminate attacks11 forces resort to the war crime of
intentionally attacking civilians. However, under which conditions and to
what extent is it possible to ‘stretch’ this provision in order to encompass
indiscriminate attacks?
This article argues that there is room to construe the mental element re-
quirement for the war crime of attacking civilians12 in the ICC Statute so as to
include risk-taking forms of criminal behaviour such as dolus eventualis or
recklessness.13 This interpretation would allow for the ascription of criminal
10 See H. van der Wilt, ‘Towards a Better Understanding of the Concept of ‘‘Indiscriminate
Attack’’ – How International Criminal Law Can Be of Assistance’, 22 Yearbook of
International Humanitarian Law (YIHL) (2019) 29–42.
11 Several authors highlight that the concept of indiscriminate attacks ‘straddles the principles of
distinction and proportionality’, S. Townley, ‘Indiscriminate Attacks and the Past, Present, and
Future of the Rules/Standards and Objective/Subjective Debates in International Humanitarian
Law’, 50 Vanderbilt Journal of Transnational Law (2017) 1223, at 1261; see in the same vein the
analysis by van der Wilt, ibid., at 38 and 39. Art. 51(5)(b) AP I treats indiscriminate attacks as
violations of the proportionality principle. However, indiscriminate attacks entail a ‘lack of
targeting’ and seem to be better qualifiable as violations of the principle of distinction. This
paper thus approaches indiscriminate attacks from the point of view of the principle of distinc-
tion endorsing the legal qualification adopted by the ICTY in several cases and by the ICC so
far, see infra Sections 2 and 3.
12 Some of the arguments developed with regard to the interpretation of the mental element of the
war crime of attacking civilians (Art. 8(2)(b)(i) ICCSt.) could prima facie apply to other
targeting-related war crimes such as the war crimes of intentional attacks against peacekeepers
(Art. 8(2)(b)(iii)) and against cultural and religious sites (Art. 8(2)(b)(ix)), and the war crime of
disproportionate attacks (Art. 8(2)(b)(iv)). However, a more detailed analysis of each crime
would be required to determine whether mens rea standards lower than dolus directus in the
first degree could apply to other targeting-related war crimes.
13 There is no uniform definition of dolus eventualis across civil law jurisdictions. Central to the
existing competing definitions is a type of intention which does not reflect the object and
purpose of the accused but rather where the perpetrator foresees the likely risk that a forbidden
consequence will occur and nevertheless proceeds with his/her actions, G.P. Fletcher,
Rethinking Criminal Law (OUP, 2000), at 445 and 446; M.E. Badar, ‘The Mental Element in
the Rome Statute of the International Criminal Court: A Commentary from a Comparative
Criminal Law Perspective’, 19 Criminal Law Forum (2008) 473, at 489–491. There has been
enduring tension over the demarcation between dolus eventualis and recklessness. In common
law systems, recklessness refers to the mental state of the perpetrator who foresees that his/her
conduct may bring about the forbidden harmful result ‘but nevertheless takes a deliberate and
unjustifiable risk of bringing it about’, Badar, ibid., at 488 referring to R. v. G. and Another
Autonomous Weapons and the Responsibility Gap 5 of 25
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
deploying autonomous systems envisages the risk of directing attacks against
persons or objects immune from attack and decides nonetheless to proceed
with the attack. This interpretation would partially reduce the responsibility
gap resulting from a narrower interpretation of the mental element as requir-
ing dolus directus in the first degree or in the second degree.14 At the same
time, however, this article contends that, in most human-machine teaming
scenarios, even lower types of intent such as dolus eventualis would be insuf-
ficient for the ascription of criminal responsibility for indiscriminate attacks
against civilians. This is because of the specific risks posed by the integration
of autonomy in the targeting process and the resulting changes to the cogni-
tive environment in which human agents operate, which significantly impact
specific components of mens rea standards.
The article, first, clarifies that, due to their intrinsic features, the integration of
autonomy in weapons and in targeting processes is likely to result in attacks
against civilians in violation of the principle of distinction, which may be charac-
terized as indiscriminate attacks (Section 2). Secondly, it examines the debates over
how Article 30 of the ICC Statute and the definition of the war crime of attacking
civilians under Article 8 incorporate the principle of culpability (Section 3). Finally,
the article explores the impact that the cognitive environment created by the
integration of autonomy in weapons and targeting has on human decision-
making and, as a consequence, on different forms of mens rea and the ascription
of criminal responsibility for indiscriminate attacks against civilians (Section 4).
[2004] 1 AC 1034 (HL). In brief, the difference boils down to punishing the objective, actual
risk of harm (recklessness) vis-à-vis punishing the perpetrator’s posture towards risks (dolus
eventualis). Despite being aware of the risk of causing the result proscribed by the criminal
offence, the ‘reckless’ perpetrator disregards it and persists in having an ‘affirmative aversion’ to
the production of the harmful consequence, Fletcher, ibid., at 446. For the purposes of the
present paper, we are concerned with a mental state referring to the knowledge of a certain
risk and the decision to continue the attack.
14 See the ICC case law discussed in the following section. The concept of intent is one of the most
contentious issues in national criminal law and borrowing concepts from national legal dis-
course risks confusion. As aptly pointed out by Ohlin: ‘[t]he word is notoriously vague and
captures situations where the defendant desires a particular outcome as well as situations
where the defendant is aware of the practical certainty of the outcome but is indifferent to
the result’, J.D. Ohlin, ‘Targeting and the Concept of Intent’, 35 Michigan Journal of International
Law (2013) 79, at 82. For the purpose of this article, I subscribe to the view that dolus directus
in the first degree and dolus indirectus/dolus directus in the second degree ‘cover the same
conceptual territory as acting with purpose and acting with knowledge’ as adopted in the
Model Penal Code (MPC) § 2.02(2)(a)–(b) (1985), Ohlin, ibid., at 82. An additional factor
that might add confusion is that in some systems dolus indirectus is a synonym of dolus
eventualis, see F. Antolisei, Manuale di Diritto Penale (Giuffrè Editore, 2003), at 353.
6 of 25 JICJ (2021)
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
ascription of criminal responsibility for war crimes, in particular with respect
to the principle of culpability.15 Autonomy in targeting, either in the weapon
system itself or in another stage of the targeting cycle, such as the pre-
approval of target lists based on AI-powered intelligence, may in some instan-
ces lead to violations of the principle of distinction in relation to which the
fulfillment of the specific mental element could become problematic.
Automated Target Recognition (ATR) is a technology used for target acquisition
and classification. ATR systems are currently implemented in some semi-
autonomous weapons and will be increasingly integrated in both autonomous
and semi-autonomous weapons. Since it was invented in the 1970s, ATR tech-
nology has relied upon the principle of pattern recognition: it identifies military
objects based on target signatures predefined by humans.16 Near-term develop-
ments17 to improve ATR performance and accuracy include the use of machine
learning (ML),18 including deep learning and deep neural networks.19
However, there are several limitations to ATR’s capacity to correctly identify
targets in compliance with the international humanitarian law (IHL) principle of
distinction, which place limitations on the way in which warfare may be waged
insofar as it prohibits making the civilian population the object of an attack.20
Commentators have noted that among the main issues affecting ATR’s
15 The focus of the discussions at the Convention on Conventional Weapons (CCW) Group of
Governmental Experts (GGE) is limited to fully lethal autonomous weapon systems (LAWS) and
within this framework individual criminal responsibility for war crimes is rarely discussed.
Switzerland is one of the only States to grapple with the ascription of criminal responsibility
for international crimes committed with LAWS, dealing in particular with the mens rea element
of war crimes, see A ‘Compliance-Based’ Approach to Autonomous Weapon Systems, Working
Paper Submitted by Switzerland, UN Doc. CCW/GGE.1/2017/WP.9, 10 November 2017, §§
22–24, at 5 and 6.
16 V. Boulanin and M. Verbruggen, Stockholm International Peace Research Institute (SIPRI),
Mapping the Development of Autonomy in Weapon Systems, November 2017, available online at
www.sipri.org/sites/default/files/2017-11/siprireport_mapping_the_development_of_autonomy_in_
weapon_systems_1117_1.pdf (visited 13 October 2020), at 24 and 25.
17 See for example, the Target Recognition and Adaption in Contested Environments (TRACE)
project by the US Defense Advanced Research Projects Agency (DARPA).
18 ML is one of the approaches to AI and according to Nilsson, ‘a machine learns whenever it
changes its structure, program, or data (based on its inputs or in response to external infor-
mation) in such a manner that its expected future performance improves’, N.J. Nilsson,
Introduction to Machine Learning: An Early Draft of a Proposed Textbook (Stanford University,
1998), at 1.
19 Deep learning is an approach to ML ‘whereby the system ‘‘learns’’ how to learn: the system
transforms raw data input to representations (features) that can be effectively exploited in
machine-learning tasks ... . Deep learning allows the computer to build complex concepts
from simpler concepts. A deep-learning system can, for instance, represent the concept of an
image of a person by combining simple concepts, such as corners and contours. Deep learning
. . . has made important progress in recent years, thanks to improvements in computing power
and increased data availability and techniques to train neural networks’, Boulanin and
Verbruggen, supra note 16, at 17 citing I. Goodfellow, Y. Bengio and A. Courville, Deep
Learning (MIT Press, 2016), at 8.
20 The principle of distinction is enshrined in Art. 48 AP I with accompanying rules in Arts 51
and 52 AP I.
Autonomous Weapons and the Responsibility Gap 7 of 25
predictability and reliability are: first, the lack or poor quality of training and test
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
data; second, under and over-fitting; and, third, a discrepancy between the data
used to train the algorithm and the real data.21 Some other limitations of ATR
systems are linked to their inability to determine the context in which they operate
and the impossibility to program concepts of IHL into ATR algorithms. Thus, it is
doubtful that ATR could distinguish between a combatant and a person hors de
combat, such as a fighter with the intention to surrender, or correctly identify ‘dual
use’ objects, or ‘civilians directly participating in hostilities’.22 This may lead, for
example, to semi-autonomous weapons possibly mistakenly targeting civilian tar-
gets and to human operators approving or failing to stop an attack based on an
ATR system misidentifying targets.
Another example of human-machine teaming is the increasing use of au-
tonomy in the analysis of intelligence data. The Department of Defence (DoD)’s
Algorithmic Warfare Cross-Functional Team (AWCFT), also known as Project
Maven, is part of the DoD Third Offset military strategy focusing on AI and
automation. Project Maven entails the use of big data and ML in order to
automate the work of analysts looking at drone-collected video surveillance
footage. The AI programme extracts objects of interest from moving or still
imagery of drone video footage and then relays its discoveries to a human
analyst.23 The human operator then takes targeting decisions based on the
suggestions given by the AI programme. Possible problems could lie, however,
in the unreliability of the AI-based intelligence analysis or the lack of trans-
parency and understandability. (Over-)reliance on the analysis made by the AI
system could lead the human operator to launch an attack against civilians or
other persons and objects immune from attack.
In the aforementioned scenarios, the question arises of whether and to what
extent there is a responsibility gap for war crimes related to the violation of the
IHL principle of distinction, from the viewpoint of principal criminal responsi-
bility for commission.24 In the ICC Statute, there is a host of war crimes
specifically related to the violation of the principle of distinction. Chief among
them, Article 8(2)(b)(i) establishes that, in the context of an international
21 Boulanin and Verbruggen, supra note 16, at 24–25; J. Hughes, ‘The Law of Armed Conflict
Issues Created by Programming Automatic Target Recognition Systems Using Deep Learning
Methods’, 21 YIHL (2018) 99, at 106 and 107.
22 J. van den Boogaard, ‘Proportionality and Autonomous Weapons Systems’, 6 Journal of
International Humanitarian Legal Studies (2015) 247, at 259; Hughes, ibid., at 116 and 117.
23 L. Lewis, ‘Insights for the Third Offset: Addressing Challenges of Autonomy and Artificial
Intelligence in Military Operations’, Center for Naval Analysis, September 2017, available on-
line at www.cna.org/CNA_files/PDF/DRM-2017-U-016281-Final.pdf (visited 13 October 2020),
at 17. Project Maven was established in 2017; since December 2019 Palantir — a big US-based
data analytics company — has taken over the role of Google and started collaborating with the
DoD on this project, M. Kress, ‘Palantir Takes Over Project Maven’, 8 January 2020, available
online at http://artificialintelligencemania.com/2020/01/08/palantir-takes-over-project-maven
(visited 13 October 2020).
24 Unlike in scenarios in which fully autonomous systems are employed, where it has been argued
that a commander could be held responsible for the actions of his/her ‘subordinate’ autono-
mous weapon system, the direct application of forms of criminal participation or accessory
liability such as command responsibility to human-machine teaming scenarios is more difficult.
8 of 25 JICJ (2021)
armed conflict (IAC), intentionally directing ‘attacks against the civilian popu-
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
lation as such or against individual civilians not taking direct part in hostilities’
is a war crime under the jurisdiction of the Court. The same war crime is listed
under Article 8(2)(e)(i) in relation to non-IACs (NIACs) as defined in Article
8(2)(f).
Situations where, in an IAC or in a NIAC, a human operator deliberately and
purposely attacks civilians fall squarely within the pursuit of these criminal
provisions. However, when weapons and targeting rely on autonomy, this is
an unlikely scenario. The development and use of autonomy in the battlefield
expressly aims to ‘pierce the fog of war’ and to correctly identify the military
target. Autonomy still may lead to violations of the principle of distinction,
creating a context where civilians and other unlawful targets are mistakenly
attacked, unintentionally attacked, or at risk of being attacked. As mentioned
above, ATR in semi-autonomous weapons may not be fully reliable and may
not allow for correct identification of targets. AI could produce an incorrect
analysis of targeting data. Automation bias, which refers to a documented
human response triggered by reliance on automated or autonomous systems,
may affect human capacity to call into question AI-generated targeting infor-
mation25 and provoke over-reliance on them. Civilians may thus be attacked
as a result of mistakes or false positives26 of ATR systems or incorrect AI-
powered analysis of targeting data, on which the human operators relied.
When autonomous processes used in weapons and targeting generate target
information used by human operators that result, also because of the interplay
of automation bias, in civilians (or other non-military targets) indiscriminately
being attacked, the IHL principle of distinction27 is violated and the ensuing
attacks are to be qualified as indiscriminate attacks. In other words, autonomy
in warfare, which allegedly should lead to greater precision in the battlefield,
could lead to unreliable and wrong target recognition and engagement and to
a lack of capability to distinguish between military and non-military targets.28
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
hibition of indiscriminate attacks is at risk of violation.
These attacks refer to (or entail) a lack of distinction between lawful and
unlawful targets and/or to the employment of weapons or other means that
are indiscriminate in nature. Indiscriminate attacks are, for example: attacks
that are not directed against a specific military target; attacks employing in-
discriminate weapons (i.e. weapons incapable of distinguishing between civil-
ian and military targets) and attacks carried out without taking the necessary
precautions to spare civilians, especially failing to seek precise information on
the targets to be attacked.29
The ICC Statute is silent on the concept of indiscriminate attacks. However,
when civilians have been hit, the war crime of intentionally attacking civilians
has been interpreted to encompass this type of attack. In Katanga, where
attacks against co-located military objectives and civilians were conducted in
a manner that indicated that civilians were the primary object of the attacks,
indiscriminate attacks have been subsumed under the crime of intentional
attacks against the civilian population or individual civilians.30 In Ntaganda,
the Trial Chamber held that ‘the crime under Article 8(2)(e)(i) of the Statute
may encompass attacks that are carried out in an indiscriminate manner, that
is by targeting an area, as opposed to specific objects, or not targeting specific
military objects or persons taking a direct part in hostilities, so long as the
perpetrator was aware of the presence of civilians in the relevant area’.31 In
Mbarushimana, it was held that the war crime under Article 8(2)(e)(i) may also
be committed when the attack is launched against a military objective and
simultaneously against the civilian population or individual civilians not taking
a direct part in the hostilities.32 In Ntaganda and Mbarushimana, the Chambers
do not seem to require that the primary target was the civilian population.
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
This case law is in line with that of the International Tribunal for the former
Yugoslavia (ICTY), which also considered that indiscriminate attacks may
constitute direct attacks against civilians.33 In sum, if indiscriminate attacks
were criminalized explicitly under the ICC Statute this would be the most
appropriate path to prosecuting attacks resulting from obstructed capacity to
distinguish between civilian and military targets due to the integration of au-
tonomy in targeting. However, since indiscriminate attacks are not explicitly
criminalized as an independent war crime, the war crime of intentionally
attacking civilians must be relied upon to avoid the risk of consolidating a
gap in the framework of protection against these types of attacks. Once it is
accepted that this sort of attacks can be prosecuted under the war crime of
attacking civilians (for lack of a better option), then we must consider the
challenges to attributing individual criminal responsibility (on the basis of
the elements of this crime in the context of the ICC Statute).
UPC/FPLC made no difference between the two, because in addition to attacking the opposing
forces, it in fact also wished to target the Lendu civilians’. Decision on the Confirmation of
Charges, Mbarushimana (ICC-01/04-01/10), Pre-Trial Chamber I, 16 December 2011 (hereafter
‘Mbarushimana Confirmation of Charges’), §§ 142, 151, 191, 203, 218. At § 144 specific
reference is made to the witness testimony that ‘orders to kill were generalised and directed at
the ‘‘enemy’’ without any distinction being made between combatants and civilians’.
33 The ICTY also considered that attacks employing means of combat that cannot discriminate
between civilians and civilian objects and military objectives are tantamount to directing
attacks against civilians: Judgment, Blaskić (IT-95-14-T), Trial Chamber, 3 March 2000, §§
180, 501 and 512; Judgment, Galić (IT-98-29-T), Trial Chamber I, 5 December 2003, § 57
(hereafter ‘Galić Trial Judgment’); Judgment, Milošvić (IT-98-29/1-T), Trial Chamber III, 12
December 2007 (hereafter ‘Milošvić Trial Judgment’), § 948; Judgment, Perišić (IT-04-81-T),
Trial Chamber I, 6 September 2011, § 97; Judgment, Galić (IT-98-29-A), Appeals Chamber, 30
November 2006 (hereafter ‘Galić Appeal Judgment’), § 132.
34 There is also the question of whether the causation requirement is fulfilled, but this question
will be analysed in future research.
Autonomous Weapons and the Responsibility Gap 11 of 25
It is therefore crucial to establish whether and to what extent the prima facie
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
exclusion of risk-taking forms of criminal responsibility in these war crimes
provisions is warranted.
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
Secondly, and more generally, it cannot be taken for granted that the ‘inten-
tionality’ requirement of Article 8 coincides with the ‘intent’ requirement in
Article 30.
Let us examine these two issues in turn.
1. ‘Intent’ under Article 30 of the ICC Statute and Whether It Includes dolus
eventualis
(a) ‘Intent’ and ‘knowledge’ under Article 30(1) of the ICC Statute express the
volitional and cognitive components of dolus
Article 30(1) establishes that the material elements of a crime must be fulfilled,
‘unless otherwise provided’, with ‘intent’ and ‘knowledge’. Article 30(1)
expresses the volitional and cognitive components of mens rea that must cu-
mulatively exist, each to different degrees (depending on the type of dolus), for
criminal responsibility to arise.41
While recklessness is not included in Article 30 of the ICC Statute there
is some room to contend that dolus eventualis is. In order to determine this,
we must turn to the analysis of Article 30(2)(b).
(b) Article 30(2)(b) of the ICC Statute requires awareness that a consequence ‘will
occur in the ordinary course of events’
Article 30(2)(a) defines intent in relation to conduct, and requires that the
perpetrator ‘means to engage in the conduct’. This paragraph does no more
than refer to the so-called ‘voluntary act’ requirement, i.e. that conduct must
be the result of a voluntary act, thus excluding that automatism or reflex could
give rise to criminal responsibility.42
Thus, the crucial provision for determining standards of mens rea under the
ICC Statute (and whether dolus eventualis could be a sufficient mental element)
is Article 30(2)(b). Pursuant to Article 30(2)(b) a person has intent in relation
to consequences if he ‘means to cause that consequence’ (first alternative) or if
he is at least ‘aware that [the consequence] will occur in the ordinary course
of events’ (second alternative). According to some commentators, this provi-
sion refers primarily to crimes of result, i.e. those requiring that the proscribed
conduct causes a certain harmful result.43 However, as suggested by others,
41 G. Werle and F. Jeßberger, Principles of International Criminal Law (4th edn., OUP, 2020), para.
565; A. Eser, ‘Mental Elements – Mistake of Fact and Mistake of Law’, in A. Cassese, P. Gaeta
and J.R.W.D. Jones (eds), The Rome Statute of the International Criminal Court: A Commentary
(OUP, 2002) 889, at 905.
42 Ex plurimis Werle and Jeßberger, ibid., para. 571; Eser, ibid., at 913; D.K. Pigaroff and D.
Robinson, in O. Triffterer and K. Ambos (eds), The Rome Statute of the International Criminal
Court: A Commentary (3rd edn., C.H. Beck/Hart/Nomos, 2016), para. 19 (Article 30).
43 Werle and Jeßberger, supra note 41, para. 572.
Autonomous Weapons and the Responsibility Gap 13 of 25
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
applies to both crimes of result and crimes of conduct, (the latter referring to
those fulfilled by the commission of the proscribed act).44 Indeed, ‘consequen-
ces’ not only refers to the specific consequence of the perpetrator’s conduct
required by the definition of the crime (such as the death of the victim), but
also embraces all ‘physical effects and any other detrimental results’ of the
conduct.45
Unlike the corresponding grave breach in Additional Protocol (AP) I, which
requires that an attack against civilians causes ‘death or serious injury to body
or health’, in the ICC Statute the war crime of attacking civilians is a crime of
conduct.46 The latter interpretation, propounding the applicability of Article
30(2)(b) to both crimes of result and crimes of conduct, is therefore important
with respect to the war crime of attacking civilians (as well for other similar
war crimes of conduct in the ICC Statute). On the condition that the require-
ment of ‘intentionality’ under Article 8(2)(b)(i) coincides with ‘intent’ under
Article 30, the relevant definition would be the one enshrined in subsection b)
and not the one enshrined in subsection a), as the Trial Chamber in Katanga
instead suggests. This is a relevant distinction, since there is room to contend
that ‘intent’ under Article 30(2)(b) also includes dolus eventualis. This in turn
would mean that prohibited risk-taking behaviours are also criminalized in
Article 30 of the ICC Statute, subject to the clause of ‘unless otherwise
provided’.
The relevant part of Article 30(2)(b) for dolus eventualis is the second alter-
native. The first alternative, i.e. the perpetrator ‘means to cause’ the conse-
quence, obviously refers to dolus directus in the first degree. This interpretation
is widely endorsed in scholarship47 and in the case law of the ICC.48 As for
44 Eser, supra note 41, at 914: ‘Although this provision is primarily designed for ‘‘result crimes’’ in
the aforementioned sense, and thus, can be treated as referring to the prohibited result (such as
the death or the pain of the victim in cases of homicide or torture and their preceding causal
chains), this provision can also concern ‘‘conduct crimes’’ at least insofar as the perpetrator
must intend to procure the prohibited effect.
45 Eser, supra note 41, at 911.
46 Under Art. 85(3) AP I, the actus reus of the war crime of attacking civilians (as well as the
other grave breaches enshrined in this article) contains the requirement that an attack against
civilians cause ‘death or serious injury to body or health’. It therefore amounts to a result
crime, as the offence is consummated only if certain proscribed consequences (i.e. deaths or
injury) take place. See P. Gaeta, ‘Serious Violations of the Law on the Conduct of Hostilities: A
Neglected Class of War Crimes?’ in F. Pocar, M. Pedrazzi and M. Frulli (eds), War Crimes and the
Conduct of Hostilities: Challenges to Adjudication and Investigation (Edward Elgar Publishing, 2013)
20, at 32 and 33.
47 Eser, supra note 41, at 914; Pigaroff and Robinson, supra note 42, para. 21; S. Finnin, ‘Mental
Elements under Article 30 of the Rome Statute of the International Criminal Court: A
Comparative Analysis’, 61 International and Comparative Law Quarterly (2012) 325, at 341.
48 See Decision on the Confirmation of Charges, Lubanga (ICC-01/04-01/06), Pre-Trial Chamber I,
29 January 2007 (hereafter ‘Lubanga Confirmation of Charges’), § 351; Katanga Trial Judgment,
supra note 30, § 774; Decision Pursuant to Article 61(7)(a) and (b) of the Rome Statute on the
Charges of the Prosecutor against Jean-Pierre Bemba Gombo, Bemba (ICC-01/05-01/08), Pre-
Trial Chamber II, 15 June 2009 (hereafter ‘Bemba Confirmation of Charges’), § 358.
14 of 25 JICJ (2021)
Article 30(2)(b)’s second alternative, i.e. the perpetrator’s awareness that the
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
consequence will occur in the ordinary course of events, the language is more
ambiguous.49 In particular, it does not make clear whether the required level
of awareness is that a ‘consequence’ will certainly occur or that the conse-
quence will probably or possibly occur in the ordinary course of events. The first
option would lead to the conclusion that Article 30(2)(b) encompasses dolus
directus in the second degree.50 This mens rea refers to the mental state of the
perpetrator whose purpose is not to cause the forbidden result or any other
detrimental effects of the prohibited conduct (like in dolus directus in the first
degree), but foresees its occurrence as a necessary or certain or highly probable
consequence of the achievement of his main purpose and nevertheless engages
in the conduct.51 By contrast, the second option would include also dolus
eventualis.
Based on the choice of the terms ‘will occur’ instead of ‘may occur’ during
the negotiations, some commentators appear to favour a line of reasoning
excluding dolus eventualis as a sufficient mens rea under Article 30 of the ICC
Statute.52 The fact that all draft provisions on dolus eventualis and recklessness
were dropped at the Rome Conference strengthens this exclusion.53 Others
instead interpret ‘awareness that a consequence will occur in the ordinary
course of events’ as covering dolus eventualis.54 Werle and Jeßberger, on the
other hand, argue that, while Article 30 leaves no room for dolus eventualis,55
a departure from customary law and general principles of law in the ICC
Statute is normatively unwarranted.56 They suggest that the preparatory
works do not militate against the inclusion of dolus eventualis and recklessness,
but contend that the preparatory works mainly indicate that in light of their
differing legal traditions the drafters were unable to ‘agree on a consistent
common language’, and not that they ‘intended to require a higher standard
for the mental element, even though international customary law, as it stands,
already finds dolus eventualis to be sufficient’.57 In the latter respect, they note
49 For debates concerning the interpretation of the wording ‘will occur’, see Finnin, supra note 47,
at 346–349.
50 See Eser, supra note 41, 914 and 915; Finnin, supra note 47, at 343 and 344.
51 For ICC case law defining dolus in the second degree as enshrined in Art. 30(2)(b) ICCSt. see
Bemba Confirmation of Charges, supra note 48, § 359; Lubanga Confirmation of Charges, supra
note 48, § 352 (let. i); Katanga Confirmation of Charges, supra note 37, § 530.
52 See K. Ambos, ‘General Principles of Criminal Law in the Rome Statute’, 10 Criminal Law
Forum (1999) 1, at 21 and 22; Finnin, supra note 47, at 349; Eser, supra note 41, at 915
and 916.
53 For a discussion of the drafting history, see Finnin, supra note 47, at 344 and 345.
54 H. Jescheck, ‘The General Principles of International Criminal Law Set Out in Nuremberg, as
Mirrored in the ICC Statute’, 2 JICJ (2004) 38, at 45; O. Triffterer, ‘The New International
Criminal Law — Its General Principles Establishing International Criminal Responsibility’, in K.
Koufa (ed.), The New International Criminal Law (Sakkoulas Publications, 2003), at 706.
55 Werle and Jeßberger, supra note 41, para. 604.
56 Ibid., paras 576, 586 and 587. They argue that dolus eventualis and recklessness could come
into play via the ‘unless otherwise provided clause’, but see in relation to Art. 8(2)(b)(i) ICCSt.
below note 63.
57 Ibid., para. 576.
Autonomous Weapons and the Responsibility Gap 15 of 25
that dolus eventualis is a sufficient mens rea in national legal systems and in the
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
case law of other international tribunals and must be considered customary
international law as well as a general principle of law.58 Pigaroff and Robinson
argue that ‘it might be open to the Court to interpret [Article 30] to include
some of the . . . aspects’ of dolus eventualis and recklessness and that it remains
to be determined ‘which probabilities between virtual certainty and mere pos-
sibility will suffice’.59
So far, however, the Appeals Chamber of the ICC has ruled out dolus
eventualis from the scope of Article 30(2)(b) of the ICC Statute. In
Lubanga, the Appeals Chamber has overturned the approach adopted by
the Pre-Trial Chamber in the same case and in other cases as well.60 It
has construed the expression ‘a consequence will occur’ as referring ‘to
future events in respect of which there is virtual certainty that they will
occur’.61 The doctrine of stare decisis does not apply before the ICC:62 a
revirement de jurisprudence on Article 30 including dolus eventualis is there-
fore possible.
In addition, the ‘unless otherwise provided’ clause in Article 30(1) specific-
ally envisages that crimes under the ICC Statute may require different mens rea
standards. This raises the question of whether ‘intent’ under Article 30 neces-
sarily overlaps with ‘intentionality’ in the relevant war crimes definitions in
Article 8.
58 Ibid.
59 Pigaroff and Robinson, supra note 42, para. 22 and text accompanying note 77.
60 In Lubanga, the Pre-Trial Chamber held that Art. 30 ICCSt. also covers ‘situations in which the
suspect (a) is aware of the risk that the objective elements of the crime may result from his or
her actions or omissions, and (b) accepts such an outcome by reconciling himself or herself
with it or consenting to it (also known as dolus eventualis)’, Lubanga Confirmation of Charges,
supra note 48, § 352 (let. ii). See in the same vein Katanga Confirmation of Charges, supra note
37, note 329.
61 Judgment on the appeal of Mr Thomas Lubanga Dyilo against his conviction, Lubanga (ICC-01/
04-01/06), Appeals Chamber, 1 December 2014, § 6; the emphasis is added. See also, in the
same vein, the Katanga Trial Judgment, supra note 30, §§ 775–777; and Bemba Confirmation of
Charges, supra note 48, §§ 360–362.
62 Art. 21(2) ICCSt.
16 of 25 JICJ (2021)
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
the purposes of this Article’. The latter clause and Article 30’s ‘unless other-
wise provided’ allows for definitions or meanings of dolus/intent different from
Article 30 to be provided in respect to specific crimes.64
Irrespective of interpretations of Article 30 as including/excluding dolus
eventualis, one must therefore turn to the ‘intentionality’ requirement of
Article 8(2)(b)(i) for the war crime of attacking civilians and inquire into
whether it also encompasses dolus eventualis.
63 Werle and Jeßberger, supra note 41, para. 580 and quote in note 120. However, relying on a
literal interpretation of the crime definition and on the interplay with the crime of Article
8(2)(b)(iv) ICCSt., they reach the conclusion that ‘intentionality’ under Art. 8(2)(b)(i) equates
to the ‘purpose of attacking civilians’, a higher mental element than Art. 30, ibid., para. 1416.
64 Pigaroff and Robinson, supra note 42, para. 16.
65 Dörmann, supra note 29, at 131.
66 As opposed to other targeting-related war crimes, see for example Art. 8(2)(b)(iii), Element 5
requiring awareness ‘of the factual circumstances that established that protection’.
67 See Katanga Trial Judgment, supra note 30, § 808; the emphasis is added. See also Ntaganda
where the Trial Chamber required that ‘the perpetrator was aware of the presence of civilians
in the relevant area’, Ntaganda Trial Judgment, supra note 29, § 921.
Autonomous Weapons and the Responsibility Gap 17 of 25
that some of the targets are civilians is sufficient. A level of knowledge near to
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
certainty or high probability would be in line with the Trial Chamber ruling
that dolus directus in the first degree is ‘first and foremost’68 the mens rea of the
war crime of attacking civilians.
However, it seems difficult to reconcile these knowledge requirements with
the intrinsic features of the war crime of indiscriminate attacks. Indiscriminate
attacks (intended as violations of the principle of distinction) are characterized
by the lack of certain knowledge of the civilian status of the targets69 and by
the awareness of the possibility that some of the targeted persons might have
civilian status70 or by the perpetrator’s ‘awareness of his lack of awareness’ as
to whether some civilians might be hit.
The irreconcilable tension between ‘intentionality’, strictly construed as dolus
directus in the first degree (which must be accompanied by knowledge of the
civilian status) and the intrinsic features of indiscriminate attacks (which
entails a ‘lack of targeting’) may translate into gaps in the attribution of
criminal responsibility for violations of the principle of distinction, including
those stemming from human-machine teaming.
akin to dolus directus in the first degree, as the ICC has done so far, would
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
inevitably lead to the exclusion of a great array of indiscriminate attacks from
the scope of Article 8(2)(b)(i). In sum, one faces an either/or situation Either
one considers that ‘intentionality’ in the war crime of targeting civilians
includes at least the most serious forms of risk-taking behaviors (dolus even-
tualis), thus allowing the punishment of most types of indiscriminate attacks.
Alternatively, one contends that the requirement of ‘intentionality’ is intent
(first and second degree), thus ruling out the criminality of most instances of
indiscriminate attacks under the ICC Statute. It is unlikely that this exclusion
was intended by the drafters of the ICC Statute.71 At the same time, as men-
tioned above, it does not find support in the case law of the ICC on indiscrim-
inate attacks.72
the ‘willfulness’ requirement of Article 85, that the mens rea of the war crime
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
of attacks against civilians is direct intent and recklessness. In Galić ‘the Trial
Chamber found that the perpetrator must undertake the attack ‘‘wilfully’’,
which it defines as wrongful intent, or recklessness, and explicitly not ‘‘mere
negligence’’. . .’.74
In some of the following cases however the ICTY moved from recklessness to
use the concept of indirect intent. In Strugar, the Appeals Chamber held that
the mens rea ‘encompasses both the notion of ‘‘direct intent’’ and ‘‘indirect
intent’’’.75 Indirect intent in Strugar was defined as knowledge of a high degree
of risk and knowledge of probability of committing a crime.76 While knowledge
of possibility and dolus eventualis were excluded in Strugar, in Prlić the ICTY
endorsed the dolus eventualis standard.77
A full assessment of the ICTY case law on the mens rea of the war crime of
attacking civilians and its inconsistencies is beyond the scope of this article. It
suffices here to say that the ICTY case law goes beyond the concept of dolus
directus in the first degree and second degree to include risk-based forms of
mens rea, thus allowing the prosecution of indiscriminate attacks.
Moreover, ICTY cases posit the requirement of knowledge as to the civilian
character of the victims in terms of actual knowledge (‘the perpetrator was
aware’)78 or constructive knowledge (the perpetrator . . . should have been
aware of the civilian status of the persons attacked’79 or ‘it was impossible
not to know, that civilians or civilian property were being targeted’).80 In
Galić, in defining the degree of knowledge with regard to the civilian status,
the Trial Chamber held that ‘full awareness of the high risk that the target was
a civilian’ was sufficient.81
74 Galić Appeal Judgment, supra note 33, § 140 citing Galić Trial Judgment, supra note 33, § 54.
See also Milošović Trial Judgment, supra note 33, § 951: ‘The Prosecution must establish that
the Accused wilfully made the civilian population or individual civilians the object of acts of
violence . . . the notion of ‘‘wilfully’’ incorporates the concept of recklessness, whilst mere
negligence is excluded’.
75 Judgment, Strugar (IT-01-42-A), Appeals Chamber, 17 July 2008, § 270; Judgment, Martić (IT-
95-11-T), Trial Chamber, 12 June 2007, § 72, however the accompanying footnote cites the
ICRC’s commentary to Art. 85(3)(a) AP I which interprets ‘wilfully’ as encompassing wrongful
intent and recklessness.
76 Judgment, Strugar (IT-01-42-T), Trial Chamber, 31 January 2005, §§ 235 and 236 (hereafter
‘Strugar Trial Judgment’).
77 Judgment, Prlić et al. (IT-04-74-T), Trial Chamber III, 29 May 2013, § 192 endorsing the
rulings in Galić: ‘Regarding the mental element required for the crime of attacks on the civilian
population, the Tribunals case-law has settled that the perpetrator of the crime is required to
have acted with intent, which encompasses dolus eventualis whilst excluding negligence. . . .
Thus for there to be intent, the perpetrator has to have acted knowingly and wilfully, that is to
say, perceiving his acts and their consequences and purposing that they should come to pass.
Dolus eventualis occurs when the perpetrator, without being certain that the result will take
place, accepts it in the event it does come to pass. Conduct is negligent when the perpetrator
acts without having his mind on the act or its consequences’.
78 Galić Trial Judgment, supra note 33, § 55.
79 Ibid.
80 Strugar Trial Judgment, supra note 76, § 280.
81 Galić Trial Judgment, supra note 33, § 433.
20 of 25 JICJ (2021)
While the case law of international criminal tribunals is not part of the
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
applicable law of the ICC, it has often been used as an interpretative aid for
the relevant provisions of the ICC Statute and has authoritative weight. This is
particularly evident if one considers that the Introduction to the Elements of
Crimes explicitly stipulates that ‘[t]he elements for war crimes under Article 8,
paragraph 2, of the Statute shall be interpreted within the established framework
of the international law of armed conflict’. Therefore, the grave breach of attack-
ing civilians under AP I, and the interpretation of the ‘willfulness’ requirement
given by the ICTY, may offer valuable interpretative guidance in relation to the
mens rea of Article 8(2)(b)(i) of the ICC Statute.82
Furthermore, the war crime of attacking civilians as adjudicated by the ICTY
is directed at the same objective as the corresponding war crime provision
under the ICC Statute: the criminalization of attacks against civilians and ci-
vilian objects. The ICC Statute war crime is even more protective. The drafting
of the war crime provisions under Article 8(2)(b)(i) and 8(2)(e)(i) as crimes of
conduct indicates that these norms intend to protect civilians against the risk of
being attacked, more precisely against the risk of being injured or murdered.
These norms aim at criminalizing the targeting of civilians regardless of
whether the attack resulted in death or injury. Endorsing risk-taking based
mental elements, such as dolus eventualis (or recklessness), helps the war crime
of attacking civilians under the ICC Statute to achieve its underlying objective
and fills the gap left by the lack of specific criminalization of indiscriminate
attacks in the ICC Statute.
One reaches similar conclusions by reverting to the additional source
indicated in Article 21(1)(c) of the ICC Statute, namely ‘general principles
of law derived by the Court from national laws of legal systems of the
world’. General principles are applicable ‘failing’ the sources of previous
paragraphs (a) and (b) and if ‘not inconsistent with th[e] Statute and
with international law and internationally recognized norms and stand-
ards’.83 As discussed above, dolus eventualis and similar standards such as
recklessness can be considered general principles of law.84 The inclusion of
82 See K. Dörmann, ‘War Crimes under the Rome Statute of the International Criminal Court,
with a Special Focus on the Negotiations on the Elements of Crimes’, 7 Max Planck Yearbook of
United Nations Law (2003) 341, at 353. Dörmann argues that more in-depth inquiry from the
ICC is necessary with respect to the relationship between Article 30 ICCSt. and the ‘willfulness’
requirement as interpreted in the case law of the ICTY. The same can be argued with respect to
the relationship between the ‘intentionality’ requirement in the ICC war crime provisions and
‘willfulness’ as interpreted by the ICTY.
83 Art. 21(1)(c) ICCSt.
84 See Werle and Jeßberger, supra note 41, para. 576. With specific reference to war crimes
national prosecutions, see Haque referring to the fact that under the Uniform Code of
Military Justice (UCMJ) US soldiers were found guilty for recklessly killing civilians, A.A.
Haque, ‘What the Kunduz Report Gets Right (and Wrong)’, Just Security, 10 May 2016,
available online at www.justsecurity.org/30986/kunduz-report-and-wrong (visited 13 October
2020). But see some domestic practice excluding recklessness with respect to war crimes (either
violating the principle of distinction or proportionality): ‘U.S. Department of Defense Releases
Report of Investigation Finding that October 2015 Air Strike on Doctors Without Borders
Hospital in Kunduz, Afghanistan, Was Not a War Crime’, 110 American Journal of
Autonomous Weapons and the Responsibility Gap 21 of 25
dolus eventualis is not contrary to the ICC Statute in light of the ‘unless
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
otherwise provided’ clause of Article 30 which allows standards lower than
dolus in the first and second degree.
International Law (2016) 579, at 585–586; W. Heintschel von Heinegg and P. Dreist, ‘The
2009 Kunduz Air Attack: The Decision of the Federal Prosecutor-General on the Dismissal of
Criminal Proceedings Against Members of the German Armed Forces’, 53 German Yearbook of
International Law (2010) 833, at 844.
85 M.L. Cummings, ‘Automation Bias in Intelligent Time Critical Decision Support Systems’, 2
Collection of Technical Papers Aiaa 1st Intelligent Systems Technical Conference (2004) 557, at
558 and 559, available online at https://arc.aiaa.org/doi/abs/10.2514/6.2004-6313 (visited
13 October 2020).
86 K.L. Mosier and L.J. Skitka, ‘Human Decision Makers and Automated Decision Aids: Made for
Each Other?’ in R. Parasuraman and M. Mouloua (eds), Automation and Human Performance:
Theory and Applications (CRC Press, 1996) 201, at 205.
22 of 25 JICJ (2021)
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
ing the lawfulness of the target or some relevant features of the target or
consequences of the attack.
Automation bias might be due to psychological processes by which human
operators are inclined to interpret contradictory information as consistent with
information provided by automated or AI-based decision aids (assimilation or
confirmation bias).87 Or, human operators may reduce their cognitive efforts to
seek out other information. This process is presented as the ‘tendency toward
‘‘satisficing’’ in judgment and decision making’.88
Decreased vigilance and situational awareness of human operators, which
are common when some targeting functions are delegated to an automated or
AI system, also affect judgment and decision-making. They may lead to the
failure to detect errors or to assess the risks related to a certain targeting
decision.89 Finally, the complexity of AI and lack of training can prevent
any effort on the side of the human cooperating with an autonomous system
to fully understand the risks that employing AI in the battlefield can entail.
87 Ibid., at 204.
88 Ibid.
89 Ibid., at 207 and 208.
90 G.P. Fletcher, Basic Concepts of Criminal Law (OUP, 1998), at 115; italic emphasis in the
original.
Autonomous Weapons and the Responsibility Gap 23 of 25
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
the duty to take precautions and acquire the necessary knowledge about the
object of the attack, the human operator can be held responsible to the extent
that the violation of the duty to take precautions evidences the intent to hit
civilians.91
Secondly, automation bias and, in particular, over-reliance on AI has an
impact on the assessment of the degree of probability of hitting civilians. The
certainty or high probability of risks could lead to a finding of responsibility for
dolus directus in the second degree whereas probability or possibility could lead
to ascribing responsibility on the basis of dolus eventualis (or recklessness).
Given the uncertainty and diverging interpretations of the mens rea of the
war crime of attacking civilians and indiscriminate attacks (and the possible
exclusion of dolus eventualis), the degree of probability may have a conclusive
bearing on the question of whether the human operator can be considered
responsible.
Thirdly, autonomy in targeting may affect the predisposition towards risks.
The subjective posture towards the risk of crimes being committed (i.e. civilians
being hit or killed) is the factor distinguishing dolus eventualis from recklessness
(approval or indifference in the former versus aversion in the latter). Over-
reliance on autonomous processes may lead human agents to discard the risk
and be more confident about the non-realization of a crime. Should we accept
that dolus eventualis is a sufficient mens rea standard and should we exclude
recklessness, the predisposition towards risks becomes determinative for the
attribution of criminal responsibility. If human operators are aware of the
risk but due to over-reliance on autonomy remain confident that no criminal
consequence will come about and dismiss this risk, they will be exempted from
criminal responsibility.
Fourthly, and importantly, autonomy and automation bias can preclude the
formation of the awareness of the civilian nature of the object of the attack,
which, because of its relevance, will be treated separately in the next section.
91 Dörmann, supra note 29, at 132: ‘The required mens rea may be inferred from the fact that the
necessary precautions (in the sense of Art. 57 AP I, e.g. the use of available intelligence to
identify the target) were not taken before and during an attack’.
24 of 25 JICJ (2021)
possibility (in the war crime of indiscriminate attacks) must exist for intent
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
to be shaped.
The civilian or otherwise protected nature of the object of the attack is part
of the material elements of the war crime of attacking civilians and, more
generally, targeting-related war crimes. Intent must cover all the material
elements of the crime. Despite not being mentioned in the Elements of the
Crimes, awareness of the civilian status is the crucial cognitive component of
the intent to target civilians. A misperception of ‘the facts underlying the
material elements of the crime’92 would enable a defendant to invoke the
defense of mistake of fact, which is recognized in Article 32(1) of the ICC
Statute and across national systems.93 Mistake of fact over the existence of
the material elements of the crime, such as the civilian status of the object of
the attack, negates the perpetrator’s intent and leads to the exclusion of crim-
inal responsibility.94 The civilian status of the object of the attack has both a
descriptive and normative character. What is relevant for mistake of fact is the
descriptive character, since, for example, the incorrect assumption that under
IHL it is permissible to target a Red Cross vehicle does not exclude criminal
liability under the doctrine of mistake of law in Article 32(2) of the ICC
Statute95 (and across most national systems).
Mistake of fact also applies to scenarios when the human operator of an
autonomous system acts without knowledge or awareness of reality.96 For
example, because of automation bias or because of the complexity of AI, the
human operator may trust the autonomous system and ‘press the button’
without perceiving that the semi-autonomous weapon is launching an attack
against a civilian object. Or human operators, because of the same lack of
perception or knowledge of the civilian status of persons in the targeted
area, may fail to intervene and stop an attack. Lack of knowledge about ma-
terial elements of crimes is treated in the same way as mistake of fact and
excludes criminal responsibility because the mental element is lacking.
Thus, the impact of the defense of mistake of fact in preventing the attribu-
tion of criminal responsibility when AI is employed in targeting is potentially
wide.
5. Conclusion
Autonomy when used in weapons and targeting and leading to human-
machine teaming scenarios can prevent the ascription of criminal responsibility
for attacks violating the principle of distinction in light of the gap in the
Downloaded from https://academic.oup.com/jicj/advance-article/doi/10.1093/jicj/mqab005/6181757 by Serials Section, Dixson Library user on 28 May 2021
‘intentionality’ requirement for the war crime of attacking civilians.
The so-called ‘responsibility gap’ problem has been widely discussed in rela-
tion to fully autonomous weapons and the unforeseeable mistakes of AI, which
cannot be, in principle, criminally attributable to human agents. They fall
outside fault-based forms of criminal responsibility that characterize the frame-
work applicable to conduct of hostilities war crimes. Unintentional attacks
against civilians stemming from the unforeseen decisions of an autonomous
process would escape responsibility under the current legal framework.
Debates on full autonomy in weapons have overshadowed the problems
raised by current applications of AI and autonomy in weapons and targeting.
A responsibility gap may also arise with respect to the attribution of criminal
responsibility in relation to some foreseeable and accepted risks stemming from
reliance on autonomy in human-machine teaming situations.
This paper has inquired into the impact of autonomy on the prosecution of
targeting-related war crimes, focusing on indiscriminate attacks and the war
crime of attacking civilians. A responsibility gap may follow from the lack of
specific war crime provision on indiscriminate attacks under the ICC Statute
and the uncertainty surrounding the contours of its criminalization and its
elements, thus forcing recourse to the provision on the war crime of inten-
tionally attacking civilians (in IAC Article 8(2)(b)(i) and in NIAC Article
8(2)(e)(i)). This leads to problems surrounding the ‘intentionality’ requirement
and to asking to what extent this provision can be reconciled with the specific
features of indiscriminate attacks.
This article demonstrated that endorsing risk-based forms of criminal respon-
sibility such as dolus eventualis or recklessness is key to increasing the possibil-
ity of ascribing criminal responsibility for indiscriminate attacks, including
whenever weapons and targeting integrate autonomy.
However, even under lower standards of mens rea, the effects of autonomy in
itself, exacerbated by automation bias, could be profound for the ascription of
criminal responsibility. Autonomy in the battlefield might well ‘pierce the fog of
war’, but at the price of distorting the knowledge of risks and risk propensity
on the part of the human operators. Moreover, autonomy may impede the
formation of knowledge of the civilian status of the targets, thus opening the
door to reliance on the defense of mistake of fact, which negates intent and
prevents the ascription of criminal responsibility.
Ultimately, this article intended to demonstrate that the debate on the
‘responsibility gap’, which characterizes the development of so-called fully
autonomous lethal weapons, does not pertain to the advent of future tech-
nology. The gap is already open.
The integration of autonomy into weapons and targeting accentuates
the pre-existing vacuum of criminalization of indiscriminate attacks in the
ICC Statute and serves to highlight the obstacles to prosecution of targeting
decisions and attacks based on risk-taking behavior as war crimes.