Lucas Kello - The Virtual Weapon and International Order-Yale University Press (2019)
Lucas Kello - The Virtual Weapon and International Order-Yale University Press (2019)
i
ii
THE
VIRTUAL
WEAPON
AND INTERNATIONAL ORDER
LUCAS KELLO
iii
Copyright © 2017 Lucas Kello
All rights reserved. This book may not be reproduced in whole or in part, in any form
(beyond that copying permitted by Sections 107 and 108 of the U.S. Copyright Law and
except by reviewers for the public press) without written permission from the publishers.
For information about this and other Yale University Press publications, please contact:
U.S. Office: [email protected] yalebooks.com
Europe Office: [email protected] yalebooks.co.uk
A catalogue record for this book is available from the British Library.
10 9 8 7 6 5 4 3 2 1
iv
To Stanley Hoffmann
v
vi
Contents
List of Tables ix
Acknowledgments x
Introduction 1
vii
viii CONTENTS
Notes 258
Bibliography 301
Index 315
Tables
ix
Acknowledgments
x
A C K N OW L E D G M E N T S xi
A Permanent Revolution
Every historical era begins with a revolution: it comes of age when
revolution becomes the new normal. The Reformation began when a
disaffected Augustinian friar asked, What authority has the Pope? It
achieved its peak when the schism in Christianity became a source not
of religious war but of stable social structures. The Romantic period
started with the philosophical challenge, What is a state if not also an
integral nation? It matured when nationalism in Europe became less a
cause of violence than of political cohesion. The present era, too, can be
defined by a revolutionary question, one rooted in technology: what
limits has space? For cyberspace, a defining motif of our era, smashes
the constraints of geography on the speed and range of human action.
Never before has technology permeated society so completely or influ-
enced the dealings of states and peoples so intricately as in the present.
Yet despite significant experience with the related technology, the cyber
age remains in a revolutionary condition. We have yet to master the
forces of change that define our times.
The distinguishing feature of revolution is that it challenges not just
the rational but also the moral order. Contemporary observers struggle
to make sense of both sets of problems as they relate to the virtual
weapon of cyberspace – malicious computer code designed to manipu-
late the functions of machines or else seize, corrupt, or disclose their
sensitive data. This lag in understanding is especially acute in the study
1
2 INTRODUCTION
threats permanently reside at its core. States and other actors use cyber-
space to penetrate each other’s most basic infrastructures. Often,
breaches are unknown to the defender even after an attack has begun.
Here, then, is another splendid reversal of the established security para-
digm: in the past, the enemy’s presence in essential domestic terrains
signaled the failure of security policy; today it is a starting axiom.
A second difficulty that sustains the cyber revolution concerns the
technology’s scientific complexity. The correct interpretation of cyber
phenomena by strategists presupposes a strong technical understanding
of a capability that is highly esoteric, more so than the nuclear revolution
– the twentieth-century’s great technological current in security affairs.
When in the 1940s nuclear physicists conceived of the fission bomb,
even the most eminent among them could not foresee its manifold
effects on human biology, social structures, or the environment. Ingenious
speculators even believed that a nuclear explosion risked igniting the
Earth’s atmosphere. Within a decade, however, the nuclear attacks on
Hiroshima and Nagasaki, as well as successive atmospheric tests, revealed
to a high degree of certainty the broad range of effects associated with
fission (and later fusion) explosions. By contrast, the virtual weapon’s
behavioral properties are far more elusive to scientists who wish to model
them with certainty – for two reasons. First, it is difficult to ascertain the
weapon’s direct effects, in other words those that unfold within the
logical environment of the target machine complex, because of the intri-
cacy of modern hardware and software. Second, the weapon’s indirect
effects, namely, those that hinder activity beyond the logical perimeter of
the compromised computer host,9 are even more difficult to ascertain,
because modern society’s reliance on complex computer systems for core
functions of government and the economy, and the interconnectedness
between both these systems and the functions they support, mean that a
cyberattack can produce cascading effects that affect essential activities
across a range of sectors and jurisdictions. These indirect effects are
largely unknowable before attack – and possibly afterward as well.10 The
vast range of unknown and inadvertent effects impedes the interpreta-
tion of probable consequences of cyberattack. Three decades after the
virtual weapon’s birth, scientific knowledge of its behavioral properties
and effects remains rudimentary.
A third complication concerns the volatility of change: the problem
of how to dominate a technology whose technical features change far
INTRODUCTION 7
Few studies of this sort exist. One of the central premises of this
book is that cyberspace has a growing – indeed decisive – importance
for world affairs and not just, as many people already know, personal
and commercial life. Yet there is only a modest commitment within the
international relations community to explore the subject. True, a growing
number of works – some masterful – address the strategic, tactical, legal,
ethical, and economic aspects of the cyber question. There is, for
example, Lawrence Lessig’s foundational (and superbly written) book
on the interrelation of cyberspace, law, and social norms;11 Gregory
Rattray’s analysis of “strategic information warfare,” one of the earliest
attempts to conceptualize the effects of information systems on
strategic affairs (as distinct from issues of crime and espionage);12 Jack
Goldsmith and Tim Wu’s deflation of high aspirations of a borderless
Internet beyond the state system’s clutches;13 Martin Libicki’s sober
study of the limits and possibilities of hostile conquest or his examina-
tion of the challenges of deterrence in the new domain;14 Jason Healey’s
comprehensive historical account of cyber conflict;15 Michael Schmitt’s
volume on the legal and ethical travails of international cyber conduct;16
or Adam Segal’s review of how nations, small and large, use the Internet
to achieve strategic and tactical ends outside cyberspace.17
On this basis, some scholars claim, rather cavalierly and compla-
cently, that in international relations the field of cyber studies is
flourishing.18 It is not. For the existing literature betrays an important
gap: it barely, and sometimes not at all, integrates the virtual weapon
into the theoretical matter of international relations. Notable exceptions
to this trend of neglect exist, to be sure – for instance, Joseph Nye’s work
on “power diffusion” in the cyber domain and its implications for global
power structures; or Nazli Choucri’s exploration of “cyber politics” as a
new manifestation of anarchic international relations, drawing explicitly
on the discipline’s theoretical debates and frameworks.19 But the analysis
of other foundational concepts and questions remains primitive. Scholars
have not systematically addressed how cyberspace affects notions such
as international order and stability; institutions and regimes; identity,
norms, and ideology; and the balance of power – which are the disci-
pline’s prime units of intellectual currency. Security studies thinkers in
particular have barely begun to apply their conceptual toolkits to clarify,
model, or explain developments in the new domain. In a realm of study
that should be theirs, they have provided no school – so far.
10 INTRODUCTION
for the interpretation of novelties that the old thinking leads them arbi-
trarily to neglect. This book assumes, as a core methodological principle,
that in a new realm of strategic action involving a largely untested tech-
nology, sometimes the most important events in theory construction are
those that did not occur but may plausibly happen.
This book eschews the tendencies and fashions of scientism that
prevail within certain quarters of security studies. By pursuing a deep
analysis of single cases, not always or even primarily involving tradi-
tional national defense purposes, this work strives to provide broader
and truer insights about the nature and scope of the contemporary cyber
revolution than would be possible through the canvassing of a large
number of data points that represent only narrow segments of a much
vaster and varied world of action (both proven and plausible).
Second, the book argues that understanding the cyber question
requires a new paradigm of security commensurate with it, one that
privileges not just physical but also nonphysical threats and that elevates
nonstate actors to a higher level of theoretical existence than the tradi-
tionalist viewpoint allows. This work is a sequel to the few existing
political science books on cyber issues, but it strives to avoid repeating
errors of conviction and omission: of obstinately applying old security
paradigms to the study of new problems, while neglecting alternative
theoretical lenses that may yield a truer picture of present trends.
The book’s main argument is that cyber politics exhibits two states of
nature. One is the traditional system of states locked in familiar contests
for security, but featuring a largely untested weapon whose use is diffi-
cult to model and regulate even among rational contenders. The other
is a chaotic “global” milieu comprising nontraditional players whose
aims and motives may be subversive of national or international order
and who may not accept or even understand (seasoned statesmen often
fail to grasp it) the delicate political framework of the anarchic states
system.
Therein lies the book’s main theoretical contribution: it presents a
framework that clarifies how these two universes converge and collide.
On the one hand, the study shows how the wide diffusion of cyber
capabilities enables new modes of coordination between state and
nonstate players who share certain goals and adversaries. An example of
this phenomenon is the reported collusion between Iran and company
insiders at Saudi Aramco, the Saudi Arabian national oil firm and the
INTRODUCTION 13
T H E O R Y A N D C O N C EP T S
21
22
CHAPTER 1
and because technical analysis is not the same as strategic learning, a tech-
nical criterion will not always seem appropriate to international relations
specialists. It may, in fact, impede learning because absorption with the
virtual weapon’s technical complexity obscures the purposes that drive its
use. And then there is the scope of the policymaker’s conception: the
translation of both technical knowledge and political axioms into policies
to defeat relentless threats. The clash of conceptions and the absence of
bridges to join them renders the task of statesmanship even harder than if
its problems were solely technical or solely political. Thus whether tech-
nical mastery obstructs strategic learning, whether strategy inhibits policy,
depends on a resolution of the contest of views within the academy.
But a resolution of what kind and at what price? Three broad
approaches are possible. One is multidisciplinarity. This recognizes the
necessity for diverse viewpoints without, however, seeking to organize
them into a harmonious school of learning. Each profession is left to
pursue its own solitary path of knowledge. Another is interdisciplinarity.
Similar to the first approach, it respects the integrity of distinct view-
points, but it strives to fuse their teachings into shared insights, though
without elevating any single one above the others. The third approach is
unidisciplinarity. This strives to absorb alternative conceptions: in
computer science, to subsume the laws of human behavior within the
logic of computer programming; in political science, to reduce techno-
logical phenomena to the imperatives of governments and their peoples
– in short, to deny that machines and humans have separate essences.2
This approach is the natural temptation of each discipline in dealing
with the fashions of other disciplines, for scientific professions define
themselves not by tearing down walls of learning but by building esoteric
bodies of knowledge within them; not by recognizing and reconciling
external differences but by immersing in internal divisions. These inter-
necine quarrels are often so fierce that they destroy opportunities to
cross the quiet but much deeper divide that separates professions
from each other. Or else, if the internal diversions can be set aside, what
may emerge is a “bridging” discipline,3 a field of study that is so vast in
scope and so pluralistic in methods – strategic studies, for example,
which combines insights from perspectives as diverse as philosophy and
geography – that it is not a definable discipline.
There exists, then, a continuum of resolution: segmentation, synthesis,
and absorption. Segmentation, or multidisciplinarity, suffers the common
THE QUEST FOR CYBER THEORY 25
The uneven attendance sheet and disarray within the chamber reflect
the fact that cyber studies is both very old and very young. The field is
old in the sense that computer specialists have strived to understand the
behavior of machines and their ability to enhance (or degrade) human
activity ever since Alan Turing first laid down the principles of computing
in a seminal paper published in 1936.7 Famously, Turing proved that it
was theoretically possible to invent a machine that could solve finite
mathematical problems by interpreting simple symbols – 0’s and 1’s.8
According to the “Church-Turing thesis,” every conceivable computer
program can be written as a Turing machine program, in other words as
a particular kind of mathematical function. There flowed from this real-
ization a series of engineering quests of vast proportions. Almost
immediately, in the early 1940s, inventors applied the Turing principle
to create machines that could solve pressing problems of military tactics,
such as the tabulation of ballistic trajectories or the decryption of
German cyphers during the Second World War. The principle has since
been applied to address a limitless array of economic and social prob-
lems: the prediction of climatic patterns, the facilitation of global
financial transactions, the collection of citizen data by governments, and
so on. Almost any human task can be rendered as a mathematical func-
tion. Consequently, no aspect of contemporary public or private life is
beyond the reach, in some way, of Turing machines.
The field is also old in the sense that technical specialists have long
reflected on the security problems that arise from the invention of
machines that are controllable by abstract data. The benefits of cyber-
space apart, there is a basic truth to Turing’s invention that generations
of technicians have been unable to escape: because there is no absolute
standard for the interpretation of code, no computer program in the
world is perfectly secure. A phrase by Kenneth Thompson immortalized
the problem: ‘How to trust trust?’9 If no amount of scrutiny can protect
you against imperfectly trusted code, if the only software worthy of trust
is that which the machine operator wrote totally (and even here human
error may occur), and if computer programs contain code from multiple
coders – sometimes hundreds or thousands of them – then actions in
cyberspace always entail some measure of misplaced trust in the correct
workings of machines, in the benign intentions of their inventors, or
in machines’ ability to interpret correctly those intentions. When mali-
cious code abuses that trust and alters computer behavior, it would
THE QUEST FOR CYBER THEORY 29
Schools of Skepticism
It is superfluous – almost – to state that security studies scholars are
skeptical of the existence of a cyber danger: they have barely acknowl-
edged the issue, as reflected in the scant relevant literature. There has
been little systematic theoretical or empirical analysis of cyber issues
from their perspective.14 What accounts for this scholarly void? The
question is important: cyber studies can thrive only if the factors –
primarily intellectual, but also institutional – that inhibit it are
overcome. There are not one but two schools of skepticism, one intrinsic
to the cyber question, the other inherent in the intellectual fashions of
security studies.
First, some thinkers – call them deep skeptics – evade the subject
because of methodological obstacles, especially the related technology’s
scientific complexity. There is a sense in some quarters of political
science that cyber studies is fraught with intellectual obstacles. Holders
of this view emphasize two major obstacles.
One concerns the paucity of cases available to propose, test, and
refine theoretical claims about empirical phenomena. Paradoxically, this
problem reflects a combination of too much and too little data. Reports
of hostile cyber events are profuse, with governments and private
industry registering incidents, separately and mostly disjointedly, on an
ongoing basis. At the same time, the tendency of governments to over-
classify information has produced a data gap that complicates these
tasks. The most important tactical maneuvers in cyberspace remain
shrouded in secrecy,15 complicating scholarly investigation of cyberat-
tack as an instrument of foreign and defense policy. Archival material of
the kind security studies thinkers have often relied on to uncover the
motives and aims of conflict will likely remain inaccessible for many
years still. Thus it is often difficult to ascertain the relevance of cases to
security studies, given either poor techniques of data collection or the
lack of suitable metrics to codify the events. A recent comment by a
British official illustrates the data paradox. Asked about breaches of
government machines, he stated, “Sixty-eight cyber incidents from all
types of organisations were voluntarily reported to the National Cyber
Security Centre (NCSC) in its first month of operation (1–31 October
2016),” to which followed the obfuscation, “These incidents vary in
scale, nature and target,” on which there was no clarification: “For secu-
rity reasons the Government does not comment on specific details of
THE QUEST FOR CYBER THEORY 31
section assesses the price paid for the scholarly void that is a conse-
quence of the prevailing skepticism.
who, recognizing that a new genus of conflict was in the offing, proved
inordinately obstinate in searching for satisfactory answers to the hazards
it portended. Nevertheless, in the race against Iran’s development of a
nuclear bomb, they decided to act. It still remains for security studies
scholars to develop a theoretical scheme to address the quandaries of
cyber conflict for the future.
The torments of decision-making faced by practitioners are an oppor-
tunity for scholars. Whatever aspect of the cyber question one considers
– its strategic, tactical, or moral problems – there is in it a chance to demon-
strate the merits of academic insight in the resolution of pressing policy
challenges. The impression of a cyber danger seems only to intensify at
the highest strata of government as new threats come to light – from the
destruction of corporate networks at Sony Pictures Entertainment to
the infiltration of politicians’ personal files during national elections (to
which we will return later). So long as it persists, the reluctance to assess
the defining revolution of our times risks nourishing practitioners’ precon-
ception that security studies is, as one U.S. diplomat put it, “irrelevant” and
“locked within the circle of esoteric scholarly discussion.”38
the meaning of core notions and common terms as well as allocate priori-
ties of investigation among professions other than international relations.
All of these functions can help to address the rhetorical hysterics and
conceptual convulsions that prevail in much of the public perception of
cyber issues. Errors of rhetoric such as “Cyber Pearl Harbor!” or “Cyber
9/11!” or “Cybergeddon!” misrepresent the essence of cyber threats no
less (perhaps more) than skeptical disbelief. They make the task of
theoretical and strategic adjustment harder than if it involved merely
the analysis of phenomena about which observers know nothing at all.
Cyber studies, in other words, parts from a basis worse than zero knowl-
edge; it must begin from negative knowledge.
The schematization below fills the conceptual void and clarifies
prevailing misconceptions. It contains the following six elements:
cyberspace and cyber domain, cybersecurity, malware and cyberweapon,
cybercrime, cyberattack, and cyber exploitation and “kompromat.”61
1. Cyberspace and cyber domain. Cyberspace is the most elemental
concept in the new field: it establishes the technical markers within
which the virtual weapon can operate. One common definition construes
cyberspace as all computer systems and networks in existence, including
air-gapped systems.62 Another excludes isolated nodes.63 For the
purposes of this study, the first definition is appropriate. Total isolation
of computer systems is rarely feasible today. The ubiquity of computing
devices, ranging from removable drives to personal laptops – each a
potential carrier of malware – has multiplied the access vectors through
which an attacker can bridge an air gap. Moreover, the computer systems
likeliest to be shielded by air (e.g. nuclear facilities) are ordinarily of
high significance to national security and therefore should not be
excluded from the plane of action. Cyberspace can thus be conceived as
comprising three partially overlapping terrains: (a) the Internet, encom-
passing all interconnected computers, including (b) the World Wide
Web, consisting only of nodes accessible via a URL interface; and (c) a
cyber “archipelago” comprising all other computer systems that exist in
theoretical seclusion (in other words not connected to the Internet or
the web and thus not to be confused with the “Dark Web,” or the
compartments of the public web whose interactions are not known to
an outsider because they transpire behind authentication controls).64
This conceptualization reflects an important consideration in security
planning: not all threats propagated through the web can transmit via
46 THEORY AND CONCEPTS
the Internet and those that are transmissible cannot use the Internet to
breach the cyber archipelago. On these terms, there are two basic kinds
of target: remote-access and closed-access, each of which is susceptible
to different methods of approach in a cyberattack.
Some observers have contested the definition of cyberspace as
comprising only machines and networks. They argue that it also
encompasses “technicians, network operators, vendors, regulators, and
institutions.”65 Although there is no axiomatic definition of cyberspace
– the meaning of the term is disputed – there are strong reasons to resist
conflating the technical and the social planes in a single, catch-all
term.66 We already possess a suitable term for the expansive notion:
cyber domain, which encompasses the bevy of human and institutional
actors that operates and regulates cyberspace itself. The two notions, it
is important to realize, are distinct. Cyberspace is a technical plane
comprising machines and networks whose uniform feature is manipu-
lability by code; by contrast, the cyber domain is primarily a political
and social plane subject to wholly different interventions and behavioral
rules. We require separate concepts to capture their separate essences.67
2. Cybersecurity. Cybersecurity consists of measures to protect cyber-
space from hostile action. It can also be conceived of as a state of affairs:
the absence of unauthorized intrusion into computer systems and their
proper functioning. Crucially, the concept also encompasses measures to
protect the cyber domain from threats emanating from the technical
plane: i.e. it means the safety and survivability of functions operating
beyond cyberspace but still reliant on a computer host, to which they are
linked at the logical or information layer.68 Insofar as measures of secu-
rity are the purview of the military or impinge on military capabilities,
they constitute cyber defense.
As we saw above, an alternative conception of cybersecurity, often
labeled “information security,” involves government protection of chan-
nels of information flow in domestic society (e.g. Internet censorship).
This area of concern dominates cybersecurity planning in authoritarian
nations. The Chinese government, for instance, operates a vast array of
Internet surveillance and suppression techniques – the so-called Great
Firewall. Officials often utilize these controls to stem activity that Beijing
regards as subversive, such as social media postings by the Uighur Muslim
minority who represent 40 percent of the country’s Xinjiang region’s
population.69 As one important study found, Chinese censorship does
THE QUEST FOR CYBER THEORY 47
packets harmful are not their contents, which in lower quantities are
harmless, but their intense volume.
We arrive at a thorny conceptual question: should malware be labeled
a “weapon” at all? It is reasonable to impose limits on this language.
After all, two of the defining features of weaponized code (we shall see
in Chapter 2) are its intangible payload and effects, which are often
nonviolent and which have not caused fatalities.
Not all forms of malware are weapons from an international relations
perspective. In particular, we may question whether intelligence-gathering
code merits the label. Some forms of it do not because they have innoc-
uous aims: an undisruptive intrusion into a machine to detect harmful
agents in it does not paint a convincing picture of a weapon. But at least
two kinds of intelligence agents may be regarded as such. One is exploita-
tive code that produces harm which impinges on national security – e.g.
an artifact that seizes military or industrial secrets or else sensitive infor-
mation about politicians in the midst of a national election. Another
example is code that gathers systems’ relevant data which the attacker later
uses to customize a disruptive payload.
The crucial definitional criterion of a virtual weapon lies in its
intended and possible effects. On these terms, the vast majority of
malware is from a national security perspective uninteresting. It involves
criminal actions at the low end of the spectrum of effects: credit card
fraud, identity theft, web defacement, and so on. These forms of malware
uses are not weapons from an international relations perspective because
their possible effects are not of a magnitude that states would regard as
relevant to their core interests. Other forms of code potentially meet this
criterion of relevance: the destruction of nuclear enrichment centrifuges,
the incapacitation of financial infrastructures, the crippling of govern-
mental networks, the theft of prized military and commercial secrets,
the capture of personal data whose public divulgence influences the
outcome of national elections, and so on – actions that may damage
the political, economic, or military security of nations.
A further question concerns physical objects: can they be elements of
the weapon? Attackers sometimes embed backdoors into hardware
components – for instance, a hardware Trojan, or a maliciously modified
circuit that enables an intruder to monitor or modify the circuit’s
contents and transmissions.75 Or they use USB drives and portable
computers to deliver payloads to the target. We may be tempted to treat
50 THEORY AND CONCEPTS
also has been rising steadily, as later chapters will discuss; it now includes
physical destruction of essential infrastructure. In addition, the virtual
weapon in all its advanced forms poses enormous defense challenges
while disturbing interstate strategic stability. Whether security scholars
grasp these implications of the cyber revolution for international secu-
rity depends on their ability to break free from their preconceptions as
to what constitutes a serious threat.
58
THE CYBER CURSE 59
The reason for the absence of a cyber death is not hard to fathom. It
inheres in the uniquely virtual method of harm. To reach its target, a
traditional weapon has to traverse a geographic medium: land, sea, air, or
outer space. Upon arrival, it inflicts direct material results: the piercing of
flesh by an arrow’s tip, the detonation of an explosive charge of TNT
against the hull of a ship, the fusion of uranium isotopes in the atmo-
sphere, and so on. Traditionally, then, overt violence was the only method
of harm that weapons could inflict. Even if they produced no political or
economic effects, the weapons’ methods of operation were still
inherently violent. There is no such thing as a nonviolent attack in
conventional domains of conflict. Militaries, to be sure, commonly
employ means of “psychological” and “information” warfare to supple-
ment their tactical operations, but these nonviolent actions are not
commonly regarded as attacks, for information itself does not directly
inflict damage upon the physical world.
The cyber revolution has dramatically altered this situation. Malware
can travel the vast information skin of the Internet. It obeys the protocols
of TCP/IP, not the laws of geography.14 It is little constrained by space
and obliterates traditional distinctions between local and distant conflict.
The payload is an intangible: it operates through complex coding, which
means that the weapon’s charge – a series of malicious 0’s and 1’s – is not
the most proximate cause of damage. Instead, the payload requires
a remote object – such as a programmable logic controller, or PLC, a
machine that automates industrial processes – which can be manipulated.
Information is no longer just a supplement of national power; it has
become force itself. The virtual weapon – information stored as electrons
– can inflict harm upon the political and social world without ever exiting
the intangible minds of Turing’s machines. Machine functions have
replaced violent charges in the behavior of weapons.
The absence of death and the intangibility of most direct effects are
not convincing grounds to refute the virtual weapon’s potency, although
theories wedded to the notion of violent loss of life may lead us toward
that conclusion. Cyberattack need not result in physical destruction to
pose a serious danger to society. “It may not be a bomb coming down our
middle chimney of our house,” Jonathan Zittrain explained, “but it could
be something that greatly affects our way of life.”15 Or as General Martin
Dempsey, Chairman of the Joint Chiefs of Staff, stated, “The uncom-
fortable reality of our world is that bits and bytes can be as threatening
62 THEORY AND CONCEPTS
cause of injury and may not even be violent. Second, the conception of
war as the use of armed force sets high thresholds in terms of scope,
duration, and intensity that cyber actions may not meet.24 Third, the
perpetrators of a cyberattack can be nonstate parties who are not typi-
cally considered subjects of international law and thus not subject to its
restraining procedures. Fourth, an offensive cyber operation by non-
traditional players, such as that conducted against Estonia, need not
involve the strategic purposes of states or their militaries. Fifth, at least
in the case of a generalized cyberattack, the important distinction
between military and civilian targets dissolves due to the broad diffu-
sion of computer systems in society and their interdependencies.
Despite a long record (from 1988) of dealing with major disruptive
cyber incidents, the difficulties of conceptualization continue. Leading
politicians are given to passionate and incautious depictions. In 2016,
Ted Cruz, a leading contender in the U.S. Republican Party’s presidential
primary campaign, warned that Russia and China were waging “cyber
war” against the United States, although the hostile activity he had in
mind – espionage – produced no intentional direct effects and was
not prohibited under international law.25 Similarly, John McCain, the
incoming chairman of the U.S. Senate Armed Services Committee,
floridly decried the Sony Pictures attack as an “act of war,”26 chiding
President Obama for describing it as an act of “vandalism.” “The presi-
dent does not understand that this is a manifestation of a new form of
warfare,” said the senator. “When you destroy economies, when you are
able to impose censorship on the world, and especially the United States
of America, it’s more than vandalism. It’s a new form of warfare.”27 Such
reactions may in the future embarrass those who articulate them; today,
however, they pervade the public perception.
Clearer-minded officials have struggled to correct persistent
mischaracterizations. U.S. Defense Secretary Ashton Carter rejected
the description of Chinese cyber espionage as an act of war. The White
House declared that while the Sony Pictures attacks amounted to “a
serious matter of national security,” they did not meet the criteria of an
act of war. Obama himself publicly repudiated the “cyber war” label,
sending a message to Washington as much as to Pyongyang that his
administration would not respond with conventional force. Were it not
for the seniority of the speakers, these corrective statements might have
been lost among the melee of gross simplifications that often ensues
THE CYBER CURSE 65
cavalierly, “If you shut down our power grid, maybe we will put a missile
down one of your smokestacks.”30 The implications of this principle for
international security are potentially serious: a cyber event can occur
that does not meet the traditional definition of war but nevertheless
elicits a reprisal of commensurate severity.31
The known cases of cyberattack display an almost sequential accre-
tion of harm that exposes the tenuity of skeptical thinking about the
scope of technological possibility. Estonia proved the technology’s
ability to convulse the economic affairs of a small nation; Georgia, the
ability to hinder civilian defense during a military invasion; Stuxnet and
the Ukrainian malware incident, physical infrastructural damage; Saudi
Aramco, systems-wide computer malfunction in a large corporation. Yet
these cases, however alarming, do not convey the limits of possibility of
cyber conflict. Scientists and experts widely recognize the potential for
graver consequences. Officials at the U.S. Department of Homeland
Security, for instance, have identified sixty-five facilities in the United
States against which a single cyberattack could cause “catastrophic
harm,” which they defined as “causing or having the likelihood to cause
$50 billion in economic damage, 2,500 fatalities, or a severe degradation
of our national security.”32
In the future, war by malware may occur if a cyberattack results in a
similar number of deaths or level of physical destruction as a major
kinetic strike. “I believe that what is termed an act of war should follow
the same practices as in other domains because it is the seriousness, not
the means of an attack that matters most,” stated Carter. “Malicious
cyber activities could result in death, injury or significant destruction,
and any such activities would be regarded with the utmost concern and
could well be considered ‘acts of war.’ ”33 War-like scenarios are not
difficult to conjure. Based on extrapolations of a cyberattack simulation
conducted by the National Academy of Sciences in 2007, penetration of
the control system of the U.S. electrical grid could cause “hundreds or
even thousands of deaths” as a result of human exposure to extreme
temperatures.34 Such an attack would be all the more damaging because,
at least initially, officials would be unable to detect the source of the
problem. Other calamitous cyberattack simulations involve the derail-
ment of trains transporting hazardous chemical materials or the
contamination of public water supplies.35 The absence to date of more
severe cyberattacks, therefore, does not prove the impotence of the new
THE CYBER CURSE 67
Complications of Defense
Security planners repeatedly warn that, in the cyber domain, the offense
holds the advantage.38 Some skeptics seek to dispel this notion by empha-
sizing the high costs of staging a destructive cyberattack. They cite
Stuxnet to make their point: the operation required years of meticulous
planning, involved a preliminary intrusion into the Natanz PLC to gain
knowledge of the target, manipulated no less than six vulnerabilities in
the PLC environment39 – each an expensive technical feat – and required
a skilled operative on site or nearby to deliver the worm across the air gap.
Moreover, once the worm’s coding secrets were revealed, systems opera-
tors were able to patch the programming defects that the worm exploited,
rendering knowledge of these weaknesses useless to aspiring prolifer-
ants.40 For these reasons, skeptics assert, the defense, not the offense, has
the advantage.41
This conclusion is only half complete, however. It ignores or down-
plays the other half of the strategic picture: the enormous costs of
defense against a cyberattack. Four such costs in particular are notable.42
First, there is the problem of offensive unpredictability. The use of
code to achieve destructive direct effects requires the manipulation
of vulnerabilities in the target’s computer system. By definition, the
defender is unaware of such zero-day weaknesses – hence their name.
The universe of unknown and manipulable weaknesses renders a cyber-
attack difficult to predict or even imagine, complicating the design of
measures to repulse it. Incomplete knowledge of defensive weaknesses
also hinders remediation of intrusion post facto, because this requires
understanding the zero-day vulnerabilities that the offensive payload
exploits. Furthermore, the abundance of zero-day flaws and other
possible access vectors that an attacker can utilize complicates the inter-
ception of malware in transit. True, malware is often easier to detect and
neutralize when it is traveling across computer nodes – for example, via
the Internet – toward its destination than when it is already lodged
within it. Again, however, this is merely a relative statement: it says
THE CYBER CURSE 69
nothing about the defender’s absolute costs (which are often high) in
conducting a successful interception.
The Stuxnet operation demonstrates these points. Stealth was a
genial feature of this multistage operation. The method of access, which
may have involved the use of infected removable drives, was unantici-
pated. For three years, the Stuxnet worm and its antecedents, which
acted as “beacons” for the offense, resided in the logical environment of
the target PLC – that is, the PLC, the machines used to program it, and
the machines it governed – without the plant operators noticing their
presence. Remarkably, the worm was able to mask its damaging effects
from the controllers even after the attack sequence had begun. Only a
few months later did the Iranians determine, with outside assistance,
the source of the centrifuge malfunction. “These guys know the centri-
fuges better than the Iranians,” said Ralph Langner, an expert on Stuxnet
and the first to divine its geopolitical intent. “[They] know everything.
They know the timing, they known the inputs, and they know it
by heart.”43
The second cost of defense concerns the problem of offense unde-
tectability. Perhaps the most worrisome feature of the cyber strategic
landscape is the possibility that attack code will reside undiscovered in
a defender’s computer system. Even after the attack sequence has begun,
detection of the threat can be slow if the defender does not realize that
malware is the cause of the mechanical malfunction.44 The difficulties
of detection are especially complicated when dealing with a complex
defensive terrain comprising multiple nodes and entry points – that is,
the most prized kinds of systems: for instance, the financial exchange
systems that process electronic securities trades or the systems that
manage a city’s civilian power supply.45 According to a report by Verizon,
private firms take an average of 240 days to spot network intrusions.46
The problem of detection lag also applies to the most sensitive govern-
mental quarters of cyberspace. Examples of the lag abound. In April
2014, computer security specialists discovered the presence of exploita-
tive malware in the computer network of the Office of Personnel
Management (OPM); one year later, they discovered a second intrusion
in OPM’s network, concluding that the intrusion was not just “histor-
ical, but an ongoing breach.”47 By the time the breach was seemingly
defeated, the intruders had exfiltrated millions of secret personnel
files, including the sensitive records of security clearance applications
70 THEORY AND CONCEPTS
the primary aim of security planning was to prevent the enemy’s pres-
ence in the home terrain. In the new domain, it must be a
starting assumption of strategy that the enemy is already inside. One
often hears FBI Director Robert Mueller’s famous refrain: “There
are only two types of companies: those that have been hacked and those
that will be hacked.”56 It is more accurate, however, to distinguish
between organizations that know they have been hacked and those
that do not know – and might never find out. The ability of advanced
adversaries to reside permanently within essential infrastructures
proves a maxim paraphrased from British politician Stanley Baldwin’s
remark about strategic warfare in the 1930s: malware will always get
through.57
A central task of security policy, then, is to detect the enemy’s
presence in the defender’s computer terrain. The failure to do so raises
enormous dangers, because residency within the logical habitat of a
machine complex affords the invader means to deprive the defense of
the ability to manage its own protection in at least two ways. One is
peer-to-peer monitoring, which allows an attacker to adjust the attack
sequence remotely and in real time. Another is the use of an intelligent
malware agent with self-adaptive capacities that enable it to learn and
override defensive acts. The attackers of Natanz, for instance, inter-
cepted security updates to the facility’s PLC. The Stuxnet worm would
identify and co-opt these updates before the PLC operators could
implement them.58 The ability of malware to generate multiple versions
of itself means that the threat variants during a cyberattack are theo-
retically limitless.
Nevertheless, a permanent breach of a computer system need not
entail permanent insecurity if the defensive terrain can be organized in
concentric zones of access so that the most prized nodes are quaran-
tined from less secure compartments. This approach, however, runs
counter to the very purpose of “information” technologies, namely, to
ease transmission of data between machines. Therein lies the root
dilemma of cybersecurity: an impregnable computer system may be
inaccessible to legitimate users, while an accessible machine is inher-
ently manipulable by pernicious code.
The difficulties of detecting the foe’s presence complicate the task
of “compellence,” which Thomas Schelling described as the act of
convincing an opponent to withdraw from territory that he has
72 THEORY AND CONCEPTS
and intelligence to tailor the payload. At the same time, the costs to
the defender, who has more node interdependencies to map and
greater vulnerabilities to patch, also increase exponentially. The result
is a fundamental offense–defense imbalance. Whereas the attacker
only needs to understand the procedures of entry and attack that it
decides to employ, the defender must continuously protect the entire
network surface against the vast universe of conceivable attacks. The
growing tendency to connect critical computer systems to the Internet
is multiplying the available points of entry for use in customized
cyberattacks. Moreover, society’s mounting reliance on interconnected
computer systems to support basic economic and social functions is
increasing the opportunities to cause harm through a generalized cyber-
attack. The expanding network surface provides conditions for a shock
offensive or, as John Mearsheimer puts it, “the ability to choose the
main point” – indeed, multiple points simultaneously – “of attack for
the initial battles, to move forces there surreptitiously, and to surprise
the defender.”62
Fourth, supply-chain risks present a major challenge to defenders.
Computer systems rely more and more on off-the-shelf and offshore
manufacturers for components, introducing vulnerabilities into the
supply chain. Foreign agents or private contractors could preload soft-
ware or hardware components with malware, whether for attack or
exploitative purposes. Apple security experts, for example, reportedly
worry that the company’s cloud services, “iCloud,” have been compro-
mised by vendors who have installed “back door” technologies for the
purposes of government spying.63 In 2009, Britain’s Joint Intelligence
Committee warned that Chinese-stocked components of British
Telecom’s phone network could be preloaded with malware or zero-day
weaknesses, giving Beijing the ability to interrupt the country’s power
and food supplies. A “sleeper” payload of this kind could be remotely
executed to achieve a preferred outcome in a future diplomatic or mili-
tary crisis. In 2012, the U.S. House of Representatives Intelligence
Committee warned that machine parts supplied by Huawei, a Chinese
company founded by a former officer of the People’s Liberation Army,
could be used to exfiltrate data from government computers. Supply-
chain risks are also a concern of the West’s adversaries. The Chinese
government, for example, recently banned the use of the Windows 8
operating system in its computers.64 Protection against such risks
74 THEORY AND CONCEPTS
notions of war and peace are polar binaries, denying that a major cyber
action is warlike is to affirm that it is peacelike. “If Chinese or Russian
spies had backed a truck up to the State Department, smashed the glass
doors, tied up the guards and spent the night carting off file cabinets, it
would constitute an act of war,” quipped Jim Lewis, a researcher at the
Center for Strategic and International Studies. “But when it happens in
cyberspace, we barely notice.”75
Second, for this reason, the notion of peace fails to capture the
essence of our strategic problem. The absence of war no longer means
the existence of peace – if peace means not merely the silence of guns
but, more fundamentally, a state of affairs to which statesmen can gener-
ally aspire as the maximal condition of coexistence within the system of
international anarchy. In the past, nations willingly exited a state of
peace when the perceived gains of war or the desire to preempt its losses
were greater than in peacetime, but always the situation of peace –
whatever its final shape – was a prima facie desirable object of statecraft.
Thus we intuitively know that a given state of affairs violates the concep-
tual limits of peace when some of the system’s main units no longer
accept it as a desirable or even bearable state of affairs.
Much of cyber activity is neither recognizably war nor recognizably
peace; it falls between the definitional bounds of these two binary
concepts without satisfying either one. It is symptomatic of this reality
that the comments of public officials and writings of analysts abound
with statements conflating the meanings of the two notions. “Russia
and China see cyber operations as a part of a warfare strategy during
peacetime,” stated the U.S. Republican Party in its 2016 electoral plat-
form.76 One commentator similarly depicted cyber activity as “warfare
during peacetime.”77 Another wrote about “Waging war in peacetime.”78
In these depictions of the contests in the new domain, it has become
difficult to distinguish a situation in which nations are at war from
when they are at peace; peace can become a form of war.
How many more conflations of elemental concepts can the design of
strategy tolerate? The thesis of “war in peacetime” has intuitive appeal
because it conveys the essential truth that much of cyber activity is
neither peace nor war. Yet because the view merges elements of each of
these two notions, it violates them both. It contains an obvious logical
fallacy: because there is neither real peace nor real war, there is therefore
both. The main result of these utterances is further confusion in the
78 THEORY AND CONCEPTS
public perception. Again their effect is to neglect the real and difficult
analytical question before us: if neither peace nor war, then what to call
the new phenomena?
Let us leave at the wayside the two oppositional notions of war and
peace, neither of which captures the essence of the contemporary situa-
tion. Let us resist, too, the urge to discard the distinction between war
and peace. Instead, let us refer to the costly but nonviolent, incessant
though often imperceptible hostilities in the new domain as a new state
of affairs – a situation of unpeace, or mid-spectrum rivalry lying below
the physically destructive threshold of interstate violence, but whose
harmful effects far surpass the tolerable level of peacetime competition
and possibly, even, of war.
By focusing so much attention on the important observation that
major cyber actions are not war, analysts and politicians have lost
sight of an equally important reality: neither are they peace, at least not
peace as statesmen have come to know and tolerate the condition. One
might also apply the label “nonwar” to describe mid-spectrum activity.
Unpeace is a better term because the vast majority of sub-threshold
cyber incidents do not even remotely approximate the criterion of phys-
ical destruction in war. In other words, their nonviolent modality is
closer to peace, though their effects are distinctly not peaceful.
The notion of unpeace is in our context similar in essence to colloquial
usage of the term – the case of a tragically “unpeaceful marriage,” for
example. A couple in this situation is not at “war” because the two parties
are still wedded. That is, they have not divorced, that final severance of
relations that frequently brings forth fierce exchanges of legal blows in
court. Yet neither is the situation one of peace, for the shaky union may
experience severe flare-ups of treachery and abuse which (even if physical
violence does not occur) can inflict far more psychological and emotional
damage upon the parties than skirmishes on the judicial battlefield. Now,
refine the plane of deception and misdeeds such that the husband holds
– secretly – the login credentials of his wife’s banking, social media, as
well as work and personal email accounts; the keys to the parent-in-law’s
house; and remote access to the mobile devices of them all. The man’s
opportunities to cause his spouse harm and misery without crossing the
judicial threshold of nuptial war multiply exponentially.
Trapped in rigid mental reflexes, some skeptics will dismiss the label
of unpeace as theoretical conjuring. Minds that are enchanted by old
THE CYBER CURSE 79
concepts do not easily adopt new ones. But other, more flexible minds
may find in this term a welcome means to escape the conceptual tangles
of the peace–war binary that so blights popular characterizations of the
new realm of action. Let us continue to strive for new breakthroughs in
concepts that are better suited to capturing the unprecedented trends of
security in our times.
CHAPTER 3
80
T E C H N O L O G I C A L R E V O L U T I O N A N D I N T E R N AT I O N A L O R D E R 81
who wield them; in short, to neglect a central truth: ideals, beliefs, and
habits are what define a revolutionary situation in the political system.
As important as the nature of a new technology, therefore, are the
nature and purposes of its possessors. Confronted with a potentially
transforming invention, international relations specialists must ask not
only whether it alters the physical power of states – a question that links
their concerns with the applied knowledge of engineers – but also
whether it affects states’ ability to realize revolutionary ideals. Even
more fundamentally, analysts must ask: does the technology signifi-
cantly diminish the supremacy of the main units – states – such that the
international system’s very structure is in peril?
These questions demand of traditional international relations theory
more than it can honestly give. Surprisingly little has been written about
the general relationship between new technology and international
order.1 True, analysts have long sparred over the impact of specific tech-
nological breakthroughs on strategic affairs – the influence of nuclear
arms on superpower rivalry,2 the effects of bombers and cruise missiles
on the logic behind deterrence,3 the implications of advances in robotics
on war and warriors,4 and so on. Yet the investigation of how technology
affects foundational questions of system, order, and anarchy remains
rudimentary. Theorists show a natural reverence for such questions; if
they do not invoke them, then they do not believe that the questions
apply. Thus the gap in thinking betrays a sense of skepticism – more
visceral than analytical – about the transforming potential of new tech-
nology, one that features in contemporary debates about cyberspace.
New inventions may influence ordinary international contests involving
interstate coercion and territorial conquest, but they are not normally a
menace to the international political order, much less a threat to the
state system’s very constitution.
To this school of technological doubters one can attach a by now
familiar label: Clausewitzian skepticism. It displays the core tenets of
intellectual reactionism in security studies – a focus on the dealings of
states (usually only large ones), an emphasis on the technologies of mili-
tary conflict, and an assumption of states’ unitary rationality. This
fixation with mechanical interstate dealings reflects the broader
Westphalian mold in which established theories are cast. It derives from
a rigid model of “system” that does not allow much possibility for deep
change in the units’ basic purposes or composition. The absence of an
82 THEORY AND CONCEPTS
are the main and – crucially – irreducible units to which all other agents,
domestic and international, are subordinate.
A second set of assumptions relates to structure: those features of the
international system that cannot properly be comprehended as proper-
ties of units because they apply across all or a group of them.11 Structure
has both social and material elements. The social element reflects the
common (though not always explicit) assumption among theorists that
the competing units share a basic interest in survival and the preserva-
tion of order, which moderates the intensity of rivalry among them,
especially the resort to violence. For this reason, some political thinkers
describe international politics as comprising not only a mechanical
system of rational interactions – the world, writ large, of John Stuart
Mill’s detached Homo economicus who prioritizes defined ends according
to their numerical value12 – but also a “society of states” – Immanuel
Kant or Woodrow Wilson’s realm of players whose values, ideologies,
and passions always shape and sometimes displace selfish conveniences.
In the words of E. H. Carr: “No political society, national or
international, can exist unless people submit to certain rules of conduct.” 13
The existence of a society does not mean that the units share common
interests at all times; merely that when these interests diverge, some-
times greatly, the units commonly accept the contenders’ right to protect
their own interests even if in certain instances the clash involves an
outbreak of limited violence. This common social fabric defines the basic
parameters within which states respond to the system’s material struc-
ture, such as the balance of power or international commercial flows,
which define the context, and often the drama, in which the units pursue
basic goals without, however, altering their essence.
Third are the system’s procedures: the rules, laws, norms, and institu-
tions that help to sustain this temperance of behavior and facilitate
cooperation even in the absence of a central authority to suppress the
will of the units. Views on the importance of the system’s procedural
machinery (such as international organizations or normative conven-
tions) in averting conflict and preserving order diverge widely. Some
scholars depict institutional mechanisms as tools that large powers use
to seize the distributional advantages of cooperation. Others portray
them as necessary conditions for cooperation to occur. For one side,
institutions are a device of anarchic power politics; for the other, they
are a means to overcome its crudities.14 Whatever one’s position in these
T E C H N O L O G I C A L R E V O L U T I O N A N D I N T E R N AT I O N A L O R D E R 85
for the abolition of the state itself, or of the nation, do not figure in these
prescriptions [my italics].”37 But the liquefaction of the state within a
global proletariat did not feature in the Soviet program of change. It was
an end point more imaginable than realizable, less a command to polit-
ical action than a topic for sedate discussion among theoreticians.38 Yet
Moscow’s pursuance of “world revolution” in the form of a sustained
campaign to refashion, often by force of arms, other nations in the figure
of its own domestic image represented a direct challenge to the
Westphalian credo of ideological pluralism in the states system – an
immoderate quest to mold a global identity of purpose.39
Another historical example of systemic revision is the contemporary
project of European union, at least in its deeper political purposes.
Following the cataclysm of the Second World War, the leaders of West
Germany, France, Italy, Belgium, the Netherlands, and Luxembourg –
“the Six” – sought to secure permanent peace on the Continent. This
aim had enjoyed a long history in European political thought, existing,
as Andrew Hurrell noted, as an “imaginable political reality” since at
least the seventeenth century.40 Most famously, it formed the principal
concern of Kant’s system of international ethics and was the basis of his
assault on Grotius and other pluralists as “sorry comforters.”41 Political
movements agitating for lasting peace proliferated well before the
Second World War, emerging in Britain already in the eighteenth
century.42 But at no point prior to 1939 did the aim acquire a sustained
following among leaders in positions of national authority within a core
group of countries – no less France and Germany, on whose soil Europe’s
most sanguinary battles were fought. The peace aim represented a major
departure from the dogma of power politics, which accepted as legiti-
mate only the avoidance of generalized war among the great powers
because it could jeopardize the continuance of the states system itself.
Limited war, by contrast, was deemed permissible in proportion to
balancing needs, as often arose. What is more, in the logic of power
politics, the aspiration to permanent peace seems pernicious, because to
eradicate the use of force is to remove the final means in the preserva-
tion of the balance of power. It is an invitation to universal conquest:
permanent peace at the cost of permanent subjugation – and after much
violence. On these terms, the logic of power politics deemed only a
transitory peace attainable or even desirable; peace was a contingent but
not an absolute goal. On the contrary, European federalists extolled
92 THEORY AND CONCEPTS
has long ceased being a creature of their control.67 Thus, contra John
Ruggie, who argues that political forces become “preponderant” only
after a technology’s emergence, the Internet is proving far more difficult
to control now than at its stage of creation, when political goals were
supreme.68
Even in its deterministic stage, however, technology can be but one
among several important factors that shape the scope of possible polit-
ical action. And it may not even be the most significant. It would be
wrong to presume that technology can operate entirely beyond reach
of the political forces that motivate its development and guide its
consequent uses.69
between 1960 and 1989, when Soviet economic growth was the lowest
in the world,75 or between 1992 and 1998, when the Russian economy
declined on average almost 7 percent per year.76 In the absence of nuclear
weapons, the post-1945 bipolar order would have been less pure in the
sense that the two superpowers would have been less able to influence
their interactions both with each other and with their respective smaller
partners. In the post-1991 period of American preeminence, it may
have led to the absorption – peacefully or forcefully – of all fifteen
former Soviet Republics (and not just Estonia, Latvia, and Lithuania)
into the Western security architecture.77
It is important to note, however, the limits of the nuclear revolution in
this regard: it has not equalized large and small states. Acquisition of the
bomb has not consecrated Israel, North Korea, Pakistan, or even India as
global powers equal to the larger nuclear states – the United States,
Russia, and China – even if the elevated status of the lesser powers would
be inconceivable without their nuclear arsenals. In other words, the
expansion of systemic disruption via the proliferation of nuclear arms has
not given rise to a multipolar order, although it may have increased the
scope of maneuver of the nuclear powers over all others.
Technological Volatility
A new technology may contribute to instability among rational states
because it complicates the task of evaluating the true balance of military
capabilities among them. This can occur if the pace of technological
development is faster than strategic planners’ ability to measure its
effects on the means of war; or if these effects are hidden or so scientifi-
cally complex that the contenders cannot decipher them.
The emergence of submarine warfare illustrates the first instance. In the
early twentieth century, British military commanders regarded the sub-
marine as a strategic relic even though they perceived its potential effects
on naval strategy. Submarine warfare, in the words of one British admiral,
was “underhand, unfair, and damned un-English.”78 Military planners
regarded the beloved Dreadnoughts – the colossal “castles of steel” – as
tactically superior to submarines. As a result, the Royal Navy invested
heavily in the production of surface ships in whose ultimate superiority
over submersible vessels they firmly believed.79 Reality proved otherwise.
At the start of the First World War, few viable means existed to detect the
T E C H N O L O G I C A L R E V O L U T I O N A N D I N T E R N AT I O N A L O R D E R 103
presence of enemy ships below water. Even if their detection was possible,
viable defenses against them were slow to develop.80 Consequently, during
the opening months of the war, the Royal Navy lost four capital ships –
three of them in a single action – to German U-boats. The development of
this new form of warfare outpaced the ability of military commanders to
learn the lessons of defense (or, for that matter, offense).81
The early stages of the nuclear revolution illustrate the second
problem. Consider, for instance, nuclear fallout – the chief collateral
effect of nuclear explosions. When in 1952 the United States detonated
the first megaton hydrogen bomb, few scientists correctly anticipated the
degree of fallout, even though the phenomenon was well known to
them.82 Technical analyses of the effects of the first Soviet thermonuclear
test in 1953 also ranged wildly. The Chairman of the European Atomic
Commission, Lewis Strauss, observed that individuals injured by the
blast were recovering swiftly from the effects of radiation.83 This finding
disturbed A. H. Sturtevant of the California Institute of Technology, who
claimed that the pathological effects (e.g. permanent genetic mutations)
endured beyond the first generation of victims.84 General Curtis LeMay,
the founding director of U.S. Strategic Command, which held responsi-
bility over nuclear forces, expressed the contraposition to these concerns,
testifying that a person could avoid deleterious health effects by lying
under three feet of dirt!85 For ten years the fallout debate raged and
hindered the development of nuclear strategy. It obscured understanding
of the extent of civilian deaths, the uninhabitability of populated areas,
the contaminability of food and water supplies – questions that were at
the core of political, doctrinal, tactical, even ethical debates about the
management of this new genus of warfare. The scientific debate waned
only in 1963 with the signing of the Partial Nuclear Test Ban Treaty,
which prohibited testing in outer space, the atmosphere, and below water,
and thus placed an artificial freeze on the pace of technological develop-
ment. In this way, a procedural device in international law enabled
strategic knowledge to match scientific realities; it helped to control the
destabilizing effects of the nuclear revolution.
famine would ensue. In addition, these events may produce the dis-
integration of internal state structures. Territorial boundaries would
become indistinguishable, national jurisdictions would be rendered
meaningless, and local or transnational authorities may become the new
agents of a chaotic system of global politics.
the rigid parameters of the Conventional Model, that is, the basic
purposes of the units or their supremacy next to other players.
This chapter has outlined a more expansive revolutions framework.
It distinguished changes in scale from changes in kind. Not all revolu-
tions are equal in transforming potential; in fact, some reflect a basic
continuity in international affairs because they leave untouched the
established model’s defining assumptions. The least severe form of revo-
lution – if it is that – is systemic disruption, or third-order technological
revolution. It transpires within the material and social confines of the
state-centric universe. A more fundamental form of change, systemic
revision, or second-order technological revolution, implies a deep internal
reform of this universe. It occurs if new technology empowers a state or
group of states that repudiate the basic ends that the dominant units
regard as legitimate and which, in the analyst’s eye, seemed “rational.”
Systems change – first-order technological revolution – is the most
radical form of transformation, because it affects the system’s basic
constitution: the traditional supremacy of states over other actors. It
happens when technology gives rise to a new class of actors whose
motives and aims the units are unable to grasp and whose actions they
are unable to control even as they are greatly affected by the alien players.
The next part of the book will elaborate on these distinctions as they
apply to the contemporary cyber age.
116
PART II
D E G RE E S O F T H E C Y B E R R EVOL UT ION
117
118
CHAPTER 4
Technological Volatility
Technology itself is a destabilizing factor: cyberweapons are so novel
and the vulnerabilities they seek to manipulate so inscrutable as to
impede interpretation of the probable effects of their use. Put simply, it
is difficult to know how pernicious code will behave.
One difficulty concerns collateral damage. A poorly designed cyber-
weapon can cause far-reaching effects beyond the intended target if it
infects a large number of third-party machines. The danger of collateral
damage is due to the ease and rapidity with which malware travels the
Internet. It also arises from the intertwining of many military and
communications networks. In the lead-up to the NATO air campaign
to depose Muammar Gaddafi in 2011, American officials reportedly
weighed the option of using weaponized code to blanket Libya’s govern-
ment communications systems. They decided against the move because
this infrastructure was closely entwined with civilian systems.
One important means of limiting collateral effects and of concen-
trating the force of the blow upon a single system is to customize the
malware. The Stuxnet worm illustrates this design feature. It altered
the rotor speed of IR-1 centrifuges that were joined to Siemens S7-315
and S7-417 industrial controllers, which governed respectively the
plant’s Centrifuge Drive System and the Cascade Protection System.15
The worm’s handlers had hoped it would not escape the air gap that
enveloped the facility. But like a fish in an open-top tank that is dipped
in the sea, the artifact travelled speedily beyond its designated logical
habitat, infecting tens of thousands of other machines in more than one
hundred countries in less than one year.16 Newly infected systems
124 DEGREES OF THE CYBER REVOLUTION
included industrial controllers that the worm did not affect because
they did not match the precise configuration at Natanz. Thus although
the propagation technique (it could infect almost any machine running
a Windows operating system) was not customized,17 the payload that
caused physical damage to Natanz’s centrifuges was closely tailored to
their controller.
Malware customization only partly resolves the problem of uninten-
tional civilian harm. Although it reduces the scope of direct effects
beyond the target machine, the attack’s indirect consequences can still
be enormous if the affected machine (or the secondary activity, such as
power generation, that it governs) supports essential social and economic
functions. These indirect effects are difficult to model or predict. But the
effects of technical errors in stock-trading systems offer a glimpse of
the chaos that a major cyberattack on these systems might produce. In
the famed “flash crash” incident of May 2010, a design flaw in the “sell
algorithm” of a trading house seeking to sell 75,000 mini-contracts
prompted a spiraling trade volume and loss of liquidity that saw the
value of the Dow Jones fall by nearly 1,000 points in a matter of a few
minutes.18 Two years later, a software error caused the equities trading
company Knight Capital Group to issue faulty orders, resulting in losses
of $440 million.19 In July 2015, the New York Stock Exchange halted
the trading of $28 trillion worth of equities for nearly four hours because
of the discovery of a coding error.20 None of these technical crashes
points to an impending scenario of financial Armageddon, as some
alarmists have warned. Yet they reveal the fragility of the underlying
technology of modern financial infrastructures. What an inadvertent
technical error can cause, a highly customized weapon can multiply
several times over. A major and intentional interruption of stock-trading
platforms could create psychological reverberations that undermine
public confidence in the entire financial system.
A related difficulty is the potential for blowback: the possibility that
the negative effects of a cyberattack will be felt by the attacker or its
allies, whether through the self-propagative tendencies of malware
(causing direct effects on home computer systems) or through cascading
economic damage (indirect effects on one’s home society).21 Part of the
problem arises because of the interconnectedness of computer systems
and networks. Take down servers in Iraq’s financial and banking sector
and you might also disrupt cash dispensers far away in Europe. This was
THIRD-ORDER CYBER REVOLUTION 125
adventures in the future. In fact, the general reluctance of the Bush and
Obama administrations to disclose or claim offensive actions may
reflect a desire to conceal the strategic returns of a still largely unproven
capability to which the American economy and society are immensely
vulnerable.58 Before Stuxnet struck in 2009, Iran’s arsenal of weaponized
code was vacant or negligible. The Iranian cyber army’s chief capacities
were in the area of surveillance. Its main preoccupation was the moni-
toring and interception of the “Green Movement,” a popular uprising
that engulfed the nation’s urban centers in 2009. The dazzling spectacle
at Natanz altered these priorities and activities. Within three years, in
the summer of 2012, Iranian agents reportedly disrupted U.S. financial
operations with DDoS attacks, which some American officials regarded
as a muted retaliation for Stuxnet.59 Later that year, Iranian agents
crafted the “Shamoon” virus that incapacitated about thirty thousand
machines at Saudi Aramco, the world’s largest oil firm. Thus it was not
long before the despair and embarrassment of Natanz evolved into a
recognition that the new technology’s offensive dominance afforded its
possessors a new capacity for potent action in foreign policy.
A third and graver problem concerns the use of active defense, a class of
proactive measures that involves preemption or prevention of cyberattack
by infiltrating or disrupting an opponent’s computer systems (Chapter 9
will discuss this problem further).60 Cyber artifacts are most easily detected
in transit, by which time it may be too late to neutralize them. Therefore,
the existence of a wide defensive gap has produced a tendency among the
most capable players toward the persistent and absolute penetration of
adversaries’ computer systems. Both the American and British governments
operate mass surveillance programs. Officials at U.S. Cyber Command
have boasted that they can penetrate almost any computer system in the
world. The revelations of covert surveillance programs by Edward Snowden
in 2013 show that Washington has put this claim to wide practice abroad.
In 2016, the British Parliament passed the so-called Snoopers’ Charter that
will expand the foreign surveillance powers of the electronic spy agency,
Government Communications Headquarters (GCHQ).
Some observers like to claim that such systematic programs of
privacy infringement show that the state has become dominant in
private life. This is a mischaracterization of the current situation. The
compulsion to engage in surveillance is a reflection of the defender’s
anguish. It gives credence to the presumption that “malware will always
138 DEGREES OF THE CYBER REVOLUTION
The speed at which code can travel and execute eliminates temporal
limitations to the infliction of harm across national borders. The new
capability pushes the upper speed of weapons systems from Mach 20,
the speed of the fastest intercontinental ballistic missiles, to the velocity
of electrons.
Consequently, the interaction domain of a cyber conflict unfolds
in milliseconds – an infinitesimally narrow response time for which
existing crisis-management procedures, which move at the speed of
bureaucracy, are not adequate. The attacker, to be sure, also faces bureau-
cratic encumbrances. But he has the advantage of being able to preselect
the initial, and possibly decisive, moves in the conflict before it begins.
This is particularly true of a multistage cyberattack involving the
preliminary intrusion of the defender’s terrain. The longer the enemy
resides inside the target system undetected, the more information on its
design and the defender’s habits he is able to obtain and the greater
will be his opportunities to derive tactical gains in the actual battle.
The defender, by contrast, faces immediate policy hurdles. Who has the
tactical capacity to respond? To whom does the authority to initiate
the response belong? What retaliatory action, if any, is adequate?
In answering these questions, traditional precedents that regulate the
role of government agencies in the conduct of national defense can be
difficult to interpret in a cyber emergency. And even where the necessary
tactical action is known, the authority to initiate it may be unclear; or
else the ability of operational and command structures to implement it
may not exist. To illustrate, the U.S. National Security Agency (NSA)
has authority to defend and retaliate against foreign-based cyberattacks,
but it may lack access to the forensic data necessary to tailor a timely
defensive response if such information resides in private computer
systems or in foreign jurisdictions. “I can’t defend the country until I’m
into all the networks,” remarked NSA Director Keith Alexander in
2013.63 Once again, the government’s drive to monitor civilian networks
is a sign of weakness, for in the absence of threat information therein, it
cannot mount a full defensive strategy.
Private actors have few means to carry out a response, particularly of
a punitive kind, against an attacker who is located in one or multiple
nations. Or else their affected computers and servers may reside in
several foreign jurisdictions. In August 2013, culprits in the Syrian
Electronic Army, a foreign group sympathetic to President Bashar
140 DEGREES OF THE CYBER REVOLUTION
Al-Assad, attacked the domain name registrar of The New York Times,
crashing the newspaper’s website for several hours. In another intrusion
earlier that year, malware of Chinese origin could have “wreaked havoc”
on the company’s news coverage during the night of the 2012 U.S. pres-
idential election. Yet in neither instance could the publisher have taken
independent action – legally or effectively – against the culprits in Syria,
China, or wherever else they may have resided.
Another incident that smashed through multiple borders of the rigid
Westphalian system were the cyberattacks in December 2014 against
Sony Pictures. Their political master, North Korea’s supreme leader Kim
Jong-un, was in Pyongyang. The perpetrators reportedly routed the
attack via computer infrastructure located in northeastern China. The
firm and its affected machines were located in the United States. Its
parent company, the technology conglomerate Sony, was based in Japan.
Upon which government did responsibility for the reply reside – the
United States, Japan, or both? Which nation – North Korea, China, or
both – should have been the object of punitive measures? In the midst of
the unfolding crisis Michael Lynton, Chief Executive Officer of Sony
Pictures, observed: “We are the canary in the coal mine, that’s for sure.
There’s no playbook for this, so you are in essence trying to look at the
situation as it unfolds and make decisions without being able to refer to
a lot of experiences you’ve had in the past or other people’s experiences.
You’re on completely new ground.”64 Much the same might have been
said of the frequently clueless international relations specialist.
The implementation of automated responses, or “tactical fires,” can
go far toward restoring strategic depth. Automation is already a common
feature in the detection of weaponized code. Conventionally, however,
the defeat of sophisticated weapons requires the intervention of engi-
neers, a process that can take months to complete if the code is
especially intricate or if the vulnerabilities it exploits are unclear. The
retaliatory response, too, requires a human decision-making process
that must weigh policy, legal, ethical, and institutional considerations.
In 2014, DARPA, the same Pentagon agency that funded the
ARPANET, the modern Internet’s precursor, launched a program to
develop software that could repair itself automatically.65 A similar
program seeks to automate responses to distributed denial-of-service
attacks, which work by overwhelming computer services by flooding
them with information. Cloud computing infrastructure, for example,
THIRD-ORDER CYBER REVOLUTION 141
Escalatory Dangers
Together and individually, the forces of systemic disruption in the cyber
domain produce risks of an unwanted or accelerating conflict among
contenders who wish to avoid it. The technology’s volatility complicates
the design of models to contain a conflict within tolerable bounds.
Players in a cyber conflict may never command good knowledge of the
consequences of their tactical moves, as American decision-makers
discovered when considering cyberattacks against Saddam Hussein’s
142 DEGREES OF THE CYBER REVOLUTION
foreign laborers over the heads of their own leaders. Yet despite Russian
Communists’ ideological fervor against established diplomatic conven-
tions, despite their abstract yearning for a stateless society, theirs was not
a political revolution of the first order. Unlike the programs of universal
conquest of Napoleon or Hitler, Russia’s new leaders did not pursue a
generalized war aimed at liquidating the nation state within a World
(or European) Leviathan.1 Instead, the Communist creed was for the
Russians largely an instrument of the national interest. They recognized
the imperative of strengthening their transformed state within a larger
and almost universally hostile states system, some of whose basic
conventions – trade agreements, bilateral treaties, international organi-
zations, and so on – they grudgingly adopted following the setbacks to
world revolution in 1920–21.2 Soviet Russia (and from 1922, the Soviet
Union) were not agents of systems change. They accepted the necessity
of a fragmented political system built around the organizing principle
of sovereignty. As Lenin observed of Bukharin in a denunciation that
later applied generally to the Soviet ruling class, “Scratch a Russian
Communist and you will find a Russian chauvinist.”3
Yet the Soviet entity was revolutionary in other ways: it was an
agent of second-order revolution, or systemic revision. Its ideology and
autocratic character gave shape to a foreign policy that defied core inter-
national principles and institutions. Leon Trotsky, the People’s Commissar,
displayed a revolutionary zeal in his astounding revelation in 1917 of the
secret treaties of the Tsarist and Provisional governments, which exposed
their complicity in imperialist dealings and annexationist cravings. This
act may not be surprising in today’s era of WikiLeaks, but in those times
it challenged the integrity of government and the cherished customs of
diplomacy.4 More fundamentally, succeeding generations of Soviet
leaders, beginning with Stalin, discarded received principles of inter-
national conduct by repeatedly invading nations that they sought to
reduce to a Communist servility – Finland in 1939, the Baltic states in
1940 (and again in 1944), Hungary in 1956, Czechoslovakia in 1968,
Afghanistan in 1979. In this context, we can stress the importance of
Moscow’s aspiration to world revolution.5 Although it took a back seat to
the consolidation of the Soviet state in the Westphalian order, it remained
a centerpiece of the Soviet worldview and motivated a foreign policy that
strived to refashion smaller neighbors in the master’s bizarre image of
itself. This was not a unit whose vision of itself and of its place in the
SECOND-ORDER CYBER REVOLUTION 145
world theorists today could easily squeeze into the tidy Conventional
Model of international relations, no matter how hard they tried.6
In this ideological endeavor, new technology – the nuclear bomb –
was a secondary but nevertheless crucial enabling factor. The Soviet
Union’s acquisition of a fission bomb in 1949 cast a tremor over East–
West rivalry that enabled Moscow to carve out its sphere of influence by
force without fear that Western nations would thwart its military
designs. But nuclear technology had also a restraining effect on Soviet
expansionism. As David Armstrong explains, “Nuclear weapons made
accommodation with the capitalist powers an urgent long-term neces-
sity rather than a short-term dialectical tactic.”7 It was not nuclear or
some other technology in the end that drew a curtain on the Communist
Revolution and its chief agent. Rather, this came about largely because
of internal political forces: notably, the rise to power in 1985 of Mikhail
Gorbachev, who embraced the principles of international society, as
revealed for example by his celebrated decision not to veto the United
Nations resolution that endorsed the American invasion of Iraq – a
Cold War ally of Moscow – following its illegal annexation of Kuwait
in 1990. In sum, the relationship between the Communist and nuclear
revolutions shows that transforming technology can fuel or restrain, but
not by itself instigate or overturn, a concerted program of systemic
revision.
We can observe a similar pattern of effects in the virtual weapon’s
impact upon the conduct of North Korea – the quintessential revolu-
tionary state in our times, more so even than the Soviet Union was in
the last century, because the cult of Lenin and Stalin never acquired the
dynastic character that has perpetuated the House of Kim through
generations. North Korean leaders have fanned antagonisms both
regionally and globally in order to disturb international stability. They
seek to score political points at home by instigating chaos abroad. In
this program of foreign subversion, they have benefited from the
widening scope of strategic maneuver that the virtual weapon affords:
the peculiar methods of unpeace, or what Chapter 2 described as
harmful activity that falls legally and conceptually below the threshold
of war, though it may inflict more economic and political harm than a
single act of war, and whose effects seem less tolerable to ordinary
participants in the international jungle than the familiar means of
peaceful rivalry.
146 DEGREES OF THE CYBER REVOLUTION
regional military bases of the United States, it cannot yet easily reach the
despised enemy’s frigid shores of Alaska, much less the temperate and
strategically more significant urban centers of Seattle or Los Angeles,
although its focused quest to assemble long-range ICBMs such as the
Taepodong–2 reveals such an intention.18
To close this capabilities gap, North Korea has turned to the develop-
ment of unconventional technology. Of special importance to the Kims’
revolutionary program both at home and abroad is the ultimate technology
of conflict, nuclear weapons. The country has built between two and three
dozen nuclear warheads.19 Although the evaluation of North Korean
nuclear doctrine is notoriously difficult – its conceiver Kim Il-sung left no
written records of his motives and his offspring did not air theirs – its basic
elements may be inferred from the regime’s political purposes of survival.
One possible goal is Korean unification: a reversal by sheer military
force of the artificial separation of the brotherly peoples by a menacing
imperial power.20 As Young Ho Park observed, although the main thrust
of North Korea’s unification strategy is the enlistment of anti-government
activists in the South, not war, “it is difficult to say that North Korea has
totally given up the idea of unifying Korea by force of arms.”21 Thus it is
conceivable that nuclear weapons are a final instrument of unification.
But unification of the Koreas by nuclear war or blackmail is no more
achievable today than was the forcible fusion of the two Germanys by a
nuclear armed Soviet Union during the Cold War.22 Nuclear status is
likelier to prevent forced unification on the South’s terms than it is to
bring it about as the North desires.
Therein lies a more plausible purpose for the weapons: existential
deterrence, or the prevention of a military invasion by the United States
and its ally to the south.23 In this interpretation, the North’s nuclear
arms are the central component of an interplay between domestic and
international politics. They serve as a deterrent shield to preserve the
nation’s independence and protect the regime’s existence against conven-
tionally superior enemies.24 But the deterrent function will not be
complete until the North is able to detonate nuclear weapons on the
other side of the Pacific. The country, however, only very recently
mastered the technology of miniaturization. Presently it can mount a
warhead onto a medium-range Rodong missile, but not onto a long- or
full-range intercontinental missile, which at any rate it does not possess.25
Yet even when the North breaks through the intercontinental missile
150 DEGREES OF THE CYBER REVOLUTION
barrier (the world awaits with trepidation the answer to the question,
“What happens when it does?”),26 its nuclear capability will provide
only limited strategic gains. The country already possesses in its vast
arsenal of rocket forces the ability – spontaneously – to obliterate South
Korea’s (and Japan’s) industrial heartland and to wipe out a large number
of the 28,500 American soldiers stationed on the peninsula. The addi-
tion of a few American cities to this list of possible horrors would
significantly raise but not fundamentally alter the deterrence calculus on
which the United States already seeks to avoid a military showdown.
For this same reason, the North will not likely be the first to launch a
nuclear strike or unleash its conventional rockets, except in the face of
an imminent threat to the survival of the Kim house.
Instead, the rewards of nuclear weapons are perhaps more political
than strategic. Scott Sagan has made the important point that states
sometimes acquire nuclear arms not only to thwart foreign threats but
also to secure domestic political interests.27 “Nuclear weapons, like other
weapons, are more than tools of national security,” he explained. “They
are political objects of considerable importance in domestic debates and
internal bureaucratic struggles and can also serve as international
normative symbols of modernity and identity.”28 In addition, as we shall
see below, they are also instruments that repressive leaders can use to
defy the international consensus for domestic purposes of regime legiti-
macy – a different manifestation of the domestic proliferation model
than the one Sagan supposes. Normally, this model invokes the complex-
ities of internal bureaucratic interests. As Victor Cha explained, in some
states – that is, the “rational” units such as India and Pakistan that
inhabit the Conventional Model – the drive toward the bomb may be
the method by which rising civilian technocrats displace an “old, corrupt,
and inefficient military bureaucracy.”29 These factors need not apply,
however, to a nation in which power is so highly concentrated in the
figure of a single man. To be sure, the North Korean military elites on
whose support the regime’s continuity relies require appeasement and
inducements.30 Kim Jong-un must, like any despot, satisfy the interests
of a cadre of underlings that together constitute a government elite
(though they may not be a “ruling” elite).31 But to invoke the image of
nagging lobbyists, overbearing bureaucrats, and cunning legislators who
can twist Kim’s arm is to miss the unsurpassed despotic nature and
ideological fervor of his regime.
SECOND-ORDER CYBER REVOLUTION 151
and the best organized.”33 The North’s arsenal of code inspires awe as
much for its proven destructive capacity as for the revolutionary purposes
that drive its use. The record of this behavior illustrates the new scope
for strategic maneuver that the technology affords the revolutionary
state in its relations with ordinary states whose goals it repudiates and
conventions it defies.
The North’s cyber arsenal traces its origins to the electronic warfare
(EW) capability of the Korean People’s Army in the 1970s. EW denotes
offensive or defensive activities within the electromagnetic spectrum.
Strictly speaking, it is different from cyber operations because it does
not involve the use of code to alter machine functions, but it can impair
them significantly if an attacker uses electromagnetic energy, such as a
concentrated “pulse,” to incapacitate the machines’ electronics. On these
terms, the common label “cyber electronic warfare” is misleading because
the activity affects but does not involve the use of cyberspace to inflict
harm. Cyberspace is the object but not the medium of harm. Nevertheless,
because electromagnetic and cyber operations often rely on similar
technologies and expertise, it is natural that the latter capability grew
out of the former.34 Following a study of the lessons of the 1990–91
Gulf War,35 the KPA expanded its EW capability to include assets in
information warfare, a broad notion that encompasses weaponized code
but also the use of propaganda and disinformation (discussed further in
Chapter 8 in the Russian context).
The most important step in North Korea’s ascent in the hierarchy of
cyber powers was the establishment in 1998 of the famed Bureau 121.
The unit operates within the Reconnaissance General Bureau, the
military’s principal clandestine and intelligence body. As the nation’s
principal cyber outfit, it is charged with a singular responsibility: offen-
sive activities in foreign cyberspace.36 To nourish this unit with skills,
North Korea has developed a small but highly specialized computer
industry. Although it hosts only approximately one thousand IP
addresses – much less than the 112 million in South Korea, whose
capital city, Seoul, was designated “the world’s leading digital city” and
which far surpasses the North in technological ability37 – this native
technological base is not primitive. It has developed its own intranet
(although state websites are hosted abroad in China); tablet computers,
for example, the “Samjiyon,” which features applications tailored to it;
and a rudimentary social network, the oddly named “Best Korea’s Social
SECOND-ORDER CYBER REVOLUTION 153
researchers who seek to answer it, for the North is no more forth-
coming about the aims of its cyber program than it is about the goals
of its nuclear program. Nevertheless, the public record of events offers
important clues about these goals. Three stand out.
One goal is offensive: North Korea’s cyber arsenal provides a piercing
but not overtly violent tool to seize foreign economic assets – all the
while avoiding the penalties of doing so with traditional military means.
For example, the computer security firm Symantec tied North Korean
hackers to attempts to penetrate the Tien Phong Bank in Vietnam.46
More ominously, in February 2016 the hackers used code to penetrate
the computers of the Bangladeshi Central Bank. The attackers manipu-
lated weaknesses in the SWIFT interbank messaging system, which
handles billions of dollars of transactions each day, to seize $81 million
from the bank’s account in the Federal Reserve Bank of New York.47
(The size of the heist might have been much larger had the culprits not
misspelled the name of an entity receiving the illicit cash.)48 Such seizures
of foreign money enable North Korea to offset the internal pains of
Juche and the economic sanctions for its maligned nuclear program.
A second important goal is psychological: the cyber actions dilute
the confidence of North Korea’s adversaries in their own superior
economic and military base. Recall from the discussion in Chapter 4
that the virtual weapon has enormous depth of power; players can use it
to penetrate an opponent’s essential computer systems remotely and
sometimes permanently. In April 2011, nearly half of the machines in
the South Korean Nonghyup agricultural bank crashed. For several
days, tens of millions of bank customers were unable to access their
cash.49 In December 2014, the agency that operates the South’s twenty-
three nuclear-power facilities, Korea Hydro and Nuclear Power Co.
Ltd., reported an intrusion into its computer systems. Days later, a
Twitter user demanded that three reactors shut down, or else the plants
would suffer unspecified destruction (the intruders did not disrupt the
reactor cores’ activity or seize critical systems data).50 In March 2016,
the North reportedly hacked the mobile phones of several key govern-
ment and media employees in the South.51
These incidents did not produce large-scale physical damage. Yet they
serve as a warning to North Korea’s enemies that it can utilize cyberspace
to penetrate permanently – and, if necessary, disrupt – the computer
infrastructures that sustain the essential economic and civilian functions
SECOND-ORDER CYBER REVOLUTION 155
aims while avoiding the stiff penalties that accompany war. Obama himself
made this point plain in declaring, publicly and unequivocally, that the
cyberattacks against Sony Pictures were not an “act of war.” Promising to
retaliate “proportionally” – in other words, not militarily – he signed an
executive order in January 2015 imposing new financial sanctions on the
reclusive nation, excoriating the North for its “provocative, destabilizing,
and repressive actions and policies, particularly its destructive and coercive
cyber attack.”58 The president subsequently signed an Executive Order
that authorized sanctions against North Korea in retaliation for its action.59
The sanctions froze the U.S.-based assets of the regime, government offi-
cials and agents, and officials of the Workers’ Party of Korea. The country
also suffered unspecified blackouts in its Internet service, an exclusive tool
of the regime.60 Overall, these mild penalties could not have bitten deeply
into the hardened skin of Juche’s princes.
But evasion of severe penalties is not the new weapons’ only attrac-
tion. Like nuclear arms, offensive code offers an awe-inspiring techno-
logical medallion that the regime can brandish, cavalierly and
disruptively, to distract and stir the pride of a pauperized people, who,
caught in the glare of the cult leader’s repeated acts of defiance against
the international order, cannot discern their own misery. Insubordination
in foreign cyberspace serves the purpose of stiffening submission in the
domestic political space.
160
F I R S T- O R D E R C Y B E R R E V O L U T I O N 161
Thus the dilution of the states’ role – where dilution occurs – does
not mean that they are becoming irrelevant, as naïve observers have
sometimes claimed. Rather, the point is that the technology is partially
but notably diminishing states’ supremacy and even primacy. It empowers
actors residing outside the Conventional Model, even as the alien players
assert their influence over the traditional units and threaten their basic
purposes. It is in short a partial but still notable force of systems change.
This trend is evident in three ways. First, states are not the sole
relevant source of threats to national security; nontraditional players
such as political activists, criminal syndicates, and religious extremist
groups can also cause significant harm against governmental, financial,
even military interests. Second, states are not the only, or in some cases
even the primary, national-security providers. In some areas of national
security, such as the operation of vital computer infrastructures or the
collection and encryption or decryption (in forensic investigations, for
instance) of personal data, the private sector is more important. Third,
the traditional units are no longer able to maintain full control over the
dynamics of conflict among them. Civilian culprits can disturb –
sometimes gravely – the fragile political framework of interstate affairs.
New departures in thinking about security are therefore necessary.
McAfee Labs surmised that the two artifacts were the work of the same
actors. “One thing for sure is that the Stuxnet team is still active,” they
concluded shortly after Duqu’s discovery.43
Flame is another possible successor to Stuxnet. It emerged in May
2012, when Iran’s computer emergency response team, MAHER,
discovered it in computers belonging to the country’s main oil facility on
Kharg Island.44 Machines in Lebanon, Egypt, Syria, Sudan, and the
Palestinian territories were also infected. According to MAHER,
various features of Flame put it in “close relation” to both Stuxnet and
Duqu.45 The prospect of design emulation appeared. Flame featured
similar file-naming conventions as its two possible ancestors. More
revealing were the similarities in the propagation method. The virus
used different exploits to manipulate the same two programming defects
in Windows software – namely, a “special trick” to infect USB drives and
“privilege-escalation” weaknesses – that Stuxnet used to disperse itself.46
According to Hathaway, for this reason Flame represents a possible
attempt to “replicate” Stuxnet.47 Yet Flame’s code was more complicated
than Stuxnet’s.48 And unlike Stuxnet, Flame’s payload had no destruc-
tive intent. As a data-sweeping agent, it was more similar to Duqu, yet
Flame’s technical profile was distinct from Duqu in other respects.49
Conceivably, Flame represents a case of proliferation of either one or
both of its predecessors. Its similarities with Stuxnet and Duqu mean
that its creation may have involved some combination of line replication
and design emulation. At the same time, it was sufficiently distinct from
them to affect third-party systems.
One view holds that Flame was created by a distinct developer team –
an instance of horizontal proliferation.50 Researchers at the Budapest
University of Technology and Economics claimed that “sKyWIper
[Flame] and Duqu (Stuxnet) have many differences, and it seems
plausible that sKyWIper was not made by the same developer
team.”51 We may never know the facts. Duqu’s and Flame’s perpetrators
remain unidentified. For this reason, Kaspersky and Symantec have chal-
lenged the depiction of these cases as a horizontal proliferation. “Stuxnet
and Duqu belonged to a single chain of attacks,” remarked Eugene
Kaspersky. “The Flame malware looks to be another phase in this war.”52
Regardless of the connection (or not) between Flame and other
malware, the weapon has its own proliferation contender: the Shamoon
virus. In August 2012, it struck the computer systems of the oil firm
172 DEGREES OF THE CYBER REVOLUTION
Saudi Aramco, the world’s largest company by market value. The U.S.
Secretary of Defense Leon Panetta described the operation as “a signifi-
cant escalation of the cyber threat.”53 Shamoon was a self-propagating
agent. It manipulated infected computers to corrupt other nodes
connected to them via the company’s internal network, after which the
virus wiped the host machine’s master boot record. The attack’s direct
effects were astounding: almost thirty thousand machines were rendered
inoperable. The indirect effects may have been greater still. The attack
prompted the company to bar external access to its computer networks,
further disrupting corporate activities. Aramco’s vice-president for
corporate planning alleged that “the main target in this attack was to
stop the flow of oil and gas to local and international markets.”54 That
did not occur. Nevertheless, some analysts refuse to rule out such an
eventuality in the future.55
Shamoon’s crafters may have emulated some design elements of
Flame. According to technicians at Kaspersky Lab, the wiper payload
of the virus borrowed design principles from a similar component in
Flame-infected Iranian computers. Yet there were also notable differ-
ences. It used a different wiper pattern that lacked the distinctive service
names and driver filenames of Flame. It was also sloppier: “silly” flaws in
the code included the use of an erroneous date comparison and the
exchange of uppercase letters for lowercase ones in the code’s format
string, which impeded its execution.56
The identity of the perpetrators remains obscure. Surprisingly,
Kaspersky experts believe that Shamoon was likely a “copycat” artifact,
the work of “script kiddies inspired by the story [of Flame].”57 An
obscure jihadist militant group, the Cutting Sword of Justice, claimed
responsibility for the attack in protest against Riyadh’s regional foreign
policy. The date that is hardcoded into the body of the virus is the
same as the date that this group claimed Aramco’s computers were
hit.58 Officials have openly wondered whether Shamoon was Tehran’s
reprisal for Flame.59 Saudi Arabia is a close regional security partner
of the United States and has vehemently advocated military action
against Iranian nuclear facilities. As one study of the operation observed,
Iran had both the geopolitical motive and the capability to carry
it out.60
Ambiguities about the attackers’ identity complicate the assessment
of whether Shamoon qualifies as horizontal proliferation. But if
F I R S T- O R D E R C Y B E R R E V O L U T I O N 173
to corporate workstations. They will not normally seize the prized sensi-
tive data of senior government officials.68 Nor will they instruct dams to
open, derail trains carrying hazardous materials, or blank out air-traffic-
control systems (all in principle achievable by weaponized code).69 But
they can still cause considerable damage, which can be significant even
by the high standards of national security. Culprits have used botnets to
interrupt essential computer functions at moments of acute international
tension. Most famously, they were the prime means of attack used against
Estonia’s financial and governmental infrastructures in the spring of
2007 after the government removed a Soviet war memorial, the statue
of “the Bronze Soldier,” which many citizens regarded as an occupation
monument, from central Tallinn to a military hospital. In August 2008,
botnet commanders orchestrated DDoS attacks that disrupted the oper-
ations of Georgia’s national bank and government communications
systems during the country’s military invasion by Russia. More recently,
in 2016, they impaired the functions of Dyn, a company that manages
domain name services on which important commercial and media serv-
ices rely. The attacks, which like the others came in waves, prevented
millions of people in numerous countries from accessing popular
websites such as Reddit, Airbnb, Netflix, CNN, The New York Times, The
Boston Globe, The Financial Times, and The Guardian.
None of these attacks produced loss of life. None caused irreversible
damage to computer functions. Yet the attacks against Estonia convulsed
this small nation’s governmental services and economic affairs such as
online banking, a vital activity in a country whose population conducted
98 percent of financial transactions online, intermittently for a period
of about three weeks. “An attack against the state” is how Estonian
Defense Minister Jaak Aaviksoo described the incident.70 The attacks
against Georgia hindered the government’s ability to organize national
defenses and procure essential war materiel while it was fending off a
Russian land invasion. The attacks against Dyn disrupted the daily lives
of millions of individuals around the globe.
In all of these cases the main agents behind the botnets were private
actors. There is reason to suspect some degree of Russian government
complicity in the Estonian attacks. But it is clear from the available
facts that their main perpetrators were private, not governmental.71
Anonymous private citizens participated in the campaign en masse
from multiple national jurisdictions, including from Estonian soil.
176 DEGREES OF THE CYBER REVOLUTION
Intelligence, James Clapper, stated that a nonstate actor was behind the
attacks.75 The computer security firm Flashpoint corroborated Clapper’s
assessment, pointing out that the “Mirai” botnet infrastructure involved
in the attacks was popular among the video-gaming community which
comprises hackers who are eager to burnish their computational prowess.
Not ideology or politics but the vainglorious impulses of anonymous
denizens of the Internet’s dark quarters paralyzed broad sectors of the
Internet.76
Overall the picture of cyber arms proliferation to nonstate actors is
mixed. Customized weapons of the kind that can cause direct destruc-
tive effects to infrastructures are in principle the cheapest – costless, in
fact – to replicate precisely. Yet this method of proliferation is unlikely
to yield a payload that is reusable against a target other than the original.
Possibly the replicated weapon will prove useless later even against the
original target if its operators have patched the previously unknown
vulnerabilities on which the weapon relies to gain access or execute the
payload. Emulation of weaponized code is far more expensive, requiring
a significant redevelopment effort – the harvesting of new zero-day
weaknesses, the design of a highly customized payload against a new
target, the opening of precise access vectors into it, and so on. Yet the
impact factor of a successful emulation can be high, as the possible sons
of Stuxnet, Flame, and Duqu revealed. By contrast, a generalized weapon
such as a botnet DDoS tool imposes only moderate costs on the aspiring
proliferant. Many other actors have freely employed the services of
botnets. Their direct and indirect effects, however, are demonstrably less
significant than what is achievable with weaponized code.
Yet the main reason for preoccupation when considering the growing
influence of actors alien to the states system is not their capabilities,
which are conclusively lower than the power of the system’s masters,
but the nature of their aims, which can be more subversive and wilder
and which may clash with the purposes of states. “We’ve had this
disparity or contrast between the capability of the most sophisticated
cyber actors, nation-state cyber actors, which are clearly Russia and
China,” remarked Clapper, noting that these familiar opponents have
a “more benign intent.” But “then you have other countries who have a
more nefarious intent,” he continued, a possible allusion to revolutionary
states such as North Korea. “And then even more nefarious are non-
nation-state actors,”77 such as political hacktivists, criminal syndicates,
178 DEGREES OF THE CYBER REVOLUTION
Defense Fragmentation
A second manifestation of systems change involves the provision of
national security: governments are not the supreme, and in some cases
not even the primary, defenders against cyber threats in the way that
they are against conventional threats. Here, we must draw a distinction
between two basic forms of defensive activity: active and passive.
The label “active defense” broadly denotes the virtual weapon’s use
outside the defender’s or other friendly terrain to prevent or preempt a
hostile cyber action.78 This does not imply the use of any specific kind
of malware: the defending action can be exploitive, disruptive, or both.
Rather, it means that the activity transpires outside the home network
terrain for strictly defensive purposes.
States are the main players in the conduct of active defense. They
possess the legal prerogative to inflict harm on attacking machines
residing outside their jurisdiction. By contrast, most domestic penal
codes – for example, the U.S. Computer Fraud and Abuse Act – prohibit
private actors from carrying out this activity. Even when they are the
targets of a major impending offensive strike, it falls on the government,
not the victims, to conduct (or not to conduct) active defense.
Governments have good reasons to retain absolute rights over active
defense. But many voices, including some in government, have called for
an expansion of these rights to the private sector. We will deal with these
reasons and their countervailing voices in Chapter 9. For now, it suffices
to recognize that when it comes to cyber offense-as-defense, the govern-
ment’s role is as the framers of the Conventional Model would expect:
supreme.
Passive defense is more important and more common in the
cyber realm than active defense. Passive measures such as resiliency and
redundancy – the equivalents of underground shelters and target
dispersal in nuclear defense – aim to thicken the defensive glacis and
F I R S T- O R D E R C Y B E R R E V O L U T I O N 179
expressed a concern for the public good involving the use of one of its
products, but that the firm was in a position to adjudicate against the
government’s wishes between two seemingly incommensurable public
goods. In the end, the FBI resorted to the services of private hackers to
crack the device.86 The irony of the case’s conclusion is no boon to the
statist sentiment: the world’s most powerful government overcame
the resistance of company executives more powerful than itself in
dealing with a matter of national security significance only because the
state recruited or bought the sympathies of another private player. One
way or another, the private sector was supreme over the sovereign.
The influence of the private sector in the functions of cyberspace
ranges from relevance to primacy to supremacy. Even in the realm of
active defense, where the government’s prerogative is unquestionably
supreme, the private sector retains some relevance because the govern-
ment cannot operate fully in those quarters that it does not control or
because the government lacks access to private data that it requires to
tailor its foreign operations. The Apple-FBI saga shows that in core
areas of national security such as in the aftermath of a terrorist atrocity
inspired by foreign miscreants, the state cannot operate with full lati-
tude because it lacks access to sensitive private information or other
resources in the domestic cyberspace that it requires for its investiga-
tion. Private technology behemoths such as Apple can enable or deny
this access, often in defiance of law enforcement needs. When opposi-
tion in the boardrooms is unbending, the sovereign can lose not merely
its customary supremacy but also its primacy; it may remain relevant
only at the leniency of other profit seekers who themselves are in a
position to arbitrate among clashing public interests.
only can private players use cyberspace to harm the interests of nations
or to hinder the organization of their own defenses, they can also disturb
the interactions among nations.
Cyber conflict can fit four basic frames of agent interaction: state-to-
state, in which one state targets another state’s strategic computer assets,
such as in the Stuxnet operation (this category includes the use by
government of obedient civilian proxies); private-to-state, which includes
cyberattacks by militant groups or hacktivists, such as in the Estonian
crisis; private-to-private, involving an exchange of attacks between
nonstate entities such as private companies; and state-to-private, in
which a state attacks the private computer systems of another nation,
possibly for commercial or other economic gain (see Table 1).
Attacker
time that NATO ever invoked its collective defense clause, Article 5,
which stipulates that an armed attack on one alliance member is an
attack on all of them. Another important clause, Article 4, which
provides for formal consultations in situations where a member’s “terri-
torial integrity, political independence or security” is threatened, is less
well known, but it has also been invoked rarely – only five times in almost
seventy years. An invocation of one or both of these articles would repre-
sent a major diplomatic and even military development.98 Few people
understand that the Estonian crisis is the only situation other than the
9/11 attacks that immediately prompted a discussion within or among
NATO member states regarding the invocation of the two articles.
Hostilities in Estonian cyberspace quickly raised the matter of the arti-
cles’ applicability to this case. In the midst of the crisis, Harri Tiido, at
the time Estonia’s ambassador to NATO, recalls the alliance’s Secretary
General, Jaap de Hoop Scheffer, grabbing him by the arm while alliance
officials drafted a press release and asking whether it was his govern-
ment’s intention to invoke Article 4.
The answer from Tallinn was no: officials would not invoke either
clause. They concluded – wisely – that neither the intensity of the attacks,
which, though potent, did not involve physical destruction or loss of life,
nor their duration met the high bar for an armed attack. The desire of
decision-makers to shape the narrative of the attacks also factored into
the decision not to escalate internationally. “The events were interpreted
largely as a domestic disturbance – not an international incident,” said
Lifländer. “The Estonian government wanted this perception to prevail:
this was nothing but hooligans to be dealt with by law enforcement.
International escalation would have broken this narrative.”99
Yet escalation was in the air. According to Lifländer, who was on the
scene in Estonia, and Kaljurand, who was stationed in Moscow,
the matter of collective defense implications was considered within
national decision-making circles. Tiido reported that in the month
following the attacks, the Estonian government brought the matter to
the attention of the alliance “first and foremost as a political issue,”
that is, “ to make the point that cyberattacks can be as damaging as
conventional weapons.”100
In short order, then, a crisis that unknown private actors outside of
NATO territory intensified, if not also precipitated, placed the question
and the prospect of the ultimate diplomatic showdown – an activation of
F I R S T- O R D E R C Y B E R R E V O L U T I O N 187
A Cyber Breakdown?
There is something remarkable about Ene Ergma’s nuclear analogy
above: it betrays the belief that a major cyberattack could produce a
breakdown of modern society. She is not the only public official to
conjure images of a technologically induced cataclysm; one often hears
senior officials utter the slogans “Cyber 9/11” and “Cyber Pearl Harbor.”
Probably these are nothing more than rhetorical devices to raise public
awareness about a growing cyber threat that some quarters of society
still dismiss as uninteresting or unimportant. But could the virtual
weapon’s unrestrained power produce so catastrophic a disintegration of
modern society that the states system itself collapses?
Security planners at the Pentagon have explored the plausibility of
such a scenario. Possibilities have included a coordinated and rolling
attack on power grids; the malfunction of oil-refining facilities in Texas;
the disabling of hydraulic systems in Californian ports; and the blinding
of the air-defense radar systems that monitor U.S. air traffic.113 Others
have warned about the possible collapse of the computer infrastructure
that sustains the Internet. Some of the firms that operate this infra-
structure, such as Dyn and Verisign, which register top-level domains
(for example,. com,. net,. co.uk.), have experienced a growing number
of DDoS attacks. Bruce Schneier warns that the attacks appear to
be probing operations: sophisticated state actors are calibrating their
weapons in ways that would enable concurrent attacks across multiple
vectors.114
190 DEGREES OF THE CYBER REVOLUTION
dangers that private culprits will intervene in ways that accelerate the
crisis or that move it in a direction that the system’s old stewards do not
want to go.
Security policy, in short, now has to be conducted against and by not
only states but also a growing universe of other players of unclear origin
and identity. States can no longer conduct their foreign and defense
policies as if forces of discord emanating from beyond the states system
were suppressible merely by their own will. Ambassador Ducaru, who
works at the frontlines of policy planning against unconventional threats
to NATO, a cherished organ of the Westphalian order whose forms
reflect the finest prejudices of theorists, regards the growing ability
of the alien players to wield the virtual weapon against both states
and each other as “a fundamental change in the international threat
landscape” – even if they are not as mighty as nations, even if policy-
makers must preserve the official doctrine that the largest among the
ordinary units pose the gravest peril to national and regional security.
The partial forces of systems change – the most important of which
is what a senior official in the British Cabinet Office described as “the
democratization of threat actors”116 – produce opportunities for conver-
gence between the world of states and the chaotic global universe that
also includes nontraditional players. It enables new opportunities for
cooperation between states and nonstate actors who share certain goals
and adversaries. Convergence can take a defensive or offensive form.
Defensively, the aim may be to enhance cooperation between the
government and the private sector in the protection of either party’s
computer terrain. Offensively, the goal may be to collude in the design
of sophisticated attack tools targeting foreign machines.
The main distinguishing feature of the cyber revolution is that it
may be the first technological revolution of the first order in the inter-
national system. That technology is also at least partly a primary rather
than just a secondary cause of change gives it a special uniqueness.
These developments have profound implications for the understanding
and practice of international order and security. Vanished is the secure
belief in the state as both the supreme source of threats to national
security and the supreme protector against them. A central feature
of the present technological revolution is that the state has become a
lesser agent in the management of its own security affairs, regardless
of the continued indispensability of governments in the organization of
192 DEGREES OF THE CYBER REVOLUTION
PRO B LE M S O F S T R AT E G Y A ND P OL IC Y
193
194
CHAPTER 7
Doctrinal Problems
The ordinary goal of deterrence policy is to prevent an attack. Thus
it is futile to speak of “degrees” of success in deterrence: it either succeeds
or it fails. Failure is absolute. There is no reverting to a situation in
which the attack did not happen. Consider the nuclear realm of conflict
from which deterrence theory emerged. Initial attempts to enshrine
strategies of “limited war” in policy did not go far.1 Aversion to the loss
of a single American or Soviet city and its entire population convinced
the superpowers that an exchange of nuclear blows must be avoided at
all costs. With a quotient of horror measured in millions of lives, the
idea of limited losses is repugnant. All tactical amputations seem like a
strategic fatality. Thus the temptation of deterrence policy has been
to prevent all attacks unconditionally: to pursue deterrence for its
own sake.
This policy dogma remains entrenched in contemporary strategy
in other realms of conflict. Policymakers repeatedly emphasize the
need, as a senior official in the British Cabinet Office put it, “to increase
deterrence across the board in the cyber domain.”2 They stress the
goal of shoring up defenses to deny adversaries opportunities to compro-
mise networks – “all the way from protecting ordinary computer users
to protecting nuclear command codes.”3
195
196 P R O B L E M S O F S T R AT E G Y A N D P O L I C Y
Yet the strategy of total deterrence that has succeeded in the nuclear
realm fails in the cyber domain. Denial is notoriously difficult to achieve.
As we discussed in Chapter 2, sophisticated cyber operations are hard
to foil. Therefore, the emphasis of deterrence policy is on retribution:
preventing attacks by expressing a readiness to punish them with
maximum available force. The “equivalence principle” that features
prominently in American and British national cyber strategies stipulates
that a major attack may invoke a military reprisal – a pledge that even a
proportionate response will be intense. As Britain’s Chancellor of the
Exchequer George Osborne defiantly stated in November 2015, “We
reserve the right to respond to a cyber attack in any way that we choose.”4
The prevailing deterrence posture in the cyber domain suffers from
two major deficiencies. One involves the dangers of conflict escalation.
So far, the equivalence principle has worked – at least in deterring
acts of cyberwar.5 At the time of writing, there has been no cyberwar.
But the warning of severe retaliation creates pressures for an acceler-
ating crisis if an exchange of blows occurs. If the logic of penalties fails,
the price of a successful cyberattack is the risk of a spiraling war in the
conventional domain. In the high spectrum of action encompassing
destructive activity, therefore, the logics of deterrence and escalation
control are at odds. The logic of penalties in the cyber domain aggra-
vates the difficulties of conflict control in other domains. The new tech-
nology gives a new twist to the remark by Raymond Aron: “Everything
that increases the likelihood of escalation in advance contributes to
deterrence but also makes it, by definition, more difficult to limit war if
it breaks out after all.”6 Second, the posture fails in the middle range of
action: it has not succeeded in preventing acts of unpeace, the non-
violent but highly damaging activity described in Chapter 2. This
activity continues unabated; in fact, it has grown in scope and severity as
nations find new ways to employ virtual weapons to harm the national
interests and disrupt the internal political affairs of adversaries without
ever firing a shot – for instance, Russia’s reported hacking and publica-
tion of the defamatory email contents of American politicians during
the presidential election in 2016. It would be misguided to dismiss such
activity – drawing from Clausewitzian prejudices – as tolerable because
it falls short of traditional war or interstate coercion. Recall a chief
feature of the virtual weapon: it enables new forms of strategic effect
that do not involve violence.
THE DETERRENCE PUZZLE 197
even if an air gap envelops the target system. In 2009, the Stuxnet worm
reportedly bridged the wall of air at the Natanz nuclear facility in Iran
by infecting the personal machine of an unsuspecting plant operator.11
The sheer speed of malware means that it can travel the globe via the
Internet almost instantaneously. Then there are the problems of perma-
nent residency of malware in the defender’s system. Because the payload
is an intangible with almost no physical signatures, and because it
exploits zero-day coding vulnerabilities that the defender does not know
about or has not patched, the presence of advanced malware may be
impossible to detect. This last factor is an especially worrisome feature
of cyberspace, for sophisticated intrusion provides the attacker with the
means to predict and thwart the defender’s defensive measures. Likelier
than the defender denying the attacker in transit is the attacker denying
the defender in situ.
Another problem is the difficulty of anticipating attack effects.
Direct effects are in principle reproducible and measurable. Technicians
can model them in a controlled laboratory setting if they possess rele-
vant data about the target system. For example, the U.S. National
Security Agency simulated the Stuxnet worm’s payload in a replicated
system in order to customize it to the exact specifications of the Natanz
plant’s industrial control system and to understand the direct effects
that the worm would unleash.12 The indirect effects, however, can vary.
Modern society relies on complex computer systems to support core
governmental and economic activities. A high degree of interconnected-
ness exists both between these systems and between the external
functions they support. Consequently, a major cyberattack against one
system can cause cascading harm that impairs vital activities across a
broad range of industries and jurisdictions. These indirect effects are
largely unknowable before the attack occurs; it is likely they are impos-
sible to model afterward as well. Furthermore, the fragmentation of
responsibility to secure vital infrastructures within and across the public
and private sectors will limit the recuperative powers of society during
an emergency. The impossibility of understanding and thus preparing
for the full indirect effects of cyberattack complicates the ability to deny
entry by means of redundancy and resilience. Their attainment will be
more difficult than heretofore.
Deterrence by punishment – preventing a major cyberattack by
the pledge of severe penalties – is more promising.13 The equivalence
THE DETERRENCE PUZZLE 199
system from which they were seized, the opponent can use this informa-
tion either for purposes of industrial espionage or to design weaponized
code. Consider the PLA’s massive hacking operation of American
computer systems that security analysts uncovered in 2013. “We know
foreign countries and companies swipe our corporate secrets,” remarked
U.S. President Barack Obama at the time. “Now our enemies are also
seeking the ability to sabotage our power grid, our financial institutions,
our air-traffic controllers.”26 The discovery of exploitative malware
in vital infrastructures raises the question: is its purpose intelligence
gathering, destruction, or both? Valid as the distinction between attack
and exploitation may be from a conceptual and legal perspective,
doubts about the ultimate aim of exploitation and its potential for mis-
understanding by the defender as the initial phase of attack may lead to
unnecessary preemptive action.
Two steps can go a long way to reducing these escalatory dangers.
One is to specify clearer thresholds of attack beyond which the equiva-
lence principle applies. To be sure, the current posture of declaratory
ambiguity has advantages. As Thomas Schelling, a maven of nuclear
conflict studies, argued, an uncertain promise of retaliation is more
effective than a certain one.27 A reasonable adversary with access to
capable analytical resources may correctly infer from public statements
and previous experience where those lines exist – most of the time. For
example, the absence of a punitive response from NATO in the after-
math of the 2007 attacks on Estonia showed Moscow that its allies
drew the equivalence line well above even a large, politically motivated
DDoS attack that paralyzes a nation’s economic and financial activities.
This does not, however, reveal where the line actually lies. Alliance and
member state officials have persistently refused to draw it any more
clearly. At a recent summit, NATO Secretary General Jens Stoltenberg
stated: “A severe cyber attack may be classified as a case for the alliance.
Then NATO can and must react.” But he then added ambiguity to this
declaration: “How, that will depend on the severity of the attack,” he
said, without specifying degrees of severity.28 And if loss of life is a
necessary criterion of equivalence, what number of deaths signifies an
armed attack has occurred? Western adversaries will be forgiven if they
are left guessing whether a destructive attack on a civilian power grid or
transportation system is a cause to activate the alliance’s collective
defense clause.
THE DETERRENCE PUZZLE 205
but a series of actions; not one-off effects but cumulative effects. Factors to
consider in determining the accumulation of damage are the intensity of
harm (physical versus intangible effects), the timescale of harm (prolonged
accumulation or acute accumulation), and the range of harm inflicted
upon friendly interests (damage to national interests only versus damage
to allies or an alliance). Uri Tor has proposed a similar notion, “restrictive
cumulative deterrence,” which accepts the inevitability of cyber aggres-
sion. This view, however, does not stray far from the received paradigm,
for it entails “attacking the rival repeatedly in response to specific behav-
iors, over a long period of time.”43 By contrast, the approach developed
below prescribes a punctuated regime of punishment: not continuous
reprisals for persistent and sometimes simultaneous actions – an approach
that saps the political will and disables the bureaucratic ability to respond
– but a graduated scheme in which penalties are meted out over time and
at a moment of the defender’s choosing.
Drawing from the example above, punctuated deterrence would
consider not merely Russia’s action in the given electoral context, but also
its preceding sub-threshold actions, possibly including actions against
allied parties – for instance, the attempt to subvert the electoral process of
friendly nations or allies. The aim of this approach is to increase the
attacker’s expectation of the costs that the victim imposes on him
following the attack (Ha), as expressed in the classical deterrence formula
in Table 2 below. (It is also possible, of course, to diminish the attacker’s
expected benefits (Ba) and the harm that he expects to inflict (Hv) by
way of a counter-information warfare strategy that seeks to dispel
falsities.)
The accretional principle of punctuated deterrence faces a major
psychological barrier: the adversary may not perceive that his actions
constitute a coherent series of moves, even if their damaging conse-
quences accumulate coherently in the eyes of the victim. The possibility
of misperception will be especially pronounced if different elements of
the adversary’s decision-making apparatus pursue disparate actions and if
it lacks a single policy framework to tie them together. In this case, the
burden of supplying a framework of cumulative penalties falls on the
defender. Diplomatic and signaling procedures will be required to demon-
strate to adversaries that for the purposes of punishment, individual
hostile actions are regarded as a comprehensive package of offensive
activity rather than being disparate actions that merit isolated responses.
210 P R O B L E M S O F S T R AT E G Y A N D P O L I C Y
212
R U S S I A A N D C Y B E R S PA C E 213
leading to the closure of the institution’s networks for two weeks.3 Yet
none of these incidents was of such a magnitude as to disturb interstate
dealings, even if they affected core national security interests. Oracles of
strategy had warned of these dangers with the dire slogan: “Cyberwar is
coming!”4 Few heeded the call until the Estonian crisis – a demonstra-
tion of new technological potential as significant for the cyber age as the
explosions over Hiroshima and Nagasaki were for the nuclear era.5
Against this backdrop, the Estonian spectacle had an inaugural signifi-
cance: it launched an era of unpeace – a state of incessant, intolerable,
but so far largely unpunished activity into which we have since been
dragged only further.6
Security planners and foreign policy specialists began at last to awaken
to the gravity of the cyber threat. Before the events in Estonia, no country
had published a dedicated cybersecurity strategy; today dozens of coun-
tries have done so. No Western intelligence agency had rated the cyber
threat as a central concern of national security; presently they regard it as
one of the most pressing issues, if not the foremost menace. In 2007,
basic questions of security doctrine converged upon the world of
diplomacy. How can nations deter major foreign attacks on essential
infrastructures? What techniques can reduce the intensity of a cyber
conflict following a failure to prevent it? Do the laws of war apply to the
restraint of such conduct? How can governments stem the acquisition of
weapons whose transmission is nearly costless and instantaneous because
the payload is an intangible?
Yet for the nation whose citizens fired this virtual weapon, the move
was no leap into unknown doctrine. Like the waves of cyberattacks that
ensued against Georgia, Ukraine, and other nations that Moscow
perceived as adversaries, the campaign against Estonia in 2007 repre-
sented a refinement of Russia’s century-old doctrine of information
warfare,7 whose central purpose is to disorient the enemy’s leadership,
to diminish his capacity for cohesive action, to undermine his confi-
dence in the integrity of his political institutions – in short, to weaken
the enemy in his own perception. The doctrine’s expansion to account
for the new possibilities of cyberspace, including the disruption of
foreign elections, as the world witnessed during the American presiden-
tial contest in 2016, represents a fusion of two elements: the classical
techniques of disinformation in the battle over human minds; and the
use of information as force in the battle over machine functions. And
214 P R O B L E M S O F S T R AT E G Y A N D P O L I C Y
enemy has, no matter how well developed his forces and means of armed
conflict may be,” declared Gerasimov, “forms and methods for over-
coming them can be found.”27 Thus the doctrine’s main purpose is not
to augment war, though this may be so in limited instances, but to
circumvent it by opposing the adversary where his vulnerability is
greatest and his doctrinal understanding retarded: in the disruption of
the open and sometimes fragile information spaces of liberal democra-
cies. The very openness that Western nations perceive makes their
political system superior to all others is, in the Russian perception, a
source of weakness.
The Russian notion of “reflexive control” follows this thinking.28
Timothy Thomas defined the notion as “a means of conveying to a
partner or an opponent specially prepared information to incline him to
voluntarily make the predetermined decision,”29 one that is advanta-
geous to Russia. The program has two faces. One involves influencing
the minds of elites and publics abroad in order to lure the adversary into
a limited confrontation that Russia desires but it does not wish to initiate
or be seen to initiate. This approach was on display in the Russo-
Georgian War of 2008. As James Sherr related, “Putin primed the
mechanism for war” by preparing belligerent action but “was assiduous in
ensuring that [Georgian President] Saakashvili started it.”30 Surreptitious
acts of aggression by Russia or its local proxies combined with false news
reports and statements to needle Georgia into invading the disputed
territory of South Ossetia, which in turn prompted Russia’s larger inva-
sion of Georgia. Moscow maneuvered within Georgia’s information
space to manufacture a conflict that sapped NATO’s political will to
welcome the country into the alliance.
A second face of reflexive control works in the opposite direction: it
employs information warfare to shift the adversary’s foreign and security
policy toward a less confrontational, even amicable relationship with
Russia. The goal is to draw the other side into a course of action that is
not in his interests, but rather than needling, the method is wheedling.
This approach was on display in the struggle over the control of Ukraine’s
eastern region from 2014 to the present. Perhaps a torrent of social
media and news postings favorable to Russia and its magnificent leader
would erode the adversaries’ will to penalize it for its aggressions. To this
effect, Russia has employed “troll armies” comprising countless anony-
mous hirelings who countered remarks critical of it.31 One former recruit
220 P R O B L E M S O F S T R AT E G Y A N D P O L I C Y
the campaign had declared the NATO alliance “obsolete” while praising
Putin as a strong leader of his nation.37
This cyber operation combined two techniques: a classic exploitation
action involving the seizure of private records of secret conversations
among politicians; and kompromat, or the public divulgence of exfil-
trated data that is timed to inflict reputational harm on a political or
strategic target. The first technique was an intelligence-gathering
activity which large nations routinely carry out against each other and
that no international treaty prohibits. The second move was stunning
and more controversial, for it marked the first time in history that a state
applied exploitative methods to cause reputational harm against foreign
politicians in order to disrupt and possibly subvert their country’s elec-
toral process. Drawing from the technical concepts described in Chapter
1, the operation was not a cyberattack because it produced no notable
direct effects upon the compromised machines. But we can regard what-
ever effects the kompromat action exerted on the election outcome and
on the political system generally as indirect results. These effects are
conceivably more important. Recall a general claim of this book: more
important than the direct effects that code exerts upon machines are the
indirect effects it generates within the political and social world.
What were the indirect effects in this case? Schultz’s remarks caused
a fury within large quarters of her party’s base. Particularly incensed
were Sanders’ supporters, many of whose allegiance to the candidate
surpassed their commitment to the party. Party officials were meant to
observe a strict neutrality in their posture toward the candidates for
nomination. Never mind that the perception of prejudice against
Sanders already existed; the emails proved that the insistent rumors
were fact. Sanders pleaded for forgiveness of the party whose chair-
woman had privately betrayed him. But the Democratic National
Convention, which began only days after the hacking fiasco, was at
times a scene of disorder. Insurrectionists within the Sanders faction
met his earnest supplications for support of Clinton with boos. An
untold number of them proclaimed that they would not vote for the
designated party candidate, Clinton, in the election. A larger number of
other party adherents who had only warily supported her until then may
not have materialized at the booths on election day.38 Four months later
Clinton lost the election, one of the closest on record. Winning the
popular vote by almost three million votes, she succumbed to Trump in
R U S S I A A N D C Y B E R S PA C E 223
the contest for the Electoral College by small margins (tens of thou-
sands of votes) in crucial states that sophisticated but unreliable
electoral models had projected were hers to take. Clinton and some
commentators claimed that the DNC twist, combined with the contro-
versy over FBI Director James Comey’s ongoing investigation of her use
of a private email server at home while serving as Secretary of State,39
denied Clinton an otherwise safe victory against her highly controver-
sial Republican opponent.
Every honest historian knows that it is impossible to prove counter-
factual arguments. The causes of Clinton’s loss are multifactorial; the
evidence that the WikiLeaks embarrassment drained her of supporters
circumstantial.40 The skeptic will carp that the Russian maneuvers did
not conclusively derail Clinton’s path to the White House. Nor did it
directly damage the United States’ military or other significant material
interests. This thinking is misguided. Even if the information opera-
tions did not deliver the presidency to Trump, their effects on the West’s
relations with Russia and on debates about the integrity and surviva-
bility of democratic electoral processes are larger than even some armed
conflicts have produced. Anyone who witnessed the tense drama of the
election will recall the fierce controversy that erupted in the aftermath
of the hacking relevations. Reince Priebus, then Wasserman’s counter-
part at the Republican National Committee, offered the taunting
statement: “Today’s events show really what an uphill climb the
Democrats are facing this week in unifying their party. Starting out
the week by losing your party chairman over longstanding bitterness
between factions is no way to keep something together.”41 True, the
bitterness had existed, but not the scintillating personal controversies
and the large insurrection against the party leaders. The explosion of a
single meteor in cyberspace combined with events outside of it to shake
the celestial arrangement of American politics.
The truth is that we shall never know the answer to this political
mystery. But we must examine it further, for in it lie four clues to Russia’s
integration of cyberspace into modern information warfare and to revo-
lutionary trends in contemporary security affairs.
The first clue concerns complications of defense: the intruders pene-
trated the DNC’s email systems with apparent ease. Already in July
2015, Russian intelligence organs had reportedly gained access to their
target; the intruders remained undiscovered for at least eleven months.42
224 P R O B L E M S O F S T R AT E G Y A N D P O L I C Y
229
230 P R O B L E M S O F S T R AT E G Y A N D P O L I C Y
your life more difficult. This is why one sees offensive actions.”2 Yet
presently in the United States, Britain, and many other domestic juris-
dictions, the authority to implement active defense belongs exclusively
to the government. Top American officials have called for changes in
law and policy that would bolster the private sector’s use of active defense
techniques such as “strikeback” or “hackback” technology: in effect,
arming of the civilian quarters of cyberspace.3 The main body of govern-
ment opinion has successfully resisted these calls – so far.
This chapter asks: what are the possible strategic and other conse-
quences of enabling the private sector to arm itself with active defenses?
Little or no systematic analysis of this question exists. The chapter
argues that while the potential defensive and other benefits of private
sector arms are significant, the risks to defenders, innocent parties, and
international conflict stability are notably greater.
But first, a clarification of key terms is in order. The label “private
sector” in this analysis denotes the entirety of nonstate groups and indi-
viduals who comprise the economy and society and who are not under
direct state control but are possibly under its informal direction.
Conceptually, the difference between formal state “control” and informal
“direction” is subtle but crucial: the former implies membership of the
state; the latter, exclusion. On this basis, the private sector encompasses
some forms of proxy actors such as criminal syndicates or privately owned
firms (for example, Kaspersky Lab)4 that have established informal
working relationships with the government, but it excludes state-affiliated
civilian actors such as paramilitary militias (for example, Estonian Cyber
Defense League) or publicly controlled firms (Huawei).5 Consistent
with the framework of concepts introduced in Chapter 1, the term
“cyberweapon” or “arm” signifies the software and hardware instruments
necessary to carry out cyber exploitation or attack. The term “active
defense” – a contested and ambiguous notion – is broadly construed to
denote the use of such instruments outside the defender’s or other friendly
terrain to prevent or preempt attack. This interpretation of active defense
does not imply the use of any specific kind of cyberweapon; merely that
the activity transpires in extra-defensive terrains (more on this below).
The chapter has three sections: first, it defines the concept of active
defense; second, it reviews the current state of private sector active defense;
and, finally, it analyzes potential strategic benefits and risks associated
with the development of private sector arms.
P R I VAT E S E C T O R A C T I V E D E F E N S E 231
Defensive Purpose
As the label implies, the aim of active defense is to enhance the security
of the defender’s assets: to deny proactively but not to penalize the
attacker. The attacker, by definition, is affected only if he engages or
prepares to engage or is perceived to engage the target. Thus the essence
of active defense lies in the eye of the defender. It entails the reasonable
perception – not necessarily the fact – of an adversary’s intention and
capability to attack. For this reason, retaliation to deter future attack
does not qualify as active defense unless it seeks to degrade the
attack sequence itself and transpires while the threat is still active.
Offensive activity that extends beyond the minimum threshold of
action necessary to neutralize an imminent threat or endures after
the threat has subsided also does not constitute active defense.
Here the criterion of imminence is debatable: does it include only
tactical or also broader strategic threats? History provides a clue. As
early as 1936, Japan presented a strategic threat to the United States by
232 P R O B L E M S O F S T R AT E G Y A N D P O L I C Y
Out-of-Perimeter Location
The “active” quality of the concept refers not to offensive activity, as
some thinkers suppose (see below), but to the activity’s out-of-perimeter
location. Passive measures are those the defender conducts within his
own terrain; active measures are those he conducts outside it – that is,
within adversarial or neutral terrain, including the terrain of innocent
parties whose computer identity or functions the attacker has usurped.
P R I VAT E S E C T O R A C T I V E D E F E N S E 233
Tactical Flexibility
There is one sense in which common ambiguities in the meaning of
active defense are warranted: the concept implies nothing about the
scale, type, or intensity of the defender’s action. Tactically, active defense
may involve a variety of actions – intelligence collection, disruption
(including destruction), or some combination of the two. On this basis,
it is possible to conceive of three broad sorts of active defense: nondis-
ruptive, disruptive, or mixed (in other words similar to a “multistage”
cyberattack that involves both preliminary exploitation and subsequent
disruption).11 It is therefore imprecise to define active defense simply
as offensive action to defeat an ongoing attack, although some observers
suggest this interpretation,12 because the concept could, in fact, involve
entirely nondisruptive measures, such as the insertion of exploitative
beacons in enemy networks to capture threat data.
234 P R O B L E M S O F S T R AT E G Y A N D P O L I C Y
In sum, the chief distinguishing features of active defense are not the
scale, intensity, or form of activity but rather defensive measures of threat
neutralization – whether nondisruptive or disruptive or both – that a
defender implements outside his own or other friendly terrain. Table 3
summarizes and illustrates the differences between passive and active
defense.
and section (a)(5), which forbids the intentional use of computer code to
impair the operations of a protected computer system.13 Moreover, the
Federal Wiretap Act’s section 2511(2)(a)(i) forbids the unauthorized
interception or recording of electronic communication transiting between
machines. The fines for infringement of these rules can be severe.
The legal consequences of the recently passed Cybersecurity
Information Sharing Act for private sector active defense in the United
States are unclear. Possibly the bill will broaden the monitoring powers
of private actors, but only if they work in conjunction with government
authorities: in other words, as an informal arm of the state. Probably the
changes will not be drastic. Although the bill allows the deployment of
“countermeasures” that legitimately target threats and which damage
data or machines on other networks, legally such countermeasures must
be deployed within the defender’s own network. Any resulting damage to
external parties must therefore be unintentional.14 Thus CISA’s provision
for countermeasures does not satisfy the out-of-perimeter criterion of
active defense; it is beyond the scope of the present analysis.
There is little case law that elucidates the legal ramifications
associated with the use of private sector arms.15 Yet the prevailing legal
viewpoint is clear: the practice of active defense is unlawful – if only
because of the activity’s second defining characteristic, that is, the inten-
tional intrusion into or disruption of computers to which the defender
lacks authorized access. Some officials have vocally pressed for changes
in U.S. federal law that would allow the greater use of private active
defense.16 For now, however, the legal environment remains unequivo-
cally proscriptive.
A second viewpoint concerns policy: official opinion reflects and
supports the prevailing legal condition. The U.S. Department of Justice
strongly discourages exploitative active defense. One of its guidebooks
states:
Possible Benefits
The development of private sector arms may yield at least four positive
consequences: improvement of strategic depth; closer civil–military
238 P R O B L E M S O F S T R AT E G Y A N D P O L I C Y
a specific attack sequence is defeated but the attacker retains the ability
to redeploy, or it could be strategic, in other words the attacker is
dissuaded from or deprived of the ability to attack the target again.
Exploitative tools could also support the government’s own threat-
monitoring and neutralization effort without themselves engaging in
disruptive action. For example, third-party threat-intelligence compa-
nies may sell their services to the government, thereby serving as
intermediaries between the victim and the government – an arrange-
ment that could help to preserve the victim’s anonymity.
A second advantage is enhanced civil–military integration. Western
societies face an acute shortage of workers trained in technical disci-
plines relevant to cybersecurity, such as computer science and software
engineering. The relevant skills base resides primarily in the private
sector. Large technology firms (for example, Google, Apple, Microsoft)
are able to offer salaries many times larger than military and security
agencies (USCYBERCOM, NSA, GCHQ) can offer. “We are
competing in a tough marketplace against a private sector that is in a
position to offer a lot more money,” lamented the U.S. Secretary of
Homeland Security, Jeh Johnson. “We need more cybertalent without a
doubt in D.H.S., in the federal government, and we are not where we
should be right now, that is without a doubt.”29 Similarly, in Britain, the
government skills gap is so severe that former GCHQ Director Iain
Lobban said that his agency might have to employ non-nationals for a
brief period – that is, before they, too, are inevitably absorbed by the
private sector.30 Another drain on skills occurs when defense contractors
hire the manpower of government agencies, only later to sell their serv-
ices back to the government.
Governments have reacted to asymmetry in the technological skills
base in two ways: first, by attempting to assimilate civilian talent into
loose state structures such as military reserves; and second, by cooper-
ating with private technology providers to develop joint capabilities.
In the first approach, the government assumes a direct role in equip-
ping the private sector. It drafts, trains, arms, and retrains elements of
the civilian population in the methods of cyber operations. This may be
achieved by establishing a voluntary paramilitary defense force, such as
Estonia’s Cyber Defense League (Küberkaitseliit), a civilian defense
organization that supports the military and Ministry of Defense;31 or
by way of conscription, as in Israel’s Unit 8200, whose ranks include
240 P R O B L E M S O F S T R AT E G Y A N D P O L I C Y
drafted servicemen who after an initial term of service enter the army
reserves.32 This approach has achieved moderate success in small nations
such as Estonia and Israel, which have vibrant technological innovation
hubs and a popular tradition of mass conscription. But it has paid only
limited returns in large countries such as the United States and Britain
where the National Guard or Reserves and the Territorial Army often
fail to attract high-skilled elements of the civilian workforce.33
The second approach entails an extension of the concept of “private
military companies” (PMCs) into the new domains. PMCs provide
military and security services – even armed force – to the state or to
other private entities.34 This approach may be better suited to large
nations with sizeable private technology industries but poorly devel-
oped traditions of military service. It would, however, require a greater
commitment on behalf of participating companies to develop the sorts
of strategic and tactical technologies that governments need to achieve
national security goals. Some firms already provide the U.S. and other
governments with sophisticated surveillance tools such as tracking and
eavesdropping software.35 Few companies, however, have invested in
the other side of active defense – advanced disruptive tools – because
of the legal and policy prohibitions or because the business case for
doing so is not clear. Yet the private sector is well poised to develop
them. Cisco’s dominance of the router market, Google’s near monopoly
of online searches, and Microsoft’s preponderance in the sale of desktop
operating systems afford these firms tremendous (and legal) access to a
significant proportion of Internet traffic and global hardware compo-
nents. Some of this access is directly relevant to the harvesting of
zero-day vulnerabilities and to the design of access vectors and payloads
that governments require to mount sophisticated cyber operations.36
CISA’s relaxation of prohibitions against private sector exploitation
performed under government sanction may foster more cooperation of
this sort, although at present the structural incentives for such coopera-
tion are not clear.
In brief, some elements of the technological sector possess merely by
their existence a latent capacity to acquire sophisticated cyberweapons.
The development of private sector cyber arms under informal govern-
ment direction could enable governments to harness the civilian sector’s
technological prowess while avoiding the cumbersome organizational
costs of traditional military formations.
P R I VAT E S E C T O R A C T I V E D E F E N S E 241
Possible Risks
The use of cyber arms by the private sector entails at least three risks:
foreign government penalties; innocent third-party harm; and inad-
vertent or accelerating international conflict. The last directly involves
state interests and is potentially the gravest.
P R I VAT E S E C T O R A C T I V E D E F E N S E 243
can be very difficult to attribute; and they often use multiple neutral
machines and networks to access the target. Almost inevitably, there-
fore, active defense measures will impair to some degree the operations
or data of third-party computer users, either because the defender
misattributes the source of the attack to a machine that is in fact not
involved because the attacker employs spoofing software that alters the
compromise indicators (for example, the IP address); or because
the defender correctly attributes the source or transit point of the attack
but the identified machine is in fact innocent because the attacker has
hijacked it. And as the number of injured parties multiplies, the poten-
tial for the conflict to accelerate and broaden grows.
The third type of danger is the gravest of all: inadvertent and esca-
lating conflict, or the possibility of unwanted international crises. Some
international relations thinkers have questioned the ability of private
actors to destabilize the dynamics of interstate security competitions.43
A world not far from the one in which we live challenges this view.
Extending the private sector’s ability to carry out active defense may
produce instability in the following ways, inter alia:
Cyber Futures
247
248 P R O B L E M S O F S T R AT E G Y A N D P O L I C Y
Introduction
1. The words are those of Nick Harvey, the United Kingdom’s Armed Forces Minister.
Nick Hopkins, “UK Developing Cyber-Weapons Programme to Counter Cyber War
Threat,” The Guardian, May 30, 2011.
2. Within international security studies, examples of skeptical thinking include the
following works: Erik Gartzke, “The Myth of Cyberwar: Bringing War in Cyberspace
Back Down to Earth,” International Security, Vol. 38, No. 2 (Fall 2013), pp. 41–73; Jon
R. Lindsay, “Stuxnet and the Limits of Cyber Warfare,” Security Studies, Vol. 22, No. 3
(2013), pp. 365–404; and Brandon Valeriano and Ryan C. Maness, Cyber War Versus
Cyber Realities: Cyber Conflict in the International System (Oxford: Oxford University
Press, 2015).
3. For the sake of simplicity, this work refers to cyberspace as a single technology – much
as observers often refer to nuclear enrichment facilities, warheads, delivery vehicles, and
so on as nuclear technology. But readers should note that cyberspace is, in fact, a series
of technologies – each complex in its own right – that encompasses servers, routers,
machine nodes, and various forms and layers of applications and software architecture.
4. See Cliff Stoll, The Cuckoo’s Egg: Tracking a Spy through the Maze of Computer Espionage
(New York: Doubleday, 1989), Epilogue.
5. See John Arquilla and David Ronfeldt, Cyber War Is Coming! (Santa Monica, CA:
RAND, 1993).
6. See Arnaud de Borchgrave, Frank J. Cilluffo, Sharon L. Cardash, and Michele M.
Ledgerwood, Cyber Threats and Information Security Meeting the 21st Century Challenge
(Washington, D.C.: U.S. Department of Justice, Office of Justice Programs, 2001).
7. The cyberattacks against Estonian computer systems in the spring of 2007 thrust this
question upon national security planners.
8. See Marvin Minsky, The Society of Mind (New York: Simon and Schuster, 1988). The
notion of the comparability of the human and machine minds is rooted in the earliest
attempts to create artificial intelligence. In the 1960s, for instance, Frank Rosenblatt
sought to design a mechanical brain, Perceptron, which he described as “a machine which
senses, recognizes, remembers, and responds like the human mind.” Minsky and his
colleague Seymour Papert achieved fame by criticizing this design – ambitious for its
times – as severely limited because it could not solve even simple logical problems. See
Marvin Minsky and Seymour Papert, Perceptrons: An Introduction to Computational
258
NOTES to pp. 5–16 259
Geometry (Cambridge, MA: The MIT Press, 1969); and Gary Marcus, “Is ‘Deep Learning’
a Revolution in Artificial Intelligence?,” The New Yorker (November 25, 2012).
9. On the distinction between direct and indirect cyberattack effects, see Lucas Kello,
“The Meaning of the Cyber Revolution,” International Security, Vol. 38, No. 2 (Fall
2013), p. 19.
10. See Lucas Kello, “The Virtual Weapon: Dilemmas and Future Scenarios,” Politique
étrangère, Vol. 79, No. 4 (Winter 2014–15), p. 6.
11. See Lawrence Lessig, Code (New York: Basic Books, 1999).
12. See Gregory J. Rattray, Strategic Warfare in Cyberspace (Cambridge, MA: MIT Press,
2001).
13. See Jack Goldsmith and Tim Wu, Who Controls the Internet? Illusions of a Borderless
World (New York: Oxford University Press, 2006).
14. See Martin C. Libicki, Conquest in Cyberspace: National Security and Information Warfare
(New York: Cambridge University Press, 2007); and Martin C. Libicki, Cyberdeterrence
and Cyberwar (Santa Monica, CA: RAND Corporation, 2009).
15. See Jason Healey, A Fierce Domain: Conflict in Cyberspace, 1986 to 2012 (Arlington, VA:
Cyber Conflict Studies Association, 2012).
16. See Michael N. Schmitt, ed., Tallinn Manual on the International Law Applicable to
Cyber Warfare (Cambridge: Cambridge University Press, 2013).
17. See Adam Segal, The Hacked World Order: How Nations Fight, Trade, Maneuver, and
Manipulate in the Digital Age (New York: PublicAffairs, 2015). Other important book-
length studies of cyber issues include Chris C. Demchak, Wars of Disruption and
Resilience: Cybered Conflict, Power, and National Security (Athens, GA: University of
Georgia Press, 2011); Derek S. Reveron, ed., Cyberspace and National Security: Threats,
Opportunities, and Power in a Virtual World (Washington, D.C.: Georgetown University
Press, 2012); and Martin C. Libicki, Crisis and Escalation in Cyberspace (Santa Monica,
CA: RAND Corporation, 2012).
18. See Valeriano and Maness, Cyber War Versus Cyber Realities, p. 61.
19. See Joseph S. Nye, The Future of Power (New York: PublicAffairs, 2011), Chapter 5; and
Nazli Choucri, Cyber Politics in International Relations (Cambridge, MA: MIT Press,
2012). Other works focus less on developing theoretical frameworks and more on
explaining and predicting empirical trends. See, for instance, Valeriano and Maness,
Cyber War Versus Cyber Realities.
20. See, for instance, Friedrich Kratochwil, “The Embarrassment of Change: Neo-Realism
as the Science of Realpolitik without Politics,” Review of International Studies, Vol. 19,
No. 1 ( January 1993), pp. 63–80.
21. A notable work in this regard is P. W. Singer and Allan Friedman, Cybersecurity and
Cyberwar: What Everyone Needs to Know (Oxford: Oxford University Press, 2014).
22. In this book, the label “international security studies,” or plainly security studies, denotes
the subfield of the discipline of international relations. Thus the label automatically
captures the theories, concepts, concerns, and aims of international relations. Where the
arguments of the book relate more generally to the discipline rather than the subfield,
the book uses the label “international relations.”
23. On the value of single case studies, see Alexander L. George and Andrew Bennett, Case
Studies and Theory Development in the Social Sciences (Cambridge, MA: MIT Press,
2005), pp. 81–82. See also Chapter 1 of this book.
24. See, for example, Défense et Sécurité nationale: Le Livre blanc (Paris: La Documentation
française, June 2008); A Strong Britain in an Age of Uncertainty: The UK National Security
Strategy (London: Cabinet Office, 2010); and James R. Clapper to the U.S. Senate
Intelligence Committee (Washington, D.C.: U.S. Government Printing Office, March
12, 2013).
25. See Donna Miles, “Stavridis Spotlights Top National Security Issues,” American Force
Press Service, U.S. Department of Defense, March 15, 2012. See also comments by
Keith B. Alexander to the U.S. Senate Committee on Armed Services (Washington,
D.C.: U.S. Government Printing Office, April 15, 2010), p. 219; Joseph S. Nye, Jr.,
260 NOTES to pp. 16–29
“Nuclear Lessons for Cyber Security?” Strategic Studies Quarterly, Vol. 5, No. 4 (Winter
2011), pp. 18–38; and Kello, “The Meaning of the Cyber Revolution.”
26. See, for instance, Kenneth A. Oye, ed., Cooperation Under Anarchy (Princeton, N.J.:
Princeton University Press, 1986).
27. Among American officials, the operation was known by the code name “Olympic
Games.”
12. See Tyler Moore, Richard Clayton, and Ross Anderson, “The Economics of Online
Crime,” Journal of Economic Perspectives, Vol. 23, No. 3 (Summer 2009), pp. 3–20.
13. See Sharon S. Dawes, “The Continuing Challenges of E-Governance,” Public
Administration Review, Vol. 68, No. 1 [Special Issue] (December 2008), pp. 586–602.
14. The number of scholarly publications that focus on international security aspects of the
cyber question is small. These works include Ronald J. Deibert, “Black Code: Censorship,
Surveillance, and Militarization of Cyberspace,” Millennium, Vol. 32, No. 2 (December
2003), pp. 501–30; Johan Eriksson and Giampiero Giacomello, “The Information
Revolution, Security, and International Relations: The (IR)relevant Theory?”
International Political Science Review, Vol. 27, No. 3 ( July 2006), pp. 221–44; Lene
Hansen and Helen Nissenbaum, “Digital Disaster, Cyber Security, and the Copenhagen
School,” International Studies Quarterly, Vol. 53, No. 4 (December 2009), pp. 1,155–75;
and Mary M. Manjikian, “From Global Village to Virtual Battlespace: The Colonizing
of the Internet and the Extension of Realpolitik,” International Studies Quarterly, Vol.
54, No. 2 ( June 2010), pp. 381–401; Lucas Kello, “The Meaning of the Cyber
Revolution: Perils to Theory and Statecraft,” International Security, Vol. 38, No. 2 (Fall
2013), pp. 7–40; Eric Gartzke, “The Myth of Cyberwar: Bringing War in Cyberspace
Back Down to Earth,” International Security, Vol. 38, No. 2 (Fall 2013), pp. 41–73; and
Jon R. Lindsay, “The Impact of China on Cybersecurity: Fiction and Friction,”
International Security, Vol. 39, No. 3 (Winter 2014/15), pp. 7–47. See Introduction, p. 9.
15. See David E. Sanger, Confront and Conceal: Obama’s Secret Wars and Surprising Use of
American Power (New York: Crown, 2012), p. 291.
16. In the official statement, the last point came first. See Ben Gummer, “Government
Departments: Cybercrime: Written question – 55021,” UK Cabinet Office (December
5, 2016), http://www.parliament.uk/business/publications/written-questions-answers-
statements/written-question/Commons/2016–11–28/55021/.
17. One poll in Britain found that three-fourths of firms did not report computer breaches
to the police, let alone the public. See Kate Palmer, “Businesses Keep Quiet over Cyber
Attacks, as EU Cracks Down on Underreporting,” The Telegraph (March 3, 2016),
http://www.telegraph.co.uk/business/2016/03/02/businesses-keep-quiet-over-cyber-
attacks-as-eu-cracks-down-on-un/.
18. Stephen M. Walt, “Is the Cyber Threat Overblown?” Stephen M. Walt blog, Foreign
Policy (March 30, 2010), http://walt.foreignpolicy.com/posts/2010/03/30/is_the_
cyber_threat_overblown. Elsewhere, Walt calls for systematic study of the cyber issue
by a “panel of experts.” See Stephen M. Walt, “What Does Stuxnet Tell Us about the
Future of Cyber-Warfare?” Stephen M. Walt blog, Foreign Policy (October 7, 2010),
http://walt.foreignpolicy.com/posts/2010/10/07/what_does_stuxnet_tell_us_about_
the_future_of_cyber_warfare.
19. See Thomas G. Mahnken, “Cyber War and Cyber Warfare,” in Kristin M. Lord and
Travis Sharp, eds., America’s Cyber Future: Security and Prosperity in the Information Age
(Washington, D.C.: Center for a New American Security, 2011); and Thomas Rid,
“Cyber War Will Not Take Place,” Journal of Strategic Studies, Vol. 35, No. 1 (February
2012), pp. 5–32.
20. See Thomas Rid, “Think Again: Cyberwar,” Foreign Policy, Vol. 192 (March/April
2012), pp. 80–84.
21. Joseph S. Nye, Jr., “Cyber War and Peace,” Project Syndicate (April 10, 2012).
22. Rid, “Think Again, p. 84. Rid does not explain why sophisticated cyberattack should
not therefore concern lesser powers.
23. See Michael Howard, Clausewitz: A Very Short Introduction (Oxford: Oxford University
Press, 2002), p. 22.
24. Bruce Schneier, “Threat of ‘Cyberwar’ Has Been Hugely Hyped,” CNN ( July 7, 2010).
Similarly, Schneier also warned about the dangerous implications of the “militarization”
of cyberspace for the civilian control of the Internet and for the protection of user
privacy. See Schneier, “Militarizing Cyberspace Will Do More Harm Than Good,” The
Irish Times (November 29, 2012).
262 NOTES to pp. 32–9
25. See Barry Buzan and Lene Hansen, The Evolution of International Security Studies
(Cambridge: Cambridge University Press, 2009), p. 12.
26. See Stephen M. Walt, “The Enduring Relevance of the Realist Tradition,” in Ira
Katznelson and Helen V. Milner, eds., Political Science: State of the Discipline (New York:
W. W. Norton, 2002), p. 220.
27. See Stanley Hoffmann, The State of War: Essays on the Theory and Practice of International
Politics (New York: Praeger, 1965), pp. 7–8.
28. The quote appears in Niall Ferguson, Kissinger, 1923–1968: The Idealist (London:
Penguin, 2015), p. 336 (emphasis mine). It is from a letter by Kissinger to Arthur
Schlesinger dated February 16, 1955. Shortly after, Kissinger developed his argument in
a book-length study, Nuclear Weapons and Foreign Policy (New York: Council on Foreign
Relations, 1957).
29. Holsti wrote: “power may be viewed from several aspects: it is a means, it is based on
capabilities, it is a relationship, and a process, and it can also be a quantity.” Kalevi J.
Holsti, “The Concept of Power in the Study of International Relations,” Background,
Vol. 7, No. 4 (February 1964), p. 182.
30. Nuclear Tipping Point, documentary film (2010).
31. See, for instance, Friedrich Kratochwil, “The Embarrassment of Change: Neo-Realism
as the Science of Realpolitik without Politics,” Review of International Studies, Vol. 19,
No. 1 ( January 1993), pp. 63–80.
32. Stephen M. Walt, “The Relationship between Theory and Policy in International
Relations,” Annual Review of Political Science, Vol. 8 (2005), pp. 41–42.
33. For a historical elaboration of the problem of technological revolution and strategic
adaptation, see Chapter 3.
34. Joseph S. Nye, Jr., “Nuclear Lessons for Cyber Security?” Strategic Studies Quarterly,
Vol. 5, No. 4 (Winter 2011), p. 19.
35. For an excellent account of the Stuxnet deliberations, see Sanger, Confront and Conceal,
Chapter 8.
36. This was “a huge amount of code” compared to other known malware. See Sharon
Weinberger, “Is this the Start of Cyberwarfare?” Nature, Vol. 474, pp. 142–43.
37. Despite the attack code’s sophistication, analysts have found “basic errors” in its design
and in the code itself. For example, Tom Parker observed that “the command-and-
control mechanism is poorly done and sends its traffic in the clear and the worm ended
up propagating on the Internet, which was likely not the intent.” Dennis Fisher, “Stuxnet
Authors Made Several Basic Errors,” Threatpost ( January 18, 2011), https://threatpost.
com/stuxnet-authors-made-several-basic-errors–011811/74856/. See also Nat Lawson,
“Stuxnet is Embarrassing, Not Amazing,” rdist ( January 17, 2011), https://rdist.
root.org/2011/01/17/stuxnet-is-embarrassing-not-amazing/#comment–6451. Despite
these criticisms, computer security specialists broadly regard Stuxnet as a genial weapon.
38. David Newsom, “Foreign Policy and Academia,” Foreign Policy, Vol. 101 (Winter 1995),
p. 66.
39. Although Stuxnet’s custodians sought to contain the worm within the Natanz facility,
thousands of external machines were infected (more than 40 percent of them outside
Iran).
40. An example of such new legislation is the Cyber Security Information Sharing Act
(CISA) that was signed into law in December 2015.
41. James Blitz, “UK Becomes First State to Admit to Offensive Cyber Attack Capability,”
Financial Times (September 29, 2013).
42. Mark Pomerlau, “Carter Looking to Drop ‘Cyber Bombs’ on ISIS,” Defence Systems
(February 29, 2016).
43. See, for instance, Louis Klarevas, “Political Realism: A Culprit for the 9/11 Attacks,”
Harvard International Review, Vol. 26, No. 3 (Fall 2004), pp. 18–21.
44. Valeriano and Maness, Cyber War versus Cyber Realities. Similarly, the authors state:
“Evidence and facts are needed in order to counter hype and bluster” about the cyber
threat (p. 209).
NOTES to pp. 39–45 263
45. Sam Jones, “Ministry of Defence Fends Off ‘Thousands’ of Daily Cyber Attacks,” The
Financial Times ( June 25, 2015), http://www.ft.com/intl/cms/s/0/2f6de47e–1a9a–
11e5–8201-cbdb03d71480.html.
46. Shaun Walker, “Kremlin Pours Cold Water on MI5 Chief ’s Claims of Russian Threat,”
The Guardian (November 1, 2016).
47. See Michael S. Schmidt, “New Interest in Hacking as Threat to Security,” The New York
Times (March 13, 2012).
48. See Jack Kim, “North Korea Mounts Long-Running Hack of South Korea Computers,
Says Seoul,” Reuters ( June 13, 2016), http://www.reuters.com/article/us-northkorea-
southkorea-cyber-idUSKCN0YZ0BE.
49. See “Military Investigators Raid Cyber Command in Hacking Probe,” Yonhap News
Agency (December 13, 2016), http://english.yonhapnews.co.kr/northkorea/2016/12/13
/0401000000AEN20161213006500315.html.
50. François Clemenceau and Antoine Malo, “Le Drian sur le cyberespionnage: La France
n’est pas à l’abri, il ne faut pas être naïf,” Le Journal du Dimanche ( January 7, 2017),
http://www.lejdd.fr/International/Le-Drian-sur-le-cyberespionnage-La-France-n-
est-pas-a-l-abri-il-ne-faut-pas-etre-naif–837985#xtor=CS1–4.
51. See Amy Chozick, “Hillary Clinton Blames F.B.I. Director for Election Losses,” The
New York Times (November 12, 2016); and Jason Blakely, “Is Political Science this
Year’s Election Casualty?” The Atlantic (November 14, 2016).
52. The journal International Security, published by MIT Press, features primarily works of
international relations.
53. The quote is by the journalist Peter Passell. Joseph B. Treaster, “Herman Kahn Dies;
Futurist and Thinker on Nuclear Strategy,” The New York Times ( July 8, 1983).
54. The comment is by political scientist Roman Kolkowicz. Michael Intriligator, Roman
Kolkowicz, and Andrzej Korbonski, “Bernard Brodie, Political Science: Los Angeles,”
Calisphere, University of California (September 1979).
55. PhD in International Relations, University of Chicago.
56. Masters in Mathematical Logic, Columbia University. Some observers regard mathe-
matics as a natural science; indeed, many university departments group it with physics,
chemistry, biology, etc. But the “Queen of the Sciences,” as German thinker Carl
Friedrich Gauss called mathematics, is also the basis for much non-scientific inquiry –
ranging from logic to aesthetics. At any rate, it is certainly not an engineering or tech-
nical science.
57. AB and PhD in Government, Harvard University.
58. AB and PhD in Government, Harvard University.
59. BA in Economics, University of California, Berkeley; PhD in Economics, Harvard
University.
60. This trend is reflected in the overtly technical tone of important works of military
tactics, such as Martin C. Libicki, Conquest in Cyberspace: National Security and
Information Warfare (New York: Cambridge University Press, 2007).
61. The proposed framework draws from, but also adapts, concepts introduced in William
A. Owens, Kenneth W. Dam, and Herbert S. Lin, eds., Technology, Policy, Law, and
Ethics Regarding U.S. Acquisition and Use of Cyberattack Capabilities (Washington, D.C.:
National Academies Press, 2009).
62. See Richard A. Clarke and Robert K. Knake, Cyber War: The Next Threat to
National Security and What to Do About It (New York: HarperCollins, 2010), p. 69. For an
instructive, more technical review of some of these concepts, see Matthew Monte,
Network Attacks and Exploitation: A Framework (Indianapolis, IN: John Wiley and Sons,
2015).
63. See German Federal Ministry of the Interior, Cyber Security Strategy for Germany
(Berlin: German Federal Ministry of the Interior, February 2011), p. 14.
64. See Andrew Greenberg, “Hacker Lexicon: What is the Dark Web?,” Wired (November
19, 2014). The Dark Web is also not to be confused with the Deep Web, of which it is
a small part and which denotes the vast swathes of the Internet that search engines such
264 NOTES to pp. 45–51
as Google do not index. See “Going Dark: The Internet Behind the Internet,” NPR
(May 25, 2014).
65. See, for instance, Jon R. Lindsay and Lucas Kello, “Correspondence: A Cyber
Disagreement,” International Security, Vol. 39, No. 2 (Fall 2014), p. 187.
66. Some working concepts omit social agents. See, for example, Nazli Choucri and David
Clark, “Cyberspace and International Relations: Towards an Integrated System,” paper
presented at the Massachusetts Institute of Technology, Cambridge, Massachusetts,
August 2011, p. 8.
67. See Lindsay and Kello, “Correspondence: A Cyber Disagreement,” pp. 188–92.
68. The logical layer comprises the service platforms on which computer systems and
networks function (e.g. software applications). The information layer includes the data
that flow between interconnected nodes. The physical layer comprises physical machines.
On the “layers” model, see Choucri and Clark, “Cyberspace and International Relations.”
69. An information security operation of this kind occurred in November 2015 in the after-
math of the Paris massacre perpetrated by Islamic State operatives. See Paul Mozur,
“China Cuts Mobile Service of Xinjiang Residents Evading Internet Filters,” The New
York Times (November 23, 2015).
70. Gary King, Jennifer Pan, and Margaret E. Roberts, “How Censorship in China Allows
Government Criticism but Silences Collective Expression,” American Political Science
Review, Vol. 2 (May 2013), pp. 1–18.
71. For a gripping account of Russian Internet censorship generally and of this case partic-
ularly, see Andrei Soldatov and Irina Borogan, The Red Web: The Struggle between
Russia’s Digital Dictators and the New Online Revolutionaries (New York: PublicAffairs,
2015). See also Hal Roberts and Bruce Etling, “Coordinated DDoS Attack during
Russian Duma Elections,” Internet and Democracy Blog, Berkman Center for Internet
and Society, Harvard University (December 8, 2011).
72. For an excellent discussion of freedom of speech over the Internet, see Timothy Garton
Ash, Free Speech: Ten Principles for a Connected World (New Haven, CT: Yale University
Press, 2016).
73. See Steven M. Bellovin, Susan Landau, and Herbert S. Lin, “Limiting the Undesired
Impact of Cyber Weapons: Technical Requirements and Policy Implications,” unpub-
lished paper, p. 3.
74. See Luke Harding, “Top Democrat’s Emails Hacked by Russia after Aide Made Typo,
Investigation Finds,” The Guardian (December 14, 2016).
75. See Mohammad Tehranipoor and Farinaz Koushanfar, “A Survey of Hardware Trojan
Taxonomy and Detection,” IEEE Design and Test of Computers, Vol. 27, No. 1 (2010),
pp. 10–25; and Masoud Roustami, Farinaz Koushanfar, Jeyavijayan Rajendran, and
Ramesh Karri, “Hardware Security: Threat Models and Metrics,” ICCAD ‘13 Proceedings
of the International Conference on Computer-Aided Design, IEEE (November 18–21,
2013), pp. 819–23.
76. Therefore, in instances where the sole medium of entry to a target is the Internet or
some physical device, the weapon lacks a penetration element all its own.
77. See Cecilia Kang, “A Tweet to Kurt Eichenwald, a Strobe and a Seizure. Now, an
Arrest,” The New York Times (March 17, 2017).
78. On problems of global governance in the cyber domain, see, for example, Laura
DeNardis, Protocol Politics: The Globalization of Internet Governance (Cambridge, MA:
MIT Press, 2009); Joseph S. Nye, Jr., “The Regime Complex for Managing Global
Cyber Activities,” Global Commission on Internet Governance, Issue Paper Series, No. 1
(May 2014); and Lucas Kello, “Cyber Security: Gridlock and Innovation,” Beyond
Gridlock (Cambridge: Polity, 2017). Cyber governance, which deals with the manage-
ment of cybersecurity issues, especially cyberattacks and cyber espionage, should not be
confused with Internet governance, an older field that involves the management of the
globe’s computer network infrastructure.
79. See Thomas C. Reed, At the Abyss: An Insider’s History of the Cold War (New York:
Random House, 2005), Chapter 17.
NOTES to pp. 51–8 265
80. The term “industrial controller” signifies computer systems that govern processes of
industrial production. It includes supervisory control and data acquisition (SCADA)
systems and programmable logic controllers (PLCs).
81. See Owens, Dam, and Lin, Technology, Policy, Law, and Ethics, pp. 1–2.
82. For technical details on Stuxnet’s destructive procedure, see Nicholas Falliere, Liam O.
Murchu, and Eric Chien, “W32.Stuxnet Dossier,” ver. 1.4 (Cupertino, CA: Symantec,
February 2011).
83. The standard usage can be relabeled as follows: “first-order” direct effects exerted on an
industrial controller; and “second-order” direct effects influencing machine parts
governed by it.
84. For a similar definition, see Nye, “Nuclear Lessons for Cyber Security?” p. 21.
85. On hacktivism as a modern form of political activism, see François Paget, Cybercrime
and Hacktivism (Santa Clara, CA: McAfee, 2010), pp. 10–12.
86. Bellovin, Landau, and Lin, “Limiting the Undesired Impact of Cyber Weapons,”
pp. 4–5.
87. See David D. Clark and Susan Landau, “Untangling Attribution,” in Proceedings of a
Workshop on Deterring Cyberattacks: Informing Strategies and Developing Options for U.S.
Policy (Washington, D.C.: National Academies Press, 2010), pp. 25–40.
88. See, for example, Alexander Klimburg and Heli Tirmaa-Klaar, Cybersecurity and
Cyberpower: Concepts, Conditions, and Capabilities for Cooperation for Action within the
EU (Brussels: European Parliament Directorate General for External Policies of the
Union, Policy Department, April 2011), p. 5.
89. Bruce Schneier, “When Does Cyber Spying Become a Cyber Attack,” Defense One
(March 10, 2014), http://www.defenseone.com/technology/2014/03/when-does-
cyber-spying-become-cyber-attack/80206/.
90. The common term is “advanced persistent threat,” or APT, which refers to an actor
(such as a large state) able to penetrate an adversary’s computer systems persistently and
successfully. I prefer the term advanced persistent adversary because the meaning of
APT focuses on the threat posed by the agent, rather than the more important agent
posing the threat.
91. Stephen M. Walt, “The Renaissance of Security Studies,” International Studies Quarterly,
Vol. 35, No. 2 (1991), p. 212.
92. See Adam P. Liff, “Cyberwar: A New ‘Absolute Weapon’? The Proliferation of
Cyberwarfare Capabilities and Interstate War,” Journal of Strategic Studies, Vol. 35, No.
3 ( June 2012), pp. 401–28; Thomas Rid, Cyber War Will Not Take Place (London: Hurst,
2013); and Erik Gartzke, “The Myth of Cyberwar: Bringing War in Cyberspace Back
Down to Earth,” International Security, Vol. 38, No. 2 (Fall 2013), pp. 41–73.
93. For a discussion of the shape of the cyber danger, see Chapter 2.
94. Brandon Valeriano and Ryan C. Maness, “The Dynamics of Cyber Conflict between
Rival Antagonists, 2001–11,” Journal of Peace Research, Vol. 51, No. 3 (2014), p. 355.
This analysis ignores nonstate actors unless “they are considered part of a state’s national
security apparatus, or if the initiators are clearly acting on behalf of their home govern-
ment” (p. 355). Thus it excludes a vast number of unaffiliated or subversive players. The
authors’ book-length study repeats this omission. See Valeriano and Maness, Cyber War
Versus Cyber Realities.
Force Press Service, U.S. Department of Defense (May 18, 2012), http://www.defense.
gov/news/newsarticle.aspx?id_116394.
18. See President of Estonia Toomas H. Ilves, address given at the European Union
Ministerial Conference on Critical Infrastructure Protection, Tallinn, Estonia (April
27, 2009).
19. Thomas Rid, “Cyber War Will Not Take Place,” Journal of Strategic Studies, Vol. 35,
No. 1 (February 2012), p. 12.
20. See U.S. Cyber Consequences Unit (US-CCU), Overview by the US-CCU of the Cyber
Campaign against Georgia in August 2008, Special Report, US-CCU (August 2009),
http://www.registan.net/wp-content/uploads/2009/08/US-CCU-Georgia-Cyber-
Campaign-Overview.pdf.
21. See “The Shamoon Attacks,” Symantic Official Blog (August 16, 2012); and
Christopher Bronk and Eneken Tikk-Ringas, “The Cyber Attack on Saudi Aramco,”
Survival: Global Politics and Strategy, Vol. 55, No. 2 (2013), pp. 81–96.
22. See David E. Sanger, “Obama Order Sped Up Wave of Cyberattacks against Iran,” The
New York Times ( June 1, 2012).
23. See Gabi Siboni and Zvi Magen, “The Cyber Attack on the Ukrainian Electrical
Infrastructure: Another Warning,” INSS Insights, No. 798 (February 17, 2016).
24. See Eneken Tikk-Ringas quoted in “Could Cyber Skirmish Lead to War?,” NBC
News ( June 11, 2010), http://www.nbcnews.com/technology/could-cyber-skirmish-
lead-u-s-war–6C10406234.
25. Kim Chipman, “Cruz Says Russia, China Have Committed Acts of ‘Cyber War,’ ”
Bloomberg (August 7, 2015).
26. Oliver Laughland and Dominic Rushe, “Sony Pulling the Interview Was ‘a Mistake’
Says Obama,” The Guardian (December 20, 2014).
27. Richard Waters, “US Struggles to Find Response to Hack Attack on Sony,” The
Financial Times (December 21, 2014).
28. See Rid, “Cyber War Will Not Take Place.”
29. On calculated ambiguity in other domains of conflict, see Scott D. Sagan, “The
Commitment Trap: Why the United States Should Not Use Nuclear Weapons to
Deter Biological and Chemical Weapons Attacks,” International Security, Vol. 24, No. 4
(Spring 2000), pp. 85–115.
30. Siobhan Gorman and Julian E. Barnes, “Cyber Combat: Act of War,” Wall Street Journal
(May 30, 2011).
31. See Chapter 7.
32. See Congressional Record, U.S. Senate, Proceedings and Debates of the 114th Congress, First
Session, Vol. 161, No. 126 (August 5, 2015), p. S6338.
33. Aaron Boyd, “SecDef Nominee: Cyber Threats Require Holistic Defense Strategy,”
Federal Times (February 4, 2015).
34. National Research Council of the National Academies, Terrorism and the Electric Power
Delivery System (Washington, D.C.: National Academies Press, 2012), p. 16.
35. See Barack H. Obama, “Taking the Cyberattack Threat Seriously,” op-ed, Wall Street
Journal ( July 19, 2012).
36. Some officials speculate that Iran retaliated for Stuxnet with DDoS attacks against U.S.
financial institutions. See Senator Joseph Lieberman, interview on Newsmakers,
C-SPAN (September 23, 2012).
37. See Adam P. Liff, “Cyberwar: A New ‘Absolute Weapon?’ The Proliferation of
Cyberwarfare Capabilities and Interstate War,” Journal of Strategic Studies, Vol. 35, No.
3 ( June 2012), p. 401.
38. See William J. Lynn III, “Defending a New Domain,” Foreign Affairs, Vol. 89, No. 5
(September 2010), pp. 97–108.
39. The term “PLC environment” denotes the PLC computers and the engineering stations
used to program them.
40. For a discussion of potential reproduction, in whole or in part, of the Stuxnet worm, see
Chapter 6.
268 NOTES to pp. 68–71
41. See Thomas Rid, “Think Again: Cyberwar,” Foreign Policy, Vol. 192 (March/April
2012). A more nuanced argument about offense superiority in limited scenarios
involving specific organizational abilities and technologies appears in Rebecca Slayton,
“What Is the Cyber Offense-Defense Balance? Conceptions, Causes, and Assessment,”
International Security, Vol. 41, No. 3 (Winter 2016–17), pp. 72–109.
42. On the difficulties of cyber defense, see Stewart Baker, Natalia Filipiak, and Katrina
Timlin, In the Dark: Crucial Industries Confront Cyberattacks (Santa Clara, CA: Center
for International and Strategic Studies and McAfee, 2011); and John Arquilla,
“Cyberwar Is Already Upon Us,” Foreign Policy (February 27, 2012), www.foreignpolicy.
com/articles/2012/02/27/cyberwar_is_already_upon_us.
43. Paul Roberts, “Update – Stuxnet Expert: Analysis Shows Design Flaw, Not Vulnerability
Sunk Siemens,” Threatpost ( January 19, 2012).
44. According to one report, the average detection time of zero-day attacks is approxi-
mately ten months. The median is eight months. See Leyla Bilge and Tudor Dumitras,
“Before We Knew It: An Empirical Study of Zero-Day Attacks in the Real World,”
Proceedings of the 2012 ACM Conference on Computer and Communications Security,
October 16–18, 2012, p. 834.
45. The problem of undetectable malware is reflected in the technical community’s
common fixation with the search for viable means of identifying APTs.
46. This figure is a simplification. The lag time between compromise and detection depends
on the class and effects of the hostile action. A higher figure applies to cyber exploitation
rather than cyberattacks. Indeed, some attacks – such as ransomware, which incapacitates
the target machine – may be discovered immediately. See 2016 Data Breach Investigations
Report, Verizon (April 24, 2016), pp. 10–11. The policy process from the time that inves-
tigators identified North Korea as the culprit to publicly outing it took longer than the
time between when investigators first learned of the breach and when they identified
North Korea.
47. Edward W. Krippendorf vs. United States of America, Office of Personnel Management; and
Keypoint Government Solutions, Case 1:15 cv 01321 (August 14, 2015), p. 25.
48. See Jessica Silver-Greenberg, Matthew Goldstein, and Nicole Perlroth, “Hackers’
Attack on JPMorgan Chase Affects Millions,” The New York Times (October 2, 2014).
49. See Danny Yardon, “Three Months Later, State Department Hasn’t Rooted Out
Hackers,” Wall Street Journal (February 19, 2015).
50. Andrea Shalal, “Nearly Every U.S. Arms Program Found Vulnerable to Cyber Attacks,”
Reuters ( January 21, 2015).
51. Lorenzo Franceschi-Bicchierai, “FBI Says a Mysterious Hacking Group Has Had
Access to US Government Files for Years,” Motherboard (April 4, 2016).
52. In one case, the detection lag may have been about one year. See Kim Zetter, “Kaspersky
Finds New Nation-State Attack – In its Own Network,” Wired (October 6, 2015).
53. Franceschi-Bicchierai, “FBI Says a Mysterious Hacking Group Has Had Access to US
Government Files for Years.”
54. See Luke Harding, “Top Democrat’s Emails Hacked by Russia after Aide Made Typo,
Investigation Finds,” The Guardian (December 14, 2016).
55. See Sam Thielman, “Yahoo Hack: 1bn Accounts Compromised by Biggest Data Breach
in History,” The Guardian (December 15, 2016).
56. Robert S. Mueller, III, “Remarks at RSA Cyber Security Conference, San Francisco,
CA” (March 21, 2012), https://archives.fbi.gov/archives/news/speeches/combating-
threats-in-the-cyber-world-outsmarting-terrorists-hackers-and-spies.
57. Baldwin’s original dictum was: “The bomber will always get through.” Yet the inventive
acuity of British scientists proved his dire prediction wrong, as the Royal Air Force’s
victory against the Luftwaffe during the Battle of Britain in 1940 showed. See Chapter
3 in this volume.
58. See Matthew Monte, Network Attacks and Exploitation (Indianapolis, IN: Wiley, 2015),
p. 150.
59. See Thomas C. Schelling, Arms and Influence (New Haven, CT: Yale University Press,
1966).
NOTES to pp. 71–7 269
60. According to sources, the data breach, which involved the exfiltration of several
terabytes of information, occurred in 2007 and 2008. See Daniel Nasaw, “Hackers
Breach Defences of Joint Strike Fighter Jet Programme,” The Guardian, April 21,
2009.
61. The diplomatic dispute over the control of the Spratly and Paracel Islands and other
island archipelagos in the South China Sea has a long and complex history. The islands
underwent military occupations by various powers, the most recent of which was
China’s stationing of an advanced surface-to-air missile system (HQ–9) on Woody
Island in February 2016 – a move that the United States and regional countries have
denounced. The Philippines government has taken its dispute with China over the
Spratlys to the Permanent Court of Arbitration in The Hague, which in July 2016 ruled
against China’s historic claims to the islands’ resources.
62. John J. Mearsheimer, Conventional Deterrence (Ithaca, N.Y.: Cornell University Press,
1983), p. 26.
63. See Amir Efrati and Steve Nellis, “Inside Apple’s Cloud Infrastructure Troubles,”
Business Insider (March 23, 2016).
64. See “Chinese Government Bans Windows 8 from its Computers,” The Guardian (May
20, 2014).
65. An example of this effort is the proposal for a new legislative bill (H.R.5793 – Cyber
Supply Chain Management and Transparency Act of 2014 113th Congress, 2013–14)
requiring all private contractors that supply software or hardware to the U.S. govern-
ment to provide “a bill of materials of all third party and open source components used,”
demonstrate that those component do not have known vulnerabilities, provide “secure
update mechanisms” when a new vulnerability is detected, and supply remediation
“within a reasonable specified time.”
66. In 2010, Google announced that sophisticated Chinese agents had breached its systems,
and in 2011, unknown parties compromised RSA’s authentication products. This was
followed by attempts to penetrate computers at Lockheed Martin, an RSA client.
67. See Sean M. Lynn-Jones, “Offense-Defense Theory and its Critics,” Security Studies,
Vol. 4, No. 4 (Summer 1995), p. 665.
68. Brian Groom, “Ministers Warn on Threat from Cyber Attacks,” The Financial Times
(September 4, 2012).
69. Robert O’Harrow, Jr., “Understanding Cyberspace Is Key to Defending against Digital
Attacks,” Washington Post ( June 2, 2012).
70. See Lucas Kello, “The Meaning of the Cyber Revolution: Perils to Theory and
Statecraft,” International Security, Vol. 38, No. 2 (Fall 2013), p. 22.
71. Brandon Valeriano and Ryan C. Maness, “The Coming Cyberpeace: The Normative
Argument against Cyberwarfare,” Foreign Affairs (May 13, 2015).
72. See Michael Riley, “How Russian Hackers Stole the Nasdaq,” Bloomberg ( July 17, 2014).
73. See Dave Majumdar, “America’s F–35 Stealth Fighter vs. China’s New J–31: Who
Wins?” National Interest (September 25, 2015).
74. Josh Rogin, “NSA Chief: Cybercrime Constitutes the ‘Greatest Transfer of Wealth in
History’,” Foreign Policy ( July 9, 2012).
75. James Lewis, “Significant Cyber Incidents Since 2006,” Center for Strategic and
International Studies (August 2016), file:///C:/Users/LK/Downloads/160824_
Significant_Cyber_Events_List.pdf.
76 John Barrasso (chairman), Mary Fallin, and Virginia Foxx, Republican Platform 2016
(Cleveland, OH: Consolidated Solutions, 2016), p. 53.
77. James R. Van de Velde, “War in Peace,” American Interest (September 6, 2016).
78. Fergus Hanson, “Waging War in Peacetime: Cyber Attacks and International Norms,”
The Interpreter (October 20, 2015).
29. See Thomas Berger, “Changing Norms of Defense and Security in Japan and Germany,”
in Peter J. Katzenstein, ed., The Culture of National Security: Norms and Identity in World
Politics (New York: Columbia University Press, 1996), pp. 317–56.
30. See Stephen M. Walt, Revolution and War (Ithaca, N.Y.: Cornell University Press, 1996).
Nevertheless, advances in communications technology – not least the Internet and social
media – may create conditions in which political winds of domestic revolution blow
rapidly and unexpectedly across national borders. For a discussion of this phenomenon
in the context of the Arab Spring, see Philip N. Howard, Aiden Duffy, Deen Freelon,
Muzammil M. Hussain, Will Mari, and Marwa Maziad. “Opening Closed Regimes:
What Was the Role of Social Media during the Arab Spring?” (2011), available at
SSRN: https://ssrn.com/abstract=2595096 or http://dx.doi.org/10.2139/ssrn.2595096.
31. See Walt, Revolution and War.
32. Bull, The Anarchical Society, p. 25.
33. For an elaboration of this point, see John Vincent, Human Rights and International
Relations: Issues and Reponses (Cambridge: Cambridge University Press, 1996).
34. Bull, The Anarchical Society, p. 22. See also Hedley Bull, Justice in International Relations:
Hagey Lectures (Waterloo: University of Waterloo, 1983), p. 13. This understanding of
the connection between world and international society is similar to Locke’s social-
contract theory, in which the legitimacy of sovereignty is thought to derive from the
ability of the state to protect its citizens’ natural rights. Other thinkers, such as Martin
Wight and Andrew Linklater, view the relationship between state sovereignty and indi-
vidual moral claims as oppositional. See Martin Wight, International Theory: The Three
Traditions (London: Leicester University Press, 1991); and Andrew Linklater,
“Citizenship and Sovereignty in the Post-Westphalian State,” European Journal of
International Relations, Vol. 2, No. 1 (March 1996), pp. 77–103.
35. Gilpin, War and Change in World Politics, p. 211. Thucydides himself claimed to have
written for posterity. See also Barry Buzan and Richard Little, International Systems in
World History: Remaking the Study of International Relations (Oxford: Oxford University
Press, 2000), p. 2. For other theories of international change that conform to rational-
choice assumptions, see Paul Kennedy, The Rise and Fall of the Great Powers: Economic
Change and Military Conflict from 1500 to 2000 (New York: Vintage Books, 1987); and
Charles Kindleberger, The World in Depression, 1929–1939 (Berkeley, CA: University of
California Press, 1996).
36. The analysis of this section combined procedural reforms, or what Gilpin termed
“interaction changes,” with larger changes in the international system’s material struc-
ture. Together, these two elements, which Gilpin treats separately, are the main features
of systemic disruption.
37. Bull, Anarchical Society, p. 311.
38. Marx’s theory of the state is rooted in his understanding of the state as a guarantor of
private property and, therefore, his expectation that a classless society would also be
stateless. In pursuing this vision, Lenin in fact violated it: he emphasized the necessity
of an intermediary stage, “the dictatorship of the proletariat,” in which the state’s influ-
ence on economic and personal life was supreme. See Friedrich Engels and Karl Marx,
The German Ideology, second edition (London: Laurence and Wishart, 1974); and
Vladimir Lenin, The State and Revolution: The Marxist Theory of the State and the Tasks
of the Proletariat in the Revolution (London: Union Books, 2013).
39. See Chapter 5.
40. See Andrew Hurrell, On Global Order: Power, Values and the Constitution of International
Society (Oxford: Oxford University Press, 2007), p. 69. For an intellectual history of war
and peace in European thought, see Michael Howard, War in Modern History (Oxford:
Oxford University Press, 1976); and Justine Lacroix and Kalypso A. Nicolaidïs, eds.,
European Stories: Intellectual Debates on Europe in National Contexts (Oxford: Oxford
University Press, 2011). On modern peace theories and proposals, see F. H. Hinsley,
Power and the Pursuit of Peace: Theory and Practice in the History of Relations between
States (Cambridge: Cambridge University Press, 1967).
NOTES to pp. 91–4 273
41. See Immanuel Kant, Perpetual Peace and Other Essays on Politics, History and Morals,
trans. T. Humphrey (Indianapolis, IN: Hackett, 1983).
42. See Hurrell, On Global Order, p. 69.
43. This form of moderate federalism differed from the vision of radical federalists such as
Altiero Spinelli, Carlo Sforza, and Helmuth von Moltke, whose central concern was the
wholesale eradication of the nation state within a single European entity. See Walter
Lipgens, A History of European Integration (Oxford: Oxford University Press, 1982);
and Walter Lipgens and Wilfried Loth, eds., Documents on the History of European
Integration: The Struggle for European Union by Political Parties and Pressure Groups in
Western European Countries, 1945–1950 (New York: De Gruyter, 1988).
44. The evaluation of systemic revision confronts an analytical challenge: analysts must
determine whether observed changes in states’ shared goals and principles signal the
emergence of new political ideals or represent, instead, a mere adaptation of previously
existing values. Or more succinctly: At what point does normative change become
systemic revision? See Barry Buzan, From International to World Society? English School
Theory and the Social Structure of Globalisation (Cambridge: Cambridge University
Press, 2004), p. 182.
45. This concept is different from Geoffrey Herrera’s “system change,” a label he employs
interchangeably with the more common “systemic change.” See Herrera, Technology and
International Transformation.
46. See Richard Rosecrance, The Resurgence of the West: How a Transatlantic Union Can
Prevent War and Restore the United States and Europe (New Haven, CT: Yale University
Press, 2013).
47. See Margaret E. Keck and Kathryn Sikkink, Activists Beyond Borders: Advocacy Networks
in International Politics (Ithaca, N.Y.: Cornell University Press, 1998).
48. See Herrera, Technology and International Transformation, p. 22.
49. Bull’s original term was “world political system.”
50. Bull, Anarchical Society, p. 266.
51. Ibid., p. 268. Critical theories of international relations also supply a range of concepts
with which to build a model of the system that does not privilege the state. This
perspective emphasizes the importance not only of structures of power but also the
very nature of power, especially shifting hierarchical relationships among state and
nonstate players; it focuses not just on the actions of observers and practitioners but
also their minds, especially normative choices in favor of or against a particular social
conception of hierarchical agents; in brief, it draws attention to the question of how a
new system comes about rather than how established systems adapt and endure. The
critical approach, however, has its limits. Not least of these is the essential duality in
which deep change occurs. Forces of change in the contemporary world exist within,
and are thus shaped and constrained by, the very social arrangement that they strive to
supplant. This limits, as Robert Cox acknowledged, “the range of choice to alternative
orders which are feasible transformations of the existing world.” (Robert W. Cox,
“Social Forces, States, and World Orders,” Millennium: Journal of International Studies,
Vol. 10, No. 2 [1981], p. 130.) The basic theoretical challenge, then, is to provide
conceptual benchmarks of revolutionary change that separate changes within the
international system and more exceptional changes of the system itself – the central
aim of this chapter.
52. David Armstrong, Revolution and World Order: The Revolutionary State in International
Society (Oxford: Clarendon Press, 1993), p. 33. There were, in fact, two treaties – of
Osnabrück and Münster.
53. See Robert Jackson, Sovereignty: The Evolution of an Idea (Cambridge: Polity, 2007),
p. 38. Andrew Hurrell disputes the Westphalia-based story of the states system’s genesis,
noting that the legal and moral practices of modern international society developed
between 1750 and 1914. See Hurrell, On Global Order, p. 54; and Andreas Osiander,
“Sovereignty, International Relations and the Westphalian Myth,” International
Organization, Vol. 55, No. 2 (April 2001), pp. 251–87.
274 NOTES to pp. 94–100
54. Wight, International Theory, p. 41. Wight cites Christian Wolff as a principal exponent
of this dogma.
55. See J. D. Cockcroft and E. T. S. Walton, “Experiments with High Velocity Positive Ions.
II. The Disintegration of Elements by High Velocity Protons,” Proceedings of the Royal
Society of London ( July 1932).
56. In 1934, American physicist Leo Szilard observed that neutrons could sustain a nuclear
chain reaction that could generate an enormous amount of energy. See Leo Szilard and
T. H. Chalmers, “Detection of Neutrons Liberated from Beryllium by Gamma Rays:
A New Technique for Inducing Radioactivity,” Nature, Vol. 134 (September 1934),
pp. 494–95.
57. Oak Ridge housed one of the first centers of nuclear medicine in the United States.
Doctors used cesium–137 to kill cancerous tissue. See Alan Taylor, “The Secret City,”
The Atlantic ( June 25, 2012).
58. On technological determinism, see John G. Ruggie, “International Responses to
Technology: Concepts and Trends”, International Organization (Summer 1975),
pp. 558–83; Merritt R. Smith and Leo Marx, eds., Does Technology Drive History? The
Dilemma of Technological Determinism (Cambridge, MA: MIT Press, 1994); and
Herrera, Technology and International Transformation, pp. 29–39.
59. According to one interesting study, globalization can reduce states’ reliance on the use of
military force to pursue national interests. That is, integration provides “an additional mech-
anism for competition beyond cheap talk, but short of military violence.” Erik Gartzke and
Quan Li, “War, Peace, and the Invisible Hand: Positive Political Externalities of Economic
Globalization,” International Studies Quarterly, Vol. 47 (November 2003), pp. 561–86.
60. Skolnikoff, The Elusive Transformation, p. 11.
61. For an example of such a view, see Keir A. Lieber, War and the Engineers: The Primacy of
Politics over Technology (Ithaca, N.Y.: Cornell University Press, 2005).
62. See William Potter, ed., International Nuclear Trade and Nonproliferation: The Challenge of
Emerging Suppliers (Lexington, MA: Lexington Books, 1990); Chaim Braun and
Christopher F. Chyba, “Proliferation Rings: New Challenges to the Nuclear
Nonproliferation Regime,” International Security, Vol. 29, No. 2 (Fall 2004), pp. 5–49;
Olav Njølstad, Nuclear Proliferation and International Order: Challenges to the Non-
Proliferation Regime (New York, NY: Routledge, 2011).
63. See Vinton G. Cerf and Robert E. Kahn, “A Protocol for Packet Network Interconnection,”
IEEE Transactions on Communications, COM–22, Vol. 5 (1974); and John Naughton, A
Brief History of the Future: Origins of the Internet (London: Weidenfeld and Nicolson,
1999).
64. David D. Clark, “The Design Philosophy of the DARPA Internet Protocols,”
Proceedings of the SIGCOMM ’88, Computer Communication Review, Vol. 18, No. 4
(1988), pp. 106–14. See also Clark, “Designs for an Internet,” Draft version 2.0 ed. s.l.,
unpublished manuscript (2016), p. 22.
65. Clark, “Designs for an Internet,” p. 22.
66. On the history of ARPANET, see Clark, “The Design Philosophy of the DARPA
Internet Protocols”; Barry M. Leiner, “Brief History of the Internet” (The Internet
Society, http://www.internetsociety.org/internet/internet–51/history-internet/brief-
history-internet); Janet Abbate, Inventing the Internet (Cambridge, MA: MIT Press,
1999), Chapters 1–4; and Naughton, A Brief History of the Future.
67. This trend may reverse if countries such as Russia and China succeed in efforts to
impose sovereign control over the Internet. See Chapter 1 of this book.
68. See Ruggie, “International Responses to Technology,” p. 558.
69. The notion of “sociotechnical” systems captures the interaction of technology and poli-
tics. See Herrera, Technology and International Transformation.
70. In fact, this is the essence of Gilpin’s definition of systemic change. See Gilpin, War and
Change in World Politics, p. 9.
71. Thucydides, History of the Peloponnesian War, trans. Steven Lattimore (Indianapolis, IN:
Hackett, 1998).
NOTES to pp. 100–3 275
72. The disturbance of the balance resulted largely from the growth of Athens’ navy and
from its construction of a defensive wall around the port of Piraeus, the completion of
which would have complicated Sparta’s ability to check Athenian expansion in the
Aegean region. See Thucydides, History of the Peloponnesian War, pp. 33 and 67–70.
73. See Paul M. Kennedy, The Rise and Fall of the Great Powers: Economic Change and Military
Conflict from 1500 to 2000 (New York: Random House, 1987), pp. 148–52; and Paul
Bairoch, “International Industrialization Levels from 1750 to 1980,” Journal of European
Economic History, Vol. 11 (1982). Notably, Kennedy’s work contains only passing refer-
ences to the important role of technological innovation in the historical rise and fall of
great powers. He notes, for example, that “The advanced technology of steam engines
and machine-made tools gave Europe decisive economic and military advantages [over
non-European societies during the eighteenth and nineteenth centuries]. The improve-
ments in the muzzle-loading gun (percussion caps, rifling, etc.) were ominous enough;
the coming of the breechloader, vastly increasing the rate of fire, was an even greater
advance; and the Gatling guns, Maxims and light field artillery put the final touches to
a new ‘firepower revolution’ which quite eradicated the chances of a successful resistance
by indigenous peoples reliant upon older weaponry” (p. 150). Yet this work does not
develop a theory of technological revolution and great power transition.
74. For the purposes of illustrating the effects of nuclear arms on the international balance
of power, this discussion does not consider the revolutionary nature of Soviet foreign
policy (for a discussion, see Chapter 5). Insofar as these aims were revolutionary, the
nuclear revolution in this instance produced second-order, not third-order effects.
75. See William Easterly and Stanley Fischer, “The Soviet Economic Decline: Historical
and Republican Data,” NBER Working Paper No. 4735 (May 1994), p. 1.
76. See William H. Cooper, Russia’s Economic Performance and Policies and Their Implications
for the United States (Washington, D.C.: Congressional Research Service, June 29,
2009), p. 5.
77. Ukraine’s fate after 1991 demonstrates the expansionist dangers that non-nuclear states
face against nuclear powers. Although some Ukrainian leaders have strived to steer
their country towards NATO and the European Union, a Russian military incursion
that began in 2014 has stalled both accessions. Had Ukraine retained the nuclear deter-
rent that it inherited from the Soviet Union, its Western-oriented leaders’ may have
realized their political ambitions. See John J. Mearsheimer, “The Case for a Ukrainian
Nuclear Deterrent,” Foreign Affairs (Summer 1993).
78. The words are from Admiral Sir Arthur Wilson. Quoted in Stephen W. Roskill, Naval
Policy between the Wars (London: Walker, 1968), p. 231.
79. The Royal Navy Submarine Service was founded in 1901. By August 1914, it numbered
71 vessels, many of which, however, were training vessels. This number is larger than the
number of vessels (61) that the Germany Navy fielded at any one time during the war.
The largest fleet belonged to the French, who possessed 123 vessels, of which few were
fit for battle.
80. The most reliable form of submarine detection – the sighting of a periscope – almost
always came too late to deflect a fatal blow. Defensive measures – such as minefields, net
barrages, and depth charges – were only partly effective; the first sinking of a German
submarine (U–68) by depth charges did not occur until March 1916. See Richard
Compton-Hall, Submarines and the War at Sea (London: Macmillan, 1991); and Robert
K. Massie, Castles of Steel: Britain, Germany and the Winning of the Great War at Sea
(New York: Random House, 2003).
81. As developments in antisubmarine warfare (e.g. sonar) later showed, there was nothing
about the nature of the new technology that intrinsically favored the offense over the
defense.
82. Henry A. Kissinger, Nuclear Weapons and Foreign Policy (New York: Council on Foreign
Relations, 1957), p. 73.
83. Some observers question whether the “Joe 4” test was a true thermonuclear detonation,
because its yield (400 kilotons of TNT) was less than what was normal for a hydrogen
276 NOTES to pp. 103–7
bomb (in comparison, the American test, “Ivy Mike,” yielded 10.4 megatons). See
Michael Kort, The Columbia Guide to the Cold War (New York: Columbia University
Press, 1998), p. 187.
84. See “Statement by Lewis L. Strauss, Chairman, United States Atomic Energy
Commission,” USAEC Release (March 31, 1954); A. H. Sturtevant, “Social Implications
of the Genetics of Man,” Science, Vol. 120 (September 10, 1954), pp. 405–07;
Ralph E. Lapp, “Radioactive Fall-out,” Bulletin of the Atomic Scientists, Vol. 11 (February
1955), pp. 206–09; and Carolyn Kopp, “The Origins of the American Scientific Debate
over Fallout Hazards,” Social Studies of Science, Vol. 9 (1979), pp. 404–6.
85. See U.S. Senate, Study of Airpower: Hearings before the Subcommittee on the Air Force of
the Committee on Armed Services (Washington, D.C.: Government Publication Office,
1956), p. 165.
86. See National Intelligence Estimate: Soviet Capabilities for Clandestine Attack against the
U.S. with Weapons of Mass Destruction and the Vulnerability of the U.S. (Langley, VA:
Central Intelligence Agency, 1951).
87. See Donald P. Steury, Intentions and Capabilities: Estimates on Soviet Strategic Forces,
1950–1983 (Langley, VA: Center for the Study of Intelligence, Central Intelligence
Agency, 1996), p. 18; and Report of the Defense Science Board Task Force on Preventing
and Defending Against Clandestine Nuclear Attack (Washington, D.C.: Office of the
Under Secretary of Defense For Acquisition, Technology, and Logistics, June 2004).
For a discussion of this problem in the context of nuclear terrorism, see Benjamin E.
Schwartz, Right of Boom: What Follows an Untraceable Nuclear Attack? (New York:
Overlook Press, 2015).
88. Charles L. Glaser and Chaim Kauffman, “What is the Offense-Defense Balance and
Can We Measure It?” International Security, Vol. 22, No. 4 (Spring 1998), pp. 44–82.
89. See Stephen Van Evera, “The Cult of the Offensive and the Origins of the First
World War,” International Security, Vol. 9, No. 1 (Summer 1984), p. 59.
90. See Marshall Joffre, Mémoires du Maréchal Joffre (Paris: Librarie Plon, 1932), p. 33; and
Van Evera, “The Cult of the Offensive and the Origins of the First World War,” p. 61.
91. The German plan for a Swiss invasion, “Operation Tannenbaum,” envisaged a concen-
trated thrust from Lake Geneva to Lake Constance. Its bold authors warned about the
perilous Jura Mountains and the steep banks of the Aare River. On Germany’s war
plans in Switzerland, see Werner Roesch, Bedrohte Schweiz: Die Deutsche
Operationsplanungen gegen die Schweiz im Sommer/Herbst 1940 und die Abwehr-
Bereitschaft der Armee in Oktober 1940 (Frauenfeld: Huber, 1986).
92. See Lieber, War and the Engineers, p. 115.
93. Lindemann himself sought to prioritize other inventions, such as aerial mines, over
the radar’s development. The Germans had not invested in the development of radar
technology because Hitler did not regard it as necessary for the attainment of crushing
offensive victories. See William Manchester and Paul Reid, The Last Lion: Winston
Spencer Churchill, Defender of the Realm, 1940–1965 (New York: Little, Brown, and
Company, 2012), Chapter 1.
94. See Lucas Kello, “Security,” in Joel Krieger, ed., The Oxford Companion to International
Relations, third edition (Oxford: Oxford University Press, 2014).
95. Thucydides’ tragic tale of Athens and Sparta was later translated and elaborated on by
the seventeenth-century English philosopher Thomas Hobbes. See Thomas Hobbes,
Leviathan, edited with an introduction by C. B. Macpherson (New York: Penguin,
1968). In modern times, the logic of the security dilemma has induced nations to adopt
strategic postures that also resulted in war: Germany’s bellicose reaction in 1914 to
perceived encirclement by France and Russia; Japan’s attack on the United States in
1941 in anticipation of a military contest in the Pacific; or the U.S.-led invasion of Iraq
in 2003 to dismantle suspected weapons-of-mass-destruction sites. In all of these
cases, the despotic and militaristic regimes of Germany, Japan, and Iraq aggravated the
security dilemma, showing that domestic-level factors can combine with systemic
factors to increase the chances of conflict.
NOTES to pp. 107–12 277
96. Robert O. Keohane and Joseph S. Nye, Jr., Power and Interdependence (New York:
Longman, 1979), p. 40.
97. Or in the academic language of international relations, the former occurs at the second
“level of analysis,” the latter at the third. On the levels of analysis in international rela-
tions – or “images,” as Waltz called them – see Kenneth N. Waltz, Man, the State, and
War: A Theoretical Analysis (New York: Columbia University Press, 1959).
98. On Napoleon’s mastery of the weapons of war, see Martin Van Creveld, Technology and
War: From 2000 B.C. to the Present (New York: Macmillan, 1989), Chapter 3.
99. This rearming process began in earnest following the Nazi seizure of power in 1933.
Often, the process advanced by subterfuge because of the Versailles Treaty’s restric-
tions on German military capacity – for example, the use of the German Air Transport
School (Deutsche Verkehrsfliegerschule), an outwardly civilian organization, to train
Luftwaffe pilots. The key development was Hitler’s remilitarization of the Rhineland
in 1936, which made the security relationship with France dangerous. See P. Laurent,
“The Reversal of Belgian Foreign Policy, 1936–37,” Review of Politics, Vol. 31, No. 3
( July 1969), p. 372.
100. One man who correctly divined the shifting tides of strategy during the interwar
period was Colonel Charles de Gaulle, who urged the creation of a French armée de
métier comprising shock mechanized units. Believing that war had not changed in its
essence since 1918, his superiors quashed the proposal. See Charles de Gaulle, Vers
l’armée de métier (Paris: Plon, 1981).
101. Quoted in Steven Waugh, Essential Modern World History (Cheltenham: Thomas
Nelson, 2001), p. 52.
102. See A. G. Armstrong, “The Army Today,” RUSI Journal, Vol. 81, No. 523 (1936); “The
Army’s New Weapons and Equipment,” RUSI Journal, Vol. 84, No. 534 (1939); and
Damian P. O’Connor, Between Peace and War: British Defence and the Royal United Services
Institute, 1931–2010 (London: Royal United Services Institute, 2011), pp. 176–77.
103. In 1940, France had about 2,900 deployable tanks in the country’s northeast. This
number was higher than Germany’s, even if one counts the tanks that Germany seized
from Czechoslovakia. French tanks, moreover, were qualitatively superior; they
included the SOMUA S35, which packed more firepower than its German equivalent,
the Panzer III. See Julian Jackson, The Fall of France: The Nazi Invasion of 1940
(Oxford: Oxford University Press, 2003), pp. 12–13.
104. Interaction capacity signifies “the level of transportation, communication, and orga-
nization capability in the unit/system that determines what types and levels of interac-
tion are possible.” Buzan and Little, International Systems in World History, p. 441.
105. See Benedict Anderson, Under Three Flags: Anarchism and the Anti-Colonial
Imagination (London: Verso, 2005).
106. See William M. Ramsay, The Imperial Peace: An Ideal in European History (Oxford:
The Clarendon Press, 1913).
107. On Roman military technology, see Simon James, Rome and the Sword: How Warriors
and Weapons Shaped Roman History (London: Thames and Hudson, 2011).
108. See Thomas Powers, Heisenberg’s War: The Secret History of the German Bomb (New
York: De Capo Press, 2000).
109. See Robert Farley, “What if Hitler Developed Nuclear Weapons during World War
II?” The National Interest (October 8, 2016).
110. A German atomic victory, which likely would have involved the incineration of at least
London and Moscow, would have looked very different from a conventional German
victory, in which big Allied population centres may have survived the war largely
intact, much as Paris did following France’s defeat in the spring of 1940. In either
scenario, it is probable that the post-war order would have resembled a combination
of World Leviathan within the European pan-Germanic Empire, or Großgermanisches
Reich, colonial dominion in parts of Africa, and hegemony elsewhere, possibly in
conjunction with the Japanese Empire. As Geoffrey Stoakes has argued, Hitler sought
to assert direct control only over certain parts of the globe while exercising leadership
278 NOTES to pp. 112–20
elsewhere. See Geoffrey Stoakes, Hitler and the Quest for World Dominion (Leamington
Spa: Berg, 1986), p. 235.
111. A world system is in Bull’s terms different from a world society, which as we saw
conveys a concern for the interests of actors other than states even as it retains the
assumption that states are the supreme agents of international life. See Hedley Bull,
The Anarchical Society: A Study of Order in World Politics, second edition (London:
Macmillan, 1995).
112. See Graham Allison, Nuclear Terrorism: The Greatest Preventable Catastrophe (New
York: Henry Holt, 2005), Chapter 1.
113. This would equate to approximately one or two deployed warheads. By “deployed” I
mean devices that are usable because they are mounted on missiles or other delivery
systems. If one includes total nuclear warheads, the figure rises to seven. See Shannon
N. Kile and Hans M. Kristensen, Trends in Nuclear Forces, 2016: SIPRI FACT Sheet
(Solna, Sweden: Stockholm International Peace Research Institute, June 2016), p. 2.
The nuclear devices such as “dirty bombs” that terrorists could plausibly deploy may
differ significantly both in the manner of deployment and explosive yield. Even
allowing for a greatly reduced range of delivery and power, the effect on international
security of a dirty nuclear attack by terrorists could be profound. On nuclear terrorism,
see Allison, Nuclear Terrorism.
114. Allison, Nuclear Terrorism, p. 227.
115. Michael J. Mills, Owen B. Toon, Julia Lee-Taylor, and Alan Robock, “Multidecadal
Global Cooling and Unprecedented Ozone Loss Following a Regional Nuclear
Conflict,” Earth’s Future, Vol. 2, No. 4 (April 2014), pp. 161–76.
116. See Ryan Rasteger, “How Many Nukes Would It Take to Render Earth Uninhabitable,”
Global Zero: A World Without Nuclear Weapons ( July 9, 2015), http://www.globalzero.
org/blog/how-many-nukes-would-it-take-render-earth-uninhabitable.
117. Thomas S. Kuhn, The Structure of Scientific Revolutions, third edition (Chicago, IL:
University of Chicago Press, 1996), p. 151. Kuhn observed that intellectual resistance
to paradigmatic adjustment can last a lifetime, particularly in the case of individuals
“whose careers have committed them to an older tradition of normal science” (p. 151).
the Internet. See Peter Bradshaw, “The Interview: Sony’s Retreat Signals an
Unprecedented Defeat on American Turf,” The Guardian (December 18, 2014); and
Chapter 5.
5. B. Valeriano and C. Maness, Cyber War Versus Cyber Realities: Cyber Conflict in the
International System (Oxford: Oxford University Press, 2015), p. 6 (emphasis mine).
6. Ibid., p. 21 (emphasis mine).
7. See Lucas Kello, “The Meaning of the Cyber Revolution: Perils to Theory and
Statecraft,” International Security, Vol. 38, No. 2 (Fall 2013), pp. 7–40.
8. See Austin Wright, “Cybersecurity Tests Delayed over Vulnerability Concerns,” Politico,
(November 2015), https://www.politicopro.com/defense/story/2015/11/cybersecurity-
tests-delayed-over-vulnerability-concerns–0752266.
9. See Nicholas Watt, “Trident Could Be Vulnerable to Cyber-Attack, Says Former
Defence Secretary,” The Guardian (November 23, 2015). The quoted words belong to
Franklin Miller, a former White House defense policy official under President George
W. Bush. As the Stuxnet operation showed, air gaps are not impervious to malware. All
computer systems, including those residing within the cyber archipelago, are susceptible
to cyber intrusion. Thus Miller’s statement would be correct only in the impossible case
that nuclear command and control functions, and the weapons themselves, were entirely
manually operated.
10. For a fine elaboration of this line of argumentation, see Erik Gartzke, “The Myth of
Cyberwar: Bringing War in Cyberspace Back Down to Earth,” International Security,
Vol. 38, No. 2 (Fall 2013), pp. 41–73.
11. This core feature of the Conventional Model of the international system is reflected in
the common tendency of some theorists to integrate into the study of international
politics assumptions from microeconomic theory. As Kenneth Waltz put it: “Just as
economists define markets in terms of firms, so I define international political struc-
tures in terms of states.” Kenneth N. Waltz, Theory of International Politics (New York:
McGraw-Hill, 1979), p. 94.
12. Donna Miles, “Stavridis Spotlights Top National Security Issues,” American Force
Press Service, U.S. Department of Defense (March 15, 2012).
13. On the notion of “bounded rationality” in human decision-making, see Herbert A.
Simon, Administrative Behavior: A Study of Human Decision-Making Processes in
Administrative Organization (New York: Macmillan, 1947); and Daniel Kahneman, “A
Perspective on Judgment and Choice: Mapping Bounded Rationality,” American
Psychologist, Vol. 58, No. 9 (September 2003), pp. 697–720.
14. For a discussion of systemic disruption in the nuclear context, see Graham T. Allison,
Albert Carnesale, and Joseph S. Nye, eds., Hawks, Doves, and Owls: An Agenda for
Avoiding Nuclear War (New York: W. W. Norton, 1985), Chapter 1.
15. See Ralph Lander, To Kill a Centrifuge: A Technical Analysis of What Stuxnet’s Creators
Tried to Achieve (Arlington, VT: The Langner Group, November 2013), p. 34.
16. See Vivian Yeo, “Stuxnet Infections Spread to 115 Countries,” ZDNet Asia (August 9,
2010).
17. This is because the worm exploited a shortcut facility in Windows software. See Sharon
Weinberger, “Computer Security: Is This the Start of Cyberwarfare?” Nature, Vol. 474
(2011), pp. 142–45.
18. The algorithm failed to specify either a price or a time for the execution of the order,
producing an enormous sell pressure. See Findings Regarding the Market Events of May
6, 2010: Report of the Staffs of the CFTC and SEC to the Joint Advisory Committee on
Emerging Regulatory Issues (Washington, D.C.: U.S. Commodity Futures Trading
Commission and U.S. Securities & Exchange Commission, September 30, 2010).
19. See Nathaniel Popper, “The Stock Market Bell Rings, Computers Fail, Wall Street
Cringes,” The New York Times ( July 8, 2015).
20. See Bradley Hope and Saumya Vaishampayan, “Glitch Freezes NYSE Trading for
Hours,” Wall Street Journal ( July 8, 2015).
21. See William A. Owens, Kenneth W. Dam, and Herbert S. Lin, eds., Technology, Policy,
280 NOTES to pp. 124–30
Law, and Ethics Regarding U.S. Acquisition and Use of Cyberattack Capabilities
(Washington, D.C.: National Academies Press, 2009), pp. 2–32.
22. See “Cyberwar Against Iraq,” News Max (March 12, 2003). A further problem arose
when U.S. tacticians explored options to insert code into Iraq’s military command
networks. The country’s military and civilian telecommunications networks were closely
linked, thus raising risks of collateral damage if the former was attacked.
23. “Cyberwar Against Iraq,” News Max.
24. On the problem of adaptive malware and responses to it, see Sean Price,
“Adaptive Threats and Defenses,” in Harold F. Tipton and Micki Krause, eds.,
Information Security Management Handbook, sixth edition, Vol. 4 (New York: Auerbach
Publications, 2010).
25. See McAfee Labs Threats Report (Santa Clara, CA: McAfee Labs, June 2016).
26. See Peter W. Singer and Allan Friedman, Cybersecurity and Cyberwar: What Everyone
Needs to Know (Oxford: Oxford University Press, 2014), p. 31.
27. For example, Russia. See Jonathan Medalia, Comprehensive Nuclear Test-Ban Treaty:
Updated Safeguards and Net Assessments (Washington, D.C.: Congressional Research
Service, June 3, 2009).
28. Other treaties that ensued in the first ten years after the PTBT include the Outer Space
Treaty and Treaty of Tlatelolco (1967), the Seabed Arms Control Treaty (1972), and
the Anti-Ballistic Missile Treaty (1972). A Comprehensive Test-Ban Treaty was signed
in 1996 but still awaits ratification.
29. Matthew Evangelista, Unarmed Forces: The Transnational Movement to End the Cold War
(Ithaca, N.Y.: Cornell University Press, 1999).
30. See Impact of a Threshold Test Ban Treaty on Soviet Military Programs: National
Intelligence Estimate Number 11–11–66, United States Intelligence Board (May 25,
1966).
31. See Nuclear Weapons Life Cycle (Washington, D.C.: National Nuclear Security
Administration), https://nnsa.energy.gov/ourmission/managingthestockpile/nwlifecycle.
32. See William J. Broad and David E. Sanger, “As U.S. Modernizes Nuclear Weapons,
‘Smaller’ Leaves Some Uneasy,” The New York Times ( January 11, 2016).
33. See “Treaty Banning Nuclear Weapon Tests in the Atmosphere, in Outer Space and
Under Water,” U.S. Department of State, http://www.state.gov/t/isn/4797.htm.
34. On the backdrop of the recent controversy over Hillary Clinton’s use of private email
servers in official State Department communications, Graham gleefully stated: “I
haven’t worried about an email being hacked, since I’ve never sent one. I’m, like, ahead
of my time.” Yet pictures exist of the senator using a mobile phone – hence, some of his
communications are susceptible to hacking. See “Quotation of the Day,” The New York
Times (September 15, 2016).
35. On the attribution problem, see for instance, Clark and Landau, “Untangling
Attribution”; and Thomas Rid and Ben Buchanan, “Attributing Cyber Attacks,” Journal
of Strategic Studies, Vol. 38, No. 1 (2014), pp. 4–37.
36. Herbert S. Lin, “Some Interesting Aspects of Cyberconflict for Discussion,” presenta-
tion at the Harvard Kennedy School, Cambridge, Massachusetts (February 8, 2012).
37. Dmitri Galushkevich, an ethnic Russian from Tallinn, was convicted of disrupting the
website of the Reform Party of Estonian Prime Minister Andrus Ansip. He was fined
17,500 Estonian kroons (about $1,600). For more on the Estonian attacks, see Chapter 6.
38. On cyber deterrence, see Patrick M. Morgan, “Applicability of Traditional Deterrence
Concepts and Theory to the Cyber Realm,” in Proceedings of a Workshop on Deterring
Cyberattacks: Informing Strategies and Developing Options for U.S. Policy (Washington,
D.C.: National Academies Press, 2010), pp. 55–76.
39. See Jordan Carney, “Wasserman Schultz Called Top Sanders Aide a ‘Damn Liar’ in
Leaked Email,” The Hill ( July 22, 2016).
40. As Chapter 8 discusses, a declassified U.S. intelligence report stated: “Russia’s goals were
to undermine public faith in the US democratic process, denigrate Secretary Clinton,
and harm her electability and potential presidency. We further assess Putin and the
NOTES to pp. 131–5 281
the rich can buy there, though. Your ordinary comrade will have to brave the whims
of the shopkeeper who retrieves the desired merchandise from a shelf behind the
counter. See “North Koreans Experience the Marvels of a Supermarket Firsthand,”
Business Insider (February 25, 2012).
9. Han S. Park, North Korea: The Politics of Unconventional Wisdom (Boulder, CO: Lynne
Rienner, 2002).
10. A leading position in the Non-Aligned Movement distinguished North Korea from
the Soviet Union and China. See Bernd Schaefer, “North Korean ‘Adventurism’ and
China’s Long Shadow, 1966–1972,” Cold War International History Project Working
Paper, Woodrow Wilson International Center for Scholars (2004).
11. See “North Korea Defence Chief Yong-Chol ‘Executed’,” BBC (May 13, 2015).
12. See Clyde Haberman, “Bomb Kills 19, Including 6 Key Koreans,” The New York Times
(October 10, 1983).
13. Rupert Wingfield-Hayes, “The North Korean Spy Who Blew Up a Plane,” BBC News
(April 22, 2013).
14. See Anna Fifield, “Malaysian Airport Assassination Focuses New Attention on North
Korean Leader,” Washington Post (February 15, 2017).
15. North Korea performed a similar act of defiance by firing a medium-range rocket into
the Sea of Japan in February 2017. See Bryan Harris and Kana Inagaki, “North Korea
Tests Trump with Missile Launch,” The Financial Times (February 12, 2017).
16. North Korea received uranium-enrichment technology and equipment from Pakistan in
exchange for its ballistic-missile technology. In the mid-2000s, North Korea began
building a nuclear reactor in Syria, which the Israelis destroyed in September 2007 with
a surgical air strike. See Sharon A. Squassoni, Weapons of Mass Destruction: Trade between
North Korea and Pakistan (Washington, D.C.: Congressional Research Service, 2006);
and Scott Snyder, “North Korea’s Illicit Arms Trade Unmasked,” Forbes (March 19, 2014).
17. See The Military Balance, Vol. 117, No. 1 (2017), International Institute for Strategic
Studies, p. 304.
18. To reach Alaska, North Korea requires missiles with a minimum range of 7,000 km.
The Taepodong–2, a variant of which is a full-range ICBM, has a theoretical reach
of up to 12,000 km. See “Missles,” WMD Around the World, Federation of American
Scientists, https://fas.org/nuke/guide/dprk/missile/ (last updated on October 21, 2016).
19. See NTI Nuclear Materials Index Security Index: Building a Framework for Assurance,
Accountability, and Action, second edition (Washington, D.C.: Nuclear Threat Initiative,
January 2014), p. 19.
20. See Park Young Ho, “South and North Korea’s Views on the Unification of the Korean
Peninsula and Inter-Korean Relations,” paper presented at the Second KRIS-Brookings
Joint Conference on Security and Diplomatic Cooperation between ROK and US for
the Unification of the Korean Peninsula ( January 21, 2014), p. 5.
21. Ibid., p. 7.
22. See the record of the discussions between the North Korean defector Hwang Jang-Yop
and Selig Harrison in Don Oberdorfer, The Two Koreas (New York: Addison-Wesley,
1998), p. 401; and Hwang’s remarks in Robert Myers, Korea in the Cross Currents (New
York: M. E. Sharpe, 2000). For a general analysis of North Korea’s nuclear strategy, see
Victor D. Cha, “North Korea’s Weapons of Mass Destruction: Badges, Shields, or
Swords?” Political Science Quarterly, Vol. 117, No. 2 (2002), pp. 209–30.
23. See, for instance, John S. Park and Dong Sun Lee, “North Korea: Existential Deterrence
and Diplomatic Leverage,” in Muthiah Alagappa, ed., The Long Shadow: Nuclear
Weapons and Security in 21st Century Asia (Stanford, CA: Stanford University Press,
2008).
24. For an exposition of this argument, see Samuel S. Kim, “North Korea’s Nuclear Strategy
and the Interface between International and Domestic Politics,” Asian Perspectives,
Vol. 34, No. 1 (2010), pp. 49–85.
25. See Richard Lloyd Parry, “North Korea ‘Succeeds in Miniaturising Nuclear Warhead’,”
The Times (April 7, 2016).
284 NOTES to pp. 149–53
26. The failed launch of an Earth observation satellite in February 2016 betrays the obsta-
cles to Pyongyang’s missile ambition.
27. See Scott D. Sagan, “Why Do States Build Nuclear Weapons? Three Models in Search
of a Bomb,” International Security, Vol. 21, No. 3 (Winter 1996–97).
28. Ibid., p. 55.
29. Cha, “North Korea’s Weapons of Mass Destruction,” p. 227.
30. As Daniel Byman explained, the regime has used economic and other inducements
(conveyed in the “military-first” or songun principle) to co-opt the country’s military
elites. See Daniel Byman, “Pyongyang’s Survival Strategy: Tools of Authoritarian
Control in North Korea,” International Security, Vol. 35, No. 1 (Summer 2010), pp. 44–74.
31. As John Park has shown, the Kim house sustains the loyalty of elites by operating a web
of state-trading companies that generate income for special interest budgets. See John
S. Park, “North Korea, Inc.: Gaining Insights into North Korean Regime Stability from
Recent Commercial Activities,” United States Institute of Peace Working Paper,
Washington, D.C. (May 2009).
32. For a discussion of the limited effects of economic sanctions on North Korea’s nuclear
ambitions, see Stephen Haggard and Marcus Noland, “Engaging North Korea: The
Efficacy of Sanctions and Inducements,” in Etel Solingen, ed., Sanctions, Statecraft, and
Nuclear Proliferation (Cambridge: Cambridge University Press, 2012).
33. “Hearings to Examine the Nomination of General Vincent K. Brooks, USA, for
Reappointment to the Grade of General and to be Commander, United Nations
Command/Combined Forces Command/United States Forces Korea,” U.S.
Congressional Committee Hearing, C-SPAN (April 19, 2016), https://www.c-span.org/
video/?408108–1/nomination-brooks/.
34. The closeness between the two domains is reflected in their operational integration
within the U.S. Marine Corps: for example, the creation of a Cyber Electronic Warfare
Coordination Center. See Matthew E. Poole and Jason C. Schuette, “Cyber Electronic
Warfare. Closing the Operational Seams,” Marine Corps Gazette, Vol. 99, No. 8 (August
2015), pp. 60–62.
35. Some analysts regard Desert Storm as the first “information war.” See Edward Mann,
“Desert Storm: The First Information War?” Airpower Journal (Winter 2014).
36. See “Chapter Six: Asia,” The Military Balance, Vol. 117, No. 1 (2017), p. 306.
37. See “Seoul Ranked World’s Top Digital City,” Chosun Ilbo ( June 20, 2007). The city has
gained plaudits for its integration of information technologies into urban planning and
services. See Anthony M. Townsend, “Seoul: Birth of a Broadband Metropolis,”
Environment and Planning B: Urban Analytics and City Science, Vol. 34, No. 3 (2007),
pp. 396–413.
38. The North Korean government severely restricts the public’s access to Internet services.
Jean H. Lee, the Associated Press bureau chief in the country, explained that ordinary
citizens have “access to the state media, information sources that are vetted by the
government, and picked and pulled from the Internet and posted to their intranet site.”
Clyde Stanhope, “How Bad Is the North Korean Cyber Threat?” HackRead ( July 2016).
39. See Andrei Lankov, “Changing North Korea: An Information Campaign Can Beat the
Regime,” Foreign Affairs, Vol. 88, No. 6 (November/December 2009), p. 95.
40. See Ju-min Park and James Pearson, “In North Korea, Hackers Are a Handpicked,
Pampered Elite,” Reuters (December 5, 2014).
41. See “North Korea Boosted ‘Cyber Forces’ to 6,000 Troops, South Says,” Reuters ( January
6, 2015).
42. See Youkyung Lee, “A Look at North Korea’s Cyberwar Capabilities,” Washington
Times (December 18, 2004).
43. For technical details about the Lazarus Group’s malicious activities, see Costin Raiu,
Global Research and Analysis Team, Juan Andrés Guerrero-Saade, “Operation
Blockbuster Revealed,” Securelist (February 24, 2016), https://securelist.com/blog/
incidents/73914/operation-blockbuster-revealed/.
44. Unconfirmed sources, however, suggest that a Sony Pictures employee or Russian
NOTES to pp. 153–8 285
nationalists abroad may have assisted the attackers. See Bruce Schneier, “We Still Don’t
Know Who Hacked Sony,” Schneier on Security ( January 5, 2015).
45. But see, for instance, Christopher Whyte, “Ending Cyber Coercion: Computer Network
Attack, Exploitation and the Case of North Korea,” Comparative Strategy, Vol. 35,
No. 2 ( July 2016), pp. 93–102.
46. See Symantec Security Response, “SWIFT Attackers’ Malware Linked to More
Financial Attacks,” Symantec Official Blog (May 26, 2016).
47. See Nicole Perlroth and Michael Corkery, “North Korea Linked to Digital Attacks on
Global Banks,” The New York Times (May 26, 2016).
48. See Kim Zetter, “That Insane, $81m Bangladesh Bank Heist? Here’s What We Know,”
Wired (May 17, 2016).
49. See Chico Harlan and Ellen Nakashima, “Suspected North Korean Cyberattack on a
Bank Raises Fears for S. Korea, Allies,” Washington Post (August 29, 2011).
50. See “Cyber-Attacks on South Korean Nuclear Power Operator Continue,” The
Guardian (December 28, 2014).
51. See “N. K. Hacked Government Bigwigs,” Korea Herald (March 7, 2016).
52. See Ju-min Park, “South Korea Group Launches Anti-North Leaflets amid Threats
from Pyongyang,” Reuters (October 25, 2014).
53. See Lankov, “Changing North Korea”; and Jieun Baek, North Korea’s Hidden Revolution:
How the Information Underground Is Transforming a Closed Society (New Haven, CT:
Yale University Press, 2015).
54. Following the cyberattacks, suspected North Korean agents also delivered emails
which vaguely threatened physical attacks against theaters that screened the film. See
Tatiana Siegel, “Sony Hack: New Leaked Docs Reveal Michael Lynton’s Email Inbox,”
Hollywood Reporter (December 16, 2016).
55. Chang Jae-soon, “Obama Vows to ‘Respond Proportionally’ to Sony Hack Blamed on
N. Korea,” Yonhap News Agency (December 20, 2014).
56. The newspaper had published satirical images of the prophet Muhammad that offended
some religious sensibilities.
57. Brent Lang and Ted Johnson, “Fear and Censorship: Paris, Sony Attacks Put Creative
Freedoms under Fire,” Variety ( January 7, 2015).
58. Barack Obama, “Executive Order 13687 – Imposing Additional Sanctions With Respect
to North Korea” ( January 2, 2015), published online by Gerhard Peters and John T.
Woolley,The American Presidency Project,http://www.presidency.ucsb.edu/ws/?pid=108103.
59. See Michael Daniel, “Our Latest Tool to Combat Cyber Attacks: What You Need to
Know,” The White House (April 1, 2015), https://obamawhitehouse.archives.gov/
blog/2015/04/01/our-latest-tool-combat-cyber-attacks-what-you-need-know.
60. Some American officials denied that their country caused the interruption of
North Korea’s Internet. Others, however, affirmed this. See Chris Strohm, “North
Korea Web Outage Response to Sony Hack, Lawmaker Says,” Bloomberg (March 17,
2015).
61. See Roger Hurwitz, “Keeping Cool: Steps for Avoiding Conflict and Escalation in
Cyberspace,” Georgetown Journal of International Affairs ( July 2015), pp. 17–23.
62. Quoted in Stanhope, “How Bad is the North Korean Cyber Threat?” The original
source language is available here: http://nk.joins.com/news/view.asp?aid=12640100.
63. See David E. Sanger and William J. Broad, “Trump Inherits a Secret Cyberwar Against
North Korean Missiles,” The New York Times (March 4, 2017). On active defense, see
Chapter 9.
3. See Jack Goldsmith and Tim Wu, Who Controls the Internet? Illusions of a Borderless
World (New York: Oxford University Press, 2006).
4. Lawrence Lessig, Code (New York: Basic Books, 1999), p. 298.
5. See for instance Ronald Deibert, John Palfrey, Rafal Rohozinski, and Jonathan Zittrain,
Contested: Security, Identity, and Resistance in Asian Cyberspace (Cambridge, MA: MIT
Press, 2011).
6. Kenneth N. Waltz, Theory of International Politics (New York: McGraw-Hill, 1979),
pp. 93–94.
7. See Robert O. Keohane, After Hegemony: Cooperation and Discord in the World Political
Economy (Princeton, N.J.: Princeton University Press, 1984); and Lisa L. Martin,
“Neoliberalism,” in Tim Dunne, Milya Kurki, and Steve Smith, eds., International
Relations Theories: Discipline and Diversity (Oxford: Oxford University Press, 2007).
8. See Hedley Bull, The Anarchical Society: A Study of Order in World Politics, second edition
(London: Macmillan, 1995).
9. See Alexander Wendt, Social Theory of International Politics (Cambridge: Cambridge
University Press, 1999); and Christian Reus-Smit, The Moral Purpose of the State:
Culture, Social Identity, and Institutional Rationality in International Relations (Princeton,
N.J.: Princeton University Press, 1999).
10. David A. Lake, “The State and International Relations,” in Christian Reus-Smit and
Duncan Snidal, eds., The Oxford Handbook of International Relations (Oxford: Oxford
University Press, 2015), p. 2.
11. Ronald Rogowski, “Institutions as Constraints on Strategic Choice,” in David A. Lake
and Richard Powell, eds., Strategic Choice and International Relations (Princeton, N.J.:
Princeton University Press, 1999).
12. See Arnold Wolfers, “ ‘National Security’ as an Ambiguous Symbol,” Political Science
Quarterly, Vol. 67 (December 1952), pp. 481–502. A countervailing view is found in
Steven D. Krasner, Defending the National Interest: Raw Materials Investments and US
Foreign Policy (Princeton, N.J.: Princeton University Press, 1978).
13. Mill himself never used this term, although it appeared in reaction to his writings.
See John Stuart Mill, Essays on Some Unsettled Questions of Political Economy, second
edition (London: Longmans, 1874), Essay 5; and Joseph Persky, “The Ethology
of Homo Economicus,” Journal of Economic Perspectives, Vol. 9, No. 2 (Spring 1995),
p. 222.
14. Khan has claimed in his defense that Pakistan’s Prime Minister, Benazir Bhutto,
instructed him to exchange technological secrets with North Korea. See Mark
Fitzpatrick, “Dr. A. Q. Khan and the Rise and Fall of Proliferation Networks,” Nuclear
Black Markets (London: International Institute for Strategic Studies, 2007).
15. John Mueller, “Simplicity and the Spook: Terrorism and the Dynamics of Threat
Exaggeration,” International Studies Perspectives (2005), p. 220.
16. Even some prominent social theorists are content with keeping the lid closed. Alexander
Wendt, for instance, regards “the constitution of states as ‘unitary actors,’ which is the
starting point for theorizing about the international system.” Alexander Wendt, Social
Theory of International Politics (Cambridge: Cambridge University Press, 1999), p. 195.
17. Christopher Rhoads and Farnaz Fassihi, “Iran Vows to Unplug the Internet,” Wall Street
Journal (May 28, 2011).
18. See Saeed Kamali Dehghan, “Iran Clamps Down on Internet Use,” The Guardian
( January 5, 2012).
19. On the high costs of mounting a sophisticated cyberattack such as the Stuxnet opera-
tion, see for example Jon Lindsay, “Stuxnet and the Limits of Cyber Warfare,” Security
Studies, Vol. 22, No. 3 (2013), pp. 365–404.
20. For a discussion of this topic, see Joseph S. Nye, Jr., The Future of Power (New York:
PublicAffairs, 2011), Chapter 5.
21. Ash Carter, “Drell Lecture: ‘Rewiring the Pentagon: Charting a New Path on
Innovation and Cybersecurity,’ ” Stanford University (April 23, 2015), http://archive.
defense.gov/speeches/speech.aspx?SpeechID=1935.
NOTES to pp. 166–70 287
“Fanny,” which security researchers have also tied to Duqu and Flame. See Boldizsár
Bencsáth, “Duqu, Flame, Gauss: Followers of Stuxnet,” RSA Conference Europe 2012,
http://www.rsaconference.com/writable/presentations/file_upload/br–208_bencsath.
pdf. In this scenario, Stuxnet, Duqu, and Flame may have been “stepbrothers.”
63. See Jamie Dettmer, “Digital Jihad: ISIS, Al Qaeda Seek a Cyber Caliphate to Launch
Attacks on US,” Fox News (September 14, 2014).
64. See Daniel Byman and Jeremy Shapiro, “Be Afraid. Be a Little Afraid: The Threat of
Terrorism from Western Fighters in Syria and Iraq,” Foreign Policy and Brookings,
Policy Paper No. 34 (Washington, D.C.: Brookings, November 2014).
65. See Giles Hogben, ed., Botnets: Detection, Measurement, Disinfection, and Defence
(Heraklion: European Network and Information Security Agency, 2011), p. 13.
66. Brian Proffitt, “How to Build a Botnet in 15 Minutes,” Readwrite ( July 31, 2013).
67. See Brian Krebs, “Stress-Testing the Booter Services, Financially,” KrebsonSecurity.com
(August 15, 2016).
68. Some botnets, however, have been used to conduct governmental espionage – for
example, GOZ, a botnet that issued detailed queries to computers in Georgia and
Turkey in search of classified documents. See Michael Sandee, GameOver Zeus:
Background on the Badguys and the Backends (Delft: Fox-IT, July 2015), p. 9.
69. Some forms of sophisticated espionage code, though, began their life as botnet
components – for instance, BlackEnergy, a malware package that targeted Ukrainian
government institutions in 2014. See BlackEnergy and Quedagh: The Convergence of
Crimeware and APT Attacks, White Paper (Helsinki: F-Secure, 2014).
70. “US Warns Cyber-Attacks Will Increase,” Financial Times (May 18, 2007).
71. See Noah Shachtman, “Kremlin Kids: We Launched the Estonian Cyber War,” Wired
(March 11, 2009).
72. U.S. Cyber Consequences Unit (US-CCU), Overview by the US-CCU of the Cyber
Campaign against Georgia in August 2008, Special Report, US-CCU (August 2009), www.
registan.net/wp-content/uploads/2009/08/US-CCU-Georgia-Cyber-Campaign-
Overview.pdf, p. 3.
73. Ibid.
74. According to the U.S. Cyber Consequences Unit, the criminal groups participating in
the DDoS attacks “wanted to claim credit” for them. See ibid.
75. James R. Clapper and Charlie Rose, “A Conversation with James Clapper,” Council on
Foreign Relations (October 25, 2016), http://www.cfr.org/intelligence/conversation-
james-clapper/p38426.
76. Flashpoint investigators dismissed with moderate confidence the claims of authorship
by self-styled Russian state actor “Jester” and by supporters of Wikileaks. See Allison
Nixon, John Costello, and Zach Wikholm, “An After-Action Analysis of the Mirai
Botnet Attacks on Dyn,” Flashpoint (October 25, 2016), https://www.flashpoint-intel.
com/action-analysis-mirai-botnet-attacks-dyn/.
77. Clapper and Rose, “A Conversation with James Clapper.”
78. The meaning of the notion is, in fact, ambiguous and contested. For a more detailed
discussion, see Chapter 9.
79. The fifth largest market, the Shanghai Stock Exchange, is a non-profit organization
under the direct supervision of the China Securities Regulatory Commission, which is
an institution of the State Council – hence it is not privately operated.
80. The two exceptions are Paraguana Refining Centre, which belongs to the Venezuelan
state oil firm PDVSA; and the Ras Tanura Refinery, which is the property of the
publicly owned firm Saudi Aramco. See “Top 10 Large Oil Refineries,”
HydrocarbonsTechnology.com (September 30, 2013).
81. The Pentagon’s Defense Advanced Research Projects Agency (DARPA) supported but
did not drive the protocol transition. See John Naughton, A Brief History of the Future:
The Origins of the Internet (London: Weidenfeld and Nicolson, 1999), pp. 166–67; and
Vint Cerf, “How the Internet Came to Be,” in Bernard Adoba, ed., The Online User’s
Encyclopedia (Boston, MA: Addison-Wesley, 1993).
290 NOTES to pp. 179–86
How One Interagency Group Made a Major Difference (Washington, D.C.: National
Defense University Press, June 2012); and Steve Abrams, “Beyond Propaganda: Soviet
Active Measures in Putin’s Russia,” Connections: The Quarterly Journal, Vol. 15, No. 1
(2016), p. 12.
14. KGB agents at Soviet embassies around the globe disseminated the U.S. National
Security Council document, titled “Carter’s Secret Plan to Keep Black Africans and
Black Americans at Odds.” See Schoen and Lamb, Deception, Disinformation, and
Strategic Communications, p. 24; and J. Michael Waller, Strategic Influence: Public
Diplomacy, Counterpropaganda, and Political Warfare (Washington, D.C.: Institute of
World Politics Press, 2009), pp. 159–61.
15. See Tony Barber, “Russia’s Dark Art of Disinformation,” The Financial Times
(September 16, 2016); and Thomas Boghardt, “Soviet Bloc Intelligence and Its AIDS
Disinformation Campaign,” Studies in Intelligence, Vol. 53, No. 4 (December 2009), pp.
1–24. On KGB disinformation campaigns generally, see Ladislav Bittman, The KGB
and Soviet Disinformation: An Insider’s View (Oxford: Pergamon, 1985); and Ion Mihai
Pacepa and Ronald Rychlak, Disinformation: Former Spy Chief Reveals Secret Strategies
for Undermining Freedom, Attacking Religion, and Promoting Terrorism (Washington,
D.C.: WND Books, 2013).
16. Valeriy Gerasimov, “Ценность науки в предвидении” [“The Value of Science is in
Foresight”], Военно-промышленный курьер (February 27, 2013), http://vpk-news.
ru/sites/default/files/pdf/VPK_08_476.pdf. For an English translation and commen-
tary, see “The ‘Gerasimov Doctrine’ and Russian Non-Linear Warpage,” in https://
inmoscowsshadows.wordpress.com/2014/07/06/the-gerasimov-doctrine-and-russian-
non-linear-war/. See also Sergey Chekinov and Sergey A. Bogdanov, “The Nature and
Content of a New-Generation War,” Military Thought, No. 4 (2013).
17. See Jolanta Darczewska, The Anatomy of Russian Information Warfare. The Crimean
Operation: A Case Study (Warsaw: Centre for Eastern Studies, May 2014).
18. Keir Giles, “Russia’s ‘New’ Tools for Confronting the West: Continuity and Innovation
in Moscow’s Exercise of Power,” Research Paper, Russia and Eurasia Programme,
Chatham House (March 2016), p. 10. Gerasimov himself has pointed as evidence of
this reality to the popular uprisings in the Arab world in 2011 and in Ukraine in 2014.
See A Primer on Modern Russian Unconventional Warfare, Ukraine 2013–14 (Unclassified
version) (Fort Bragg, N.C.: The United States Army Special Operations Command,
2015), p. 16.
19. See James Risen, “The C.I.A. in Iran – A Special Report; How a Plot Convulsed Iran
in ’53 (and in ’79),” The New York Times (April 16, 2000).
20. See Adam Taylor, “Before ‘Fake News’ There Was Soviet ‘Disinformation’,” Washington
Post (November 26, 2016).
21. See Andrei Soldatov and Irina Borogan, The Red Web: The Struggle between Russia’s
Digital Dictators and the New Online Revolutionaries (New York: PublicAffairs, 2015),
p. 25.
22. One notable incident was the DDoS attack against liberal and independent websites
during the 2011 Duma elections. See Hal Roberts and Bruce Etling, “Coordinated
DDoS Attack During Russian Duma Elections,” Internet and Democracy Blog
(December 8, 2011), http://blogs.harvard.edu/idblog/2011/12/08/coordinated-ddos-
attack-during-russian-duma-elections/.
23. See Samuel P. Huntington, The Clash of Civilizations and the Remaking of World Order
(New York: Simon and Schuster, 1996).
24. Indeed, Huntington’s conception of a thin basis of universal values among countries in
different civilizations has affinities with Bull’s “pluralist” notion of international society.
See ibid., p. 54.
25. See Dina Newman, “Russian Nationalist Thinker Dugin Sees War with Ukraine,” BBC
News ( July 10, 2014), http://www.bbc.co.uk/news/world-europe–28229785.
26. “Putin Deplores Collapse of USSR,” BBC (April 25, 2005).
27. “The ‘Gerasimov Doctrine’ and Russian Non-Linear Warpage.”
296 NOTES to pp. 217–23
28. See James Sherr, “Ukraine and the Black Sea Region: The Russian Military Perspective,”
in Stephen Blank, ed., The Russian Military in Contemporary Perspective (Carlisle, PA:
Strategic Studies Institute, U.S. Army War College, forthcoming).
29. Timothy L. Thomas, “Russia’s Reflexive Control Theory and the Military,” Journal of
Slavic Military Studies, Vol. 17, No. 2 (2004), p. 237. See also Maria Snegovaya,
“Executive Summary: Putin’s Information Warfare in Ukraine: Soviet Origins of
Russia’s Hybrid Warfare,” Institute for the Study of War (September 2015).
30. Sherr, “Ukraine and the Black Sea Region.”
31. See Daisy Sindelar, “Inside Russia’s Disinformation Campaign,” Defense One (August
12, 2014); and Patrick Michael Duggan, “Strategic Development of Special Warfare in
Cyberspace,” Joint Force Quarterly, Vol. 79, No. 4 (October 2015).
32. Shaun Walker, “Salutin’ Putin: Inside a Russian Troll House,” The Guardian (April 2,
2015).
33. Sindelar, “Inside Russia’s Disinformation Campaign.”
34. For a review of Dugin’s school of thought on information warfare, see Little Green Men,
p. 16.
35. Officially, Clinton did not win the party’s nomination until the Democratic National
Convention selected her four days later, but by the time of the email leaks it was clear that
Sanders would likely lose. Trump had obtained the Republican Party’s nomination on July
19, 2016.
36. See “Background to ‘Assessing Russian Activities and Intentions in Recent US Elections’.”
See also Adam Meyers, “Danger Close: Fancy Bear Tracking of Ukrainian Field Artillery
Units,” Crowdstrike Blog (December 22, 2016), https://www.crowdstrike.com/blog/
danger-close-fancy-bear-tracking-ukrainian-field-artillery-units/; and Matt Flegenheimer,
“Countering Trump, Bipartisan Voices Strongly Affirm Findings on Russian Hacking,” The
New York Times ( January 5, 2017).
37. See Krishnadev Calamur, “NATO Shmato?,” The Atlantic ( July 21, 2016).
38. The deep cleavage within the Democratic Party pre-dated the hacking event. According to an
Economist/YouGov poll conducted four months earlier, 55 percent of Sanders supporters would
feel “dissatisfied” or “upset” if Clinton won the Democratic Party’s nomination. The party nomi-
nation boosted Clinton’s support within her party, but the cleavage persisted. In a survey the
month after the Convention, about one-third of Sanders sympathizers still rejected her. See “The
Economist/YouGov Poll,” YouGov UK (March 10–12, 2016), https://d25d2506sfb94s.cloudfront.
net/cumulus_uploads/document/055qdf83nv/econTabReport.pdf; and David Weigel, “Sanders
Absolves Clinton on Hacked Emails, but Other Voices on the Left Are Angry,” Washington Post
(October 12, 2016); and Harry Enten, “About a Third of Bernie Sanders’ Supporters Still Aren’t
Backing Hillary Clinton,” FiveThirtyEight (August 8, 2016), http://fivethirtyeight.com/features/
about-a-third-of-bernie-sanders-supporters-still-arent-backing-hillary-clinton/?ex_
cid=538twitter.
39. See Sari Horwitz, “FBI Director James B. Comey Under Fire for His Controversial
Decision on the Clinton Email Inquiry,” Washington Post (October 29, 2016).
40. For his part, Trump dismissed this assessment as sore-losing bluster. See Harry Enten,
“How Much Did Wikileaks Hurt Hillary Clinton?,” FiveThirtyEight (December 23,
2016), https://fivethirtyeight.com/features/wikileaks-hillary-clinton/.
41. Sabrina Siddiqui, “Priebus and Manafort Seize on Wasserman Schultz DNC
Resignation,” The Guardian ( July 25, 2017).
42. “Background to ‘Assessing Russian Activities and Intentions in Recent US Elections’:
The Analytic Process and Cyber Incident Attribution,” Report of the Director of
National Intelligence ( January 6, 2017).
43. See Lorenzo Franceschi-Bicchierai, “How Hackers Broke Into John Podesta and Colin
Powell’s Gmail Accounts,” Motherboard (October 20, 2016), https://motherboard.vicecom/
en_us/article/how-hackers-broke-into-john-podesta-and-colin-powells-gmail-accounts.
44. See Jeffrey Carr, “FBI/DHS Joint Analysis Report: A Fatally Flawed Effort,” Medium
(December 30, 2016), https://medium.com/@jeffreycarr/fbi-dhs-joint-analysis-report-
a-fatally-flawed-effort-b6a98fafe2fa.
NOTES to pp. 223–6 297
45. The origin of the Vermont power grid story was an anonymous source who spoke to the
Washington Post. The incident was not an attack, because it did not involve a disruption
of computer functions – merely the discovery of malware in a single computer. Nor was
it an action directed against the national power grid, because the infected machine was
not connected to the grid system of the affected Vermont company, Burlington
Electrical Department. See Richard Chirgwin, “Russian ‘Grid Attack’ Turns Out to Be
a Damp Squib,” The Register ( January 3, 2017).
46. Melissa Chan, “Julian Assange Says a ‘14-Year-Old Kid Could Have Hacked Podesta’
Emails,” Time ( January 4, 2017).
47. Guccifer divulged some material on outlets such as Wordpress blog, Gaker, and
DCLeaks. The DCLeaks website was established in June 2016 with the purpose of
publishing leaks of emails belonging to prominent government and military figures in
the United States. See ‘Threat Connect Identifies DC Leaks as Another Russian-
backed Influence Outlet,’ ThreatConnect (August 12, 2016).
48. See Thomas Rid, “All Signs Point to Russia Being Behind the DNC Hack,” Motherboard
( July 25, 2017).
49. “Opening Statement by SASC Chairman John McCain at Hearing on Foreign Cyber
Threats to the United States,” Floor Statements, official webpage of U.S. Senator John
McCain ( January 5, 2017), https://www.mccain.senate.gov/public/index.cfm/
floor-statements?ID=810C2B63–6714–4DF0-A337–5B46D9C6BBD9.
50. “Trump Questions Claims of Russian Hacking: ‘I Know Things Others Don’t’,” The
Guardian ( January 1, 2017).
51. Elizabet Weise, “Tech Crowd Goes Wild for Trump’s ‘400-Pound Hacker’,” USA Today
(September 27, 2016). Some security analysts and journalists also questioned Russia’s
authorship of the DNC hack. See, for example, Matt Taibbi, “Something About This
Russia Story Stinks,” Rolling Stone (December 30, 2016).
52. Confronted by the allegation that the Kremlin sought to help him win the election, the
president-elect replied: “I think it’s [the allegation] ridiculous . . . No, I don’t believe it
at all.” Elise Viebeck, “Trump Denies CIA Report that Russia Intervened to Help Him
Win Election,” Washington Post (December 11, 2016).
53. Ibid.
54. Polling data pointed to Clinton victory throughout much of the post-convention
stage. See Vann R. Newkirk II, “What Went Wrong With the 2016 Polls?” The Atlantic
(November 9, 2012).
55. The title of the Clinton video was “How 100% of the Clintons’ ‘Charity’ Went to . . .
Themselves.” The title of the Trump video was “Trump Will Not Be Permitted To
Win” (that is, by the U.S. political establishment). For details on Russian disinformation
activities during the campaign, see “Background to ‘Assessing Russian Activities and
Intentions in Recent US Elections’,” pp. 3–4.
56. See “Background to ‘Assessing Russian Activities and Intentions in Recent US
Elections’.”
57. Ibid.
58. See Evan Osnos, David Remnick, and Joshua Yaffa, “Trump, Putin, and the New Cold
War,” The New Yorker (March 6, 2017).
59. “Background to ‘Assessing Russian Activities and Intentions in Recent US Elections’.”
60. Lauren Gambino, “Obama Orders Sanctions against Russia in Response to US Election
Interference,” The Guardian (December 29, 2016).
61. Ibid.
2. Author interview with a senior official in the British Cabinet Office (February 17,
2017).
3. “Chairman of the U.S. House Intelligence Committee Mike Rogers in Washington
Post Live: Cybersecurity 2014,” Washington Post (October 2, 2014).
4. See Riley C. Matlack, M. Riley, and J. Robertson, “The Company Securing Your
Internet Has Close Ties to Russian Spies,” Bloomberg (March 15, 2015).
5. Huawei describes itself as an employee-owned “collective,” but some commentators
have questioned its freedom from Chinese state control. See Richard McGregor, The
Party: The Secret World of China’s Communist Rulers (New York: HarperCollins, 2010);
and M. Rogers and C. A. D. Ruppersberger, Investigative Report on the U.S. National
Security Issues Posed by Chinese Telecommunications Companies Huawei and ZTE, U.S.
House of Representatives 112th Congress, Permanent Select Committee on Intelligence
(October 8, 2012).
6. See Department of Defense Strategy for Operating in Cyberspace (Washington, D.C.: U.S.
Department of Defense, July 2011), p. 7.
7. See Cyber Security Strategy, 2014–2017 (Tallinn: Ministry of Economic Affairs and
Communications, 2014).
8. C. Green, “UK Becomes First Country to Disclose Plans for Cyber Attack Capability,”
Information Age (September 30, 2013).
9. Robert M. Lee, The Sliding Scale of Cyber Security – A SANS Analyst Whitepaper (Boston,
MA: SANS Institute, 2015), pp. 9–11.
10. Honeypots consist of decoy data that the defender uses to lure an attacker to study and
disrupt his methods. See Loras R. Even, Honey Pot Systems Explained (Boston, MA:
SANS Institute, July 12, 2000). Sinkholes refer to a DNS computer server that produces
false data to prevent the attacker from using the true domain name. See Guy Bruneau,
DNS Sinkhole (Boston, MA: SANS Institute, August 7, 2010).
11. See David D. Clark and Susan Landau, “Untangling Attribution,” Harvard National
Security Journal (March 2011).
12. See, for instance, Alexander Klimburg and Jason Healey,“Strategic Goals and Stakeholders,”
in Alexander Klimburg, ed., National Cyber Security Framework and Manual (Tallinn:
NATO Cooperative Cyber Defence Centre of Excellence, 2012), pp. 74–5 and 80; Tim
Maurer and Robert Morgus, Compilation of Existing Cybersecurity and Information Security
Related Definitions (New America, October 2012), p. 71; and Jay P. Kesan and Carol M.
Hayes, “Mitigative Counterstriking: Self-Defense and Deterrence in Cyberspace,” Harvard
Journal of Law and Technology, Vol. 25, No. 2 (Spring 2012), p. 460.
13. There are debates about the requirements of “authorization.” See Searching and Seizing
Computers and Obtaining Electronic Evidence in Criminal Investigations, third edition
(Washington, D.C.: Department of Justice Office of Legal Council, 2009).
14. It is unclear, however, whether intentional damage resulting from actions taken entirely
within one’s networks is lawful. See “Cyber-Surveillance Bill to Move Forward, Secretly”
(Washington, D.C.: Center for Democracy and Technology, March 4, 2015).
15. One notable case is Susan Clements Jeffrey vs. Absolute Software involving a company
that used beacon technology to capture explicit data from a computer the operator did
not know was stolen. The court ruled against the company. See “Absolute Software
Settles Lawsuit Over Nude Photos,” Forbes (September 6, 2011).
16. See remarks by the Homeland Security Secretary, Janet Napolitano, in Joseph Menn,
“Hacked Companies Fight Back with Controversial Steps,” Reuters ( June 18, 2012);
and remarks by Chairman of the U.S. House Intelligence Committee Mike Rogers in
“Washington Post Live: Cybersecurity 2014,” Washington Post (October 2, 2014).
17. Best Practices for Victim Response and Reporting of Cyber Incidents (Washington, D.C.:
Department of Justice, April 2015).
18. Michael S. Rogers, “Cyber Threats and Next-Generation Cyber Operations,” Keynote
Speech at the Annual Cybersecurity Technology Summit, AFCEA, Washington, D.C.
(April 2, 2015).
19. Interview with John Lynch, Steptoe Cyberlaw Podcast ( January 21, 2016).
NOTES to pp. 235–40 299
20. Hannah Kuchler, “Cyber Insecurity: Hacking Back,” The Financial Times ( July 27,
2015).
21. See Tom Spring, “Spam Slayer: Bringing Spammers to Their Knees,” PCWorld ( July 18,
2008).
22. See Kim Zetter, “FBI vs. Coreflood Botnet: Round 1 Goes to the Feds,” Wired (April
11, 2011).
23. See Brian Krebs, “ ‘Operation Tovar’ Targets ‘Gameover’ ZeuS Botnet, CryptoLocker
Scourge,” KrebsonSecurity ( June 2, 2014).
24. Kuchler, “Cyber Insecurity.”
25. These consequences are positive from the perspective of private defenders and their
parent governments; other players may not share this view.
26. The attackers activated the “Wiper” malware on 24 November; the FBI publicly attrib-
uted the attack to North Korea on 19 December. See “Update on Sony Investigation,”
Federal Bureau of Investigation (December 19, 2014), https://www.fbi.gov/news/
pressrel/press-releases/update-on-sony-investigation.
27. See Peter W. Singer and Allan Friedman, Cybersecurity and Cyberwar (Oxford: Oxford
University Press, 2014).
28. To share information derived from classified sources the U.S. government resorts to four
selective commercial service providers: AT&T, CenturyLink, Lockheed Martin, and
Verizon. See Andy Ozment, DHS’s Enhanced Cybersecurity Services Program Unveils New
“Netflow” Service Offering (Washington, S.C.: U.S. Department of Homeland Security,
January 26, 2016), https://www.dhs.gov/blog/2016/01/26/dhs%E2%80%99s-enhanced-
cybersecurity- services- program- unveils- new- %E2%80%9Cnetflow%E2%80%
9D-service-offering.
29. Ron Nixon, “Homeland Security Dept. Struggles to Hire Staff to Combat Cyberattacks,”
International New York Times (April 6, 2016).
30. See Oliver Wright, “GCHQ’s ‘Spook First’ Programme to Train Britain’s Most Talented
Tech Entrepreneurs,” The Independent ( January 1, 2015); and Jamie Collier, “Proxy
Actors in the Cyber Domain” (unpublished paper).
31. See Christian Czosseck, Rain Ottis, and Anna-Maria Talihärm, “Estonia After the
2007 Cyber Attacks: Legal, Strategic and Organisational Changes in Cyber Security,”
in M. Warren, ed., Case Studies in Information Warfare and Security (Reading: Academic
Conferences and Publishing International Limited, 2013).
32. See Lior Tabansky and Itzhak Ben Israel, Striking with Bits? The IDF and Cyber-
Warfare (Cham: Springer, 2015).
33. See “National Guard to Stand Up 13 New Cyber Units in 23 States,” Army Times
(December 15, 2015).
34. See James Pattison, The Morality of Private War: The Challenge of Private Military
Companies and Security Companies (Oxford: Oxford University Press, 2014); and A.
Alexandra, D.-P. Baker, and M. Caparini, eds., Private Military Companies: Ethics,
Policies and Civil-Military Relations (London: Routledge, 2008).
35. See Sari Horwitz, Shyamantha Asokan, and Julie Tate, “Trade in Surveillance
Technology Raises Worries,” Washington Post (December 1, 2011).
36. See Lillian Ablon, Martin C. Libicki, and Andrea A. Golay, Markets for Cybercrime Tools
and Stolen Data (Santa Monica, CA: RAND, March 14, 2014).
37. See James Vincent, “Edward Snowden Claims Microsoft Collaborated with NSA and
FBI to Allow Access to User Data,” The Independent ( July 12, 2013).
38. See Gabrielle Coppola, “Israeli Entrepreneurs Play Both Sides of the Cyber Wars,”
Bloomberg (September 29, 2014).
39. See Stephen Krasner, “State Power and the Structure of International Trade,” World
Politics, Vol. 28, No. 3 (April 1976), pp. 317–47; and Richard N. Rosecrance, The
Resurgence of the West: How a Transatlantic Union Can Prevent War and Restore the United
States and Europe (New Haven, CT: Yale University Press, 2013).
40. See Florian Egloff, “Cybersecurity and the Age of Privateering: A Historical Analogy,”
Cyber Studies Working Paper No. 1, University of Oxford (March 2015).
300 NOTES to pp. 240–57
41. See Fernand Braudel, The Mediterranean and the Mediterranean World in the Age of
Philip II (Berkeley, CA: University of California Press, 1995).
42. Some thinkers question whether attribution is as hard as many observers believe it to
be. See Jon R. Lindsay, “Tipping the Scales: The Attribution Problem and the Feasibility
of Deterrence against Cyberattack,” Journal of Cybersecurity, Vol. 1, No. 1 (2015),
pp. 53–67; and Thomas Rid, “Attributing Cyber Attacks,” Journal of Strategic Studies
(2015), p. 38.
43. See Brandon Valeriano and Ryan C. Maness, Cyber War Versus Cyber Realities: Cyber
Conflict in the International System (Oxford: Oxford University Press, 2015).
44. Legal scholars who support the “natural law” tradition developed by Aquinas, Locke,
and Vattel have challenged the positivist doctrine’s position as the legitimate source of
international law. See James L. Brierly, The Basis of Obligations in International Law
(Oxford: Clarendon Press, 1958); and Hersch Lauterpacht, International Law and
Human Rights (London: Stevens and Sons, 1950).
10 Cyber Futures
1. See “The Path of Cyberlaw,” Yale Law Journal, Vol. 104, No. 7 (May 1995),
pp. 1,743–55; and “The Zones of Cyberspace Symposium: Surveying Law and Borders,”
Stanford Law Review, Vol. 48 (1995–96), pp. 1,403–11.
2. See, for example, Rein Turn, “Privacy Protection and Security in Transnational Data
Processing Systems,” Stanford Journal of International Law, Vol. 16 (Summer 1980),
pp. 67–86.
3. See Elaine McArdle, “The New Age of Surveillance,” Harvard Law Bulletin (Spring
2016).
4. Parallel developments in the area of quantum cryptography will not neutralize this
threat. Computer specialists expect that the technique will reinforce only “symmetric
cryptography,” a more cumbersome encryption method than asymmetric encryption
that is used to secure Top Secret government documents and other highly sensitive
data. Workshop on European Cybersecurity: Future Trends and Policy Challenges,
January 27–28, 2017, Cyber Studies Programme, University of Oxford.
Bibliography
Abbate, Janet. Inventing the Internet. Cambridge, MA: MIT Press, 1999
Ablon, Lillian, Martin C. Libicki, and Andrea A. Golay. Markets for Cybercrime Tools and
Stolen Data. Santa Monica, CA: RAND, March 14, 2014
Abrams, Steve. “Beyond Propaganda: Soviet Active Measures in Putin’s Russia,” Connections:
The Quarterly Journal, Vol. 15, No. 1 (2016)
Allison, Graham. Nuclear Terrorism: The Greatest Preventable Catastrophe. New York: Henry
Holt, 2005
Allison, Graham, Albert Carnesale, and Joseph S. Nye, eds. Hawks, Doves, and Owls: An
Agenda for Avoiding Nuclear War. New York: W. W. Norton, 1985
Allison, Graham and Philip D. Zelikow. The Essence of Decision: Explaining the Cuban
Missile Crisis, second edition. New York: Longman, 1999
Alvargonzålez, David. “Multidisciplinarity, Interdisciplinarity, Transdisciplinarity, and
the Sciences,” International Studies in the Philosophy of Science, Vol. 25, No. 4 (2011),
pp. 387–403
Anderson, Benedict. Under Three Flags: Anarchism and the Anti-Colonial Imagination.
London: Verso, 2005
Andres, Richard B. “Cyber-Gang Warfare: State-Sponsored Militias Are Coming to a
Server Near You,” Foreign Policy (February 11, 2013)
Armstrong, A. G. “The Army Today,” RUSI Journal, Vol. 81, No. 523 (1936)
——. “The Army’s New Weapons and Equipment,” RUSI Journal, Vol. 84, No. 534 (1939)
——. Revolution and World Order: The Revolutionary State in International Society. Oxford:
Clarendon Press, 1993
Aron, Raymond. Penser la guerre, Clausewitz, Vol. 2, L’àge planétaire. Paris: Gallimard, 1976
Arquilla, John. “Cyberwar Is Already Upon Us,” Foreign Policy (February 27, 2012)
Arquilla, John and David Ronfeldt. Cyber War Is Coming! Santa Monica, CA: RAND, 1993
——. Networks and Netwars: The Future of Terror, Crime, and Militancy. Santa Monica, CA:
RAND, 2001
Ash, Timothy Garton. Free Speech: Ten Principles for a Connected World. New Haven, CT:
Yale University Press, 2016
Axelrod, Robert. The Evolution of Cooperation. New York: Basic Books, 1984
Axelrod, Robert and Rumen Iliev. “Timing of Cyber Conflict,” Proceedings of the National
Academy of Sciences of the United States of America, Vol. 11, No. 4 (October 2013),
pp. 1,298–1,303
301
302 BIBLIOGRAPHY
Baek, Jieun. North Korea’s Hidden Revolution: How the Information Underground Is
Transforming a Closed Society. New Haven, CT: Yale University Press, 2015
Bairoch, Paul. “International Industrialization Levels from 1750 to 1980,” Journal of
European Economic History, Vol. 11 (1982)
Baker, Stewart, Natalia Filipiak, and Katrina Timlin. In the Dark: Crucial Industries Confront
Cyberattacks. Santa Clara, CA: Center for International and Strategic Studies and
McAfee, 2011
Barzashka, Ivanka. “Are Cyber-Weapons Effective?” The RUSI Journal, Vol. 158, No. 2 (2013)
Bellamy, Alex J. Global Politics and the Responsibility to Protect: From Words to Deeds. London:
Routledge, 2011
Bellovin, Steven M., Susan Landau, and Herbert S. Lin. “Limiting the Undesired Impact of
Cyber Weapons: Technical Requirements and Policy Implications,” unpublished paper
Bencsáth, Boldizsár. “Duqu, Flame, Gauss: Followers of Stuxnet,” RSA Conference Europe
2012
Berger, Thomas. “Changing Norms of Defense and Security in Japan and Germany,” in
Peter J. Katzenstein, ed., The Culture of National Security: Norms and Identity in World
Politics. New York: Columbia University Press, 1996
Bilge, Leyla and Tudor Dumitras. “Before We Knew It: An Empirical Study of Zero-Day
Attacks in the Real World,” Proceedings of the 2012 ACM Conference on Computer and
Communications Security, October 16–18, 2012
Bishop, Matt. “What is Computer Security?,” IEEE Security & Privacy ( January/February
2003), pp. 67–69
Bittman, Ladislav. The KGB and Soviet Disinformation: An Insider’s View. Oxford: Pergamon,
1985
Blakely, Jason. “Is Political Science This Year’s Election Casualty?” The Atlantic (November
14, 2016)
Boghardt, Thomas. “Soviet Bloc Intelligence and Its AIDS Disinformation Campaign,”
Studies in Intelligence, Vol. 53, No. 4 (December 2009), pp. 1–24
Braudel, Fernand. The Mediterranean and the Mediterranean World in the Age of Philip II.
Berkely, CA: University of California Press, 1995
Braun, Chaim and Christopher F. Chyba. “Proliferation Rings: New Challenges to the
Nuclear Nonproliferation Regime,” International Security, Vol. 29, No. 2 (Fall 2004),
pp. 5–49
Brierly, James L. The Basis of Obligations in International Law. Oxford: Clarendon Press,
1958
Brodie, Bernard. The Absolute Weapon. New York: Harcourt, 1946
——. “The Anatomy of Deterrence,” in Bernard Brodie, ed., Strategy in the Missile Age.
Princeton, N.J.: Princeton University Press, 1958
Bronk, Christopher and Eneken Tikk-Ringas. “The Cyber Attack on Saudi Aramco,”
Survival: Global Politics and Strategy, Vol. 55, No. 2 (2013), pp. 81–96
Brown, Michael E., Sean M. Lynn-Jones, and Steven E. Miller. Debating the Democratic
Peace. Cambridge, MA: MIT Press, 1996
Bruneau, Guy. DNS Sinkhole. Boston, MA: SANS Institute, August 7, 2010
Buchanan, Ben. The Cybersecurity Dilemma: Hacking, Trust and Fear Between Nations.
Oxford: Oxford University Press, 2017
Bull, Hedley. Justice in International Relations: Hagey Lectures. Waterloo: University of
Waterloo, 1983
——. The Anarchical Society: A Study of Order in World Politics, second edition. London:
Macmillan, 1995
Buzan, Barry. From International to World Society? English School Theory and the Social
Structure of Globalisation. Cambridge: Cambridge University Press, 2004
Buzan, Barry and Lene Hansen. The Evolution of International Security Studies. Cambridge:
Cambridge University Press, 2009
Buzan, Barry and Richard Little. International Systems in World History: Remaking the Study
of International Relations. Oxford: Oxford University Press, 2000
BIBLIOGRAPHY 303
Geers, Kenneth. “The Challenge of Cyber Attack Deterrence,” Computer Law and Security
Review, Vol. 26, No. 3 (May 2010), pp. 298–303
George, Alexander L. and Andrew Bennett. Case Studies and Theory Development in the
Social Sciences. Cambridge, MA: MIT Press, 2005
Gerasimov, Valeriy. “Ценность науки в предвидении (The Value of Science is in Foresight,”
Военно-промышленный курьер (February 27, 2013)
Giles, Keir. “Russia’s ‘New’ Tools for Confronting the West. Continuity and Innovation in
Moscow’s Exercise of Power,” Research Paper, Russia and Eurasia Programme, Chatham
House (March 2016)
Gilpin, Robert. War and Change in World Politics. Cambridge: Cambridge University Press,
1989
Glaser, Charles L. and Chaim Kauffman. “What is the Offense-Defense Balance and Can
We Measure It?” International Security, Vol. 22, No. 4 (Spring 1998), pp. 44–82
Goldsmith, Jack L. “Against Cyberanarchy,” Chicago Law Review, Vol. 1199 (Fall 1998)
——. “WCIT-12: An Opinionated Primer and Hysteria-Debunker,” Lawfare (blog)
(November 30, 2012)
——. “Disconcerting U.S. Cyber Deterrence Troubles Continue,” Lawfare (September 15,
2015)
Goldsmith, Jack L. and Tim Wu. Who Controls the Internet? Illusions of a Borderless World.
New York: Oxford University Press, 2006
Guri, Mordechai, Yosef Solewicz, Andrey Daidakulov, and Yuval Elovici. “DiskFiltration:
Data Exfiltration from Speakerless Air-Gapped Computers via Covert Hard Drive
Noise,” arXiv.org (August 11, 2016)
Haggard, Stephen and Marcus Noland. “Engaging North Korea: The Efficacy of Sanctions
and Inducements,” in Etel Solingen, ed., Sanctions, Statecraft, and Nuclear Proliferation.
Cambridge: Cambridge University Press, 2012
Hansen, Lene and Helen Nissenbaum. “Digital Disaster, Cyber Security, and the
Copenhagen School,” International Studies Quarterly, Vol. 53, No. 4 (December 2009),
pp. 1,155–75
Harknett, Richard J. “The Logic of Conventional Deterrence and the End of the Cold
War,” Security Studies, Vol. 4, No. 1 (1994), pp. 86–114
——. “Toward a Logic of Offensive Persistence,” International Security (2017)
Harknett, Richard J. and Hasan B. Yalcin. “The Struggle for Autonomy: A Realist Structural
Theory of International Relations,” International Studies Review, Vol. 14 (2012),
pp. 499–521
Hathaway, Melissa E. “Leadership and Responsibility for Cybersecurity,” Georgetown
Journal of International Affairs (2012)
Hathaway, Melissa E. and Alexander Klimburg. “Preliminary Considerations: On National
Cyber Security,” in Alexander Klimburg, ed., National Cyber Security Framework Manual.
Tallinn: NATO CCD-COE, 2012
Healey, Jason. A Fierce Domain: Conflict in Cyberspace, 1986 to 2012. Arlington, VA: Cyber
Conflict Studies Association, 2012
Herrera, Geoffrey L. Technology and International Transformation: The Railroad, the Atom
Bomb, and the Politics of Technological Change. New York: State University of New York
Press, 2006
Herz, John H. International Politics in the Atomic Age. New York: Columbia University Press,
1959
Hinsley, F. H. Power and the Pursuit of Peace: Theory and Practice in the History of Relations
between States. Cambridge: Cambridge University Press, 1967
Ho, Park Young, “South and North Korea’s Views on the Unification of the Korean
Peninsula and Inter-Korean Relations,” Paper presented at the Second KRIS-Brookings
Joint Conference on Security and Diplomatic Cooperation between ROK and US for the
Unification of the Korean Peninsula ( January 21, 2014)
Hobbes, Thomas. Leviathan, edited with an introduction by C. B. Macpherson. New York:
Penguin, 1968
306 BIBLIOGRAPHY
Hoeber, Francis. Slow to Take Offense: Bombers, Cruise Missiles and Prudent Deterrence.
Washington, D.C.: Georgetown University Center for Strategic and International
Studies, 1977
Hoffmann, Stanley. The State of War: Essays on the Theory and Practice of International Politics.
New York: Praeger, 1965
——. Gulliver’s Travels, Or the Setting of American Foreign Policy. New York: McGraw-Hill,
1968
——. Duties Beyond Borders: On the Limits and Possibilities of Ethical International Politics.
Syracuse, N.Y.: Syracuse University Press, 1981
Hogben, Giles, ed. Botnets: Detection, Measurement, Disinfection, and Defence. Heraklion:
European Network and Information Security Agency, 2011
Holsti, Kalevi J. “The Concept of Power in the Study of International Relations,” Background,
Vol. 7, No. 4 (February 1964)
Howard, Michael. War in Modern History. Oxford: Oxford University Press, 1976
——. Clausewitz: A Very Short Introduction. Oxford: Oxford University Press, 2002
Howard, Philip N., Aiden Duffy Deen Freelon, Muzammil M. Hussain, Will Mari, and
Marwa Maziad. “Opening Closed Regimes: What Was the Role of Social Media During
the Arab Spring?,” (2011), available at SSRN: https://ssrn.com/abstract=2595096 or
http://dx.doi.org/10.2139/ssrn.2595096
Huntingon, Samuel P. The Clash of Civilizations and the Remaking of World Order. New York:
Simon and Schuster, 1996
Hurrell, Andrew. On Global Order: Power, Values and the Constitution of International Society.
Oxford: Oxford University Press, 2007
Hurwitz, Roger. “Keeping Cool: Steps for Avoiding Conflict and Escalation in Cyberspace,”
Georgetown Journal of International Affairs ( July 2015), pp. 17—23
Jang-Yop, Hwang and Selig Harrison. The Two Koreas. New York: Addison-Wesley, 1998
Jackson, Julia. The Fall of France: The Nazi Invasion of 1940. Oxford: Oxford University
Press, 2003
Jackson, Robert. Sovereignty: The Evolution of an Idea. Cambridge: Polity, 2007
James, Simon. Rome and the Sword: How Warriors and Weapons Shaped Roman History.
London: Thames and Hudson, 2011
Jervis, Robert. “Cooperation under the Security Dilemma,” World Politics, Vol. 30, No. 2
( January 1978), pp. 167–214
——. The Meaning of the Nuclear Revolution: Statecraft and the Prospect of Armageddon.
Ithaca, N.Y.: Cornell University Press, 1989
Jo, Dong-Joon and Erik Gartzke. “Determinants of Nuclear Weapons Proliferation,” Journal
of Conflict Resolution, Vol. 51, No. 1 (February 2007), pp. 167–94
Joffre, Marshall. Mémoires du Maréchal Joffre. Paris: Librarie Plon, 1932
Jupille, Joseph, James A. Caporaso, and Jeffrey T. Checkel. “Integrating Institutions:
Rationalism, Constructivism, and the Study of the European Union,” Comparative
Political Studies, Vol. 36, No. 7 (February–March 2003), pp. 7–40
Kahn, Herman. On Escalation: Metaphors and Scenarios. London: Pall Mall Press, 1965
Kahneman, Daniel. “A Perspective on Judgment and Choice: Mapping Bounded
Rationality,” American Psychologist, Vol. 58, No. 9 (September 2003), pp. 697–720
Kant, Immanuel. Perpetual Peace and Other Essays on Politics, History and Morals, trans. T.
Humphrey. Indianapolis, IN: Hackett, 1983
Keck, Margaret E. and Kathryn Sikkink. Activists Beyond Borders: Advocacy Networks in
International Politics. Ithaca, N.Y.: Cornell University Press, 1998
Kello, Lucas. “The Meaning of the Cyber Revolution: Perils to Theory and Statecraft,”
International Security, Vol. 38, No. 2 (Fall 2013), pp. 7–40
——. “Security,” in Joel Krieger, ed., The Oxford Companion to International Relations, third
edition. Oxford: Oxford University Press, 2014
——. “The Virtual Weapon: Dilemmas and Future Scenarios,” Politique étrangère, Vol. 79,
No. 4 (Winter 2014–15), pp. 139–50
——. “Cyber Security: Gridlock and Innovation,” in Beyond Gridlock. Cambridge: Polity,
2017
BIBLIOGRAPHY 307
Kennedy, Paul. The Rise and Fall of the Great Powers: Economic Change and Military Conflict
from 1500 to 2000. New York: Vintage Books, 1987
Keohane, Robert O. and Joseph S. Nye, Jr. Power and Interdependence. New York: Longman,
1979
——. After Hegemony: Cooperation and Discord in the World Political Economy. Princeton,
N.J.: Princeton University Press, 1984
Kesan, Jay P. and Carol M. Hayes. “Mitigative Counterstriking: Self-Defense and Deterrence
in Cyberspace,” Harvard Journal of Law and Technology, Vol. 25, No. 2 (Spring 2012)
Kile, Shannon N. and Hans M. Kristensen. Trends in Nuclear Forces, 2016: SIPRI FACT
Sheet. Solna, Sweden: Stockholm International Peace Research Institute, June 2016
Kim, Samuel S. “North Korea’s Nuclear Strategy and the Interface between International
and Domestic Politics,” Asian Perspectives, Vol. 34, No. 1 (2010), pp. 49–85
Kindleberger, Charles. The World in Depression, 1929–1939. Berkeley, CA: University of
California Press, 1996
King, Gary, Jennifer Pan, and Margaret E. Roberts. “How Censorship in China Allows
Government Criticism but Silences Collective Expression,” American Political Science
Review, Vol. 2 (May 2013), pp. 1–18
Kissinger, Henry A. Nuclear Weapons and Foreign Policy. New York: Council on Foreign
Relations, 1957
Klaveras, Louis. “Political Realism: A Culprit for the 9/11 Attacks,” Harvard International
Review, Vol. 26, No. 3 (Fall 2004)
Klimburg, Alexander and Heli Tirmaa-Klaar. Cybersecurity and Cyberpower: Concepts,
Conditions, and Capabilities for Cooperation for Action within the EU. Brussels: European
Parliament Directorate General for External Policies of the Union, Policy Department,
April 2011
Klimburg, Alexander and Jason Healey. “Strategic Goals and Stakeholders,” in Alexander
Klimburg, ed., National Cyber Security Framework and Manual. Tallinn: NATO
Cooperative Cyber Defence Centre of Excellence, 2012
Klotz, Audie and Cecelia M. Lynch. Strategies for Research in Constructivist International
Relations. London: M. E. Sharpe, 2007
Kopp, Carolyn. “The Origins of the American Scientific Debate over Fallout Hazards,”
Social Studies of Science, Vol. 9 (1979), pp. 404–06
Kort, Michael. The Columbia Guide to the Cold War. New York: Columbia University Press,
1998
Krasner, Stephen. “State Power and the Structure of International Trade,” World Politics,
Vol. 28, No. 3 (April 1976), pp. 317–47
——. Defending the National Interest: Raw Materials Investments and US Foreign Policy.
Princeton, N.J.: Princeton University Press, 1978
——. “Globalization and Sovereignty,” in David A. Smith, Dorothy J. Solinger, and Steven
C. Topik, eds., States and Sovereignty in the Global Economy. London: Routledge, 1999
Kratochwil, Friedrich. “The Embarrassment of Change: Neo-Realism as the Science of
Realpolitik without Politics,” Review of International Studies, Vol. 19, No. 1 ( January
1993), pp. 63–80
Kroenig, Matthew. “Importing the Bomb: Sensitive Nuclear Assistance and Nuclear
Proliferation,” Journal of Conflict Resolution, Vol. 53, No. 2 (April 2009)
Kuhn, Thomas S. The Structure of Scientific Revolutions, third edition. Chicago, IL: University
of Chicago Press, 1996
Laboratory of Cryptography and Systems Security. Duqu: A Stuxnet-Like Malware Found in
the Wild. Budapest: Budapest University of Technology and Economics, October 14, 2011
Lacroix, Justine and Kalypso A. Nicolaidïs, eds. European Stories: Intellectual Debates on
Europe in National Contexts. Oxford: Oxford University Press, 2011
Lake, David A. Lake. “The State and International Relations,” in Christian Reus-Smit and
Duncan Snidal, eds., The Oxford Handbook of International Relations. Oxford: Oxford
University Press, 2015
Langner, Ralph. “Stuxnet and the Hacker Nonsense,” Langner.com (blog) (February 14,
2011)
308 BIBLIOGRAPHY
Majumdar, Dave. “America’s F-35 Stealth Fighter vs. China’s New J-31: Who Wins?” The
National Interest (September 25, 2015)
Manchester, William and Paul Reid. The Last Lion: Winston Spencer Churchill, Defender of
the Realm, 1940–1965. New York: Little, Brown, and Company, 2012
Manjikian, Mary M. “From Global Village to Virtual Battlespace: The Colonizing of the
Internet and the Extension of Realpolitik,” International Studies Quarterly, Vol. 54, No. 2
( June 2010), pp. 381–401
Mann, Edward. “Desert Storm: The First Information War?” Airpower Journal (Winter
2014)
Marcus, Gary. “Is ‘Deep Learning’ a Revolution in Artificial Intelligence?” The New Yorker
(November 25, 2012)
Martin, Lisa L. “Neoliberalism,” in Tim Dunne, Milya Kurki, and Steve Smith, eds.,
International Relations Theories: Discipline and Diversity. Oxford: Oxford University
Press, 2007
Maurer, Tim and Robert Morgus. Compilation of Existing Cybersecurity and Information
Security Related Definitions. New America, October 2012
Massie, Robert K. Castles of Steel: Britain, Germany and the Winning of the Great War at Sea.
New York: Random House, 2003
Mearsheimer, John J. Conventional Deterrence. Ithaca, N.Y.: Cornell University Press, 1983
——. “The Case for a Ukrainian Nuclear Deterrent,” Foreign Affairs (Summer 1993)
Medalia, Jonathan. Comprehensive Nuclear Test-Ban Treaty: Updated Safeguards and Net
Assessments. Washington, D.C.: Congressional Research Service, June 3, 2009
Mill, John Stuart. Essays on Some Unsettled Questions of Political Economy. London:
Longmans, Green, Reader, and Dyer, 1874
Mills, Michael J., Owen B. Toon, Julia Lee-Taylor, and Alan Robock. “Multidecadal Global
Cooling and Unprecedented Ozone Loss Following a Regional Nuclear Conflict,” Earth’s
Future, Vol. 2, No. 4 (April 2014), pp. 161–76
Minsky, Marvin. The Society of Mind. New York: Simon and Schuster, 1988
Minsky, Marvin and Seymour Papert. Perceptrons: An Introduction to Computational
Geometry. Cambridge, MA: The MIT Press, 1969
Mohan, Vivek and John Villasenor. “Decrypting the Fifth Amendment: The Limits of Self-
Incrimination in the Digital Era,” University of Pennsylvania Journal of Constitutional
Law Heightened Scrutiny, Vol. 15 (October 2012), pp. 11–28
Monte, Matthew. Network Attacks and Exploitation: A Framework. Indianapolis, IN: John
Wiley and Sons, 2015
Moore, Tyler, Richard Clayton, and Ross Anderson. “The Economics of Online Crime,”
Journal of Economic Perspectives, Vol. 23, No. 3 (Summer 2009), pp. 3–20
Moravcsik, Andrew. The Choice for Europe: Social Purpose and State Power from Messina to
Maastricht. Ithaca, NY: Cornell University Press, 1998
Morgan, Patrick M. Deterrence. Beverley Hills, CA: Sage, 1977
——. “Applicability of Traditional Deterrence Concepts and Theory to the Cyber Realm,”
in Proceedings of a Workshop on Deterring Cyberattacks: Informing Strategies and Developing
Options for U.S. Policy. Washington, D.C.: National Academies Press, 2010, pp. 55–76
Mueller, John. “Simplicity and the Spook: Terrorism and the Dynamics of Threat
Exaggeration,” International Studies Perspectives (2005).
——. “Think Again: Nuclear Weapons,” Foreign Policy (December 18, 2009)
Myers, Robert. Korea in the Cross Currents. New York: M. E. Sharpe, 2000
National Research Council of the National Academies. Terrorism and the Electric Power
Delivery System. Washington, D.C.: National Academies Press, 2012
Naughton, John. A Brief History of the Future: Origins of the Internet. London: Weidenfeld
and Nicolson, 1999
Newsom, David. “Foreign Policy and Academia,” Foreign Policy, Vol. 101 (Winter 1995)
Njølstad, Olav. Nuclear Proliferation and International Order: Challenges to the Non-
Proliferation Regime. New York: Routledge, 2011
Nye, Joseph S., Jr. The Future of Power. New York: PublicAffairs, 2011
310 BIBLIOGRAPHY
——. “Nuclear Lessons for Cyber Security?” Strategic Studies Quarterly, Vol. 5, No. 4
(Winter 2011), pp. 18–38
——. “The Regime Complex for Managing Global Cyber Activities,” Global Commission on
Internet Governance, Issue Paper Series, No. 1 (May 2014)
——. “Deterrence and Dissuasion in Cyberspace,” International Security, Vol. 41, No. 3
(Winter 2016/2017), pp. 44–71
O’Connor, Damian P. Between Peace and War: British Defence and the Royal United Services
Institute, 1931–2010. London: Royal United Services Institute, 2011
Ogburn, William, ed. Technology and International Relations. Chicago, IL: Chicago
University Press, 1949
Oren, Ido. “The Subjectivity of the ‘Democratic’ Peace: Changing U.S. Perceptions of
Imperial Germany,” in Michael E. Brown, Sean M. Lynn-Jones, and Steven E. Miller,
Debating the Democratic Peace: An International Security Reader. Cambridge, MA: The
MIT Press, 1996, pp. 263–300
Orlov, Aleksandr. The Secret History of Stalin’s Crimes. New York: Jarrolds, 1953
Osiander, Andreas. “Sovereignty, International Relations and the Westphalian Myth,”
International Organization, Vol. 55, No. 2 (April 2001), pp. 251–87
Ownes, William A., Kenneth W. Dam, and Herbert S. Lin, eds. Technology, Policy, Law, and
Ethics Regarding U.S. Acquisition and Use of Cyberattack Capabilities. Washington, D.C.:
National Academies Press, 2009
Oye, Kenneth A., ed. Cooperation Under Anarchy. Princeton, NJ: Princeton University Press,
1986
Pacepa, Ion Mihai and Ronald Rychlak. Disinformation: Former Spy Chief Reveals Secret
Strategies for Undermining Freedom, Attacking Religion, and Promoting Terrorism.
Washington, D.C.: WND Books, 2013
Paget, François. Cybercrime and Hacktivism. Santa Clara, CA: McAfee, 2010
Park, Han S. North Korea: The Politics of Unconventional Wisdom. Boulder, CO: Lynne
Rienner, 2002
Park, John S. “North Korea, Inc.: Gaining Insights into North Korean Regime Stability
from Recent Commercial Activities,” United States Institute of Peace Working Paper,
Washington, D.C. (May 2009)
Park, John S. and Dong Sun Lee. “North Korea: Existential Deterrence and Diplomatic
Leverage,” in Muthiah Alagappa, ed., The Long Shadow: Nuclear Weapons and Security in
21st Century Asia. Stanford, CA: Stanford University Press, 2008
Pattison, James. The Morality of Private War: The Challenge of Private Military Companies
and Security Companies. Oxford: Oxford University Press, 2014
Persky, Joseph. “The Ethology of Homo Economicus,” Journal of Economic Perspectives, Vol. 9,
No. 2 (Spring 1995)
Poole, Matthew E. and Jason C. Schuette. “Cyber Electronic Warfare. Closing the
Operational Seams,” Marine Corps Gazette, Vol. 99, No. 8 (August 2015), pp. 60–62
Potter, William, ed. International Nuclear Trade and Nonproliferation: The Challenge of
Emerging Suppliers. Lexington, MA: Lexington Books, 1990
Powell, Robert. Nuclear Deterrence Theory: The Search for Credibility. Cambridge: Cambridge
University Press, 1990
Powers, Thomas. Heisenberg’s War: The Secret History of the German Bomb. New York: De
Capo Press, 2000
Ramsay, William M. The Imperial Peace: An Ideal in European History. Oxford: The
Clarendon Press, 1913
Rasteger, Ryan. “How Many Nukes Would It Take to Render Earth Uninhabitable,” Global
Zero: A World Without Nuclear Weapons ( July 9, 2015)
Rattray, Gregory J. Strategic Warfare in Cyberspace. Cambridge, MA: MIT Press, 2001
Reed, Thomas C. At the Abyss: An Insider’s History of the Cold War. New York: Random
House, 2005
Reus-Smit, Christian. The Moral Purpose of the State: Culture, Social Identity, and Institutional
Rationality in International Relations. Princeton, N.J.: Princeton University Press, 1999
BIBLIOGRAPHY 311
Reveron, Derek S., ed. Cyberspace and National Security: Threats, Opportunities, and Power in
a Virtual World. Washington, D.C.: Georgetown University Press, 2012.
Rid, Thomas. “Cyber War Will Not Take Place,” Journal of Strategic Studies, Vol. 35, No. 1
(February 2012), pp. 5–32
——. “Think Again: Cyberwar,” Foreign Policy, Vol. 192 (March/April 2012), pp. 80–84
——. Cyber War Will Not Take Place. London: C. Hurst and Co., 2013
——. “Attributing Cyber Attacks,” Journal of Strategic Studies (2015)
——. “All Signs Point to Russia Being Behind the DNC Hack” Motherboard. July 25, 2017
Rid, Thomas and Ben Buchanan. “Attributing Cyber Attacks,” Journal of Strategic Studies,
Vol. 38, No. 1 (2014), pp. 4–37
Roberts, Hal and Bruce Etling. “Coordinated DDoS Attack During Russian Duma
Elections,” Internet and Democracy Blog, Berkman Center for Internet and Society,
Harvard University (December 8, 2011)
Roesch, Werner. Bedrohte Schweiz: Die Deutsche Operationsplanungen gegen die Schweiz im
Sommer/Herbst 1940 und die Abwehr-Bereitschaft der Armee in Oktober 1940. Frauenfeld:
Huber, 1986
Rogin, Josh. “NSA Chief: Cybercrime Constitutes the ‘Greatest Transfer of Wealth in
History’,” Foreign Policy ( July 9, 2012)
Rogowski, Ronald. “Institutions as Constraints on Strategic Choice,” in David A. Lake
and Richard Powell, eds., Strategic Choice and International Relations. Princeton, N.J.:
Princeton University Press, 1999
Rosecrance, Richard. The Resurgence of the West: How a Transatlantic Union Can Prevent War
and Restore the United States and Europe. New Haven, CT: Yale University Press, 2013
Roskill, Stephen W. Naval Policy between the Wars. London: Walker, 1968
Roustami, Masoud, Farinaz Koushanfar, Jeyavijayan Rajendran, and Ramesh Karri.
“Hardware Security: Threat Models and Metrics,” ICCAD ’13 Proceedings of the
International Conference on Computer-Aided Design, IEEE (November 18–21, 2013),
pp. 819–23
Ruggie, John G. “International Responses to Technology: Concepts and Trends,”
International Organization (Summer 1975), pp. 558–83
——. “Continuity and Transformation in the World Polity: Toward a Neorealist Synthesis,”
World Politics, Vol. 35, No. 2 ( January 1983)
Russett, Bruce M. “The Calculus of Deterrence,” Journal of Conflict Resolution, Vol. 7, No. 2
( June 1963), pp. 97–109
——. Grasping the Democratic Peace: Principles for a Post-Cold War World. Princeton, N.J.:
Princeton University Press, 1993
Sagan, Scott D. “Why Do States Build Nuclear Weapons? Three Models in Search of a
Bomb,” International Security, Vol. 21, No. 3 (Winter 1996/1997)
——. “The Commitment Trap: Why the United States Should Not Use Nuclear Weapons
to Deter Biological and Chemical Weapons Attacks,” International Security, Vol. 24, No.
4 (Spring 2000), pp. 85–115
Sandee, Michael. GameOver Zeus: Background on the Badguys and the Backends. Delft:
Fox-IT, July 2015
Sanger, David E. Confront and Conceal: Obama’s Secret Wars and Surprising Use of American
Power. New York: Crown, 2012
Schaefer, Bernd. “North Korean “Adventurism” and China’s Long Shadow, 1966–1972,”
Cold War International History Project Working Paper, Woodrow Wilson International
Center for Scholars (2004)
Schelling, Thomas C. The Strategy of Conflict. Cambridge, MA: Harvard University Press,
1960
——. Arms and Influence. New Haven, CT: Yale University Press, 1966
Schelling, Thomas C. and Morton H. Halperin. Strategy and Arms Control. Washington,
D.C.: Twentieth Century Fund, 1961
Schmitt, Michael N., ed. Tallinn Manual on the International Law Applicable to Cyber
Warfare. Cambridge: Cambridge University Press, 2013
312 BIBLIOGRAPHY
Schneier, Bruce. “When Does Cyber Spying Become a Cyber Attack,” Defense One (March
10, 2014)
——. “We Still Don’t Know Who Hacked Sony,” Schneier on Security ( January 5, 2015)
Schneier, Bruce. “Someone Is Learning How to Take Down the Internet,” Lawfare
(September 13, 2016)
Schoen, Fletcher and Christopher J. Lamb. Deception, Disinformation, and Strategic
Communications: How One Interagency Group Made a Major Difference. Washington, D.C.:
National Defense University Press, June 2012
Schwartz, Benjamin E. Right of Boom: What Follows an Untraceable Nuclear Attack? New
York: Overlook Press, 2015
Segal, Adam. “The Code Not Taken: China, the United States, and the Future of Cyber
Espionage,” Bulletin of the Atomic Scientists, Vol. 69, No. 5 (2013), pp. 38–45
——. The Hacked World Order: How Nations Fight, Trade, Maneuver, and Manipulate in the
Digital Age. New York: PublicAffairs, 2015
Sheldon, Robert and Joe McReynolds. “Civil-Military Integration and Cybersecurity: A
Study of Chinese Information Warfare Militias,” in Jon R. Lindsay, Tai Ming Cheung,
and Derek S. Reveron, eds., China and Cybersecurity: Espionage, Strategy, and Politics in the
Digital Domain. New York: Oxford University Press, 2015
Sherr, James. “Ukraine and the Black Sea Region: The Russian Military Perspective,” in
Stephen Blank, ed., The Russian Military in Contemporary Perspective. Carlisle, PA:
Strategic Studies Institute, U.S. Army War College, forthcoming
Shoemaker, Dan, Anne Kohnke, and Ken Sigler. A Guide to the National Initiative for
Cybersecurity Education (NICE) Cybersecurity Workforce Framework (2.0). London: CRC
Press, 2016
Siboni, Gabi and Zvi Magen. “The Cyber Attack on the Ukrainian Electrical Infrastructure:
Another Warning,” INSS Insights, No. 798 (February 17, 2016)
Simon, Herbert A. Administrative Behavior: A Study of Human Decision-Making Processes in
Administrative Organization. New York: Macmillan, 1947
Singer, Peter W. Wired for War: The Robotics Revolution and Conflict in the 21st Century. New
York: Penguin, 2009
Singer, Peter W. and Allan Friedman. Cybersecurity and Cyberwar: What Everyone Needs to
Know. Oxford: Oxford University Press, 2014
Skolnikoff, Eugene B. The Elusive Transformation: Science, Technology, and the Evolution of
International Politics. Princeton, N.J.: Princeton University Press, 1993
Slayton, Rebecca. “What Is the Cyber Offense-Defense Balance? Conceptions, Causes, and
Assessment,” International Security, Vol. 41, No. 3 (Winter 2016/2017), pp. 72–109
Smeets, Max. “A Matter of Time: On the Transitory Nature of Cyberweapons,” Journal of
Strategic Studies (February 2017), pp. 1–28
Smith, Merritt R. and Leo Marx, eds. Does Technology Drive History? The Dilemma of
Technological Determinism. Cambridge, MA: MIT Press, 1994
Snegovaya, Maria. “Executive Summary: Putin’s Information Warfare in Ukraine:
Soviet Origins of Russia’s Hybrid Warfare,” Institute for the Study of War (September
2015)
Snyder, Glenn H. Deterrence and Defense. Princeton, N.J.: Princeton University Press, 1961
Soldatov, Andrei, and Irina Borogan. The Red Web: The Struggle Between Russia’s Digital
Dictators and the New Online Revolutionaries. New York: PublicAffairs, 2015
Squassoni, Sharon A. Weapons of Mass Destruction: Trade Between North Korea and Pakistan.
Washington, D.C.: Congressional Research Service, 2006
Steury, Donald P. Intentions and Capabilities: Estimates on Soviet Strategic Forces, 1950–1983.
Langley, VA: Center for the Study of Intelligence, Central Intelligence Agency, 1996
Stoakes, Geoffrey. Hitler and the Quest for World Dominion. Leamington Spa: Berg, 1986
Stoll, Cliff. The Cuckoo’s Egg: Tracking a Spy through the Maze of Computer Espionage. New
York: Doubleday, 1989
Sturtevant, A.H. “Social Implications of the Genetics of Man,” Science, Vol. 120 (September
10, 1954)
BIBLIOGRAPHY 313
Symantec. W32.Duqu: The Precursor to the Next Stuxnet, ver. 1.4 (November 23, 2011)
Szilard, Leo and T. H. Chalmers. “Detection of Neutrons Liberated from Beryllium by
Gamma Rays: A New Technique for Inducing Radioactivity,” Nature, Vol. 134 (September
1934), pp. 494–95
Tabansky, Lior and Itzhak Ben Israel. Striking with Bits? The IDF and Cyber-Warfare.
Cham: Springer, 2015
Tehranipoor, Mohammad and Farinaz Koushanfar. “A Survey of Hardware Trojan
Taxonomy and Detection,” IEEE Design and Test of Computers, Vol. 27, No. 1 (2010), pp.
10–25
Thomas, Timothy L. “Russia’s Reflexive Control Theory and the Military,” Journal of Slavic
Military Studies, Vol. 17, No. 2 (2004)
Thompson, Kenneth. “Reflections on Trusting Trust,” Communication of the ACM, Vol. 27,
No. 8 (August 1984), pp. 761–63
Thucydides, History of the Peloponnesian War, trans. Steven Lattimore. Indianapolis, IN:
Hackett, 1998
Tipton, Harold F. and Micki Krause, eds. Information Security Management Handbook, sixth
edition, Vol. 4. New York: Auerbach Publications, 2010
Tor, Uri. “ ‘Cumulative Deterrence’ as a New Paradigm for Cyber Deterrence,” Journal of
Strategic Studies, Vol. 40, No. 1 (2017), pp. 92–117
Townsend, Anthony M. “Seoul: Birth of a Broadband Metropolis,” Environment and
Planning B: Urban Analytics and City Science, Vol. 34, No. 3 (2007), pp. 396–413
Turing, Alan M.“On Computable Numbers, with an Application to the Entscheidungsproblem,”
Proceedings of the London Mathematical Society, Vol. 2, No. 42 (1937 [delivered to the
Society in 1936]), pp. 230–65
Turn, Rein. “Privacy Protection and Security in Transnational Data Processing Systems,”
Stanford Journal of International Law, Vol. 16 (Summer 1980), pp. 67–86
Ulam, Adam B. Bolsheviks: The Intellectual, Personal and Political History of the Triumph of
Communism in Russia. New York: Macmillan, 1968
——. Stalin: The Man and His Era. London: Tauris, 1989
U.S. Cyber Consequences Unit (US-CCU). Overview by the US-CCU of the Cyber Campaign
against Georgia in August 2008, Special Report, US-CCU (August 2009)
U.S. Defense Science Board. Report of the Defense Science Board Task Force on Preventing and
Defending Against Clandestine Nuclear Attack. Washington, D.C.: Office of the Under
Secretary of Defense for Acquisition, Technology, and Logistics, June 2004
U.S. Department of Defense. Department of Defense Strategy for Operating in Cyberspace.
Washington, D.C.: U.S. Department of Defense, July 2011
U.S. Senate. Study of Airpower: Hearings before the Subcommittee on the Air Force
of the Committee on Armed Services. Washington, D.C.: Government Publication Office,
1956
Valeriano, Brandon and Ryan C. Maness. “The Dynamics of Cyber Conflict between Rival
Antagonists, 2001–11,” Journal of Peace Research, Vol. 51, No. 3 (2014).
——. Cyber War Versus Cyber Realities: Cyber Conflict in the International System. Oxford:
Oxford University Press, 2015
Van Creveld, Martin. Technology and War: From 2000 B.C. to the Present. New York:
Macmillan, 1989
Van Evera, Stephen. “The Cult of the Offensive and the Origins of the First World War,”
International Security, Vol. 9, No. 1 (Summer 1984)
Venter, H. S. and J. H. P. Eloff. “A Taxonomy of Information Security Technologies,”
Computers and Security, Vol. 22, No. 4 (May 2003), pp. 299–307
Vincent, John. Human Rights and International Relations: Issues and Reponse. Cambridge:
Cambridge University Press, 1996
Wall, David S. Cybercrime: The Transformation of Crime in the Information Age. Cambridge:
Polity Press, 2007
Waller, J. Michael. Strategic Influence: Public Diplomacy, Counterpropaganda, and Political
Warfare. Washington, D.C.: Institute of World Politics Press, 2009
314 BIBLIOGRAPHY
Walt, Stephen M. Revolution and War. Ithaca, N.Y.: Cornell University Press, 1996
——. “The Enduring Relevance of the Realist Tradition,” in Ira Katznelson and Helen V.
Milner, eds., Political Science: State of the Discipline. New York: W. W. Norton, 2002
——. “The Relationship between Theory and Policy in International Relations,. Annual
Review of Political Science, Vol. 8 (2005)
——. “Is the Cyber Threat Overblown?” Stephen M. Walt blog, Foreign Policy (March 30,
2010)
——. “What Does Stuxnet Tell Us about the Future of Cyber-Warfare?” Stephen M. Walt
blog, Foreign Policy (October 7, 2010)
Waltz, Kenneth N. Man, the State, and War: A Theoretical Analysis. New York: Columbia
University Press, 1959
——. Theory of International Politics. New York: McGraw-Hill, 1979
Waugh, Steven. Essential Modern World History. Cheltenham: Thomas Nelson, 2001
Weart, Spencer R. Never at War: Why Democracies Will Not Fight Each Other. New Haven,
CT: Yale University Press, 1998
Weinberger, Sharon. “Computer Security: Is This the Start of Cyberwarfare?” Nature,
Vol. 474 (2011), pp. 142–45
Welsh, Jennifer M. “Implementing the ‘Responsibility to Protect’: Where Expectations
Meet Reality,” Ethics and International Affairs, Vol. 24, No.4 (Winter 2010), pp. 415–30
Wendt, Alexander. Social Theory of International Politics. Cambridge: Cambridge University
Press, 1999
Whyte, Christopher. “Power and Predation in Cyberspace,” Strategic Studies Quarterly
(Spring 2015), pp. 100–18
——. “Ending Cyber Coercion: Computer Network Attack, Exploitation and the Case of
North Korea,” Comparative Strategy, Vol. 35, No. 2 ( July 2016), pp. 93–102
Wight, Martin. International Theory: The Three Traditions. London: Leicester University
Press, 1991.
Wilson, Geoff and Will Saetren. “Quite Possibly the Dumbest Military Concept Ever: A
‘Limited’ Nuclear War,” The National Interest (May 27, 2016)
Wohlstetter, Albert. “The Delicate Balance of Terror,” in Henry A. Kissinger, ed., Problems
of National Strategy: A Book of Readings. New York: Praeger, 1965
Wolfers, Arnold. “ ‘National security’ as an Ambiguous Symbol,” Political Science Quarterly,
Vol. 67 (December 1952), pp. 481–502
——. Discord and Collaboration. Baltimore, MD: Johns Hopkins Press, 1962
Youngblood, Dawn. “Interdisciplinary Studies and the Bridging Disciplines: A Matter of
Process,” Journal of Research Practice, Vol. 3, No. 2 (2007)
Zumwalt, Elmo R. “Correspondence: An Assessment of the Bomber-Cruise Missile
Controversy,” International Security, Vol. 2, No. 1 (Summer 1977), pp. 47–58
Index
315
316 INDEX
Clinton, Hillary 41, 131, 221, 222, 223, Punctuated 18, 196–7, 205–11, 255
224, 225–6 Punishment mechanism 104, 158,
Compellence 71 197–200, 206–11
Congress of Disciplines 17, 27–9, 42–4, 80, Total 196
142, 247 Distributed denial of service (DDoS)
Crowdstrike 188, 237 attack 7, 47, 48, 50, 52, 62, 130, 137,
Cruz, Ted 64 140, 141, 174, 175, 177, 189, 199, 204
Cyber archipelago 45, 46, 63, 165 Doty, Paul 42
Definition, 45 Dowding, Hugh 134
Iran, 63 Ducaru, Sorin 187, 191, 199, 205, 208
Penetration of, 46, 165 Dugin, Alexandr 218
Cyber domain Duqu 170–1, 173, 177
Definition 46 Dyn 175, 176, 189
Cyber exploitation 45, 53–5, 56, 142, 207,
230 Eisenhower, Dwight 42, 249
Definition, 54 Electronic warfare (EW) 152
and Kompromat 53 see also Kompromat Engels, Friedrich 90
Cyberattack Ergma, Ene 187, 189
Customized 23–57, 160–92 Estonia 66, 102, 163, 183, 217, 220–1, 240
Definition, 51 Cyberattacks (2007) 13, 52, 55, 57, 62,
Generalized 23–57, 160–92 64, 66, 120, 130, 175–6, 183, 184–5,
Cybercrime 45, 256 186–8, 189, 199, 204, 207, 212, 213,
Definition 50 220–1, 253
Cybersecurity 4, 25, 31, 32, 45–6, 47, 71, Cyber Defense League (Küberkaitseliit)
138, 160–92, 213, 229, 231, 236, 239 188, 230, 239
Definition 46–7 Mutual Assistance Treaty with Russia
Cyberspace 130, 185, 199
Definition 45–6 European Union (EU) 185, 190
Cyberwar 3, 40, 44, 51, 52, 58, 62, 121, 125, European Atomic Commission 103
196, 206
Definition, 52 F-Secure 170
Cyberweapons 8, 45, 47, 55, 123, 129, FireEye 138
135–6, 169, 174, 189, 203, 230, 240, First World War 36, 102, 109, 214
243, 245 Flame 171–2, 173, 177
Botnets 50, 130, 174–5, 176, 177, 184, France 88–9, 91, 97, 108, 125, 134, 135,
185, 217, 237, 253, 255 201, 251
Design emulation 168, 169, 171 French Army 109
Line replication 168, 169, 171 Fukuyama, Francis (“End of History”) 212
Phishing 48, 135, 224
Zero-day vulnerabilities 48, 68, 73, 76, Gaddafi, Muammar 62, 66, 88, 121, 123,
136, 168, 169, 170, 173, 174, 176, 177, 142, 176, 183, 217, 219, 255
198, 200, 240 Georgia cyberattacks (2008) 55, 62, 66,
175, 176, 183, 213, 255
Dark Web 45, 176, 177 Gerasimov, Valery 216, 218, 219
Dell 237 “Gerasimov doctrine” 216
Dempsey, Martin 61 Gilpin, Robert 87, 90
Deterrence 195–211, 252, 255 Goldsmith, Jack 9, 161, 180, 206
Accretional principle 205–11 Google 48, 74, 181, 239, 240, 241
Calculated ambiguity 65 Gorbachev, Mikhail 145
Cross-domain 15, 199, 205 Graham, Lindsey 129
Definition 56 Guderian, Heinz 134
Denial mechanism18, 196–7, 197–205
Entanglement mechanism 18, 97 Halperin, Morton 132
Equivalence principle 56, 65, 125, 142, Hammond, Philip 38
196, 198–9, 202, 203, 204, 205, 206, 207 Hitler, Adolf 89, 94–5, 105, 144, 218
INDEX 317
Stuxnet 17, 36, 38, 39, 52, 55, 65, 67, 69, Computer Fraud and Abuse Act 178, 234
71, 76, 120, 123–4, 125, 127, 129, 135, Cyber Command 137, 207
136, 137, 161, 167, 168, 169, 170–1, Cybersecurity Information Sharing Act
173, 177, 183, 198, 201, 241, 253 (CISA) 235, 238
Submarine warfare 36, 102–3, 121 Democratic Party 48, 53, 70, 130, 221–8
Sudan 75, 171, 190 Department of Defense 70, 98, 125, 140,
SWIFT 154 164, 179, 189, 231
Syria 38, 140, 148, 171, 173 Department of Homeland Security 39, 66
Syrian Electronic Army 139–40 Department of Justice 235, 236
Systemic disruption 13, 14, 15, 17, 18, 82, Department of State 70, 77
83, 85–6, 86–90, 92, 96, 100–7, 112, Electrical grid 66
115, 119–42, 251, 252, 253 see also Federal Bureau of Investigation (FBI)
technological revolution, third order 70, 71, 181–2, 223, 224, 236, 237, 255
Systemic revision 13, 14, 17, 83, 85–6, Joint Strike Fighter F-35 72
90–2, 95, 96, 108–10, 115, 143–59, National Security Agency (NSA) 76,
200, 252, 253 see also technological 139, 141, 198, 236, 239
revolution, second order National Security Council (NSC) 167
Systems change 13, 18, 83, 86, 92–5, 100, Office of Personnel and Management
110–14, 115, 144, 160–92, 200, 239, (OPM) 69
245, 246, 252, 253, 254 see also Republican Party 64, 77, 108, 131, 221,
technological revolution, first order 223
Strategic Command 103
Technological life cycle 98–100, 128 United States Steel Coorporation 188
Technology-in-being 98–9, 128 Unpeace 17, 18, 78, 79, 145, 146, 196, 205,
Technology-in-the-making 98, 257 206, 207, 208, 211, 213, 216, 220, 226,
Technological revolution 80–115, 119–42, 249–50, 255, 256
143–59, 160–92 Definition 78
First order, definition 13, 92–5 URL 45
Second order, definition 13, 90–2
Third order, definition 13, 86–90 Vatican City 185
Tenet, George 74 Verisign 189
Thompson, Kenneth 28
Thucydides 90, 100–1 Walt, Stephen 31, 35, 55, 89
Tiido, Harri 186, 199 Waltz, Kenneth 83, 87, 93, 101, 162, 166
Tokyo Stock Exchange 179 Wasserman Schultz, Debbie 131, 221,
Trotsky, Leon 90, 144, 214 223
Trump, Donald 41, 131, 218, 221–2, 223, Westphalia, Treaty of 94
224, 225, 226, 227 Wight, Martin 90, 95
Turing, Alan 4–5, 28 WikiLeaks 144, 221, 223, 224, 225
Wilson, Woodrow 84
UAVs 54 Wohlstetter, Albert 43
Ukraine 63, 213, 217–18, 219, 220, 226 Wolfers, Arnold
United Nations (UN) 135, 145, 180 World Wide Web, definition 45
Group of Government Experts
(UNGGE) 157 Yahoo 70, 74
International Telegraph Union (ITU) 180
United States 31, 39, 47, 53, 55, 64, 66, 67, Zero-day vulnerabilities 48, 68, 73, 76, 136,
75–6, 96, 97, 98, 101, 102, 103, 104, 168, 169, 170, 173, 174, 176, 177, 198,
111, 113, 120, 125, 126, 127, 135, 140, 200, 240
146, 147, 149, 150, 153, 156, 166, 170,
172, 179, 180, 185, 200, 203, 205, 217,
221, 223, 225, 226, 230, 231, 235, 240,
241, 256
Central Intelligence Agency (CIA) 74,
141, 217