LAWS and Yes
LAWS and Yes
Week 2 tutorials
Welcome to CS0888: AI & New Tech Law
3 tasks:
1. Nametags for the 2. Seating chart for the
semester—the coloured semester is being passed
papers around
• Print the name you’d like to • Please sit where you’d like to
be called (e.g., MARK) stay
• Use my markers, not pen, so 3. Download handout for
it’s legible from a distance today’s tutorial
• Each week I re-collect them, • NTULearn >> Tutorial
then redistribute them the handouts >> “wk2 LAWS
following week handout”
2
13
Lethal Autonomous Weapons Systems &
rogue AGI
• Laws
• Definition & state of the art
• Arguments against/for LAWS
• Examples & options
• Contrast rogue AI & the control problem
15
Lethal Autonomous Weapons Systems: definition
16
State of the art: Some unknowns
• Fully autonomous weapons
researched, probably developed
• Not yet prominently deployed
• Include:
• Offensive weapons for attacking
• Defensive weapons for repelling
attacks
17
State of the art: Israel’s Iron Dome system
• AWS that intercepts
(detects & destroys)
unmanned rockets
targeted at populated
areas
• High degree of
autonomy—but human
at least “on the loop”
18
State of the art: Drones
• Unmanned (uncrewed) Aerial Vehicles (drones)
with varying mixes of autonomy & human
operators are routinely used:
• Both Russia & Ukraine (Feb 2022-now)
• Both Israel & Hamas war (Oct 2023-now)—both
sides
• US assassinated Iranian General Qassem
Soleimani, 2020, with Predator drone
• For surveillance & as weapons (“kamikaze” or
recoverable)
19
Lethal Autonomous Weapons Systems &
rogue AI
• LAWS
• Definition & state of the art
• Arguments against/for LAWS
• Examples & options
• Contrast rogue AI & the control problem
20
Argument against (L)AWS
• Must maintain “meaningful human control”
• Human “in the loop”
• Loop: decision-making process
• Terms used to indicate that human involved, but less so:
• Human “nearby the loop” or “on the loop”
22
Afghan scout scenario
• Girl sent to report US/allied
positions to Taliban soldiers
• She was a legal target, under
int’l law of war
• Soldiers’ mercy based on
ethical (not legal)
considerations
25
Lethal Autonomous Weapons Systems
• LAWS
• Definition & state of the art
• Arguments against/for LAWS
• Examples & options
• Contrast rogue AI & the control problem
26
LAWS: A current threat
• “a psychopathic leader in control of
a sophisticated ANI system
portends a far greater risk in the
near term”…
• …than a rogue AGI
• Amir Husain, security-software
entrepreneur
27
LAWS example: Swarming drones
• “as flying robots become smaller, their
manoeuvrability increases…. They have a
shorter range, yet they [could] carry a lethal
payload—perhaps a one-gram shaped charge
to puncture the human cranium.
• “[O]ne can expect [them to be] deployed in
the millions, the agility & lethality of which
will leave humans utterly defenceless.”
5:17 on
LAWS option 1: Regulate development, use,
possession with int’l treaties
• Restrict what machines can do, globally—through an
international treaty
• E.g., prohibit fully autonomous combat decisions
• NGO Campaign to Stop Killer Robots lobbying for global
ban; Int’l Red Cross support
• No current United Nations agreements on LAWS, after a
decade of effort
• Opposed: Australia, Israel, Russia, UK, US
30
LAWS option 1: Regulate development, use,
possession with int’l treaties: Precedents
• Compare int’l agreements on Weapons of Mass Destruction
• “a class of weaponry with the potential to, in a single moment, kill
millions of civilians, jeopardize the natural environment, &
fundamentally alter the world & the lives of future generations
through their catastrophic effects” (a UN definition)
• Nuclear, chemical or biological weapons—e.g., nuclear test ban treaties
• …& agreements on certain other weapons
• Permanently blinding lasers
• Landmines
31
LAWS option 1: Regulate development, use,
possession with int’l treaties: Precedents
• Compare nuclear non-proliferation treaties:
• No spread of weapons beyond 5 nations that already were
“nuclear weapons states” before 1967:
• 5 had nuclear weapons: US, Russia, UK, France, China ()
• The 5 must share nuclear energy tech (for non-military use)
with 191 state parties to treaty (including SG)
• But the 191 can’t access nuclear weapons
• Non-parties India, Pakistan, Israel, & North Korea have nukes
US Small state
• One can’t regulate what • “We are all aware that a
doesn’t exist yet developing country does not
have the technology that we are
discussing…. How are we going
to defend ourselves?”
• Representative of Cuban
delegation
As reported in New York Times, “A.I. is making it easier to kill (you)” (2019)
33
LAWS option 2: Arm ourselves
• “The ‘choice’ is really no choice at all: We must fight
AI with AI”
• Development inevitable (?)
• No bans: Because of speed & complexity of LAWS battles,
“human input in certain conflicts is not only unnecessary
but also dangerous”
• Husain, The Sentient Machine
34
Your view
• Should nations agree NOT to develop machines that
can kill entirely autonomously, without a human in
the loop?
• Should SG sign such an agreement if it happens?
35
Lethal Autonomous Weapons Systems
• LAWS
• Definition & state of the art
• Arguments against/for LAWS
• Examples & options
• Contrast rogue AI & the control problem
37
the control problem
• Maintaining control of AI,
especially ASI
• Or even AGI
• Ensuring that AI doesn’t
go rogue—that AI’s goals
remain aligned with ours
38
39
40
Debate positions
42
Control: Indifference
• “It…seems perfectly possible to have a
superintelligence whose sole goal is something
completely arbitrary, such as to manufacture
as many paperclips as possible, & who would
resist with all its might any attempt to alter this
goal.”
• Nick Bostrum, Oxford U, 2003
43
Control: Indifference
44
ASI control: Learn from our everyday
behaviour
• “if we see the distance between ourselves and
the ants as equivalent to the distance between
a superintelligence and ourselves,
• then maybe [ASI] just doesn’t care as well.”
• Singler (2022)
45
Control: Maybe ASI can’t get mad
No sentience
• “…An AI system that has the
equivalent of a neocortex…
• …but not the other parts of the
brain [that produce emotion]…
• …will not spontaneously develop
human-like emotions and drives….
• …So if we don't put [emotions &
drives] in machines, they won’t
just suddenly appear.”
Hawkins interview (2021)
46
Control: Safeguards can be built in
• Some say rules can be
programmed into ASI to keep it
under control
47
48