by Paul Scharre
evolution of, 14–23
and human-machine relationship, 28–30
autonomy (continued)
importance to robots, 15
and intelligence, 28–33
intelligence vs., 50
and jammed communication channels, 15–16
limits to, 23–25
and personnel costs, 16
and swarming, 17–23
task dimension of, 28
theoretical basics, 26–34
B-52 bomber, 174
B-59 (Soviet submarine), 311, 318
BAE Systems, 108–9
ballistic missiles, 40–41, 139, 141–43
Ballmer, Steve, 241
Bandar Abbas airport (Iran), 169
bandwidth, 327–28
Barksdale Air Force Base, 174
Basic Ai Drives, The (Omohundro), 237–38
Bat radar-guided anti-ship bomb, 96
battle damage assessment (BDA), 55
battle network, 43–44
Battlestar Galactica (film), 223
Belfiore, Michael, 76
Berman, Greg, 207
Biological Weapons Convention (BWC), 344
Black Monday market crash (1987), 199, 206
Blade Runner (film), 234
bombing raids, aerial, 275–76, 278, 341–42
Boomerang shot detection system, 113
bordeebook (online bookseller), 205
Borrie, John
on incidents of mass lethality, 193
on risks of autonomous weapons, 150–51, 158
on system accidents, 189
on unanticipated failures, 154
on unintended lethal effects, 351
Bostrom, Nick, 237, 239
botnets, 212
Boyd, John, 23–24
Breakout (video game), 248
Brimstone missile, 105–8, 117, 326, 353
Bringsjord, Selmer, 245
brinksmanship, 207–8
brittleness, 145–47
and accidents, 155
and adversary innovation, 177–78
and counter-autonomy, 221
in neural networks, 182
in stock trading algorithms, 204
Brizzolara, Bob, 22
Brumley, David, 217, 219–22
on dangers of AI, 246
on ecosystem of autonomous systems, 247
on fear of AI, 241
on future of U.S. cybersecurity, 226–27
on introspective systems, 226
Brzezinski, Zbigniew, 173
bullets, expanding, 343
Bush, George H. W., 279
BWC (Biological Weapons Convention), 344
C++, 131
Cambodia, 288
Cameron, James, 264
Campaign to Stop Killer Robots, 7, 252, 271–72, 349, 353
Camp Roberts, California, 11–13
Canning, John, 261, 355
CAPTOR encapsulated torpedo mine, 51
Carnegie Mellon University, 219, 220
Carpenter, Charli, 264–65
Carr, Sir Roger, 109
cars, see automobiles
Carter, Jimmy, 173
Castro, Fidel, 307
casualties, civilian, see civilian causalities
CCW, see Convention on Certain Conventional Weapons
centaur chess, 321–22, 325
centaur warfighters, 321–30
advantages of, 321–23
in degraded communication conditions, 327–30
human’s role as moral agent and fail-safe, 323–25
speed as limiting factor, 325–26
targeting decisions, 98
Center for a New American Security, 7
centralized coordination, 19, 20f
Challenger space shuttle disaster, 154
chatbots, 236
chemical weapons, 331, 340–41, 343
Chemical Weapons Convention, 266–67
chess, 150, 235, 321–22, 325, 380n–381n
Chicago Mercantile Exchange, 204, 388n
chicken, game of, 311–12
China, 5, 62, 207–9
circuit breaker
to halt stock trading, 206–7, 230, 389n
human intervention as, 148, 190, 210, 228–30, 247
civilian causalities
and accuracy of autonomous weapons, 6, 272, 284
autonomous weapons and, 8, 282
and debate over arms control, 342, 348–49, 355
and IHL principles, 251, 252
in internecine wars, 289
military necessity vs., 348
precautions in attack, 258
and proportionality, 255–57
and tank warfare, 115
civil liability, 261–62, 401n
Civil War, U.S., 35–36, 259, 274
CIWS (Phalanx Close-In Weapon System), 111
Clark, Micah, 234–36, 242
Clune, Jeff, 182–87
cluster munitions, 267, 342–43, 349; see also land mines
CODE, see Collaborative Operations in Denied Environments
codes of conduct, 251, 357, 362
cognitive electronic warfare, 81–83
cognitization of machines, 5
Colby, Elbridge, 299
Cold War
arms control treaties, 331, 340
Cuban Missile Crisis, 307, 310–11
DARPA during, 76
“Dead Hand” doomsday device, 313–14
Hawk system and, 91
missile launch false alarm (1983), 1–2
nuclear weapons and stability, 298–302
offset strategies, 59
Cole, USS, 22
Collaborative Operations in Denied Environments (CODE) program, 72–76, 117, 253, 327–28
collateral damage, 74, 97–98, 113; see also civilian causalities
collective intelligence, 20–21
Columbia space shuttle disaster, 154
command-and-control, 162–67, 306–11
Command and Decision (C&D) computer, 163, 167, 168
commander’s intent, 308
command-guided weapons, 40
commercial drones, see DIY drones
common-mode failure, 155
“Communicating Intent and Imparting Presence” (Shattuck), 308
communications
autonomous weapons, 82–83
centaur warfighters and, 327–30
and CODE program, 72
jamming/disruption of, 15, 55–56, 303–4
complexity
and deep learning, 186
as double-edged sword, 179
complex systems
automation and, 156–59
coupling in, 152
computer viruses, 211–13
computer vision, 86–87
Conficker worm, 225–26
conscience, 265
consensus-based coordination, 19, 20f
consequentialism, 271–73, 281, 286, 295–96
context
and AGI, 231
failure of autonomous weapons to understand, 6
and human intervention, 91, 321
Convention on Certain Conventional Weapons (CCW), 268, 346, 347, 351, 354, 359
convolutional neural networks, 129, 133
cooperative targets, 84–85
coordinated behavior, 18–21; see also swarming
Coreflood botnet, 223
costs
of missiles, 54
of personnel, 16
counter-autonomy, 220–21
counter-rocket, artillery, and mortar (C-RAM) system, 323–25
coupling, in complex systems, 152
“Creating Moral Buffers in Weapon Control Interface Design” (Cummings), 278
crisis stability, 298–99
crossbow, 331
cruise missiles, 40–41
CS gas, 266–67
Cuban Missile Crisis, 207, 307, 308, 310–11, 317–18
Cummings, Mary “Missy,” 277–78
/> customary laws, 264
Cyber Grand Challenge, 217–22, 226, 246
cyberwarfare, 211–30
autonomous cyberweapons, 222–27
autonomy in, 215–16
Cyber Grand Challenge, 217–22
DoD policy on cyberweapons, 227–28
malware, 211–13
speed in cyberwarfare, 229–30
Stuxnet worm, 213–16
Danks, David, 310, 316
Danzig, Richard, 247
DARPA (Defense Advanced Research Projects Agency), 76–88
ACTUV, 78–79
CODE program, 72–76
Cyber Grand Challenge, 217–22
FLA, 68–71
Grand Challenge, 216–17
TRACE, 84–88
transparency in description of weapons research, 111
TTO, 79–83
data breaches, 212
datalinks, 55
Davis, Duane, 12, 13, 18, 19, 21
DDoS (distributed denial of service) attacks, 212–13
“Dead Hand” doomsday device, 313–14, 409n
“dead man’s switch,” 313
decision authority, 166
decision boundary, 185f
decision-making
by autonomous weapons, 4–5
by humans on the battlefield, 2–4
decision trees, 97
Deep Blue, 150, 321, 380n–381n
deep learning neural networks, 87f, 124–28
and adversarial images, 180–87
and Breakout, 248
and computer vision, 86–87
and cyber warfare, 226
vulnerability of, 180–88
DeepMind, 125–27, 247–48
DEFCON 2, 307
defense-dominant regime, 299, 300
defensive supervised autonomous weapons, 89
definitions, arms control and, 346–47, 349
dehumanization, 279–80
Dela Cuesta, Charles, 131–32
deontological ethics, 272, 285, 286, 294–96
Department of Defense (DoD)
autonomous cyberweapons policy, 228
autonomous weapons policy, 6, 89–90, 272, 293
CODE system, 75–76
cultural resistance to robotic weapons, 61
cyberdefense at, 216
cyberweapon policy, 227–28
drone budget, 13–14
“roadmaps” for unmanned system investment, 15–17, 25
Third Offset Strategy, 59
Department of Defense Law of War Manual, 245, 269
Dick, Philip K., 234
dictates of public conscience, 263–66
Dieterrich, Tom, 243–44
disarming of weapons, 261
distinction, principle of, 252–55
distributed denial of service (DDoS) attacks, 212–13
DIY (do-it-yourself) drones, 120–34
and high-school students’ mastery of technology, 130–33
hunting indoor targets, 121–24
neural networks and, 128–30
DMZ (demilitarized zone), 105, 112–13, 260
Docherty, Bonnie, 261–63, 267–68
Dr. Strangelove (film), 312
doctrine (Aegis C&D), 163–68, 170
DoD Directive 3000.09, 75, 89–92, 227–28, 347
doomsday machine, 312–14
Doppler shift, 86
Dow Jones Industrial Average, 199, 204
drones; see also specific drones
accuracy of, 282
DIY, 120–34
DoD spending on, 14
growth of, 102
nations making military use of, 363n
prevalence of, 56
reusability of, 56
Senkaku Islands incident, 208
stealth drones, 56, 61–62, 209, 354
for surveillance, 13–14
world-wide proliferation of, 103m
Duke University, 277
Dunlap, Charles, 263, 266–69
Ecuador, 103
Eisen, Michael, 205
Eisenhower, Dwight David, 76
electronic attacks, 55–56
electronic warfare, 81–83; see also communications; cyberwarfare
Ellison, Harlan, 234
emergent coordination, 19–20, 20f
E-minis, 203–4, 206, 388n
empathy, 273–74
encapsulated torpedo mines, 51
Environmental Modification Convention, 344
ESGRUM armed robot boat, 103
essential operators, 322, 323
Estonia, 212
Ethical Autonomy Project, 7
ethical governors, 281, 283
ethics, 271–96
authorization of autonomous weapons, 90–93
autonomous weapons’ potential for behaving more ethically than humans, 279–84
battlefield decision-making, 2–4
consequences of autonomous weapons, 272
consequences of removing moral responsibility for killing, 275–79
criticisms of weapons vs. criticisms of war in general, 294–96
DIY drones and, 133–34
empathy/mercy in war, 273–74
human dignity and autonomous weapons, 287–90
and inhumanity of autonomous weapons, 285–87
life-and-death choices, 4–5
military professionals’ role, 290–94
EURISKO, 239–40
Ex Machina (film), 236–37
Exocet missile, 169
expanding bullets, 343
experimental weapons programs, 59–77
CODE, 72–76, 117, 253, 327–28
DARPA and, 76–77
FLA, 68–71
LRASM, 62–68, 66f–67f, 353
X-47B, 17, 60–62
F-14 fighter aircraft, 169–70
F-15 fighter aircraft, 208, 327
F-16 fighter aircraft, 28, 141, 143, 157
F-18 Hornet fighter jet, 143, 160, 192, 278
F-22 fighter aircraft, 157
F-35 Joint Strike Fighter, 157, 222
FAA (Federal Aviation Administration), 120, 161
“fail-deadly” mechanism, 313
failures (accidents), 137–60; see also specific accidents
accountability gap, 262–63
automation’s role in, 155–59
fully autonomous systems, 193–94
humans vs. machines, 193–94
impossibility of testing all scenarios, 149–50
inevitability of, 154–55, 175–79
normal accidents, 150–54, 159–60
risk of, 189–95
runaway gun, 190–91
unanticipated consequences, 145–47
Falcon torpedo, 39
false alarms, see near-miss accidents
fast inshore attack craft (FIAC), 107
Fast Lightweight Autonomy (FLA), 68–71
fate, 360–63
feasible precautions, 258
Federal Aviation Administration (FAA), 120, 161
Federal Bureau of Investigation (FBI), 223
FIAC (fast inshore attack craft), 107
Financial Industry Regulatory Authority, 204
fire-and-forget munitions, 42, 105–8, 192
firebombing, 279
Fire Inhibit Switch (FIS), 165, 167, 168
Fire Scout Drone, 209
first-mover advantage, 221, 298, 300, 302, 304
FIRST Robotics Competition, 130–31
first-strike instability, 298
FIS (Fire Inhibit Switch), 165, 167, 168
fixed autonomous weapons, 353, 354
FLA (Fast Lightweight Autonomy), 68–71
flag of truce, 260
“Flash Crash,” 199–201, 203–4, 206–7
“flash war,” 210
Fog of War (documentary), 279
fooling (adversarial) images, 180–87, 181f, 183f, 185f, 253, 384n
ForAllSecure, 220
Ford, Harrison, 2
34
Foxer acoustic decoy, 40
Frankenstein complex, 234
Franz Ferdinand (archduke of Austria), 208
fratricide
acoustic shot detection system and, 113–14
Patriot missile system and, see Patriot missile system
Fukushima Daiichi nuclear plant disaster, 154–55, 176–77
fully autonomous weapon systems, 30f, 47f, 329f
basics, 46–50
fully autonomous weapon systems (continued)
current near-fully autonomous weapons, 50–51
dangers of, 148–49, 193–95, 329–30
semiautonomous weapons vs., 48f
futures contracts, 203
G7es/T5 Zaunkönig (“Wren”) torpedo, 39–40
G7e/T4 Falke (“Falcon”) torpedo, 39
Galluch, Peter, 163–69, 194, 329–30
gas, poison, 340–43
Gates, Bill, 232–33
Gates, Robert, 25
Gatling, Richard, 35, 36
Gatling gun, 35–36
general intelligence, see artificial general intelligence
General Robotics Automation Sensing and Perception (GRASP), 70
Geneva Conventions, 251, 259, 269; see also international humanitarian law
Georgia, Republic of, 213
Gerasimov, Valery, 117
Germany
aerial bombardments, 341–42
offense-dominant regime, 299–300
poison gas use, 340, 343
precision-guided torpedoes, 39–40
U.S. bombing campaigns against, 282
GGE (Group of Governmental Experts), 346
ghost tracks, 142
Global Hawk drone, 17
go (game), 124–28, 149, 150; see also AlphaGo
goal-driven AI systems, 238–40
goal misalignment, 243
goal-oriented behavior, 32
Goethe, Johann Wolfgang von, 148
golems, 234
Good, I. J., 233
Google, 125, 128
Goose, Steve, 252, 266–68, 271, 349, 351
GPS (global positioning system), 41
GRASP (General Robotics Automation Sensing and Perception), 70
Gray Eagle drone, 17
greedy shooter algorithm, 12, 21
Grossman, Dave, 275–77, 290
ground combat robots, 111–17
Group of Governmental Experts (GGE), 346
Guadalcanal, Battle of, 101
Guardium, 102
Guarino, Alessandro, 226
Gulf of Tonkin incident, 208, 389n
Gulf War (1991), 44, 279, 340
Haas, Michael Carl, 306–9, 317, 330
hacking, 157, 246–47; see also electronic attack
hacking back, 223–24, 228
Hague Convention (1899), 343
Hague Convention (1907), 260–61
Hambling, David, 114
HARM, see High-speed Anti-Radiation Missile
Harpoon anti-ship missile, 42, 62
Harpy, 5, 47–48, 47f, 52, 55, 117, 353
Harpy 2 (Harop), 55
Hawking, Stephen, 232
Hawkins, Jeff, 241
Hawley, John, 171–72, 177, 189, 193