Book Read Free

Army of None

Page 40

by Paul Scharre


  HARD PROBLEMS, IMPERFECT INSTITUTIONS

  Humanity is at the threshold of a new technology that could fundamentally change our relationship with war. The institutions that human society has to deal with these challenges are imperfect. Getting agreement in the CCW is challenging, given its structure as a consensus-based organization. It’s possible that fully autonomous weapons are a bad idea, whether for legal, moral, or strategic reasons, but that restraint among nations is doomed to fail. It wouldn’t be the first time. For now, nations, NGOs, and international organizations like the ICRC continue to meet in the CCW to discuss the challenges of autonomous weapons. Meanwhile, technology races forward.

  Conclusion

  NO FATE BUT WHAT WE MAKE

  In the Terminator films, Sarah Connor and her son John, who will eventually lead the resistance against the machines, are hounded by an enemy even worse than the Terminators: fate. No matter how many times Sarah and John defeat Skynet, it still returns to haunt them in yet another film. Part of this is good Hollywood business. Sarah and John are victims of being in a film series where a sequel is a surefire moneymaker. But their trap of fate is also essential to the storytelling of the Terminator movies. In film after film, Sarah and John are perpetually hunted by Terminators sent back from the future to kill them and prevent John from eventually leading the human resistance against the machines. The essence of the events that propel the story are a time-travel paradox: if the Terminators succeeded in killing Sarah or John, then John couldn’t lead the human resistance, which would negate the reason for killing them in the first place. Meanwhile, Sarah and John attempt to destroy Skynet before it can come into existence. It’s another paradox; if they succeeded, Skynet would never send a Terminator back in time to attack them, giving them the motivation to destroy Skynet.

  Sarah and John are forever trapped in a battle against Skynet across the past, present, and future. Judgment Day continues to occur, no matter their actions, although the date keeps shifting (conveniently, to just a few years after each film’s release date, keeping Judgment Day forever in the audience’s future). In spite of this, Sarah and John fight against fate, never wavering in their faith that this time they will be able to defeat Skynet for good and avert Judgment Day. In Terminator 2: Judgment Day, John Connor quotes his mother as saying, “The future’s not set. There’s no fate but what we make for ourselves.” The line returns again and again in subsequent movies: “There is no fate but what we make.”

  Of course, Sarah Conner is right. In the real world, the future isn’t written. The visions of possible futures presented in this book—scary visions, good visions—are only wisps of imagination. The real future unfolds one step at a time, one day at a time, one line of code at a time. Will the future be shaped by technology? Of course. But that technology is made by people. It’s being crafted by people like Duane Davis, Bradford Tousley, and Brandon Tseng. It’s shaped by government officials like Larry Schuette, Frank Kendall, and Bob Work. Their decisions are guided by voices like Stuart Russell, Jody Williams, and Ron Arkin. Each of these individuals has choices and, collectively, humanity has choices.

  The technology to enable machines that can take life on their own, without human judgment or decision-making, is upon us. What we do with that technology is up to us. We can use artificial intelligence to build a safer world, one with less human suffering, fewer accidents, fewer atrocities, and one that keeps human judgment where it is needed. We can preserve a space for empathy and compassion, however rare they may be in war, and leave the door open to our better angels. Or we can become seduced by the allure of machines—their speed, their seeming perfection, their cold precision. We can delegate power to the machines, trusting that they will perform their assigned tasks correctly and without hesitation, and hope that we haven’t got it wrong, that there are no flaws lurking in the code for unforeseen events to trigger or enemies to exploit.

  There are no easy answers. If war could be averted and nations could secure their peace through treaties and not force of arms, they would have done so long ago. Militaries exist as a means to defend people from those who are not deterred by laws or international goodwill. To ask nations to surrender a potential means to defend themselves is to ask them to take a grave and weighty gamble.

  And yet . . .

  Despite this—despite the reality that there are no police to enforce the laws of war and that only the victors decide who stands trial. . . . Despite the reality that might, not right, decides who wins and dies on the battlefield. . . . Despite all this, codes of conduct have governed human behavior in war for millennia. Even the earliest of these codes contain guidance for which weapons could be used in war and which were beyond the pale. Barbed and poison-tipped arrows were surely useful in war. Yet they were wrong nevertheless.

  Human societies have cooperated time and again to restrain the worst excesses in war, to place some actions or means of killing out of bounds, even when life and death are at stake. Sometimes this cooperation has failed, but the miracle is that sometimes it hasn’t. In the modern era, militaries have largely stepped away from chemical weapons, biological weapons, blinding lasers, land mines, and cluster munitions as weapons of war. Not all militaries, but most of them. Nuclear powers have further agreed to limit how nuclear weapons are deployed in order to improve strategic stability. These rules are sometimes broken, but the fact that restraint exists at all among states that otherwise fear each other shows that there is hope for a better world.

  This restraint—the conscious choice to pull back from weapons that are too dangerous, too inhumane—is what is needed today. No piece of paper can prevent a state from building autonomous weapons if they desire it. At the same time, a pell-mell race forward in autonomy, with no clear sense of where it leads us, benefits no one. States must come together to develop an understanding of which uses of autonomy are appropriate and which go too far and surrender human judgment where it is needed in war. These rules must preserve what we value about human decision-making, while attempting to improve on the many human failings in war. Weighing these human values is a debate that requires all members of society, not just academics, lawyers, and military professionals. Average citizens are needed too, because ultimately autonomous military robots will live—and fight—in our world.

  Machines can do many things, but they cannot create meaning. They cannot answer these questions for us. Machines cannot tell us what we value, what choices we should make. The world we are creating is one that will have intelligent machines in it, but it is not for them. It is a world for us.

  Illustrations

  U.S. Marine Corps officers with a Gatling gun in Washington, DC, 1896. Through automation, the Gatling gun allowed four men to perform the same work as a hundred. Richard Gatling built his gun in the hopes that it would reduce the number of soldiers on the battlefield, thus saving lives.

  A British machine gun crew in gas masks during the Battle of the Somme, July 1916. The Gatling gun paved the way for the machine gun, which brought a new level of destruction to war that European nations were not prepared for. At the Battle of the Somme, Britain lost 20,000 men in a single day.

  The destroyer USS Fitzgerald fires a Harpoon missile during a joint training exercise with Japan, 2016. The Harpoon is a fire-and-forget semiautonomous anti-ship missile. The human chooses the enemy ship to be destroyed and the missile uses automation to avoid other nearby ships. Missiles of this type are in widespread use around the world and have been used for decades.

  The Tomahawk Land Attack Missile (TLAM) Block IV, also called “Tactical Tomahawk” or TLAM-E, flies over China Lake, California. The Tactical Tomahawk is a “net-enabled” weapon with a communications link back to human controllers, allowing commanders to redirect the missile while in flight. Advanced missiles increasingly have communications links, which give commanders more control and increases weapons’ effectiveness.

  U.S. Marines remove a training AGM-88 High-Speed Anti-Radiation Missile (HAR
M) from an F/A-18C Hornet on the deck of the USS Theodore Roosevelt aircraft carrier, 2015. The HARM is a fire-and-forget semiautonomous homing missile used to destroy enemy radars.

  An Israeli Harpy loitering munition launching. The Harpy is a fully autonomous anti-radar weapon and has been sold to a number of countries: Chile, China, India, South Korea, and Turkey. Similar to the HARM, the Harpy is intended to destroy radars. The key difference is that the Harpy can loiter for 2.5 hours, allowing it to search over a wide area for enemy targets, whereas the HARM is only aloft for approximately 4.5 minutes.

  A U.S. Navy Aegis warship fires a missile as part of a live-fire exercise off the coast of North Carolina, 2017. The Aegis air and missile defense system has semiautonomous (human in the loop) and supervised autonomous (human on the loop) modes. Supervised autonomy is vital for defending ships against short-warning saturation attacks. At least thirty nations have ship- or land-based defensive supervised autonomous weapons similar to Aegis.

  A U.S. Army Patriot battery along the Turkey-Syria border, 2013. U.S. Patriot batteries were deployed to Turkey to aid in defending Turkey during the Syrian civil war. The Patriot, a land-based supervised autonomous air and missile defense system, was involved in two fratricide incidents in 2003 that highlighted some of the dangers of automation in weapon systems.

  An MQ-1 Predator at Creech Air Force Base, Nevada, 2016. At least ninety nations have drones and over a dozen have armed drones. As automation increases, future drones will be increasingly autonomous, raising new possibilities and challenges in warfare.

  The X-45A uninhabited aircraft in an experimental test flight, 2002. Today’s drones are not survivable in contested environments because of their lack of stealth characteristics. The X-45A paved the way for future stealth combat drones, which are in development by leading military powers around the globe. Stealth combat drones would operate in contested environments in which communications may be jammed, raising questions about which tasks the aircraft should be allowed to perform when operating autonomously.

  The X-47B autonomously lands on the USS George H. W. Bush in 2013, marking the first time an uninhabited aircraft landed on an aircraft carrier. Demonstrating autonomous carrier landings was a significant milestone for uninhabited aircraft. Earning warfighters’ trust is a major limiting factor in fielding more advanced military robotic systems. Despite technological opportunities, the U.S. Navy is not developing a carrier-based uninhabited combat aircraft.

  The X-47B autonomously refuels from a K-707 tanker over the Chesapeake Bay, 2015, demonstrating the first aerial refueling of an uninhabited aircraft. Autonomous aerial refueling is an important enabler for making uninhabited combat aircraft operationally relevant.

  The Israeli Guardium uninhabited ground vehicle. The armed Guardium has reportedly been sent on patrol near the Gaza border, although humans remain in control of firing weapons. Countries have different thresholds for risk with armed robots, including lethal autonomy, depending on their security environment.

  An uninhabited, autonomous boat near Virginia Beach as part of a U.S. Navy demonstration of swarming boats, 2016. Swarms are the next evolution in autonomous systems, allowing one human to control many uninhabited vehicles simultaneously, which autonomously cooperate to achieve a human-directed goal.

  Deputy Secretary of Defense Bob Work speaks at the christening of DARPA’s Sea Hunter, or Anti-Submarine Warfare Continuous Trail Unmanned Vessel (ACTUV), 2016. Work has been a major advocate of robotics and autonomous systems and human-machine teaming to maintain U.S. military superiority. At the Sea Hunter’s christening, Work envisioned “wolf packs” of uninhabited warships like the Sea Hunter plying the seas in search of enemy submarines.

  The Sea Hunter gets under way on the Willamette River following its christening in Portland, Oregon, 2016. At $2 million each, the Sea Hunter is a fraction of the cost of a $1.6 billion destroyer, allowing the United States to field large numbers of Sea Hunters, if it desires.

  A B-1B bomber launches a Long-Range Anti-Ship Missile (LRASM) in a flight demonstration, 2013. The semiautonomous LRASM incorporates a number of advanced autonomous guidance features that allow it to avoid pop-up threats while en route to its human-designated target.

  A Long-Range Anti-Ship Missile (LRASM) about to hit a target ship in a demonstration test, 2013. Humans remain “in the loop” for LRASM targeting decisions. Similar to the Harpoon, a human operator chooses the enemy ship to be attacked and the LRASM uses automation to maneuver and identify the intended target while avoiding other nearby ships.

  A modified quadcopter autonomously navigates through a warehouse as part of DARPA’s Fast Lightweight Autonomy (FLA) program, 2016. FLA quadcopters use onboard sensors to detect the surrounding environment and autonomously navigate through cluttered terrain.

  Under Secretary of Defense Frank Kendall watches as a soldier from the 4th Battalion, 17th Infantry Regiment, 1st Brigade Combat Team, 1st Armored Division demonstrates a micro drone at Fort Bliss, Texas, 2015. When he was under secretary of defense for acquisition, technology, and logistics, Kendall was one of three officials who would have had responsibility for authorizing the development of any autonomous weapon under current DoD policy.

  Screenshot from a DARPA video of the prototype human-machine interface for the CODE program. The machine automatically detected an enemy tank, developed a targeting solution, and estimated likely collateral damage. For this engagement the human is “in the loop,” however, and must approve each engagement.

  Navy Captain Pete Galluch (right), commander of the Aegis Training and Readiness Center, demonstrates to the author “rolling green” to authorize lethal engagements in an Aegis simulator in Dahlgren, Virginia, 2016. Navy commanders are able to use the highly automated and lethal Aegis weapon system safely in large part because humans retain tight control over its operation.

  Researchers from the Naval Postgraduate School (NPS) launch a small drone as part of a thirty-drone swarm at Camp Roberts, California, 2015. Researchers at NPS are experimenting with swarm versus swarm combat, exploring new tactics for how to control friendly swarms and how to defeat enemy swarms.

  Two drones fly in formation over Camp Roberts, California, as part of a Naval Postgraduate School experiment in cooperative autonomy, or swarming. Swarms raise novel command-and-control challenges for how to optimize autonomous behavior and cooperation for large numbers of systems while retaining human control over the swarm as a whole.

  Deputy Secretary of Defense Bob Work (left) and Office of Naval Research program officer Lee Mastroianni discuss the prototype Low-Cost Unmanned Aerial Vehicle Swarming Technology (LOCUST) drone, 2016. The tube-launched LOCUST is intended to pave the way for swarms of low-cost drones.

  A Counter Rocket, Artillery, and Mortar (C-RAM) system from the 2nd Battalion, 44th Air Defense Artillery Regiment at Bagram Airfield in Afghanistan. The C-RAM, which has a high degree of automation but also retains a human in the loop, is an example of the kind of “centaur” human-machine teaming that Bob Work has advocated for.

  Left to right: Steve Goose (Human Rights Watch), Jody Williams (Nobel Women’s Initiative), the author Paul Scharre (Center for a New American Security), and Thomas Nash (Article 36) at the 2016 United Nations Convention on Certain Conventional Weapons (CCW) meeting on lethal autonomous weapons. In 2016, CCW member states agreed to form a Group of Governmental Experts (GGE) to discuss lethal autonomous weapon systems, but there is no consensus among states on what to do about autonomous weapons.

  The DJI Spark, which retailed for $499 as of August 2017, can autonomously track and follow moving objects, avoid obstacles, and return home when it is low on batteries. The hobbyist drone market has exploded in recent years, making the technology widely available and inexpensive. Non-state groups have already used weaponized small, cheap drones for aerial attacks. Over time, hobbyist drones will become increasingly autonomous.

  A student robotics project at Thomas Jefferson High School in Alexandri
a, Virginia, to build a bicycle with an automatic gear shifter, akin to an automatic transmission in a car. As robotics and autonomous technology advances, increasingly capable robotic systems will be available to DIY hobbyists.

  Vice Chairman of the Joint Chiefs General Paul Selva (left) looks on as Deputy Secretary of Defense Bob Work speaks to reporters on the defense budget, 2016. Speaking at a conference in 2016, General Selva said that delegating responsibility for lethal force decisions was a “fairly bright line that we’re not willing to cross.”

  An X-47B experimental drone takes off from the USS Theodore Roosevelt aircraft carrier, 2013. Robotic technology will continue to evolve, with increasingly autonomous systems available to nations and non-state groups around the globe.

 

‹ Prev