Deserts of Fire

Home > Science > Deserts of Fire > Page 9
Deserts of Fire Page 9

by Douglas Lain


  But all Kyra could think was: No one would have to do what my father did.

  The process of getting security clearance took a while. Kyra’s mother was surprised when Kyra called to tell her that government investigators might come to talk to her, and Kyra wasn’t sure how to explain why she took this job when there were much better offers from other places. So she just said, “This company helps veterans and soldiers.”

  Her mother said, carefully, “Your father would be proud of you.”

  Meanwhile, they assigned her to the civilian applications division, which made robots for factories and hospitals. Kyra worked hard and followed all the rules. She didn’t want to mess up before she got to do what she really wanted. She was good at her job, and she hoped they noticed.

  Then, one morning, Dr. Stober, the head roboticist, called her to join him in a conference room.

  Kyra’s heart was in her throat as she walked over. Was she going to be let go? Had they decided that she couldn’t be trusted because of what had happened to her father? That she might be emotionally unstable? She had always liked Dr. Stober, who seemed like a good mentor, but she had never worked with him closely.

  “Welcome to the team,” said a smiling Dr. Stober. Besides Kyra, there were five other programmers in the room. “Your security clearance arrived this morning, and I knew I wanted you on this team right away. This is probably the most interesting project at the company right now.”

  The other programmers smiled and clapped. Kyra grinned shyly at each of them in turn as she shook their outstretched hands. They all had reputations as the stars in the company.

  “You’re going to be working on the AW-1 Guardians, one of our classified projects.”

  One of the other programmers, a young man named Alex, cut in: “These aren’t like the field transport mules and remote surveillance crafts we already make. The Guardians are unmanned, autonomous flying vehicles about the size of a small truck armed with machine guns and missiles.”

  Kyra noticed that Alex was really excited by the weapons systems.

  “I thought we make those kinds already,” Kyra said.

  “Not exactly,” Dr. Stober said. “Our other combat systems are meant for surgical strikes in remote places or prototypes for frontline combat, where basically anything that moves can be shot. But these are designed for peacekeeping in densely populated urban areas, especially places where there are lots of Westerners or friendly locals to protect. Right now we still have to rely on human operators.”

  Alex said in a deadpan voice, “It would be a lot easier if we didn’t have to worry about collateral damage.”

  Dr. Stober noticed that Kyra didn’t laugh and gestured for Alex to stop. “Sarcasm aside, as long as we’re occupying their country, there will be locals who think they can get some advantage from working with us and locals who wish we’d go away. I doubt that dynamic has changed in five thousand years. We have to protect those who want to work with us from those who don’t, or else the whole thing falls apart. And we can’t expect the Westerners doing reconstruction over there to stay holed up in walled compounds all the time. They have to mingle.”

  “It’s not always easy to tell who’s a hostile,” Kyra said.

  “That’s the heart of the issue. Most of the time, much of the population is ambivalent. They’ll help us if they think it’s safe to do so, and they’ll help the militants if they think that’s the more convenient choice.”

  “I’ve always said that if they choose to help the militants blend in, I don’t see why we need to be that careful. They made a decision,” Alex said.

  “I suppose some interpretations of the rules of engagement would agree with you. But we’re telling the world that we’re fighting a new kind of war, a clean war, one where we hold ourselves to a higher standard. How people see the way we conduct ourselves is just as important nowadays.”

  “How do we do that?” Kyra asked before Alex could further derail the conversation.

  “The key piece of software we have to produce needs to replicate what the remote operators do now, only better. The government has supplied us with thousands of hours of footage from the drone operations during the last decade or so. Some of them got the bad guys, and some of them got the wrong people. We’ll need to watch the videos and distill the decision-making process of the operators into a formal procedure for identifying and targeting militants embedded in urban conditions, eliminate the errors, and make the procedure repeatable and applicable to new situations. Then we’ll improve it by tapping into the kind of big data that individual operators can’t integrate and make use of.”

  The code will embody the minds of my father and others like him so that no one would have to do what they did, endure what they endured.

  “Piece of cake,” said Alex. And the room laughed, except for Kyra and Dr. Stober.

  Kyra threw herself into her work, a module they called the ethical governor, which was responsible for minimizing collateral damage when the robots fired upon suspects. She was working on a conscience for killing machines.

  She came in on the weekends and stayed late, sometimes sleeping in the office. She didn’t view it as a difficult sacrifice to make. She couldn’t talk about what she was working on with the few friends she had, and she didn’t really want to spend more time outside the office with people like Alex.

  She watched the videos of drone strikes over and over. She wondered if any were missions her father had flown. She understood the confusion, the odd combination of power and powerlessness experienced when watching a man one is about to kill through a camera, the pressure to decide.

  The hardest part was translating this understanding into code. Computers require precision, and the need to articulate vague hunches had a way of forcing one to confront the ugliness that could remain hidden in the ambiguity of the human mind.

  To enable the robots to minimize collateral damage, Kyra had to assign a value to each life that might be endangered in a crowded urban area. One of the most effective ways for doing this—at least in simulations—also turned out to be the most obvious: profiling. The algorithm needed to translate racial characteristics and hints about language and dress into a number that held the power of life and death. She felt paralyzed by the weight of her task.

  “Everything all right?” Dr. Stober asked.

  Kyra looked up from her keyboard. The office lights were off; it was dark outside. She was practically the last person left in the building.

  “You’ve been working a lot.”

  “There’s a lot to do.”

  “I’ve reviewed your check-in history. You seem to be stuck on the part where you need the facial recognition software to give you a probability on ethnic identity.”

  Kyra gazed at Dr. Stober’s silhouette in the door to her office, back-lit by the hall lights. “There’s no API for that.”

  “I know, but you’re resisting the need to roll your own.”

  “It seems … wrong.”

  Dr. Stober came in and sat down in the chair on the other side of her desk. “I learned something interesting recently. During World War II, the US Army trained dogs for warfare. They would act as sentries, guards, or maybe even as shock troops in an island invasion.”

  Kyra looked at him, waiting.

  “The dogs had to be trained to tell allies apart from enemies. So they used Japanese-American volunteers to teach the dogs to profile, to attack those with certain kinds of faces. I’ve always wondered how those volunteers felt. It was repugnant and yet it was also necessary.”

  “They didn’t use German-American or Italian-American volunteers, did they?”

  “No, not that I’m aware of. I’m telling you this not to dismiss the problematic nature of your work, but to show you that the problem you’re trying to solve isn’t entirely new. The point of war is to prefer the lives of one group over the lives of another group. And short of being able to read everyone’s minds, you must go with shortcuts and snap heuristics to tell a
part those who must die from those who must be saved.”

  Kyra thought about this. She could not exempt herself from Dr. Stober’s logic. After all, she had lamented her father’s death for years, but she had never shed a tear for the thousands he had killed, no matter how many might have been innocent. His life was more valuable to her than all of them added together. His suffering meant more. It was why she was here.

  “Our machines can do a better job than people. Attributes like appearance and language and facial expressions are but one aspect of the input. Your algorithm can integrate the footage from city-wide surveillance by thousands of other cameras, the metadata of phone calls and social visits, individualized suspicion built upon data too massive for any one person to handle. Once the programming is done, the robots will make their decisions consistently, without bias, always supported by the evidence.”

  Kyra nodded. Fighting with robots meant that no one had to feel responsible for killing.

  Kyra’s algorithm had to be specified exactly and submitted to the government for approval. Sometimes the proposals came back, marked with questions and changes.

  She imagined some general (advised, perhaps, by a few military lawyers) looking through her pseudocode line by line:

  A target’s attributes would be evaluated and assigned numbers. Is the target a man? Increase his suspect score by thirty points. Is the target a child? Decrease his suspect score by twenty-five points. Does the target’s face match any of the suspected insurgents with at least a fifty-percent probability? Increase his suspect score by five hundred points.

  And then there was the value to be assigned to the possible collateral damage around the target. Those who could be identified as Americans or had a reasonable probability of being Americans had the highest value. Then came native militia forces and groups who were allied with US forces, and the local elites. Those who looked poor and desperate were given the lowest values. The algorithm had to formalize anticipated fallout from media coverage and politics.

  Kyra was getting used to the process. After the specifications had gone back and forth a few times, her task didn’t seem so difficult.

  Kyra looked at the number on the check. It was large.

  “It’s a small token of the company’s appreciation for your efforts,” said Dr. Stober. “I know how hard you’ve been working. We got the official word on the trial period from the government today. They’re very pleased. Collateral damage has been reduced by more than eighty percent since they started using the Guardians, with zero erroneous targets identified.”

  Kyra nodded. She didn’t know if the eighty percent was based on the number of lives lost or the total amount of points assigned to the lives. She wasn’t sure she wanted to think too hard about it. The decisions had already been made.

  “We should have a team celebration after work.”

  And so for the first time in months, Kyra went out with the rest of the team. They had a nice meal, some good drinks, sang karaoke. And Kyra laughed and enjoyed hearing Alex’s stories about his exploits in war games.

  “Am I being punished?” Kyra asked.

  “No, no, of course not,” Dr. Stober said, avoiding her gaze. “It’s just administrative leave until … the investigation completes. Payroll will still make bi-weekly deposits, and your health insurance will continue, of course. I don’t want you to think you’re being scapegoated. It’s just that you did most of the work on the ethical governor. The Senate Armed Forces Committee is really pushing for our methodology, and I’ve been told that the first round of subpoenas are coming down next week. You won’t be called up, but we’ll likely have to name you.”

  Kyra had seen the video only once, and once was enough. Someone in the market had taken it with a cellphone, so it was shaky and blurry. No doubt the actual footage from the Guardians would be much clearer, but she wasn’t going to get to see that. It would be classified.

  The market was busy, the bustling crowd trying to take advantage of the cool air in the morning. It looked, if you squinted a bit, like the farmer’s market that Kyra sometimes went to to get her groceries. A young American man, dressed in the distinctive protective vest that expat reconstruction advisors and technicians wore over there, was arguing with a merchant about something, maybe the price of the fruits he wanted to buy.

  Reporters had interviewed him afterwards, and his words echoed in Kyra’s mind: “All of a sudden, I heard the sounds made by the Guardians patrolling the market change. They stopped to hover over me, and I knew something was wrong.”

  In the video, the crowd was dispersing around him, pushing, jostling with each other to get out of the way. The person who took the video ran, too, and the screen was a chaotic blur.

  When the video stabilized, the vantage point was much further. Two black robots about the size of small trucks hovered in the air above the kiosk. They looked like predatory raptors. Metal monsters.

  Even in the cellphone video, it was possible to make out the recorded warning in the local language the robots projected via loudspeakers. Kyra didn’t know what the warnings said.

  A young boy, seemingly oblivious to the hovering machines above him, was running at the American man, laughing and screaming, his arms opened wide as if he wants to embrace the man.

  “I just froze. I thought, oh God, I’m going to die. I’m going to die because this kid has a bomb on him.”

  The militants had tried to adapt to the algorithms governing the robots by exploiting certain weaknesses. Because they realized that children were assigned a relatively high value for collateral damage purposes and a relatively low value for targeting purposes, they began to use more children for their missions. Kyra had had to tweak the algorithm and the table of values to account for these new tactics.

  “All of your changes were done at the request of the Army and approved by them,” said Dr. Stober. “Your programming followed the updated rules of engagement and field practices governing actual soldiers. Nothing you’ve done was wrong. The Senate investigation will be just a formality.”

  In the video, the boy kept on running towards the American. The warnings from the hovering Guardians changed, got louder. The boy did not stop.

  A few more boys and girls, some younger, some older, came into the area cleared by the crowd. They ran after the first boy, shouting.

  The militants had developed an anti-drone tactic that was sometimes effective. They’d send the first bomber out, alone, to draw the fire of the drones. And while the drone operators were focused on him and distracted, a swarm of backup bombers would rush out to get to the target while the drones shot up the first man.

  Robots could not be distracted. Kyra had programmed them to react to such tactics.

  The boy was now only a few steps away from the lone American. The Guardian hovering on the right took a single shot. Kyra flinched at the sound from the screen..

  “It was so loud,” said the young man in his interview. “I had heard the Guardians shoot before, but only from far away. Up close was a completely different experience. I heard the shot with my bones, not my ears.”

  The child collapsed to the ground immediately. Where his head had been, there was now only empty space. The Guardians had to be efficient when working in a crowd. Clean.

  A few more loud shots came from the video, making Kyra jump involuntarily. The cellphone owner panned his camera over, and there were a few more bundles of rags and blood on the ground. The other children.

  The crowd stayed away, but a few of the men were coming back into the clearing, moving closer, raising their voices. But they didn’t dare to move too close to the stunned young American, because the two Guardians were still hovering overhead. It took a few minutes before actual American soldiers and the local police showed up at the scene and made everyone go home. The video ended there.

  “When I saw that dead child lying in the dust, all I could feel was relief, an overwhelming joy. He had tried to kill me, and I had been saved. Saved by our robots.�


  Later, when the bodies were searched by the bomb-removal robots, no explosives were found.

  The child’s parents came forward. They explained that their son wasn’t right in the head. They usually locked him in the house, but that day, somehow he had gotten out. No one knew why he ran at that American. Maybe he thought the man looked different and he was curious.

  All the neighbors insisted to the authorities that the boy wasn’t dangerous. Never hurt anyone. His siblings and friends had been chasing after him, trying to stop him before he got into any trouble.

  His parents never stopped crying during the interview. Some of the commenters below the interview video said that they were probably sobbing for the camera, hoping to get more compensation out of the American government. Other commenters were outraged. They constructed elaborate arguments and fought each other in a war of words in the comment threads, trying to score points. Some commenters brought up the point, again, that comments on news reports really ought to be moderated.

  Kyra thought about the day she made the changes in the programming. She had been sipping a frappé because the day was hot. She remembered deleting the old value of a child’s life and putting in a new one. It had seemed routine, just another change like hundreds of other tweaks she had already made. She remembered deleting one IF and adding another, changing the control flow to defeat the enemy. She remembered feeling thrilled at coming up with a neat solution to the nested logic. It was what the Army had requested, and she had decided to do her best to give it to them faithfully.

  “Mistakes happen,” said Dr. Stober. “The media circus will eventually end, and all the hand-wringing will stop. News cycles are finite, and something new will replace all this. We just have to wait it out. We’ll figure out a way to make the system work better next time. This is better. This is the future of warfare.”

  Kyra thought about the sobbing parents, about the dead child, about the dead children. She thought about the eighty-percent figure Dr. Stober had quoted. She thought about the number on her father’s scorecard, and the parents and children and siblings behind those numbers. She thought about her father coming home.

 

‹ Prev