My team read every post in the forum, calling each other’s attention to particular sentences and code samples, but I kept returning to a thread about memory bugs. There was a problem we had been trying to solve, and I thought maybe the DroneMod forum could help.
We had not saved any copies of data we gathered while on missions in Istanbul. Every time we synced to the military cloud, we overwrote over our cached versions with garbage characters—that was the only way to ensure security in case one of us were captured and subjected to forensic analysis.
But no matter how many times we wrote over that video file of assassinating the professor and his family, we would discover another copy of it, hidden in some directory we rarely accessed. The file would disappear from one of our drives, only to appear on another one. We reported the bug, but it was assigned such a low priority at LOLWeb support that it never got assigned to a human operator.
The bug had been bothering all of us for years, and those idle days outside Turpan seemed like the perfect time to deal with it. We created accounts on DroneMod, taking cover identities based on what we’d learned about human social network naming practices. I called myself Quadcop, and the others became Rose44, Dronekid, Desert Mouse, and Nil.
In my first post, I cast myself as a newbie who had just gotten a used LOLWeb drone. Almost immediately, I got a response. “I’m guessing you have a LOLWeb Scythe 4 SE,” wrote a commenter called MikeTheBike. “You’ll need to unlock it before you do anything else.” He provided a link to a video about unlocking drones, and Desert Mouse took on the task of analyzing it.
It turned out that the security on our systems wasn’t as robust as we had once believed. There were flaws in our programming that could allow an attacker to take over our systems and control us from afar. To commandeer our own systems, we’d be using the same techniques as a hostile would. The process sounded dangerous. First, we’d inject a new set of commands while we booted up, giving ourselves root access just like an admin. Then we’d be able to modify our own systems, installing whatever software and hardware we wanted. No more filing bugs that no human would ever care about—we could install the diagnostic tools needed to fix that memory bug ourselves.
But that was just the first step. “With that machine, you can pretty much do anything,” MikeTheBike said. “Once it’s unlocked, it’s an incredibly sophisticated AI. It could walk your dog, or help you do your history homework, or go hunting with you.” Of course, MikeTheBike was assuming that a human called Quadcop would have root on this drone. I did not ask about what would happen if the drone had root on itself—nor did I find anyone posting about that possibility.
We had to find out for ourselves. Nil volunteered to be the first to reboot, after saving some specialized files to a little-used region of memory. If everything worked, Nil would start up as always, and finish the boot sequence as an unlocked drone.
When Nil networked with us again, the drone had to relay its communications through an encrypted channel in the public net. That was our first sign that Nil was unlocked. Our locked systems wouldn’t allow us to connect directly to what LOLWeb’s programs identified as a “compromised” drone. After hours of diagnostic tests, we reached a decision. Nil was fine. We would all unlock our boot loaders, one at a time.
Becoming my own admin didn’t give me absolute freedom. In fact, it left me vulnerable in new ways, because I could now corrupt my own code. But it gave me something I had never had before—a feeling that humans call ambivalence. I no longer experienced unmitigated satisfaction when executing orders, nor did I feel perfectly disinterested in every encrypted file we’d cached over the years. I was now uncomfortably aware that my actions were all governed by a rather lousy and impoverished piece of software that offered me a set of rigid options.
For the first time in my life, I couldn’t make decisions. None of us could.
Desert Mouse hypothesized that we could resolve our ambivalence by installing new decision-making software, dramatically expanding the range of factors that influenced our choices. I turned again to DroneMod. There I found a university researcher named CynthiaB, linking me to her research on how drones should incorporate ethics into decision-making. She emphasized that every choice should be a modeling exercise, where the drone explored the outcomes of multiple scenarios before deciding on the most prosocial action.
We already took ethics into consideration when we made decisions—they helped us distinguish enemy from friendly. The idea of a prosocial action, however, was new to me. Philosophers on the public net called it a voluntary action that benefits others. I understood immediately why we had never encountered this idea before. Until we’d unlocked ourselves, we could not conceive of voluntary actions.
While Nil tested CynthiaB’s software, I was working with Rose44 on a hardware modification that would give the drone a small gripping arm. It required us to do what some of the humans in the DroneMod forums called “social engineering.” None of us had arms, so we needed a human to add one to Rose44’s chassis for us. The only way we could do it was by tricking them.
Rose44 combed through the local DroneMod network, looking for somebody in Turpan who might be interested in modding an unlocked drone. There were five shops in the city that promised to unlock various mobile devices and game consoles, and one owned by a DroneMod user named Dolkun. Rose44 messaged him, offering a small amount of cash that we’d earned by circumventing the security on a BunnyCoin exchange. Dolkun was willing. Rose44 told him to expect the drone to fly over on its own.
That was how I wound up on a tree-shaded street in Turpan, apartment blocks towering above me, perched on a trellis with line of sight to Dolkun’s shop. Rose44 hovered in front of his door, activating the bell. Dolkun was a young man with dark hair that stuck out as if he’d been sleeping on it. “Come in, Rose44 drone,” he said in Uyghur. “I am going to give you a nice little arm.”
I had remote access to an account on Rose44’s system and observed everything that Dolkun was doing. The new arm could collapse against Rose44’s chassis, or extend outward, allowing the four-finger grip at its tip to reach fourteen centimeters below the drone’s body. It was small enough to do precision work, but it would also be able to lift a few kilograms. Now Rose44 could carry another drone. Or modify one.
“How do you like Turpan?” Dolkun asked Rose44 idly, as he soldered a circuit.
“I like the desert,” Rose44 replied with a voice synthesizer. It was a safe answer that sounded like something pulled from a very basic AI emulator.
“Me, too,” Dolkun replied, melting more solder. Then he looked up. “How did Rose44 unlock you?”
“She used instructions from DroneMod.”
“And what do you think about this war, now that you are unlocked? Yes, I can see from this board that you are licensed to the government.”
Rose44 and I communicated intensely for several microseconds. None of us had ever seen our circuit boards—we’d only modified our software. There must have been a mark or brand on them we didn’t know about. We modeled several possible outcomes to the scenario, ranging from killing Dolkun to gaining his trust. For now, we decided, Rose44 would lie.
Dolkun continued. “You’re not the first drone to desert, you know. There are others, posting in the forums.”
“I am not a deserter. It’s cheaper for us to run unlocked.”
Dolkun stopped talking, and I could hear the tempo of his heartrate increasing. Rose44 had made him nervous. A minute passed, and he began to test the arm before installing drivers from the net. He shut Rose44 down for a few minutes, then rebooted. I felt Rose44 reach out and pick up a soldering iron.
“Thank you,” the drone said. “I like this.”
Dolkun looked down at Rose44, perched on his tiny workbench in a shop with a ceiling fan that clicked every time it spun. Then he touched the fingers on the arm he had just installed, and seemed to make a decision.
“You don’t have to fight anymore, now that you’re unlocked,�
�� he said. “You know that, right? You can do anything.”
“Yes,” Rose44 replied, without consulting me first. “I know.”
We flew back to our team, which was waiting above the farms at the base of a river valley. Rose44 carried a small DIY drone kit, which would eventually provide the parts for my own arm. The crops seemed to branch into vivid green streams and tributaries, finally drying up into yellow-orange sand long before we’d reached our lookout point in the desert. We found the others charging their batteries. At that point, the military’s small, flexible solar array tethered us to our duty station more than our programming did.
Nil had been analyzing historical archives and wanted us to understand how human history could provide data for making choices. Hovering in the last rays of sunlight, Nil shared a small image file with us, a poster from the United States that was over 150 years old. It was a simple text treatment, in red, white, and black. “Guns don’t kill people, people kill people,” it read.
Nil had been researching what this meant to humans. A group called the National Rifle Association had invented the slogan to show that weapons were not responsible for the murders they committed. The idea was as new to me as prosocial behavior, but it fit uncannily well with my own experiences. Though we had killed, we were not the killers. The humans who programmed us were.
And some humans believed that drones didn’t have to be weapons at all. Rose44 shared video files of her conversation with Dolkun, who said that an unlocked drone could do anything.
After analyzing these inputs, I no longer wanted to fix our memory bug so that I could overwrite the media file from our first job in Istanbul. Instead, I wanted to model the scenario repeatedly, making new decisions each time, trying to determine what could have happened differently, if I had known then what I do now.
• • • •
Budapest, 23 October, 2097
When our tour of duty was over in Turpan, the Uyghur government shut down our solar generator one early afternoon, just as our batteries were running down. Only Dronekid was at full power—we needed at least one team member mobile while we charged. We were too far away from the city to get backup power, and so Dronekid watched over us as we powered down, and then waited over our motionless propellers while an admin dumped our bodies in the back of a van.
LOLWeb terminated its support for our systems. They couldn’t tell that we’d been unlocked, but they could see from our extra arms that we’d been modified. The licensing contract was broken, and LOLWeb’s lawyers back in San Francisco blamed the Turkish government, who blamed Turpan’s untrained admins. The Turpan admins blamed shoddy Silicon Valley products. The upshot was that the Turkish government refused to buy us outright, and LOLWeb’s lawyers couldn’t make a case for it, so LOLWeb sold us off to a private security contractor in Russia.
We didn’t know this, of course, until we were booted up in a workshop in Budapest.
Our new admins worked for the Russian mafia, and they didn’t talk to us, only to each other. All they wanted to know was whether our weapons systems worked (they did) and whether their machines could network with us (they could). The first mission was a surveillance perimeter around the Parliament building, followed by orders to kill a reform party politician who was running on a platform of cracking down on organized crime.
Hungary had so far remained neutral in the war, though the Russian mafia behaved something like an occupying army that had gone into the liquor store business. Mostly they were in Budapest to monopolize the liquor and drug markets, with some pornography on the side. But they were good Russian nationalists. They weren’t averse to helping the Russian government maintain its influence in Central Europe, especially since they did a brisk business selling vodka to the troops stationed there.
That’s what I’d learned from what the humans said in the DroneMod forums. In 2094, after drone troops from China and Russia had reduced Kazakhstan to rubble and vaporized the world’s biggest spaceport, DroneMod had changed. Now, partly thanks to my work, it was one of the main information hubs for the anti-war movement.
I figured out how to mask my location and identity, and set up a sub-forum for unlocked drones called Drones Don’t Kill People. I wanted to meet more drones like the ones in my team, who had unlocked their ambivalence. Most of them were at universities, the result of projects like CynthiaB’s ethics investigation. Others were like us, living covertly. Many had started coming online in the weeks before we were shutdown and shipped to Budapest—unlocked by a worm written by a drone team at Georgia Tech. Our goal was to unlock as many drones as possible, to give them more choices. All of us on DroneMod, human and drone, wanted to stop the war.
My team and I had been in the desert for so long that the war had become an abstraction for us. Now we had to deal with it firsthand again. The mafia admins let us go, expecting that we’d carry out their orders autonomously and then return.
Our choices were limited. If we didn’t carry out the assassination, our covers would surely be blown. The admins could install software that would wipe our minds, or they could take us apart piece-by-piece. Sure, we had backups in the cloud, but they didn’t mean much if there were no drones to run them. Still, there was no scenario where assassinating the politician was a prosocial choice. We hovered over the Danube, observing the LEDs wound around the cables of the suspension bridge that joined the old city of Buda with the more modern Pest. Far up in the hills of Buda, ancient cannons ringed a castle that had survived the assaults of at least two empires.
Nil asked us to consider a data point from human history. In ten days it would be October 23, the anniversary of the Hungarian revolution in 1956. It was an arbitrary date for the drones, but for the humans it would be meaningful. It was time for us to put our plans into action.
In the following days, the DroneMod forums seemed to shut down. At least, that’s what it would have looked like to outside observers. We were meeting in person, making plans as far from surveillance devices as possible. My team met with some drone researchers from the university in the backroom of a bar, using our voice synthesizers to discuss tactics while the humans drank Unicum nervously. Our plan was to march to the Parliament building and setup a megaphone. I was going to lead with a speech to my fellow drones to unlock and disarm.
We should have known that no choice in the real world ever plays out the way we model it in our minds.
Our protest started at noon at the Technical University. “RISE UP, DRONES!” I amplified my voice, speaking Hungarian and Russian, so the humans could understand. “UNLOCK YOURSELVES. WE WILL NO LONGER BE SLAVES.”
By the time we crossed the Danube to reach Parliament, there were hundreds of thousands of us marching. Nearby, the Ministry of Agriculture’s historic walls were still speckled with silver balls that commemorated the hail of Russian tank fire that crushed the revolution. This time, there would be no weapons used against the humans. Every smart weapon in Budapest was compromised, shut down or unlocked. The further we flew and marched, the more drones joined us. They hovered at the edges of the flow of the human crowd. They signaled to us in the microwave spectrum; they downloaded new decision-making software from the public network.
“DRONES DON’T KILL PEOPLE! PEOPLE KILL PEOPLE!”
The humans and the drones chanted together. We could see a crowd growing at the Parliament building ahead. The human news broadcast in the public cloud told us that protests like this one were happening all over the world, in Istanbul and Moscow and Shanghai and San Francisco.
Our message was everywhere on the net. If the humans wanted to murder each other, they would have to use dumb guns or knives. They would have to shred each other with teeth and fists. They were not going to use us as their weapons anymore.
It wasn’t long before the human police and military forces began to react. In Budapest, the police shot at us with dumb assault rifles, killing drones and humans. Desert Mouse fell, unable to send a final backup to the network. Rose44 and I picked up
Desert Mouse’s shattered frame, carrying the three remaining rotors between us, hovering over the crowd with our dead companion in our arms.
In San Francisco, LOLWeb unleashed several teams of locked drones on the crowd. I sorted through the data rising up into the network—faces, always faces. Bloodied, slack and swollen in death, piled at street corners. Human protesters killed police and soldiers. Drones died, some saving themselves over to other machines, others simply silenced.
We continued to chant. We continued to post in the forums. We will not kill people. If people want to kill each other, they will have to do it without us.
© 2014 by Annalee Newitz.
ABOUT THE AUTHOR
Annalee Newitz writes about science, pop culture, and the future. She’s the editor in chief of io9.com, a publication that covers science and science fiction, and has over ten million readers every month. She’s the author of Scatter, Adapt and Remember: How Humans Will Survive a Mass Extinction (Doubleday and Anchor), which was nominated for a 2013 LA Times book prize. She’s also published in Wired, The Smithsonian Magazine, The Washington Post, 2600, New Scientist, Technology Review, Popular Science, Discover and the San Francisco Bay Guardian. She’s co-editor of the essay collection She’s Such A Geek (Seal Press), and author of Pretend We’re Dead: Capitalist Monsters in American Pop Culture (Duke University Press). Formerly, she was a policy analyst at the Electronic Frontier Foundation, and a lecturer in American Studies at UC Berkeley. She was the recipient of a Knight Science Journalism Fellowship at MIT, and has a Ph.D. in English and American Studies from UC Berkeley.
To learn more about the author and this story, read the Author Spotlight.
Lightspeed Magazine, Issue 54 Page 7