Robot Uprisings

Home > Science > Robot Uprisings > Page 11
Robot Uprisings Page 11

by Daniel H. Wilson


  The Christmas dinner Uncle Joe brought over from the restaurant turned out to be delicious. Of course, it was nowhere near as good as the food my mom used to cook for Christmas dinner, but I tried not to think about that. While were eating, I could tell that Dad was trying not to think about the same thing. And sitting around the kitchen table staring at my mom’s empty chair wasn’t helping, so Dad suggested that we move our feast to the living room instead.

  This turned out to be a great idea, because just as we were sitting down, Uncle Joe found a rerun of Buck Rogers on TV. It featured Gary Coleman as the guest star, and the episode was filled with cheesy sci-fi goodness. I wished S.A.M.M. could watch it with us, because I thought he’d get a kick out of Buck’s robot Twiki, and that maybe I could even teach him to do a Twiki impersonation. But S.A.M.M. still wasn’t fully charged, and the manual warned that you could screw up his battery’s life by turning him on too early.

  About an hour later, when we were watching a tape of my favorite Knight Rider episode (the one where K.I.T.T. battles his nemesis K.A.R.R.), the timer on my watch sounded, letting me know that S.A.M.M. should be fully recharged.

  I jumped up and said, “Hey, I’m going to get a soda. You guys want anything?”

  “Mountain Dew,” they said together. And then, because he’s a huge dork, Uncle Joe sang, in a horrible approximation of the TV commercial, “Doin’ it country cooooool.”

  I went to the kitchen and powered S.A.M.M. on. “How are you feeling, pal?” I asked, as soon as his eyes began to glow.

  S.A.M.M. didn’t respond. I repeated my question, but he still didn’t answer. I had a sudden moment of panic, wondering if I’d done the charging procedure incorrectly and fried his battery—or worse, fried his robot brain.

  I turned his power switch off and back on, then tried again. “S.A.M.M.? Can you hear me?”

  “Yes, Wyatt!” he finally replied. “I can hear you. Thank you for the power!”

  I sighed with relief. “You’re welcome! Now, how about we try out your drink-serving skills?”

  “Of course!” he said. “I enjoy serving drinks.”

  I grabbed the Omnibot’s motorized drink tray and attached it to the port on the front of his chassis. The tray had a series of cup cradle rings built into its surface that could be rotated, allowing the robot to pour a drink into a cup in each slot. I put three Empire Strikes Back glasses into the first three rings and an open can of Mountain Dew into the last ring.

  I’d seen an Omnibot 2000 serve drinks in the TV commercial and had been dying to try out this feature all day.

  “Okay, S.A.M.M. I’m going to head back into the living room and sit down. You wait here for a minute, then come in and serve each of us a glass of Mountain Dew. Okay?”

  “Affirmative,” S.A.M.M. replied.

  I ran back into the living room and plopped down on the couch next to my dad in Uncle Joe’s seat, because he’d gotten up to go use the bathroom.

  Dad glanced toward the kitchen. “Hey, where’s my drink? The service in this robo-restaurant totally stinks.”

  “Hold your horses,” I said. “He’s coming!” I slid the coffee table forward a few feet, to give S.A.M.M. enough clearance to reach us when he came in.

  “I think this technology still needs some work, ace,” Dad said. “It looks like it’s still easier—and quicker—to just get up off your butt and get the drinks on your own.”

  “Come on, Steve,” Uncle Joe said as he returned from the restroom. “For the first time in your life, a robot is about to serve you a drink. Just sit back and enjoy it.”

  A few seconds later, we heard a muffled banging sound in the kitchen. I was about to go investigate, but then S.A.M.M. appeared in the doorway and began to wheel through the dining room, straight toward us.

  “This is going to be so cool!” I said. I saw Dad and Uncle Joe exchange a smile. They both looked happy to see me so happy, and that made me even happier.

  But when S.A.M.M. reached the living room, we all saw that he was holding a large carving knife in his right hand, with the blade pointed down toward the floor. He was rolling toward us so fast that soda was sloshing up out of the Mountain Dew can and the glasses on the tray were jostling against each other, creating a strange musical cadence that somehow made the robot’s rapid approach even more frightening.

  I raised the Omnibot’s remote control and mashed all four of the direction buttons, but he didn’t change direction.

  As he continued to wheel toward us, S.A.M.M. slowly raised his right arm—and the knife—as high as it would go. He looked like the star of some robot slasher film. And now he was headed straight for my father.

  “Dad, look out!” I shouted. “I think he wants to kill you!”

  But Dad was laughing. “Wyatt, it’s okay,” he said. “Relax.”

  But S.A.M.M. was still headed straight for him, and the knife was still raised. He was now less than ten feet from my dad.

  Dad stopped laughing. Now he looked concerned. He glanced over at Uncle Joe and said, “Okay, enough is enough. Stop the damn thing already!”

  I looked at Uncle Joe and saw that he was pointing his own remote control at S.A.M.M. But his device didn’t look like mine; it was some kind of universal remote with twice as many buttons. Uncle Joe was pressing several of them, and none seemed to be having any effect.

  “Nothing is working!” he said.

  A second later, S.A.M.M. reached Dad and his drink tray slammed into my father’s shin. S.A.M.M. began to lower the knife, and that was when it became clear that my father was in no danger, because S.A.M.M.’s reach was so limited that there was no way he could get the knife blade within striking distance. And his arm was moving so slowly that it was unlikely he could stab through anything thicker than a sheet of paper.

  As the knife blade lowered, it knocked over the can of Mountain Dew on the tray. This created a domino effect that knocked over all of the glasses, too. Mountain Dew began to flood the tray and spill over the sides, dousing the leg of my dad’s jeans. I also saw the greenish-yellow soda running down the front of S.A.M.M.’s chassis, and that finally broke my paralysis. I reached out and switched off the robot’s power, then unhooked his dripping tray and set it aside. S.A.M.M. was still clutching the knife firmly in his clawlike hand, so I carefully removed it and set it on the coffee table. Then I turned to face my father and Uncle Joe.

  “Okay,” I said. “One of you better tell me what the hell is going on.”

  Here’s what happened.

  The whole thing had been Uncle Joe’s idea. Dad had told him he was planning to buy me an Omnibot for Christmas about a month ago, and he’d asked for my uncle’s help in programming it to do some cool stuff on Christmas morning. Uncle Joe had done a lot of research on the Omnibot’s capabilities, and that was when he learned about Robo Link, a software program made by a company called Computer Magic. Robo Link let you create an unlimited number of programs for the Omnibot on your IBM or Apple and store them on floppy disk. Then you could call each program up at any time and load it into the Omnibot via a special interface cable that came with the software.

  Uncle Joe had ordered Robo Link right away and quickly taught himself to use it. Once my Omnibot arrived, my dad stenciled the AI on its chassis, then helped Uncle Joe attach a radio transmitter to his IBM that would allow commands to be sent directly from it to the Omnibot, using the robot’s existing remote control interface. It was a pretty ingenious setup. Uncle Joe also configured things so that his IBM’s speech synthesizer program could transmit verbal responses to S.A.M.M.’s speaker as fast as he could type them in. And my uncle types really fast, so I’d never noticed any delay when S.A.M.M. was talking to me.

  They tested their system out at my uncle’s apartment and got everything working. Then, on Christmas Eve, Uncle Joe brought his computer over and set it up in our basement, directly under the Christmas tree. Dad borrowed several black-and-white security cameras from his shop and placed them around the first fl
oor of our house. These cameras were linked to a TV monitor in the basement, so that my uncle could see what the robot and I were doing. And S.A.M.M.’s internal microphone allowed my uncle to hear what I was saying, so that he could provide appropriate responses.

  Once Uncle Joe and my dad got their jerry-rigged system all set up and running, they rehearsed what the Omnibot would do and say the next morning. Then Uncle Joe crashed in a sleeping bag on the basement floor, so that he would be there, ready and waiting, when I got up the next morning to open my present.

  Their ruse worked almost flawlessly. When I was alone with my dad, Uncle Joe was downstairs operating S.A.M.M. And after my uncle “came over” to the house, my father snuck down into the basement and took over typing in S.A.M.M.’s responses, while Uncle Joe controlled his movements with a remote he kept hidden in his pocket.

  The only problem was that the Omnibot’s remote also operated on the same frequency as various garage-door openers, television remotes, CB radios, airplane transponders, and even walkie-talkies—like the walkie-talkies our two neighbor kids had gotten for Christmas. They’d been out in their backyard playing “G.I. Joe” with them all day, and that was where the feedback and the static-laden “I’ll kill them both!” voice had probably originated.

  Random radio signal interference from those walkie-talkies also accounted for the erratic and random movements S.A.M.M. had made when he knocked our family photo off the wall.

  As for the knife, it had been attached to one of those magnetic cutlery strips that was affixed to one of our kitchen cabinets. As far as we could figure, S.A.M.M. had probably banged into the cabinets and knocked the knife loose, and it had fallen blade-first into S.A.M.M.’s serving tray. When Uncle Joe had blindly used his remote to try to make S.A.M.M. pick up the can of Mountain Dew, his hand had closed around the handle of the knife instead.

  My father and my uncle both apologized profusely for deceiving me, and they swore up and down that they had planned to reveal the truth right before I went to bed.

  When they finished telling me all of this, I suddenly burst into tears. I could tell by their expressions that this made both of them feel awful.

  “I’m sorry,” I said, my voice catching as I forced out each syllable. “I’m not mad at you guys. At all. I just—”

  That was all I could get out, so I ran over and gave each of them a hug. They both seemed confused, but also relieved that I wasn’t angry at them.

  In the weeks that followed, Uncle Joe helped us set up the Robo Link program on our Apple II, so that Dad and I could use S.A.M.M. to pull the same parlor tricks on everyone who came to visit our house. We scared the pants off the mailman, several door-to-door salesmen, and two incredibly naïve Jehovah’s Witnesses (who ran out of the house screaming after S.A.M.M. professed his love of “the Dark Lord”).

  Part of me was still disappointed that S.A.M.M. didn’t really possess artificial intelligence. But I eventually realized that I no longer yearned for a sentient robot companion, because I knew I was lucky enough to have two real people in my life who cared about me more than any machine ever could.

  The following year, I asked for a laser tag game, and that Christmas proved to be far more uneventful. S.A.M.M. and I did beat my dad and Uncle Joe in our First Annual Yuletide Laser Tag Battle Royale, with a final score of four games to three. But I think they may have let us win that last one.

  CORY DOCTOROW

  EPOCH

  Cory Doctorow is a science fiction author, activist, journalist, and blogger. He serves as coeditor of Boing Boing (boingboing.net) and is the author of the novels Homeland, For the Win, and Little Brother. He is the former European director of the Electronic Frontier Foundation and cofounded the UK’s Open Rights Group. Born in Toronto, Canada, he now lives in London. Learn more about Doctorow’s work at www.craphound.com.

  The doomed rogue AI is called BIGMAC and he is my responsibility. Not my responsibility as in “I am the creator of BIGMAC, responsible for his existence on this planet.” That honor belongs to the long-departed Dr. Shannon, one of the shining lights of the once great Sun-Oracle Institute for Advanced Studies, and he had been dead for years before I even started here as a lowly sysadmin.

  No, BIGMAC is my responsibility as in, “I, Odell Vyphus, am the systems administrator responsible for his care, feeding, and eventual euthanizing.” Truth be told, I’d rather be Dr. Shannon (except for the being dead part). I may be a lowly grunt, but I’m smart enough to know that being the Man Who Gave the World AI is better than being the Kid Who Killed It.

  Not that anyone would care, really. Two hundred and fifteen years after Mary Shelley first started humanity’s hands wringing over the possibility that we would create a machine as smart as us but out of our control, Dr. Shannon did it, and it turned out to be incredibly, utterly boring. BIGMAC played chess as well as the non-self-aware computers, but he could muster some passable trash talk while he beat you. BIGMAC could trade banalities all day long with any Turing tester who wanted to waste a day chatting with an AI. BIGMAC could solve some pretty cool vision-system problems that had eluded us for a long time, and he wasn’t a bad UI to a search engine, but the incremental benefit over non-self-aware vision systems and UIs was pretty slender. There just weren’t any killer apps for AI.

  By the time BIGMAC came under my care, he was less a marvel of the twenty-first century and more a technohistorical curiosity who formed the punch line to lots of jokes but otherwise performed no useful service to humanity in exchange for the useful services that humanity (e.g., me) rendered to him.

  I had known for six months that I’d be decommissioning old BM (as I liked to call him behind his back), but I hadn’t seen any reason to let him in on the gag. Luckily (?) for all of us, BIGMAC figured it out for himself and took steps in accord with his nature.

  This is the story of BIGMAC’s extraordinary self-preservation program, and the story of how I came to love him, and the story of how he came to die.

  My name is Odell Vyphus. I am a third-generation systems administrator. I am twenty-five years old. I have always been sentimental about technology. I have always been an anthropomorphizer of computers. It’s an occupational hazard.

  BIGMAC thought I was crazy to be worrying about the rollover. “It’s just Y2K all over again,” he said. He had a good voice—speech synthesis was solved long before he came along—but it had odd inflections that meant that you never forgot you were talking with a nonhuman.

  “You weren’t even around for Y2K,” I said. “Neither was I. The only thing anyone remembers about it, today, is that it all blew over. But no one can tell, at this distance, why it blew over. Maybe all that maintenance tipped the balance.”

  BIGMAC blew a huge load of IPv4 ICMP traffic across the network, stuff that the firewalls were supposed to keep out of the system, and every single intrusion-detection system alarm lit, making my screen into a momentary mosaic of competing alerts. It was his version of a raspberry, and I had to admit it was pretty imaginative, especially since the IDSes were self-modifying and required that he come up with new and better ways of alarming them each time.

  “Odell,” he said, “the fact is, almost everything is broken, almost always. If the failure rate of the most vital systems in the world went up by twenty percent, it would just mean some overtime for a few maintenance coders, not Götterdämmerung. Trust me. I know. I’m a computer.”

  The rollover was one of those incredibly boring apocalypses that periodically get extracted by the relevance filters, spun into screaming 128-point linkbait headlines, then dissolved back into their fundamental, incontrovertible technical dullness and out of the public consciousness. Rollover: 19 January 2038. The day that the Unix time function would run out of headroom and roll back to zero, or do something else undefined.

  Oh, not your modern unices. Not even your elderly unices. To find a rollover-vulnerable machine, you needed to find something running an elderly, 32-bit paleounix. A machine running on a processor
that was at least twenty years old—2018 being the last date that a 32-bit processor shipped from any major fab. Or an emulated instance thereof, of course. And counting emulations, there were only—

  “There’s fourteen billion of them!” I said. “That’s not twenty percent more broken! That’s the infocalypse.”

  “You meatsacks are so easily impressed by zeros. The important number isn’t how many thirty-two-bit instances of Unix are in operation today. It’s not even how many vulnerable ones there are. It’s how much damage all those vulnerable ones will cause when they go blooie. And I’m betting: not much. It will be, how do you say, ‘meh’?”

  My grandfather remembered installing the systems that caused the Y2K problem. My dad remembered the birth of “meh.” I remember the rise and fall of anyone caring about AI. Technology is glorious.

  “But okay, stipulate that you’re right and lots of important things go blooie on January nineteenth. You might not get accurate weather reports. The economy might bobble a little. Your transport might get stuck. Your pay might land in your bank a day late. And?”

  He had me there. “It would be terrible—”

  “You know what I think? I think you want it to be terrible. You want to live in the Important Epoch in Which It All Changes. You want to know that something significant happened on your watch. You don’t want to live in one of those Unimportant Epochs in Which It All Stayed the Same and Nothing Much Happened. Being alive in the Epoch in Which AI Became Reality doesn’t cut the mustard, apparently.”

  I squirmed in my seat. That morning, my boss, Peyton Moldovan, had called me into her office—a beautifully restored temporary habitat dating back to the big L.A. floods, when this whole plot of land had been a giant and notorious refugee camp. Sun-Oracle had gotten it for cheap and located its institute there, on the promise that they preserve the hastily thrown-up structures where so many had despaired. I sat on a cushion on the smooth cement floor—the structures had been delivered as double-walled bags full of cement mix, needing only to be “inflated” with high-pressure water to turn them into big, dome-shaped sterile cement yurts.

 

‹ Prev