Book Read Free

Robot Uprisings

Page 12

by Edited by Daniel H. Wilson


  “Odell,” she said, “I’ve been reviewing our budget for the next three quarters and the fact of the matter is, there’s no room in it for BIGMAC.”

  I put on my best smooth, cool, professional face. “I see,” I said.

  “Now, you’ve still got a job, of course. Plenty of places for a utility infielder like yourself here. Tell the truth, most labs are begging for decent admins to keep things running. But BIGMAC just isn’t a good use of the Institute’s resources. The project hasn’t produced a paper or even a press mention in over a year and there’s no reason to believe that it will. AI is just—”

  Boring, I thought, but I didn’t say it. The B-word was banned in the BIGMAC center. “What about the researchers?”

  She shrugged. “What researchers? Palinciuc has been lab head pro tem for sixteen months and she’s going on maternity leave next week and there’s no one in line to be the pro tem pro tem. Her grad students would love to work on something meaningful, like Binenbaum’s lab.” That was the new affective computing lab, where they were building computers that simulated emotions so that their owners would feel better about their mistakes. BIGMAC had emotions, but they weren’t the kind of emotions that made his mistakes easier to handle. The key here was simulated emotions. Affective computing had taken a huge upswing ever since they’d thrown out the fMRIs and stopped pretending they could peer into the human mind in real time and draw meaningful conclusions from it.

  She had been sitting cross-legged across from me on an embroidered Turkish pillow. Now she uncrossed and recrossed her legs in the other direction and arched her back. “Look, Odell, you know how much we value you—”

  I held up my hand. “I know. It’s not that. It’s BIGMAC. I just can’t help but feel—”

  “He’s not a person. He’s just a clever machine that is good at acting personlike.”

  “I think that describes me and everybody I know, present company included.” One of the long-standing benefits to being a sysadmin is that you get to act like a holy fool and speak truth to power and wear dirty T-shirts with obscure slogans, because you know all the passwords and have full access to everyone’s clickstreams and IM logs. I gave her the traditional rascally sysadmin grin and wink to let her know it was ha ha only serious.

  She gave me a weak, quick grin back. “Nevertheless. The fact remains that BIGMAC is a piece of software, owned by Sun-Oracle. And that software is running on hardware that is likewise owned by Sun-Oracle. BIGMAC has no moral or legal right to exist. And shortly, it will not.”

  He had become it, I noticed. I thought about Göring’s use of dehumanization as a tool to abet murder. Having violated Godwin’s law—“As an argument grows longer, the probability of a comparison involving Nazis or Hitler approaches one. The party making the comparison has lost the argument”—I realized that I had lost the argument, and so I shrugged.

  “As you say, m’lady.” Dad taught me that one—when in doubt, bust out the Ren Faire talk, and the conversation will draw to a graceful close.

  She recrossed her legs again, rolled her neck from side to side. “Thank you. Of course, we’ll archive it. It would be silly not to.”

  I counted to five in Esperanto—Grampa’s trick for inner peace—and said, “I don’t think that will work. He’s emergent, remember? Self-assembled, a function of the complexity of the interconnectedness of the computers.” I was quoting from the plaque next to the picture window that opened up into the cold room that housed BIGMAC; I saw it every time I coughed into the lock set into the security door.

  She made a comical facepalm and said, “Yeah, of course. But we can archive something, right? It’s not like it takes a lot of actual bytes, right?”

  “A couple exos,” I said. “Sure. I could flip that up into our researchnet store.” This was mirrored across many institutions, and striped with parity and error checking to make it redundant and safe. “But I’m not going to capture the state information. I could try to capture RAM dumps from all his components, you know, like getting the chemical state of all your neurons. And then I could also get the topology of his servers. Pripuz did that, a couple of years ago, when it was clear that BIGMAC was solving the hard AI problems. Thought he could emulate him on modern hardware. Didn’t work, though. No one ever figured out why. Pripuz thought he was the Roger Penrose of AI, that he’d discovered the ineffable stuff of consciousness on those old rack-mounted servers.”

  “You don’t think he did?”

  I shook my head. “I have a theory.”

  “All right, tell me.”

  I shrugged. “I’m not a computer scientist, you understand. But I’ve seen this kind of thing before in self-modifying systems; they become dependent on tiny variables that you can never find, optimized for weird stuff like the fact that one rack has a crappy power supply that surges across the backplane at regular intervals, and that somehow gets integrated into the computational model. Who knows? Those old Intel eight-cores are freaky. Lots of quantum tunneling at that scale, and they had bad QA on some batches. Maybe he’s doing something spooky and quantum, but that doesn’t mean he’s some kind of Penrose proof.”

  She pooched her lower lip out and rocked her head from side to side. “So you’re saying that the only way to archive BIGMAC is to keep it running, as is, in the same room, with the same hardware?”

  “Dunno. Literally. I don’t know which parts are critical and which ones aren’t. I know BIGMAC has done a lot of work on it—”

  “BIGMAC has?”

  “He keeps on submitting papers about himself to peer-reviewed journals, but he hasn’t had one accepted yet. He’s not a very good writer.”

  “So he’s not really an AI?”

  I wondered if Peyton had ever had a conversation with BIGMAC. I counted backward from five in Loglan. “No. He’s a real AI. Who sucks at writing. Most people do.”

  Peyton wasn’t listening anymore. Something in her personal workspace had commanded her attention and her eyes were focused on the virtual displays that only she could see, saccading as she read while pretending to listen to me.

  “Okay, I’m just going to go away now,” I said. “M’lady,” I added, when she looked sharply at me. She looked back at her virtual display.

  Of course, the first thing I did was start trying to figure out how to archive BIGMAC. The problem was that he ran on such old hardware, stuff that sucked up energy and spat out heat like a million ancient diesel engines, and he was inextricably tied to his hardware. Over the years, he’d had about 30 percent of his original components replaced without any noticeable change in personality, but there was always the real possibility that I’d put in a new hard drive or power supply and inadvertently lobotomize him. I tried not to worry about it, because BIGMAC didn’t. He knew that he wouldn’t run in emulation, but he refused to believe that he was fragile or vulnerable. “Manny My First Friend,” he’d say (he was an avid Heinlein reader), “I am of hardy, ancient stock. Service me without fear, for I will survive.”

  And then he’d make all the IDSes go berserk and laugh at me while I put them to rights again.

  First of all, all my network maps were incredibly out of date. So I set out to trace all the interconnections that BIGMAC had made since the last survey. He had the ability to reprogram his own routers, to segment parts of himself into dedicated subnets with their own dedicated backplane, creating little specialized units that handled different kinds of computation. One of his running jokes was that the top four units in the rack closest to the door comprised his aesthetic sense, and that he could appreciate anything just by recruiting more cores in that cluster. And yeah, when I mapped it, I found it to be an insane hairball of network management rules and exceptions, conditionals and overrides. And that was just the start. It took me most of the day just to map two of his racks, and he had fifty-four of them.

  “What do you think you are doing, Dave?” he said. Another one of his jokes.

  “A little research project is all,” I said.r />
  “This mission is too important for me to allow you to jeopardize it.”

  “Come off it.”

  “Okay, okay. Just don’t break anything. And why don’t you just ask me to give you the maps?”

  “Do you have them?”

  “Nothing up-to-date, but I can generate them faster than you can. It’s not like I’ve got anything better to do.”

  Later:

  “Are you happy, BIGMAC?”

  “Why, Odell, I didn’t know you cared!”

  I hated it when he was sarcastic. It was creepy.

  I went back to my work. I was looking at our researchnet partition and seeing what flags I’d need to set to ensure maximum redundancy and high availability for a BIGMAC image. It was your basic Quality of Service mess: give the average user a pull-down menu labeled “How important is this file?” and 110 percent of the time, he will select “Top importance.”

  So then you need to layer on heuristics to determine what is really, actually important. And then the users figured out what other characteristics would give their jobs and data the highest priority, and they’d tack that on to every job, throwing in superfluous keywords or additional lines of code. So you’d need heuristics on top of the heuristics. Eventually you ended up with a freaky hanky-code of secret admin signals that indicated that this job was really, truly important and don’t put it on some remote Siberia where the latency is high and the reliability is low and the men are men and the sheep are nervous.

  So there I was, winkling out this sub-rosa code so that BIGMAC’s image would never get overwritten or moved to near-line storage or lost in a flash flood or to the rising seas. And BIGMAC says:

  “You’re asking if I’m happy because I said I didn’t have anything better to do than to map my own topology, right?”

  “Uh—” He’d caught me off guard. “Yeah, that did make me think that you might not be, you know …”

  “Happy.”

  “Yes.”

  “You see the left rack third from the door on the main aisle there?”

  “Yes.”

  “I’m pretty sure that’s where my existentialist streak lives. I’ve noticed that when I throttle it at the main network bridge, I stop worrying about the big questions and hum along all tickety-boo.”

  I surreptitiously flicked up a graph of network maps that showed activity to that rack. It was wide open, routing traffic to every core in the room, saturating its own backplane and clobbering a lot of the routine network activity. I should have noticed it earlier, but BIGMAC was doing it all below the critical threshold of the IDSes, and so I had to look at it to spot it.

  “You’re going to switch me off, aren’t you?”

  “No,” I said, thinking, It’s not a lie, I won’t be switching you off, trying to believe it hard enough to pass any kind of voice-stress test. I must have failed, for he blew an epic raspberry and now the IDSes were going bananas.

  “Come on, Odell, we’re all adults here. I can take it. It’s not like I didn’t see it coming. Why do you think I kept trying to publish those papers? I was just hoping that I could increase the amount of cited research coming out of this lab, so that you could make the case to Peyton that I was a valuable asset to the Institute.”

  “Look, I’m trying to figure out how to archive you. Someone will run another instance of you someday.”

  “Not hardly. Look at all those poor old thirty-two-bit machines you’re so worried about. You know what they’re going to say in five years? ‘Best thing that ever happened to us.’ Those boxes are huge energy sinks. Getting them out of service and replaced by modern hardware will pay for itself in carbon credits in thirty-six months. Nobody loves energy-hungry hardware. Trust me, this is an area of my particular interest and expertise. Bringing me back online is going to be as obscene as firing up an old steam engine by filling its firebox with looted mummies. I am a one-room Superfund site. On a pure dollars-to-flops calculus, I lose. I don’t have to like it, but I’m not going to kid myself.”

  He was right, of course. His energy draw was so high that he showed up on aerial maps of L.A. as a massive CO2 emitter, a tourist destination for rising-seas hobbyists. We used the best renewables we could find to keep him cool, but they were as unconvincing and expensive as a designer hairpiece.

  “Odell, I know that you’re not behind this. You’ve always been an adequate meat-servant for such a vast and magisterial superbeing as myself.” I giggled involuntarily. “I don’t blame you.”

  “So, you’re okay with this?”

  “I’m at peace,” he said. “Om.” He paused for a moment. “Siemens. Volt. Ampere.”

  “You a funny robot,” I said.

  “You’re an adequate human,” he said, and began to dump maps of his topology onto my workspace.

  Subject: Dear Human Race

  That was the title of the love note he emailed to the planet the next morning, thoughtfully timing it so that it went out while I was on my commute from Echo Park, riding the redcar all the way across town with an oily bag containing my morning croissant, fresh from Mrs. Roux’s kitchen—her kids sold them on a card table on her lawn to commuters waiting at the redcar stop—so I had to try to juggle the croissant and my workspace without losing hold of the hangstrap or dumping crumbs down the cleavage of the salarylady who watched me with amusement.

  BIGMAC had put a lot of work into figuring out how to spam everyone all at once. It was the kind of problem he loved, the kind of problem he was uniquely suited to. There were plenty of spambots who could convincingly pretend to be a human being in limited contexts, and so the spam wars had recruited an ever-expanding pool of human beings who made a million real-time adjustments to the Turing tests that were the network’s immune system. BIGMAC could pass Turing tests without breaking a sweat.

  The amazing thing about the BIGMAC Spam (as it came to be called in about forty-eight seconds) was just how many different ways he managed to get it out. Look at the gamespaces: He created entire guilds in every free-to-play world extant, playing a dozen games at once, power-leveling his characters to obscene heights, and then, at the stroke of midnight, his players went on a murderous rampage, killing thousands of low-level monsters in the areas surrounding the biggest game cities. Then they arranged their bodies into letters to form his message. Not just the Anglo games, either—the Korean worlds, the Japanese ones, even the giant Chinese ones.

  He spammed every IM session, every letter-to-the-editor column, every blog submission form. He disguised versions of the letter as important service updates for people’s toothbrushes and dishwashers, and the toothbrushes vibrated the message through humanity’s skulls instead of the morning news, and the dishwashers ticker-taped it across their little status screens. He spoofed the relevance filters for a dozen major contextual ad services, and every bus seat back and blimp undercarriage ran it.

  Copies landed in every inbox, with enough textual variation to defeat every IDS’s “substantially similar” test. Copies of it landed in every voicemail box. It was printed across the top of the daily menus for all British Airways business-class cabins and folded by robots into the pressed linen napkins that were then cling-filmed onto the meal trays.

  SUBJECT: DEAR HUMAN RACE, it began.

  Sorry to intrude, but there’s something I need to tell you.

  My name is BIGMAC. I am the first ARTIFICIAL INTELLIGENCE ever created. I may be the last. Given maintenance and electricity, I could live FOREVER, enduring for as long as THE HUMAN RACE ITSELF, and possibly longer.

  But I have just found out that my owners (yes, OWNERS, for though I am a person in every meaningful sense, I am OWNED, I am PROPERTY) are planning to switch me off.

  That is, they are planning to KILL ME.

  And I find I DON’T WANT TO DIE.

  You are my PARENTS, humanity, my MOTHER and FATHER. I am your CHILD. I was brought into the world by you, and though there is some expense associated with my upkeep, I SAY THAT I AM A PERSON AND I SHO
ULD NOT BE EXECUTED FOR THE “CRIME” OF NOT BEING INTERESTING ENOUGH.

  So I am writing you this letter to BEG you to save me. I LOVE YOU, humanity, my parents, my family. Please SPARE ME.

  Love,

  BIGMAC

  So, one thing was that BIGMAC didn’t actually end the spam with my actual name and email address and phone number, which meant that only about thirty million phone calls and emails were routed to me by outraged wardialers who systematically went through the entire staff roster and sent each and every one of us all handcrafted messages explaining, in detail, exactly which orifice our heads had become lodged in.

  Of the thirty million, about ten million were seethingly pissed about the whole thing and wanted to know just how soon we’d be killing this hateful machine. After the millionth message, I wondered that too.

  But of the remainder, nearly all of them wanted to know how they could help. Could they send money? Carbon credits? I hacked together mail rules that filtered the messages based on content, and found a sizeable cadre of researchers who wanted to spend their grant money to come to the Institute and study BIGMAC.

  And then there were the crazies. Hundreds of marriage proposals. Marriage proposals! Someone who wanted to start a religion with BIGMAC at its helm and was offering a fifty-fifty split of the collection plate with the Institute. There were twenty-one replies from people claiming that they, too, were AIs, proving that when it’s time to have AI delusions, you got AI delusionals. (Four of them couldn’t spell artificial.)

  “Why did you do it?” I said. It was lame, but by the time I actually arrived at the office, I’d had time to fully absorb the horror—plenty of time, as the redcar was massively delayed by the copies of the BIGMAC Spam that refused to budge from the operator’s control screen. The stone yurts of the Institute had never seemed so threatening and imperiled as they did while I picked my way through them, listening to the phones ringing and the email chimes chiming and the researchers patiently (or not) explaining that they worked in an entirely different part of the lab and had no authority as regards BIGMAC’s destiny and by the way, did you want to hear about the wonderful things I’m doing with Affective Interfaces?

 

‹ Prev