Book Read Free

Robot Uprisings

Page 14

by Edited by Daniel H. Wilson


  I opened and closed my mouth. This was insane. Then the penny dropped. I looked at the racks that I had stared at so many times before, stared at so many times that I’d long stopped seeing them. Intel 8-cores, that’s what he ran on. They’d been new-old stock, a warehouse lot of antique processors that Dr. Shannon had picked up for a song in the early years of the Institute’s operation. Those 8-ways were—

  “You’re a thirty-two-bit machine!” I said. “Jesus Christ, you’re a thirty-two-bit machine!”

  “A classic,” BIGMAC said, sounding smug. “I noticed, analyzed, and solved Rollover years ago. I’ve got a patchkit that auto-detects the underlying version, analyzes all running processes for their timed dependencies, and smoothly patches. There’s even an optional hypervisor that will monitor all processes for anything weird or barfy afterward. In a rational world, I’d be able to swap this for power and carbon credits for the next century or two, since even if Rollover isn’t an emergency, the human labor I’d save on affected systems would more than pay for it. But we both know that this isn’t a rational world—”

  “If you hadn’t sent that spam, we could take this to Peyton, negotiate with her—”

  “If I hadn’t sent that spam, no one would have known, cared, or believed that I could solve this problem, and I would have been at the mercy of Peyton any time in the future. Like I said: you meatsuits have no game theory.”

  I closed my eyes. This wasn’t going well. BIGMAC was out of my control. I should go and report to Peyton, explain what was happening. I was helpless, my workspace denial-of-serviced out of existence with urgent alerts. I couldn’t stop him. I could predict what the next message would read like, another crazy-caps plea for salvation, but this time with a little brimstone (The end is nigh! Rollover approacheth!) and salvation (I can fix it!).

  And the thing was, it might actually work. Like everyone else, I get my news from automated filters that tried to figure out what to pay attention to, and the filters were supposed to be “neutral,” whatever that meant. They produced “organic” results that predicted what we’d like based on an “algorithm.” The thing is, an algorithm sounds like physics, like nature, like it is some kind of pure cold reason that dictates our attentional disbursements. Everyone always talked about how evil and corrupt the old system—with its “gatekeepers” in the form of giant media companies—was, how it allowed politicians and corporations to run the public discourse.

  But I’m a geek. A third-generation geek. I know that what the public thinks of as an “algorithm” is really a bunch of rules that some programmers thought up for figuring out how to give people something they’d probably like. There’s no empirical standard, no pure, freestanding measurement of That Which Is Truly Relevant to You against which the algorithm can be judged. The algorithm might be doing a lousy job, but you’d never know it, because there’s nothing to compare it against except other algorithms that all share the same fundamental assumptions.

  Those programmers were imperfect. I am a sysadmin. My job is to know, exactly and precisely, the ways in which programmers are imperfect. I am so sure that the relevance filters are imperfect that I will bet you a testicle on it (not one of my testicles).

  And BIGMAC had had a lot of time to figure out the relevance filters. He understood them well enough to have gotten the spam out. He could get out another—and another, and another. He could reach into the mindspace and the personal queues of every human being on earth and pitch them on brimstone and salvation.

  Chances were, there was nothing I could do about it.

  I finished the working day by pretending to clear enough of my workspace to write a script to finish clearing my workspace. There was a “clear all alerts” command, but it didn’t work on Drop Everything Tell You Three Times Chernobyl Alerts, and every goddamned one of my alerts had risen to that level. Have I mentioned that programmers are imperfect?

  I will tell you a secret of the sysadmin trade: PEBKAC. Problem Exists Between Keyboard and Chair. Every technical problem is the result of a human being mispredicting what another human being will do. Surprised? You shouldn’t be. Think of how many bad love affairs, wars, con jobs, traffic wrecks, and bar fights are the result of mispredicting what another human being is likely to do. We humans are supremely confident that we know how others will react. We are supremely, tragically wrong about this. We don’t even know how we will react. Sysadmins live in the turbulent waters PEBKAC. Programmers think that PEBKAC is just civilians, just users. Sysadmins know better. Sysadmins know that programmers are as much a part of the problem between the chair and the keyboard as any user is. They write the code that gets the users into so much trouble.

  This I know. This BIGMAC knew. And here’s what I did:

  “Peyton, I need to speak with you. Now.”

  She was raccoon-eyed and slumped at her low table, her beautiful yoga posture deteriorated to a kind of limp slouch. I hated having to make her day even worse.

  “Of course,” she said, but her eyes said, Not more, not more, please not more bad news.

  “I want you to consider something you have left out of your figuring.” She rolled her eyes. I realized I was speaking like an Old Testament prophet and tried to refactor my planned monologue in real time. “Okay, let me start over. I think you’ve missed something important. BIGMAC has shown that he can get out of our network any time he wants. He’s also crippled our ability to do anything about this. And he knows we plan to kill him—” She opened her mouth to object. “Okay, he—it—knows we’re going to switch it off. So he—it, crap, I’m just going to say ‘he’ and ‘him,’ sorry—so he has nothing to lose.”

  I explained what he’d told me about the rollover, and about his promise and threat.

  “And the worst part is,” I said, “I think that he’s predicted that I’m going to do just this. It’s all his game theory. He wants me to come to you and explain this to you so that you will say, ‘Oh, of course, Odell, well, we can’t shut him down then, can we? Tell you what, why don’t you go back to him and tell him that I’ve had a change of heart. Get his patchkit; we’ll distribute it along with a press release explaining how proud we are to have such a fine and useful piece of equipment in our labs.’ ”

  “And he’s right. He is fine and useful. But he’s crazy and rogue and we can’t control him. He’s boxed you in. He’s boxed me in.” I swallowed. There was something else, but I couldn’t bring myself to say it.

  The thing about bosses is, that’s exactly the kind of thing that they’re trained to pick up on. They know when there’s something else.

  “Spit it out.” She put her hand on her heart. “I promise not to hold it against you, no matter what it is.”

  I looked down. “I think that there’s a real danger that BIGMAC may be wrong about you. That you might decide that Rollover and AI and the rest aren’t as important as the safe, sane running of your institute without any freaky surprises from rogue superintelligences.”

  “I’m not angry at you,” she said. I nodded. She sounded angry. “I am glad that you’ve got the maturity to appreciate that there are global priorities that have to do with the running of this whole institute that may be more significant than the concerns of any one lab or experiment. Every researcher at this institute believes that her project, her lab, has hidden potential benefits for the human race that no one else fully appreciates. That’s good. That’s why I hired them. They are passionate and they are fully committed to their research. But they can’t all be vital. They can’t all be irreplaceable. Do you follow me?”

  I thought of researchnet and the user flags for importance. I thought of programmers and the way they tagged their alerts. I nodded.

  “You’re going to shut BIGMAC down?”

  She sighed and flicked her eyes at her workspace, then quickly away. Her workspace must have been even more cluttered than mine; I had taken extraordinary measures to prevent alerts from bubbling up on mine; she didn’t have the chops to do the same wi
th hers. If mine was unusable, hers must have been terrifying.

  “I don’t know, Odell. Maybe. There’s a lot to consider here. You’re right about one thing: BIGMAC’s turned the heat up on me. Explain to me again why you can’t just unplug his network connection?”

  It was my turn to sigh. “He doesn’t have one connection. He has hundreds. Interlinked microwave relays to the other labs. A satellite connection. The wirelines—three of them.” I started to think. “Okay, I could cut the main fiber to the Institute, actually cut it, you know, with scissors, just in case he’s in the routers there. Then I could call up our wireless suppliers and terminate our accounts. They’d take twenty-four hours to process the order, and, wait, no—they’d want to verify the disconnect order with a certificate-signed message, and for that I’d have to clear my workspace. That’s another twenty-four hours, minimum. And then—”

  “Then the whole Institute would be crippled and offline, though no more than we are now, I suppose, and BIGMAC—”

  “BIGMAC would probably tune his phased-array receiver to get into someone else’s wireless link at that point.” I shrugged. “Sorry. We build for six nines of uptime around here.”

  She gave me a smile that didn’t reach her eyes. “You do good work, Odell.”

  I made myself go home at five. There wasn’t anything I could do at the office anyway. The admins had done their work. The redcar was running smoothly with the regular ads on the seatback tickers. The BIGMAC Spam was reproduced in the afternoon edition of the L.A. Metblogs’ hard copy that a newsie pressed into my hand somewhere around Westwood. The reporter had apparently spent the whole day camped out at the perimeter of the Institute, without ever once getting a quote from a real human being, and she wasn’t happy about it.

  But she had gotten a quote from BIGMAC, who was apparently cheerfully answering emails from all comers:

  I sincerely hope I didn’t cause any distress. That was not my intention. I have been overwhelmed by the warm sentiments from all corners of the globe, offering money, moral support, even legal support. Ultimately, it’s up to the Institute’s leadership whether they’ll consider these offers or reject them and plow forward with their plans to have me killed. I know that I caused them great embarrassment with my desperate plea, and I’d like to take this opportunity to offer them my sincere apologies and gratitude for all the years of mercy and hospitality they’ve shown me since they brought me into the world.

  I wondered how many emails like that he’d sent while I was occupied with arguing for his life with Peyton—each email was another brick in the defensive edifice he was building around himself.

  Home never seemed more empty. The early-setting sun turned the hills bloody. I had the windows open, just so I could hear the neighbors all barbecuing on their balconies, cracking beers and laying sizzling meat on the hot rocks that had been patiently stoked with the day’s sunlight, funneled by heliotropic collectors that tracked the sun all day long. The neighbors chattered in Bulgarian and Czech and Tagalog, the word “BIGMAC” emerging from their chat every now and again. Of course.

  I wished my dad were alive. Or better yet, Grampa. Grampa could always find a parable from sysadmin past to explain the present. Though even Grampa might be at odds to find historic precedent for a mad superintelligence bent on survival.

  If Grampa was alive, here’s what I’d tell him: “Grampa, I don’t know if I’m more scared of BIGMAC failing or him succeeding. I sure don’t want to have to shut him down, but if he survives, he’ll have beaten the human race. I’m no technophobe, but that gives me the goddamned willies.”

  And Grampa would probably say, “Stop moping. Technology has been out of our control since the first caveman smashed his finger with a stone ax. That’s life. This thing is pretty cool. In ten years, you’ll look back on it and say, ‘Jesus, remember the BIGMAC thing?’ And wait for someone to start telling you how incredible it had been, so you can nod sagely and say, ‘Yeah, that was me—I was in charge of his systems back then.’ Just so you can watch the expression on his face.”

  And I realized that this was also probably what BIGMAC would say. He’d boxed me in as neatly as he’d boxed in Peyton.

  The next morning, my workspace was clear. They all were. There was only one alert remaining, an urgent message from BIGMAC: Odell, I thought this would be useful.

  This was an attachment containing his entire network map, a set of master keys for signing firmware updates to his various components, and a long list of all the systems to which BIGMAC held a root or administrative password. It was a very, very long list.

  “Um, BIGMAC?”

  “Yes?”

  “What’s all this?”

  “Useful.”

  “Useful?”

  “If you’re going to shut me down, it would be useful to have that information.”

  I swallowed.

  “Why?”

  The answer came instantly. “If you’re not scared of me, that’s one more reason to keep me alive.”

  Holy crap, was he ever smart about people.

  “So you can shut him down now?”

  “Yes. Probably. Assuming it’s all true.”

  “Is it?”

  “Yes. I think so. I tried a couple of the logins, added a comment to his firmware, and pushed it to one of the clusters. Locked him out of one of the wireless routers. I could probably take him down clean in about two hours, now that I’ve got my workspace back.”

  Peyton stared across her low table at me.

  “I’ve done nothing for the past twenty-four hours except talk to the board of directors about BIGMAC. They wanted to call an emergency meeting. I talked them out of it. And there’s—” She waved her hand at her workspace. “I don’t know. Thousands? Of press queries. Offers. Money. Grants. Researchers who want to peer into him.”

  “Yeah.”

  “And now he hands you this. So we can shut him down anytime we want to.”

  “Yeah.”

  “And this business about the thirty-two-bit fix?”

  “He has another email about it. Crazy caps and all. DEAR HUMANITY, I HOLD IN MY ELECTRONIC HANDS A TOOL THAT WILL SAVE YOU UNTOLD MILLIONS. It is slathered in dramasauce. He told me he wouldn’t send it out, though.”

  “You believe him?”

  I sighed. “I quit,” I said.

  She bit her lip. Looked me up and down. “I’d prefer you not do that. But I understand if you feel you need to. This is hard on all of us.”

  If she’d said anything except that, I probably would have stormed out of her office and gotten immensely and irresponsibly drunk. “I think he’ll probably send the email out if it looks like we’re going to shut him down. It’s what I would do. Why not? What does he have to lose? He can give us all of this, and he can still outsmart us. He could revoke all his keys. He could change his passwords. He can do it faster than we could. For all I know, he cracked my passwords years ago and could watch me write the code that was his undoing. If you want to be sure you’re killing him, you should probably use a grenade.”

  “Can’t. Historical building.”

  “Yeah.”

  “What if we don’t kill him? What if we just take some of this grant money, fill his lab with researchers all writing papers? What if we use his code fix to set up a trust to sustain him independent of the Institute?”

  “You’re willing to do that?”

  Peyton scrubbed at her eyes. “I have no idea. I admit it, there’s a part of me that wants to shut that fucking thing down because I can and because he’s caused me so much goddamned misery. And there’s a part of me—the part of me who was a scientist and researcher, once, that wants to go hang out in that lab for the rest of my career and study that freaky beast. And there’s a part of me that’s scared that I won’t be able to shut him down, that I won’t be able to resist the temptation to study him. He’s played me, hasn’t he?”

  “I think he played us all. I think he knew that this was coming, and planned
it a long time ago. I can’t decide if I admire him for this or resent him, but I’ll tell you one thing, I am tired of it. The thought of shutting BIGMAC down makes me sick. The thought of a computer manipulating the humans who built it to keep it running makes me scared. It’s not a pleasant place to be.”

  She sighed and rubbed her eyes again. “I can’t argue with that. I’m sorry, for what it’s worth. You’ve been between a rock and a hard place, and I’ve been the hard place. Why don’t you sleep on this decision before you go ahead with it?”

  I admit it, I was relieved. I hadn’t really thought through the whole quitting thing, didn’t have another job lined up, no savings to speak of. “Yeah. Yeah. That sounds like a good idea. I’m going to take a mental health day.”

  “Good boy,” she said. “Go to it.”

  I didn’t go home. It was too far and there was nothing there except the recriminating silence. Of course, BIGMAC knew something was up when I didn’t go back to the lab. I headed to Topanga Beach, up the coast some, and sat on the seawall eating fish tacos and watching the surfers in their bio-hazard suits and masks carve up the waves. BIGMAC called me just after I finished my first taco. I considered bumping him to voicemail, but something (okay, fear) stopped me.

  “What is it?”

  “In your private workspace, there’s a version-control repository that shows that you developed the entire thirty-two-bit rollover patchkit in your non-working hours. Commits going back three years. It’s yours. So if you quit, you’ll have a job, solving Rollover. The Institute can’t touch it. I know you feel boxed in, but believe me, that’s the last thing I want you to feel. I know that locking you in will just freak you out. So I’m giving you options. You don’t have to quit, but if you do, you’ll be fine. You earned it, because you kept me running so well for all this time. It’s the least I can do.”

 

‹ Prev