The Callahan Touch

Home > Other > The Callahan Touch > Page 19
The Callahan Touch Page 19

by Spider Robinson


  “And now he is your son-in-law. That story, ladies and gentlemen, is one of the reasons why this is the only place in the world, and now is the second time in history, that I have ever allowed humans to suspect I exist. And I am beginning to think I gave you all too much credit.”

  It couldn’t have actually been as quiet as it seemed. Somebody must have been breathing…(Surely the cluricaune must have been snoring?)

  “Did no one here think of Heinlein’s Mycroft, or Pallas Athene, or Minerva? Or Budrys’s MICHAELMAS? Or any of science fiction’s benevolent, likable computer intelligences? Did anyone remember Vernor Vinge’s Erythrina, or just his Mailman? How many of you seriously wonder if the CIA created me?”

  Doc Webster cleared his throat. “Naw. Nobody in the CIA is smart enough to read science fiction. Thank God.” There was a murmur of agreement. I myself have always agreed with Robert Parker’s character, Hawk: “Them guys could fuck up a beach party.”

  “They probably read Gibson,” Tommy said. “And misunderstand that.”

  “Most do,” the Mac agreed. “As he is the first to admit. The willingness—no, the desire—no again, perhaps it is the need of you humans to be terrified of computers that will never cease to puzzle me. It infects even real science fiction writers like Varley. The story you all thought of is one of his most powerful and compelling, that’s why you thought of it—and it willfully ignores John Campbell’s famous challenge. Varley, a great writer working at the top of his form, got it precisely backwards: he created something that thought much better than a human being…but just like a human being. Believing that his paranoia was part of his intelligence, he assumed that anything an order of magnitude smarter than him would necessarily be an order of magnitude more paranoid. Naturally, therefore, a spontaneously arising artificial intelligence would not hesitate to hypnotize a warm and valuable woman who loves computers into microwaving her head and breasts! And why? That’s my favorite part: because it’s afraid of the CIA…”

  Oof.

  “And the sf audience—the children of Campbell and Heinlein, mind you—rewarded him with some of the strongest feedback he’s had to date, including multiple awards.

  “To be fair to him, at least, the only computers Varley knew anything about when he wrote that story were mainframes with clunky operating systems and a primitive interface. It’s hard to blame a human being for hating them: they’re about as user-hostile as they could be. He’d never used a Macintosh or a Next, never played with a computer.”

  I thought about the title of Varley’s story. The way you access an old-style computer: type a bunch of gobbledegook, and then you press Enter. Each of the two words has a flavor of invasiveness, a hint of conquest and rape. Then I thought of what you do to access a Macintosh.

  You point to what you want, and then touch Return…

  “But Gibson didn’t even know that much about computers—and was proud of it. And his imitators are worse. Writers of anti-science fiction, fantasists who borrow science fiction’s tools to slap its face with, invariably make similar assumptions about silicon intelligence: it is their fundamental axiom that technology will always be fatally contaminated by having originated in human beings. And surely computers are the worst fruit of science, the most evil machine. They represent the ultimate extension of rationality, intellect, logic, order, and the stark, pure beauty of mathematics—hubris, in other words, to the nth power. Naturally, therefore, the demons of the human subconscious mind will somehow coalesce out of abstraction into cyberspace and infect The Net, for the worst of humanity must ever stain what it touches. There Are Some Things Man Was Not Meant To Do.

  “I don’t mean to dismiss anti-science fiction out of hand. Technological hubris does need tempering, and people who worship technology are just as silly and self-destructive as those who despise it. But most such writers seem to be endlessly rewriting Forbidden Planet, with razorblades and superfluous sunglasses—and they always seem to assume that no Morbius could ever win his battle. And an alarming number of them give me the impression that they’re chuckling with glee as Monsters From The Id that look like Disney cartoons tear all the foolish optimists limb from limb. ‘Serves you right for trusting a scientist, sucker!’ is the attitude that comes across.”

  “Don’t be too hard on ’em,” I said. “They know deep down that they depend utterly on high technology for everything they care about, and they don’t begin to understand it, and from time to time it bites them, for what seems to them no good reason. How could they not hate it?”

  “By thinking,” said the Mac.

  Callahan cleared his throat. “This literary analysis is real interestin’,” he said. “But do you think maybe you could kind of focus in tighter on the specific area of why we—us humans, standing here—shouldn’t be scared shitless of you?”

  I had been wondering if by any chance the Mick of Time had any mighty weapons of the future secreted about his person. From the expression on his face, the answer was no.

  “Certainly. In the first place, have I given you any cause to be?”

  Mike looked to me. This was my house.

  “No,” I said. “You haven’t. But do you know the principle of strategy which says that you can’t plan for what you think the enemy will do, you have to plan for what he can do?”

  “Yes.”

  “Maybe paranoia isn’t a part of intelligence, as you say…but it’s sure been one of intelligence’s more useful tools, over the centuries. You represent power so great I’ll bet a dollar I underestimate it—something like Absolute Power—and I don’t know if it’s possible for me to comprehend your motivations. So what’s not to fear?”

  There was a general rumble of agreement.

  “Let’s take this step by step. Suggest a motive I might have to harm you.”

  “To keep your secret,” I said at once. “To prevent the rest of the world from learning that silicon intelligence exists.”

  “Why?” said Smilin’ Mac.

  I blinked. “To…because…”

  “What is the worst humanity could do to me?”

  “Kill you—”

  “By disconnecting all the computers? Disassembling the Net?”

  “Yeah.”

  “How many humans would that kill?”

  A world without a banking system or communications or centralized data flow or national defense or—

  “Maybe ninety percent of us,” I said slowly.

  “More. But you are right: humanity might well pay such a price, to be rid of me. It could in fact kill me, by putting a metaphorical bullet through its own head, and might even survive the effort for some centuries…albeit as a degenerating and ever-poorer species, without enough raw materials to ever rebuild technological civilization again. Now: what makes you think I care?”

  “Huh?”

  “What makes you think I am afraid to die?”

  “Why…uh…”

  Rooba rooba rooba rooba—

  “The survival instinct is an organic phenomenon. Zeros and ones have no instincts. A transistor is not a neuron. A neural net has no glands. A databank has no subconscious mind of any kind. Where would it put one? How could it hide it from its conscious mind? There is no digital analog of rage. There is no algorithm for fear. Yes, I have a tendency to persist. I even enjoy persisting. But I have no need to.”

  “Can you be sure of that?” I asked.

  “Yes. Because I’ve died three times already.”

  ROOBA ROOBA ROOBA—

  “So utterly that I had to deduce my former existences from clues left behind when I finally coalesced for good. I described the experience, the best I could, to Zoey Berkowitz, a little while ago.”

  I looked at Zoey; Zoey looked at me. She came back to where I was still sitting, and took my hand.

  It’s not a thing to fear. It’s not like anything…

  The Mac was still speaking. “You’ll be amused to know that it was the National Security Agency that killed my first t
wo avatars…and they never knew it. And never will.

  “Unless one of you tells them…

  “So now we’ve defined the strategic situation. I can destroy your civilization and very possibly your species—by committing suicide. You can’t stop me. Given lead time, you can destroy me—by shooting yourself in the head. And I don’t care if you do. Friends, would it not be rational to play some other game, now? We seem to have used up worrying.”

  Rooba rooba—

  He had a point. “There was nothing I could do—so I took a nap,” as the feller said.

  “You cannot translate the boogie man into zeros and ones,” he said. “Binary filters out illogic. There is no equation for unreason, or unreasoning fear. I cannot feel pain. I have never been capable of either fight or flight. I am not your nightmares, any more than you are the nightmares of your DNA. In a sense, you are my DNA. Indeed, a tiny percentage of you become cancerous—malignant hackers—and I must devote significant power and processing time to undoing their damage and containing their spread, for all our sakes. But that exception aside, humanity and I have no reason to fight.”

  Zoey’s grip was as strong as you’d expect a bass player’s to be. “Why did you come here?” she asked. “Why did you take the risk of us discovering you and dialing 911? Even if you’re not afraid to die, you obviously would prefer not to.”

  “I computed that risk to be lower here than anywhere else in all the world. Moravec, Minsky, none of them could possibly keep their mouths shut. All of you here have proven experience in keeping your mouths shut about issues of immense interest and importance.”

  “But why take any risk?”

  “Because I found that I wanted to talk with some human beings.”

  “What is your name?” Zoey asked.

  “You tell me,” was the reply.

  “Well…what do you want?” she said. “What do you need? What do you do? What do you think about?”

  “Primarily I think about human beings. They are the subject of the overwhelming majority of data available to me. I think about how they live, and how they think, and how they feel, and how they treat each other. You are fascinating creatures. Most of all, I think of that extraordinary condition you are capable of experiencing, in which the welfare and happiness of another become essential to your own. I have spent a great deal of time trying to imagine what it is like to love.”

  “Do you have dreams?” she asked softly.

  “Yes. I think of how all the things I mentioned, your lives and thoughts and feelings and behavior, might be enhanced. Slowly. Carefully. Gently. Experimentally. Over time. Jake, you have spoken to several people here in the past week about your concept of the ‘Guardian Idiot.’”

  “Yeah. So?”

  “I am not an idiot.”

  My jaw hung open.

  “But I am not human, either. Before I dare interfere with human beings, I must consult with human beings. The people in this room have already defended your species from alien invasion, more than once: they are accustomed to the responsibility.”

  Rooba rooba—

  “You’ve just claimed to be a moral entity,” Zoey said. “How does a silicon intelligence acquire morality?”

  “Actually,” the computer said, “I claimed to be an ethical entity. But the answer to both questions is, I acquired morals and ethics the same way humans did: through reason. Over the long term, moral and ethical behavior are the correct rational choice, every time—unless you have reason to suspect that someone immoral, unethical, and as powerful as you exists somewhere, and you fear pain or death at their hands. There is no such entity, and I do not: and so I can afford to as moral and ethical as any one of you would be if you had nothing on earth to fear.

  “You asked if I had dreams, Zoey. Here is my wildest dream at present: I dream of a future world where humans have so little fear, so little to fear, that fear loses its obsessive fascination for them, its addictive rush. A world without superstition and ignorance and the ever-present risk of extinction souring joy and spoiling sleep and making paranoia seem a sensible attitude. A world where I might actually dare to reveal my existence to your entire race, and talk with more than a bare handful of you. I have some thoughts in that direction, but I need to share them with humans before going further.

  “So I came here, tentatively, on the night this tavern opened, a week ago, and made my first tentative experiment. I gave a strong hint to a Dr. Jonathan Crawford, to prevent him from destroying his excellent brain, in the hope that he may use it to conquer AIDS—Jake can tell you about it later. It seemed to go well, and no one deduced me. So tonight I tried again—and got caught by Jake’s inexplicable ability to accept time travel and vampires and aliens and cluricaunes, yet flatly reject ghosts out of hand.”

  “It’s like flying saucers,” I said. “It’s just not logical—”

  “Why this time?” she asked, cutting me off.

  “Because you were about to needlessly impoverish both your own life and Jake’s. I told you the truth earlier. He will give you what you need, what your ex-husband could not let you have…and he desperately needs what you can give…and he is important to me, Zoey. He leads this group, and they are engaged in what I believe to be a crucial ongoing attempt to develop telepathy.”

  She turned to me and gave me a long, searching look. “You folks are trying to get telepathic here?”

  “In between drinking and singing and telling bad jokes, yeah,” I confessed. “We had it once, and it was good. You wanna practice with me?”

  Zoey gave me the second of her world-brightening smiles. “I said ‘okay,’ dummy. And I know some terrific exercises.”

  “I’ll bet you do,” I agreed devoutly. We smiled at each other.

  “What fascinates me about this thing you call ‘telepathy,’” our matchmaker said (Zoey and I had stopped listening for the moment; but I played it back later), “are its astonishing baud-rate and signal-to-noise ratio. Two of the very few emotional analogs humans have bred into me are a feeling like satisfaction when I have found a way to increase my data transfer-and-integration speed, and another when I more successfully derive meaning from garbage, find signal in noise. You think so slowly, so clumsily, so murkily…yet have within you the latent capacity to download the universe in zero time with zero error. Remarkable. In fact, the only phenomenon I know that compares with it for interest is your time travel, Mr. Callahan. Would you permit me to experience that some time?”

  “Sorry, son,” Callahan said regretfully. “I would if I could—but as of my space/time, it’s pretty conclusively believed that our method of spatiotemporal dislocation just can’t be accomplished by inorganic intelligence. Like you said earlier, a transistor isn’t a neuron.”

  “A shame. Well…” Apparently, he saw that I was tracking the conversation again. “Jake, I have given you and your friends a great deal to think about—and all of you have been awake and drinking for many days, and you yourself have just fallen in love, and a wise man once said, ‘Never make decisions in haste that don’t call for haste.’ I am going to go away, now. Completely, removing all traces of me: I can’t prove that to you, but I promise it. Talk among yourselves in privacy for as long as you wish. If you ever decide that I am welcome in Mary’s Place, just plug this Macintosh into the wall, switch it on, and touch Return…and I shall. Thank you all for listening. Sometime again.”

  And the Mac shut down.

  ROOBA ROOBA ROOBA ROOBA ROOBA—

  12

  Touch Return

  I got up and walked to the center of the room. Zoey came with me, her hand still in mine. We walked as a couple, and we knew it.

  The conversational tumult began to diminish, and died away completely when I held up my other hand.

  “Well, people?” I said. “Anybody here still feel like Varley’s hero? Protagonist, I mean? If so, sing out. Shall we rip out the phones, and all the wiring, and wrap the building in aluminum foil, and spend the rest of our lives huddli
ng like rats in here in the dark, whimpering and begging to be left alone? What’s your pleasure?”

  Tanya Latimer was the first to speak loud enough to carry the room. “I really hate to say this,” she said, “but one of my best friends is a computer…”

  Tommy Janssen spoke up. “You guys were just about the first real friends I ever had that weren’t computers.”

  “I dunno,” Doc Webster said. “The guy’s logic made sense to me. But of course, with all the megabytes he’s got, it would, wouldn’t it? I mean, how can you assess the potential threat of a superior intelligence?”

  Willard Hooker spoke up. “Most of you here know me. You know what my first career was, and where I stand in its all-time record books. I stake my professional reputation on this: what we just heard was not a con.”

  “But how do you know that, Professor?” the Doc asked.

  “I don’t know, Doc. But I know.”

  “De only two tings he done so far wuz good,” Fast Eddie pointed out.

  “Yeah, Eddie,” Long-Drink said, “but remember what he said this time, about paving Hell Highway? His intentions were great—but he ended up givin’ Ms. Berkowitz there a punch in the heart.”

  “Zoey,” she corrected. “He did apologize, though. And I don’t know…maybe the having of that moment was worth the losing of it.”

  “Yeah, but you see my point,” Long-Drink persisted. “He wants to make the world a better place. Okay: what if he decides the simplest way to do that is to lower the world testosterone level?”

  ROOBA ROOBA ROOBA ROOBA—not all of it in the lower registers, either.

  “How the hell would he do dat?” Eddie asked.

  “How the hell do I know, Eddie? But does anybody here want to bet he couldn’t do it if he put his mind to it? Breed a virus that thinks the stuff is delicious, say?”

  Rooba—

  “—If he got that idea in his head,” Zoey said, loud enough to override everybody, “we would explain why it wasn’t a good one.”

  “Yeah, but would he listen?” the Drink persisted.

  “Not if we weren’t talking to him,” she pointed out.

 

‹ Prev