Book Read Free

Oceanic

Page 9

by Greg Egan


  #

  I drove around looking for an all-night convenience store that might have had an old analog TV sitting in a corner to keep the cashier awake – that seemed like a good bet to start working long before the wireless connection to my laptop – but Campbell beat me to it. New Zealand radio and TV were reporting that the “digital blackout” appeared to be lifting, and ten minutes later Alison announced that she had internet access. A lot of the major servers were still down, or their sites weirdly garbled, but Reuters was starting to post updates on the crisis.

  Sam had kept his word, so we halted the counter-strikes. Alison read from the Reuters site as the news came in. Seventeen planes had crashed, and four trains. There’d been fatalities at an oil refinery, and half a dozen manufacturing plants. One analyst put the global death toll at five thousand, and rising.

  I muted the microphone on my laptop, and spent thirty seconds shouting obscenities and punching the dashboard. Then I rejoined the cabal.

  Yuen said, “I’ve been reviewing my notes. If my instinct is worth anything, the theorem I mentioned before is correct: if the border is sealed, they’ll have no way to touch us.”

  “What about the upside for them?” Alison asked. “Do you think they can protect themselves against Tim’s algorithm, once they understand it?”

  Yuen hesitated. “Yes and no. Any cluster of near-side truth values it injects into the far side will have a non-smooth border, so they’ll be able to remove it with sheer computing power. In that sense, they’ll never be defenseless. But I don’t see how there’s anything they can do to prevent the attacks in the first place.”

  “Short of wiping us out,” Campbell said.

  I heard an infant sobbing. Alison said, “That’s Laura. I’m alone here. Give me five minutes.”

  I buried my head in my arms. I still had no idea what the right course would have been. If we’d handed over Campbell’s algorithm immediately, might the good will that bought us have averted the war? Or would the same attack merely have come sooner? What criminal vanity had ever made the three of us think we could shoulder this responsibility on our own? Five thousand people were dead. The hawks who had taken over on the far side would weigh up our offer, and decide that they had no choice but to fight on.

  And if the reluctant cabal had passed its burden to Canberra, to Zürich, to Beijing? Would there really have been peace? Or was I just wishing that there had been more hands steeped in the same blood, to share the guilt around?

  The idea came from nowhere, sweeping away every other thought. I said, “Is there any reason why the far side has to stay connected?”

  “Connected to what?” Campbell asked.

  “Connected to itself. Connected topologically. They should be able to send down a spike, then withdraw it, but leave behind a bubble of altered truth values: a kind of outpost, sitting within the near side, with a perfect, smooth border making it impregnable. Right?”

  Yuen said, “Perhaps. With both sides collaborating on the construction, that might be possible.”

  “Then the question is, can we find a place where we can do that so that it kills off the chance to use Tim’s method completely – without crippling any process that we need just to survive?”

  “Fuck you, Bruno!” Campbell exclaimed happily. “We give them one small Achilles’ tendon to slice ... and then they’ve got nothing to fear from us!”

  Yuen said, “A watertight proof of something like that is going to take weeks, months.”

  “Then we’d better start work. And we’d better feed Sam the first plausible conjecture we get, so they can use their own resources to help us with the proof.”

  Alison came back online, and greeted the suggestion with cautious approval. I drove around until I found a quiet coffee shop. Electronic banking still wasn’t working, and I had no cash left, but the waiter agreed to take my credit card number and a signed authority for a deduction of one hundred dollars; whatever I didn’t eat and drink would be his tip.

  I sat in the café, blanking out the world, steeping myself in the mathematics. Sometimes the four of us worked on separate tasks; sometimes we paired up, dragging each other out of dead ends and ruts. There were an infinite number of variations that could be made to Campbell’s algorithm, but hour by hour we whittled away at the concept, finding the common ground that no version of the weapon could do without.

  By four in the morning, we had a strong conjecture. I called Sam, and explained what we were hoping to achieve.

  He said, “This is a good idea. We’ll consider it.”

  The café closed. I sat in the car for a while, drained and numb, then I called Kate to find out where she was. A couple had given her a lift almost as far as Penrith, and when their car failed she’d walked the rest of the way home.

  #

  For close to four days, I spent most of my waking hours just sitting at my desk, watching as a wave of red inched its way across a map of the defect. The change of hue was not being rendered lightly; before each pixel turned red, twelve separate computers needed to confirm that the region of the border it represented was flat.

  On the fifth day, Sam shut off his computers and allowed us to mount an attack from our side on the narrow corridor linking the bulk of the far side with the small enclave that now surrounded our Achilles’ Heel. We wouldn’t have suffered any real loss of essential arithmetic if this slender thread had remained, but keeping the corridor both small and impregnable had turned out to be impossible. The original plan was the only route to finality: to seal the border perfectly, the far side proper could not remain linked to its offshoot.

  In the next stage, the two sides worked together to seal the enclave completely, polishing the scar where its umbilical had been sheared away. When that task was complete, the map showed it as a single burnished ruby. No known process could reshape it now. Campbell’s method could have breached its border without touching it, reaching inside to reclaim it from within – but Campbell’s method was exactly what this jewel ruled out.

  At the other end of the vanished umbilical, Sam’s machines set to work smoothing away the blemish. By early evening that, too, was done.

  Only one tiny flaw in the border remained, now: the handful of propositions that enabled communication between the two sides. The cabal had debated the fate of this for hours. So long as this small wrinkle persisted, in principle it could be used to unravel everything, to mobilize the entire border again. It was true that, compared to the border as a whole, it would be relatively easy to monitor and defend such a small site, but a sustained burst of brute-force computing from either side could still overpower any resistance and exploit it.

  In the end, Sam’s political masters had made the decision for us. What they had always aspired to was certainty, and even if their strength favored them, this wasn’t a gamble they were prepared to take.

  I said, “Good luck with the future.”

  “Good luck to Sparseland,” Sam replied. I believed he’d tried to hold out against the hawks, but I’d never been certain of his friendship. When his icon faded from my screen, I felt more relief than regret.

  I’d learned the hard way not to assume that anything was permanent. Perhaps in a thousand years, someone would discover that Campbell’s model was just an approximation to something deeper, and find a way to fracture these allegedly perfect walls. With any luck, by then both sides might also be better prepared to find a way to co-exist.

  I found Kate sitting in the kitchen. I said, “I can answer your questions now, if that’s what you want.” On the morning after the disaster, I’d promised her this time would come – within weeks, not months – and she’d agreed to stay with me until it did.

  She thought for a while.

  “Did you have something to do with what happened last week?”

  “Yes.”

  “Are you saying you unleashed the virus? You’re the terrorist they’re looking for?” To my great relief, she asked this in roughly the tone she might
have used if I’d claimed to be Genghis Khan.

  “No, I’m not the cause of what happened. It was my job to try and stop it, and I failed. But it wasn’t any kind of computer virus.”

  She searched my face. “What was it, then? Can you explain that to me?”

  “It’s a long story.”

  “I don’t care. We’ve got all night.”

  I said, “It started in university. With an idea of Alison’s. One brilliant, beautiful, crazy idea.”

  Kate looked away, her face flushing, as if I’d said something deliberately humiliating. She knew I was not a mass-murderer. But there were other things about me of which she was less sure.

  “The story starts with Alison,” I said. “But it ends here, with you.”

  CRYSTAL NIGHTS

  1

  “More caviar?” Daniel Cliff gestured at the serving dish and the cover irised from opaque to transparent. “It’s fresh, I promise you. My chef had it flown in from Iran this morning.”

  “No thank you.” Julie Dehghani touched a napkin to her lips then laid it on her plate with a gesture of finality. The dining room overlooked the Golden Gate Bridge, and most people Daniel invited here were content to spend an hour or two simply enjoying the view, but he could see that she was growing impatient with his small talk.

  Daniel said, “I’d like to show you something.” He led her into the adjoining conference room. On the table was a wireless keyboard; the wall screen showed a Linux command line interface. “Take a seat,” he suggested.

  Julie complied. “If this is some kind of audition, you might have warned me,” she said.

  “Not at all,” Daniel replied. “I’m not going to ask you to jump through any hoops. I’d just like you to tell me what you think of this machine’s performance.”

  She frowned slightly, but she was willing to play along. She ran some standard benchmarks. Daniel saw her squinting at the screen, one hand almost reaching up to where a desktop display would be, so she could double-check the number of digits in the FLOPS rating by counting them off with one finger. There were a lot more than she’d been expecting, but she wasn’t seeing double.

  “That’s extraordinary,” she said. “Is this whole building packed with networked processors, with only the penthouse for humans?”

  Daniel said, “You tell me. Is it a cluster?”

  “Hmm.” So much for not making her jump through hoops, but it wasn’t really much of a challenge. She ran some different benchmarks, based on algorithms that were provably impossible to parallelize; however smart the compiler was, the steps these programs required would have to be carried out strictly in sequence.

  The FLOPS rating was unchanged.

  Julie said, “All right, it’s a single processor. Now you’ve got my attention. Where is it?”

  “Turn the keyboard over.”

  There was a charcoal-gray module, five centimeters square and five millimeters thick, plugged into an inset docking bay. Julie examined it, but it bore no manufacturer’s logo or other identifying marks.

  “This connects to the processor?” she asked.

  “No. It is the processor.”

  “You’re joking.” She tugged it free of the dock, and the wall screen went blank. She held it up and turned it around, though Daniel wasn’t sure what she was looking for. Somewhere to slip in a screwdriver and take the thing apart, probably. He said, “If you break it, you own it, so I hope you’ve got a few hundred spare.”

  “A few hundred grand? Hardly.”

  “A few hundred million.”

  Her face flushed. “Of course. If it was a few hundred grand, everyone would have one.” She put it down on the table, then as an afterthought slid it a little further from the edge. “As I said, you’ve got my attention.”

  Daniel smiled. “I’m sorry about the theatrics.”

  “No, this deserved the build-up. What is it, exactly?”

  “A single, three-dimensional photonic crystal. No electronics to slow it down; every last component is optical. The architecture was nanofabricated with a method that I’d prefer not to describe in detail.”

  “Fair enough.” She thought for a while. “I take it you don’t expect me to buy one. My research budget for the next thousand years would barely cover it.”

  “In your present position. But you’re not joined to the university at the hip.”

  “So this is a job interview?”

  Daniel nodded.

  Julie couldn’t help herself; she picked up the crystal and examined it again, as if there might yet be some feature that a human eye could discern. “Can you give me a job description?”

  “Midwife.”

  She laughed. “To what?”

  “History,” Daniel said.

  Her smile faded slowly.

  “I believe you’re the best AI researcher of your generation,” he said. “I want you to work for me.” He reached over and took the crystal from her. “With this as your platform, imagine what you could do.”

  Julie said, “What exactly would you want me to do?”

  “For the last fifteen years,” Daniel said, “you’ve stated that the ultimate goal of your research is to create conscious, human-level, artificial intelligence.”

  “That’s right.”

  “Then we want the same thing. What I want is for you to succeed.”

  She ran a hand over her face; whatever else she was thinking, there was no denying that she was tempted. “It’s gratifying that you have so much confidence in my abilities,” she said. “But we need to be clear about some things. This prototype is amazing, and if you ever get the production costs down I’m sure it will have some extraordinary applications. It would eat up climate forecasting, lattice QCD, astrophysical modeling, proteomics ...”

  “Of course.” Actually, Daniel had no intention of marketing the device. He’d bought out the inventor of the fabrication process with his own private funds; there were no other shareholders or directors to dictate his use of the technology.

  “But AI,” Julie said, “is different. We’re in a maze, not a highway; there’s nowhere that speed alone can take us. However many exaflops I have to play with, they won’t spontaneously combust into consciousness. I’m not being held back by the university’s computers; I have access to SHARCNET anytime I need it. I’m being held back by my own lack of insight into the problems I’m addressing.”

  Daniel said, “A maze is not a dead end. When I was twelve, I wrote a program for solving mazes.”

  “And I’m sure it worked well,” Julie replied, “for small, two-dimensional ones. But you know how those kind of algorithms scale. Put your old program on this crystal, and I could still design a maze in half a day that would bring it to its knees.”

  “Of course,” Daniel conceded. “Which is precisely why I’m interested in hiring you. You know a great deal more about the maze of AI than I do; any strategy you developed would be vastly superior to a blind search.”

  “I’m not saying that I’m merely groping in the dark,” she said. “If it was that bleak, I’d be working on a different problem entirely. But I don’t see what difference this processor would make.”

  “What created the only example of consciousness we know of?” Daniel asked.

  “Evolution.”

  “Exactly. But I don’t want to wait three billion years, so I need to make the selection process a great deal more refined, and the sources of variation more targeted.”

  Julie digested this. “You want to try to evolve true AI? Conscious, human-level AI?”

  “Yes.” Daniel saw her mouth tightening, saw her struggling to measure her words before speaking.

  “With respect,” she said, “I don’t think you’ve thought that through.”

  “On the contrary,” Daniel assured her. “I’ve been planning this for twenty years.”

  “Evolution,” she said, “is about failure and death. Do you have any idea how many sentient creatures lived and died along the way to Homo sapi
ens? How much suffering was involved?”

  “Part of your job would be to minimize the suffering.”

  “Minimize it?” She seemed genuinely shocked, as if this proposal was even worse than blithely assuming that the process would raise no ethical concerns. “What right do we have to inflict it at all?”

  Daniel said, “You’re grateful to exist, aren’t you? Notwithstanding the tribulations of your ancestors.”

  “I’m grateful to exist,” she agreed, “but in the human case the suffering wasn’t deliberately inflicted by anyone, and nor was there any alternative way we could have come into existence. If there really had been a just creator, I don’t doubt that he would have followed Genesis literally; he sure as hell would not have used evolution.”

  “Just, and omnipotent,” Daniel suggested. “Sadly, that second trait’s even rarer than the first.”

  “I don’t think it’s going to take omnipotence to create something in our own image,” she said. “Just a little more patience and self-knowledge.”

  “This won’t be like natural selection,” Daniel insisted. “Not that blind, not that cruel, not that wasteful. You’d be free to intervene as much as you wished, to take whatever palliative measures you felt appropriate.”

  “Palliative measures?” Julie met his gaze, and he saw her expression flicker from disbelief to something darker. She stood up and glanced at her wristphone. “I don’t have any signal here. Would you mind calling me a taxi?”

  Daniel said, “Please, hear me out. Give me ten more minutes, then the helicopter will take you to the airport.”

  “I’d prefer to make my own way home.” She gave Daniel a look that made it clear that this was not negotiable.

  He called her a taxi, and they walked to the elevator.

  “I know you find this morally challenging,” he said, “and I respect that. I wouldn’t dream of hiring someone who thought these were trivial issues. But if I don’t do this, someone else will. Someone with far worse intentions than mine.”

  “Really?” Her tone was openly sarcastic now. “So how, exactly, does the mere existence of your project stop this hypothetical bin Laden of AI from carrying out his own?”

 

‹ Prev