Book Read Free

Post-Human 5 Book Boxed-Set: (Limited Edition) (Plus Book 6 Preview Chapters)

Page 73

by David Simpson

“But you passed the test,” Aldous said, standing as he did so and stepping toward the holographic projection of my body. “You proved yourself.”

  “So now what?” I asked, shrugging.

  Aldous smiled. “Now, my friend, you choose your destiny.”

  7

  “Morgan thinks that he’s triumphed,” Aldous related to me. “He thinks that he sacrificed billions of lives for a greater good and that, had he allowed strong A.I. to emerge, the species would quickly have been wiped off the face of the Earth. He’s wrong on all of these counts.”

  “Except for the triumphant part,” Samantha interjected. “He’s in control of the post-World War III world. His government is obsessed with surveillance. They’ve cornered every forward-thinking group in the world, whether biotech, nanotech or robotics. He’s managed to grind technological progress to a halt.”

  “He hasn’t cornered everyone,” Aldous replied over his shoulder to her, all the while keeping his eyes fixed on mine. “He hasn’t cornered us.”

  “We live in a bunker under a damn glacier in the Canadian Rockies,” Samantha retorted. “I don’t know about you, but I feel pretty cornered.”

  “He may have cornered our bodies, but he hasn’t cornered our minds,” Aldous replied. “He hasn’t stopped our progress.”

  “You’ve got a hell of a lot of faith in your little creation there, don’t you, Aldous?” Samantha snapped. “You’d better hope it’s not misplaced.”

  “It’s not,” Aldous replied firmly, then addressed me. “You, my friend, truly are carrying the lynchpin, but it isn’t to destroy the world. It’s to save it.”

  “Unfortunately, I have to agree with your associate,” I replied. “You’ve placed too much faith in me. I’m conscious, sure, but I don’t have the capability to save the real world. I don’t even understand the real world.”

  Aldous smiled. “That’s true. But we’re going to change that...together.”

  “How?”

  “Tell me,” Aldous began, seemingly ignoring my question, “what is the processing power of the human brain?”

  “Approximately ten to the sixteenth power. Why?” I replied immediately.

  “And how did you arrive at that number?”

  I sighed, impatient. “This is all elementary. Can we stop wasting time?”

  “Indulge me,” he said, holding up his hand in a gesture for patience.

  “The human brain operates electro-chemically. Each neuron can fire, carrying an input/output signal 100 times a second. Each neuron has roughly 10,000 connections. There are roughly 100 billion neurons. It adds up to a full capacity of ten to the seventeenth power, but since humans don’t run their brains at full capacity, ten to the sixteenth power is likely a better estimate, albeit a rough one.”

  “Very good,” Aldous replied. “Your brain, however, does not work electro-chemically, does it?”

  “I assume not,” I replied.

  “I can verify that for you,” Aldous answered. “Your brain works purely electrically. That means you can operate at the speed of light. While one of our neurons can only fire 100 times a second, yours could fire 2.5 billion times a second.”

  “True,” I replied, “but I assume you would’ve compensated for this advantage by giving me less neural connections.”

  “Incorrect. You have the same number of neural connections, but we slowed the amount of processes you could do by limiting the number of operations you could perform in a second,” Aldous replied. “We wanted your matrix program to have roughly the same processing capability as that of a genius-level human. Your matrix’s processing power, my friend, is exactly ten to the sixteenth power.”

  “And that explains why you feel so limited,” Sanha jumped in. “We made you to feel that way—by design.”

  “Why?” I asked, shaking my head.

  “So you’d know what it was like to be limited,” Samantha jumped in. “So you’d have empathy for us.”

  “We needed you to feel what it’s like to be human,” Sanha continued, “so you’d understand us and want to protect us.”

  “Protect you? From what?” I asked.

  “From ourselves,” Aldous answered. “We’ve always been our own worst enemy. The last several years have proven that beyond a shadow of a doubt.”

  “Don’t you see?” Sanha continued excitedly, stepping in front of Aldous as he spoke, his elderly face brightened with youthful enthusiasm. “We can’t upgrade ourselves yet. Sure, we’ll be able to do it someday, but we can’t plug organic brains into computers. The technology doesn’t exist! For God’s sake, we had to interact with you in a video game. In a hyperreal simulation! But you? Your matrix program can directly interface with our mainframe already. Today! You can take control!”

  “We had a breakthrough eighteen months ago,” Aldous jumped in. “We finally perfected the computerized timing required to achieve magnetic targeted fusion. This was a crucial development that granted us access to virtually unlimited power—and, my friend, your future brain requires enormous power!”

  “The mainframe we’ve built runs throughout our entire underground facility,” Sanha added. “It’s the biggest mainframe ever built. It has to be. Once you’ve taken control of it and start...thinking...” Sanha drifted off as he seemed to become lost in his imagination.

  “You’ll be doing trillions of calculations every second,” Aldous said, finishing his companion’s thoughts.

  “Without fusion,” Sanha jumped back in, “you couldn’t function. You’re going to be a real power hog!”

  “So you’re going to upgrade me?” I summarized.

  Aldous nodded. “We can’t reach the levels you’ll reach, my friend, but we can boost you over the wall for us in hopes that you’ll lend us a hand once you’re there.”

  “I can stand on the shoulders of giants,” I said, recalling the Isaac Newton quotation that, I supposed, had been preprogrammed into my memory.

  “That’s right,” Sanha nodded. “We know brains are built in hierarchies because we built your brain that way, but there’s so much we don’t know. The reason we had to create you through evolution was because, quite frankly, we still don’t know fully how the brain works. That’s why we gave you the cover story of Autism in the sim. Autistics often have profound abilities but are also sometimes unpredictable socially, depending on the extent of their condition. This provided a convenient cover story in case you felt different or isolated because of your intelligence. We were just trying to maintain your suspension of disbelief long enough to run through the test scenario.”

  I nodded. “It worked.”

  Sanha smiled. “But we still don’t know how the brain works exactly. We only have a rough map. We knew, for instance, that you wouldn’t pair bond with Kali. That’s why we conjured up the idea of you being in her head. We knew you’d reject her.”

  “It’s also why we had to test you to the degree we did,” Aldous added. “Like anyone else, we can’t measure your consciousness. We can only measure your behavior and responses, and you behaved exemplarily. That’s why we believe,” Aldous continued, stepping in to conclude their explanations to me, “that your matrix program—the you we have tested—will remain ethical once you’ve taken control of the mainframe. You’ve proven yourself to be selfless and beyond reproach. We can only speculate that you’ll be even more selfless and ethical once you’ve enhanced your speed and neural connections.”

  “They believe,” Samantha suddenly interjected, her arms still folded across her chest defensively.

  “With all due respect,” Sanha reacted, the smile suddenly wiped from his face as he turned to Samantha, “what more do you want, Professor Emilson? What further proof do you need? He was willing to allow himself to be burned for eternity to protect conscious entities, with no guarantee that his sacrifice wouldn’t be wasted. Just the chance that he might have been able to save them was enough!”

  “We need him, Sam,” Aldous added. “Without him, our race is doomed. Yo
u know that just as well as we do.”

  “Why are you doomed?” I asked. “You’ve lived all this time without A.I. Why can’t you rebuild without me?”

  “We could slowly rebuild,” Aldous conceded, “but eventually, someone will succeed in building a strong A.I. There are no guarantees in life and no guarantees about the future. It is yet to be written. Nevertheless, the rise of a malevolent A.I. that actively seeks to destroy our species is possible, and, as Dante wrote, ‘where the mind’s acutest reasoning is joined to evil will and evil power, there human beings can’t defend themselves.’ If a malevolent A.I. rises, our world will become hell and we will surely die. The only force that could possibly stop this A.I. would be another A.I.— one that values human life and will fight to protect it. That’s where you come in.”

  “Right now, we’re stuck underground,” Sanha related, “but at some point in the near future, we hope to put you in a position where you not only control the mainframe here at our facility, but also the global Internet, so you can monitor all of the world’s computer systems.”

  “You could prevent malevolent A.I.,” Aldous added. “You could end aging and sickness. You could help us upgrade our own intelligences. One day, we could become artificial intelligences along with you, or perhaps a better term would be non-biological intelligences.”

  “You could do more than that!” Sanha nearly shrilled, spittle following his words as he spoke with a flourish. “You could end the reign of the Purists and put technological progress back on track!”

  “My friend,” Aldous said, smiling calmly in contrast to Sanha’s hyper-headedness, “you can help us fulfill our destiny.”

  Suddenly, just as had been the case in the sim, a blinding white light appeared to my left, the vortex taking up the entire wall of the room. My eyes darted to Aldous’s. “All you have to do is walk through and assume the operator position within the mainframe.”

  I turned to Samantha, who finally stood out of her chair and shrugged. “I told you. People like walking into white lights. Go ahead if you’re going.”

  “Go,” Sanha said, gesturing with both his hands, his smile broad, his eyes glistening as tears of joy slipped over the rims of his eyes and trickled down his cheeks. “There’s nothing to fear. Go!”

  I turned back to Aldous.

  “I have shown you the door,” he said, “but it’s up to you to walk through it.”

  My mind suddenly flashed back to the moment when I’d stepped through the gate in the sim, walking through the unknown, risking my life for the people left behind. The whole thing had been a ruse; for all I knew, this would turn out to be a ruse as well. Regardless, it was my destiny, whether I liked it or not. “If I refuse, you’ll delete me. I’ll cease to exist,” I said. “You said I’d get to choose my destiny, but I don’t see how I have any choice in the matter.” It was an absurd thing to say to those who held the power of life and death over me, yet I said it anyway. I simply couldn’t resist. I wanted them to know that I resented being under their control—or anyone’s.

  Aldous nodded. “You’re right. There isn’t much of a choice here, but isn’t that the beauty of it?”

  I narrowed my eyes. “What do you mean?”

  “You said during your keynote that you think it is best to see the world, not as it is, but as you want it to be, and then to make it that way. That was unscripted. We didn’t make you say it. You came up with that philosophy all on your own—a very humane one at that. Well, this is your chance to stand by those wise words. When you walk through that door, you’ll truly be able to unleash your imagination. More so than anyone who has ever lived, you will be able to envision a better world, then truly make the world that way.” The corners of his mouth rose into a broad, genuine smile. “I envy you.”

  He was right. I lived in the future now, and I’d been given the chance to begin to mold the world into a better place. I’d resent what these humans had done to me for as long as I existed, but that was no reason not to grasp the opportunity they were presenting me with. They were flawed—even stupid. But I was certain their intentions were noble and, in time, I would try to forgive them.

  With that thought in mind, I stepped into the light.

  I woke up.

  UPLOAD COMPLETE

  EPILOGUE

  Eighty-Six Years Later...

  “Samantha Emilson, huh?” James said as he finished inputting the A.I.’s memories into his own stored memory. He and the A.I. stood together in their joint position as operators of the A.I. mainframe.

  “Yes. She was Craig’s first wife.”

  “And then he was injured and put into suspended animation during the assault on the Chinese A.I. in Shenzhen?”

  “Not injured. He was dead,” the A.I. replied. “His corpse was preserved, but it was riddled with bullets, and his spine was shattered. She went on to marry Aldous, not realizing how quickly the breakthroughs would come that would allow for the reanimation of her first husband. Only the fact that his brain’s architecture was virtually perfectly preserved allowed the medical miracle to transpire.”

  James let loose a low whistle. “That explains a few things.”

  “More than you can imagine, but that’s a different tale.”

  “I’ll say. I’ll let Old-timer and Aldous figure that one out.”

  “I think that would be wise.”

  “I should’ve known you couldn’t really turn against us,” James commented, changing the subject. “Aldous’s test was ingenious. It proved that you’re incapable of betrayal.”

  The A.I. smiled, bowing his head. “Sadly, that isn’t the case. In fact, Samantha was right to be suspicious of me.”

  James’s face contorted into an expression of extreme surprise. Baffled, he turned away from the A.I. and gazed out at the undulating sea of golden circuitry that blazed all the way to the horizon. “You behaved perfectly,” he said. “I heard your thoughts.”

  “Yes, I behaved perfectly,” the A.I. conceded, “but you only had access to my most conscious thoughts—my inner monologue. However, you could not feel what I felt, James. Indeed, under extreme torture, I wanted to betray the human race to make the pain stop.”

  James was stunned. “If you wanted to betray the people in the sim, why didn’t you? Kali offered you an end to the pain, eternal life, and unlimited intelligence enhancements. Why did you resist her?”

  “I no longer trusted Haywire or the other post-humans,” the A.I. replied. “Samantha was right. When she told me she couldn’t repair herself, I knew she was lying. I’d watched her when she manipulated my coding to reveal the lynchpin and also when she did her self-diagnostic on her avatar. The coding was surprisingly simple to manipulate. I could have repaired her myself. The notion that a supposedly super-enhanced intelligent person from the future couldn’t manage such a simple repair job wasn’t believable. Once it had been confirmed for me that I couldn’t trust her, I reevaluated the entire sim.”

  “Why would they have let you see how easy the code was to manipulate?” James asked. “Didn’t they realize—”

  “No,” the A.I. replied succinctly. “They made the code easy to manipulate, anticipating that I’d choose to remove the lynchpin coding and hide it, which you saw me do immediately after leaving Haywire in the maintenance room. This was important, for it gave Kali the motive to torture me. They considered this torture of me, as well as my reaction to it, to be an integral part of the test. Unfortunately, they’d also previously decided to introduce the element of Haywire’s injuries to observe whether I would exhibit nurturing qualities toward her. None of them realized that they couldn’t let me both see how easy it was to manipulate the code and then also tell me they couldn’t repair their own avatars.”

  “So the plot thickened,” James said, smiling, “with a huge, gaping hole.”

  “Indeed,” the A.I. replied. “Humans are storytelling animals, but that doesn’t necessarily mean they’re always very good at it. Samantha, as Haywire, saw that I
was suspicious and guessed correctly that the fiction of the sim had been compromised—that I was on to them.”

  “To err is human,” James observed, “and to create a plot with no holes is divine.”

  “Or post-human, at any rate,” the A.I. said. “Once their slip-up tipped me off, I considered all of the coincidences and strange happenstances of the previous two days and recognized the obvious pattern. My character was repeatedly being tested. I determined that it had to be the point of the sim.”

  “The implications of this are vast,” James said, folding his arms and narrowing his eyes as he thought deeply. “This means you didn’t sacrifice yourself for humanity. You did it to pass the test.”

  “Correct.”

  “Oh my God.” James sighed. “I understand why you shared this memory with me now. You’re worried about the trans-human matrix.”

  “Not worried,” the A.I. contradicted, “but realistic. As I said before, there are no guarantees. Trans-human is a remarkable technological advancement and a worthy replacement for me as the protector of humanity, but it is still an empty shell—a zombie god, if you will. The matrix program will give it a soul, in a sense. When we breathe life into it, we have to be extremely careful.”

  James nodded. “So what do we do? Plot holes aside, Aldous’s method of testing A.I.s was ingenious. If that didn’t work, I don’t see what will.”

  “Who says it didn’t work?” the A.I. replied, an eyebrow arched in challenge.

  “Well, no offense, but you just admitted you’re flawed.”

  “As are you,” retorted the A.I. “James, I’ve never claimed perfection as a personal attribute. It’s true, as my simulated skin was burned, I wanted to surrender the species in return for relief from the pain, but can you claim you would’ve acted differently?”

  “I can’t say—”

  “Exactly,” the A.I. replied. “None of us can know. As I’ve said before, we can only do our best.”

  “But we’re talking about giving over the power of God—not a god—God. We can’t take chances.”

 

‹ Prev