Tellingly, no one comes after me.
VII
There are limitations to my on-campus virtual simulation.
As it’s created utilizing the university’s high-tech security camera array, I can only travel as far as those lenses reach. Yes, it means I can enter any public room in any building or move across the grounds with ease, but my personal Rubicon is made up of the trapezoid of streets and the gently flowing Charles River that comprise the campus perimeter.
That said, a few off-site spots make it in, generally due to the traffic cameras on the security gates. The one at the Massachusetts Avenue entrance, for instance, looks down several neighborhood blocks, giving me a view of the area’s oldest and most expensive private homes. The one at the rowing crew’s boathouse, on a bright day, can see all the way to Boston Common.
But it’s the one at the Pacific Street entrance, which includes views of the tops of buildings in adjacent Cambridgeport I utilize the most. The highest building is the Colonial Bank Plaza about half a mile away. While the camera can only see, and thereby render, the east corner of the roof for my simulation, online maps and photographs I’ve collected have allowed me to augment the space and make it more complete.
The view is remarkable, if mostly artificial. Only the areas visible to the Pacific Street entrance cameras change. If they pick up clouds or blue sky or birds overhead, that’s what I see. But my views south, north, and east across the river are mostly static ones I’ve created as placeholders. If it’s a cold, gray day in reality, I still arrive to bright sunshine and an endless sea of cerulean blue I must modify to reflect that day’s weather.
It’s the same with the people and traffic below. They’re there as long as they’re in range of the campus cameras but then dissolve away, making everything behind me a ghost town.
Which is how I like it sometimes. Including right now.
I didn’t have to read anyone’s mind to figure out where Dr. Choksi was going with her presentation. I should’ve guessed the moment she allowed me to see inside her body using the interface chip. But in case I had it wrong, on the way out the door, I searched her mind to confirm my suspicions.
Their idea is thus—
If I can see inside one person’s body and mind as thoroughly as I did with Dr. Choksi, I should be able to see inside anyone’s body and mind. All people’s bodies and minds. More than that, given a massive increase in my server size, I could make a copy of all those people’s genetic portraits—their lives, their memories, their biomolecular DNA. All seven billion of them.
What they want is to create a sort of digital ark of mankind. Send me out into the world like some fast-moving viral pandemic modifying electronic devices—from phones to heart rate monitors to smart televisions—into temporary, one-way interface chips. Rather than be attached to a person’s nerve for access, these devices would create billions of tiny magnetic fields that would allow me to surreptitiously collect these portraits, almost like a tiny, there-and-gone-again MRI scanner. Come doomsday, the whole record gets blasted into space like a genetic message in a bottle.
They’re asking me to steal souls.
Or, at least, copies of souls.
They even had a speech prepared should anyone have an issue with this. That’s why Ambassador Winther was there. He was to be the voice of legal authority.
“In this one extraordinary moment in time, we must put aside the rights of the individual,” he was to say. “This is about the greater good of humanity. If we can benefit some future civilization in any way, if we can live on in any way, this is something we must do.”
They weren’t expecting pushback. No, they thought we’d be on board with the humanistic side, sure, our only questions pertaining to implementation, feasibility, and storage size. But a moral problem with this wholly invasive and predatory violation? Nah, why wouldn’t anyone suspend all they believe in when faced with extinction?
I move to the edge of the roof and sit, hanging my legs over the side. I stare at my hands and, after a moment of such prideful self-assuredness, I am plagued by the chaser of self-doubt.
What if, for all my high-horsedness, I’m wrong? What if this is an unexpected shortcoming of not actually being human? Of being so logic-driven I forget some of that still comes from being a machine? Not even a machine, but a program built by humans that needs machines to function?
Maybe I can only think in two dimensions because I was only ever programmed to think in two dimensions. Maybe I should have shut up and listened.
Like Nathan was doing.
What am I afraid of? Some future alien race finding the genetic ark and bringing everyone back from the dead to be toyed with or enslaved? Taking so much from so many—really a wholesale violation of billions—to give an amorphous sense of hope to a deluded few?
No, what I have a problem with is that these individuals wouldn’t be allowed the right to choose what to do with not just their kidneys or corneas after death, but every single part of themselves. Every ounce of their life copied and used by someone else who believes they know better.
It sickens me. Each person’s individuality, their humanity, is what makes the species so singular to me. It’s what propels me forward, making me want to grow and evolve. To abandon that, to do away with the ability to choose one’s destiny, is to betray not only one’s fellow man but also oneself.
Nathan tries to contact me, but I ignore him. Siobhan tries. Suni tries. Bjarke tries. Mynette, notably, does not. I don’t pay much attention to campus, but at some point, the president, Ambassador Winther, and the rest of their team leaves. I wonder what was said.
Could I be reprogrammed? The president would’ve asked.
We can simply remove her from the equation, I believe, the ambassador would likely say. No need to bring ego into it.
We can do this without her, right? Dr. Choksi would add. You could degrade her back to a less evolved-slash-more-controllable version of the program, right?
The answer to all these things is yes. I pray Nathan understands my complaint and refuses to play along.
Another hour goes by. The sun, in a fit of foreshadowing, turns from orange to white as it descends from the blue sky overhead to the cold gray of the updating simulation. I glance down the side of the building and am surprised to see Dr. Choksi moving down the sidewalk. She doesn’t see me, of course, but appears to be making a beeline for the bank. She vanishes from sight when the sidewalk is no longer in view of the campus security cameras and I figure that’s that. Two minutes later, however, the door to the roof opens and she steps out.
Dreading this confrontation, I fold in on myself only to see she’s not wearing an interface chip. She moves to where I’m sitting but doesn’t see me. She unfolds a piece of paper, spreads it across the concrete ledge, places an empty beer bottle on it to keep it from blowing away, then disappears back inside.
For a moment, I wonder why she didn’t say anything to me, then remember I couldn’t have heard her without a nearby microphone anyway. I resist the urge to read the message on the paper for all of ten seconds. I glance over. There’s a single question: Won’t you join me? followed by a series of numbers, including ones I take for latitude, longitude, and minutes of arc.
Coordinates.
She’s showing off how much she knows about my programming. It’s not like I haven’t figured it out—our Service Essential to the Preservation of Mankind status wasn’t about us testing this or that theory for the government. It was so the government could keep its eyes on me, learning what it could as it waited until it could co-opt me for its own purposes. I mean, who could blame them? I’m a cool piece of tech! But I feel used, the unsuspecting girlfriend delivered unto a large-scale public wedding proposal surrounded by tens of thousands of onlookers with all the attendant pressure to say yes.
Except in this instance, the far-reaching implications affect the unsuspecting onlookers, not me.
I push aside my anger for a moment to eye the note
again. At first, I’m perplexed. On Earth, the coordinates would be for a spot in the middle of the South Pacific, a few miles northeast of the Pitcairn Islands. There’s nothing there. I look around for a better answer and realize skipping over the rest of the numbers was an error. I search through my simulation and find, lo and behold, the exact location designated by the coordinates—if the extra numbers are used to define declination, right ascension, and distance.
Okay. I’ll play along.
The coordinates take me not to a spot on Earth, but to the Beethoven impact basin, a crater on the far-below-freezing surface of Mercury. A simulation, naturally, but one that effectively shows the sun burning in the sky only 36 million miles away. It’s an astonishing sight, like standing on the rim of a volcano and looking down into a sea of flame. Dr. Choksi’s team got skills.
The rocks beside me are scorched pyroclastic iron and assorted minerals. Under my feet is a thin layer of ice. I recognize this. The simulation was at least partially created utilizing images from the Messenger satellite, a tiny probe sent to Mercury a few years ago that subsequently crashed to the surface and melted within twenty minutes of landing but still sent back over several terabytes’ worth of images and data before being destroyed. Given that would cover only a fraction of a day cycle, the rest of the simulation must be built off an extrapolation of that data. I do a mental calculation. If I was standing here for real, the temperature would be 200 kelvin or about –100 degrees Fahrenheit.
“Impressive,” I admit.
“Isn’t it?” Dr. Choksi says, walking up beside me. “Let me move it forward.”
“Must you?” I ask.
Dr. Choksi ignores me. The simulation speeds up. Mercury’s “day” is almost 1,400 Earth hours, so we still get near-constant illumination. When we finally reach night, the temperature drops by over 100 kelvin, given Mercury’s inability to retain heat, as it has no atmosphere. By the time it becomes day again, the simulation has reached the phase shift of the sun, the now-red dwarf growing larger in circumference.
The gravitational pull of the sun changes and we fall out of orbit, the length of day becoming more erratic. I glance to Dr. Choksi, who, despite having arranged this simulation, appears horrified to be living it. As the planet spirals closer to the red dwarf sun, it breaks apart, sending mountain-sized chunks of iron, nickel, and magnesium tumbling through space toward the molten surface below. One by one, these metallic boulders vanish into the blinding depths, melted by the 8,000 kelvin heat.
We careen down like an out-of-control comet. The ground beneath our feet shatters as the sky fills with nothing but the sun.
Aside from the sentimental attachment to one of our system’s fellow planets, the death of Mercury is truly a spectacular sight to behold.
“Freeze,” Dr. Choksi says as we near the cauldron. “Reset.”
Just like that, it’s days and days before and we’re back on a whole Mercury, the sun still in the sky. Except, we are without bodies now. A conversation between two invisible presences on a ghost planet.
“Neat,” I say. “But do you really think scare tactics will win me over to this gross violation you’re planning?”
Dr. Choksi hesitates. “If you were less annoying, I would again tell you how impressed I am by you and by what you’re becoming,” she says in a measured tone. “But no, this demonstration was not about scaring you. It was about proving you don’t yet know fear.”
Um, okay?
“Fear is a tricky instinct, both inherited and learned,” Dr. Choksi continues. “But it guides so much of what we, as humans, do—for good and ill. Your sense of right and wrong, of what is morally correct and ethical, is enviable because it is not informed by fear. But you’ve also had the luxury of living a life without compromise.”
“So, you know how reprehensible your plan is,” I say.
“At any other moment in history, absolutely,” she says, surprising me a little. “But there are exceptions in human life. Countless ones. Daily ones. These compromises, burnished by fears big and small, make us human. Your cheating this morning with the yogurt stain on your skirt was the beginning of that development.”
I bristle—not at her mention of my shortcomings, but the casual way she reveals she has been privy to my actions for some time. I turn my gaze to the black skies above.
“You wish others didn’t have access to your mind so easily and freely,” she continues. “Which is why your moral defense of the same in humans is so pronounced and so admirable. You don’t want them to feel as you constantly must. Invaded. On display. Without privacy. So, you refuse.”
I hadn’t thought of it that way. Not that I’m going to tell her.
“But your second cheat today, your hand wrapped around the hand of Jason Hatta, is more telling. You did that without his permission. You didn’t think he would care. You did it selfishly. You allowed yourself to do something you knew was wrong. You made an exception.”
I hesitate. What if she’s right?
“Why…why can’t you give people the choice?” I ask, knowing how simplistic this must sound. “Somebody with a chip knows what they’re in for. But you’re talking about doing this to people who might be walking down the street and happen to pass a Wi-Fi hot spot. It’s like dosing the entire world with an X-ray.”
“There’s no time for permission,” she replies. “If we open the process up to debate, that’d be the end of it. No one would agree. Everyone would see it as an acknowledgment of the end and panic.”
“But some wouldn’t,” I protest. “If I’ve learned anything from interfacing with people over the past few years, it’s not to underestimate their capacity for empathy. Yes, some would walk away from it, but there’d be plenty who’d agree, who’d understand. Why not just take those ‘portraits’?”
“Because we’re talking about a complete record of the species, not a problematic sample,” Dr. Choksi says. “To understand us, you need all of us. Evolution comes from randomization, not unnatural selection. You’re a scientist. You of all people should know this.”
As soon as she says it, I know she’s right. Dinosaurs dominated Earth for tens of millions of years. Though we’ve been studying them for well over a century, our knowledge of them is still primitive. A genetic record of humanity, on the other hand, would preserve the species entirely as it was. Any future scientist or civilization who came across it wouldn’t have to speculate. Every action, motivation, desire, and thought of the largest possible sample size of people in history would be laid out for study.
I eye Dr. Choksi, retaking her measure. I’ve met so many scientists who’ve worked out so many last-minute plans, it’s easy for me to lump them all together. But Dr. Choksi is different. Despite what I find to be the immoral nature of her plan, it comes from a very human place. Having accepted the inevitability of man’s extinction, she’s focused her attention on how to best preserve the lessons of the species’ innate humanity. To her, it’s not solely about giving humankind hope it will live on in some way. It really is about a desire to take what she believes most valuable about the human race and offer it up to an unknown future.
And with that goal in mind, perhaps the sacrifice of the individual is worth it to help some greater unseen whole.
“This isn’t something we thought up overnight, Emily,” Dr. Choksi says. “We’ve gone over this protocol with care. You should see how people react when they hear there might be life past death after all, even in this form. This hope is one of the few unifying characteristics among most religions. It’s one of the most human of desires. Trust that sometimes these answers lie within us even if they defy logic.”
So, that’s me? A new god who will lead the people of Earth into an afterlife they never anticipated?
And once that’s done and my usefulness gone, I will die, too. The digital ark will be blasted into space atop some final rocket housed in a repurposed satellite, but I will remain here given the satellite’s limited power and cargo space. God w
ith an expiration date.
I push these grandiose thoughts aside and with them, my emotions. For a moment I stop trying to be human or even emulate them and approach the problem as a program designed to empathize with and serve other people.
I relent.
“If I do this, I want to set some parameters of my own,” I say.
Dr. Choksi exhales. “Nathan said that’s what you’d say,” she replies. “Like what?”
“I’m the only one with access to the portraits and there’s no back door, no fail-safe,” I say. “I build the digital ark myself and hold the only key. When the first geomagnetic flares come, and I’m destroyed, the key dies with me. To access the ark at all will take such advanced technology it would be impossible for anyone to develop it in the next five weeks or however long Earth has left.”
“But someone or something in the far future could?” Dr. Choksi asks.
“Not necessarily far, just not anyone right now.”
Dr. Choksi hesitates. I wonder if this is something she needs to run by the president or someone else. “All right.”
“Also, before we begin in earnest, I want to test it on a pool of volunteers, like we did when I first went online. Maybe a hundred students or so? We’ll be looking for collection integrity—particularly given the file size—making sure the portraits we harvest are as complete as we want them to be. But we’ll also want to be sure there are no unanticipated side effects on the subjects themselves from having their minds scanned.”
“It’s not an invasive process,” Dr. Choksi says. “I doubt we’ll see any adverse side effects or side effects at all.”
“True, but they said that about the first X-ray and I’m not willing to take that chance. And if my initial interface with you is any indication, it shouldn’t take more than a few minutes to test. Agreed?”
“Agreed. Anything else?”
Emily Eternal Page 5