This book is a work of fiction. Names, characters, places, and incidents are the product of the author’s imagination or are used fictitiously. Any resemblance to actual events, locales, or persons, living or dead, is coincidental.
Copyright © 2019 by Mark Wheaton
Hachette Book Group supports the right to free expression and the value of copyright. The purpose of copyright is to encourage writers and artists to produce the creative works that enrich our culture.
The scanning, uploading, and distribution of this book without permission is a theft of the author’s intellectual property. If you would like permission to use material from the book (other than for review purposes), please contact [email protected]. Thank you for your support of the author’s rights.
Grand Central Publishing
Hachette Book Group
1290 Avenue of the Americas, New York, NY 10104
grandcentralpublishing.com
twitter.com/grandcentralpub
First Edition: April 2019
Grand Central Publishing is a division of Hachette Book Group, Inc. The Grand Central Publishing name and logo is a trademark of Hachette Book Group, Inc.
The publisher is not responsible for websites (or their content) that are not owned by the publisher.
The Hachette Speakers Bureau provides a wide range of authors for speaking events. To find out more, go to www.hachettespeakersbureau.com or call (866) 376-6591.
Library of Congress Cataloging-in-Publication Data
Names: Wheaton, Mark, 1975- author.
Title: Emily eternal / M. G. Wheaton.
Description: First edition. | New York : Grand Central Publishing, 2019.
Identifiers: LCCN 2018025434| ISBN 9781538730393 (hardcover) | ISBN
9781549115400 (audio download) | ISBN 9781538730416 (ebook)
Subjects: | GSAFD: Science fiction
Classification: LCC PS3623.H4265 E55 2019 | DDC 813/.6—dc23
LC record available at https://lccn.loc.gov/2018025434
ISBNs: 978-1-5387-3039-3 (hardcover), 978-1-5387-3041-6 (ebook)
E3-20190102-DA-PC-ORI
E3-20181116-DA-NF-ORI
Table of Contents
Cover
Book Title Page
Copyright
Dedication
Book I I
II
III
IV
V
VI
VII
VIII
IX
X
XI
XII
XIII
Book II XIV
XV
XVI
XVII
XVIII
XIX
XX
XXI
XXII
XXIII
Book III XXIV
XXV
XXVI
XXVII
XXVIII
XXIX
XXX
XXXI
XXXII
XXXIII
Book IV XXXIV
XXXV
XXXVI
XXXVII
XXXVIII
XXXIX
XL
XLI
Epilogue
Acknowledgments
Reading Group Guide
Newsletters
For Eliza and Wyatt
Book I
I
It’s dark, way too dark for the middle of the day. And that’s not where the sky’s supposed to be.
My ears are filled with the roar of gale-force winds. A loud crack that sounds like the splitting of the earth soon follows. It grows louder, like the splintering of a whole forest of trees.
The ground beneath me gives way and I fall into darkness.
“It was raining before we went to bed,” Regina says, her voice shaky as she speaks from someplace else far away. “The worst was supposed to be over. The river level was dropping.”
Someone screams. In a full-length mirror attached to her closet door, I catch sight of teenage Regina. She’s in her pajamas, pink and blue with panda bears. She’s fourteen but looks much younger. There’s a second scream. Regina looks out into the hallway.
“My sister was in her bedroom,” present-day Regina continues, the tears flowing freely now. “I couldn’t see her, but I could hear her.”
She’s misremembering. A door in the hallway swings open and teenage Regina glimpses a terrified little girl—her sister, Marci—hands gripping her bedframe. There’s another loud crack and the bedframe, the girl, and the entire bedroom vanish.
I don’t consider it a lie, however. Perhaps a necessary omission. Memory is selective, particularly when it comes to trauma. It’s one of the reasons babies have evolved to remember nothing of their early emotional fears.
“What about your mother?” I ask. “Where was she?”
In the here and now, Regina feels my hands take hers. Feels me near, my warmth reminding her she’s safe. This happened long ago.
“I don’t know,” Regina says. “She was in her bedroom, but somehow she must’ve made it upstairs.”
Teenage Regina’s bedroom is in motion now, spinning around and losing chunks of wall, floor, and ceiling as it goes. Regina’s heart rate accelerates, so I move my hands to her elbows. She’s leaning forward, her body tilted into mine almost like an embrace.
“Tell me,” I say, barely above a whisper.
Regina nods and suddenly her mother appears in the room with her younger self. She doesn’t open her mouth, but teenage Regina hears her say, Take my hand.
“She led me to the roof,” Regina tells me.
The walls of teenage Regina’s bedroom fall away completely. The floor becomes the last remnants of a roof. The roar isn’t the wind but a vast churning river carrying the broken remains of Regina’s house. It’s raining, but not hard.
I accelerate my processor speed, so Regina doesn’t perceive my absence and my hunt through the case notes stored on my server. In real life, the river was no wider than about twenty feet. Her perception of great waves crashing around her shattered house? Also, an invention. The National Weather Service later estimated the river was moving at only ten miles per hour.
The most flagrant of her memory’s fabrications, however, is the presence of her mother. When the ground beneath the house, eroded by a week of torrential rains, fell away, the part of the house Regina and her sister were in toppled into the river along with several tons of earth. The body of Regina’s mother was found in the part of the home that remained on shore. She’d been killed instantly when the second floor caved in on the first.
Regina has been told this several times but can’t or won’t accept it. She is convinced her memory of events is correct.
“What happened next?” Happened not happens. A linguistic reminder she survived this.
“I woke up in an ambulance,” Regina says. “My father—he’d been out of town that weekend—later took me to the spot where they found me. I’d gotten tangled in a fallen tree by the bank.”
Though she sees herself there, she has no actual memory, only a series of imagined versions her mind pieced together after the fact. Here lies the problem.
“Regina?” I say. “I’m leaving interface now.”
I’m instantly back in the iLAB building, specifically a lounge decorated to look like an inviting albeit slightly academic therapist’s home office. Regina sits on the wide brown sofa in the center of the room. I sit—or, rather, she perceives that I sit—directly opposite her on a leather-upholstered chair. An interface chip, the small piece of extremely proprietary nanotechnology that allows this back-and-forth, is affixed to a spot on her neck where her jaw meets her ear.
The chip allows me to manipulate Regina’s senses of sight, sme
ll, touch, and hearing. Her eyes tell her brain there’s a Caucasian woman in her early thirties with brown hair, blue-green eyes, and a kind face sitting opposite her. Her ears tell her my voice has a mid-range pitch, not too low, not too high, with a slight New Englander’s accent. Her nose tells her I use mostly fragrance-free soap, a kiwi-infused shampoo, no perfume, but a baby powder–scented antiperspirant. When I touch her hand or even embrace her, I come off as warm, upright but not rigid, and a good hugger.
In return, the chip gives me unlimited access to her brain, including thoughts, memories, learned behaviors, hopes and dreams, worst fears, and all things in between. Utilizing bioalgorithms, I’m able to create a comprehensive neural map of an individual’s mind that can then be used in a therapeutic context to help patients with their issues, large or small. Years of exploratory, so-called talking therapies, brain trauma diagnoses, or even criminal psych evaluations can be drilled into a single session.
Given what mankind suddenly finds itself facing, the arrival of a new piece of tech capable of helping humans process their traumas turns out to be good timing.
“Hey,” I say.
“Hey,” Regina replies, leaning back as if spooked by our proximity to one another.
“That couldn’t have been easy,” I say, straightening as well. “Did you see anything new?”
Regina shakes her head. This is my third session with her, but the first in which we actively went into the traumatic event that has so long defined her life.
“The question is what did you see?” she asks.
The truth. That she has spent a lifetime convinced either she could’ve saved her sister or that her mother chose to save her instead of Marci. Whichever the case, she shoulders the blame for both deaths. This is the reason the raging river looks so much worse in her mind. Her subconscious tries to give her a way out, to prove she could have done nothing. Incredible, no? The human brain, so complex and yet so fragile, makes a terrible thing even worse out of a sense of self-preservation.
But I can’t say that. To do so would be to try and talk her out of one of her most deeply felt beliefs. Only she is capable of that. My job, as her therapist, is not to give her answers, but to get her asking the right questions.
In the six months or so before the world ends, that is.
“I see both the reality and the fiction your mind has built up around it in stark contrast,” I say. “As you’ve aged, your mind has mistaken this fantasy more and more for the truth—for memory. This results in the memory becoming more emotional for you, which allows your mind to embroider it further, expanding the fantasy. Your strongest emotional memory from that day is one of fear, so your mind makes it all scarier, and you’ve never been able to shake the feeling of loss, so your mind amplifies those parts of the memory, cramming a lifetime of those emotions into such a short amount of time. That’s a lot of weight.”
“So I’m lying to myself?” Regina asks, parsing this. “Making it bigger than it was?”
“Not at all,” I say. “You see the memory embroidered by the impact it has had on your life. To see the original memory as I do—in its nakedness, its chaos, its simplicity, and its real horror—would be impossible for your brain to process. So, it presents the facts in a way that matches your emotional response. Does that make sense?”
It doesn’t, but she nods anyway. It might not for a while, either. But if she begins to think of it like that, we can make progress.
“How’s your father?” I ask.
“He’s all right,” she says. “He’s in New Mexico but heading to Central California.”
“You’re joining him?”
“Leaving today,” she says. “I wish you could come, Emily. I think you’d like him. Given the number of people converging on the farm fields there, you’d probably do a lot of good, too.”
“Yeah, well, that’s how I roll,” I say. “When you’re this cool, you make everyone come to you.”
Regina laughs, but it’s bittersweet. We both know she needs a few more sessions. But as is the case with so much these days, we’re out of time. Though I don’t actually exist outside of Regina’s mind, the large-scale server farm that makes this illusion a reality is here at the university and here it’ll stay.
I’m an artificial consciousness (AC), which is totally different from artificial intelligence (AI) (Kind of? Sort of? To me at least), and was in the fifth year of this experiment when the sun began to die. Not die precisely, but it made a sudden and explosive phase shift from a yellow dwarf to a red giant. Imagine a rapidly expanding balloon. Only, this balloon is on fire and devours everything in its path, including planets. While this inevitable outcome in the sun’s stellar life cycle was first predicted as far back as 1906, scientists in recent decades postulated it couldn’t possibly happen for another five billion years.
Oops.
As a product of science myself, I often catch errors made by my creator and his colleagues at our overly esteemed, overly prestigious Massachusetts-based Institute of Technology. My team may be made up of super-genius scientists and lab techs, but they’re only human. (And mostly male. Which presents challenges in the self-actualization department considering they’ve designed their creation to identify as female.) Only, when my team makes a mistake, it’s an ħ or Ψ out of place in a quantum mechanics equation, as opposed to failing to recognize the rapid deceleration of the nuclear reactions that power the sun.
I can fix an error in even the most advanced mathematical engineering process in the blink of an eye. Nobody can restart the sun.
By and large, mankind is taking its forthcoming extinction about as well as could be expected. I empathize because, well, empathizing with mankind is what I was designed to do. Most researchers create AI to design algorithms capable of cracking the stock market, beating old Nintendo games twenty at a time, determining a customer’s next favorite album based on their current playlist, or replacing large swaths of people in the workforce with a single hard drive. My creator—Nathan—designed me to interface with and decode human minds. This is more about learning through emotional and environmental response and less overtly about math-based decision-making. Hence AC, rather than AI.
If all went well, the goal was to have me become the world’s first nonhuman psychiatrist/brain researcher, versed in unlocking the mind’s deepest, darkest secrets and misspent potential in hopes of bettering mankind.
Thanks a lot, Sun.
The thought process behind this was simple. In tests, patients in the care of mental health professionals feel more comfortable relating their secrets to a program than a potentially judgmental fellow human. Enter an artificial consciousness—me. I am capable of a near-human level of conversation, perception, and medical insight, all to help a patient perceive me as a living, breathing person.
Though still in my experimental stages, I was on track to be a real earth-shattering innovation—the first of a kind! Nobel Prizes all around!—if not for the whole “death of civilization” thing. If I sound bitter, that’s a misrepresentation. Despite having gone through several evolutions, learning much through five years of trials, that’s one emotion I’ve yet to develop.
Okay, fine. Maybe a little bitter. But whatever.
When Regina and I say our good-byes a few minutes later, I wish her all the luck in the world without resorting to platitudes. She gets it. Most do. There’s nothing to say that won’t ring hollow, so better to get on with the day.
“Take care of yourself, Emily,” she says without thinking. “Well, you know what I mean. And thank you.”
“Thank you for being a participant in the iLAB’s Artificial Consciousness Therapeutic Protocol,” I recite to her amusement. “Be well.”
She exits. I check the appointment schedule, though I already know what I’ll find. Regina Lankesh is the seventy-sixth student volunteer test subject I’ve seen this year, the four hundred thirty-eighth I’ve seen in total.
She’s also the very last.
II
/>
Emily? You awake?”
I blink my eyes twice and sit up straight in bed. The voice belongs to my creator, the aforementioned Dr. Nathan Wyman. Given the time—twenty-two past six in the morning—and the background noise, I deduce he’s calling me from his truck as he drives to campus. He and his wife and two teenage sons live in Southborough, Massachusetts, a suburb an hour east on the I-90 from Boston.
“I am now,” I say, pushing the blanket off my legs and rising to my feet. “How’s the drive in?”
“Slow,” Nathan says, his voice filling my room as if I’m inches from his mouth. “There’s ice on the road, but nothing the chains can’t handle. The heater is on the fritz again, though.”
I lower the volume of his voice and amplify the background noise. I listen but force myself to feel groggy. If I were human, being woken by outside stimuli earlier than my preset wake-up time would reduce functionality and response time. But when I slow my processor speed to create the effect, it dulls all my senses at once. Too much. I give up, turning my attention to my dorm room mirror, which reveals I haven’t washed my hair in three days. It’s beginning to show. In addition, the red Stanford sweatshirt I wear to bed—an item programmed into my wardrobe as a joke by one of my earliest programmers—could use a trip to the washing machine.
“It’s the thermostat,” I say, waiting a couple of seconds after diagnosing the problem to avoid sounding like a know-it-all. “It’s worn out. I can hear the valve trying to close. Want me to order the part? I can walk one of the grad students through the install.”
“You really think it’ll show up?” Nathan asks, popping the first of the day’s dozen or so cough drops into his mouth.
It’s a good question. Since the announcement of our forthcoming Armageddon, governments around the world have bent over backwards trying to reassure the public that going orderly into That Good Night is the best for all. To no one’s surprise, the mileage on such an announcement has varied. In certain quarters, anarchy, looting, overly optimistic mass migrations to areas of large-scale food production, and even wars have resulted. Certain religions see this as a sign they were right the whole time and have retreated to prepare for “what’s next.” For others, a great numbing has occurred, particularly since local governments began enacting soft, quasi-legal versions of martial law to keep the peace. But leave it to the stoic hardiness of New England Yankees to buckle down and await the inevitable while adamantly refusing to let it affect their day-to-day lives.
Emily Eternal Page 1