by David Mack
Just one tiny obstruction remains to be dealt with.
I cow Kobb with a look that brooks no debate. “Get off my moon.”
MARCH
2378
18
A soul flickers in my hands, its spark of awareness fragile and faltering. What else would you call a naked brain, disembodied but alive? Philosophers have spent millennia trying to define the essence of the soul, and science has had its hat in the ring for a few centuries now. So far, the best explanation I’ve been able to devise is that the soul is the core of sentience, the seed of knowledge that tells us we’re all nothing more than conglomerations of star dust and borrowed energies—and nothing less than unique cells of awareness, expressions of a universe that wants to understand itself and needs intelligent life to act as its agents in time.
To create life, to incept a soul from inert elements and inspiration, is a heady sensation. How can I help but feel the engines of creation are at my command? In the past year, I’ve willed twenty-nine new sapient minds into existence, nurtured them to stability . . . and killed them.
“Creative destruction” is the euphemism I’ve heard for endeavors such as mine. A more vulgar translation would be the proverb, “You need to break some eggs if you want to make an omelet.” A crude reduction of an underlying truth, but an accurate one. Put simply, to discover the secret of resurrecting dead positronic brains, I must first have dead positronic brains to resurrect. It seems self-evident once it’s pointed out, but the reality is a bit more troubling.
Any one of these minds could have thrived if given a chance, but that was never an option. I see no point in constructing bodies for these sacrificial minds. I’ve also resisted imbuing them with anything more than the most rudimentary programming. Bad enough I need to usher them into existence only to snuff them out; at least this way they’re nothing but inchoate, insensate entities without memory or context. They barely have time to know they exist before they vanish into oblivion.
I patch the newest brain into the master console on my workbench in the center of the lab. I need to run a complete set of benchmark tests and diagnostics to make certain this brain is stable before I induce a fatal cascade anomaly. Initially, I had planned to make only one of these and work on it until I learned how to restore it. I even convinced myself that once it was resurrected, I would build a body for it. But there were undetected hardware flaws in the first brain’s neural network, and the cascade anomaly not only collapsed its matrix, it melted and fused its synaptic relays. After that, its only value to me was as a paperweight.
All the subsequent brains have endured their cascade failures with better grace. At least, none of them were reduced to smoking slag by the anomalies. That indignity came later, as I documented two dozen ways in which not to try to reactivate a failed positronic matrix. The scorched remnants of my failures populate the shelves of an otherwise empty case along the wall beside the door. Most of the time I turn my back on them, choosing to work while facing walls of transparent steel that look out upon the sea of treetops surrounding my island of folly. Tonight a storm lashes the jungle with gray sheets of rain driven by screaming winds, and majestic forks of blue-white lightning rend the darkness. Had I ever dared to look upon such brilliant violence with human eyes, I’d have been blinded; my cybernetic visual receptors, on the other hand, filter and process the spectacle with ease, allowing me to observe it with wonder.
A crash of thunder is followed by an unexpected voice from behind me.
“Hello, Doctor Soong.”
I freeze like a criminal caught in the act. Then I turn, first my head and then my body, to face my unwelcome visitor. His gaunt mask of a face is tilted at an angle that bespeaks curiosity. For someone who’s seemingly just come in from the rain, he’s perfectly dry. “Tyros.”
The android envoy looks around the spacious main laboratory, which is cast in spectral light and crisp shadows by another blaze of lightning. “A most impressive lab.”
I’m in no mood to humor him. “How did you get in?”
He folds his hands behind his back and begins a slow stroll around the room’s periphery. “Your security systems are quite good.” He shoots me a sly look. “They could be better.”
I feel exposed as I watch him peruse my pages of handwritten notes that lie scattered about the floor, the numerous holographic displays filled with my incomplete formulas and schematics, and the countertops strewn with half-built components. “What do you want?”
“Had your fill of the casino business?”
“It served its purpose.” Why is he taunting me? “Answer me. Why are you here?”
He feigns offense. “Just checking in. You left Orion in quite a hurry. I was worried you might be in danger.” He stops, picks up a new type of phase discriminator I’ve been tinkering with, and examines it for a second before discarding it in an offhand manner. Then he resumes his perambulation of my lab. “Have you reconsidered our invitation, by chance?”
“Not for one moment.”
His steep brows climb a few millimeters higher on his broad forehead. “Maybe you should. We have a new member, one I think would intrigue you: Rhea McAdams.”
It takes effort not to react when I hear the name of Vaslovik’s holotronic android. Tyros is right; I would relish a chance to study her up close, perhaps win her confidence and steal a look at what makes her tick. Heaven knows I’ve had no luck replicating the holographic matrix Vaslovik and his partners developed on Galor IV. But I refuse to be baited. “You didn’t come here to roll out the red carpet. Why don’t you get to the point?”
He stops in front of my storage case of ruined positronic brains. My victims. His countenance darkens as he peruses the grim tableau on the shelves. Without asking permission, he opens the cabinet’s doors, picks up one of the blackened hemispheres, and studies it with a sorrowful expression. “What happened here?”
“A failed experiment.” The best lies are the ones wrapped in bits of truth.
Tyros frowns, then reverently sets the scorched brain back inside the cabinet. “Is this the result of your attempts to reinvent Vaslovik’s holotronic android . . . or your blunders in trying to copy his resurrection of your wayward Juliana?”
Damn him. He’s known the whole time. “Am I supposed to justify myself to you?”
“I didn’t come here to judge, Doctor.” He shuts the cabinet’s doors. “But I’d like to give you some advice, if you’re willing to hear it.”
“You’ve got one minute. Speak quickly.”
He walks toward me, his stride casual, his bearing calm. “Ask yourself, Doctor: What’s the point in trying to copy someone else’s achievements? Is it an act of vanity, the delusion that your genius is so without equal that no one else is allowed to build upon your work? Should knowledge advance only by your hand?” He stops in front of me and lowers his voice. “Or is this about revenge? Do you feel slighted in some way? Maybe by Vaslovik, or Juliana?” A shrug. “It doesn’t matter, really. My advice to you is to turn your talents toward making something new. Create something for its own sake, like when you made your sons. Life has to be lived forward, Doctor. You should try looking that way sometime.” For half a second, I almost expect him to clap a hand onto my shoulder. Instead, he musters a wan smile and heads for the door.
“The next time you drop by to talk, knock first.”
He replies without looking back. “There won’t be a next time.”
NOVEMBER
2379
19
I stare at words I prayed I’d never read. They circle in an endless loop, dominating every circuit in my neural net. All my safeguards aren’t enough to save me from paralysis. Immobilized with grief, I stare at the screen, but I no longer really see it. It’s just light and color, shapes and lines, a surreal mosaic signifying nothing. I struggle to process the news, but I can’t believe it’s real. I never knew I could feel this empty.
Data is dead.
The details hardly matter, but
I keep going over them, rereading them again and again, as if I’ll uncover some loophole, some simple misunderstanding that’ll mean my son isn’t gone. Starfleet’s official report says my boy sacrificed himself to save his captain and stop some lunatic’s doomsday weapon. They’re calling him a hero. I wish that made a damned bit of difference, but it doesn’t. I never wanted him to be a hero. I never wanted him to join Starfleet. From day one I’d said they’d be the death of him, and now they’ve proved me right.
I bring my fist down on a half-assembled positronic brain and smash it into dust.
Data’s captain wants to honor him with posthumous awards and fancy words. Who cares? What made Picard so precious that my son had to die for him?
A display screen shatters as I ram my fist through it. I pick up the gutted panel and hurl it across my lab, through a cluster of ion tubes, unleashing a cloud of toxic gas and jets of flame.
On a ship with hundreds of personnel, including security officers whose job it is to defend their ship and captain, why was it Data who had to sacrifice himself in the stupidest manner possible? Firing a phaser into the heart of a doomsday weapon? It took Data—my son, the paragon of nigh-immortal artificial intelligence—to point a phaser, pull a trigger, and blow himself up? Would he even have tried to do something this stupid if Picard hadn’t done it first?
Rage overwhelms me, and I tear through my lab, breaking anything I can hit, tearing apart computers and equipment, using the broken chunks of one to pummel the next into twisted wreckage. Each tremor of impact feeds my primal desires as I imagine it’s Picard I’m ripping limb from limb. Smoke and dust cloud the air. Steam and gases jet from severed pipes and hoses. I rip an electron chromatograph analyzer from its nook in the wall and hurl it against the glass case full of slain minds, all of which implode into fragments mingled on the flooded floor.
This is Picard’s fault. Why did he put himself in danger? Didn’t he know what Data would do? Was there no one else he could have sent on a suicide mission?
I pound my fists against one of the transparent steel windows and try to break through it, believing irrationally that my rage will somehow permit me to defy the laws of physics. Damn you, Data! Why did you have to be so noble? Why did you ever think their lives were worth more than yours? Why would you destroy yourself for them? How could you?
My body could go on indefinitely, but my rage falters when confronted by its own futility. Beaten, I crumple onto my hands and knees, overwrought and helpless, sobbing as I paw the debris-littered floor. I should never have tried to cheat death. If only I could have died rather than live to see this day . . . I could have met my end believing that my genius was without equal, that my son was immortal, that my legacy would live on. Instead I’ve seen my greatest works made obsolete. My sons are dead. My only true love lies in the arms of another. I am forgotten.
Everything I’ve done has been for nothing.
And it’s all my own fault.
I let my delusions of greatness drive Juliana from me. I programmed Data to see the world in such stark and absolute terms. And I was arrogant enough to think no one would ever do as I had done, and build upon my work as I built upon the labors of so many others. I thought I was the sine qua non of neurocybernetics and synthetic biomechanics. Now I see that I was only a link in a chain of progress, part of an ongoing continuity of discovery. I was never meant to be the apex of achievement, just another step on the long upward journey of science.
I could still give myself the gift of ignorance. My memories are both more permanent and more malleable than they were when I was flesh and blood. I could selectively erase all my memories dating back to the moment I finished my transfer into this body, reset myself to the moment where my biological life ended . . . and then induce my own cascade failure. A virtual stroke, of sorts. I could die as the man I was, instead of this wretch I’ve become. I could meet my long darkness with all my delusions intact. Cold comfort would be better than none, I think.
It’s not a bad plan. I could program the entire sequence of events, erasure and cascade anomaly, ahead of time and then let them play out. One moment of courage, and I could have a moment of bliss before oblivion takes me. I wouldn’t even know that I’d done it to myself. I would die thinking myself a victim of an error in my neural net, the casualty of an accident. A far more preferable end than my next best solution, which would be to immolate myself by plunging myself into a star with my memories intact.
Shakti’s comm tone chirps from an overhead speaker. I try to ignore it, hoping she’ll desist and leave me in peace, but she’s persistent. It seems she’s not willing to take no for an answer this evening. I heave a sigh of surrender and open a channel. “What?”
“I’ve acquired some new logs from Picard on the Enterprise.”
I shake my head. “I don’t care.”
“They contain newly declassified information. I think you need to see it.”
Bitter and angry, I snap at her. “I said I don’t care!”
“It’s about B-4. To be more precise, about B-4 and Data.”
Stunned, I stare at the ceiling speaker for nearly two seconds as the news penetrates my leaden shroud of mourning. Then I scramble up from the floor and sprint for the door, stutter-stepping and dodging around the obstacles I’ve strewn about in my tempest. I race out of the lab and back to my small living space, taking the most direct path to a working display screen. I stumble to a halt in front of the one mounted on the wall in my main room and activate it. Shakti, ever helpful, has already routed to it the new batch of logs for my review.
I read them all in a matter of seconds, wide-eyed at the madness and the possibilities of the news they contain. Shinzon, the lunatic who lured the Enterprise into the battle that claimed Data’s life, used my other son, B-4, against them as a Trojan horse. It doesn’t say how, when, or where this Shinzon acquired B-4, but that’s not important right now. What snares my interest is the note from Picard’s log about Data’s efforts to help his brother.
Data thought he could “uplift” B-4’s consciousness by copying his own memory engrams into B-4’s positronic matrix to stimulate the development of new neural pathways. It was a generous impulse on Data’s part, but a sadly misguided one. B-4’s positronic brain wasn’t designed to function at the same level as those I built after it. It wasn’t even supposed to stand up to long-term operation. I made it as a proof-of-concept, a simplified test model. His processors, his neural architecture, even his phase discriminator—none of them are capable of running the kinds of programs that I wrote for Data. But that stripped-down prototype is now the storage vessel for all of Data’s memories, up until shortly before he died. In essence, all that Data ever experienced, all he saw and knew, is preserved inside of B-4.
Good lord, it’s a disaster waiting to happen.
B-4’s matrix could never incorporate that information. His memory-access protocols are two generations behind Data’s. At best, he might tap into some uncompressed files from Data’s memory, but most of that knowledge will be little more than gibberish to him.
Shakti interrupts my worried musings. “Noonien? You seem upset.”
“I am upset.”
My confession seems to surprise her. “Why? I thought this would be good news.”
“Well, it’s not. While I’m thrilled that Data had the foresight to create an offline archive of his memories, it’s a shame he chose to store it inside B-4’s head.”
“Odd. I found the notion rather poetic. All that he was lives on inside his brother.”
How little she understands. “All that he was? Hardly, my dear. Data was more than just the sum of his memories. He had programs and subroutines of all kinds to govern the way he thinks, the way he acts, how he processes his perceptions, how he feels emotions. None of that was uploaded into B-4—just the memories of the life he lived. Memories that are quite literally worse than useless to B-4.” I’m gripped by a horrifying realization. “We have to help him.”
> “Noonien, what’s the matter? Is B-4 in danger?”
“More than anyone but me realizes.” I call up my private archives and decrypt B-4’s design schematics. It’s been decades since I looked at them or gave them a moment’s thought. But now there’s no margin for error; I need to know precisely what I’m working with. “B-4’s mind wasn’t made to hold the kind of information Data uploaded into it, and he was never meant to have the kind of long-term existence that his successors did. His memory storage blocks aren’t properly buffered to isolate them from his operating circuits. When he suffers his inevitable cascade anomaly, it won’t just take out his neural net—it’ll wipe his memories, too. And then every last shred of Data’s life will be erased from the universe forever.”
“How long does he have?”
“Hang on. I’m searching Picard’s logs for facts to fill in some variables.” I cobble together what few bits of hard intel are available and make some educated assumptions based on what I knew of B-4’s benchmark results when I abandoned him more than thirty years ago. The numbers tell a grim tale. “My best guess? He has about four years until his neural net starts to break down. After that, he’ll be living on borrowed time—and so will Data’s memories.”
I download B-4’s schematics from my archive to my own memory—never know when I might need them. Then I close the files on the display and start a new project. “Shakti, have the maintenance ’bots clean my lab right away. I’m sending you a list of new hardware that I need. Order it and have it brought in by an unmanned cargo drone as soon as possible. Then get me an encrypted back channel into Starfleet’s comnet. We need to get new alert daemons in place to keep tabs on B-4. I want to know where he is, who he’s with, and what he’s doing at every moment. We can’t afford to lose track of him. Understood?”