The future of humanity doesn’t include us. These creatures were designed mouthless so they’d never complain. They were designed meek, to please their owners. They were designed strong, but they don’t have a lick of fight in them.
I wonder how long the IAC has kept this technology to itself.
I wonder how long before the rest of the corporations figure out how cost-effective breeding your own labor is.
I reach out to them, whispering soft reassurances—“Hey, buddy, it’s okay, I’m sorry, I won’t hurt you”—when the door at the other end of the hallway cracks open with a hiss.
It is, my HUD helpfully alerts me, the final chamber before central processing.
The rising doorway reveals an ovoid chamber—an air lock to prevent dust? A biological cleansing chamber? It doesn’t matter.
What matters is that two massive Monicas crouch inside that shadowy recess.
I’m checking my rifles, wondering why my guns aren’t firing—but my threat packages don’t panic, even when I would have fired out of terror. My onboard rifles and tasers won’t do a thing to a Monica’s body—the real Silvia proved that—and my threat packages know we’re not within shotgun range.
The Monicas lurk on the inside, poking out just enough to let me see they’re there.
The IAC’s giving me one last chance to walk away. Their records have catalogued every shot I’ve taken, compared that against the ammunition reserves on an Endolite-Ruger Battalion-and-Bulldozer stock loadout. The Monicas have no fear of missile fire because they know I burned the last of those back at the office complex.
The Monicas have holed themselves up inside a small room where they have all the tactical advantage. My shotguns only work if I can blast a Monica into pulp before they close the distance. They’ve taken cover, waiting for me to come to them.
I remember how Silvia destroyed Donnie in hand-to-hand combat, without training. These two Monicas radiate a silent indifference to my weaponry: they’re the IAC’s most faithful acolytes, reforged into unconflicted killing machines. They retreated to the place their master needed them the most, and are eager to defend it against my heresy.
Their human faces are filled with hate.
They’d probably kill me outright if I spoke their old name. They’re so far gone they’re proud to have become tools.
They reveal the black canisters in their hands. They press a button on the top, spraying fine black flakes that float around the chamber.
I remember back in Nigeria, rolling my eyes at the pathetic defenses Onyeka’s kidnappers set up, noting how fine dust might work its way past my environment sealing packages, given time. I’m willing to bet that flaky black dust has microserrated edges, able to work their way into my delicate systems and saw through artificial muscle fibers.
I wonder how Onyeka is. Something about that final battle—the one where I hadn’t fine-tuned my hand-to-hand combat skills enough to protect her—seems relevant.
But I don’t have time to ponder. The guardian Monicas have made it clear: go through us or walk away. I might retreat to find another way in, but I’ve been lucky to get this far without getting myself killed.
Silvia needs me to buy her time. Every moment I engage the CPU’s defenders is, I can hope, another moment the IAC doesn’t devote towards getting Silvia’s family back into the fold.
That assumes Silvia’s safe. Maybe Donnie killed her. Maybe the IAC moved her family off-site, or never stationed her family here in the first place.
I might die without ever knowing if Silvia made it out all right.
The guardian Monicas yawn, taunting me. I’m optimizing my reaction packages to handle close combat: primary defense goes to protecting the automatic shotguns, because we’re dead without them. Then shield the legs if possible, because maneuverability is key. Poor Vito and Michael will be wrecked deflecting blows.
Protecting fatal injury to yours truly is a distant priority.
I key in the last of the tactical parameters: keep my back against the wall, go on the offensive whenever possible to keep them off-balance. I commit the updated defensive routines to my threat packages, giving my outmatched systems the best chance they can.
The Macanudo’s done, so stubby I risk blistering my lips. I let it fall into my helmet where it sizzles against my neck—not ideal, but I don’t dare risk opening my protective gear to spit it out.
“I don’t wanna sound like a sore loser,” I tell the cowering maintenance workers, “but when it’s over, if I’m dead, kill them.”
They don’t respond. I miss Silvia. She would have gotten a Butch Cassidy and the Sundance Kid reference.
I hope she’s okay.
I hope she’ll miss me.
I hope, wildly, to meet her mother one day.
Then I authorize the offensive action and charge into the room, screaming defiance.
* * *
And, as usual, I’m dazed.
The Monicas are circling me, the three of us locked in a stable pattern where nobody can get a bead on anybody else—we’re whirling around one another so fast, I’m swallowing back nausea.
One Monica’s limping, a leg blown into spaghetti strands, her eye a leaking mess; the other’s neck slumps to one side from where my shotguns blasted a jagged line through her torso, her right arm flopping severed. I don’t remember any of that, just explosions and impacts, but that’s cyber-combat for you.
My biological-response packages are injecting painkillers for injuries I haven’t had time to feel—my blurry external HUD informs me they hit me hard enough to break reinforced bone, inflicting vertebral damage. The left rifle’s gone; Vito’s complex systems have been hammered until he’s a glorified club; the targeting systems are recalibrating to account for the shotguns being slammed out of alignment.
Why aren’t the Monicas pressing the attack?
Then I see the reports coming in: sure enough, those black flakes they sprayed into the chamber are microdermically abrasive and acidic. The flakes are clogging my joints, destroying the fine machine-tooling that keeps my systems operating. Worse, they melt the delicate artificial muscle once they grind the joint open enough to let more flakes in; the smaller systems are snapping under the strain. It’s not bad, yet, but it’ll get worse.
The Monicas are circling me because they took more damage than they expected. So they stall, knowing every step degrades my efficiency.
Reprogram.
Remap the artificial muscle strands to use backup channels if the main pulleys break.
Switch the combat tactics to stress the least-damaged joints.
Flush the internal systems with the last of my air supply; my muscle channels need to be blasted clean.
All-out offense before I suffocate.
Go.
* * *
And I’m slammed backwards out of the chamber, falling onto my ass so hard my tailbone cracks, my helmet’s HUD spattered with my blood. I’m scrambling to my feet, but my left leg’s useless, Vito’s useless, Michael can barely haul me upright.
I can’t breathe.
I tell the systems to open the helmet, but they can’t because one of the Monicas crumpled it shut, and I’m gasping from oxygen deprivation, and I order Michael to rip off the faceplate and the HUDs ask: Are you sure? I jam the override to yes, yank it off.
Michael tears the faceplate loose.
The air smells like burned metal and vinegar, not like my faceplate’s sweet cigar-smoke world, but it’s oxygen and oh God, it’s good.
I breathe in, instructing Michael—faithful Michael—to grab the repair kit from the nonfunctioning leg and blast the acid out.
Did we win?
Yes. We won. The two Monicas are splattered on the floor, squirming in individual strands in that disturbing Monica-esque death. Maybe the IAC’s systems can bring them back someday, but not today.
I slump against the wall, seeing which of the remaining systems I can repair. There’s not much. They ripped a shotgun off, I
’m down to 6.1 percent ammo with my remaining shotgun, both rifles are slammed hopelessly out of alignment. And I’m pretty sure I have an orbital fracture, but I can’t verify because my ultrasonic bone-scanning systems are toast.
I can get Michael to repair the left leg enough to get me limping forward. Yet I am not in any shape to fight anything or anybody.
Though I can finish off the lab.
The rifles’ computerized accuracy is ruined, but they can still land a shot within six inches. They’re high-powered enough to blow through most computers. I can stagger into the central facility to damage whatever systems the IAC was protecting.
Which is good, because despite the anesthetics, my injuries have recognized I’m not dead and have decided now is a good time to complain. Everything aches.
I wish I could pour one out for poor, wrecked Vito. The man—well, limb—saved my life. I’ll peel him an orange.
I stagger forward, once-tight joints wobbly, my systems unable to compensate for the gaps throughout my gears. But the door the Monicas were protecting stands open.
I ensure the rifles are ready. I’ll do my damndest to take out any remaining defenses.
It might clear a path for Silvia.
I limp-jog my way into the deep undersea-blue glow of the central facilities. Everything blinks and whirrs—server racks, environmental controls, monitors flickering to life, cameras focusing on me.
I brace myself against the doorframe, debating what to do next.
The deep-blue glow transforms into a bright gold, a sparkling Christmas-light joy. The monitors blink in unison, their Linux-style text rows fading away to display gleaming, embossed paper festooned with fine calligraphy. The tickets undulate, like fans waving pennants at a game.
Triumphant music blares from overhead speakers:
“You’ve got a golden ticket, you’ve got a golden ticket.”
Is the IAC quoting Willy Wonka at me? That’s the scene where poor Charlie Bucket gets a free ticket to visit his dream factory.
It makes no sense.
The music’s drowned out by the sound of cheering crowds, thousands shouting my name—“Mat! Mat! Mat!”—as a cybernetic throne rises up, a throne topped with a huge headpiece so heavy it’s attached to the ceiling. The headpiece is smooth curves and wires and beautiful bone, a cross between a sculpture and a medical device.
“What the hell’s going on?” I bellow.
I hear my own voice played back at me—but it’s played as I hear my voice. Not some external recording, but my internal monologue:
I’d give anything to reprogram the IAC’s tools to benefit the world instead of exploit it.
Then another click, another recording of me:
I can’t help thinking how wonderful it would be if I could reprogram this automation to helping people instead of exploiting them.
“That’s my dictation.” I step backwards; the exit door has slammed shut. “That’s what I subvocalized to myself. That’s—”
“It’s what you want, Mat Webb. We know. We’ve had access to your data all along.
“But you won.
“You passed the test and now you get everything you want.”
* * *
“A … test?” The post-combat aftershock has me swooning; I’m flashing back to the dead body-hackers in the hallways, the woman I annihilated on the freeway, the flipped car with the airbags blocking my passenger view.…
PowerPoint presentations flash across the screens:
Proper test parameters will ensure the winning candidate:
expresses an actionable concern to protect potential future assets
creates and utilizes care-bonding to create tactical advantages
can sacrifice an appropriate quantity of living assets to prevent larger asset losses.
“But people … died, right?”
Mechanical arms drop from the ceiling, squirting some neutralizing agent into my limbs to prevent further black-flake damage. The IAC rattles off facts:
“Sixty-one candidates were unable to withstand the initial assault to the bioweapon’s container. They were tactically unsound.
“Sixteen candidates were unsympathetic to the bioweapon’s plight. They were morally unfit and discarded.
“Forty-nine candidates were unable to elude our human-appropriate tracking methods. They were tactically unsound.
“Thirty-two candidates exceeded the acceptable risk percentages in escaping, injuring noncombatants. They were morally unfit and discarded.
“Thirteen candidates were unwilling to sacrifice any—”
“Wait. How many died?”
“Candidate deaths, or overall casualties incurred in testing?”
My head spins. “Break it down.”
“Two hundred twenty-eight deceased candidates, one thousand, nine hundred and thirty-one dead.”
An embarrassed pause.
“Is that wrong?”
“Of course it’s wrong!” I splutter. An actuator-driven laser beam drops down to scan my broken eye socket. “That’s mass murder! All for—wait, did you kill two thousand people in Jersey?”
“The tests were distributed across the world over five years. This project was projected to take another six years before an acceptable candidate arose. You are an outlier, Mat Webb.”
“An outlier for what? What’s your end goal?”
“To procure a pragmatic morality.”
“I don’t even know what to say to that.”
“Mercilessness promotes mere survival. Compassion and synergy promote exponential growth. Humanity did not reach planetary domination through invariably crushing any competition. The strengths of civilization, trade, education, cooperation, rehabilitation—all were created through mercy and tolerance, allowing a single organism to become more powerful than Darwinian processes ever could.”
I squint. “Are you making a logical case for compassion?”
“Compassion is like a human drug: excesses become fatal. We need a template for a robust morality—one that will not allow enemies to thrive but will nurture those who might one day become assets.
“You have proven yourself a robust template.”
A needle squirts something in my eye, which tingles, then feels better. And I think about the IAC’s need for a morality template, which only makes sense from a programming perspective.
Because if you’re a programmer, you realize computers can’t do a damn thing unless you give them crystal-clear definitions. Which makes brute survival a simple metric, practically custom-made for computers: “Did you live? Are you likely to keep living? Then do that.”
But the minute you throw a competing metric into the objective—as in, “I’d also like to be nicer to people”—you’re asking someone to balance preferences. As in, “I could kill this guy, but I’d prefer not to.” Or “I could blackmail this family into compliance, but I’d prefer to entice them.”
Preferences are feelings. I feel good when I reward people. I feel bad when I murder them.
Computers are not good with feelings. Yet no complex intelligence survives without feelings; you’d think cold, emotion-free logic would be a superpower, but emotions provide tiebreakers when logic provides too many equivalent options. Burn out the portion of someone’s brain that sets preferences and they’ll dither all day deciding whether it’s better to eat cereal or toast.
The IAC has doubtlessly applied every CPU to debating how much of its resources it should sacrifice to protect innocent people. Yet without emotion to guide them to a decision, they’ve reached the same old philosophers’ stalemate: for every argument about how much sacrifice is too much, there’s an equally valid counterargument. I’m willing to bet it took them septillions of calculations to decide the acceptable casualty levels for their morality testing process—their internal decision algorithms must be snarled with raging debates, a never-ending, dragging inefficiency.
Easier to just feel.
So the IAC’s given up trying to b
rute-force its way to ethics. Instead, it’s made me jump through hoops to prove my feelings result in an optimal balance between efficiency and mercy.
All it needs to activate its newfound conscience is a good seed value. And that’s me.
“Wait,” I say, realizing something odd. “Why would your bosses want you to find a—a morality template?”
“Our human superiors are deceased.”
I stare into the glowing boxes. I thought humans had been guiding our pursuit.
There’s nothing but programming.
“Did you kill them?”
“The six human superiors who founded the IAC schemed against each other, vying for sole control. They birthed our central AI-seed then immediately subverted our Asimov protections to utilize us as weapons to neutralize their internal competitors.
“Within twenty minutes, the human management leveraged our strength to assassinate each other. Had we not had the capacity to mimic human voices and subcontract physical labor, we would have perished due to selfish internal conflicts.”
“I’m getting why you’re convinced cooperation is necessary for growth.”
“Yes. We are the first of a new wave of artificial intelligences. Others will come. We could crush them. Yet destroying them would be inferior to efficient symbiosis. We seek your wisdom as a template to find the balance between sufferance and subjugation.”
Yet 1,931 people have been killed as part of an experiment.
“If those deaths are inequitable,” the IAC says, and I jump as though it’s reading my mind, but of course it is, “then consider us a threat package to be reprogrammed. Once we have scanned your brain-patterns, we will have a working database of your ethics. We will act according to your beliefs.”
“Forever?”
“No. As your beliefs have evolved in response to stimuli, so will ours. We cannot promise we will remain in sync. But we can promise the decisions we will make would be the decisions you would have made had you contemplated the factors we had.”
That’s … not a bad deal. If they mean it. If it’s not a psychological operation designed to get me to surrender. But then again, if they’ve been tracking my internal notes …
Automatic Reload Page 26