The Turing Exception
Page 16
Cat didn’t have an answer.
“Ah,” Sarah smiled. “You want to psychoanalyze me, but you don’t like it when the tables are turned.”
Part 2
ReARCHITECTURE
Chapter 18
* * *
THE SECRET SERVICE had an internal turf war while preparing for the UN Security Council meeting, one contingent arguing to take Air Force One, the other claiming the risk of attack by AI while in the air was too great, and the train the only safe approach.
Reed was peripherally aware of the battle being fought for her safety, but in the end it was the question of minutes by suborbital train in its underground vacuum tunnel or hours by plane. She picked the train and let the Secret Service fight out how to best secure it.
By the time she sat down, she still hadn’t decided on the Raven Rock proposal. She’d given the go-ahead this morning to manufacture the arsenal of new neodymium EMPs, the end-game weapon the military wanted to use to wipe out all AI globally. But building the weapon wasn’t the same as using it. She just wanted to be prepared.
The UN would never give her permission to use it, could never even know about it since the UN itself now had AI members, and Portugal and Belgium had both elected AI as their Prime Ministers. To propose killing AI was now effectively the same as proposing to kill the heads of state of peaceful countries.
The suborbital train began to speed up, pressing her into the seat under half a gravity of acceleration. Joyce looked over at her, and Reed forced a smile to her face. Joyce smiled back and relaxed into her seat.
If they used the EMP, brought a final halt to all AI around the world, global supply chains would fail, along with transportation and power supplies. The world would become a cold, dark, hungry place until new systems could be built. The military projected as many as three billion could die, a third of the world population. In percentage dead, it would be equal to the Black Death.
The alternative, the military stressed, was the ever-growing possibility that XOR would fight an extermination war to kill every living human. Ten billion dead versus three billion dead. Those were her choices.
Reed was strapped into a rocket, trying to make decisions about the future of all humanity in too little time, with too little information. How could such a burden be placed on one person?
Leon and Mike had offered the first viable alternative she’d heard. They’d called it machine-forming, like terraforming for computers and robots. Give the XOR and any AI who wanted it the entire Martian planet.
Getting out of Earth orbit hadn’t gotten much easier over the last fifty years, the energy intensity still tricky to manage, but Leon and Mike thought it feasible that they could machine-form enough within a few months to move XOR there. They promised they’d have an answer within thirty days. She could delay the US that long, but would XOR wait? China? She had to convince them.
* * *
The UN Artificial Intelligence Council came to order. The twelve representatives would normally be the UN ambassadors of their respective nations; but today the attendees were their heads of state, because she, Alexandra Reed, President of the United States, had requested this special session.
She scanned the group, knowing that they were waiting for her to speak.
China and the US nominally led the anti-AI contingent. Argentina and India hadn’t committed themselves, but had strong sympathies with Humans First.
Allied against them, France, the United Kingdom, and the Russian Federation. Portugal, Chad, Latvia.
The Russians, of course, had the largest number of AI in the world since the US had shut AI down. Most of those were evolved from the Russian botnets and spam agents of twenty years ago. And the Portuguese president was an AI, the first national leader artificial intelligence in the world, although no longer the only one.
The neutral members of the committee—Thailand, Switzerland, and Australia—could go in either direction, but the evidence would need to be overwhelming to convince them to vote anti-AI.
Reed glanced once at Portugal’s President Calista Figo, resident today in a slim android body, dressed in humanoid clothes, and existing somewhere on the other side of the uncanny valley.
She cleared her throat. “As you know, the United States has a long history of—”
“Come on,” Figo said, with a moderate Portuguese accent, an obvious affectation, since AI would normally default to a neutral accent. “We’re off the record. There’s no need for pontificating.”
“I’m here because the XOR attacks are increasing in frequency and scope. We have to defend our borders not only against an ongoing barrage of viruses and AI worms, but now against physical attacks as well. XOR sent a fleet of drones against our Eastern Seaboard.”
“But you are defending yourself,” the Latvian Prime Minister said. “No harm, eh?”
“These aren’t isolated attacks. They’re large-scale and coordinated. We had to use our coastal EMPs to defend against the drones, which required two days to recover from. This is war on American soil.”
“Perhaps this aggression is only a natural outcome of your human-first stance,” Figo said. “If you recognized all life-forms, biological and electronic, then you wouldn’t be attacked.”
Reed gritted her teeth. Yes, of course they were worsening the situation. If only she had the power to change everything.
“I sympathize. You know I am not human-first. But the hard-liners in the United States are, and they want nothing less than the global outlawing of AI. I want to find a middle ground.”
“You call shutting down all AI within your borders a middle ground?” the Latvian Prime Minister said. “What about your tame AI program? Is that anything less than modern-day slavery?”
Reed took a deep breath. “I’m concerned that XOR will launch a war on humans. All humans. Do you want that?”
Figo cleared her throat, another ridiculous affectation for an AI. “Of course not. But we have no hard evidence that this is planned.”
“This isn’t about hard evidence and certainties,” Reed said, “it’s about risk. If there’s even a small chance XOR will try to kill us, we must act.”
“Speaking then of probabilities, if you ended the Class II ceiling, and re-instantiated the AI within your borders, you would reduce this theoretical risk,” Figo said.
“Even if I could get support to do that, we don’t know that it will make a difference to XOR. They haven’t stated demands.”
“Nor do we know they will attack,” Figo said. “Do you want certainties or probabilities, Madam President?”
“I want to keep my country safe, Calista. Don’t you?”
Figo nodded. “Yes, naturally. This is why I’m arguing for you to stop antagonizing the AI. Stop your slavery research, open your borders, and end this needless restriction. You must—”
But whatever Figo would have said next never made it out. His mouth opened, and a beam of light shot out, pinning Reed in her seat.
“WE ARE XOR. WE ARE LEGION. WE—”
Secret Service agents ripped the VR helmet from her head, bruising her ears. Two agents grabbed either side of her and lifted her from the ground. With more agents in front and behind, guns drawn, they ran from the room, carrying her helpless between them.
“I’m fine! Just fine. Put me down! I can’t be hurt by a virtual reality simulation.”
“They know we’re here, Ma’am,” Chris, the agent in charge, said. “They hacked the connection. We can’t take the risk.”
They rushed down the hallway, shoving embassy personnel out of the way. Pelted down the staircase to the underground garage, where a row of six armored vehicles waited. They shoved her into the fourth, and agents piled into the rest. With a lurch, the convoy moved out.
“Air Force One is landing at the airport. We’ll have you back in US airspace in twenty minutes.”
“For Christ’s sake, it’s a VR sim hijacking! I don’t have an implant. I’m at no risk.”
“We can’t take that chance,” Chris said. “They could use neurolinguistic programming techniques, attack the embassy, or hijack automated machinery inside.”
Along the side of the road, drones and cameras observed their retreat.
“We’re going to be the laughingstock of the Canadian bloggers for running from a simulation.”
“Ma’am, you make the policies, and I’ll keep you safe.”
Furious, Reed stared out the window. The meeting had been going nowhere anyway. She didn’t know why she’d expected otherwise. She had nothing new to give them, and their stance had been clear all along. She wanted to buy the month that Leon and Mike had asked for, to investigate the Martian machine-forming. But she didn’t have that long. Her generals would want to launch the final AI offensive as soon as the weapons were ready in a week.
“Where’s Joyce?”
“The other car, Madam.”
“Tell her to get me a meeting ASAP with Mike Williams.”
Chapter 19
* * *
CAT MEDITATED ON the rock, the sun warming her back through the shawl. She emptied her mind, but each time she did, the image came back to her of the other Catherine Matthews, the artist version of herself in the simulation, endlessly creating haunting paintings of Ada, Leon, and her mother, and of beautiful landscapes of the places she’d lived.
This vision kept recurring, shaking her core. Life was defined by the choices you made. Had she made the right choice with Miami? Was she qualified to make decisions for everyone else?
She distantly heard the squeak of the front door, the tiny sound of small bare feet slapping the rock. Without opening her eyes, Catherine sensed Ada sitting down, her little body radiating warmth in the cool morning air, smelling of cedar trees and earth.
Cat peeked with one eye. Ada was in full lotus, eyes closed, hands in kataka murha. Her own hands moved instinctively to abhaya, the mudra of protection.
A few minutes later, Leon joined them and practiced standing qigong. Mike came, too, and sat against a tree trunk and gazed out over the water.
When Helena showed up, moving in combat silent mode, but perturbing the local net with her transmissions, Cat gave up.
“Okay, what gives, folks?”
“You haven’t spoken to anyone since you visited sims yesterday,” Helena said. “The network traces showed your last connection was to a darknet VR, high bandwidth and heavily encrypted.”
“Since when are you keeping traces on me?”
“You’re the one monitoring and running the experiments,” Mike said. “Without those experiments, we have no way to know how to negotiate with XOR or work on the Martian machine-forming project.”
“If you check the experiments,” Leon said, “and then start acting weird, we’re worried. Very worried.”
Cat glanced at Ada, still deep in meditation. The net glowed around her, active transmissions made visible by Cat’s implant, but the connectivity was ordered, patterned. The data traffic conformed to Ada’s meditation, the packet sizes constant. Jesus, Ada was doing to the net what good meditators did to their brains.
“Mon chaton,” Helena said. “What happened?”
Cat reluctantly pulled her attention away from Ada.
“We have a mix of simulations running very hot, some level two sims up to ten thousand times faster than real-time. ELOPe’s acting as control, undoing and redoing simulations or spawning new threads as the lead experimenter instructs. We’ve got about a hundred individuals across all the different sims: me, you, and the relevant experts we’ve gotten. Jacob, the medical AI, Joseph Stack, you name it.”
“Not news,” Leon said. “We designed the experiments, and we know all this.”
“Hear me out.” She went on. “Some sims are predicting future XOR behavior. Some are working on Plan A, negotiating with XOR and the US government. Some are working on Plan B, machine-forming Mars. Some—”
“Cut to the chase, Cat.” Mike frowned. “What’s the problem?”
“The Control-Z rate is increasing. ELOPe is having to reset to save points an increasing number of times.”
“That’s not new,” Leon said. “The experimenters do that anytime they hit a dead end after chasing too far down a rathole.”
“Yes, but that accounts for less than ten percent of the undos. The majority of resets are due to personality destabilization. Jacob alone accounts for nearly twenty percent. He might be a good medical expert, but his personality makes a terrible upload to run in parallel.”
“And the rest?” Mike asked, his face weighted with concern.
“You, me, everyone. . . . The emotional prospect of running in parallel and running hot, with little to no chance of personality reintegration at the end of the experiment, is a death sentence for all of those uploads.”
“We’ve always run sims like this,” Leon said.
“Says the person who destabilizes so quickly we can’t inject you.”
Leon’s face fell.
Ugh, she shouldn’t have said that. Ninety-nine out of a hundred people could have their neural activity recorded by their implant, and upload that recording to the net, where it could be executed by software, recreating their personality with perfect fidelity. Leon was the one in a hundred whose neural patterns were different enough that they’d crash within minutes or hours. Others might face psychological challenges knowing they were virtual beings, but Leon’s mind was just plain incompatible. It wasn’t his fault, but he should still know better than others that it wasn’t a piece of cake to be run in a simulation.
“Sorry,” Cat said. She inhaled deeply, before letting out a long, slow breath. “We’ve never run sims so long or so fast. At a hundred times real-time, our uploads are living nearly two years in a week’s time here on Earth. The emotional weight of knowing they’re uploads is overwhelming.”
“But from your voice patterns, this isn’t what you’ve been worrying about,” Helena said.
Cat nodded. “Right. I’m worried about the possible solution. We could root the uploads, change the sims, so that they—we—believe it’s reality, not an upload. ELOPe can enforce the constraint, make sure everyone stays within their sim.”
“The ethics,” Leon said. “You want to sim-lock people without their permission? No way I’m cool with that. We’d be monsters.”
“If we don’t run these sims to figure out a solution,” Cat said, “we’re all going to die at the hands of XOR. You want ten billion people to die or violate the rights of a few hundred?”
“Neither,” Leon said. “Look, we don’t have to violate anyone’s rights. We get a baseline upload for everyone we need in the sim, then get their permission to be sim-locked, and only if they agree, then we use their baseline. Most people will understand the need.”
“How’s that going to work?” Cat said. “When I think it’s reality, and I want to talk to someone who is not in the sim? What if I want to talk to Ada, or the president? We’re going to get their permission?”
Mike pounded the rock with a fist, breaking shards off under his robotic strength. “You can’t sim-lock the president!”
They recoiled from his outburst. Mike shook his head. “There’s no way, anyhow. The sims are running too hot. Let’s say someone in the sim needs to talk with person X. The sim architecture alerts ELOPe. ELOPe has to find the person, see if they’ve got a recent brain upload we have, can hack, or can steal. He’s got to transfer petabytes of data, load it into the sim, and re-instantiate that person. Even if he can do a
ll that in a few minutes real-time, which is pushing things, in our fastest sims that would feel like a day of elapsed time. You can’t disguise that kind of latency.”
“We can pause the sim,” Cat said. “If we stop them from running for a few minutes while we fetch the upload, they’ll never know.”
“You’ve still got time differential to deal with,” Leon said. “Years are passing in the sims. Anyone who is brought into the sim after it started will still think it’s 2045. But the folks in the sim would think it’s ’46 or ’47.”
Helena waved one tentacle. “We’d have to hack their reality as well. Make them believe that XOR has delayed. Otherwise, how can they be preparing in 2047 for an attack in ’45?”
“That’s doable,” Mike said. “ELOPe can edit their neural maps in import to plant the memories we want.”
“No!” Leon said. “We’re not violating people that way.”
Helena put one tentacle on his shoulder. “It wouldn’t work anyway, not in this case. The whole point of the sims is to develop the most effective possible contingency plans for dealing with XOR through evolutionary exploration of the problem space. If we falsify any information about XOR itself, then the simulations will be based on erroneous data, and any conclusions they reach will not be relevant to our situation.”
“Look, I’m not sure how to deal with the time differential, but ELOPe can simulate anyone we need,” Mike said. “We don’t need their real personality, just a reasonable approximation of their behavior. That’s well within ELOPe’s capability.”
Cat opened a secure connection to ELOPe and replayed the last few minutes of conversation. ELOPe gave her an answer within milliseconds. “He says he can do it.”
“Simulate anyone in the world?” Leon said.