“How current are you with the latest research into artificial intelligence?” Painter asked.
Gray frowned. After being recruited out of Leavenworth to work for Sigma, he had undergone a fast-tracked postdoctoral program, studying physics and biology. So he knew a fair amount about the subject, but certainly not how it connected to the night’s attack.
He shrugged. “Why are you asking?”
“The topic’s been a growing concern over at DARPA. The group’s been pouring money into various AI research programs. Both public and private. Did you know that Siri—Apple’s ubiquitous assistant—was funded through DARPA research?”
In fact, Gray hadn’t known that. He sat straighter.
“But that’s barely the tip of the proverbial iceberg. Across the globe—from corporations like Amazon and Google, to research labs in every nation—a fierce arm’s race is under way in artificial intelligence, to be the first to make the next breakthrough, to take the next step. And currently we are losing that race to Russia and China. Not only do such autocratic regimes appreciate the economic advantages of AI, they also see it as a means to control their populations. Already China is using an AI program to monitor and study their populace’s use of social media, coming up with a ranking, a scorecard of their loyalty. Those with low numbers find their travel limited and their access to loans restricted.”
“So be good or suffer the consequences,” Gray muttered.
“Hopefully they don’t do that with Tinder,” Kowalski said. “Guy’s got to have some privacy while looking for a booty call.”
“You have a girlfriend,” Gray reminded him.
Kowalski huffed a stream of smoke. “I said looking for . . . not going on a booty call.”
Painter drew their attentions back. “And then there’s the matter of cyberespionage and cyberattacks. Like with the Russians. A single machine-learning AI can do the work of a million hackers at keyboards. We’re already seeing that with the automated bots infiltrating systems to spy, wreak destruction, or sow discord. Still, that’s all scratching the surface of where we’re headed next at breakneck speed. Right now, AI runs our search engines, voice-recognition software, and data-mining programs. The true arms race is to be the first to push the boundary even farther—from AI to AGI.”
Kowalski stirred. “What’s AGI?”
“Artificial general intelligence. A humanlike state of intelligence and awareness.”
“Don’t worry.” Gray glanced over to Kowalski. “You’ll get there someday.”
Kowalski took out his cigar and used its length to flip him the bird.
Gray took no offense. “It’s good to see you’re already learning to use basic tools.”
Painter sighed heavily. “Speaking of getting there someday. The director of DARPA—General Metcalf—just returned from a world summit about this very matter: the creation of the first AGI. The summit included all the usual corporate and government players. The group’s conclusion was that there is no way to halt technical progress toward the creation of an AGI. Such a prize is too tempting to ignore, especially as whoever controls such a force will likely be unstoppable. As the Russian president said, they will control the world. So every nation, every hostile power, must pursue it at all cost. Including us.”
“How close are we to that threshold?” Gray asked.
“From a poll of the experts, ten years at the outset. Maybe as short as half that. But definitely in our lifetimes.” Painter shrugged. “And there are some indications, we might have already done it.”
Gray failed to hide his shock. “What?”
1:58 A.M.
“C’mon, baby, wake up,” Monk whispered in his wife’s ear. “Kat, just give my hand a little squeeze.”
Alone in the private room in the neurology ward, he had pulled a chair to her bedside. He had never felt so helpless. Stress heightened his every sense. The chill in the room. The quiet chatter out in the hallway. The acrid scent of antiseptic and bleach. But mostly he focused on the persistent beep of the monitoring equipment, keeping track of each breath, each beat of her heart, the steady drip through her IV line.
Tension ached in his back as he hunched at her side, his muscles ready to explode if anything changed. If the EKG showed an arrhythmia, if her breathing should slow, if the flow of edema-combatting mannitol should stop.
Kat lay on her back, her head elevated to reduce the risk of further brain swelling. Bandages covered her lacerated arms. Her eyelids showed only cracks of white. Her lips pursed with each exhalation, while a nasal canula delivered supplemental oxygen.
Keep breathing, baby.
Doctors had discussed intubating her and putting her on a ventilator, but with her pulse-ox holding steady at 98 percent, they opted to hold off, especially as tests were still being run, with additional procedures lined up. If they had to move her, it would be easier if she weren’t on a ventilator.
He stared at the pulse-ox device clipped on her index finger. He considered double-checking himself with his neuroprosthesis, but he had detached his hand from its wrist cuff. It rested on the bedside table. He was still getting used to the new prosthesis. Even detached, its synthetic skin transmitted wirelessly to the cuff, then to the microarray wired in his brain, registering the cold of the room. He willed the fingers to move and watched the disembodied digits wiggle in response.
If only I could get Kat’s fingers to do the same . . .
A scuff of a heel drew his attention to the door. A slim nurse entered with a folded blanket under one arm and a cup in one hand.
Monk reached to the table and reattached his prosthesis. He felt his cheeks flush, slightly embarrassed to be caught with his hand detached, like being caught with his fly open.
“I brought an extra blanket,” the nurse said, holding up the plastic glass. “And a few ice chips. Don’t put them in her mouth, just paint them across her bruised lips. It can be soothing, or so patients who’ve recovered from coma have reported.”
“Thank you.”
Monk took the cup, grateful to be able to offer even this small bit of comfort. As the nurse tucked in a second blanket over Kat’s lower half, Monk gently rubbed an ice chip across her lower lip, then upper, like applying lipstick—not that Kat wore much makeup. He searched her face for any reaction.
Nothing.
“I’ll leave you be,” the nurse said and exited.
Kat’s lips pinkened slightly under his care, reminding him of all the times he’d kissed her.
I can’t lose you.
As the ice chip melted away, the head of neurology entered, carrying a chart.
“We have the second set of CT results,” Dr. Edmonds said.
Monk placed the plastic cup on the table and held out his hand, wanting to see the results himself. “And?”
Edmonds passed them over. “The fracture at the base of her skull has resulted in traumatic damage to the brainstem. There’s a distinct contusion involving both the cerebellum and pons region. But the rest of her brain—her higher cerebral regions—appears undamaged.”
Monk pictured Kat being struck from behind with the mallet found on the kitchen floor.
“So far, there does not seem to be any active bleeding at the contusion site. But it’s something we’ll be monitoring with successive scans.” Edmonds stared over at Kat, though it appeared less like he was checking on her than that he was avoiding eye contact with Monk. “I also ran a long EEG, which showed a normal sleep pattern, interrupted occasionally by a wakeful response.”
“Wakeful? So she may be aware at times. Does that mean she’s not in a coma?”
Edmonds sighed. “In my professional judgment, she’s in a pseudocoma.” From his grim tone, this was not good news. “During her assessment, she showed no response to pain stimulation or loud noises. While her pupillary response to light is normal, we’re seeing only minimal spontaneous eye movements.”
During her initial neuro exam, Monk had been heartened to see Kat blink when her las
hes were brushed. Still, while he had a background in medicine and biotech, he was no neurologist. “What are you trying to tell me? Be blunt.”
“I’ve consulted with everyone here. The consensus is your wife is suffering from locked-in syndrome. The brainstem trauma has cut off her higher cerebral functions from her voluntary motor control. She’s basically awake—fully aware, at times—but can’t move her body.”
Monk swallowed, his vision darkening at the edges.
Edmonds studied Kat. “I’m surprised she’s still breathing on her own. Unfortunately, I expect that function to deteriorate. Even if it doesn’t, for her long-term care, we’ll need to establish a nasogastric tube to feed her and intubate her to keep her from aspirating.”
Monk shook his head, not denying her this care, but refusing to accept this diagnosis. “So she is conscious for the most part, but unable to move or communicate.”
“Some locked-in patients learn to speak through eye movements, but in your wife’s case, she’s only showing a minimal spontaneous eye movement. Not enough, we believe, to actively communicate.”
Monk stumbled back to the chair, sat down, and took Kat’s hand. “What’s the prognosis? With time, can she recover?”
“You asked me to be blunt, so I will. There is no treatment or cure. It is very rare for patients to recover or regain significant motor control. At best, some minimal arm and leg control, maybe improved eye movements.”
He squeezed her fingers. “She’s a fighter.”
“Still, ninety percent of locked-in patients die within four months.”
The doctor’s phone chimed from a holster on his belt. He tilted the screen to read the text message. “I must go,” he mumbled, distracted, and headed toward the door. “But I’ll write up orders for her intubation.”
Alone again, Monk lowered his forehead to the back of her hand. He pictured the ruins of Gray’s home, the broken crystal angel. She had fought fiercely to protect the girls. And he would do everything he could to get them back.
But in the meantime . . .
“Baby, you keep fighting,” he whispered to her. “This time, for yourself.”
2:02 A.M.
“How could that be?” Gray asked, dumbfounded by Painter’s claim. “Are you suggesting an AGI has already been created? That one already exists or existed?”
Painter lifted a palm toward Gray. “It’s possible. Back in the eighties, a researcher named Douglas Lenat created an early AI called Eurisko. It learned to create its own rules, adjusted to mistakes, even began to rewrite its own code. Most surprising of all, it began to break rules it didn’t like.”
Gray frowned. “Really?”
Painter nodded. “Lenat even tested his program against expert players of a military game. His AI defeated every opponent, three years in a row. During the later years, players changed the rules without informing the developer to better handicap the game in their favor. Still, Eurisko soundly defeated them. Following this, Lenat grew concerned at what his creation was becoming, how it was self-improving. Ultimately, he shut it down and refused to release its code. To this day, it’s still locked up. Many believe the program was on its way to becoming an AGI, all on its own.”
A trickle of dread traced through Gray. “Still, true or not, you believe there’s no stopping this from happening again in the near future.”
“That’s the consensus of the experts. But that’s not their ultimate fear.”
Gray could guess what scared them. “If the creation of an AGI is inevitable, then an ASI will not be far behind.” Before Kowalski could ask, he added. “ASI stands for artificial super intelligence.”
“Thanks for spelling that out,” Kowalski said sourly. “But what exactly is that?”
“Ever see the movie Terminator?” Gray asked. “Where robots destroy mankind in the future? That’s an ASI. A supercomputer that outgrows mankind and decides to get rid of us.”
“But it’s no longer science fiction,” Painter added. “If an AGI is right around the corner, most believe it will not stay a general intelligence for long. Such a self-aware system will seek to improve itself—and rapidly. Researchers call it a hard takeoff or intelligence explosion, where an AGI quickly grows into an ASI. With the speed of computer processing, it could be a matter of weeks, days, hours, if not minutes.”
“And then it’ll try to kill us?” Kowalski asked, sitting up.
Gray knew this was a possibility. We could be the creators of our own end.
“There is no saying for sure,” Painter cautioned. “Such a superintelligence would certainly be beyond our comprehension and understanding. We’d be little more than ants before a god.”
Gray had enough of these speculations. This threat could wait. He had more pressing and immediate concerns. “What does any of this have to do with the attack, with finding Seichan and Monk’s kids?”
Painter nodded, acknowledging Gray’s impatience. “I was about to get to that. Like I said from the beginning, DARPA has been pouring money into various projects. And by money, I mean billions. Last year’s budget devoted sixty million to machine-learning programs, fifty to cognitive computing, and four hundred to other projects. But what is significant—what is germane to the matter at hand—is the hundred million sent out this year under the category ‘Classified Programs.’”
“In other words,” Gray said, “covert projects.”
“DARPA has been secretly funding a handful of ventures that are not only close to developing the first AGI, but whose research is aimed at a specific goal.”
“And what’s that?”
“To make sure the first AGI to arrive on this planet is a benevolent one.”
Kowalski snorted with derision. “So, Casper the friendly robot.”
“More like ethical,” Gray corrected, well aware of this line of pursuit. “A machine that won’t try to kill us when it ascends to godhood.”
“DARPA has made this a priority,” Painter emphasized. “As have many other research groups. The Machine Intelligence Research Institute. The Center for Applied Rationality. But these organizations are vastly outnumbered by those pursuing the golden ring of an ordinary AGI.”
“That seems stupid,” Kowalski said.
“No, it’s simply cheaper. It’s much easier and faster to build the first AGI than it is to engineer the first safe AGI.”
“And with a prize this valuable,” Gray said, “caution takes a backseat to speed.”
“Knowing that, DARPA has been funding and nurturing talented individuals and projects, those that show promise of creating a friendly AGI.”
Gray sensed Painter was finally getting to his point. “And one of these programs has some bearing on what happened tonight?”
“Yes. A promising project at the University of Coimbra in Portugal.”
Gray frowned. Why did that sound familiar?
Painter reached over to the computer on his desk, tapped a few buttons, and brought up a video feed onto one of the wall monitors. The footage revealed a tabletop view into a stone room. Rows of books filled shelves to either side. A group of women stirred around the table, staring straight into the camera. Lips moved, but there was no sound.
The posturing struck Gray as familiar. He guessed the feed came from a computer’s built-in camera. It appeared the women were studying something on the monitor in that stone room.
“This footage was taken the night of December twenty-first,” Painter said.
Again, something nagged at Gray. The date. The location. Before he could dredge it up, one of the women leaned closer. He recognized her and gasped. He stood up and crossed to the screen.
“That’s Charlotte Carson,” he said, already guessing what would happen next.
“U.S. ambassador to Portugal. She headed a network of women scientists. Bruxas International. The group funded hundreds of female researchers around the world through grants, fellowships, and awards. To accomplish this goal, Bruxas was self-supported for a long time, mostly thro
ugh the largesse of two founding members—Eliza Guerra and Professor Sato—who were from old and new money respectively. But even their pockets only went so deep. In order to help more women, the group sought out additional support, collecting capital from corporations and government agencies.”
Gray glanced over to Painter. “Let me guess. Including DARPA.”
“Yes, but only to finance a specific handful of their grant recipients. Like one woman’s project called Xénese. Or in English, Genesis.”
“One of DARPA’s friendly AGI projects.”
Painter nodded. “Only Dr. Carson knew of DARPA’s interest in this project. She was sworn to secrecy. Not even the young woman running the program, Mara Silviera—a veritable genius—knew of our involvement. That’s significant.”
“Why?”
“Watch.”
By now Kowalski had joined Gray at the screen. Gray knew what was about to happen, but clearly Kowalski did not. As a group of robed and blindfolded men burst into the room, Kowalski swore. The big man took a step back when the gunfire started. As the women’s bodies crashed to the stone floor, he turned away.
“Motherfuckers,” Kowalski mumbled.
Gray agreed with his characterization, but he kept staring. Charlotte Carson slumped to the floor, mortally wounded, blood pooling under her. Still, her face stared toward the camera, her brow bunched with confusion.
“What is she staring at?” Gray mumbled.
Answering him, Painter zoomed the view to a tiny corner of the screen. Focused on the horror of the attack, Gray had failed to note the small window open there. Painter replayed the last of the footage. A symbol of a pentagram filled the wall monitor. It started to spin wildly, then suddenly broke apart, leaving a single symbol glowing on the screen.
Designed by the author
“Sigma,” Gray whispered.
Leaving the symbol hanging there, Painter turned to Gray. “This footage was only discovered by Interpol eighteen hours ago. By a computer forensics expert who was searching Mara Silviera’s lab at the university. It seems the women from Bruxas had been attending a symposium in Coimbra and had come to the university library to witness a demonstration of Mara’s program when they were attacked.”
Crucible Page 6