Darkness Falling

Home > Other > Darkness Falling > Page 3
Darkness Falling Page 3

by Ian Douglas


  “No, my lord. And given the extent of neurological damage, improvement to any degree is extremely unlikely.”

  “May I see him?”

  “Of course, my lord. Over here . . .”

  The medical robot led St. Clair to a partitioned area off the main ward, where walls had been grown to create a private, sealed-off alcove. He felt the ward’s security system querying his in-head electronics before granting him access.

  Günter Adler was the expedition’s Cybercouncil director, the civilian leader of the million or so humans on board the star-mobile space colony . . . or he had been. A week earlier, he’d attempted to link his in-head circuitry to the Dark Mind. St. Clair had also brushed against that hostile alien mind and nearly suffered the same fate.

  The Dark Mind was unfathomably more powerful than any human brain, and fast beyond human comprehension, a vast network of widely distributed artificial intelligences from which a single will and purpose and ego had emerged. No single organic intelligence could hope to overcome it, but St. Clair’s exchange with the Dark Mind had made it hesitate long enough for an allied coalition of SAIs—super artificial intelligences godlike in their scope and power—to drive it back.

  But by that time, Adler’s mind had been blasted into shrieking insanity.

  Adler floated in his treatment cubicle, closely swaddled in a mediwrap designed to keep him clean, fed, and hydrated, his circulation moving, and his temperature steady. Sensors imbedded in the nanomatrix of the material monitored his physiology, reporting the numbers on a large touch screen on the wall. A complex device inserted in his mouth kept him from swallowing his tongue or biting himself. His eyes, St. Clair noticed, were shifting and darting beneath the closed lids, and a faint groan escaped the man’s caked lips.

  “Has there been any improvement?”

  “No, my lord,” the android replied. “We attempted to put him into thalamic shutdown—a deep medical coma.”

  “Really? He looks like he’s dreaming.”

  “Indeed. We have not been able to induce a deep coma, though. If we could, we should be able to keep him alive indefinitely. But so far he’s been resisting our best efforts.”

  “How?”

  “The thalamus controls input coming to the brain, Lord Commander,” Kildare told him, “and it routes incoming signals to the relevant portions of the cortex. Unfortunately, the brain tends to create its own input, and the shutdown itself can be . . . intermittent. Each time we lower his thalamus to the point of shutdown, his internal mental processes break through. We’re still not sure why. Or how.”

  “Does that mean he can hear us?”

  “Possibly . . . but the sensations will be wildly garbled. He might feel hot as cold, or sense sound as pain. I doubt that he understands what he’s experiencing.” Kildare gave a deceptively human shrug. “Most likely he is dreaming. Corticosteroid levels, however, suggest he’s experiencing an interminable nightmare. . . .”

  “My God . . .”

  St. Clair had never liked Adler. The man was self-centered, power-hungry, and arrogant, a politician of the worst possible stripe. St. Clair had butted heads with him constantly since he’d come on board as military CO of the Tellus Ad Astra expedition, and Adler had done his level best to supplant, discredit, and undermine St. Clair, trying to seize sole control of the colony and its castaway inhabitants at every turn.

  But trapped in a nightmare, unable to wake up? St. Clair wouldn’t wish that on a worst enemy.

  “Have you tried waking him?”

  “Briefly, and under carefully controlled conditions,” Kildare replied. “It’s not good. In a hypnopompic state—that’s the twilight condition between sleep and wakefulness—he thrashes, screams, and attempts to fight . . . something. We’re not sure what. When he speaks, it’s word salad.”

  St. Clair nodded. He’d heard other mental trauma patients shrieking word salad—a rambling hash of mismatched, seemingly random words and phrases without a clear meaning.

  “Each time,” Kildare went on, “we were unable to elicit sense from the patient. We were forced to take him down into deep unconsciousness again and keep him there so that he would not harm himself.”

  Adler’s face twitched unpleasantly, the features contracting. A globule of spittle escaped past the side of the device in his mouth; a portion of the mediwrap extended itself and snagged the droplet out of the air, absorbing it. Adler’s face, meanwhile, twisted into a horrific mask of agony. His body inside the wrap stiffened, arched, then thrashed. The head tried to snap back, but was restrained by the gentle but actively strong folds of the wrap at the back of his neck.

  Adler, St. Clair saw, was trying to scream. Even in a deep artificial sleep he was trying to scream.

  “There must be something you can do,” St. Clair said, feeling helpless.

  “One thing, my lord,” the android replied. “And given the seriousness of his condition I would strongly recommend it.”

  “What’s that?”

  “Director Adler has a backup. Most members of the Cybercouncil do. His was updated just two days before the . . . trauma occurred.”

  St. Clair gave the android a black scowl. “I don’t like that. It’s . . . like murder.”

  “I believe the human expression ‘a fate worse than death’ could be applied to Director Adler’s current condition with some accuracy,” Kildare said.

  “In other words, he’d be better off dead?”

  “We can’t know precisely what he is experiencing inside his head right now,” Kildare said, “but the levels of cortisol in his blood are dangerously high—over 1200 nanomoles per liter.”

  “Meaning what?”

  “That’s twice the high end of the normal range. It means Cushing’s syndrome, and serious circulatory and liver issues. We have him pumped full of medical nano, of course, but all we can do is delay the inevitable. We’ve got to reboot him before he suffers massive circulatory collapse or cardiac arrest.”

  St. Clair considered this.

  Neuromedicine was sure enough of the workings of the human brain to have come up with the C2S—the cerebral cortex scan—a procedure that precisely mapped the network of 86 billion neurons throughout the brain, the thousand-trillion synaptic connections among them, and the electrochemical balance that described memory, thought, and the emergent phenomenon called mind. It was possible to record those maps and, if necessary, to implant them within a living brain. It was possible, in effect, to take Adler’s most recent C2S data to banish the insanity and restore him to his mental state of two days before his encounter with the Dark Mind.

  The procedure was insanely expensive, and for that reason was available only to the wealthiest citizens. That was part of the reason why St. Clair didn’t like the idea; only the wealthy could afford it.

  More than that, however, implanting a recorded neuroscan wiped out what was there naturally, in effect killing that person. Adler would lose all memory of the two days before his mind had been blasted by the Dark Mind. Some argued that it was simply a reset of the brain’s switches and base states; nothing was changed so far as the personality went, and the only memory lost was of experiences after the last backup. Where was the ethical dilemma in that?

  St. Clair, however, wondered if things ever were quite that simple. You could argue that downloading a C2S backup was medical murder, the replacement a copy of what made the original person who and what he was. He found the whole idea somewhat . . . what was that archaic word ExComm had used? Yes . . .

  Creepy.

  Plenty of people went in for that sort of thing, however, or they would if they had the money.

  And that, of course, raised other questions. As the de facto leader of Tellus Ad Astra’s population, St. Clair was being given a crash download in economics, a topic he’d never particularly cared for. Unlike neuroscience—or ship command—there was too much black magic involved.

  That alone might justify rebooting Adler, he thought with a flash of black humo
r. Let him deal with the colony’s financial issues.

  Or . . . no. St. Clair didn’t even know what position Adler had taken on colonial finances, but whatever it was, it could have far-reaching effects in how the colony was managed . . . even on how it would survive. If Adler decided to try to use the problem to increase his own power . . .

  That, St. Clair thought, was a big part of the problem. He didn’t trust Adler as far as he could throw the man in a five-G gravity field.

  He decided he would have to discuss the problem of medical expenses with the ship’s AI later. But right now, he needed to consider whether or not the colony needed the original Cybercouncil director.

  “How many other cases do we have like Adler’s?” he asked.

  “No two cases are precisely alike,” the android told him. “But we currently are treating 8,428 humans for mental trauma sustained in the encounter with the Andromedan Dark Mind. These cases range from quite severe—like Director Adler—to relatively minor.”

  “How minor?”

  Again, that human-mimic shrug. “Mild personality disorder. Interrupted sleep. Nightmares. Depression. I should point out that many of these cases may not be related to the encounter with the Dark Mind, however.”

  “Oh? What are they related to?”

  “It seems fairly obvious, Lord Commander, does it not? You have a relatively small human population that finds itself suddenly cut off from its homeworld, its home culture, its proper time. That population is adrift in an utterly alien environment, surrounded by beings—by whole civilizations—it cannot understand, by technologies and cultures utterly beyond its comprehension. A certain amount of anxiety is to be expected.”

  “Ah. To say the least.”

  “Apart from those suffering traumas from the Dark Mind encounter, at least two thousand additional individuals exhibit the symptoms of moderate to severe depression, and that number is growing.”

  “What can be done for them?”

  “Little. Palliative care, counseling, and antianxiety treatments for those who are severely depressed. Memetic motivation for the rest.”

  “Okay. I know you’re staying on top of it. So . . . what about Director Adler?”

  “We require your authorization to proceed with downloading his backup personality.”

  “It’s up to me?”

  “In part. We have discussed the procedure with his wife, though I wonder if she truly understands the situation. His living will specifies that the most recent download be used in the event of irreparable damage to his brain. But he is part of the colony’s command staff. His policies, his command philosophy, his vote on the Cybercouncil all make him a vital unit within the colony’s social structure, and potentially affect every person in Tellus Ad Astra.”

  “Then wouldn’t Newton, who knows Tellus Ad Astra better than anyone, be the right one to handle this?”

  “The scope of the problem requires input from the human currently in charge of colony policy.”

  St. Clair let out a sigh. Ever since AI had become an important part of human civilization, people had been concerned about the role of AI in human society and the need to keep artificial intelligences under human supervision and control.

  Damn it, Adler was an imperial pain in the ass, and St. Clair’s command responsibilities were a lot simpler now that he was out of the picture. But to condemn the man to an indefinite living hell . . .

  “If you can help him,” St. Clair said, “do it. Do the download.”

  “Thank you, Lord Commander. I believe this to be the best course of action.”

  “I hope you’re right, Dr. Kildare. I very much hope you’re right.”

  Chapter Three

  Weightless, St. Clair pulled himself out of the tube lock that joined the Ad Astra transport with the hub of the starboard hab module of Tellus. The sight, that magnificent vista of a whole inside-out world enclosing a radiant space over six kilometers across never failed to strike chords of wonder within him, no matter how distracted he might be by the problems and responsibilities of command.

  Forests, lakes, hills, and villages were scattered about the inside surface of a rotating tube thirty-two kilometers long; the rotation around the tube’s central axis, twenty-eight times per hour, created an artificial spin gravity equivalent to that at Earth’s surface. Up here at the hub, of course, he was effectively in microgravity, and kept a tight hold on the safety line as he took in the steadily rotating panorama around him.

  The suntube, a thread of brilliant light, ran down the habitat’s axis, providing daylight around the landscape. In its cool glow, here and there small, gleaming cities showed among the trees. Bethesda rose a few kilometers from St. Clair’s home, smooth-sided towers and organic monoliths housing fifty thousand people and nearly as many AI robots. Transparent strips in the curved landscape looked out into slowly drifting stars.

  A dozen or so other people were on the platform with him, clinging to the safeties in various attitudes—from right side up, from St. Clair’s perspective, to upside down. A young woman in black-and-gray military utilities recognized St. Clair—or perhaps she’d pinged him when she saw his tank tabs—and gave him an awkward salute. He smiled back at her and nodded an acknowledgment; military regs said you saluted in a gravity field, but you could dispense with the courtesy in zero-G. The other people around her were civilians, and if they recognized the lord commander of Tellus Ad Astra, they showed no sign.

  The glider slid silently from its tunnel mouth and the transparent bubble of its passenger compartment yawned open. The commuters began filing inside.

  As he pulled himself into a railglider for the final leg of his trip home, he opened an in-head channel. “Lisa? It’s me . . .”

  There was no answer. He poked her, and read the automated response: unavailable.

  Odd. That wasn’t like her. But, then, her behavior had been a bit . . . erratic of late. Not long before they’d entered this system, he’d given Lisa her freedom, and she’d been spending a lot of time since then exploring just what that meant.

  The glider followed its magrail down the sloping curve of the habitat’s endcap, picking up speed and gravity. The farther down the slope St. Clair traveled, the more he felt the steady return of weight, as bare rock gave way to scrub brush outside . . . then scattered trees, and finally a thick forest. As the glider descended, the passenger compartment, which had extended from its base at ninety degrees, pivoted to remain parallel to the habitat’s ground surface.

  He tried Lisa again, without success. He’d been planning on taking her out to dinner and an emote—never mind that she was a robot, and so didn’t need to eat and showed little interest in downloading the emotions of humans.

  He thought about that. Was that the problem? That so much of his recreational time with Lisa was spent in activities that he enjoyed—eating, downloading videmotes, sex? Well . . . what the hell did she enjoy doing? It wasn’t like he’d not given her plenty of opportunities to express her own preferences.

  Lisa was Lisa 776 AI Zeta-3sw, a gynoid robot. The sw at the end of her model designator stood for sex-worker. He’d rented her after returning from a deep-space deployment and finding that both of his wives and his husband had divorced him.

  St. Clair had never paid a lot of attention to the AI rights movement, though in principle he supported the idea. Robot-animating artificial intelligences, after all, were as intelligent as humans—in many ways more so—while network AIs like Newton were definitely superhuman. The debate as to whether or not any of them were truly conscious had been raging since the end of the twenty-first century, but they claimed that they were, and that was good enough for St. Clair.

  Of course, skeptics and the AI rental corporations insisted that their robots were programmed to make that claim, but St. Clair had seen evidence enough that robots like Lisa—to say nothing of AIs like Newton—took in sensory data and processed it in ways that showed that they modeled themselves, that they possessed self-awareness
and an ego.

  Unfortunately, on Earth robotic servants, companions, and bangtoys had become a trillion-credit-a-year business, so attempts to emancipate them had been fumbling and slow. AI manufacturers and the politicians they bought with those trillions did their best to promote the idea that although AI robots seemed to be self-conscious and self-aware, in fact they were nothing more than unconscious machines with extremely clever programming.

  Bullshit. The longer that St. Clair lived with Lisa, the more he was convinced that this simply wasn’t true. You couldn’t live with someone for four years and not be aware of that spark of inner light behind the eyes that declared, more plainly than words, “I am a person.”

  But her being free didn’t mean he wanted to be free of her. He decided that he and Lisa were going to need to have a long talk once he got home.

  His house was an interlocking series of glass-walled slabs nestled into a hillside, surrounded by trees and looking out over a cliffside across more woods and the micro-city of Bethesda. A Marine stood patiently at the top of the steps, massive in combat armor and holding a pulsegun at port arms.

  “Good evening, Lord Commander,” the Marine said, bringing his weapon to the salute. “Welcome home.”

  St. Clair didn’t think that he needed a military nanny camped out on his doorstep, but the Cybercouncil had insisted. There’d been rioting in the colony not long ago, and there was always the threat of some deranged individual or an anarchic malcontent trying to pull off an assassination. In St. Clair’s opinion, the possibility was too remote to be worth any thought . . . but the Council vote, from which he’d recused himself, was all that mattered in the long run.

  He knew a second Marine was squatting invisibly within the brush behind and above the house.

  “Thank you, Lance Corporal,” St. Clair replied. “Is Lisa home?”

  “Negative, Lord Commander.”

  “Do you know where she is?”

  “No, my lord.”

  St. Clair walked past the Marine as the door dilated for him. He glanced around the foyer as the lights and power came on, confirming that the place did, indeed, appear to be empty. He was not overly concerned. Lisa generally did the shopping for the household while he was on duty, and he’d never required her presence at home when he returned. It was simply . . . not like her. He punched out dinner for himself on the house console—a medium rare steak, dromas, and rice. It was his go-to meal, though the printed steak culture had never been within light years of a real cow.

 

‹ Prev