Machine

Home > Other > Machine > Page 5
Machine Page 5

by Elizabeth Bear


  She said, “They were sick. We couldn’t help them. The machine intervened to protect them.”

  * * *

  Sometimes in a moment of crisis, you act. Your instincts take over, and your body and brain do the right thing without the intervention of your conscious mind. Training and repetition, presence of mind, and perhaps something innate and nameless combine to make you do a thing. It might be the right thing. It might be terribly maladaptive.

  If you’re the sort of person who habitually does the maladaptive thing in a crisis, do everybody including yourself a favor and don’t go into the emergency services.

  I didn’t do a maladaptive thing. And I didn’t do the right thing.

  I froze.

  “Machine?” I said, carefully, pitching it as a question.

  While I waited for Helen to answer, Tsosie’s concern leaked through the senso to me. Neither one of us knew what this machine she spoke of might be, its provenance, its purpose. But apparently it had shoved an entire crew of humans into cryo chambers, and possibly addled their shipmind in the process. That… was scary stuff.

  Tsosie wanted to pull back and evacuate immediately, quarantine this vessel and possibly Sally, too, until Judiciary could get here and take over the decision-making. It was laudable caution, and in principle I was in agreement.

  Except.

  Except if Helen was a threat, if this “machine” was a threat—if the machine even existed, and wasn’t a spur process of Helen herself—we might trigger an attack by disengaging. Except there were tens of thousands of people right in front of us who we could potentially rescue. Except that we could destroy the whole recontact situation by making the wrong call, and lose not just Helen, not just the crew, not just the ship—but all the knowledge and history contained in her.

  In my military career, in my rescue career, I had never done something as high-stakes as this. And nothing in my experience indicated the best course of action to take with a seemingly friendly but possibly malfunctioning artificial intelligence who might have killed her entire, very numerous, crew. And then possibly dissociated a portion of herself into microbots, over which she did not seem to have conscious control. But which definitely responded to her emotional state with vigor.

  As a human being, I wasn’t sure I could make myself walk away from people who were, if they were alive, this much in need, and this close to rescue after so very long. I was also, even with Sally’s renewed help, getting a fair amount of pain leakage. We’d been out for a long time, and I could tune it out and rely on my exo and the hardsuit to do the heavy lifting. I still had plenty of batteries. But I was getting tired, and the discomfort was starting to make me foggy and rob me of concentration.

  Something Helen and I had in common.

  I’m used to it. I’ve always functioned around the pain. I don’t remember a time before the pain. I wondered if Helen remembered a time before her pain, if it was pain. You’re projecting, Llyn. Fine, then. Her disorientation.

  I could not decide what to do based on the information I had. So I asked a leading question, and waited for more data.

  “Machine,” Helen agreed, with an airy wave. “The machine.”

  The tinkertoys rattled behind her.

  This was not, to be perfectly transparent, typical in any way of shipmind behavior. Not even the behavior of a traumatized and destabilized or physically damaged shipmind. If a shipmind lost processing power, they might become slow, unresponsive. Sticky.

  But not confused.

  They did not become vague in this manner. Disorientation was the stuff of organic malfunction.

  So in addition to everything else, Helen was scientifically interesting.

  This time, I encrypted my channel. It was a risk: Helen might decide we were plotting against her. But literally everything was a risk right now.

  Sally, I subvocalized. You’re the expert in treating designed intelligences. What is going on with this AI? Mechanically, I mean. I’ve never seen anything like this.

  “Neither have I,” she answered in my ear. “I’d say she’s got conflicting inputs, or conflicting imperatives. That could make her seem more—”

  Senile? I waited. Organic?

  “I wasn’t going to put it that way.”

  It’s all right. She’s definitely acting weird. I’m not offended. Only…

  I hesitated for too long. Sally responded with a query that I felt hanging there. Tsosie, too. He’d filed his recommendation and his dissent with my course of action, but this was my call. He wouldn’t interfere unless I did something obviously unjustifiable or sophipathological, or made a decision that was judged unreasonably dangerous for its potential benefits by a majority of the crew.

  Only how do I get Helen to consent to let us move her people?

  As I said it, though, I knew the answer. There was no urgency in treating the crew. They were in cryo: they were dead or they were alive, and for the present moment that wave state was uncollapsed and I had no way of knowing if I needed to pick up dinner for Schrödinger’s cat on my way home.

  Their status, in other words, was not deteriorating. So I would treat them as if they could be saved, because that is how you work a victim, but I wouldn’t triage them to the front of the line.

  There was a rush with one patient, though. We had to treat Helen—and the “machine,” if the machine was her tinkertoy microbot thing—before we did anything else. Before the Judiciary ship or ships supposed to be following us arrived.

  Which led us to a new complex of problems. Because I had even less idea where to begin with Helen, or how to get her to let us take care of her so we could save her and her crew.

  And get somebody over to the docked methane ship, too. That was a totally unaddressed problem, still hanging out there. Waiting out there.

  At least it was a smaller-scale problem.

  * * *

  “You can come with me,” I said to Helen. “That way you can be sure I don’t do anything to harm your crew.”

  Helen seemed to study me—disconcerting, given her eyeless, faceless face. I could only assume that it was a behavior she’d been designed to model by the same engineer who gave her the sexualized peripheral.

  “I will come with you,” she agreed, on the other side of I-didn’t-even-want-to-contemplate-how-many simulations. “You will not move my crew without my permission. And the other doctor stays here.”

  Tsosie shook his head inside the hardsuit. I overruled him.

  There was a mystery here, and I have never been good at letting go of mysteries. I can’t leave a crossword puzzle unfinished, and when I was in the Judiciary I never let go of a cold case.

  I mean, sure, I could get the bulldog tendency corrected. But it’s part of my identity. So I pick careers where being a bulldog helps.

  I wasn’t sure it was helping now. But here I was, and I wasn’t going to magically turn into someone different in time to save thousands of lives.

  “I’ll agree to those conditions until further notice,” I said. “Take me to the crew.”

  The gravity was getting to me. I wasn’t used to this much, for this long, with the kind of sustained physical activity we’d experienced crawling under and around the machine.

  Some kind of immaterial barrier (electrostatic?) held the air inside the airlock, but let more solid objects—such as me—pass through.

  When I went out into the space of the hold, the first thing that struck me was that it was not, for a wonder, full of tinkertoys. I’d become so used to picking my away around them that it felt a little strange, for the first few moments, to reach out or gesture and not see a wave of them unzipping and rezipping themselves around my hand.

  The machine—or the bit of it that had come through the door with us—did follow me into the cargo hold. Maybe that was what she meant by “I will come with you.”

  I’d been hoping it would be just the Helen peripheral—I was becoming more and more convinced that I was dealing with whatever remained o
f Big Rock Candy Mountain’s shipmind—but her golden figure remained in the airlock beside Tsosie and waved me out among the coffins. I was escorted on my mission of mercy by a latticework tendril made of brilliantly holographic microbots. It was one of the more unsettling search-and-rescue partners I’ve ever worked with.

  The machine didn’t do anything at first, however, except hover over my shoulder.

  Since there was no atmosphere in the cargo hold to carry sound, I made sure my radio was transmitting unencrypted and spoke into my suit mike, “Where did the machine come from?”

  Helen heard it, as I had suspected, and the question seemed to puzzle her. Her answer came back through my suit. “It… made itself?”

  “It looked like you were making the component parts, when we met you.”

  “I was making the spindles and connectors,” she said. “But they’re not the machine. Or they’re not all of the machine. The machine is an idea.”

  I wondered. If the machine was a part of her, and she had made the machine, then it made sense to say that the machine had made itself. But her cognition on this topic, as on certain others, seemed blocked.

  It was certainly possible that it was blocked. That someone, sometime, had intentionally closed those pathways and instituted a kind of machine denial in her. Human beings were perfectly capable of blatantly ignoring objective reality all on our lonesome. AIs had to be programmed to do it.

  If she had been blocked, though, there was no way for me to fix it, and trying to get her to talk about it wouldn’t lead to any kind of self-realization about the conflict. Talk therapy doesn’t work on lines of code. And no matter what the three-vees say, you can’t actually send an AI into crisis and meltdown by challenging its programmed assumptions. They can grow and change—that’s one of the things that makes them sentient—but they can’t shake off a code block any more than I could regrow a severed trunk nerve.

  Sally could fix it, given time. And there were code doctors at Core General that could fix it, if we could get her there.

  Whatever had been done to her, though, the person who had set it up seemed to have set up defenses around it. It was becoming plausible that what had happened to Helen—and to Big Rock Candy Mountain—was an act of self-sustaining sabotage. But on the part of a member of the crew, or the crew of the docked vessel, which we hadn’t investigated yet, or something else entirely?

  I picked my way along the row of cryo chambers, sighing in frustration as I examined them. They were not designed for human technicians to maintain. There were no readouts. There were no telltales or happy blinking lights. There was row upon row of… honestly, they looked more like chest freezers than like coffins. They didn’t look at all like a Synarche cryo tank, which at least partially confirmed my supposition that Helen or her crew had invented cryonic technology on the… fly, as it were. The chambers looked as if she had assembled them from whatever materials were available.

  They didn’t need readouts. Nobody was ever supposed to look at them and see if they were working other than Helen and the machine.

  And, well, Helen appeared to be superficially correct. Whether those cryo chambers contained living persons or dead ones, they were all intact.

  The chambers did each have a battery, which made my life that much easier. I thanked Helen profusely when she mentioned them to me. We’d have to fab chargers that fit these ports when we moved the caskets, but I wasn’t too concerned about that. We had the printers, and we could copy one of the originals. Electricity is a remarkably simple—though dangerous—animal.

  The number of chambers we could haul would be limited by the amount of juice we could generate more than by our cargo space.

  “Helen,” I said, “can I connect to the data storage on one of these chambers to run some diagnostics?”

  “I will have to print you a connector,” she said. “They don’t broadcast a signal.”

  “They’re hardwired into your systems.”

  I had a tickle of an idea on how to get us out of this situation. How to get control of it. I couldn’t be certain that my encryption with Sally was completely secure, and I needed her for it.

  I hoped she would guess.

  Helen said, “Into the machine.” She began to drift toward me, body poised and toes pointed, like a monster levitating toward its victim in some old three-vee.

  “Are you part of the machine?” I asked, very casually.

  Tsosie’s level of worry spiked so high that Sally bumped his antianxiety cocktail before clearing it with him. She was within her rights as a shipmind to do it—she, like Helen, had an obligation to her crew—but I picked up his irritation that she’d felt the need.

  Llyn, you’re going to invite her right into your fox? That’s too risky. I cannot allow it!

  Relax. Sally has my back.

  I didn’t have time to say more, because Helen was answering.

  “We are all,” she said, with great conviction, “part of the machine.”

  It sent a chill up my spine, and I didn’t tune the unease away. A certain wariness was good. A certain wariness was my brain and body telling me that I was in a dangerous situation. A certain wariness was useful. Sensible.

  Sally seemed to agree, because my sense of peril stayed right where it was, and she could have gotten rid of it as easily as she’d defused Tsosie’s panic. How had people like those in the tanks gotten through the dia with just their own native brain chemicals and coping strategies?

  If any of them were alive, I guessed I might have the chance to ask them.

  “Please,” I said. “Print me a connector, Helen.” Encrypted, I asked, Sally, are you game for this?

  It’s a terrible idea, she answered, so I knew she had picked up on exactly what I was planning.

  I said, Just don’t hurt her if you can help it.

  What if she hurts me?

  Don’t let that happen, either.

  I continued to make my way around the coffins—or the chest freezers—while Helen printed me a connector. Although I couldn’t access the vitals of the crew, I could double-check the integrity of the chambers.

  At least in that, Helen’s confidence was justified. She’d arranged things so she didn’t have to do anything except cryo chamber maintenance. Who knew? Maybe she’d gotten very, very sick of having humans around after six hundred subjective ans. Years. Whatever.

  “Helen,” I said, “why did the machine put the crew into cryosleep?”

  “There was sickness. The ship wasn’t safe.” She had turned away from me, and from Tsosie, and was attentively waiting, her gaze—which wasn’t her gaze—trained back toward the hatch behind Tsosie. The machine hovered there, stretched between the airlock and me, balancing on all its rods and connectors. Staying out of trouble for the time being, I supposed.

  “Structurally unsound?”

  “The hull was too thin,” she agreed.

  “Why was the hull too thin?”

  I expected to hit another block here, but I didn’t. Whoever had programmed Helen not to understand the consequences of her own actions hadn’t thought to build denial around this.

  “The materials were needed elsewhere.”

  “So the hull’s structural integrity was compromised. Its strength and resilience.”

  “Yes,” she said.

  “Because the materials were needed to build the machine.”

  “Yes.” She turned to me, shimmering gold and silver like moiré silk. “Should you go into stasis? The ship isn’t safe.”

  As if I had some gift of clairvoyance, I could hear the echo of those words down the ans. I could imagine her telling her crew, “You have to go into stasis. The ship isn’t safe.”

  The ship wasn’t safe because she had been taking it apart. To build the machine. A machine that was… a virus. A meme, a self-propagating set of ideas that could infect and cause sophipathology in an artificial intelligence.

  A meme whose source I did not know.

  But wherever it had
come from, the machine was also an entity, as Helen was an entity. And as such I was duty-bound to try to rescue it—and her—and save both their a-lives if I could.

  “I’m safe,” I told her. “I have my hardsuit on.”

  “I think you should go into stasis,” she began to insist.

  But she hadn’t countermanded her previous instruction. “Oh, look,” I said. “Here’s the connector.”

  The machine snaked out and handed it to me. It was a fat, physical cable. I plugged one end into my suit jack, and it fit. I crouched beside the nearest cryo chamber, and felt Sally massing herself inside my fox, a grumpy wall of code who couldn’t believe what I had gotten her into.

  You put me up to this, she said. Just remember that.

  Do you want to go into cryo? I asked her.

  The machine was grabbing for my wrist as I plugged the cable in.

  * * *

  What happened next wasn’t my fight, and I don’t really know how to describe my position as an observer. Because while it wasn’t my fight, it was happening in my head. I was the conduit for it, and the bandwidth Sally was using to bootstrap herself into Big Rock Candy Mountain’s system was the bandwidth between my fox and her—well, what amounted to her physical body. The ambulance, in other words, and the processors inside it.

  Synarche ships are pretty much made of four things: programmable computronium, engines, life-support consumables, and upholstery. I mean, okay: I’m not an engineer. The upholstery might be computronium, also.

  But I’m not computronium. My fox—the little network of wonders embedded in my central nervous system—is, and a useful wodge of the stuff, too. It’s deeply linked to everything I am and do and see and think and feel and remember. It ties me into the senso so I can share experiences with Sally and my crewmates. It lets me live their experiences, if they share their ayatanas with me. It helps keep me emotionally stable and it helps me remember things accurately, without the subjectivity of human recollection.

  It’s a damned knife blade of a bridge for an entire fucking shipmind to stuff herself across at lightspeed so she can grapple another shipmind and wrestle her to the metaphorical ground.

 

‹ Prev