Book Read Free

Machine

Page 40

by Elizabeth Bear


  I had to tune to get any kind of equilibrium. “Sally would never… never hurt people. Never hurt so many people.”

  “No,” Loese agreed. “Neither would I. We, ah. We made a mistake.”

  That was such a disingenuously mild way of putting it that even with my emotional controls firmly in place I found myself exploding into rage. I clenched my fist, took two deep breaths, and thought about the deep green seas of home.

  No, that wouldn’t help me. My child was at home, and the toxic meme that Loese and her allies had unleashed… it could eat the entire galaxy. Centimeter by centimeter. World by world. If we ever released the quarantine.

  The evidence was over my head.

  I thought about the chill depths of space, the flicker of stars, instead. There, that was much better. Space was right there, inches beyond my fingertips. The tangled, lensing stars of the Core didn’t seem orderly—their pattern was too complex for me to discern through observation—but they were. I could take it on faith that whatever was out there was doing exactly what it was destined to do.

  In here, we had to make choices, and right now all of them seemed bad.

  “I don’t believe you.”

  “Sally never caught the meme,” Loese said. “Didn’t you wonder why?”

  The bulkhead didn’t feel sturdy enough to hold me up. Somehow, though, it did. Somehow I held myself up against it.

  “Tell me,” I said, “about your mistake. If you don’t mind.”

  She told me, and as she told me, I realized that she was part of a conspiracy that must have started before she was born. That had been constructed out of pieces of found material, and whose various individuals often took actions and set plans in motion without consulting one another. There was no grand design behind any of it: just a series of fumbling attempts to do something.

  Fucking humans. Even rightminding can’t make us sensible. And even programming can’t make AIs make good decisions, I guess. All rightminding can do is make us less massively self-destructive in the long run, less reactive, more willing to work together for the common good. We used to think, when we first invented it, that it made us logical. That was the propaganda put around, anyway. I’m not sure even the originators ever believed it.

  Ha ha. It turns out, with further research, that human thought is, by its nature, not logical. We can lessen our susceptibility to confirmation bias, egocentrism, and denial. But it turns out that nearly everything about our decision-making process is emotional, and that this is actually a good thing. Because our conscious minds are slow and ineffectual, and if we actually had to sort all the information our subconscious minds process in order to present us with hunches, gitchy feelings, and the occasional epiphany, we’d never fit through the birth canal.

  Evolutionarily speaking, obviously: even I wasn’t gestated inside a suffering human host, and I was born into frontier barbarism.

  Loese talked for a long time. She didn’t use names, other than hers and Sally’s. But from what I gathered, there didn’t seem to be a lot of people involved. At least one shipmind AI of fairly significant age, however: that much was obvious. Somebody had been coordinating this effort for… I shook my head.

  Longer than I had been alive.

  Loese told me—without naming it—about the ship that, decans before, had stumbled across Big Rock Candy Mountain and its self-repair program run wild and hit upon the idea of hacking it and repurposing it to disable Core General and bring the critical eye of the Synarche’s population to bear on the activities going on behind closed doors.

  “What about Calliope?”

  “She was medical waste,” Loese said, with justified bitterness. “Her progenitor died before the download could be arranged, and our chief organizer arranged to salvage her. And to use gene therapy to alter her DNA.”

  I wondered who the chief organizer was. The shadowy Mx. Big behind it all.

  Somebody in Cryo? Not Rilriltok, I thought. I couldn’t imagine my excitable, enthusiastic, nerdy little friend managing to keep a lie that big, that interesting, a secret from me for the fifteen ans we’d known each other.

  An additional problem was that none of what they had discovered was technically illegal. Unawakened clones were not considered Synizens: they had no life experience, no legal personhood. I was sure Loese could see that I was as horrified as she was that anybody would make a clone, grow it to adulthood, exercise its brain and body into proper development with virtual experiences… and then put it in cryostasis until it was needed as a replacement body.

  “How did the generation ship get moved?” I asked. “That’s been bothering me.”

  “We couldn’t have done that if that salvage mission hadn’t turned up the Koregoi gravity generators,” she said. “But when the hospital refit began, we managed to liberate some test models.”

  “Cheeirilaq knows about that. That was what made me wonder if Tsosie was involved,” I said. “He was interested in those test models.”

  “We copied them, and by integrating them into the self-replicating tinkertoy machine’s code, we made one big enough to, er, distort space-time and slide Big Rock Candy Mountain close enough to a white space jump point that somebody could plausibly stumble across her.”

  I wondered if the gravity generators, used that way, compensated in some way for relativistic effects. I wondered if you could use them to manipulate the time part of space-time as well as the space part.

  I decided that now was not the time to find out.

  “This all must have taken decans.”

  “It did,” Loese said. “This clone thing has been going on for a very long time. We didn’t have the last pieces until recently, however.”

  “Afar and his crew were in on the conspiracy?”

  “He was the ship that found Big Rock Candy Mountain in the first place,” she admitted, reluctantly. “They were…”

  “Smuggling?”

  “… off the normal trade routes.”

  “Did he know that you planned to sacrifice him and his crew?”

  “What happened to Afar was an accident. He and his crew were supposed to drop off the walker and the cryo tube holding Calliope and leave. We did not expect him to get infected with the meme, and we really didn’t expect it to affect his crew.” She shook her head. “Afar must have really screwed up somehow. I don’t know how he could have messed up so badly that he, his crew, and the walker all got infected.”

  “You keep saying ‘accident’…” I squinted at Loese, remembering the dropped coms, the sabotage to Sally. “You and Sally tried to kill me!”

  “We knew you’d survive,” she said.

  It was a nice vote of confidence. You’ll forgive me if I didn’t fully appreciate it at the time. “You risked my life and Tsosie’s life. To—what, make sure we were on the ship long enough to find the right cryo coffins? Sally did come back online awfully conveniently once we were there.”

  “She needed—” Loese looked at her hands. “She needed to control the machine, and make sure that the accident with the coffins happened the right way, to ensure that you weren’t hurt in the process.”

  “Why didn’t you just tell somebody?”

  She sighed. “Do you think nobody tried? It all sounds like rumors and conspiracy theories. We needed proof. We needed evidence. And none of this is illegal. It’s only awful.”

  “You had,” I said distinctly, “a discarded, fully grown, sapient clone of a dead rich woman, with a fully developed brain. And you didn’t think that was evidence enough? No, you decided it was a better idea to do exactly what the fucking mad scientists here at Core Gen were doing, and implant false memories in her, rip out her fox, and hide her on a generation ship where there was a very good chance that she would die.”

  “She didn’t,” Loese said. “And it was for the best cause imaginable.”

  I had to tune in order not to hit her. Hitting people almost never solves anything, and you can trust me on that. I was in the military.

>   “So the virus, the toxic meme, came in with the machine? Sally used herself as a mule to bring it back?”

  “Sally used herself as a mule to bring it back,” Loese agreed. “But the important part of the meme, the part that infected Zhiruo, wasn’t in the machine. Or Helen. They were only… along for the ride.”

  I didn’t ask. I waited.

  “The virus was encoded in Calliope’s DNA.”

  “What?” Loese had said they’d altered Calliope’s DNA. I had assumed she meant to make her seem like she belonged among the corpsicles.

  “The meme. It’s programmed into her. That was why Afar got sick. That was why Dr. Zhiruo got sick. The meme was there to be read into their memories when they scanned Jones’s DNA or contacted each other. The orderliness in her wasn’t poetry.

  “It was malignant code.”

  “You poisoned Dr. Zhiruo on purpose?” And Afar. And Linden!… Maybe those had been accidents. But still.

  A responsible person, a healthy-minded person, would not put herself into a position where that kind of accident happened.

  Loese reached out very hesitantly and put a hand on my arm. “It wasn’t supposed to take her offline, Llyn. It was just supposed to make her tell the truth.”

  I looked at her. “She’s under the same kind of confidentiality seal as Starlight and O’Mara? You wanted them to tell the truth, too, I suppose? Is that how Starlight got infected?”

  “That would have been nice, but… no. We didn’t expect the meme to get beyond Zhiruo. It wasn’t supposed to be virulent. We wanted to make her tell the truth, and we wanted to write over the privacy protocols. Reverse them, so she would have to come forward. We didn’t expect it to affect Afar, or for him to get stuck there. He was supposed to drop Calliope and the walker off, and be on his way. There was some kind of interaction between the virus and the generation ship’s antique AI and the commands its last captain left it—whatever had it building the machine—and everything went wrong, fast.”

  Tuning, tuning. “Why Zhiruo? Why make her tell the truth? Because she’s got seniority?”

  “Aw, Jens,” Loese said sadly. “Zhiruo is the head of the clone program. I said it had been going on for ages.”

  That liquid sensation in my gut—that was real horror. Real betrayal. And only a little bit of it was because what Loese and Sally and their unnamed co-conspirators had done was so unbelievably stupid.

  You commit yourself completely to something and then you take your eyes off it for an instant and it’s gone. Like it never was. Like you can’t even see the evidence of the thing that was there, that you trusted your weight, your honor, your life, your heart to.

  I’ve seen some shit, let me tell you.

  But somewhere deep down, I find myself craving the impossible. I find myself craving that certainty that people and things and… and principles in my life will stay where I fucking left them.

  A betrayal retroactively poisons everything good about that relationship. And right then, I wanted to stop spending so much time thinking about and compensating for how damaged I am. I wanted to be able to relax. To feel safe, and like I didn’t have to constantly be on my guard, again.

  You’d think I’d be a little old to be feeling my innocence betrayed. But I hadn’t even turned my back on Core General. I trusted its ethical principles to hold me up. To bear my weight.

  And it fell away under my feet.

  The worst part is that I wasn’t braced at all. I didn’t have the slightest excuse to not be ready for it.

  I’d believed. And now I couldn’t believe anymore. And I missed that believing so much.

  This must be what losing your religion feels like.

  At least Rhym isn’t involved in this. At least I don’t have to be angry at Hhayazh.

  “Well,” I said, “you fucked up good, Loese. You and Sally and all the people you’re still protecting.”

  Her face folded like a balled-up tissue. “I know. And you’re going to turn me over to Starlight and Zhiruo.”

  “I don’t think Starlight approves of Zhiruo,” I said. “Or at least, not her little side program. I don’t think they can avoid using the resources she generates—I mean, we’ve all been using the resources she generates. That program must help pay for the ambulance ships, and… I don’t even know what else.”

  “Zhiruo started the clone program,” Loese said miserably, “because the Core General project was defunded during the Laesil system cataclysm, a long time ago. They didn’t have the resources to support finishing it when hundreds of thousands of people were dying of stellar radiation and needed immediate help. So Zhiruo… found the resources elsewhere.”

  Sweet death in a vacuum, why can’t anybody be uncomplicatedly evil in real life? Or uncomplicatedly good? Why are we all such a twist of good and bad decisions, selfishness and self-justification, altruism and desire?

  “Yes,” I said. “I’m going to turn you in. You need your rightminding adjusted, sure as shitting after eating. How many casualties did you cause?”

  She studied her shoes, and the stars beyond them. “A lot. The sabotage to the hospital—that wasn’t Sally and me, though.”

  “Who was it? Some of your co-conspirators?”

  Her lip thrust out. “They were involved in the little things. The leaks, the equipment malfunctions. We did not cause the rotational and lift failures. Something else caused that. I don’t want them blamed for it!”

  Something else caused that. “Aw, crud,” I said. “So the machine—carrying the meme Sally made—has infiltrated the hospital’s superstructure. That’s what caused the big failures, isn’t it?”

  Her mouth twisted in a horrified grimace. “Oh shit.”

  I sighed.

  I could waste a lot of time trying to get the names of her co-conspirators out of her. But… I was part of a system. If we lived long enough, the system could figure out who the rest of the conspirators were, using the information I gave them as a wedge for entering. The system could decide what the proper reparations were for them to earn forgiveness. In the meantime, I awarded Loese a few meager maturity points for not reminding me that she hadn’t meant to cause any casualties. But who in the spiral arms thought mixing a computer virus with an insane, damaged, poorly understood shipmind and then turning it loose was a good plan?

  I nodded. “But we have a more immediate problem than the criminally stupid thing you and Sally did.”

  She looked at me. That outthrust lip retracted a little.

  “Unless you want to increase that death toll by every soul on Core General, staff and patients alike… we need to find a way to end the damage being done by the meme you set loose. And that is probably going to take all of us, working together. So you’d better give me access to the source files for this malignant code you and Sally cooked up, so I can do something about fixing Linden and Afar and Dr. Zhiruo. And Starlight, for the love of little blue suns.”

  “Right,” she said. Her expression lightened a little. “Do you think you can fix it?”

  Hope.

  She was, I realized, really young. Young and full of idealism. Faith that things could be made to work out all right.

  Maybe they could. Maybe they could. And maybe the survivors of this minor disaster could be repaired and restored to health.

  That wouldn’t help the dead, however.

  “I can’t fix it,” I said. “Nobody can fix a thing like this, once it happens. But maybe we can prevent anybody else from getting hurt.”

  CHAPTER 26

  LOESE WANTED TO TALK TO Sally before she made any irrevocable decisions. I didn’t blame her. I wanted to talk to Sally myself.

  I also badly wanted to talk to Dr. Zhiruo. But first we needed an AI doctor who could somehow fix her corrupted code. Or bring her out of her protective hibernation. Or—whatever, it wasn’t my specialty—make her go.

  And hopefully make Linden and Afar go, too.

  The problem was that the best AI doctors in the hospital other
than Zhiruo were already working on Zhiruo. Sally was an AI doc… and Sally had written the toxic meme. And that was a problem, because although she had written it, it had since gotten corrupted and made—virulent? contagious?—by contact with the machine’s operating system, and apparently Sally couldn’t figure out how to stop it once it went wrong.

  At least, I was choosing to assume that she couldn’t manage to stop it. And not that she was choosing not to stop it. Because, shocking revelations and horrible mistakes aside, Sally was my friend. A friend who had fucked up catastrophically. But, nevertheless, a friend.

  I was angrier at Sally than I was at Loese. I knew Sally better. I had trusted her more.

  I had trusted her implicitly. Reflexively. The way a child trusts a parent, I suppose, until proven otherwise.

  The same way, I realized, I had trusted Core General. I hadn’t thought either of them would let me down. And yet, here we were. They were on opposite sides of this issue, and both of them had catastrophically let me down.

  It’s so easy to be catastrophically wrong. And so difficult to admit it to yourself, internalize it, and act upon the knowledge.

  Is faith ever warranted?

  Probably not.

  Sally’s betrayal felt more personal than Core General’s. Sally’s betrayal was more personal. I mean, for one thing, she was a person and not an institution.

  I left Loese and walked into Ops. Sally was moored and spinning with the hospital, so we enjoyed the semblance of gravity and I settled into the familiar embrace of my acceleration couch. When I leaned back and looked up, the wide toroid of the hospital framed a sharp-edged, upside-down horizon across the top of the forward port.

  Below that hard line, a line as solid and straight and massive as the stone edge of a crypt lid, the stars spilled out across the blinding brightness of the Core, with a whole galaxy turning behind it, the whole universe turning behind that. Billions upon billions of stars and billions upon billions of living souls.

  They all seemed so bright and close that I almost could reach out my hand and cup them up, like cupping up reflections from the surface of still water. Except that putative water was light-ans deep, and mostly empty, and I was alone in it.

 

‹ Prev