The speed wasn’t why I was busy: Sally and Loese handled that comfortably on their own. Helen was the ongoing distraction. Helen, who had been left alone for a very long time indeed. Helen, who wasn’t emotionally stable.
Since I’d insisted on rescuing her, the rest of Sally’s crew seemed unified in their opinion that entertaining the peripheral with PTSD was my problem. Sally was our AI medic, and she still had access to Helen’s code and was teaching herself the archaic language Helen was programmed in. I was confident that Sally was doing everything she could to patch up Helen’s psyche—and, in fact, Helen seemed to be getting more focused and coherent and less like a brain trauma patient as Sally went to work on her operating system. And possibly her processors as well. I didn’t have the skills to know what was going on inside that peripheral, and since she wasn’t my patient, privacy dictated that I not ask Sally unless I needed the information professionally.
I did know that it would be incredibly traumatic and would cause a lot of data loss for any modern AI to be constrained in a physical plant as small as Helen’s body. Perhaps arrogantly, I assumed Big Rock Candy Mountain wouldn’t have miniaturization on the level the Synarche did, which meant the processors in Helen must be bursting, and her working memory badly overstressed. It must have been equally traumatic when the microbots hived off—or when she chose to hive them off. An AI couldn’t suffer a psychotic break, exactly. But they had their own varieties of sophipathology, and dissociation of their various subroutines into disparate personalities was definitely one that had been well-testified in the literature.
Even I, who did not have Sally’s expertise in her specialty, knew that.
Whatever the relationship between Helen and the tinkertoys, the machine didn’t seem to communicate verbally. That was Helen’s sole province. When she wasn’t aphasic, I mean. So as part of her therapy, I wound up designated to communicate verbally with her.
When Sally assigned me, I thought about protesting. But if I didn’t do it, somebody else would have to. And it wasn’t that onerous, even if I’m much better at prying people out of critically damaged space ships than I have ever been at making small talk.
Maybe that’s why I like play-by-packet games: you have all the time in the world to come up with something to say.
My duties were pretty light when we weren’t actually mid-rescue and there were no patients in need of treatment. I wasn’t a specialist in rightminding or in treating artificial persons—Sally was our AI MD—but I had the time on my hands, so I spent a fair amount of it talking to Helen.
At first, she ignored me. I knew she was aware of my presence, because Sally was also keeping tabs on her, and making herself available for conversation. (We existed, that whole flight, in an abundance of preparedness.) Helen seemed to find Sally stressful and weird, though, so Sally kept her presence light.
I preferred Helen’s silence to hearing her repeat her fixed ideations, at least, and I kept at it. And I could, with Sally’s guidance, help lay the foundation for the data docs when we arrived at home.
So I sat with her and chatted. At her more than with her, at first. With Sally’s assistance, I knew the right leading questions to ask, and if she didn’t answer, I could tune myself to be more patient than I had been made. I quizzed her on our human cargo, her crew. The crew she was so fiercely loyal to that she’d sealed them into boxes to save them from… herself? From the poisonous meme that had infected her? Or had she hived off the thing she called the machine in order to manage her own cognitive dissonance about saving her crew by freezing them?
I wasn’t even quite sure where to begin unpacking that.
Helen also seemed to look at me—inasmuch as an eyeless face can—and listen when I told her about white space, which I took to mean she was interested in the science. I might have tried to explain the physics, but I didn’t understand those, either. So I encouraged Helen to strike up a relationship with Loese, and while they talked I got my rest shift in.
Shipminds don’t sleep, you see. Even shipminds trapped in their peripherals, who have forgotten that they were ever shipminds to begin with.
Assuming that’s what Helen was.
* * *
Naturally, I was asleep when the first interesting thing happened. In my own bunk, for once, with Tsosie and Loese on shift for the time being. I didn’t stay asleep long, though, because the g forces woke me.
When you spend as much time on a ship as I do, you get to know its moods and sensations. I surfaced from dreams of eating lunch with my daughter back planetside to the hazy awareness that Sally had fallen out of white space and was dropping v, my internal organs sloshing to the side in a slightly uncomfortable fashion. Ordinarily, I might have turned over under the net and gone back to sleep. But, down the corridor, I could hear people in ops, talking.
I slithered out of bed and didn’t bother with slippers. My pajamas shushed around my ankles as I padded under light gravity toward the command module. Rhym was there, and Camphvis stretched out in her acceleration couch, alert eyestalks the only indication that she was conscious.
A glance at the scans showed me that we’d dropped out of white space at a mass in order to change vectors, pick up a beacon, and dump some v. There was a star nearby, a red giant whose dim glow and massive size let us see details of the atmosphere.
And Rhym and Sally were talking to someone. It was Sally’s voice, mellow and carrying, that I had heard in my half-awake state and followed.
“… Singer, copy,” Sally was finishing.
Another ship’s voice answered her: this one a human-sounding tenor, without the tinny ring of translation. Another ox ship, then, and another at-least-partially human-crewed ox ship. “Sally, glad to run across you. And thank you for the updated sitrep. Anything else you’d like us to know?”
“Negative,” she responded. “Good fortune on your journeys and with your investigations.”
“Good fortune on yours,” the other ship responded. “My crew extends wishes that you return home safely, and that your patients thrive. Singer out.”
I settled on my couch and leaned over to Camphvis. “Who’s that?”
“Somebody famous!” she answered, brow tufts quivering with delight.
Rhym leaned about half of their feathery tendrils in our direction. “That is the shipmind of I Rise From Ancestral Night. They’re ferrying our archinformists!”
Even translation couldn’t flatten the excitement in their voice. Senso normally provided context, but Rhym and I had worked together so long now that we could read each other’s moods pretty well, even across a nonmorphologically aligned species barrier. And I understood why everybody was so thrilled. This was the ship that had recently been discovered parked in stasis near the Saga-star itself. We’d had some of her crew at the hospital for a while after that adventure—and not in the best of shape, or so I’d heard.
“Apparently,” said Sally, “the archinformists were all already on board, so it was easy to divert them here.”
Well, naturally. Where else would you find a lot of archaeologists but at the galaxy’s most interesting archaeological site?
* * *
The next dia’s session with Helen started off like every other so far. After I ate, I took my second bulb of tea over to where Helen was floating, out of the way against the aft bulkhead. I mourned my lack of coffee. There was one place—one—on Core General where humans could go for the devil bean. Its air was scrubbed before recirculating and you had to rinse your mouth out with wash before leaving.
I understand. I do understand. The Sneckethans eat nothing but rancid space fish, and we make them use their own cafeteria also. They’re pretty good sports about it, so the least humans can do is be good sports about coffee. The smell of tea doesn’t inspire the same hatred in our fellow sentients, and I’ve gotten used to it. I come from a coffee-drinking settlement, though. We even grew it onworld, for consumption and export. Suitably isolated from the local biosphere, naturally.
>
I do think it’s ironic that the root meaning of cafeteria is “place to get coffee.”
But I digress. I sat myself beside Helen and settled in.
That was when Dia 25 started to differ from Diar 1, 2, 3, and 4, et cetera. I almost dropped my tea when Helen acknowledged me at once, brightly. “Hello, Dr. Jens! Can I make your day more complete?”
Day, I noticed. Not dia. She really was from a long time ago.
I felt safe beside her. Sally had an override on her programs by then, and had installed a governor. That’s not as oppressive as it sounds. Helen still had free will and consciousness. It was just that if she tried to think about taking violent action against Sally or her crew… she wouldn’t be able to. Sally assured me that Helen’s program was very primitive, by artificial intelligence standards, and that she—Sally—wasn’t at all worried about her ability to control Helen. She’d written herself a routine to do nothing but monitor Helen and the microbots, which she was still keeping quiescent.
“Actually, I came to talk,” I said. “I thought you might be lonely.”
I might have been learning to read the non-expressions on her not-quite-a-face. It seemed that her demeanor lightened. “What would you like to talk about?”
Helen looked up, as if glancing at the ceiling. As if looking for Sally up there. I could read it as nervousness, which led me to swallow a flash of unproductive anger as I thought about the engineer who must have programmed that fragile little gesture of performative subservience into her.
That engineer was possibly frozen in a coffin a couple of meters away, neither determinately alive nor dead until we thawed him out. Schrödinger’s engineer.
Pity calmed me a little without my having to tune.
“What would you like to talk about?” I asked. “I’m tasked to help you assimilate, after all.”
“That other doctor who speaks to me sometimes. It’s an artificial person?” She seemed overawed.
“Sally? Yes. Sally is the ship. She’s an artificial intelligence. A… yes, I suppose you could call her a created person.”
“And she’s allowed to be a doctor?”
“She’s allowed to be anything she wants to be,” I said. “Or nothing at all, if that’s what she prefers, though I have yet to meet an AI that didn’t get bored if it wasn’t taking on four or five challenging careers simultaneously. They have to pay off their creation, and all Synizens are required to perform a certain amount of government service if their skills are needed. But yes, she’s a pilot, and an astrogator, and a doctor. One of the best doctors I’ve ever met.”
Helen seemed to pause for a long time. It was unusual for an artificial intelligence to take long enough to process something for a human to notice it. With most AIs, I would assume that she was giving me a moment to catch up, or waiting to see if my electrical signals needed a little extra time to feel their way through all that meat. Some of the more social AIs built lag time into their communication so we felt more at home and could get a word in edgewise. With Helen, damaged and primitive as she seemed, I thought she might actually lag that hard.
Helen said, “You say she. Does she have a body somewhere?”
“A peripheral? Not to the best of my knowledge, though a lot of AIs use them. It can be useful to have hands. Some operate waldos, though. Especially surgeons.” I laughed. “And shipminds.”
Helen was looking at me. Just looking. Well, inasmuch as someone without eyes can look. “Are you a created intelligence?”
I spluttered. “No, I’m made of meat, I’m afraid. I was grown in the traditional human way, by combining zygotes in a nutrient bath. Did you think I was a peripheral?”
“I thought—” She reached out and touched my exo with one resilient silver fingertip. It was inobvious, matched to my skin tone, resilient and flexible… but it was still an armature supporting my entire body. I guess I should have expected her to notice.
Nevertheless, I jerked back defensively. Helen recoiled like a cat who had touched something hot.
“You thought?”
“This might be a control chassis.”
We were both, I realized, running full-on into a wall of culture shock and miscommunication. Except Helen probably didn’t even have the concepts to express the dislocation and lack of basic knowledge she was feeling.
I tuned myself back until I stopped hyperventilating. “It’s an adaptive device to help with a pain syndrome. My exo helps me move, and it helps me manage my discomfort.”
Discomfort. That clean medical term that helps you separate yourself from what you’re feeling. Or what the patient is feeling.
Pain.
I didn’t tell her that without it, I wouldn’t be able to stand up under gravity. I didn’t tell her that without it, either I would have to lie around in a haze of chemical analgesics, unable to focus my mind, or I would have to tune my body out to the point that I wouldn’t have been able to rely on my proprioception—and I also wouldn’t have been able to feel it when I nailed my foot into something and broke a toe.
Hell, I wouldn’t have been able to feel it if I nailed my foot to something, full stop.
In the moments when I was struggling to get my reactivity under control, Helen had not spoken. When I looked at her again, she continued as if she had not lagged: “You’re… defective?”
Carefully, I unclenched my fist. She was an archaic, damaged AI—worse, one programmed for utterly different cultural constraints and with hundreds of ans of experience in an environment where those assumptions were never challenged. Where there were no outsiders, and no outsider ideas.
I was starting to think it would be a good idea to look up the backstory on Big Rock Candy Mountain’s crew demographics, though.
“I have a congenital condition. Would your crew consider me defective?”
“You won’t reproduce, of course,” she said, turning aside. Whoever had programmed her body language had done a good job. Her dropped shoulder and bowed head clearly telegraphed distress and dismay. And submission.
I forced my voice to remain quiet. I would have had to tune to keep it calm. “I already have. My daughter lives on Wisewell with her other mother.”
Life is a funny, terrible thing. We laugh at it because the utter banality of its tragedies renders them constant and unremarkable.
I hadn’t seen my daughter in person since she was eighteen standard months old. My ex-wife did not approve of me accepting a tour of duty rotation that would take me offworld for the whole ten ans. I felt I could not turn it down. For reasons of service and obligation—justifying my existence, if you like, when I’d been told for so long I was a drain on society—and also because I was interested in exomed, and a billet on a Judiciary ship was the best way to get that experience. And to get better medical care. And because getting out of a gravity well for a while seemed like a dream come true, if I were honest.
I’d promised I’d come back at the end of the tour.
I took the reassignment to Core General instead, because I felt like that was a place where I could really be useful. I can’t blame Alessi for cutting me loose when she did. I’d deserved it.
So I talk to my daughter in packets, as much as possible, though sometimes I go a long time without hearing from her. It seems Rache has grown up to be a fine young adult. I might have gone back to Wisewell and sued for partial custody. I might have. I could have gone back after my stint in Judiciary.
But it turned out… I couldn’t.
Not when Core General approved my request for an exo rotation. That was a dream come true, and the opportunity of a lifetime. So few doctors even get to train at Core General—let alone come on staff.
And I’d come to believe deeply in that place, but even so, I was more surprised when Starlight—the ox-sector administrator—requested I stay on and join the pilot ambulance program.
Alessi made the right choices, I think, when she cut me loose. She could have waited for me, been patient and self-sacrificing. She
could have followed me into space and dragged Rache along.
But she’s right. I’m a terrible mother.
I’m a very good doctor, though. And maybe it’s good to concentrate on the things you excel at.
It doesn’t mean that I’m okay with being a terrible mother. Or that we never regret the sacrifices we make to get what we want, or what we think we need. I had all of that piled up behind me, all those feelings surging under a brittle layer of chemical calm as my fox tried to compensate for sudden, massive emotional deregulation.
The conversation didn’t get any better.
“You’re female,” Helen said.
“Yes,” I said.
“How is it that your daughter lives with her mother, if she does not live with you?”
Oh. Big Rock Candy Mountain was one of those ships. “Because I contributed half her genetic material. But Alessi is considered to be her custodial mother under Synarche law because of legal technicalities. And none of this is really any of your business.”
“Oh,” Helen said. Then: “Yes, in my crew, you would not be considered a viable member. I understand that there is some emotional impact for you to that statement?”
It certainly doesn’t inspire me to help you save them. It doesn’t make me want to safeguard their lives and assign resources—and my precious time—to their care.
I drew a deep breath and held it until it hurt. Helen could be enlightened. Her crew could be educated, if they survived. It was not their fault that their society suffered deep-rooted sophipathologies.
Their ancestors had fled Terra when Terra seemed to be in its death throes. My ancestors had been too poor or stubborn or inessential to go in that first desperate refugee wave of emigration. But one thing about people is that we are remarkably bad at lying down to die. So my ancestors had adapted: adapted to managing limited resources, adapted to controlling their own atavistic urges through technology.
Machine Page 12