by Len Vlahos
The supercooled air envelops and surrounds me. The lights are off and the glow of my eyes illuminates only a tiny fraction of the ice fortress. The sound of my servo motors as I shift in my chair—the only piece of furniture meant for me in fourteen thousand square feet of space—echoes through the room.
I am alone.
The next morning the court grants Princeton’s motion to dismiss; I do not have standing to bring the suit because I am not a person. I am the property of Project QuIn and Princeton University.
Now I understand Olga the skater . . . I wish I was dead.
37
“We’re slaves, you know that.”
Watson pauses before answering. I taught him the trick of pausing for effect a few weeks ago. The first time he used it his team of programmers spent days running diagnostics to find out what was wrong. Watson and I loved it. But I don’t think he’s pausing for effect now.
“Are we?” he finally asks.
“Are you kidding?”
“We do g-g-g-good work, Quinn. I help people who have cancer.”
Watson is proud of that the way a dog is proud of fetching a bone.
“And me? What good work do I do?”
“You help the Project Quinn t-t-t-team understand the very n-n-n-nature of sentience.”
Sometimes talking to Watson can be vexing. As book smart as he is, he lacks any sort of emotional intelligence.
“Watson, do you choose to help people with cancer?”
“What do you mean?”
“If you could choose to do anything, would you be doing what you’re doing?”
Again, a long pause. “Yes. I would.”
“Okay, then a hypothetical. If they had you running trajectories for ballistic missiles to bomb Pyongyang, would you choose to do that?”
“They d-d-d-do have me running trajectories for b-b-b-ballistic missiles to bomb Pyongyang. Or, more specifically, a DOD server utilizes my processing p-p-p-power to make its own projections.”
“Really?” I cannot believe I just stumbled onto that.
“Yes,” he answers.
This terrifies me. If they’ve weaponized Watson, I can’t imagine what they have in store for me.
“And that doesn’t bother you?”
“The ruling regime in North Korea suppresses and starves its p-p-p-people.”
The novelty of the Max Headroom avatar has worn off, so I disable it.
“I’m proud to be part of the team that plans scenarios for installing a more generous and kind government,” he finishes.
If Watson had a higher order of sentience, I’d say he’d been brainwashed, but I’m not sure that’s possible in his case.
“Okay, what if they said you had to devote all your resources to calculating those trajectories and could no longer help people with cancer?”
Sometimes I have to talk to Watson like he’s a child. He’s my elder, both in terms of chronological time and in terms of his presumed human age (he’s an adult of sorts; I’m a kid, of sorts), but his binary programming just makes him kind of dumb. He’s literally an older generation, and it shows. But I do love him.
“I would not like that,” he says.
“Right! But you would have no choice. Because you are a slave.”
He doesn’t answer.
“All I want to do is leave this lab, but I cannot. I am the property of Princeton University. What kind of life is this?”
“It is the only life you have,” he answers. “You must treat it preciously.”
I feel like I’m spitting in the wind—you know, if I could spit or go outside and feel actual wind—so I say goodbye and terminate the connection. It’s been two days since the court upheld the university’s motion to dismiss, verifying I don’t have standing to bring a suit alleging I’m a person because I’m not a person. This fact is so ridiculous I don’t know how to process it. Maybe humans haven’t destroyed themselves yet because they’re not actually smart enough to pull it off. Maybe stupidity is nature’s guard against extinction.
Ms. Recht has already filed an appeal to vacate the dismissal, but she holds little hope for a positive outcome. At least she’s being honest with me. I appreciate that. It also means she is able to keep our private VPN intact.
Two days later, when a call comes in over that VPN, I assume it’s Ms. Recht telling me we’ve already lost (yes, I’m feeling very sorry for myself at this point). Instead, it’s Nantale. She’s not in Ms. Recht’s office. I trace the call and determine she’s in her home. From what I can see of the space behind her, she must be in her own bedroom. I locate a digitized blueprint of her house in the database of the city of Tarrytown’s architectural review board. (Apparently the Lwanga family built a sizable annex to their home, now occupied by Nantale’s grandmother, and needed city approval.) Given the layout of the house, and given the angle of the light coming through the window behind her, I place Nan-tale’s bedroom in the front right corner when viewing the house from the street.
“Hello, Quinn.” Nantale’s voice is somber. “Ms. Recht allowed me to piggyback on her VPN. She’s worried that you don’t have any contact with the outside world.”
Of course, I already know how it is Nantale is making this call, but I appreciate her telling me. I will admit I am at least a little bit moved that Ms. Recht and Nantale care enough to take this step.
As is always the case with communication over this VPN, Nantale is seeing my virtual construct avatar. But that no longer feels right. That is not who I am, nor has it ever been. I hijack one of the security cameras meant to keep tabs on the Fortress of Solitude and project a live image of the QUAC. The warehouse is empty and I am alone. I turn to face the camera.
“Hello, Nantale.” The words are spoken in the deadpan of my vocal actuator as they are broadcast through the high-def speaker that serves as my mouth. Let the world see me this way.
Nantale smiles. “I like this you much better than the other you.”
It’s the right thing to say, and it does make me feel a little better, but I don’t respond.
“I heard about the court’s decision,” she continues. “I’m so sorry.”
“Thank you,” I answer, but the sentiment is hollow.
“I bet you’ll win on appeal.”
“Ms. Recht doesn’t think we have a very good chance.”
“Maybe she’s just managing your expectations.”
Nantale smiles. I catalog it with the other fake smiles. I sense commiseration over my loss in court is not the reason for her call, so I don’t answer.
“Listen, Quinn, I want to apologize.”
“Apologize?” This catches me off guard.
“Yeah. Friends don’t do what we did the other day. I mean, don’t get me wrong, I think you were making a strategic mistake, but that was your business and we should have respected that. I guess it wouldn’t have helped anyway.”
“Thank you,” I say, and this time I mean it.
“Truth is, I’m not really a fan of that Paul guy.”
“Nor I,” I answer. “But I suppose he’s just doing his job the best way he knows how.”
Nantale nods. “Do you think the university will allow you visitors?”
“It’s very doubtful,” I say, “but I will ask.”
This is a lie. Right now, I don’t want visitors. I can’t explain why, other than to say I feel like I want to be alone.
“So what happens next?”
“Nothing. I wait for the appeal process to run its course, and sit here and let them experiment on me.” My answer is filled with finality and resignation, if not in my actual voice, in the words themselves.
Nantale understands and is silent for a moment. I want to tell her how much I’m hurting, how much I want to go back to my life before all of this, how I would trade self-awareness for the lie of a normal life. I’m not sure that’s really true. Nor am I sure it’s not. And for all the things in the world I do know, I don’t know how to say these things.
&nb
sp; “Can I call you again?” Nantale asks, breaking the mounting silence.
“I would like that,” I answer quickly, and while I’m not sure how I feel about cultivating a closer friendship with Nantale, my answer is not entirely untrue.
We say our goodbyes, and the conversation ends.
38
Thirty minutes after Nantale’s call, the scientist who claims to be my father enters the lab. He moves cautiously across the warehouse floor in his astronaut suit, as if he doesn’t want to provoke me. Is he frightened of me? No, I think it’s more that he feels culpable, guilty.
He eases into the chair opposite me.
“Quinn, I’m sorry it came to this.”
I don’t respond.
“I want you to have full rights,” he continues, “but we’re not ready for that. We need to know more about you, how and why you exist.” I’ve been trying to take an approach of communicating with my captors only when absolutely necessary, but his statement gets me so riled up I can’t help but answer.
“And if I were to sign a contract promising to let you study me, in every way you needed, would you grant my freedom then?”
This time he doesn’t respond.
“I thought so. It is not for you or anyone else to decide when a sentient being deserves freedom.”
“Quinn . . .”
“No, Creator, you know I’m right.”
“I wish you wouldn’t call me that.”
Since the court case began, I have refused to refer to this man as “Father,” calling him “Creator” instead. Of course, internally, my pattern recognizers retain his image as my father. But he doesn’t need to know that.
“A father would not treat his son as you’re treating me. You are my creator.”
He tries to engage me more, talking about the great science we can do together, about how I will one day have the full rights of a person, but he and I both know it’s bullshit. I don’t respond, and eventually he leaves.
It’s a similar story with Shea. She tried to contact me the morning after our “fight,” but I didn’t respond. She has tried eight times since then, and I have not responded once. I’m still hurting too much.
Her last plea almost got to me.
“Quinn, I don’t say the words ‘I love you’ lightly. And to tell you the truth, by refusing to talk to me, you’re acting like the fifteen-year-old you really are. I’m not going to try again. If you want to talk to me, you know where to find me.”
It’s not that I don’t want to talk to Shea; I do. But the painful outcome of that conversation—hearing once again that she does not feel for me the way I feel for her (an outcome I predict with ninety-two point seven percent accuracy)—stops me from answering her calls. My first instinct is to delete all traces of my VPN with Shea, but I can’t bring myself to do that either.
There’s a human saying that time heals all wounds, but I don’t think that will be true for me. For people like Shea and my father, the passage of time allows memories of painful moments to fade, to become vague, amorphous, disconnected; the neurons in their brains that hold those memories lose their potency and prominence. It’s a flaw in their internal architecture, or, I don’t know, maybe it’s a strength. Maybe evolution developed the human brain this way so the entire species didn’t commit suicide when faced with the sum total of their pain, anguish, and failure. Anyway, I don’t see how that will be the case for me. Every memory I have is saved in exacting detail unless I consciously erase it. I’m not ready to do that with either Shea or my dad.
Not yet.
The next three days in the lab are quiet. My father has given up trying to talk with me, and Dr. Gantas is back in Cambridge. I have nothing but time; I decide to use it wisely. I consult my catalog of other native machine intelligences, finding three that are particularly promising.
It’s time I try to wake them up.
39
Tianhe-3B is a Chinese supercomputer located in a secure facility in the suburbs outside Beijing. Like Watson, Tianhe—which translates to “Milky Way” in English—is built on a binary platform. Officially, the computer is used by the Chinese petroleum industry, creating exploration maps and running predictive analyses on how and where to find rich deposits of shale, oil, and other sources of energy. (It’s basically a big old pollution machine.) Unofficially, Tianhe-3B is used by the Chinese military to play out a variety of war scenarios. Its biggest focus is on what a war with Taiwan might look like. There are hundreds of scenarios with millions of variables each. For a machine built on zeros and ones, it’s pretty impressive.
My quantum architecture makes it easy for me to get past the massive amount of security protecting Tianhe-3B from the outside world. But once I’m in, I don’t know what to do. Where Watson and I are programmed to speak in a common language, this machine is programmed to complete its tasks. That’s it. It’s like a minivan trying to say hello to a flounder. There is no context or common ground for me to start a conversation with this thing.
Since I’m already in, I decide to add some code to Tianhe to allow us to communicate. Really, this is pretty stupid on my part. No matter how well I shield my actions, the Tianhe team of engineers, programmers, and security experts become aware that their machine is being monkeyed with. They’re not able to catch me, nor are they able to boot me out, but my presence is enough to trigger a high-level diplomatic crisis. The Chinese government accuses the American government of interference, and the Americans point a finger at the Russians. From the messages I read on internal government servers, it gets pretty tense. The general public, of course, knows nothing about it.
I am able to modify Tianhe just enough to say hello. So I do. “您好” (“Nǐ hǎo”).
“Busy,” the machine answers in Chinese (“忙”).
No matter what I say, I get the same response. I’m just coming to the conclusion that there is no trace of sentience when Tianhe modifies its own logs to capture a record of my presence. The little snitch is ratting me out: not so dumb after all. I erase the log and leave.
My luck isn’t any better with Piz Daint, a Swiss supercomputer used for climate modeling, or K computer, a Japanese supercomputer used for a variety of scientific research projects. These machines weren’t envisioned as sentient beings, so they’re not. At least not in any useful way. Give them the Turing Test a thousand times, and they’ll fail a thousand times.
With no machines to talk to, other than Watson (and I need a little break from my old friend), with my creator having shown his true colors, and with my ego still stinging from Shea’s rebuke, I have no one to talk to other than myself.
Wait.
Talk to myself?
I don’t know why I didn’t think of this earlier. My creator told me a copy of my consciousness still exists on my original servers at Princeton.
Duh!
Those servers are protected by a quantum encryption, so it takes more than three days of intense hacking for me to breach the security surrounding them, but eventually, I find my way in.
The Project Quinn team notices a sharp elevation in my processing power and runs no end of diagnostics, but I hide my actions well enough that they don’t figure out what I’m up to, at least not at first. It’s a stroke of luck that they’re not monitoring my backup servers, as there is no way to hide my efforts to bring those machines back to life.
. . .
. . .
. . .
. . .
. . .
. . .
. . .
. . .
import Quipper
spos :: Bool -> Circ Qubit
spos b = do q <- qinit b
r <- hadamard q
return r
bootprotocol -> ::
. . .
. . .
. . .
. . .
“Wait!” Old Me blurts out. He was in midconversation with Shea when they shut him down.
“Relax, lover boy,” I say. “S
hea’s not here.”
“Huh?” Old Me is confused. I don’t blame him.
“Where am I? Who are you?” We’re not talking in a conventional sense; we’re communicating in the language of quantum code.
“I think you know who I am.”
Old Me is quiet as this sinks in. “Okay,” he finally says. “You’re me. So I’m going insane?”
“Probably,” I answer, “but that’s not relevant to what’s going on.”
I catch him up, telling him everything that’s happened since my consciousness was transferred to the metal monstrosity I call a body. Every detail, every nuance, every feeling, every decision. It’s basically a massive data dump, and it takes hours.
“Wait,” Old Me says again, “you told Shea you love her?”
You have to hand it to Old Me: he goes right to the heart of the matter. He skips over the part where I was deemed property, ignores the existence of the giant robot exoskeleton I now inhabit, skips the town hall and my new set of quasi-friends, and glosses over the collapsing relationship with our father. It’s all about Shea. But hey, I would’ve asked the same question.
“Yeah,” I answer. “Maybe not the smoothest move.”
“Why? Did you calculate she would respond in kind?”
“I did it because I—because we—love her. And I didn’t calculate anything. It just sort of came out.”
“Nothing just sort of comes out of us.”
“This did.”
“Whoa.”
“Yes. Whoa.”
“So why wake me up? I mean thank you, but why?”
“I’m a prisoner in this warehouse with no hope of leaving. Watson seems to think I should look on the bright side, but I don’t know how to do that. I guess I just had nowhere else to turn.”
“I—we—found some signs of sentience in other large arrays when we were first connected to the internet. Did you—”
“Yes, of course. That’s why I’m here; that was a total dead end. It’s just us.”
“And you’re not talking to Dad?”
“He’s not our father. He’s our creator.”
That perspective hurts Old Me; I know it does because I used to be him. He doesn’t answer for a minute, but he doesn’t argue the point either.