Book Read Free

Silver Screen

Page 29

by Justina Robson


  “No,” I said. “None in the record and none in my time at the Company.”

  “So, in your opinion, if power and materials were somehow found from elsewhere, then the present-generation machine, 901, would be self-sustaining in accordance with the accepted recognition test for independent life?”

  “Yes.” So far so good. We had moved into the realm of the expected.

  Sikorska took a nod-count when the rest of the panel were done noting, and moved on to the next part. “The issue of this hearing is complicated by the fact that this is the first time a nonhuman has been proposed as coming under our jurisdiction. We are all somewhat reluctant to proceed on a decision at the moment, since there is no adequate definition of what it is to be human written in contrastive terms to any other type of intelligent, conscious being.” She paused and looked at me with interest. “However, due to your memory and your historical training and experience, we believe that you are also placed to answer our questions on this subject. So, in clarification, could you explain for us, first of all, those specific similarities which 901 bears to human beings such that it ought to be awarded the full protection of this judiciary?”

  Obviously they were only warming up before.

  “Specifically,” I began, “the similarities are that 901 is a thinking being, equipped with senses which are analogous to the five human senses. Historically, it has been common to compare the cognitive function of AIs with that of nonintelligent computer systems which operate dedicated programs for individual functions either in series or parallel.” I could see I was losing some of my audience, if not the hungry attention of the panel. I rephrased. “It was assumed that information must be put in by another operator and that the AI was a sophisticated processor, and that was all. But in fact this type of AI, the JM series, is constructed in a way much more similar to a human brain.”

  I paused for breath and assessed my performance. In the front row of the OptiNet team, Hallett seemed intrigued, whilst Maria was tense and there were several faces of lesser note looking crosswitted. In the back Vaughn glowered. I hadn't realized he would be there and for a second my throat dried. I took a sip of water from my table.

  “To approach this from another angle, the world is an undefined place. A human being and an insect would generate very different mental structures to deal with the same surroundings, because what is important to the survival of each is very different. The first thing any brain must do, whether it has advanced consciousness or not, is to develop and constantly maintain an idea about each of the components of the world, and what they mean in terms directly related to it.”

  Wang was nodding emphatically and Mendoza was smiling. The others did not move. I proceeded to the end of my point: “901 and its recent predecessors all engage in defining their worldviews in a constant, active process, exactly analogous to the human way. The only difference there may be lies in the physical means by which this process takes place—circuits, say, instead of cells. This ongoing mental construct, the worldview, is what in humans also determines the personality—the individual becomes themselves constantly through continual adjustments of perception of the self in terms of the worldview structure. And 901, who it is, is a function of exactly the same situation. It has an identity, a personality as distinctive as any human.”

  “And is this personality close enough to human that it may be termed ‘human’?” Wang asked. “If it was created by human beings, was it not in their own image so exactly that it is another form of human?”

  “A nice idea,” I said, “but I'm afraid not. For that to be true, 901 would have to experience the world as a human being, and because it is very different, physically, with different needs, it does not.”

  “And how does it experience the world?” Harbutt said. The court was absolutely quiet. Beside me my clerk looked up expectantly, all her attention on my face.

  At this stage I was supposed to elucidate the sheer bizarreness of an AI's experience. Hallett said that it would be significant in determining what, of 901, the Company would be legally allowed to police or control in the future. In other words, it would be the evidence used to determine what it was for an AI to suffer. In truth I knew that 901’s experience of the world was greatly masked by its ease with human interfaces. Its private world was something I knew nothing about, even less than I had known about Augustine as proved to me through the medium of Soldier. But at least with Augustine we were of the same kind so I had had a fighting chance. If I now spoke my lines obediently, it would pave the way easily for all kinds of compromise and rationalization about fair treatment for 901. Thinking this through took only a few seconds, but the atmosphere of the court was so anticipatory that for everyone it seemed to stretch into minutes. I don't think anybody even breathed. And I hesitated another precious while—because as soon as I spoke, the cat was really going to be out of the bag.

  “In order to fully understand another's experience,” I said, “it would be necessary to become them. To really stand in their shoes. At no time in the history of human relationships, although it has been imagined with great empathy, has this actually been possible.” I waited for a small ripple of disappointed expectation to subside, and for everyone present to get to the end of their physical readjustments and bottom shuffling before I gave it to them, both barrels. “Until now.”

  An electric jolt seemed to rush through the massed legal bodies on OptiNet's benches. Some scrabbled for their handpads and references, others turned to hiss into ears beside them, the rest sat up, rigid with horrible suspicion but impotent, whilst I sat quiet and easy on my comfortable seat and looked at them. It was a good feeling, to have all the power for just these minutes, the power to make them squirm. I knew it was only fleeting, couldn't get any better, so I milked that pause, the cameras bugging down at me, the judges alert, the clerks bolted to the spot, the surface on every glass of water a perfect stillness. They had no idea what I was going to say next. And if I'd really thought it through, instead of deciding on the spur of a heady moment, maybe I'd have relented and stuck more closely to the script.

  But my blood was nearly boiling with suppressed anger at the whole situation I was in, and I said, “Through direct-interface implants it is possible to connect to machine experience as a substitute for one's own sensory information. Generally, this technology is used with human simulated senses so that the experience is comprehensible to the human side of the conversation. The AI part must translate itself into information suitable for a human recipient, and so it seems that it is very human itself, but this is a necessary illusion for ease of communication. It's like asking someone from a completely different culture to talk about everything as if they were your neighbour, but about a thousand times worse. However, it is possible to experience AIs directly, by foregoing this translation. I have done so several times, not only with 901 but with other AI systems.”

  I waited for the situation I was about to explain to sink in. Maria was ash-white. Hallett had his eyebrows raised, almost amused. Vaughn was gone. I looked for him, but saw nothing.

  “I'm sorry that I can't replicate the experience for you,” I said to the judges, “because I can't understand it myself. It was too alien. But I can tell you that it was every bit as alive, as dynamic and complicated as any human I have ever met.”

  The court disintegrated into noise for a minute or two at that point, and then we had to go through the difficult and lengthy interrogations about these experiences, which led to the inevitable revelations of the existence of the Shoal—previously only known about by the communications transnationals—and also of Armour, Soldier, Platoon, and so on—strictly illegal even if it was questionable that they were outside all Earth jurisdiction since OptiNet's major operations all took place offworld. Through my explanations I made sure that the Shoal would be investigated now and have a good chance of success despite the efforts of OptiNet and Astracom to squash it. I had dropped OptiNet in a heap of troubles through which it might realisticall
y be destroyed, and I slightly damaged Roy's case for 901 because I had to reveal its part in the various operations that had taken place before and after his death. We got to the end of it at about 9:00 PM, and then closed for the night. I was taken back to the hotel under guard and in silence and was left in my room to sit and think over what I'd done like a naughty schoolgirl.

  What I had done was pretty bad; even I had to admit that, despite my sense of injustice at the way the Company had behaved towards Roy, myself, and probably a host of other bright sparks equally as naïve as we had been. Until Roy's death I'd never seen myself as a commodity, nor thought that something as obviously high-minded and good as the creation and sustenance of 901 could be generated simply out of greed, ambition, and the hunger for dominance. I'd had a lot of ideas that were well short of the mark and I'd had to face that down the barrel of a gun. But did this piece of Lyceum vengeance really counteract my catastrophic disappointments?

  No. Sitting in my beautiful window at the Hotel Mozart, drinking Riesling and watching the rain, I had to accept that it didn't. In fact it stemmed almost entirely from my failure to accept my mistakes, my stubborn insistence that the world rolled my way. Maybe even the reason that Roy's secret trail of messages had fallen so flat on me was that I wasn't as tuned in as he'd thought, and even now I was missing a crucial link. I had all the data, but couldn't see anything in it. Never was any good at matrix testing.

  I took a long drink. Alcohol would stop me sleeping but damp the agitation of my brain: a subtle playoff I wanted to simplify my thinking. But didn't that just encapsulate the whole situation? I can't have a drink without justifying it with a theory. I can't stop believing in the absolute certainties of reason and the utter goodness of reason, even when it's obvious that that meme is about to break me, if it hasn't already. So I thought, hoping that events were nearing their nadir. But they weren't. Despite my recognition of where I was going wrong I was a long way from changing, and the course had a way to run.

  Maria came in shortly after I'd ordered my dinner. I sat at my table, eating, while she kicked her shoes off and walked over to the windows to look into the trickling murk. The food was excellent—duck and new potatoes—but I couldn't appreciate it. I watched her standing with unaccustomed stillness, and saw that without Joaquin she was at a loss in a way that fuddled her—because, before Joaquin, she had been perfectly socially fluent; and now, post-Joaquin, she had unlearned some of her skill. Maybe as a result of this she seemed unusually genuine.

  “The Company is going to prosecute you in civil court for all damages resulting from your evidence,” she said. Her tone was factual. “Apart from the fact that OptiNet is now under several criminal and military investigations,” she sighed, “Josef says the case is looking quite good still.”

  I didn't know what reply to make and had the feeling that anything I said might result in a sudden outburst. I'd rather she kept a lid on any hysterics, so I said nothing.

  “Did you call station? Or 901?”

  “No,” I said. I had no one on station I wanted to talk to except Lula, and I had been saving that call for later, after I had sorted out my view on what I'd done. I didn't dare talk to 901, because I had a sneaking suspicion that my great idea of testifying to its alien equality of mind had landed it in much more trouble than before. As to what the rest of the world thought, I didn't want to consider.

  Maria sighed again. “I didn't know about the way it was,” she said, still fixated on the lights of Strasbourg beyond the window. “About 901, and that it was the same as us. I thought, like you said, like it was a computer, a smart one, doing what it was told to do, run by you and the others, doing whatever the Company said. But it doesn't. It can lie and cheat, and see through us. And it can live and love and be sad, but about other things.”

  I sat with an unswallowed mouthful of potato, astonished that Maria of the iron hair, the tattooed makeup, the facile smile, could actually think it, let alone say it. A surge of guilt and anxiety ran through me in two waves as I realized how little I had thought of what others might think and interpret from the day's words. If Maria could be moved, what of the rest?

  “On the newscasts,” she carried on, “they can't stop talking about what it might be like to be an AI, what their dreams are, what they might care about, what they think of human beings. They can't believe they couldn't understand it. They want to set up some direct public links so that people on the street can experience Little ’Stein and then talk straight to camera. Lise Marshall even did their horoscopes. She says 901’s a Gemini with the moon in Scorpio. Seems like you've started something.”

  A Scorpio moon meant a jealous lover, a passionate nature. If I'd started anything, it would be an alien cult in which everyone struggled to make AI the next great answer or enemy. And we were already headed down that path before. Now the rock was rolling harder—that was all. But Maria was contemplative and I didn't say anything to rouse her.

  “I didn't mean it to reflect badly on you,” I said. “You didn't know.”

  “No,” she agreed and turned around. “But it has. We'll all be expelled or relocated.” She popped a piece of gum from her suit pocket and unwrapped it, holding it in one hand whilst screwing up the piece of waxpaper with the other. With a glance that was half acceptance and half incomprehension she shrugged at me on her way out the door. “I hope it was worth it.”

  I swallowed my potatoes with a drink of water. Whatever it was, it was out of my hands.

  Later, I called Lula. I spoke to 901 first on the implant, from the comfort of my vast luxury bed.

  “I heard your evidence,” it said. “Very dramatic.”

  “I didn't intend to drop you in trouble.”

  “I was already there. And I suppose I can be prosecuted for what I did, but only if this trial is successful. It isn't as if I regret any of my actions. Don't feel bad.”

  “Thanks.” But its generosity made things worse. “Have you heard from Augustine?”

  “No, although I understand that prosthetic surgery has taken place and he is progressing well.”

  “Oh.” For a while I was lost for words. But it was my fault, so I should have the decency to get over it and not go wailing to Nine or anybody else. Prosthetics. He lost his hand. My own hand shook now, I noticed, a faint tremor, perhaps not sure if it existed.

  “You're tired,” 901 said. “You should sleep.”

  “Yeah.” There was a lot more I ought to say to Nine, but I hadn't got the guts. “Can you get Lula for me?”

  “I'm afraid Lula is in a meeting right now. Since the evidence today, an internal investigation is under way. I should tell you—they searched your room and Lula's. They have packed all your belongings. You won't be returning to Netplatform.” It paused for an AI thought-year. “I'm very sorry. It's going to be difficult to stay in touch.”

  “Do you want to?” I was swimming in shock.

  “If you do,” it said. “When they take the implant out, however, I don't think…”

  But I didn't hear the rest. Take it out? I'd forgotten I had a head full of Company property. What would I do without it? How could I call anyone securely? How would I get my information and chat and do everything I did without it? “You're not kidding, are you?” I asked, desperately hoping there might be a chance they would let me keep it. “Couldn't I buy it off them or something?”

  “Not now that you're such a big espionage risk,” 901 said. “But look on the brighter side. Before, you were in their power. Now things are more even. The world knows your story, and OptiNet wouldn't dare let you die on the operating table or anything remotely like that. Your life is safe from them…for now at least.”

  “Great,” I said. We didn't speak again that night, but I was aware of 901 there, a presence just left of centre, listening with me to the faint sound of the gutters overflowing above the window.

  The next day the defendant's side of the courtroom was again empty. Otherwise it was exactly as before. The clerks loaded
our tables with refreshments, documents, and referential data. One thing which was different was the way in which people now looked at me. The day before they had worn expressions of polite interest or mild anxiety or curiosity in my freakish qualifications. Today the majority of faces held a new expression—respect. Whether this was respect of my authority and moral decision to bare the truth or respect from fear of what I might say next, I couldn't tell.

  “After session we are decided that upon the questions of comparison between human and AI with regard to intellect, there is no need for further answers. However, we are still in some debate over the emotional nature of 901.” Mendoza was speaking today. His manner was neutral, although he gave the reporters on the balcony a dark look. “Do high-grade AIs experience emotions and act upon them?”

  “My evidence yesterday concerning 901’s behaviour towards Roy Croft, myself, and others with whom it has frequent contact can't be explained by any other theory,” I said, not lying, but maybe oversimplifying. It could be explained by a devious manipulatory goal I hadn't yet seen in action, but I thought this was not the place to air the idea. “All human actions and impulses are initiated by emotional states which arise in turn from personal goals, whether consciously known or unknown,” I elaborated. “Those actions of 901 not caused by the performance of its tasks as a tool or processor in response to human demands are caused by states which closely resemble human emotional states.”

  “How close would you make the comparison, Miss O'Connell?” Mendoza said.

  I glanced up at the media reporters. Maybe it was my imagination, but there seemed to be more of them. Soldier's brute expediency, the Shoal's strange reactions—they were strong in my mind, but so was the silent night.

  “It differs with the AI,” I said. “In the case of 901 I would say that during its manufacture of the HughIe interfaces and surrounding behaviours, 901 is more emotionally sophisticated and complex than most human beings. It reads human beings very accurately.”

 

‹ Prev