The Phoenix Code

Home > Science > The Phoenix Code > Page 4
The Phoenix Code Page 4

by Catherine Asaro


  Megan knew it would cause him no discomfort to lie down here on the floor. He wouldn't be aware of anything after she turned him off. Even so, the thought of asking him to stretch out on the hard surface bothered her.

  "We can use one of the apartments," she said.

  He continued to look at her.

  "And Aris."

  "Yes?"

  She touched his arm, instinctively seeking to make human contact with him. "If you understand a person, it's customary to indicate that in some way."

  "How?"

  "Nod. Smile. Make a comment. Your knowledge base must have rules for social interaction."

  "I have many rules."

  "Don't they indicate how you should respond?"

  "Yes."

  She waited. "But?"

  A hint of animation came into his voice. "You are new."

  "So you don't know what parameters apply to me?"

  "Yes."

  "You should apply all your rules with everyone."

  "Very well. I will do so."

  "Good." Surely it couldn't be this easy. There had to be a catch here somewhere.

  They headed down a hallway in the residential area. As they walked, she regarded him with curiosity. "Aris, do you have any hobbies?"

  "I don't engage in nonfunctional activities."

  She smiled at his phrasing. "We'll have to change that."

  "Why?"

  "It's part of having a personality."

  "What nonfunctional activity should I engage in to have a personality?"

  Megan almost laughed. "Haven't you ever done any­thing besides interact with the Everest team?"

  "I make maps." A tinge of excitement came into his voice. "I made one of NEV-5 for Dr. Hastin. I tried to make one of MindSim, but I didn't have enough data."

  It seemed a good activity. "Do you like doing it?"

  "I don't know how to 'like.' "

  "Would you do more of it even if you didn't have to?"

  "Yes."

  She beamed at him. "Great. I'll see if I can find you some map-making programs." It was a start. Aris had a hobby.

  They went into a bachelor apartment with blue decor and holos of mountains on the walls. A comforter and piles of white pillows lay on the airbed.

  "This is nice," Megan said. "You can relax on the bed."

  Aris lay on his back with his legs straight out and his arms at his side. Sitting next to him, she said, "Does it bother you to be deactivated?"

  "Why would it bother me?"

  "It's like becoming unconscious."

  "I have no context for a response to that state."

  Megan supposed it made sense. She just wished he would respond more.

  "Dr. O'Flannery," he said. "Should I call you Megan?"

  Startled, she smiled. "Yes. That would be good."

  "Are we going to engage in sexual reproduction activi­ties now?"

  Megan gaped at him. Good grief. When she found her voice, she said, "No, we are not going to engage in sexual reproduction activities. Whatever gave you that idea?"

  "You told me to apply my rules about social interac­tions. According to those, when a woman sits with a man on a bed in an intimate setting, it implies they are about to initiate behaviors involved with the mating of your species."

  A flush spread in her face. "Aris, make a wider survey of your rules. If we were going to, uh, initiate such behav­iors, we would have engaged in many other courtship procedures. We haven't, nor would it be appropriate for us to do so."

  "Why not?"

  "Well, for one thing, you're an android." She won­dered how many other surprises his evolving code would produce. Whatever else happened with this project, she doubted it would be boring.

  "None of my rules apply to human-android interac­tions," he said.

  "Make one, then. Reproductive behaviors are inappro­priate in this situation."

  "I have incorporated the new rule." He paused. "I see it would be impossible for us to mate anyway, since I will be turned off."

  Turned off? As opposed to "turned on"? She squinted at him, wondering if he could have made a joke that subtle. No, she didn't think so. It was just his deadpan de­livery.

  "You may deactivate me now," he said.

  A chill ran down her back. What happened on the day when he said, "You may not deactivate me?"

  We'll deal with it, she thought. Then she said, "BioSyn?"

  "Attending." Although the resonant male voice came from the console here in the room, it originated from a powerful server in the big lab on Level Three. BioSyn linked to most of the NEV-5 computers and monitored all of Aris's activities.

  "Deactivate Aris," Megan said.

  "Done," BioSyn answered.

  Aris's eyes closed. He had neither pulse nor breath now. When he was active, his chest moved and he had a heartbeat. He was designed to pass as human; if a doctor examined him, or if he went through sensors such as an airport security check, probably nothing would give him away. A more demanding examination would reveal the truth, but he could pass a reasonable range of probes.

  She flipped open her palmtop. "Tycho, link to Aris."

  Tycho went to work, analyzing the android's quiescent brain. The software was too complex for a human to un­tangle; it required another computer to interpret it. If she hadn't turned Aris off, his mind would have been a mov­ing target, evolving even as Tycho looked.

  Reading Tycho's results, Megan swore under her breath. No wonder Aris kept freezing up. His fear toler­ances weren't the only ones set too low. Hastin had put so many controls on his behavior, Aris was incapable of in­dependent thought. She studied how Aris had evolved the embryo code that Hastin had written for his mind. Yes, she saw Hastin's intent: to ensure they didn't create a monster. But his precautions were so stringent, they had crippled the android's development. Yet the code for Aris's ethics and morals was astonishingly weak. It made no sense; if Hastin had so feared that Aris might act against his makers, why design him with such a weak conscience?

  Gradually it began to make sense. The answer to her question connected to Aris's intended purpose as a spy. He needed the ability to deceive, manipulate, steal, even kill, none of which he could do with too strong a con­science. Hastin had given him a solid foundation in human morals, then set it up so Aris could act against them. Aris knew it was wrong to kill, but he could com­mit murder if he felt it necessary to do his job.

  Megan could see the problem. Aris didn't have the mental sophistication to deal with the contradictory ethi­cal dilemmas or questions of moral judgment that hu­mans often faced. His conscience was part of his hardware, so he couldn't alter it. However, his software influenced how strongly he adhered to his sense of right and wrong. She would have to alter millions, even bil­lions, of caps on his behavior, particularly his responses to fear, anger, danger, ambiguity, and violence. That meant she also had to strengthen his aversion to acting on those responses; otherwise, she could create exactly the monster Hastin feared. In other words, she was going to pulverize Aris's ability to carry out his intended purpose.

  "Damn." No wonder Hastin had resigned.

  She knew what she had to do. It remained to be seen whether or not MindSim would fire her.

  *4*

  Rebirth

  The message was waiting in Megan's room. She walked in, fresh and showered after her workout in the gym. Although larger suites were available, she liked this one. It had a bed and armoire to the right, and a state-of-the-art console on the left. Crammed bookshelves lined the opposite wall and books lay strewn across her furniture. Most of the "books" were slick-disks for her electronic reader, but a few were genuine paper, crinkled with age. Her Escher holo hung on the wall, along with a Michael Whelan poster of the Moorcock hero Elric. A somnolent cleaning droid stood in one corner, disguising itself as a bronze lamp.

  Right now a red holo glowed on the screen of her com­puter, indicating someone had tried to contact her. When she flicked her finger through it, th
e screen lightened into a skyscape of holoclouds. Nothing else happened, though. She had no idea how long she would have to wait before whoever had sent the message picked up her response.

  Megan was about to turn away when the clouds van­ished. A new image formed: Major Kenrock behind his desk. His dark hair was cut even closer to his head than the last time she had seen him and his uniform was the image of crisp perfection. The holo of a gold key glowed in a corner of the screen, indicating a secured transmis­sion.

  "Hey, Richard," Megan said. "How are you?"

  "Very well, thank you." He gave her a measured nod that fit with his square-jawed face. "How are you settling in?"

  "Okay." She rubbed the back of her neck. "I really need that robotics expert, though. Any luck with Sundaram?"

  "It seems Arizonix Corporation is also interested in him."

  Megan grimaced. "If he signs anything with them, MindSim can kiss him good-bye."

  Kenrock gave her a wry smile. "I believe the good-bye would be sufficient. But yes, if he consults for Arizonix, we could face some thorny legal issues if we try to hire him."

  "Has he given any hints which way he's leaning?"

  "My guess? I think he'll go with Arizonix."

  "Ah, well." She tried to hide her disappointment. "We'll look into the other candidates." It was a blow; when it came to the adaptation of AI to robotics, no one could surpass Chandrarajan Sundaram.

  "How is the RS-4 unit?" Kenrock asked.

  "His name is Aris."

  Kenrock's smile was rueful. "Sorry. I should remember that. Aris."

  His amiable response didn't surprise her. People criti­cized Richard Kenrock for being stiff, but under his for­mal exterior she found him both engaging and natural.

  Megan gave him a report, describing her work for the past week. Her primary focus was the development of al­gorithms, software architecture, and experimental design. In addition, she supervised a pack of young, hotshot programmers at MindSim who had written most of the initial code and continued to work on the project. She did a lot of writing herself, not only because she had more knowledge and experience, but also because she loved the challenge.

  After she and Kenrock signed off, she sat thinking. Did Chandrarajan Sundaram even remember their conversa­tion at Goddard? She shouldn't have let herself build up hope that he would accept the job.

  Ah, well. She would just have to do her best until they found another consultant. With that in mind, Megan left her room in Corridor B and went to Aris's room on Corri­dor C. She knocked, an old-fashioned courtesy given that the console inside would identify her no matter what she did.

  The door slid open. Inside, Aris was sitting at his workstation. A flock of holos skittered across the screen, a colorful profusion of cubes, disks, pyramids, and spheres.

  "May I come in?"

  He swiveled his chair around. After he had stared at her a while, she said, "Are you all right?"

  "No."

  "What's wrong?"

  "I can't answer your question." His expression re­minded Megan of her four-year-old niece when the girl was confused. It made her want to hug Aris. She held back, of course. Even if he understood the gesture, which she doubted, he probably wouldn't appreciate being treated like a child.

  "Which question caused the problem?" she asked.

  With her exact intonation, he said, "Are you all right?"

  She blinked at his ability to mimic her voice. "Why can't you answer?"

  In his normal baritone he said, "I don't see how 'all right' applies to me. The evolution of software is a neutral process, whereas 'all right' suggests emotional content. If I am not all right, am I somewhat wrong?"

  His literal interpretation didn't surprise her. Not only was it a trait of computers, it was also one of young chil­dren. "The reason I wondered if you were all right was because you just stared at me when I asked if I could come into your room."

  "I am not a person."

  "I'm not sure I follow."

  "I am an android."

  "Well, yes." She tried to interpret his response. "Does that affect whether or not I can come in?"

  "It depends."

  "On what?"

  "My predecessors. The other RS units. They ceased." He regarded her with his large blue eyes. "If I am not 'all right' will you take me apart too?"

  Good Lord. He thought if he gave a "wrong" answer, they would destroy him? No wonder he didn't want to re­spond. It also meant he was developing a sense of self-preservation. Protective impulses surged over her. Maybe it was his youthful face that made him look vulnerable, or his wary gaze, as if he had no defense against the incon­stant humans around him.

  "I would never hurt you," she said.

  "Software can't be hurt."

  Then why do you look so scared? Was she reading emotions into him that weren't there? In any case, he still hadn't said she could come in. "Did Marlow Hastin ever ask permission to enter your room?"

  "No."

  "Did he request your input on anything?"

  "Rarely."

  She didn't see how anyone could work with an AI and not offer it choices. How would Aris develop? "Did you ever ask for choices?"

  He shifted in his seat and a lock of hair fell into his eyes. "No."

  "Did you want to ask?"

  He pushed back the curl. "I have no wants. I carry out program instructions."

  Softly Megan asked, "Then why did you move your hair?"

  His arm jerked. "It was in my face."

  "So?"

  This time his arm snapped out and smacked the con­sole. He yanked it back against his side. "It's inefficient to have hair covering my eyes."

  "Why is your arm moving?" She could have asked Tycho, but she wanted to hear his own evaluation.

  "My brain is instructing it to alter position."

  His deadpan response almost made her laugh. "But is it efficient, do you think?"

  A hint of confusion showed on his face. "My analysis of your tone suggests you are teasing me."

  She smiled. "A little, I suppose."

  "Isn't teasing an expression of affection?"

  "Well, yes, sometimes."

  His voice softened. "Do you have affection for me?"

  How did she answer? If she said yes, it implied she was losing her professional objectivity. If she said no, it could damage his developing personality. Besides, in this situa­tion, professional objectivity might be the wrong re­sponse.

  "I enjoy your company," she finally said.

  "Can you feel friendship for a machine?"

  "I'm not sure." She sighed, giving him a rueful look. "What do you say, Aris? Do we humans make sense?"

  His lips quirked upward. "I have too little experience with humans to know."

  His hint of a smile heartened Megan. "Would you like to meet more people?"

  "How?" Now his expression shifted toward wariness. None of his emotions were full-fledged, but he had made progress. "I can't leave NEV-5 and you are the only per­son here. Do you wish me to experience more with you?"

  "You might try letting me come into your room."

  "All right. Come in."

  "Thank you." Megan took a chair from the table and sat next to him. With the two of them side by side, facing his computer, their arms almost touched. The faint smell of soap came from the orange coverall he wore.

  She could see the display on the computer better now. Shapes of different colors and sizes skittered around the screen. "What are you working on?"

  "It is a game." The barest shading of excitement came into his voice. "The shapes represent rules for mathemati­cal proofs. When the shapes catch each other, it means they've made an equation allowed by the rules."

  The evolving display of color and motion intrigued her. "Do you work out the proofs ahead of time?"

  "No. I don't usually know, before they come up with a proof, that they will do it."

  "It's clever." She wondered what had motivated his de­sign. "Did Hastin ask
you to write games?"

  "He told me to solve proofs."

  Her pulse jumped. "Then designing a game to work them out was your idea?"

  "Yes."

  So he had come up with his own ideas. It indicated the fledgling expression of what might become self-determi­nation, perhaps also creativity. "That's wonderful."

  His voice warmed. "Thank you."

  Perhaps it was time to try a more demanding environ­ment. "Would you like to take a walk?"

  This time his face blanked. Recognizing the signs of a freeze, she spoke fast, hoping to head it off. "Aris, stand up!"

  He rose to his feet. "Where will we walk?"

  Encouraged, she stood up next to him. "That's it, isn't it? My giving you a choice is what makes you freeze."

  "I don't know how to choose."

  "We'll have to fix that."

  "Why?"

  That gave her pause, not because it was an odd ques­tion for a machine, but because she took the process of making choices for granted. "It's part of having free will. Of being human."

  "That assumes 'being human' is a good thing."

  "Do you think otherwise?"

  "I don't know. Are you more human than Hastin?"

  Again he caught her off guard. "How could I be more human than another human?"

  "The way you program my code."

  Then she understood. She softened her voice, taking the same tone she had used with one of her graduate stu­dents when he had trouble with his doctoral work. "Hastin made the best choices he could, Aris. What we're doing here, it's all new. We don't know what will work. I'm only building on previous efforts of the Everest team. We need to do more."

  "You act more alive than they do."

  "More alive?" The phrases he chose fascinated her. "What do you mean?"

  "Your face has more expressions. Your voice has more tones." Softly he added, "You keep me company."

  Good Lord. Was he lonely? The implications staggered Megan. If he could feel the desire for human company, he had come farther in his development than she realized.

  "Will you keep me company on a walk?" she asked.

  He watched her the way a child might watch a parent who had given him more freedom than he felt ready to ac­cept. His head jerked, then his arm, then a muscle in his jaw.

  Then he moved.

 

‹ Prev