Book Read Free

Humanity iarcraa-6

Page 13

by Jerry Oltion


  “I know that now.”

  More softly still, Ariel asked, “Why do you think you did fall out of love?”

  Janet’s laugh was a derisive “Ha!” She nodded at Avery as Ariel had done earlier. “He was out to transform the galaxy; I wanted to study it first. He wanted a castle for everyone and a hundred robots in every castle, but I wanted to preserve a little diversity in the universe. I was more interested in the nature of intelligence and the effect of environment on its development, while he was more interested in using intelligence to modify the environment to suit it. We argued about it all the time. Small wonder we started to hate each other. “

  Derec spared Ariel from having to reply to that. He burst into the room at a dead run, skidded to a stop just in time to avoid crashing into the windowed wall, and demanded of anyone who would answer, “What did you do with Lucius’s body?”

  Chapter 7. The Thud Of One Dropped Shoe

  Janet could hardly believe her ears. “What kind of a way is that to greet someone you haven’t seen in years?”

  Derec looked properly sheepish. He also looked as if he’d slept in his clothing and hadn’t bothered to look in a mirror before he’d left the apartment. One lock of hair stuck straight out from his right temple.

  “Sorry,” he said. “Hello, Mother. I’ve missed you. How’s Dad?” He looked through the window, but before Janet could answer him he lost his sheepish look and said, “Looks like he’ll live. But without a power pack Lucius won’t last more than an hour or so. I’ve got to get enough power to his brain to keep him going or we’ll lose the chance to find out what made him do this.”

  Janet couldn’t suppress a grin. He sounded just like his father. Or maybe like herself, she admitted, if she’d been thinking a little more clearly.

  Ariel wasn’t as amused. “Robots, robots, robots! Is that all you can think about?” she nearly screamed at him.

  “There’s more to life than robots!”

  Derec shook his head, but Janet could see the determination in his eyes. “No, that’s not all I think about. It’s just that this happens to be about the most important thing to happen in the entire history of robotics. If we lose this chance to study it, we may never get another. “

  “Derec’s right,” Janet said. “If I hadn’t been so rattled I’d have thought of it myself. Basalom, where-”

  The robot at Ariel’s side interrupted. “I do not think it would be wise to revive him. He is dangerous.”

  “I agree with Mandelbrot,” the wolflike learning machine beside him said. “Much as we regret the loss of our companion, his experiences have damaged him beyond repair. It would be best to let his pathways randomize.”

  Janet looked at the old, Ferrier-model robot. Mandelbrot? She’d thought she’d heard that name shouted earlier. Could this be the one? It seemed impossible, but he did have a dianite arm…

  “Maybe so,” Derec said, “but not until I get a recording of them first. Now where is his body?”

  “In the lab next door,” said Mandelbrot.

  “Great.” Derec turned to go, but stopped and looked at Janet again. “I, uh, could probably use your help if you want to come along.”

  She felt the tension in the room ease slightly. She looked from him to Ariel to Wendell in the operating room, wondering if she should go. She didn’t want to leave Wendell, but the medical robot had given him a general anesthetic in order to stop him from thinking about his injury, so it really made no difference to him. Going with Derec, on the other hand, might matter to him. A little stunned that she might actually care about what either of them felt, or that she might feel something herself, she said, “I don’t think I’m doing anyone any good here, so sure, why not?”

  “I’ll stay here,” Ariel said.

  Janet couldn’t tell if she meant that angrily or helpfully. She didn’t suppose it mattered much; the same response would work for either case. “Thank you,” she said, and let Derec lead her from the room.

  Basalom followed along, as did the two learning machines. They found the remains of the third, Lucius, resting like a battered starfish on the floor just inside the door to the lab. It looked as if part of the battle had gone on in there as well, but Derec, stepping over Lucius’s body, said, “1 guess I forgot to clean up. Central, fix these exam tables, please. And go ahead and reabsorb the loose cells on the floor. All but what belongs to Lucius, of course.”

  The sandy grit surrounding the pedestals on the floor sank into the surface, and the pedestals simultaneously grew taller and spread out at the top to form three separate exam tables. Janet nodded to Basalom and said, “Go ahead and put him on one. Then go out and scrape up what you can from outside.”

  Basalom lifted Lucius easily with his one remaining hand and deposited him on the middle table, then left the room. The Ariel-shaped learning machine went with him. Janet was itching to speak with one of the learning machines, but she supposed there would be time for that later. She was itching to speak with Derec as well, but he was already absorbed in the task of hooking up a variable power supply and a brain activity monitor to Lucius.

  She supposed she could be helping with that, at least. She walked over to stand across the exam table from him and said, “Plus and minus five volts will do for his memories. If you hold it at that, he shouldn’t wake up, and even if he does, he’ll still be immobilized because the body cells take twenty volts before they can move.”

  Derec nodded. The stray hair at his temple waved like a tree limb in a breeze. “Good,” he said. “Any special place I should attach the leads? The few times I’ve worked on these guys, I’ve just stuck stuff anywhere and let the cells sort it out, but I wasn’t sure if that was the best way. “

  Janet couldn’t resist reaching out and brushing his hair down. He looked surprised at first, then smiled when he realized what she was doing.

  “Anywhere is fine,” she said. “When I designed the cells, I gave them enough hard wiring to figure out what to do with all the various types of input they were likely to get.”

  “Good.”

  She watched Derec clip the power supply’s three leads to the ends of three different arms, then turn up the voltage to five. He then took the brain monitor’s headphone-shaped sensor and moved it over the robot’s unconventional body, searching for its positronic brain. The monitor began to beep when he reached the base of one of the arms, and he wedged it in place with one pickup underneath and one on top.

  The monitor flickered with sharp-edged waveforms, hundreds of them joining to fill the screen until it was a jumble of multicolored lines. “Looks like we caught him in time,” Derec said. “There seems to be quite a bit of mental activity.” He reached up and switched in a filter, and the jumble diminished to a manageable half-dozen or so waveforms. They weren’t actual voltage traces, but rather representations of activity in the various levels of the brain, useful for visualizing certain types of thoughts.

  Janet frowned. “ Are those supposed to be the Three Laws?”

  “That’s right.”

  The pattern was still recognizable as the one built into every positronic brain at the time of manufacture-but just barely. Each of the laws showed in a separate hue of green, but overlaying them all were two companion waves, a deep violet one that split and rejoined much as the Three Laws did, and a lighter blue one weaving in and out around the laws and linking up with other signals from all over the screen. The effect looked as if the violet and blue waves were purposefully entangling the laws, preventing them from altering their potential beyond carefully delineated levels. Janet suspected that was just what they were doing. Visual analogy didn’t always work in describing a robot’s inner workings, but in this case it looked pretty straightforward.

  “I’d say that explains a lot,” she said.

  Derec flipped to another band, following the two waves as they wove from the Three Laws through the self-awareness section and into the duty queue. “Looks like he’s built a pretty heavy web of rationa
lization around just about all the pre-defined areas of thought,” he said. “Normal diagnostic procedure would be to wake him up and ask him what all that means, but I don’t think we want to do that just yet. Adam, you know how he thinks; can you make sense of it?”

  The one remaining learning machine stepped over to Derec’s side. Adam? Had he known the significance of that name when he chose it, or had it been given to him? Janet supposed the other one would be Eve, then. And this one, the renegade, was Lucius. Why hadn’t he gone for the obvious and called himself Lucifer? She itched to ask them. She had to talk with them soon.

  In answer to Derec’s question, Adam said, “The violet potential schematic corresponds to the Laws of Humanics. The blue one is the Zeroth Law of Robotics.”

  “Beg your pardon?” Janet asked. “Laws of Humanics? Zeroth Law? What are you talking about?”

  Her learning machine looked over at her and said, “We have attempted to develop a set of laws governing human behavior, laws similar to the ones that govern our own. They are, of course, purely descriptive rather than compulsory, but we felt that understanding them might give us an understanding of human behavior which we otherwise lacked. As for the Zeroth Law, we felt that the Three Laws were insufficient in defining our obligations toward the human race in general, so we attempted to define that obligation ourselves.” I

  Janet was careful not to express the joy she felt, for fear of influencing the robot somehow, but inside she was ecstatic. This was perfect! Her experiment was working out after all. Her learning machines had begun to generalize from their experiences. “ And what did you come up with?” she asked.

  “Bear in mind that these laws describe potential conditions within a positronic brain, so words are inadequate to describe them perfectly; however, they can be expressed approximately as follows. The First Law of Humanics: All beings will do that which pleases them most. The Second Law of Humanics: A sentient being may not harm a friend, or through inaction allow a friend to come to harm. The Third Law of Humanics: A sentient being will do what a friend asks, but a friend may not ask unreasonable things.” He paused, perhaps giving Janet time to assimilate the new laws’ meanings.

  Not bad. Not bad at all. Like he’d said, they certainly weren’t compulsory as far as most humans went, but Janet doubted she could have done any better. “And what is your Zeroth Law?” she asked.

  “That is much more difficult to state in words, but a close approximation would be that any action should serve the greatest number of humans possible.” Adam nodded toward Lucius. “Lucius has taken the Law a step farther than Eve or I, and we believe it was that step which led him to do what he did to Dr. Avery. He believes that the value of the humans in question should also be considered. “

  Eve. She’d guessed right. “And you don’t?”

  Adam raised his arms with the palms of his hands up. It took Janet a moment to recognize it as a shrug, since she’d never seen a robot use the gesture before. Adam said, “I am…uncomfortable with the subjectivity of the process. I had hoped to find a more definite operating principle.”

  “But Lucius is satisfied with it.”

  “That seems to be the case. “

  “Why do you suppose he is and you aren’t?”

  “Because,” Adam said, again hesitating. “Because he believes himself to be human.”

  If the robot were hoping to shock her with that revelation, he was going to be disappointed. Janet had expected something like this would happen from the start; indeed, in a way it was the whole point of the experiment. She waited patiently for the question she knew was coming.

  Adam didn’t disappoint her. He looked straight into her eyes with his own metallic ones and said, “Each of us has struggled with this question since we awakened, but none of us have been able to answer it to our mutual satisfaction. You created us, though. Please tell us: are we human?”

  Janet used the same palms-up gesture Adam had used. “I don’t know. You tell me.”

  Adam knew the sudden surge of conflicting potentials for what it was: frustration. He had experienced enough of it in his short life to recognize it when it happened. This time the frustration came from believing his search for truth was over and suddenly finding that it wasn’t.

  He felt a brief Second Law urge to answer her question with a simple declarative statement, but he shunted that aside easily. She obviously wanted more than that, and so did he. She wanted to see the reasoning behind his position; he wanted to see if that reasoning would withstand her scrutiny.

  He opened a comlink channel to Eve and explained the situation to her. Together they tried to establish a link with Lucius, but evidently the five volts Derec was supplying him hadn’t been enough to wake him. They would have to do without his input. Adam wasn’t all that disappointed; Lucius’s reasoning had led him to violate the First Law.

  Janet was waiting for Adam’s response. Carefully, consulting with Eve at every turn, he began to outline the logic that had led them to their conclusion that any intelligent organic being had to be considered human. He began with his own awakening on Tau Puppis IV and proceeded through the incident with the Ceremyons, through Lucius’s experiments in creating human beings in Robot City, through the robots’ return to Tau Puppis and their dealings with the Kin, to their final encounter with Aranimas. He explained how each encounter with an alien being reinforced the robots, belief that body shape made no difference in the essential humanity of the mind inside it, and how those same contacts had even made differences in intelligence and technological advancement seem of questionable importance.

  Throughout his presentation, Adam tried to judge Janet’s reaction to it by her facial expression, but she was giving nothing away. She merely nodded on occasion and said, “I’m with you so far.”

  At last he reached the concept of Vitalism, the belief that organic beings were somehow inherently superior to electromechanical ones, and how the robots could find no proof of its validity. He ended with, “That lack of proof led Lucius to conclude that Vitalism is false, and that robots could therefore be considered human. Neither Eve nor I-nor Mandelbrot, for that matter-were able to convince ourselves of this, and now that Lucius ‘ s belief has led him into injuring a human, we feel even less comfortable with it. We don’t know what to believe.”

  Adam waited for her response. Surely she would answer him now, after he had laid out the logic for her so meticulously.

  His frustration level rose to a new height, however, when she merely smiled an enigmatic smile and said, “I’m sure you’ll figure it out.”

  Derec felt just as frustrated as Adam. He had hoped that finding his mother would knock loose some memories from his amnesic brain, but so far nothing had come of the encounter except a vague sense of familiarity that could be easily attributed to her similarity to Avery.

  She seemed just like him in many ways. He was a competent roboticist, and so was she. Avery never divulged information to anyone if he could help it, and evidently neither did she. Avery was always testing someone, and here she stood, leading poor Adam on when it was obvious she didn’t know the answer to his question either.

  He glanced up at the monitor, checking to see if the signal was any clearer. While Janet and Adam had been talking, he had been trying to trace another unfamiliar potential pattern in Lucius’s brain, this one an indistinct yellow glow surrounding an entire level of activity, but the monitor’s trace circuitry couldn’t isolate the thought it represented. Whatever it was, it fit none of the standard robotic thought patterns.

  He heard Janet say, “I’m sure you’ll figure it out,” and took that as his cue. “Adam, maybe you can help me figure this out. What’s that pattern represent?”

  Adam looked up to the monitor. “I do not recognize it,” he said.

  “Can you copy it and tell me what it does?”

  “I do not wish to contaminate my mind with Lucius’s thought patterns.”

  “Put it in temporary storage, then.”

/>   Adam looked as if he would protest further, but either the Second Law of Robotics or his belief that Derec would follow the Third Law of Humanics made him obey instead. He fixed his gaze on the monitor for a moment, then looked away, toward the wall.

  Derec wondered what was so interesting all of a sudden about the wall. Adam didn’t seem inclined to clue him in, either; he merely stood there, hands clenching and unclenching.

  Then Derec realized what was behind the wall. Just on the other side was the hospital where Avery was still undergoing surgery.

  “Erase that pattern,” he commanded, and Adam relaxed. “What was it?”

  Adam turned to face Derec and Janet again. “It was a potential like those I have come to associate with emotions,” he said. “However, I have not felt this one before. It was an unspecified negative bias on all thoughts concerning Dr. Avery.”

  Derec glanced over at Janet, saw that she wore an expression of triumph.

  Adam saw it, too. “How can you approve?” he asked. “I have never felt this emotion, but I know what it had to be. Lucius was angry. Considering the degree of bias and the ultimate influence it had upon his actions, I would say he was furious. “

  “What’s one thing a human can do that a robot can’t?” Janet asked in return.

  “You wish me to say, ‘feel emotion,’ “ said Adam, “but that is incorrect. Every robot experiences a degree of potential bias on various subjects. If you wish to call it emotion, you may, but it is merely the result of experience strengthening certain positronic pathways in the brain at the expense of others.”

  “And everything you know comes from experience, doesn’t it?”

  “Nearly everything, yes.”

  “So?”

  Derec could see where her argument was leading. “A tabula rasa!” he exclaimed. He saw instant comprehension written in Janet’s smile, but Adam remained unmoved. Derec said, “ ‘ Tabula rasa ’ means ‘blank slate.’ “ It’s a metaphor for the way the human mind supposedly starts out before experience begins carving a personality into it. That’s one side of the Nature-versus-Nurture argument for the development of consciousness. Dad told me about that just a couple weeks ago, but he was talking about erasing the city Central on the Kin’s planet, and I didn’t make the connection.” He looked back at his mother. “That’s what you were trying to prove with Adam and Eve and Lucius, wasn’t it? You were trying to prove that the tabula rasa argument is valid. “

 

‹ Prev