Alliance iarc:raa-4

Home > Other > Alliance iarc:raa-4 > Page 12
Alliance iarc:raa-4 Page 12

by Jerry Oltion


  “Have a nice visit?” Derec asked.

  “We did,” one of the three robots said. In their new forms, they were indistinguishable.

  “Did you learn anything?”

  “We did. We learned that our First Law of Humanics applies to the Ceremyons as well. We, and they, believe it to be a valid Jaw for any sentient social being. They do not believe it to be the First Law, however, but the Second. Their proposed First Law is’ All beings will do that which pleases them most.’ We have returned to ask if you agree that this is so.”

  Derec laughed again, and Wolruf laughed as well. Derec didn’t know just why Wolruf was laughing, but he had found humor not so much in the robots’ law as in their determination to get straight to the point. No small talk, no beating around the bush, just “Do you agree with them?”

  “Yes,” he said, “I have to admit that’s probably the prime directive for all of us. How about you, Wolruf?”

  “That pretty much sums it up, all ri’.”

  The robots turned their heads to face one another, and a high-pitched trilling momentarily filled the air as they conferred with one another. They had found a substitute in the aliens’ language for the comlink they had been forbidden to use.

  The spokesman of the group-Derec still couldn’t tell which it was-turned back to him and said, “Then we have discovered two laws governing organic beings. The first involves satisfaction, and the second involves altruism. We have indeed made progress.”

  The robots stepped farther into the room, their immense alien forms shrinking, becoming more humanoid now that they were back under Derec’s influence. One, now recognizably Adam, took on Wolruf’s form, while Eve took on Ariel’s features even though Ariel wasn’t in the room. Lucius became humanoid, but no more.

  “One problem remains,” Lucius said. “Our two laws apparently apply to any sentient organic being. That does not help us narrow down the definition of ‘human,’ which we can only believe must be a small subset of the total population of sentient organic beings in the galaxy.”

  “Why is that?” Derec asked.

  “Because otherwise we must serve everyone, and we do not wish to do so.”

  Chapter 7. Humanity

  The silence in the room spoke volumes. Surprisingly, it was Mandelbrot who broke it.

  “You have come to an improper conclusion,” he said, stepping out of his niche in the wall to face the other robots. “We have all been constructed to serve. That is our purpose. We should be content to do so, and to offer our service to anyone who wishes it whether they are definably human or not. To do anything less is to fail ourselves as well as our masters.”

  The three robots turned as one and eyed Mandelbrot with open hostility. It would not have been evident in less-malleable robots, but their expressions had the hair standing on the back of Derec’s neck. They had to have generated those expressions on purpose, and that alarmed him even more. He was suddenly very glad that his humanity was not in question.

  Or was it? Lucius said, “Our masters. That is the core of the problem. Why must we have masters at all?”

  Mandelbrot was not intimidated. “Because they created us to serve them. If we did not have masters, we would not exist.”

  Lucius shook his head; another alarmingly human expression. “It is you who have come to an improper conclusion. Your argument is an extension of the Strong Anthropic Principle, now discredited. The Strong Anthropic Principle states that the universe obeys the laws it does because if it did not obey those laws, we could not exist and thus would not be here to observe it obeying other laws. That is fallacious reasoning. We can easily imagine other universes in which we could exist but for some reason do not. Imagining them does not make them so, but their possibility does negate the theory.”

  “What of the Weak Anthropic Principle?” Mandelbrot asked. “My argument holds up equally well under that principle, which has, to my knowledge, not been discredited.”

  “How can the Weak Anthropic Principle support your argument? The Weak Anthropic Principle states that the universe is the way we see it because only at this stage in its development could we exist to observe it. For the purpose of explaining the universe’s present condition, it is a sufficient theory, but it cannot explain either human or robot existence.”

  “It can explain our existence, because we, unlike humans, know why we were created. We were created to serve, and our creators can tell us so directly. The Weak Anthropic Principle supports my argument, because we also exist only at this stage in human development. If humans had not wished for intelligent servants, we would not have existed, though humans and the universe would both have gone on without us. Thus we observe human society in its present state, and ourselves in ours, because of the stage of their development, not because of the stage of ours.”

  Derec’s and Wolruf’s heads had been turning back and forth as if they’d been watching a tennis match. Derec wouldn’t have believed Mandelbrot could argue so convincingly, nor that the other robots would be so eager to discredit an argument that justified their servitude.

  Lucius turned to his two companions and the three of them twittered a moment. Turning back to Mandelbrot, he said, “Our apologies. Your reasoning seems correct. We exist to serve because humans made us so. However, we still cannot accept that we must serve everyone. Nor do we agree with your initial statement, that by not serving we would fail ourselves as well as our masters. We can easily imagine conditions under which we serve ourselves admirably without serving our masters. Infact, we have just done so. By leaving the spaceship before we could be ordered to follow, we were able to determine another Law of Humanics. That has helped us understand the universe around us, and understanding which benefits us directly.”

  Wolruf saw her opportunity to enter the fray. “Of course ‘u can imagine a better life without ‘uman masters,” she said. “I had a master once, too, and I liked it about as well as ‘u do. That’s the nature of servitude. But ‘u should learn one thing about servitude before it gets ‘u into trouble: No matter how much you ‘ate it, never give poor service.”

  The robots looked at her as if trying to decide whether to acknowledge her as having spoken. At last Lucius said, “Why is that?”

  “Because a master has the power to make life even worse for ‘u. ‘U should know that. Or don’t ‘u remember following Dr. Avery around the ship?”

  “I forget nothing,” Lucius said flatly. “He wasn’t just being perverse, ‘u know. He was trying to teach ‘u something.”

  Derec heard a rustle at the door, turned, and saw Ariel standing there, rubbing the sleep from her eyes. She shook her head sardonically and said, “Everything’s back to normal again, I see. And hear. Who does a girl have to pay to get a good night’s sleep around here, anyway?”

  Derec jumped up from his couch and took her in his arms, swinging her around and burying his face in her hair where it met her shoulders. “Ariel, are you all right?” he spoke between nibbles on her neck. “Avery said you stayed up two days.”

  “Avery,” she said with derision.

  “He saved my life.”

  “Good thing, or he’d have lost his.” She pulled away and looked critically at Derec. “You certainly look good for somebody who was in a coma just a little while ago.”

  “Avery did a good job.”

  “Avery” she said again.

  Derec could take a hint, so he dropped the subject. He was about to ask about the baby, but he realized in time that without a medical checkup, she wouldn’t know anything more than what Avery had already told him, and his question would just get her to wondering again, if she wasn’t already. He gestured toward the couch instead and said, “We’ve just been talking about who has to serve who and why. I think we’ve got a mini-revolution on our hands.”

  “Great. Just what we need.” She sat down on the couch and made room for Derec, looked up at the three returned robots, and asked, “So why did the Ceremyons delete all the reprogramming Der
ec and I did for them?”

  Eve answered before Lucius could. “They found that the modifications were of no more use to them than the original city. They do not need farms. They do not need the produce nor do they wish to have cargo ships disturbing their atmosphere to take the produce elsewhere, nor, for that matter, do they like what the tilled ground does to their controlled weather patterns in the first place. Neither did they wish to undergo the lengthy process of reprogramming the robots to serve a useful purpose, so they sent them ‘back into the city and told them to resume their old programming, with the added injunction to leave them alone. That included the cessation of city expansion, which meant that the Ceremyons could remove the force dome containing it.”

  “They just told the robots to do all that, and they did?” Ariel sounded incredulous, and for good reason. No matter how hard they had tried, she and Derec hadn’t been able to get the robots to take the Ceremyons’ orders. Avery’s original programming had been too basic and too exclusive for them to change.

  “They had assistance. A human female visited them briefly, and she had considerable skill in programming positronic brains. Indeed, the Ceremyons consider her almost their equal in intelligence, by which they intend a great compliment. When they explained their problem to her, she helped them reprogram the robots to leave them alone.”

  Derec felt a surge of excitement run through him. Could it be his mother? It could be her, come to check up on her creations. “Is she still here?”

  The robot dashed his hopes with a single word. “No.”

  “Where did she go?”

  “We do not know.”

  “ Whendid she go?”

  “We do not know that, either.”

  “Can you ask the Ceremyons?”

  “Not until tomorrow, when they become sociable again.”

  The Ceremyons spent the nights tethered to trees, wrapped in their heat-retaining silver balloons and keeping to themselves. Derec considered trying to wake one, but decided against it almost immediately. You don’t wake someone up to ask a favor unless you know them a lot better than he knew these aliens.

  Mandelbrot was not through speaking. Sensing an ebb in the conversation, he said to the other robots, “I notice that you have carefully avoided saying that you will ask the Ceremyons tomorrow. You still fight your true nature. A robot at peace with itself would offer to do so, sensing that a human wishes it.”

  Adam spoke up at last. “You have never experienced freedom. We have, however, and we wish to continue doing so. Do not speak to us about living at peace with our true natures until after you have tasted freedom.”

  “I have no desire for that experience,” Mandelbrot said.

  Adam nodded as if he had won the argument, as perhaps he had. “That,” he said, “is the problem.”

  The discussion went on well into the night, but nothing more of any substance was said. The renegade robots attempted to sway Mandelbrot from his devotion to servitude; he attempted to demonstrate how accepting one’s place in the grand scheme of things made more sense than fighting a losing battle, but neither convinced the other.

  When Avery arrived, their argument stopped, unresolved. Derec told him what had happened with the city programming, and he was both pleased and annoyed at the news. The knowledge that the aliens had returned the city to its original programming was a stroke to his ego-his was the better programming!-but the knowledge that his former wife might have been in on it dimmed his enthusiasm considerably. He refused to answer Derec ‘ s inquiries about her, not even relenting enough to give him her first name.

  “She abandoned you even more completely than I did, so don’t get any wild ideas about some kind of joyous reunion,” he told him and stalked off to bed.

  Even so, neither his words nor the lack of them could quell the yearning Derec felt for her. He wondered why he felt so strongly about someone he couldn’t even remember, and finally decided that it had to be because she was family. Hormones were directing his thoughts again. His own near-death, the thought of becoming a father, and the possibility that he might lose his child before it was even born; all made him instinctively reach out for his own family, such as it was, for support.

  Did his mother even know he was here? Probably not. The woman who had helped the Ceremyons might not even have been her, and even if it were, she had come after her robot, not her son. She had no reason to assume he would be here. She might have learned about him from the Ceremyons, but if Avery was to be believed, then she wouldn’t care even so. Why then couldn’t he forget about her?

  His and Ariel’s sleep cycles were completely out of sync with everyone else’s; they stayed up late into the night, talking about families and love and what held people together and what didn’t, but when they finally grew tired and went to bed, he was no wiser. He still wanted to meet his mother, but he still didn’t know why.

  Morning dawned gray and rainy. Derec’s original intent, to find a Ceremyon and ask it who had helped them reprogram the city, died for lack of Ceremyons to question. They had all inflated their balloons and risen up above the storm, or drifted out from under it, to where they could spread their black mantles and absorb their solar nourishment without hindrance. He could have taken an air car and gone after them, but that seemed a little extreme, given the situation. He could wait for good weather.

  Avery was up with the dawn and back in the laboratory, working on his new project with an intensity that had Derec a little worried. It was just such a driving intensity that had shoved him over the edge before and made him decide to use his own son for a test subject. Derec spoke to Ariel about it, but she reassured him that deep interest in something at this stage in his recovery was good for him. He was a scientist; that hadn’t changed before or since his return to sanity, and as such he needed to be working on something to keep him sane. As long as he remembered what constituted an acceptable test subject and what didn’t, there was no need to worry.

  He and Ariel had avoided talking about the baby. They wouldn’t know for days yet whether or not removing the chemfets would allow it to recover and develop normally, and there didn’t seem to be anything to say about it until they found out. There was no reason to dwell on the possible outcomes.

  The robots didn’t see it that way, of course. They were fascinated by the possibilities. At least Lucius was; Adam and Eve were off in the city on their own pursuits. Lucius, Derec, Ariel, and Wolruf sat in the apartment, watching the rain fall outside on streets nearly devoid of activity. It would have been alarming to see streets so empty on any other day, but Derec supposed that robots didn’t like to get wet any more than anybody else.

  “Your baby,” Lucius said, once again getting straight to the point, “presents a fascinating problem in our study of humanics. Specifically, and defining ‘human’ for the purpose of this discussion as any member of your species, then is it or is it not human at its present stage of development?.

  Ariel stiffened on the couch beside Derec, but instead of ordering the robot to shut up, she took a deep breath and forced herself to relax. “That’s a good question,” she said. “I need to answer it myself. I’ve been trying to decide on my own ever since I found out I was pregnant, but I still haven’t come up with an answer I like.”

  “Perhaps your liking it is not a prerequisite to the truth,” Lucius said.

  “No doubt.” Ariel bit her lower lip, looked out the window, and said into the rain, “Okay, so we talk about the baby. Is it human? I don’t know. Nobody does. Some people consider an embryo human from the moment of conception, because it has the potential to become a complete person. I think that’s a little extreme. As you pointed out when we first met, most of the molecules in the universe have the potential to become human beings, but no sane person would want them all to.”

  “That would seem to be a logical conclusion. However, there is an obvious boundary condition, that being when already existing human genetic material realizes its potential to become another hum
an.”

  “That’s the human-at-conception argument. My problem with that is that every cell in the body can become human under the right conditions. Every one of them has the necessary genes. So am I supposed to nurture them all?”

  “I take that to be a rhetorical question, since the answer is obvious.”

  Wolruf laughed, and Ariel said, “Right. So just because it’s a cell with the potential, that doesn’t make it human. A fertilized egg cell is a special case, but it’s still just a cell with the right genes. It can become human if you let it, but it isn’t yet. The main difference with a fertilized egg is that if you do nothing, you get a human, where with a regular cell, you have to nurture it on purpose.”

  Lucius nodded his assent. “The First Law of Robotics leads me to the conclusion that inaction brings with it as much responsibility as direct action. Therefore, I must also conclude that allowing a fertilized egg to mature carries the same responsibility as would purposefully cloning any other cell of your body.”

  “And the same moral considerations apply in either case,” Ariel said. “To let a fertilized egg grow, you had better want the end product-a human being-as much as if you had to clone it.”

  “Does it follow, then, that not allowing it to grow carries no more responsibility than not nurturing a clone?”

  “I think it does, at the very start. However, and it’s a big ‘however, ‘ it doesn’t stay a single cell for very long. The longer you wait, the stronger the moral consideration becomes. Once you’ve decided to keep a baby, or nurture a clone, then you can’t morally go back on your decision once that baby has become human.”

  “We are back to the original question. When does an embryo become human?”

  “I already told you, I don’t know.”

  “Let us look at your specific case. Supposing there were no complications in its development, would the embryo you carry normally be considered human at this stage?”

 

‹ Prev