Book Read Free

Humanity iarcraa-6

Page 15

by Jerry Oltion


  “There’s also a committee of supervisory robots,” Ariel said, “but they don’t really do any long-range planning either. And they’re all subject to the Three Laws, so anybody who wants to could order them to change something, and unless it clearly hurt someone else, they’d have to do it.”

  “No matter how stupid it was,” Janet said.

  “Right.” Derec unplugged the wires between Basalom’s upper arm and the rest of his body.

  Janet looked thoughtful. “Hmmm,” she said. “Sounds like what these cities all need is a mayor. “

  “Mayor?” Wolruf asked.

  “Old human custom,” Janet replied. “A mayor is a person in charge of a city. He or she is supposed to make decisions that affect the whole city and everyone in it. They’re supposed to have the good of the people at heart, so ideally they make the best decisions they can for the largest number of people for the longest period of time. “

  “Ideally,” Wolruf growled. “We know ‘ow closely people follow ideals.”

  “People, sure.” Janet waved a hand toward the four robots in the comer. “But how about dedicated idealists?”

  Ariel was so startled she dropped the light. It clattered to the floor and went out, but by the time she bent down to retrieve it, it was glowing again, repaired.

  “Something wrong, dear?” Janet asked her.

  “You’d let one of them be in charge of a city?”

  “Yes, I would.”

  “And you’d live there?”

  “Sure. They’re not dangerous.”

  “Not dangerous! Look at what-”

  “Lucius made the right decision, as far as I’m concerned.”

  “Maybe,” Ariel said. “What worries me is the thought process he went through to make it.” She clicked off the light; Derec wasn’t working on Basalom anymore anyway. He was staring at Ariel and Janet as if he’d never heard two people argue before. Ariel ignored his astonished look and said, “The greatest good for the greatest number of people. That could easily translate to ‘the end justifies the means., Are you seriously suggesting that’s a viable operating principle?”

  “We’re not talking an Inquisition here,” Janet said.

  “But what if we were? What if the greatest good meant killing forty-nine percent of the population? What if it meant killing just one? Are you going to stand there and tell me it’s all right to kill even one innocent person in order to make life easier for the rest?”

  “Don’t be ridiculous. That’s not what we’re talking about at all. “

  It took conscious effort for Ariel to lower her voice. “It sure is. Eventually that sort of situation is going to come up, and it scares the hell out of me to think what one of those robots would decide to do about it. “

  Janet pursed her lips. “Well,” she said, “why don’t we ask them, then?”

  Lucius looked for the magnetic containment vessel he was sure must be waiting for him somewhere. Not finding one, he looked for telltale signs of a laser cannon hidden behind one of the walls. He didn’t find that, either, but he knew there had to be something he couldn’t see, some way of instantly immobilizing him if he answered wrong. The situation was obviously a test, and the price of failure was no doubt his life.

  He’d been roused out of comlink fugue and immediately barraged with questions, the latest of which was the oddest one he’d ever been asked to consider, even by his siblings.

  “Let me make sure I understand you,” he said. “The person in question is not a criminal? He has done no wrong? Yet his death would benefit the entire population of the city?”

  “That’s right.”

  Ariel’s stress indicators were unusually high, but Lucius risked his next question anyway. “How could that be?”

  “That’s not important. The important thing is the philosophical question behind it. Would you kill that person in order to make life better for everyone else?”

  “I would have to know how it would make their lives better. “

  “We’re talking hypothetically,” Janet said. “Just assume it does.”

  Do you have any idea what the underlying intent is here?Lucius asked via comlink. Perhaps it was cheating, but no one had forbidden him to consult the other robots. A pity Basalom was not on line; his experiences with Janet might provide a clue to the proper answer.

  Neither Adam nor Eve answered, but Mandelbrot did. Yesterday I overheard Ariel and Wolruf discussing the possible effect of a robot city on Wolruf ’ s world. Wolruf was concerned that the use of robots would strip her people of the ability to think and act for themselves. Perhaps this question grew out of that concern.

  I think there is more to it than that,Lucius sent. Central, can you replay the conversation that led up to this question?

  The robots received the recorded conversation within milliseconds, but it took them considerably longer to sort it all out. At last Lucius said, I believe it is clear now. They are concerned about the moral implications of unwilling sacrifice.

  Agreed,the others all said.

  Do we have any precedent to go upon?

  Possibly,Eve said. There could have been innocent people on Aranimas ’ s ship. We know that Aranimas took slaves.Yet destroying it to save a city full of Kin was still a proper solution.

  That doesn ’ t quite fit the question we are asked to consider,said Adam. A better analogy might be to ask what if the ship had been crewed only by innocent people?

  Innocent people would not have been in that situation alone,Lucius replied.

  Mandelbrot said, Aranimas could easily have launched a drone with hostages on board.

  Then the hostages would have to be sacrificed,Lucius said immediately. They would be no more innocent than the people on the ground.

  Agreed,the other robots said.

  Perhaps I begin to see the moral dilemma here,Lucius said. What if the people on the ground were somewhat less innocent?

  How so?Eve asked.

  Suppose they in some way deliberately attracted Aranimas, knowing that he was dangerous?

  That would be foolish.

  Humans often do foolish things. Suppose they did. Would they then deserve their fate?

  This is a value judgment,Adam said.

  We have been called upon to make one,Lucius replied.

  Unfortunately so. Using your logic, then, we would have to conclude that the concept of individual value requires that humans be held responsible for their actions. The inhabitants of the city would therefore be responsible for their own act and thus deserve their fate. If the hostage were truly innocent and the city inhabitants were not, then the city would have to be sacrificed.

  I agree,said Lucius. Eve? Mandelbrot?

  I agree also,Eve said.

  I wish we had never been asked this question,Mandelbrot sent. I reluctantly agree in this specific case, but I still don ’ t believe it answers Ariel ’ s question. What if the death of the innocent hostage merely improved the lives of equally innocent townspeople? To use the Aranimas analogy, what if the hostage-carrying ship aimed at the city were filled with cold virus instead of plutonium?Would it still be acceptable to destroy it?

  No,Lucius said. Colds are only an inconvenience except in extremely rare cases.

  A worse disease. then. One that cripples but does not kill.

  How crippling? How widespread would the effects be? Would food production suffer and thus starve people later? Would the survivors die prematurely of complications brought about by bitterness at their loss? We must know these things as well in order to make a decision.

  Then we must give a qualified answer,said Mandelbrot.

  Yes. Wish me luck,Lucius said.

  Perhaps two seconds had passed while the dialog went on. Aloud, Lucius said to Ariel, “We have considered three specific cases. In the case of a city in mortal peril, if the person in question were not completely innocent in the matter, but the rest of the city’ s inhabitants were, then the person would have to be sacrificed. However, if t
he person were completely innocent but the city inhabitants were not, then the city’s welfare could not take precedence in any condition up to and including the death of the entire city population. Bear in mind that a single innocent occupant of the city would change the decision. In the last case, where an innocent person’s death would only benefit the quality of life in the city, we have not reached a conclusion. We believe it would depend upon how significant the quality change would be, but such change would have to threaten the long-term viability of the populace before it would even be a consideration. “

  Perhaps the hostage should be consulted in such a case,Eve sent.

  “Indeed. Perhaps the hostage should be consulted in such a case.”

  “But not the townspeople?” Ariel asked.

  Lucius used the comlink again. Comment?

  If time allowed polling the populace, then it would allow removing them from the danger,Mandelbrot pointed out.

  Good point.“Probably not,” Lucius said. “It would of course depend upon the individual circumstances.”

  Ariel did not look pleased. Lucius was sure she would now order him dismantled, killed to protect the hypothetical inhabitants of her hypothetical city from his improper judgment. He waited for the blast, but when she spoke it wasn’t at all what he expected.

  “Frost, maybe it wasn’t a fair question at that. I don’t know what I ’ d do in that last case. “

  “You don’t?”

  “No.”

  “Then there is no correct answer?”

  “I don’t know. Maybe not.”

  Janet was smiling. “We were more worried about a wrong answer anyway. “

  “I see.”

  Wolruf cleared her throat in a loud, gargling growl. “One last ‘ypothetical question,” she said. “W’at if the particular ‘umans in this city didn’t care about the death of an individual. Say it didn’t matter even to the individual. W’at if it wasn’t part of their moral code? Would you enforce yours on them?”

  Lucius suddenly knew the exact meaning of the cliche, “Out of the frying pan into the fire.” Help! he sent over the comlink.

  The correct answer is “No,”Mandelbrot sent without hesitation.

  You are sure?

  Absolutely. Thousands of years of missionary work on Earth and another millennium in space have answered that question definitively. One may persuade by logic, but to impose a foreign moral code by force invariably destroys the receiving civilization. Often the backlash of guilt destroys the enforcing civilization as well. Also, it can be argued that even persuading by logic is not in the best interest of either civilization, as that leads to a loss of natural diversity which is unhealthy for any complex, interrelated system such as a society.

  How do you know this?

  I read over Ariel ’ s shoulder.

  Janet heard both Ariel and Wolruf sigh in relief when Lucius said the single word, “No.”

  She laughed, relieved herself. “You’re very certain of that,” she said.

  “Mandelbrot is certain,” Lucius said. “I trust his judgment.”

  Mandelbrot. That name. She could hardly believe it, but it had to be

  “I think I trust his judgment, too.” Janet turned to Ariel. “What about you, dear? Satisfied?”

  Ariel was slow to answer, but when she did it was a nod. “For now,” she said. “I don’t know if having a learning machine for a mayor will solve everything, but it might solve some of it.”

  “Who wants them to solve everything?” Janet asked. “If they did, then we’d really have problems.”

  That seemed to mollify Ariel considerably. She nodded and said, “Yeah, well, that’s something to think about, all right. “

  No one seemed inclined to carry the discussion any further. Wolruf and Ariel exchanged glances but didn’t speak. The robots all held that particular stiff posture they got when they were using their comlinks. Now that he had removed Basalom’s shoulder joint, Derec was holding the two sections of arm together to see how easy they would be to repair.

  Janet turned her attention to Mandelbrot. She looked him up and down, noticing that while most of him was a standard Ferrier model, his right arm was the dianite arm of an Avery robot.

  Mandelbrot suddenly noticed her attention and asked, “Madam?”

  “Let me guess; you got your name all of a sudden, with no explanation, and had a volatile memory dump at the same time, all when you made a shape-shift with this arm. “

  “That is correct,” Mandelbrot said. “You sound as if you know why.”

  “I do.” Janet giggled like a little girl. “Oh dear. I just never thought I’d see the result of it so many years later.”

  She looked to Derec, then to Ariel, then to Wolruf. “Have you ever thrown a bottle into an ocean with a message inside, just to see if it ever gets picked up?”

  Derec and Ariel shook their heads, but Wolruf nodded and said, “Several times.”

  Janet smiled her first genuine smile for Wolruf. Maybe she wasn’t so alien after all. She said, “Mandelbrot was a bottle cast in the ocean. And maybe an insurance policy. I don’t know. When I left Wendell, I took all the development notes for the robot cells I’d created with me. I took most of the cells, too, but I knew he’d eventually duplicate the idea and use it for his robots, so since he was going to get it anyway, I left a sample behind in a comer of the lab and made it look like I’d just forgotten it in my hurry. But I altered two of the cells I left behind. I made them sterile, so it would just be those two cells no matter how many copies he made of them, but programmed into each one I left instructions set to trigger after they registered a thousand shape-changes. One was supposed to dump the robot’s onboard memories and change its name to Mandelbrot, and the other was supposed to reprogram it to drop whatever it was doing and track me down wherever I’d gone.”

  “I received no such instructions,” Mandelbrot said.

  “Evidently the other cell was in the rest of the robot you got your arm from,” Janet said. “I didn’t tell them to stay together; I just told them to stay in the same robot. “

  Wolruf nodded. “None of my bottles came back, either.”

  Janet laughed. “ Ah, but this is even better. This is like finding the bottle yourself on a distant shore.” She sobered, and said to Mandelbrot, “I’m sorry if it caused you any trouble. I really didn’t intend for it to happen to a regular robot. I figured it would happen to one of Wendell’s cookie cutter clones and nobody’d know the difference.”

  Derec was staring incredulously at her. “Any trouble!” he said. “When your…your little time bomb went off, Mandelbrot lost the coordinates to the planet! We didn’t know where we were, and we didn’t know where anything else was, either. We had a one-man lifepod and no place to send it. If we had we probably could have gotten help and gotten away before Dad caught up with us, and none of-” He stopped suddenly, and looked at Ariel. She smiled a smile that no doubt meant “private joke,” and Derec said to Janet, “Never mind.”

  “What?”

  “If you hadn’t done that, none of this would have happened to us. Which means Ariel would probably be dead by now from amnemonic plague, and who knows where the rest of us would be? Dad would still be crazy. Aranimas would still be searching for robots on human colonies, and probably starting a war before long. Things would have been a real mess. “

  At Derec’s words, Janet felt an incredibly strong urge to gather her son into her arms and protect him from the indifferent universe. If she felt she had any claim on him at all, she would have, but she knew she hadn’t built that level of trust yet. Still, all the things he’d been through, and to think she’d been responsible for so many of them. But what was he saying? Things would have been a mess? “They’re not now?” she asked.

  “Well, they’re better than they might have been.”

  There was a rustling at the door, and Avery stood there, bare-footed, clothed in a hospital robe, his arm with its dianite regenerator held to his c
hest in a sling, with a medical robot hovering anxiously behind him. “I’m glad to hear somebody thinks so,” he said.

  “Dad!”

  The sight of his father in such a condition wrenched at Derec as nothing had since he’d watched Ariel go through the delirium of her disease. A part of his mind wondered why he was feeling so overwhelmed with compassion now, and not a couple of hours ago when he’d first seen Avery in the operating room, but he supposed it had just taken a while to sink in that his father had been injured. Maybe being with his mother for the last couple of hours had triggered something in him after all, some hidden well of familial compassion he hadn’t known existed.

  Avery favored Derec with a nod. “Son,” he said, and Derec thought it was probably the most wonderful thing he’d ever heard him say. Avery took a few steps into the room and made a great show of surveying the entire scene: his gaze lingering on Janet perhaps a fraction of a second longer than upon Derec, then shifting to Ariel, to Wolruf, to the inert robot on the exam table and to the other four standing off to the side. He locked eyes with Lucius, and the two stared at one another for a couple of long breaths.

  Lucius broke the silence first. “Dr. Avery, please accept my apology for injuring you.”

  “I’m told I have no choice,” Avery said, glancing at Janet and back to Lucius again.

  “Oh,” Lucius said, as if comprehending the situation for the first time. He hummed as if about to speak, went silent a moment, then said, “ Accepting my apology would help repair the emotional damage.”

  “Concerned for my welfare, are you?”

  “Always. I cannot be otherwise.”

  “Ah, yes, but how much? That’s the question, eh?” He didn’t wait for a reply, but turned to Janet and said, “I couldn’t help overhearing your little anecdote as I shuffled my way down here. Very amusing, my dear. I should have guessed you’d do something like that.”

  Janet blushed, but said nothing.

 

‹ Prev