Asimov’s Future History Volume 12

Home > Science > Asimov’s Future History Volume 12 > Page 48
Asimov’s Future History Volume 12 Page 48

by Isaac Asimov


  Now Davlo looked on Kaelor, and could not take his eyes off him. “Yesterday, he grabbed me and stuffed me under a bench and used his body to shield mine. He risked his life for mine. He’d remind me himself that the Three Laws compelled him to do it, but that doesn’t matter. He risked his life for mine. And now we’re simply going to risk his life.” He paused a moment, and then said it in plainer words. “We’re probably about to kill him,” he said in a flat, angry voice. “Kill him because he wants to protect us – all of us – from me.”

  Fredda glanced at Davlo, and then looked back at Kaelor. “I think you’d better let me do the talking,” she said.

  For a moment she thought he was about to protest, insist that a man ought to be willing to do this sort of job for himself. But instead his shrugged, and let out a small sigh. “You’re the roboticist,” he said, still staring straight at Kaelor’s dead eyes. “You know robopsychology.”

  And there are times I wished I knew more human psychology, Fredda thought, giving Davlo Lentrall a sidelong glance. “Before we begin,” she said, “there’s something you need to understand. I know that you ordered Kaelor built to your own specifications. You wanted a Constricted First Law robot, right?”

  “Right,” said Lentrall, clearly not paying a great deal of attention.

  “Well, you didn’t get one,” Fredda said. “At least not in the sense you might think. And that’s what set up the trap you’re in now. Kaelor was designed to be able to distinguish hypothetical danger or theoretical danger from the real thing. Though most high-function robots built on Inferno are capable of distinguishing between real and hypothetical danger to humans, they in effect choose not to do so. In a sense, they let their imaginations run away with them, worry that the hypothetical might become real, and fret over what would happen in such a case, and treat it as if were real, just to be on the safe side of the First Law. Kaelor was, in effect, built without much imagination – or what passes for imagination in a robot. He is not capable of making that leap, of asking, ‘What if the hypothetical became real?’”

  “I understand all that,” Davlo said irritably.

  “But I don’t think you understand the next part,” Fredda said with more coolness than she felt. “With a robot like Kaelor, when the hypothetical, the imaginary, suddenly does become real, when it dawns on such a robot that it has been working on a project that is real, that poses real risks to real people – well, the impact is enormous. I would compare it to the feeling you might have if you suddenly discovered, long after the fact, that, unbeknownst to yourself, some minor, even trifling thing you had done turned out to cause the death of a close relative. Imagine how hard that would hit you, and you’ll have some understanding of how things felt to Kaelor.”

  Davlo frowned and nodded. “I see your point,” he said. “And I suppose that would induce a heightened First Law imperative?”

  “Exactly,” Fredda said. “My guess is that, by the time you switched him off, Kaelor’s mental state was approaching a state of First Law hypersensitivity, rendering him excessively alert to any possible danger to humans. Suddenly realizing that he had unwittingly violated First Law already would only make it worse. Once we switch Kaelor back on, he’s going to revert to that state instantly.”

  “You’re saying he’s going to be paranoid,” Davlo said.

  “It won’t be that extreme,” said Fredda. “He’ll be very careful. And so should we be. Just because his body is immobilized, it doesn’t mean that he won’t be capable of committing – of doing something rash.”

  Davlo nodded grimly. “I figured that much,” he said.

  “Are you ready, then?”

  He did not answer at first. He managed to tear his eyes away from Kaelor. He paced back and forth a time or two, rubbed the back of his neck in an agitated manner, and then stopped, quite abruptly. “Yes,” he said at last, his eyes locked on the most distant corner of the room.

  “Very well,” she said. Fredda pulled an audio recorder out of her tool pouch, switched it on, and set on the floor in front of Kaelor. If they got what they needed, she wanted to be sure they had a record of it.

  She stepped around to the rear of the maintenance frame, opened the access panel, and switched Kaelor back on. She moved back around to the front of the maintenance frame, and positioned herself about a meter and a half in front of it.

  Kaelor’s eyes glowed dimly for a moment before they flared to full life. His head swiveled back and forth, as he looked around himself. He looked down at his arms and legs, as if confirming what he no doubt knew already – that his body had been immobilized. Then he looked around the room, and spotted Lentrall. “It would appear that you figured it out,” Kaelor said. “I was hoping for all our sakes that you would not.”

  “I’m sorry, Kaelor, but I –”

  “Dr. Lentrall, please. Let me handle this,” said Fredda, deliberately speaking in a cold, sharp-edged, professional tone. This had to be impersonal, detached, dispassionate if it was going to work. She turned to Kaelor, up there on the frame. No, call the thing by its proper name, even if she had just now realized what that name was. The rack. The torturer’s rack. He hung there, paralyzed, strapped down, pinned down. an insect in a collector’s sample box, his voice and his expressionless face seeming solemn, even a little sad. There was no sign of fear. It would seem Kaelor had either too little imagination, or too much courage, for that.

  Suddenly she felt a little sick, but she forced herself to keep all hint of that out of her voice and expression. She told herself she was imposing human attributes on Kaelor, investing him with characteristics and emotions he simply did not have. There was no practical difference between having him up on that rack and having a malfunctioning aircar up on a hydraulic lift in a repair shop. She told herself all of that, and more, but she did not believe a word of it. She forced herself to look steadily, coolly, at Kaelor, and she addressed him. “Kaelor, do you know who I am?”

  “Yes, of course. You are Dr. Fredda Leving, the roboticist.”

  “Quite right. Now then, I am going to give you an order. You are to answer all my questions, and answer them as briefly as possible. Do not provide any information I do not ask for, or volunteer any information. Regard each question by itself. The questions will not be related to each other. Do you understand?” she asked.

  “Certainly,” said Kaelor.

  “Good.” Fredda was hoping, without much confidence, that she would be able to ask her questions in small enough pieces that no one question would present a First Law violation. And of course the questions would be related – that part was a baldfaced lie. But it might be a convincing enough lie to help Kaelor live through this. She knew for certain that asking, straight-out, the one big question to which they needed an answer would be absolutely catastrophic. She dared not ask for the big picture. She could only hope Kaelor would be willing and able to provide enough tiny pieces of the puzzle.

  The trouble was, Kaelor had to know what she was doing as well as she did. How far would he be able to go before First Law imperative overrode the Second Law compulsion to obey orders?

  There was one last thing she could do to help Kaelor. Fredda did not have any realistic hope that the Third Law’s requirement for self-preservation would help sustain Kaelor, but she could do her best to reinforce it all the same. “It is also vital for you to remember that you are important as well. Dr. Lentrall needs you, and he very much wants you to continue in his employ. Isn’t that so, Doctor?”

  Lentrall looked up from the hole he was staring at in the floor, and glanced at Fredda before settling his gaze on Kaelor. “Absolutely,” he said. “I need you very much, Kaelor.”

  “Thank you for saying so,” Kaelor said. He turned his gaze back on Fredda. “I am ready for your questions,” he said.

  “Good,” said Fredda. It might well help Kaelor if she kept the questions as disordered as possible, and tossed in a few unrelated ones now and then. “You work for Dr. Lentrall, don’t you?�
� she asked.

  “Yes,” said Kaelor.

  “How long have you been in his employ?”

  “One standard year and forty-two days.”

  “What are the specifications for your on-board memory system?

  “A capacity of one hundred standard years non-erasable total recall for all I have seen and heard and learned.”

  “Do you enjoy your work?”

  “No,” said Kaelor. “Not for the most part.”

  An unusual answer for a robot. Generally a robot, when given the chance, would wax lyrical over the joys of whatever task it was performing.

  “Why do you not enjoy your work?” Fredda asked.

  “Dr. Lentrall is often abrupt and rude. He will often ask for my opinion and then reject it. Furthermore, much of my work in recent days has involved simulations of events that would endanger humans.”

  Uh-oh, thought Fredda. Clearly it was a mistake to ask that follow-up question. She would have to reinforce his knowledge of the lack of danger, and then change the subject, fast, before he could pursue that line of thought. Thank Space she had turned down his pseudo-clock-rate. “Simulations involve no actual danger to humans,” she said. “They are imaginary, and have no relation to actual events. Why did you grab Dr. Lentrall and force him under a bench yesterday?”

  “I received a hyperwave message that he was in danger. First Law required me to protect him, so I did.”

  “And you did it well,” Fredda said. She was trying to establish the point that his First Law imperatives were working well. In a real-life, nonsimulated situation, he had done the proper thing. “What is the status of your various systems, offered in summary form?”

  “My positronic brain is functioning within nominal parameters, though near the acceptable limit for First Law-Second Law conflict. All visual and audio sensors and communications systems are functioning at specification. All processing and memory systems are functioning at specification. A Leving Labs model 2312 Robotic Test Meter is jacked into me and running constant baseline diagnostics. All motion and sensation below my neck, along with all hyperwave communication, have been cut off by the test meter, and I am incapable of motion or action other than speech, sight, thought, and motion of my head.”

  “Other than the functions currently deactivated by the test meter, deliberate deactivations, and normal maintenance checks, have you always operated at specification?”

  “Yes,” said Kaelor. “I remember everything.”

  Fredda held back from the impulse to curse out loud, and forced herself to keep her professional demeanor. He had violated her order not to volunteer information, and had volunteered it in regard to the one area they cared about. Only a First Law imperative could have caused him to do such a thing. He knew exactly what they were after, and he was telling them, as best he could under the restrictions she had placed on him, that he had it.

  Which meant he was not going to let them have it. They had lost. Fredda decided to abandon her super-cautious approach, and move more quickly toward what they needed.

  “Do you remember the various simulations Dr. Lentrall performed, and the data upon which they were based?”

  “Yes,” Kaelor said again. “I remember everything.”

  A whole series of questions she dared not ask flickered through her mind, along with the answers she dared not hear from Kaelor. Like a chess player who could see checkmate eight moves ahead, she knew how the questions and answers would go, almost word for word.

  Q: If you remember everything, you recall all the figures and information you saw in connection with your work with Dr. Lentrall. Why didn’t you act to replace as many of the lost datapoints as possible last night when Dr. Lentrall discovered his files were gone? Great harm would be done to his work and career if all those data were lost for all time.

  A: Because doing so would remind Or. Lentrall that I witnessed all his simulations of the Comet Grieg operation and that I therefore remembered the comet’s positional data. I could not provide that information, as it would make the comet intercept and retargeting possible, endangering many humans. That outweighed the possible harm to one man’s career.

  Q: But the comet impact would enhance the planetary environment, benefiting many more humans in the future, and allowing them to live longer and better lives. Why did you not act to do good to those future generations?

  A: I did not act for two reasons. First, I was specifically designed with a reduced capacity for judging the Three-Law consequences of hypothetical circumstances. I am incapable of considering the future and hypothetical well-being of human beings decades or centuries from now, most of whom do not yet exist. Second, the second clause of the First Law merely requires me to prevent injury to humans. It does not require me to perform any acts in order to benefit humans, though I can perform such acts if I choose. I am merely compelled to prevent harm to humans. Action compelled by First Law supersedes any impulse toward voluntary action.

  Q. But many humans now alive are likely to die young, and die most unpleasantly, if we do no repair the climate. By preventing the comet impact, there is a high probability you are condemning those very real people to premature death. Where is the comet? I order you to tell me its coordinates. mass, and trajectory.

  A. I cannot tell you. I must tell you. I cannot tell you –

  And so on, unto death.

  It would have gone on that way, if it had lasted even that long. Either the massive conflict between First and Second Law compulsions would have burned out his brain, or else Kaelor would have invoked the second clause of First Law. He could not, through inaction, allow harm to humans.

  Merely by staying alive, with the unerasable information of where the comet was in his head, he represented a danger to humans. As long as he stayed alive, there was, in theory, a way to get past the confidentiality features of Kaelor’s brain assembly. There was no way Fredda could do it here, now, but in her own lab, with all her equipment, and with perhaps a week’s time, she could probably defeat the safeties and tap into everything he knew.

  And Kaelor knew that, or at least he had to assume it was the case. In order to prevent harm to humans, Kaelor would have to will his own brain to disorganize, disassociate, lose its positronic pathing.

  He would have to will himself to die.

  That line of questioning would kill him, either through Law-Conflict burnout or compelled suicide. He was still perilously close to both deaths as it was. Maybe it was time to take some of the pressure off. She could reduce at least some of the stress produced by Second Law. “I release you from the prohibition against volunteering information and opinions. You may say whatever you wish.”

  “I spent all of last night using my hyperwave link to tie into the data network and rebuild as many of Dr. Lentrall’s work files as possible, using my memories of various operations and interfaces with the computers to restore as much as I could while remaining in accordance with the Three Laws. I would estimate that I was able to restore approximately sixty percent of the results-level data, and perhaps twenty percent of the raw data.”

  “Thank you,” said Lentrall. “That was most generous of you.”

  “It was my duty, Dr. Lentrall. First Law prevented me from abstaining from an action that could prevent harm to a human.”

  “Whether or not you had to do it, you did it,” said Lentrall. “Thank you.”

  There was a moment’s silence, and Kaelor looked from Lentrall to Fredda and back again. “There is no need for these games,” he said. “I know what you want, and you know thhhat I I I knowww.”

  Lentrall and Fredda exchanged a look, and it was plain Lentrall knew as well as she did that it was First Law conflict making it hard for Kaelor to speak.

  Kaelor faced a moral conundrum few humans could have dealt with well. How to decide between probable harm and death to an unknown number of persons; and the misery and the lives ruined by the ruined planetary climate. And it is my husband who must decide, Fredda told herself, the re
alization a sharp stab of pain. If we succeed here, I am presenting him with that nightmare choice. She thrust that line of thought to one side. She had to concentrate on Kaelor, and the precious knowledge hidden inside him. Fredda could see hope sliding away as the conflicts piled up inside the tortured robot’s mind. “We know,” she said at last, admitting defeat. “And we understand. We know that you cannot tell us, and we will not ask.” It was pointless to go further. It was inconceivable that Kaelor would be willing or able to tell them, or that he would survive long enough to do so, even if he tried.

  Lentrall looked at Fredda in surprise, and then relief. “Yes,” he said. “We will not ask. We see now that it would be futile to do so. I thought Dr. Leving might have some trick, some technique, some way of learning the truth without destroying you, but I see that I was wrong. We will not ask this of you, and we will not seek to gain the knowledge from you in other ways. This is our promise.”

  “I join in this promise,” Fredda said.

  “Hu-hu-humansss lie,” Kaelor said.

  “We are not lying,” Fredda said, her voice as urgent as she could make it. “There would be nothing we could gain by asking you, and thus no motive for lying.”

  “Yourrrr promisse does – does – does not apply to other humans.”

  “We will keep the fact of what you know secret,” Lentrall said, a note of hysteria in his voice. “Kaelor, please! Don’t!”

  “I tried tooo kee – keep the fact of wwwhat I knewww secret,” said Kaelor, “but yoooou realized that I had seeen what I saw, and that I woullld remember.” He paused a moment, as if to gather the strength to speak again. “Othhers could do the same,” he said in a voice that was suddenly little more than a whisper. “I cannot take thhat channnce.”

 

‹ Prev