Book Read Free

Asimov’s Future History Volume 4

Page 29

by Isaac Asimov


  He said soberly, “Now what is this notion of yours concerning First Law?”

  “Will we be overheard?”

  “No. I’ve taken care.”

  Baley nodded. He said, “Let me quote the First Law.”

  “I scarcely need that.”

  “I know, but let me quote it, anyway: A robot may not harm a human being or, through inaction, allow a human being to come to harm.”

  “Well?”

  “Now when I first landed on Solaria, I was driven to the estate assigned for my use in a ground-car. The ground-car was a specially enclosed job designed to protect me from exposure to open space. As an Earthman–”

  “I know about that,” said Leebig impatiently. “What has this to do with the matter?”

  “The robots who drove the car did not know about it. I asked that the car be opened and was at once obeyed. Second Law. They had to follow orders. I was uncomfortable, of course, and nearly collapsed before the car was enclosed again. Didn’t the robots harm me?”

  “At your order,” snapped Leebig.

  “I’ll quote the Second Law: A robot must obey the orders given it

  by human beings except where such orders would conflict with the First Law. So you see, my order should have been ignored.”

  “This is nonsense. The robot lacked knowledge–”

  Baley leaned forward in his chair. “Ah! We have it. Now let’s recite the First Law as it should be stated: A robot may do nothing that, to its knowledge, will harm a human being; nor, through inaction, knowingly allow a human being to come to harm.”

  “This is all understood.”

  “I think not by ordinary men. Otherwise, ordinary men would realize robots could commit murder.”

  Leebig was white. “Mad! Lunacy!”

  Baley stared at his finger ends. “A robot may perform an innocent task, I suppose; one that has no damaging effect on a human being?”

  “If ordered to do so,” said Leebig.

  “Yes, of course. If ordered to do so. And a second robot may perform an innocent task, also, I suppose; one that also can have no damaging effect on a human being? If ordered to do so?”

  “Yes.”

  “And what if the two innocent tasks, each completely innocent, completely, amount to murder when added together?”

  “What?” Leebig’s face puckered into a scowl.

  “I want your expert opinion on the matter,” said Baley. “I’ll set you a hypothetical case. Suppose a man says to a robot, “Place a small quantity of this liquid into a glass of milk that you will find in such and such a place. The liquid is harmless. I wish only to know its effect on milk. Once I know the effect, the mixture will be poured out. After you have performed this action, forget you have done so.”

  Leebig, still scowling, said nothing.

  Baley said, “If I had told the robot to add a mysterious liquid to milk and then offer it to a man, First Law would force it to ask, ‘What is the nature of the liquid? Will it harm a man?’ And if it were assured the liquid was harmless, First Law might still make the robot hesitate and refuse to offer the milk. Instead, however, it is told the milk will be poured out. First Law is not involved. Won’t the robot do as it is told?”

  Leebig glared.

  Baley said, “Now a second robot has poured out the milk in the first place and is unaware that the milk has been tampered with. In all innocence, it offers the milk to a man and the man dies.”

  Leebig cried out, “No!”

  “Why not? Both actions are innocent in themselves. Only together are they murder. Do you deny that that sort of thing can happen?”

  “The murderer would be the man who gave the order,” cried Leebig.

  “If you want to be philosophical, yes. The robots would have been the immediate murderers, though, the instruments of murder.”

  “No man would give such orders.”

  “A man would. A man has. It was exactly in this way that the murder attempt on Dr. Gruer must have been carried through. You’ve heard about that, I suppose.”

  “On Solaria,” muttered Leebig, “one hears about everything.”

  “Then you know Gruer was poisoned at his dinner table before the eyes of myself and my partner, Mr. Olivaw of Aurora. Can you suggest any other way in which the poison might have reached him? There was no other human on the estate. As a Solarian, you must appreciate that point.”

  “I’m not a detective. I have no theories.”

  “I’ve presented you with one. I want to know if it is a possible one. I want to know if two robots might not perform two separate actions, each one innocent in itself, the two together resulting in murder. You’re the expert, Dr. Leebig. Is it possible?”

  And Leebig, haunted and harried, said, “Yes,” in a voice so low that Baley scarcely heard him.

  Baley said, “Very well, then. So much for the First Law.”

  Leebig stared at Baley and his drooping eyelid winked once or twice in a slow tic. His hands, which had been clasped, drew apart, though the fingers maintained their clawed shape as though each hand still entwined a phantom hand of air. Palms turned downward and rested on knees and only then did the fingers relax.

  Baley watched it all in abstraction.

  Leebig said, “Theoretically, yes. Theoretically! But don’t dismiss the First Law that easily, Earthman. Robots would have to be ordered very cleverly in order to circumvent the First Law.”

  “Granted,” said Baley. “I am only an Earthman. I know next to nothing about robots and my phrasing of the orders was only by way of example. A Solarian would be much more subtle and do much better. I’m sure of that.”

  Leebig might not have been listening. He said loudly, “If a robot can be manipulated into doing harm to a man, it means only that we must extend the powers of the positronic brain. One might say we ought to make the human better. That is impossible, so we will make the robot more foolproof.

  “We advance continuously. Our robots are more varied, more specialized, more capable, and more unharming than those of a century ago. A century hence, we will have still greater advances. Why have a robot manipulate controls when a positronic brain can be built into the controls itself? That’s specialization, but we can generalize, also. Why not a robot with replaceable and interchangeable limbs. Eh? Why not? If we–”

  Baley interrupted. “Are you the only roboticist on Solaria?”

  “Don’t be a fool.”

  “I only wondered. Dr. Delmarre was the only–uh–fetal engineer, except for an assistant.”

  “Solaria has over twenty roboticists.”

  “Are you the best?”

  “I am,” Leebig said without self-consciousness.

  “Delmarre worked with you.”

  “He did.”

  Baley said, “I understand that he was planning to break the partnership toward the end.”

  “No sign of it. What gave you the idea?”

  “I understand he disapproved of your bachelorhood.”

  “He may have. He was a thorough Solarian. However, it did not affect our business relationships.”

  “To change the subject. In addition to developing new model robots, do you also manufacture and repair existing types?”

  Leebig said, “Manufacture and repair are largely robot conducted. There is a large factory and maintenance shop on my estate.”

  “Do robots require much in the way of repair, by the way?”

  “Very little.”

  “Does that mean that robot repair is an undeveloped science?”

  “Not at all.” Leebig said that stiffly.

  “What about the robot that was at the scene of Dr. Delmarre’s murder?”

  Leebig looked away, and his eyebrows drew together as though a painful thought were being barred entrance to his mind. “It was a complete loss.”

  “Really complete? Could it answer any questions at all?”

  “None at all. It was absolutely useless. Its positronic brain was completely short-circuit
ed. Not one pathway was left intact. Consider! It had witnessed a murder it had been unable to halt–”

  “Why was it unable to halt the murder, by the way?”

  “Who can tell? Dr. Delmarre was experimenting with that robot. I do not know in what mental condition he had left it. He might have ordered it, for instance, to suspend all operations while he checked one particular circuit element. If someone whom neither Dr. Delmarre nor the robot suspected of harm were suddenly to launch a homicidal attack, there might be a perceptible interval before the robot could use First Law potential to overcome Dr. Delmarre’s freezing order. The length of the interval would depend on the nature of the attack and the nature of Dr. Delmarre’s freezing order. I could invent a dozen other ways of explaining why the robot was unable to prevent the murder. Being unable to do so was a First Law violation, however, and that was sufficient to blast every positronic pathway in the robot’s mind.”

  “But if the robot was physically unable to prevent the murder, was it responsible? Does the First Law ask impossibilities?”

  Leebig shrugged. “The First Law, despite your attempts to make little of it, protects humanity with every atom of possible force. It allows no excuses. If the First Law is broken, the robot is ruined.”

  “That is a universal rule, sir?”

  “As universal as robots.”

  Baley said, “Then I’ve learned something.”

  “Then learn something else. Your theory of murder by a series of robotic actions, each innocent in itself, will not help you in the case of Dr. Delmarre’s death.”

  “Why not?”

  “The death was not by poisoning, but by bludgeoning. Something had to hold the bludgeon, and that had to be a human arm. No robot could swing a club and smash a skull.”

  “Suppose,” said Baley, “a robot were to push an innocent button which dropped a booby-trap weight on Delmarre’s head.”

  Leebig smiled sourly. “Earthman, I’ve viewed the scene of the crime. I’ve heard all the news. The murder was a big thing here on Solaria, you know. So I know there was no sign of any machinery at the scene of the crime, or of any fallen weight.”

  Baley said, “Or of any blunt instrument, either.” Leebig said scornfully, “You’re a detective. Find it.”

  “Granting that a robot was not responsible for Dr. Delmarre’s death, who was, then?”

  “Everyone knows who was,” shouted Leebig. “His wife! Gladia!” Baley thought: At least there’s a unanimity of opinion. Aloud he said, “And who was the mastermind behind the robots who poisoned Gruer?”

  “I suppose...” Leebig trailed off.

  “You don’t think there are two murderers, do you? If Gladia was responsible for one crime, she must be responsible for the second attempt, also.”

  “Yes. You must be right.” His voice gained assurance. “No doubt of it.”

  “No doubt?”

  “Nobody else could get close enough to Dr. Delmarre to kill him. He allowed personal presence no more than I did, except that he made an exception in favor of his wife, and I make no exceptions. The wiser I.” The roboticist laughed harshly.

  “I believe you knew her,” said Baley abruptly.

  “Whom?”

  “Her. We are discussing only one ‘her.’ Gladia!”

  “Who told you I knew her any more than I know anyone else?” demanded Leebig. He put his hand to his throat. His fingers moved slightly and opened the neck seam of his garment for an inch downward, leaving more freedom to breathe.

  “Gladia herself did. You two went for walks.”

  “So? We were neighbors. It is a common thing to do. She seemed a pleasant person.”

  “You approved of her, then?”

  Leebig shrugged. “Talking to her was relaxing.”

  “What did you talk about?”

  “Robotics.” There was a flavor of surprise about the word as though there were wonder that the question could be asked.

  “And she talked robotics too?”

  “She knew nothing about robotics. Ignorant! But she listened. She has some sort of field-force rigmarole she plays with; field coloring, she calls it. I have no patience with that, but I listened.”

  “All this without personal presence?” Leebig looked revolted and did not answer.

  Baley tried again, “Were you attracted to her?”

  “What?”

  “Did you find her attractive? Physically?”

  Even Leebig’s bad eyelid lifted and his lips quivered. “Filthy animal,” he muttered.

  “Let me put it this way, then. When did you cease finding Gladia pleasant? You used that word yourself, if you remember.”

  “What do you mean?”

  “You said you found her pleasant. Now you believe she murdered her husband. That isn’t the mark of a pleasant person.”

  “I was mistaken about her.”

  “But you decided you were mistaken before she killed her husband, if she did so. You stopped walking with her some time before the murder. Why?”

  Leebig said, “Is that important?”

  “Everything is important till proven otherwise.”

  “Look, if you want information from me as a roboticist, ask it. I won’t answer personal questions.”

  Baley said, “You were closely associated with both the murdered man and the chief suspect. Don’t you see that personal questions are unavoidable? Why did you stop walking with Gladia?”

  Leebig snapped, “There came a time when I ran out of things to say; when I was too busy; when I found no reason to continue the walks.”

  “When you no longer found her pleasant, in other words.”

  “All right. Put it so.”

  “Why was she no longer pleasant?”

  Leebig shouted, “I have no reason.”

  Baley ignored the other’s excitement. “You are still someone who has known Gladia well. What could her motive be?”

  “Her motive?”

  “No one has suggested any motive for the murder. Surely Gladia wouldn’t commit murder without a motive.”

  “Great Galaxy!” Leebig leaned his head back as though to laugh, but didn’t. “No one told you? Well, perhaps no one knew. I knew, though. She told me. She told me frequently.”

  “Told you what, Dr. Leebig?”

  “Why, that she quarreled with her husband. Quarreled bitterly and frequently. She hated him, Earthman. Didn’t anyone tell you that? Didn’t she tell you?”

  15: A Portrait Is Colored

  BALEY TOOK IT between the eyes and tried not to show it.

  Presumably, living as they did, Solarians considered one another’s private lives to be sacrosanct. Questions concerning marriage and children were in bad taste. He supposed then that chronic quarreling could exist between husband and wife and be a matter into which curiosity was equally forbidden.

  But even when murder had been committed? Would no one commit the social crime of asking the suspect if she quarreled with her husband? Or of mentioning the matter if they happened to know of it?

  Well, Leebig had.

  Baley said, “What did the quarrels concern?”

  “You had better ask her, I think.”

  He better had, thought Baley. He rose stiffly, “Thank you, Dr. Leebig, for your cooperation. I may need your help again later. I hope you will keep yourself available.”

  “Done viewing,” said Leebig, and he and the segment of his room vanished abruptly.

  For the first time Baley found himself not minding a plane flight through open space. Not minding it at all. It was almost as though he were in his own element.

  He wasn’t even thinking of Earth or of Jessie. He had been away from Earth only a matter of weeks, yet it might as well have been years. He had been on Solaria only the better part of three days and yet it seemed forever.

  How fast could a man adapt to nightmare?

  Or was it Gladia? He would be seeing her soon, not viewing her. Was that what gave him confidence and this odd feeling of mi
xed apprehension and anticipation?

  Would she endure it? he wondered. Or would she slip away after a few moments of seeing, begging off as Quemot had done?

  She stood at the other end of a long room when he entered. She might almost have been an impressionistic representation of herself, she was reduced so to essentials.

  Her lips were faintly red, her eyebrows lightly penciled, her earlobes faintly blue, and, except for that, her face was untouched. She looked pale, a little frightened, and very young.

  Her brown-blond hair was drawn back, and her gray-blue eyes were somehow shy. Her dress was a blue so dark as to be almost black, with a thin white edging curling down each side. She wore long sleeves, white gloves, and flat-heeled shoes. Not an inch of skin showed anywhere but in her face. Even her neck was covered by a kind of unobtrusive ruching.

  Baley stopped where he was. “Is this close enough, Gladia?”

  She was breathing with shallow quickness. She said, “I had forgotten what to expect really. It’s just like viewing, isn’t it? I mean, if you don’t think of it as seeing.”

  Baley said, “It’s all quite normal to me.”

  “Yes, on Earth.” She closed her eyes. “Sometimes I try to imagine it. Just crowds of people everywhere. You walk down a road and there are others walking with you and still others walking in the other direction. Dozens–”

  “Hundreds,” said Baley. “Did you ever view scenes on Earth in a book-film? Or view a novel with an Earth setting?”

  “We don’t have many of those, but I’ve viewed novels set on the other Outer Worlds where seeing goes on all the time. It’s different in a novel. It just seems like a multiview.”

  “Do people ever kiss in novels?”

  She flushed painfully. “I don’t read that kind.”

  “Never?”

  “Well–there are always a few dirty films around, you know, and sometimes, just out of curiosity–It’s sickening, really.”

  “Is it?”

  She said with sudden animation, “But Earth is so different. So many people. When you walk, Elijah, I suppose you even t-touch people. I mean, by accident.”

  Baley half smiled. “You even knock them down by accident.” He thought of the crowds on the Expressways, tugging and shoving, bounding up and down the strips, and for a moment, inevitably, he felt the pang of homesickness.

 

‹ Prev