Asimov, Isaac - Foundation 03 - Naked Sun
Page 16
"The truth."
"It's double-talk to me. How else can you put it?"
"It means a certain weakening of the First Law."
"Why so? A child is disciplined for its own future good. Isn't that the theory?"
"Ah, the future good!" Leebig's eyes glowed with passion and he seemed to grow less conscious of his listener and correspondingly more talkative. "A simple concept, you think. How many human beings are willing to accept a trifling inconvenience for the sake of a large future good? How long does it take to train a child that what tastes good now means a stomach-ache later, and what tastes bad now will correct the stomach-ache later? Yet you want a robot to be able to understand?
"Pain inflicted by a robot on a child sets up a powerful disruptive
potential in the positronic brain. To counteract that by an antipotential triggered through a realization of future good requires enough paths and bypaths to incsease the mass of the positronic brain by 50 per cent, unless other circuits are sacrificed."
Baley said, "Then you haven't succeeded in building such a robot."
"No, nor am I likely to succeed. Nor anyone."
"Was Dr. Delmarre testing an experimental model of such a robot at the time of his death?"
"Not of such a robot. We were interested in other more practical things also."
Baley said quietly, "Dr. Leebig, I am going to have to learn a bit more about robotics and I am going to ask you to teach me."
Leebig shook his head violently, and his drooping eyelid dipped further in a ghastly travesty of a wink. "It should be obvious that a course in robotics takes more than a moment. I lack the time."
"Nevertheless, you must teach me. The smell of robots is the one thing that pervades everything on Solaria. If it is time we require, then more than ever I must see you. I am an Earthman and I cannot work or think comfortably while viewing."
It would not have seemed possible to Baley for Leebig to stiffen his stiff carriage further, but he did. He said, "Your phobias as an Earthman don't concern me. Seeing is impossible."
"I think you will change your mind when I tell you what I chiefly want to consult you about."
"It will make no difference. Nothing can."
"No? Then listen to this. It is my belief that throughout the history of the positronic robot, the First Law of Robotics has been deliberately misquoted."
Leebig moved spasmodically. "Misquoted? Fool! Madman! Why?"
"To hide the fact," said Baley with complete composure, "that robots can commit murder."
14
A Motive Is Revealed
LEEBIG'S MOUTH widened slowly. Baley took it for a snarl at first and then, with considerable surprise, decided that it was the most unsuccessful attempt at a smile that he had ever seen.
Leebig said, "Don't say that. Don't ever say that."
"Why not?"
"Because anything, however small, that encourages distrust of robots is harmful. Distrusting robots is a human disease!"
It was as though he were lecturing a small child. It was as though he were saying something gently that he wanted to yell. It was as though he were trying to persuade when what he really wanted was to enforce on penalty of death.
Leebig said, "Do you know the history of robotics?"
"A little."
"On Earth, you should. Yes. Do you know robots started with a Frankenstein complex against them? They were suspect. Men distrusted and feared robots. Robotics was almost an undercover science as a result. The Three Laws were first built into robots in an effort to overcome distrust and even so Earth would never allow a robotic society to develop. One of the reasons the first pioneers left Earth to colonize the rest of the Galaxy was so that they might establish societies in which robots would be allowed to free men of poverty and toil. Even then, there remained a latent suspicion not far below, ready to pop up at any excuse."
"Have you yourself had to counter distrust of robots?" asked Baley.
"Many times," said Leebig grimly.
"Is that why you and other roboticists are willing to distort the facts just a little in order to avoid suspicion as muih as possible?"
"There is no distortion!"
"For instance, aren't the Three Laws misquoted?"
"No!"
"I can demonstrate that they are, and unless you convince me otherwise, I will demonstrate it to the whole Galaxy, if I can."
"You're mad. Whatever argument you may think you have is fallacious, I assure you."
"Shall we discuss it?" "If it does not take too long." "Face to face? Seeing?" Leebig's thin face twisted. "No!"
"Good-by, Dr. Leebig. Others will listen to me."
"Wait. Great Galaxy, man, wait!"
"Seeing?"
The roboticist's hands wandered upward, hovered about his chin. Slowly a thumb crept into his mouth and remained there. He stared, blankly, at Baley.
Baley thought: Is he regressing to the pie-five-year-old stage so that it will be legitimate for him to see me?
"Seeing?" he said.
But Leebig shook his head slowly. "I can't. I can't," he moaned, the words all but stifled by the blocking thumb. "Do whatever you
Want."
Baley stared at the other and watched him turn away and face the wall. He watched the Solarian's straight back bend and the Solarian's face hide in shaking hands.
Baley said, "Very well, then, I'll agree to view."
Leebig said, back still turned, "Excuse me a moment. I'll be back."
Baley tended to his own needs during the interval and stared at his fresh-washed face in the bathroom mirror. Was he getting the feel of Solaria and Solarians? He wasn't sure.
He sighed and pushed a contact and a robot appeared. He didn't turn to look at it. He said, "Is there another viewer at the farm, besides the one I'm using?"
"There are three other outlets, master."
"Then tell Kiorissa Cantoro-tell your mistress that I will be using this one till further notice and that I am not to be disturbed."
"Yes, master."
Baley returned to his position where the viewer remained focused on the empty patch of room in which Leebig had stood. It was still empty and he settled himself to wait.
It wasn't long. Leebig entered and the room once more jiggled as the man walked. Evidently focus shifted from room center to man center without delay. Baley remembered the complexity of viewing controls and began to feel a kind of appreciation of what was involved.
Leebig was quite master of himself now, apparently. His hair was slicked back and his costume had been changed. His clothes fitted loosely and were of a material that glistened and caught highlights. He sat down in a slim chair that folded out of the wall.
He said soberly, "Now what is this notion of yours concerning First Law?"
"Will we be overheard?"
"No. I've taken care."
Baley nodded. He said, "Let me quote the First Law."
"I scarcely need that."
"I know, but let me quote it, anyway: A robot may not harm a human being or, through inaction, allow a human being to come to harm."
"Well?"
"Now when I first landed on Solaria, I was driven to the estate assigned for my use in a ground-car. The ground-car was a specially enclosed job designed to protect me from exposure to open space. As an Earthman--"
"I know about that," said Leebig impatiently. "What has this to do with the matter?"
"The robots who drove the car did not know about it. I asked that the car be opened and was at once obeyed. Second Law. They had to follow orders. I was uncomfortable, of course, and nearly collapsed before the car was enclosed again. Didn't the robots harm me?"
"At your order," snapped Leebig.
"I'll quote the Second Law: A robot must obey the orders given it
by human beings except where such orders would conflict with the First Law. So you see, my order should have been ignored."
"This is nonsense. The robot lacked knowledge--"
/> Baley leaned forward in his chair. "Ah! We have it. Now let's recite the First Law as it should be stated: A robot may do nothing that, to its knowledge, will harm a human being; nor, through inaction, knowingly allow a human being to come to harm."
"This is all understood."
"I think not by ordinary men. Otherwise, ordinary men would realize robots could commit murder."
Leebig was white. "Mad! Lunacy!"
Baley stared at his finger ends. "A robot may perform an innocent task, I suppose; one that has no damaging effect on a human being?"
"If ordered to do so," said Leebig.
"Yes, of course. If ordered to do so. And a second robot may perform an innocent task, also, I suppose; one that also can have no damaging effect on a human being? If ordered to do so?"
"Yes."
"And what if the two innocent tasks, each completely innocent, completely, amount to murder when added together?"
"What?" Leebig's face puckered into a scowl.
"I want your expert opinion on the matter," said Baley. "I'll set you a hypothetical case. Suppose a man says to a robot, 'Place a small quantity of this liquid into a glass of milk that you will find in such and such a place. The liquid is harmless. I wish only to know its effect on milk. Once I know the effect, the mixture will be poured out. After you have performed this action, forget you have done so."
Leebig, still scowling, said nothing.
Baley said, "If I had told the robot to add a mysterious liquid to milk and then offer it to a man, First Law would force it to ask, 'What is the nature of the liquid? Will it harm a man?' And if it were assured the liquid was harmless, First Law might still make the robot hesitate and refuse to offer the milk. Instead, however, it is told the milk will be poured out. First Law is not involved. Won't the robot do as it is told?"
Leebig glared.
Baley said, "Now a second robot has poured out the milk in the first place and is unaware that the milk has been tampered with. In all innocence, it offers the milk to a man and the man dies."
Leebig cried out, "No!"
"Why not? Both actions are innocent in themselves. Only together are they murder. Do you deny that that sort of thing can happen?"
"The murderer would be the man who gave the order," cried Leebig.
"If you want to be philosophical, yes. The robots would have been the immediate murderers, though, the instruments of murder."
"No man would give such orders."
"A man would. A man has. It was exactly in this way that the murder attempt on Dr. Gruer must have been carried through. You've heard about that, I suppose."
"On Solaria," muttered Leebig, "one hears about everything."
"Then you know Gruer was poisoned at his dinner table before the eyes of myself and my partner, Mr. Olivaw of Aurora. Can you suggest any other way in which the poison might have reached him? There was no other human on the estate. As a Solarian, you must appreciate that point."
"I'm not a detective. I have no theories."
"I've presented you with one. I want to know if it is a possible one. I want to know if two robots might not perform two separate actions, each one innocent in itself, the two together resulting in murder. You're the expert, Dr. Leebig. Is it possible?"
And Leebig, haunted and harried, said, "Yes," in a voice so low that Baley scarcely heard him.
Baley said, "Very well, then. So much for the First Law."
Leebig stared at Baley and his drooping eyelid winked once or twice in a slow tic. His hands, which had been clasped, drew apart, though the fingers maintained their clawed shape as though each hand still entwined a phantom hand of air. Palms turned downward and rested on knees and only then did the fingers relax.
Baley watched it all in abstraction.
Leebig said, "Theoretically, yes. Theoretically! But don't dismiss the First Law that easily, Earthman. Robots would have to be ordered very cleverly in order to circumvent the First Law."
"Granted," said Baley. "I am only an Earthman. I know next to nothing about robots and my phrasing of the orders was only by way of example. A Solarian would be much more subtle and do much better. I'm sure of that."
Leebig might not have been listening. He said loudly, "If a robot
can be manipulated into doing harm to a man, it means only that we must extend the powers of the positronic brain. One might say we ought to make the human better. That is impossible, so we will make the robot more foolproof.
"We advance continuously. Our robots are more varied, more specialized, more capable, and more unharming than those of a century ago. A century hence, we will have still greater advances. Why have a robot manipulate controls when a positronic brain can be built into the controls itself? That's specialization, but we can generalize, also. Why not a robot with replaceable and interchangeable limbs. Eh? Why not? If we-"
Baley interrupted. "Are you the only roboticist on Solaria?"
"Don't be a fool."
"I only wondered. Dr. Delmarre was the only-uh-fetal engineer, except for an assistant."
"Solaria has over twenty roboticists."
"Are you the best?"
"I am," Leebig said without self-consciousness.
"Delmarre worked with you."
"He did."
Baley said, "I understand that he was plannir~g to break the partnership toward the end."
"No sign of it. What gave you the idea?"
"I understand he disapproved of your bachelorhood."
"He may have. He was a thorough Solarian. However, it did not affect our business relationships."
"To change the subject. In addition to developing new model robots, do you also manufacture and repair existing types?"
Leebig said, "Manufacture and repair are largely robot-conducted. There is a large factory and maintenance shop on my estate."
"Do robots require much in the way of repair, by the way?"
"Very little."
"Does that mean that robot repair is an undeveloped science?"
"Not at all." Leebig said that stiffly.
"What about the robot that was at the scene of Dr. Delmarre's murder?"
Leebig looked away, and his eyebrows drew together as though a painful thought were being barred entrance to his mind. "It was a complete loss."
"Really complete? Could it answer any questions at all?"
"None at all. It was absolutely useless. Its positronic brain was completely short-circuited. Not one pathway was left intact. Consider! It had witnessed a murder it had been unable to halt-"
"Why was it unable to halt the murder, by the way?"
'Who can tell? Dr. Delmarre was experimenting with that robot. I do not know in what mental condition he had left it. He might have ordered it, for instance, to suspend all operations while he checked one particular circuit element. If someone whom neither Dr. Delmarre nor the robot suspected of harm were suddenly to launch a homicidal attack, there might be a perceptible interval before the robot could use First Law potential to overcome Dr. Delmarre's freezing order. The length of the interval would depend on the nature of the attack and the nature of Dr. Delmarre's freezing order. I could invent a dozen other ways of explaining why the robot was unable to prevent the murder. Being unable to do so was a First Law violation, however, and that was sufficient to blast every positronic pathway in the robot's mind."
"But if the robot was physically unable to prevent the murder, was it responsible? Does the First Law ask impossibilities?"
Leebig shrugged. "The First Law, despite your attempts to make little of it, protects humanity with every atom of possible force. It allows no excuses. If the First Law is broken, the robot is ruined."
"That is a universal rule, sir?"
"As universal as robots."
Baley said, "Then I've learned something."
"Then learn something else. Your theory of murder by a series of robotic actions, each innocent in itself, will not help you in the case of Dr. Delmarre's death."
"W
hy not?"
"The death was not by poisoning, but by bludgeoning. Something had to hold the bludgeon, and that had to be a human arm. No robot could swing a club and smash a skull."
"Suppose," said Baley, "a robot were to push an innocent button which dropped a booby-trap weight on Delmarre's head."
Leebig smiled sourly. "Earthman, I've viewed the scene of the crime. I've heard all the news. The murder was a big thing here on Solaria, you know. So I know there was no sign of any machinery at the scene of the crime, or of any fallen weight."
Baley said, "Or of any blunt instrument, either." Leebig said scornfully, "You're a detective. Find it." "Granting that a robot was not responsible for Dr. Delmarre's death, who was, then?"
"Everyone knows who was," shouted Leebig. "His wife! Gladia!" Baley thought: At least there's a unanimity of opinion. Aloud he said, "And who was the mastermind behind the robots who poisoned Gruer?"
"I suppose. . ." Leebig trailed off.
"You don't think there are two murderers, do you? If Gladia was responsible for one crime, she must be responsible for the second attempt, also."
"Yes. You must be right." His voice gained assurance. "No doubt of it."
"No doubt?"
"Nobody else could get close enough to Dr. Delmarre to kill him. He allowed personal presence no more than I did, except that he made an exception in favor of his wife, and I make no exceptions. The wiser I." The roboticist laughed harshly.
"I believe you knew her," said Baley abruptly.
"Whom?"
"Her. We are discussing only one 'her.' Gladia!"
"Who told you I knew her any more than I know anyone else?" demanded Leebig. He put his hand to his throat. His fingers moved slightly and opened the neck-seam of his garment for an inch downward, leaving more freedom to breathe.
"Gladia herself did. You two went for walks."
"So? We were neighbors. It is a common thing to do. She seemed a pleasant person."
"You approved of her, then?"
Leebig shrugged. "Talking to her was relaxing."
'What did you talk about?"
"Robotics." There was a flavor of surprise about the word as though there were wonder that the question could be asked.