by Isaac Asimov
“That’s primary, Dr. Calvin. When it was necessary for one of our men to expose himself for a short period to a moderate gamma field, one that would have no physiological effects, the nearest robot would dash in to drag him out. If the field were exceedingly weak, it would succeed, and work could not continue till all robots were cleared out. If the field were a trifle stronger, the robot would never reach the technician concerned, since its positronic brain would collapse under gamma radiations – and then we would be out one expensive and hard-to-replace robot.
“We tried arguing with them. Their point was that a human being in a gamma field was endangering his life and that it didn’t matter that he could remain there half an hour safely. Supposing, they would say, he forgot and remained an hour. They couldn’t take chances. We pointed out that they were risking their lives on a wild off-chance. But self-preservation is only the Third Law of Robotics – and the First Law of human safety came first. We gave them orders; we ordered them strictly and harshly to remain out of gamma fields at whatever cost. But obedience is only the Second Law of Robotics – and the First Law of human safety came first. Dr. Calvin, we either had to do without robots, or do something about the First Law – and we made our choice.”
“I can’t believe,” said Dr. Calvin, “that it was found possible to remove the First Law.”
“It wasn’t removed, it was modified,” explained Kallner. “Positronic brains were constructed that contained the positive aspect only of the Law, which in them reads: ‘No robot may harm a human being.’ That is all. They have no compulsion to prevent one coming to harm through an extraneous agency such as gamma rays. I state the matter correctly, Dr. Bogert?”
“Quite,” assented the mathematician.
“And that is the only difference of your robots from the ordinary NS2 model? The only difference? Peter?”
“The only difference, Susan.”
She rose and spoke with finality, “I intend sleeping now, and in about eight hours, I want to speak to whomever saw the robot last. And from now on, General Kallner, if I’m to take any responsibility at all for events, I want full and unquestioned control of this investigation.”
Susan Calvin, except for two hours of resentful lassitude, experienced nothing approaching sleep. She signaled at Bogert’s door at the local time of 0700 and found him also awake. He had apparently taken the trouble of transporting a dressing gown to Hyper Base with him, for he was sitting in it. He put his nail scissors down when Calvin entered.
He said softly, “I’ve been expecting you more or less. I suppose you feel sick about all this.”
“I do.”
“Well – I’m sorry. There was no way of preventing it. When the call came out from Hyper Base for us, I knew that something must have gone wrong with the modified Nestors. But what was there to do? I couldn’t break the matter to you on the trip here, as I would have liked to, because I had to be sure. The matter of the modification is top secret.”
The psychologist muttered, “I should have been told. U. S. Robots had no right to modify positronic brains this way without the approval of a psychologist.”
Bogert lifted his eyebrows and sighed. “Be reasonable, Susan. You couldn’t have influenced them. In this matter, the government was bound to have its way. They want the Hyperatomic Drive and the etheric physicists want robots that won’t interfere with them. They were going to get them even if it did mean twisting the First Law. We had to admit it was possible from a construction standpoint and they swore a mighty oath that they wanted only twelve, that they would be used only at Hyper Base, that they would be destroyed once the Drive was perfected, and that full precautions would be taken. And they insisted on secrecy – and that’s the situation.”
Dr. Calvin spoke through her teeth, “I would have resigned.”
“It wouldn’t have helped. The government was offering the company a fortune, and threatening it with antirobot legislation in case of a refusal. We were stuck then, and we’re badly stuck now. If this leaks out, it might hurt Kallner and the government, but it would hurt U. S. Robots a devil of a lot more.”
The psychologist stared at him. “Peter, don’t you realize what all this is about? Can’t you understand what the removal of the First Law means? It isn’t just a matter of secrecy.”
“I know what removal would mean. I’m not a child. It would mean complete instability, with no nonimaginary solutions to the positronic Field Equations.”
“Yes, mathematically. But can you translate that into crude psychological thought. All normal life, Peter, consciously or otherwise, resents domination. If the domination is by an inferior, or by a supposed inferior, the resentment becomes stronger. Physically, and, to an extent, mentally, a robot – any robot – is superior to human beings. What makes him slavish, then? Only the First Law! Why, without it, the first order you tried to give a robot would result in your death. Unstable? What do you think?”
“Susan,” said Bogert, with an air of sympathetic amusement. “I’ll admit that this Frankenstein Complex you’re exhibiting has a certain justification – hence the First Law in the first place. But the Law, I repeat and repeat, has not been removed – merely modified.”
“And what about the stability of the brain?”
The mathematician thrust out his lips, “Decreased, naturally. But it’s within the border of safety. The first Nestors were delivered to Hyper Base nine months ago, and nothing whatever has gone wrong till now, and even this involves merely fear of discovery and not danger to humans.”
“Very well, then. We’ll see what comes of the morning conference.”
Bogert saw her politely to the door and grimaced eloquently when she left. He saw no reason to change his perennial opinion of her as a sour and fidgety frustration.
Susan Calvin’s train of thought did not include Bogert in the least. She had dismissed him years ago as a smooth and pretentious sleekness.
Gerald Black had taken his degree in etheric physics the year before and, in common with his entire generation of physicists, found himself engaged in the problem of the Drive. He now made a proper addition to the general atmosphere of these meetings on Hyper Base. In his stained white smock, he was half rebellious and wholly uncertain. His stocky strength seemed striving for release and his fingers, as they twisted each other with nervous yanks, might have forced an iron bar out of true.
Major-general Kallner sat beside him; the two from U. S. Robots faced him.
Black said, “I’m told that I was the last to see Nestor 10 before he vanished. I take it you want to ask me about that.”
Dr. Calvin regarded him with interest, “You sound as if you were not sure, young man. Don’t you know whether you were the last to see him?”
“He worked with me, ma’am, on the field generators, and he was with me the morning of his disappearance. I don’t know if anyone saw him after about noon. No one admits having done so.”
“Do you think anyone’s lying about it?”
“I don’t say that. But I don’t say that I want the blame of it, either.” His dark eyes smoldered.
“There’s no question of blame. The robot acted as it did because of what it is. We’re just trying to locate it, Mr. Black, and let’s put everything else aside. Now if you’ve worked with the robot, you probably know it better than anyone else. Was there anything unusual about it that you noticed? Had you ever worked with robots before?”
“I’ve worked with other robots we have here – the simple ones. Nothing different about the Nestors except that they’re a good deal cleverer – and more annoying.”
“Annoying? In what way?”
“Well – perhaps it’s not their fault. The work here is rough and most of us get a little jagged. Fooling around with hyper-space isn’t fun.” He smiled feebly, finding pleasure in confession. “We run the risk continually of blowing a hole in normal space-time fabric and dropping right out of the universe, asteroid and all. Sounds screwy, doesn’t it? Naturally, you’
re on edge sometimes. But these Nestors aren’t. They’re curious, they’re calm, they don’t worry. It’s enough to drive you nuts at times. When you want something done in a tearing hurry, they seem to take their time. Sometimes I’d rather do without.”
“You say they take their time? Have they ever refused an order?”
“Oh, no,” hastily. “They do it all right. They tell you when they think you’re wrong, though. They don’t know anything about the subject but what we taught them, but that doesn’t stop them. Maybe I imagine it, but the other fellows have the same trouble with their Nestors.”
General Kallner cleared his throat ominously, “Why have no complaints reached me on the matter, Black?”
The young physicist reddened, “We didn’t really want to do without the robots, sir, and besides we weren’t certain exactly how such... uh... minor complaints might be received.”
Bogert interrupted softly, “Anything in particular happen the morning you last saw it?”
There was a silence. With a quiet motion, Calvin repressed the comment that was about to emerge from Kallner, and waited patiently.
Then Black spoke in blurting anger, “I had a little trouble with it. I’d broken a Kimball tube that morning and was out five days of work; my entire program was behind schedule; I hadn’t received any mail from home for a couple of weeks. And he came around wanting me to repeat an experiment I had abandoned a month ago. He was always annoying me on that subject and I was tired of it. I told him to go away – and that’s all I saw of him.”
“You told him to go away?” asked Dr. Calvin with sharp interest. “In just those words? Did you say ‘Go away’? Try to remember the exact words.”
There was apparently an internal struggle in progress. Black cradled his forehead in a broad palm for a moment, then tore it away and said defiantly, “I said, ‘Go lose yourself.’”
Bogert laughed for a short moment. “And he did, eh?”
But Calvin wasn’t finished. She spoke cajolingly, “Now we’re getting somewhere, Mr. Black. But exact details are important. In understanding the robot’s actions, a word, a gesture, an emphasis may be everything. You couldn’t have said just those three words, for instance, could you? By your own description you must have been in a hasty mood. Perhaps you strengthened your speech a little.”
The young man reddened, “Well... I may have called it a... a few things.”
“Exactly what things?”
“Oh – I wouldn’t remember exactly. Besides I couldn’t repeat it. You know how you get when you’re excited.” His embarrassed laugh was almost a giggle, “I sort of have a tendency to strong language.”
“That’s quite all right,” she replied, with prim severity. “At the moment, I’m a psychologist. I would like to have you repeat exactly what you said as nearly as you remember, and, even more important, the exact tone of voice you used.”
Black looked at his commanding officer for support, found none. His eyes grew round and appalled, “But I can’t.”
“You must.”
“Suppose,” said Bogert, with ill-hidden amusement, “you address me. You may find it easier.”
The young man’s scarlet face turned to Bogert. He swallowed. “I said” His voice faded out. He tried again, “I said-”
And he drew a deep breath and spewed it out hastily in one long succession of syllables. Then, in the charged air that lingered, he concluded almost in tears, “... more or less. I don’t remember the exact order of what I called him, and maybe I left out something or put in something, but that was about it.”
Only the slightest flush betrayed any feeling on the part of the robopsychologist. She said, “I am aware of the meaning of most of the terms used. The others, I suppose, are equally derogatory.”
“I’m afraid so,” agreed the tormented Black.
“And in among it, you told him to lose himself.”
“I meant it only figuratively.”
“I realize that. No disciplinary action is intended, I am sure.” And at her glance, the general, who, five seconds earlier, had seemed not sure at all, nodded angrily.
“You may leave, Mr. Black. Thank you for your cooperation.”
It took five hours for Susan Calvin to interview the sixty-three robots. It was five hours of multi-repetition; of replacement after replacement of identical robot; of Questions A, B, C, D; and Answers A, B, C, D; of a carefully bland expression, a carefully neutral tone, a carefully friendly atmosphere; and a hidden wire recorder.
The psychologist felt drained of vitality when she was finished.
Bogert was waiting for her and looked expectant as she dropped the recording spool with a clang upon the plastic of the desk.
She shook her head, “All sixty-three seemed the same to me. I couldn’t tell-”
He said, “You couldn’t expect to tell by ear, Susan. Suppose we analyze the recordings.”
Ordinarily, the mathematical interpretation of verbal reactions of robots is one of the more intricate branches of robotic analysis. It requires a staff of trained technicians and the help of complicated computing machines. Bogert knew that. Bogert stated as much, in an extreme of unshown annoyance after having listened to each set of replies, made lists of word deviations, and graphs of the intervals of responses.
“There are no anomalies present, Susan. The variations in wording and the time reactions are within the limits of ordinary frequency groupings. We need finer methods. They must have computers here. No.” He frowned and nibbled delicately at a thumbnail. “We can’t use computers. Too much danger of leakage. Or maybe if we-”
Dr. Calvin stopped him with an impatient gesture, “Please, Peter. This isn’t one of your petty laboratory problems. If we can’t determine the modified Nestor by some gross difference that we can see with the naked eye, one that there is no mistake about, we’re out of luck. The danger of being wrong, and of letting him escape is otherwise too great. It’s not enough to point out a minute irregularity in a graph. I tell you, if that’s all I’ve got to go on, I’d destroy them all just to be certain. Have you spoken to the other modified Nestors?”
“Yes, I have,” snapped back Bogert, “and there’s nothing wrong with them. They’re above normal in friendliness if anything. They answered my questions, displayed pride in their knowledge – except the two new ones that haven’t had time to learn their etheric physics. They laughed rather good-naturedly at my ignorance in some of the specializations here.” He shrugged, “I suppose that forms some of the basis for resentment toward them on the part of the technicians here. The robots are perhaps too willing to impress you with their greater knowledge.”
“Can you try a few Planar Reactions to see if there has been any change, any deterioration, in their mental set-up since manufacture?”
“I haven’t yet, but I will.” He shook a slim finger at her, “You’re losing your nerve, Susan. I don’t see what it is you’re dramatizing. They’re essentially harmless.”
“They are?” Calvin took fire. “They are? Do you realize one of them is lying? One of the sixty-three robots I have just interviewed has deliberately lied to me after the strictest injunction to tell the truth. The abnormality indicated is horribly deep-seated, and horribly frightening.”
Peter Bogert felt his teeth harden against each other. He said, “Not at all. Look! Nestor 10 was given orders to lose himself. Those orders were expressed in maximum urgency by the person most authorized to command him. You can’t counteract that order either by superior urgency or superior right of command. Naturally, the robot will attempt to defend the carrying out of his orders. In fact, objectively, I admire his ingenuity. How better can a robot lose himself than to hide himself among a group of similar robots?”
“Yes, you would admire it. I’ve detected amusement in you, Peter – amusement and an appalling lack of understanding. Are you a roboticist, Peter? Those robots attach importance to what they consider superiority. You’ve just said as much yourself. Subconsciously they f
eel humans to be inferior and the First Law which protects us from them is imperfect. They are unstable. And here we have a young man ordering a robot to leave him, to lose himself, with every verbal appearance of revulsion, disdain, and disgust. Granted, that robot must follow orders, but subconsciously, there is resentment. It will become more important than ever for it to prove that it is superior despite the horrible names it was called. It may become so important that what’s left of the First Law won’t be enough.”
“How on Earth, or anywhere in the Solar System, Susan, is a robot going to know the meaning of the assorted strong language used upon him? Obscenity is not one of the things impressioned upon his brain.”
“Original impressionment is not everything,” Calvin snarled at him. “Robots have learning capacity, you... you fool-” And Bogert knew that she had really lost her temper. She continued hastily, “Don’t you suppose he could tell from the tone used that the words weren’t complimentary? Don’t yon suppose he’s heard the words used before and noted upon what occasions?”
“Well, then,” shouted Bogert, “will you kindly tell me one way in which a modified robot can harm a human being, no matter how offended it is, no matter how sick with desire to prove superiority?”
“If I tell you one way, will you keep quiet?”
“Yes.”
They were leaning across the table at each other, angry eyes nailed together.
The psychologist said, “If a modified robot were to drop a heavy weight upon a human being, he would not be breaking the First Law, if he did so with the knowledge that his strength and reaction speed would be sufficient to snatch the weight away before it struck the man. However once the weight left his fingers, he would be no longer the active medium. Only the blind force of gravity would be that. The robot could then change his mind and merely by inaction, allow the weight to strike. The modified First Law allows that.”