The Caves of Steel

Home > Science > The Caves of Steel > Page 16
The Caves of Steel Page 16

by Isaac Asimov


  “It strikes me very unpleasantly.”

  “Suppose you had to leave the City at night and walk cross country for half a mile or more.”

  “I—I don’t think I could be persuaded to.”

  “No matter how important the necessity?”

  “If it were to save my life or the lives of my family, I might try …” He looked embarrassed. “May I ask the point of these questions, Mr. Baley?”

  “I’ll tell you. A serious crime has been committed, a particularly disturbing murder. I’m not at liberty to give you the details. There is a theory, however, that the murderer, in order to commit the crime, did just what we were discussing; he crossed open country at night and alone. I was just wondering what kind of man could do that.”

  Dr. Gerrigel shuddered. “No one I know. Certainly not I. Of course, among millions I suppose you could find a few hardy individuals.”

  “But you wouldn’t say it was a very likely thing for a human being to do?”

  “No. Certainly, not likely.”

  “In fact, if there’s any other explanation for the crime, any other conceivable explanation, it should be considered.”

  Dr. Gerrigel looked more uncomfortable than ever as he sat bolt upright with his well-kept hands precisely folded in his lap. “Do you have an alternate explanation in mind?”

  “Yes. It occurs to me that a robot, for instance, would have no difficulty at all in crossing open country.”

  Dr. Gerrigel stood up. “Oh, my dear sir!”

  “What’s wrong?”

  “You mean a robot may have committed the crime?”

  “Why not?”

  “Murder? Of a human being?”

  “Yes. Please sit down, Doctor.”

  The roboticist did as he was told. He said, “Mr. Baley, there are two acts involved: walking cross country, and murder. A human being could commit the latter easily, but would find difficulty in doing the former. A robot could do the former easily, but the latter act would be completely impossible. If you’re going to replace an unlikely theory by an impossible one—”

  “Impossible is a hell of a strong word, Doctor.”

  “You’ve heard of the First Law of Robotics, Mr. Baley?”

  “Sure. I can even quote it: A robot may not injure a human being, or, through inaction, allow a human being to come to harm.” Baley suddenly pointed a finger at the roboticist and went on, “Why can’t a robot be built without the First Law? What’s so sacred about it?”

  Dr. Gerrigel looked startled, then tittered, “Oh, Mr. Baley.”

  “Well, what’s the answer?”

  “Surely, Mr. Baley, if you even know a little about robotics, you must know the gigantic task involved, both mathematically and electronically, in building a positronic brain.”

  “I have an idea,” said Baley. He remembered well his visit to a robot factory once in the way of business. He had seen their library of book-films, long ones, each of which contained the mathematical analysis of a single type of positronic brain. It took more than an hour for the average such film to be viewed at standard scanning speed, condensed though its symbolisms were. And no two brains were alike, even when prepared according to the most rigid specifications. That, Baley understood, was a consequence of Heisenberg’s Uncertainty Principle. This meant that each film had to be supplemented by appendices involving possible variations.

  Oh, it was a job, all right. Baley wouldn’t deny that.

  Dr. Gerrigel said, “Well, then, you must understand that a design for a new type of positronic brain, even one where only minor innovations are involved, is not the matter of a night’s work. It usually involves the entire research staff of a moderately sized factory and takes anywhere up to a year of time. Even this large expenditure of work would not be nearly enough if it were not that the basic theory of such circuits has already been standardized and may be used as a foundation for further elaboration. The standard basic theory involves the Three Laws of Robotics: the First Law, which you’ve quoted; the Second Law which states, ‘A robot must obey the orders given by human beings except where such orders would conflict with the First Law,’ and the Third Law, which states, ‘A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.’ Do you understand?”

  R. Daneel, who, to all appearances, had been following the conversation with close attention, broke in. “If you will excuse me, Elijah, I would like to see if I follow Dr. Gerrigel. What you imply, sir, is that any attempt to build a robot, the working of whose positronic brain is not oriented about the Three Laws, would require first the setting up of a new basic theory and that this, in turn, would take many years.”

  The roboticist looked very gratified. “That is exactly what I mean, Mr.…”

  Baley waited a moment, then carefully introduced R. Daneel: “This is Daneel Olivaw, Dr. Gerrigel.”

  “Good day, Mr. Olivaw.” Dr. Gerrigel extended his hand and shook Daneel’s. He went on, “It is my estimation that it would take fifty years to develop the basic theory of a non-Asenion positronic brain—that is, one in which the basic assumptions of the Three Laws are disallowed—and bring it to the point where robots similar to modern models could be constructed.”

  “And this has never been done?” asked Baley. “I mean, Doctor, that we’ve been building robots for several thousand years. In all that time, hasn’t anybody or any group had fifty years to spare?”

  “Certainly,” said the roboticist, “but it is not the sort of work that anyone would care to do.”

  “I find that hard to believe. Human curiosity will undertake anything.”

  “It hasn’t undertaken the non-Asenion robot. The human race, Mr. Baley, has a strong Frankenstein complex.”

  “A what?”

  “That’s a popular name derived from a Medieval novel describing a robot that turned on its creator. I never read the novel myself. But that’s beside the point. What I wish to say is that robots without the First Law are simply not built.”

  “And no theory for it even exists?”

  “Not to my knowledge, and my knowledge,” he smiled self-consciously, “is rather extensive.”

  “And a robot with the First Law built in could not kill a man?”

  “Never. Unless such killing were completely accidental or unless it were necessary to save the lives of two or more men. In either case, the positronic potential built up would ruin the brain past recovery.”

  “All right,” said Baley. “All this represents the situation on Earth. Right?”

  “Yes. Certainly.”

  “What about the Outer Worlds?”

  Some of Dr. Gerrigel’s self-assurance seemed to ooze away. “Oh dear, Mr. Baley, I couldn’t say of my own knowledge, but I’m sure that if non-Asenion positronic brains were ever designed or if the mathematical theory were worked out, we’d hear of it.”

  “Would we? Well, let me follow up another thought in my mind, Dr. Gerrigel. I hope you don’t mind.”

  “No. Not at all.” He looked helplessly first at Baley, then at R. Daneel. “After all, if it is as important as you say, I’m glad to do all I can.”

  “Thank you, Doctor. My question is, why humanoid robots? I mean that I’ve been taking them for granted all my life, but now it occurs to me that I don’t know the reason for their existence. Why should a robot have a head and four limbs? Why should he look more or less like a man?”

  “You mean, why shouldn’t he be built functionally, like any other machine?”

  “Right,” said Baley. “Why not?”

  Dr. Gerrigel smiled a little. “Really, Mr. Baley, you were born too late. The early literature of robotics is riddled with a discussion of that very matter and the polemics involved were something frightful. If you would like a very good reference to the disputations among the functionalists and anti-functionalists, I can recommend Hanford’s History of Robotics. Mathematics is kept to a minimum. I think you’d find it very interesting.”


  “I’ll look it up,” said Baley, patiently. “Meanwhile, could you give me an idea?”

  “The decision was made on the basis of economics. Look here, Mr. Baley, if you were supervising a farm, would you care to build a tractor with a positronic brain, a reaper, a harrow, a milker, an automobile, and so on, each with a positronic brain; or would you rather have ordinary unbrained machinery with a single positronic robot to run them all. I warn you that the second alternative represents only a fiftieth or a hundredth the expense.”

  “But why the human form?”

  “Because the human form is the most successful generalized form in all nature. We are not a specialized animal, Mr. Baley, except for our nervous system and a few odd items. If you want a design capable of doing a great many widely various things, all fairly well, you could do no better than to imitate the human form. Besides that, our entire technology is based on the human form. An automobile, for instance, has its controls so made as to be grasped and manipulated most easily by human hands and feet of a certain size and shape, attached to the body by limbs of a certain length and joints of a certain type. Even such simple objects as chairs and tables or knives and forks are designed to meet the requirements of human measurements and manner of working. It is easier to have robots imitate the human shape than to redesign radically the very philosophy of our tools.”

  “I see. That makes sense. Now isn’t it true, Doctor, that the roboticists of the Outer World manufacture robots that are much more humanoid than our own?”

  “I believe that is true.”

  “Could they manufacture a robot so humanoid that it would pass for human under ordinary conditions?”

  Dr. Gerrigel lifted his eyebrows and considered that. “I think they could, Mr. Baley. It would be terribly expensive. I doubt that the return could be profitable.”

  “Do you suppose,” went on Baley, relentlessly, “that they could make a robot that would fool you into thinking it was a human?”

  The roboticist tittered. “Oh, my dear Mr. Baley. I doubt that. Really. There’s more to a robot than just his appear—”

  Dr. Gerrigel froze in the middle of the word. Slowly, he turned to R. Daneel, and his pink face went very pale.

  “Oh, dear me,” he whispered. “Oh, dear me.”

  He reached out one hand and touched R. Daneel’s cheek gingerly. R. Daneel did not move away but gazed at the roboticist calmly.

  “Dear me,” said Dr. Gerrigel, with what was almost a sob in his voice, “you are a robot.”

  “It took you a long time to realize that,” said Baley, dryly.

  “I wasn’t expecting it. I never saw one like this. Outer World manufacture?”

  “Yes,” said Baley.

  “It’s obvious now. The way he holds himself. The manner of his speaking. It is not a perfect imitation, Mr. Baley.”

  “It’s pretty good, though, isn’t it?”

  “Oh, it’s marvelous. I doubt that anyone could recognize the imposture at sight. I am very grateful to you for having me brought face to face with him. May I examine him?” The roboticist was on his feet, eager.

  Baley put out a hand. “Please, Doctor. In a moment. First, the matter of the murder, you know.”

  “Is that real, then?” Dr. Gerrigel was bitterly disappointed and showed it. “I thought perhaps that was just a device to keep my mind engaged and to see how long I could be fooled by—”

  “It is not a device, Dr. Gerrigel. Tell me, now, in constructing a robot as humanoid as this one, with the deliberate purpose of having it pass as human, is it not necessary to make its brain possess properties as close to that of the human brain as possible?”

  “Certainly.”

  “Very well. Could not such a humanoid brain lack the First Law? Perhaps it is left out accidentally. You say the theory is unknown. The very fact that it is unknown means that the constructors might set up a brain without the First Law. They would not know what to avoid.”

  Dr. Gerrigel was shaking his head vigorously. “No. No. Impossible.”

  “Are you sure? We can test the Second Law, of course—Daneel, let me have your blaster.”

  Baley’s eyes never left the robot. His own hand, well to one side, gripped his own blaster tightly.

  R. Daneel said calmly, “Here it is, Elijah,” and held it out, butt first.

  Baley said, “A plain-clothes man must never abandon his blaster, but a robot has no choice but to obey a human.”

  “Except, Mr. Baley,” said Dr. Gerrigel, “when obedience involves breaking the First Law.”

  “Do you know, Doctor, that Daneel drew his blaster on an unarmed group of men and women and threatened to shoot?”

  “But I did not shoot,” said Daneel.

  “Granted, but the threat was unusual in itself, wasn’t it, Doctor?”

  Dr. Gerrigel bit his lip. “I’d need to know the exact circumstances to judge. It sounds unusual.”

  “Consider this, then. R. Daneel was on the scene at the time of the murder, and if you omit the possibility of an Earthman having moved across open country, carrying a weapon with him, Daneel and Daneel alone of all the persons on the scene could have hidden the weapon.”

  “Hidden the weapon?” asked Dr. Gerrigel.

  “Let me explain. The blaster that did the killing was not found. The scene of the murder was searched minutely and it was not found. Yet it could not have vanished like smoke. There is only one place it could have been, only one place they would not have thought to look.”

  “Where, Elijah?” asked R. Daneel.

  Baley brought his blaster into view, held its barrel firmly in the robot’s direction.

  “In your food sac,” he said. “In your food sac, Daneel!”

  13.

  SHIFT TO THE MACHINE

  “That is not so,” said R. Daneel, quietly.

  “Yes? Well let the Doctor decide. Dr. Gerrigel?”

  “Mr. Baley?” The roboticist, whose glance had been alternating wildly between the plain-clothes man and the robot as they spoke, let it come to rest upon the human being.

  “I’ve asked you here for an authoritative analysis of this robot. I can arrange to have you use the laboratories of the City Bureau of Standards. If you need any piece of equipment they don’t have, I’ll get it for you. What I want is a quick and definite answer and hang the expense and trouble.”

  Baley rose. His words had emerged calmly enough, but he felt a rising hysteria behind them. At the moment, he felt that if he could only seize Dr. Gerrigel by the throat and choke the necessary statements out of him, he would forego all science.

  He said, “Well, Dr. Gerrigel?”

  Dr. Gerrigel tittered nervously and said, “My dear Mr. Baley, I won’t need a laboratory.”

  “Why not?” asked Baley apprehensively. He stood there, muscles tense, feeling himself twitch.

  “It’s not difficult to test the First Law. I’ve never had to, you understand, but it’s simple enough.”

  Baley pulled air in through his mouth and let it out slowly. He said, “Would you explain what you mean? Are you saying that you can test him here?”

  “Yes, of course. Look, Mr. Baley, I’ll give you an analogy. If I were a Doctor of Medicine and had to test a patient’s blood sugar, I’d need a chemical laboratory. If I needed to measure his basal metabolic rate, or test his cortical function, or check his genes to pinpoint a congenital malfunction, I’d need elaborate equipment. On the other hand, I could check whether he were blind by merely passing my hand before his eyes and I could test whether he were dead by merely feeling his pulse.

  “What I’m getting at is that the more important and fundamental the property being tested, the simpler the needed equipment. It’s the same in a robot. The First Law is fundamental. It affects everything. If it were absent, the robot could not react properly in two dozen obvious ways.”

  As he spoke, he took out a flat, black object which expanded into a small book-viewer. He inserted a well-worn spool into the
receptacle. He then took out a stop watch and a series of white, plastic slivers that fitted together to form something that looked like a slide rule with three independent movable scales. The notations upon it struck no chord of familiarity to Baley.

  Dr. Gerrigel tapped his book-viewer and smiled a little, as though the prospect of a bit of field work cheered him.

  He said, “It’s my Handbook of Robotics. I never go anywhere without it. It’s part of my clothes.” He giggled self-consciously.

  He put the eyepiece of the viewer to his eyes and his finger dealt delicately with the controls. The viewer whirred and stopped, whirred and stopped.

  “Built-in index,” the roboticist said, proudly, his voice a little muffled because of the way in which the viewer covered his mouth. “I constructed it myself. It saves a great deal of time. But then, that’s not the point now, is it? Let’s see. Umm, won’t you move your chair near me, Daneel?”

  R. Daneel did so. During the roboticist’s preparations, he had watched closely and unemotionally.

  Baley shifted his blaster.

  What followed confused and disappointed him. Dr. Gerrigel proceeded to ask questions and perform actions that seemed without meaning, punctuated by references to his triple slide rule and occasionally to the viewer.

  At one time, he asked, “If I have two cousins, five years apart in age, and the younger is a girl, what sex is the older?”

  Daneel answered (inevitably, Baley thought), “It is impossible to say on the information given.”

  To which Dr. Gerrigel’s only response, aside from a glance at his stop watch, was to extend his right hand as far as he could sideways and to say, “Would you touch the tip of my middle finger with the tip of the third finger of your left hand?”

  Daneel did that promptly and easily.

  In fifteen minutes, not more, Dr. Gerrigel was finished. He used his slide rule for a last silent calculation, then disassembled it with a series of snaps. He put away his stop watch, withdrew the Handbook from the viewer, and collapsed the latter.

  “Is that all?” said Baley, frowning.

  “That’s all.”

 

‹ Prev