The Robots of Dawn trs-3

Home > Science > The Robots of Dawn trs-3 > Page 9
The Robots of Dawn trs-3 Page 9

by Isaac Asimov


  Fastolfe said, “Now that they’ve gone—” He paused and shook his head slowly in rueful conclusion. “Except that they haven’t. Ordinarily, it is customary for the robots to leave before lunch actually begins. Robots do not eat, while human beings do. It therefore makes sense that those who eat do so and that those who do not leave. And it has ended by becoming one more, ritual. It would be quite unthinkable to eat until the robots left. In this case, though—”

  “They have not left,” said Baley.

  “No. I felt that security came before etiquette and I felt that, not being an Auroran, you would not mind.”

  Baley waited for Fastolfe to make the first move. Fastolfe lifted a fork, so did Baley. Fastolfe made use of it, moving slowly and allowing Baley to see exactly what he was doing.

  Baley bit cautiously into a shrimp and found it delightful. He recognized the taste, which was like the shrimp paste produced on Earth but enormously more subtle and rich. He chewed slowly and, for a while, despite his anxiety to get on with the investigation while dining, he found it quite unthinkable to do anything but give his full attention to the lunch.

  It was in fact, Fastolfe who made the first move. “Shouldn’t we make a beginning on the problem, Mr. Baley?”

  Baley felt himself flush slightly. “Yes. By all means. I ask your pardon. Your Auroran food caught me by surprise, so that it was difficult for me to think of anything else.—The problem, Dr. Fastolfe, is of your making, isn’t it?”

  “Why do you say that?”

  “Someone has committed roboticide in a manner that requires great expertise—as I have been told.”

  “Roboticide? An amusing term.” Fastolfe smiled. “Of course, I understand what you mean by it.—You have been told correctly; the manner requires enormous expertise.”

  “And only you have the expertise to carry it out—as I have been told.”

  “You have been told correctly there, too.”

  “And even you yourself admit—in fact, you insist—that only you could have put Jander into a mental freeze-out.”

  “I maintain what is, after all, the truth, Mr. Baley. It would do me no good to lie, even if I could bring myself to do so. It is notorious that I am the outstanding theoretical roboticist in all the Fifty Worlds.”

  “Nevertheless, Dr. Fastolfe, might not the second-best theoretical roboticist in all the worlds—or the third-best, or even the fifteenth-best—nevertheless possess the necessary ability to commit the deed? Does it really require all the ability of the very best?”

  Fastolfe said calmly, “In my opinion, it really requires all the ability of the very best. Indeed… I again in MY opinion, myself, could only accomplish the task on one of my good days. Remember that the best brains in robotics—including mine—have specifically labored to design, positronic brains that could not be driven into mental freeze-out.”

  “Are you certain of all that? Really certain?”

  “Completely.”

  “And you stated so publicly?”

  “Of course. There was a public inquiry, my dear Earthman. I was asked the questions you are now asking and I answered truthfully. It is an Auroran custom to do so.”

  Baley said, “I do not, at the moment, question that you were convinced, you were answering truthfully. But might you not have been swayed by a natural pride in yourself that might also be typically Auroran, might it not?”

  “You mean that my anxiety to be considered the best would make me willingly put myself in a position where everyone would be forced to conclude I had mentally frozen Jander?”

  “I picture you, somehow, as content to have your political and social status destroyed, provided your scientific reputation remained intact.”

  “I see. You have an interesting way of thinking, Mr. Baley. This would not have occurred to me. Given a choice between admitting I was second-best and admitting I was guilty of, to use your phrase, a roboticide, you are of the opinion I would knowingly accept the latter.”

  “No, I Dr. Fastolfe, I do not wish to present the matter quite so simplistically. Might it not be that you deceive yourself into thinking you are the greatest of all roboticists and that you are completely unrivaled, clinging to that at all costs, because you unconsciously—unconsciously, Dr. Fastolfe—realize that, in fact, you are, being overtaken—or have even already been overtaken—by others.”

  Fastolfe laughed, but there was an edge of annoyance in it. “Not so, Mr. Baley. Quite wrong.”

  “Think, Dr. Fastolfe! Are you certain that none of your roboticist colleagues can approach you in brilliance?”

  “There are only a few who are capable of dealing at all with humaniform robots. Daneel’s construction created virtually a new profession for which there is not even a name—humaniformicists, perhaps. Of the theoretical roboticists on Aurora, not one, except for myself, understands the workings of Daneel’s positronic brain. Dr. Sarton did, but he is dead—and he did not understand it as well as I do. The basic theory is mine.”

  “It may have been yours to be in with, but surely you can’t expect to maintain exclusive ownership. Has no one learned the theory?”

  Fastolfe shook his head firmly. “Not one. I have taught no one and I defy any other living roboticist to have developed the theory on his own.”

  Baley said, with a touch of irritation, “Might there not be a bright young man, fresh out of the university, who is cleverer than anyone yet realizes, who—”

  “No, Mr. Baley, no. I would have known such a young man. He would have passed through my laboratories. He would have worked with me. At the moment, no such young man exists. Eventually, one will; perhaps many will. At the moment, none!”

  “If you died, then, the new science dies with you?”

  “I am only a hundred and sixty-five years old. That’s metric years, of course, so it I is only a hundred and twenty-four of your Earth years, more or less. I am still quite young by Auroran standards and there is no medical reason why my life should be considered even half over. It is not entirely unusual to reach an age of four hundred years—metric years. There is yet plenty of time to teach.”

  They had finished eating, but neither man made any move to leave the table. Nor did any robot approach to clear it. It was as though they were transfixed into immobility by the intensity of the back and forth flow of talk.

  Baley’s eyes narrowed. He said, “Dr. Fastolfe, two years ago I was on Solaria. There I was given the clear impression that the Solarians, were, on the whole, the most skilled roboticists in all the worlds.”

  “On the whole, that’s probably true.”

  “And not one of them could have done the deed?”

  “Not one, Mr. Baley. Their skill is with robots who are, at best, no more advanced than my poor, reliable Giskard. The Solarians know nothing of the construction of humaniform robots.

  “How can you be sure of that?”

  “Since you were on Solaria, Mr. Baley, you know very well that Solarians can approach each other with only the greatest of difficulty, that they interact by trimensional viewing—except where sexual contact is absolutely required. Do you think that any of them would dream of designing a robot so human in appearance that it would activate their neuroses? They would so avoid the possibility of approaching him, since he would look so human, that they could make no reasonable use of him.”

  “Might not a Solarian here or there display a surprising tolerance for the human body? How can you be sure?”

  “Even if a Solarian could, which I do not deny, there are no Solarian nationals on Aurora this year.”

  “None?”

  “None! They do not like to be thrown into contact even with Aurorans and, except on the most urgent business, none will come here—or to any other world. Even in the case of urgent business, they will come no closer than orbit and then they deal with us only by electronic communication.”

  Baley said, “In that case, if you are—literally and actually—the only person in all the worlds who could have d
one it, did you kill Jander?”

  Fastolfe said, “I cannot believe that Daneel did not tell you I have denied this deed.”

  “He did tell me so, but I want to hear it from you.”

  Fastolfe crossed his arms and frowned. He said, through clenched teeth, “Then I’ll tell you so. I did not do it.”

  Baley shook his head. “I believe you believe that statement.”

  “I do. And most sincerely. I am telling the truth. I did not kill Jander.”

  “But if you did not do it, and if no one,—else can possibly have done it, then—But wait. I am, perhaps, making an unwarranted assumption. Is Jander really dead or have I been brought here under false pretenses?”

  “The robot is really destroyed. It will be quite possible to show him to you, if the Legislature does not bar my access to him before the day is over—which I don’t think they will do.”

  “In that case, if you did not do it, and if no one else could possibly have done it, and if the robot is actually dead—who committed the crime?”

  Fastolfe sighed. “I’m sure Daneel told you what I have maintained at the inquiry—but you want to hear it from my own lips.”

  “That is right, Dr. Fastolfe.”

  “Well, then, no one committed the crime. It was a spontaneous event in the positronic flow along the brain paths that set up the mental freeze-out in Jander.”

  “Is that likely?”

  “No, it is not. It is extremely unlikely—but if I did not do it, then that is the only thing that can have happened.”

  “Might it not be argued that there is a greater chance that you are lying than that a spontaneous mental freeze-out took place.”

  “Many do so argue. But I happen to know that I did not do it and that leaves only the spontaneous event as a possibility.”

  “And you have had me brought here to demonstrate—to prove—that the spontaneous event did,—in fact, take place?”

  “Yes.”

  “But how does one go about proving the spontaneous event? Only by proving it, it seems, can I save you, Earth, and myself.”

  “In order of increasing importance, Mr. Baley?”

  Baley looked annoyed. “Well, then, you, me, and Earth.”

  “I’m afraid,” said Fastolfe, “that after considerable thought, I have come to the conclusion that there is no way of obtaining such a proof.”

  17

  Baley stared at Fastolfe in horror. “No way?”

  “No way. None.” And then, in a sudden fit of apparent abstraction, he seized the spicer and said, “You know, I am curious to see if I can still do the triple genuflection.”

  He tossed the spicer into the air with a calculated flip of the wrist. It somersaulted and, as it came down, Fastolfe caught the narrow end on the side of his right palm (his thumb tucked down). It went up slightly and swayed and was caught on the side of the left palm. It went up again in reverse and was caught on the side of the right palm and then again on the left palm. After this third genuflection, it was lifted with sufficient force to produce a flip. Fastolfe caught it in his right fist, with his left hand nearby, palm upward. Once the spicer was caught, Fastolfe displayed his left hand and there was a fine sprinkling of salt in it.

  Fastolfe said, “It is a childish display to the scientific mind and the effort is totally disproportionate to the end, which is, of course, a pinch of salt, but the good Auroran host is proud of being able to put on a display. There are some experts who can keep the spicer in the air for a minute and a half, moving their hands almost more rapidly than the eye can follow.

  “Of course,” he added thoughtfully, “Daneel can perform such actions with greater skill and speed than any human. I have tested him in this manner in order to check on the workings of his brain paths, but it would be totally wrong to have him display such talents in public. It would needlessly humiliate human spicists—a popular term for them, you understand, though you won’t find it in dictionaries.”

  Baley grunted.

  Fastolfe sighed. “But we must get back to business.”

  “You brought me through several parsecs of space for that purpose,”

  “Indeed, I did.—Let us proceed!”

  Baley said, “Was there a reason for that display of yours, Dr. Fastolfe?”

  Fastolfe said, “Well, we seem to have come to an impasse. I’ve brought you here to do something that can’t be done. Your face was rather eloquent and, to tell you the truth, I felt no better. It seemed, therefore, that we could use a breathing space. And now—let us proceed.”

  “On the impossible task?”

  “Why should it be impossible for you, Mr. Baley? Your reputation is that of an achiever of the impossible.”

  “The hyperwave drama? You believe that foolish distortion of what happened on Solaria?”

  Fastolfe spread his arms. “I have no other hope.”

  Baley said, “And I have no choice. I must continue to try; I cannot return to Earth a failure. That has been made clear to me.—Tell me, Dr. Fastolfe, how could Jander have been killed? What sort of manipulation of his mind would have been required?”

  “Mr. Baley don’t know how I could possibly explain that, even to another roboticist, which you certainly are not, and even if I were prepared to publish my theories, which I certainly am not. However, let me see if I can’t explain something.—You know, of course, that robots were invented on Earth.”

  “Very little concerning robotics is dealt with on Earth—”

  “Earth’s strong antirobot bias is well-known on the Spacer worlds. But the Earthly origin of robots is obvious to any person on Earth who thinks about it. It is well-known that hyperspatial travel was developed with the aid of robots and, since the Spacer worlds could not have been settled without hyperspatial travel, it follows that robots existed before settlement had taken place and while Earth was still the only inhabited planet. Robots were therefore invented on Earth by Earthpeople.”

  “Yet Earth feels no pride in that, does it?”

  “We do not discuss it,” said Baley shortly.

  “And Eartlipeople know nothing about Susan Calvin?”

  “I have come across her name in a few old books. She was one of the early pioneers in robotics.”

  “Is that all you know of her?”

  Baley made a gesture of dismissal. “I suppose I could find out more if I searched the records, but I have had no occasion to do so.”

  “How strange,” said Fastolfe. “She’s a demigod to all Spacers, so much so that I imagine that few Spacers who are not actually roboticists think of her as an Earthwoman. It would seem a profanation. They would refuse to believe it if they were told that she died after having lived scarcely more than a hundred metric years. And yet you know her only as an early pioneer.”

  “Has she got something to do with all this, Dr. Fastolfe?”

  “Not directly, but in a way. You must understand that numerous legends cluster about her name. Most of them are undoubtedly untrue, but they cling to her, nonetheless. One of the most famous legends—and one of the least likely to be true — concerns a robot manufactured in those primitive days that, through some accident on the production lines, turned out to have telepathic abilities—”

  “What!”

  “A legend! I told you it was a legend—and undoubtedly untrue! Mind you, there is some theoretical reason for supposing this might be possible, though no one has ever presented a plausible design, that could even begin to incorporate such an ability. That it could have appeared in positronic brains as crude and simple as those in the prehyperspatial era is totally unthinkable. That is why we are quite certain that this particular tale is an invention. But let me go on anyway, for it points out a moral.

  “By all means, go on.”

  “The robot, according to the tale, could read minds. And when asked questions, he read the questioner’s mind and told the questioner what he wanted to hear. Now the First Law of Robotics states quite clearly that a robot may not injure a
human being or, through inaction, allow a person to come to harm, but to robots generally that means physical harm. A robot who can read minds, however, would surely decide that disappointment or anger or any violent emotion would make the human being feeling those emotions unhappy and, the robot would interpret the inspiring of such emotions under the heading of ‘harm.’ If, then, a telepathic robot knew that the truth might disappoint or enrage a questioner or cause that person to feel envy or unhappiness, he would tell a pleasing lie, instead. Do you see that?”

  “Yes, of course.”

  “So the robot lied even to Susan Calvin herself. The lies could not long continue, for different people were told different things that were not only inconsistent among themselves but unsupported by the gathering evidence of reality, you see. Susan Calvin discovered she had been lied to and realized that those lies had led her into a position of considerable embarrassment. What would have disappointed her somewhat to begin with had now, thanks to false hopes, disappointed her unbearably.—You never heard the story?”

  “I give you my word.”

  “Astonishing! Yet it certainly wasn’t invented on Aurora, for it is equally current on all the worlds.—In any case, Calvin took her revenge. She pointed out to the robot that, whether he told the truth or told a lie, he would equally harm the person with whom he dealt. He could not obey the First Law, whatever action he took. The robot, understanding this, was forced to take refuge in total inaction. If you want to put it colorfully, his positronic pathways burned out. His brain was irrecoverably destroyed. The legend goes on, to say that Calvin’s last word to the destroyed robot was ‘Liar!’”

  Baley said, “And something like this, I take it, was what happened to Jander Panell. He was faced with a contradiction in terms and his brain burned out?”

  “It’s what appears to have happened, though that is not as easy to bring about as it would have been in Susan Calvin’s day. Possibly because of the legend, roboticists have always been careful to make it as difficult as possible for contradictions to arise. As the theory of positronic brains has grown more subtle and as the practice of positronic brain design has grown more intricate, increasingly successful systems have been devised to have all situations that might arise resolve into nonequality, so that some action can always be taken that will be interpreted as obeying the First Law.”

 

‹ Prev