Book Read Free

Asimov’s Future History Volume 4

Page 44

by Isaac Asimov


  And then, little by little, he collected his thoughts and knew that he was hugging not Daneel but R. Daneel–Robot Daneel Olivaw. He was hugging a robot and the robot was holding him lightly, allowing himself to be hugged, judging that the action gave pleasure to a human being and enduring that action because the positronic potentials of his brain made it impossible to repel the embrace and so cause disappointment and embarrassment to the human being.

  The insurmountable First Law of Robotics states: “A robot may not injure a human being–” and to repel a friendly gesture would do injury.

  Slowly, so as to reveal no sign of his own chagrin, Baley released his hold. He even gave each upper arm of the robot a final squeeze, so that there might seem to be no shame to the release.

  “Haven’t seen you, Daneel,” said Baley, “since you brought that ship to Earth with the two mathematicians. Remember?”

  “Of a certainty, Partner Elijah. It is a pleasure to see you.”

  “You feel emotion, do you?” said Baley lightly.

  “I cannot say what I feel in any human sense, Partner Elijah. I can say, however, that the sight of you seems to make my thoughts flow more easily, and the gravitational pull on my body seems to assault my senses with lesser insistence, and that there are other changes I can identify. I imagine that what I sense corresponds in a rough way to what it is that you may sense when you feel pleasure.”

  Baley nodded. “Whatever it is you sense when you see me, old partner, that makes it seem preferable to the state in which you are when you don’t see me, suits me well–if you follow my meaning. But how is it you are here?’

  “Giskard Reventlov, having reported you–” R. Daneel paused.

  “Purified?” asked Baley sardonically.

  “Disinfected,” said R. Daneel. “I felt it appropriate to enter then.”

  “Surely you would not fear infection otherwise?”

  “Not at all, Partner Elijah, but others on the ship might then be reluctant to have me approach them. The people of Aurora are sensitive to the chance of infection, sometimes to a point beyond a rational estimate of the probabilities.”

  “I understand, but I wasn’t asking why you were here at this moment. I meant why are you here at all?”

  “Dr. Fastolfe, of whose establishment I am part, directed me to board the ship that had been sent to pick you up for several reasons. He felt it desirable that you have one immediate item of the known in what he was certain would be a difficult mission for you.”

  “That was a kindly thought on his part. I thank him.”

  R. Daneel bowed gravely in acknowledgment. “Dr. Fastolfe also felt that the meeting would give me”–the robot paused–“appropriate sensations.”

  “Pleasure, you mean, Daneel.”

  “Since I am permitted to use the term, yes. And as a third reason–and the most important–”

  The door opened again at that point and R. Giskard walked in.

  Baley’s head turned toward it and he felt a surge of displeasure. There was no mistaking R. Giskard as a robot and its presence emphasized, somehow, the robotism of Daneel (R. Daneel, Baley suddenly thought again), even though Daneel was far the superior of the two. Baley didn’t want the robotism of Daneel emphasized; he didn’t want himself humiliated for his inability to regard Daneel as anything but a human being with a somewhat stilted way with the language.

  He said impatiently, “What is it, boy?”

  R. Giskard said, “I have brought the book-films you wished to see, sir, and the viewer.”

  “Well, put them down. Put them down.–And you needn’t stay. Daneel will be here with me.”

  “Yes, sir.” The robot’s eyes–faintly glowing, Baley noticed, as Daneel’s were not–turned briefly to R. Daneel, as though seeking orders from a superior being.

  R. Daneel said quietly, “It will be appropriate, friend Giskard, to remain just outside the door.”

  “I shall, friend Daneel,” said B. Giskard.

  It left and Baley said with some discontent, “Why does it have to stay just outside the door? Am I a prisoner?”

  “In the sense,” said R. Daneel, “that it would not be permitted for you to mingle with the ship’s company in the course of this voyage, I regret to be forced to say you are indeed a prisoner. Yet that is not the reason for the presence of Giskard.–And I should tell you at this point that it might well be advisable, Partner Elijah, if you did not address Giskard–or any robot–as ‘boy.”

  Baley frowned. “Does it resent the expression?”

  “Giskard does not resent any action of a human being. It is simply that ‘boy’ is not a customary term of address for robots on Aurora and it would be inadvisable to create friction with the Aurorans by unintentionally stressing your place of origin through habits of speech that are nonessential.”

  “How do I address it, then?’

  “As you address me, by the use of his accepted identifying name. That is, after all, merely a sound indicating the particular person you are addressing–and why should one sound be preferable to another? It is merely a matter of convention. And it is also the custom on Aurora to refer to a robot as ‘he’–or sometimes ‘she’–rather than as ‘it.’ Then, too, it is not the custom on Aurora to use the initial ‘R.’ except under formal conditions where the entire name of the robot is appropriate–and even then the initial is nowadays often left out.”

  “In that case–Daneel,” (Baley repressed the sudden impulse to say “R. Daneel”) “how do you distinguish between robots and human beings?”

  “The distinction is usually self-evident, Partner Elijah. There would seem to be no need to emphasize it unnecessarily. At least that is the Auroran view and, since you have asked Giskard for films on Aurora, I assume you wish to familiarize yourself with things Auroran as an aid to the task you have undertaken.”

  “The task which has been dumped on me, yes. And what if the distinction between robot and human being is not self-evident, Daneel? As in your case?”

  “Then why make the distinction, unless the situation is such that it is essential to make it?”

  Baley took a deep breath. It was going to be difficult to adjust to this Auroran pretense that robots did not exist. He said, “But then, if Giskard is not here to keep me prisoner, why is it–he–outside the door?”

  “Those are according to the instructions of Dr. Fastolfe, Partner Elijah. Giskard is to protect you.”

  “Protect me? Against what?–Or against whom?”

  “Dr. Fastolfe was not precise on that point, Partner Elijah. Still, as human passions are running high over the matter of Jander Panell–”

  “Jander Panell?”

  “The robot whose usefulness was terminated.”

  “The robot, in other words, who was killed?”

  “Killed, Partner Elijah, is a term that is usually applied to human beings.”

  “But on Aurora distinctions between robots and human beings are avoided, are they not?”

  “So they are! Nevertheless, the possibility of distinction or lack of distinction in the particular case of the ending of functioning has never arisen–to my knowledge. I do not know what the rules are.”

  Baley pondered the matter. It was a point of no real importance, purely a matter of semantics. Still, he wanted to probe the manner of thinking of the Aurorans. He would get nowhere otherwise.

  He said slowly, “A human being who is functioning is alive. If that life is violently ended by the deliberate action of another human being, we call that ‘murder’ or ‘homicide.’ ‘Murder’ is, somehow, the stronger word. To be witness, suddenly, to an attempted violent end to the life of a human being, one would shout ‘Murder!’ It is not at all likely that one would shout ‘Homicide!’ It is the more formal word, the less emotional word.”

  R. Daneel said, “I do not understand the distinction you are making, Partner Elijah. Since ‘murder’ and ‘homicide’ are both used to represent the violent ending of the life of a human being,
the two words must be interchangeable. Where, then, is the distinction?”

  “Of the two words, one screamed out will more effectively chill the blood of a human being than the other will, Daneel.”

  “Why is that?”

  “Connotations and associations; the subtle effect, not of dictionary meaning, but of years of usage; the nature of the sentences and conditions and events in which one has experienced the use of one word as compared with that of the other.”

  “There is nothing of this in my programming,” said Daneel, with a curious sound of helplessness hovering over the apparent lack of emotion with which he said this (the same lack of emotion with which he said everything).

  Baley said, “Will you accept my word for it, Daneel?”

  Quickly, Daneel said, almost as though he had just been presented with the solution to a puzzle, “Without doubt.”

  “Now, then, we might say that a robot that is functioning is alive,” said Baley. “Many might refuse to broaden the word so far, but we are free to devise definitions to suit ourselves if it is useful. It is easy to treat a functioning robot as alive and it would be unnecessarily complicated to try to invent a new word for the condition or to avoid the use of the familiar one. You are alive, for instance, Daneel, aren’t you?”

  Daneel said, slowly and with emphasis, “I am functioning!”

  “Come. If a squirrel is alive, or a bug, or a tree, or a blade of grass, why not you? I would never remember to say–or to think–that I am alive but that you are merely functioning, especially if I am to live for a while on Aurora, where I am to try not to make unnecessary distinctions between a robot and myself. Therefore, I tell you that we are both alive and I ask you to take my word for it.”

  “I will do so, Partner Elijah.”

  “And yet can we say that the ending of robotic life by the deliberate violent action of a human being is also ‘murder’? We might hesitate. If the crime is the same, the punishment should be the same, but would that be right? If the punishment of the murder of a human being is death, should one actually execute a human being who puts an end to a robot?”

  “The punishment of a murderer is psychic-probing, Partner Elijah, followed by the construction of a new personality. It is the personal structure of the mind that has committed the crime, not the life of the body.”

  “And what is the punishment on Aurora for putting a violent end to the functioning of a robot?”

  “I do not know, Partner Elijah. Such an incident has never occurred on Aurora, as far as I know.”

  “I suspect the punishment would not be psychic-probing,” said Baley. “How about ‘roboticide’?”

  “Roboticide?”

  “As the term used to describe the killing of a robot.”

  Daneel said, “But what about the verb derived from the noun, Partner Elijah? One never says ‘to homicide’ and it would therefore not be proper to say ‘to roboticide.”

  “You’re right. You would have to say ‘to murder’ in each case.”

  “But murder applies specifically to human beings. One does not murder an animal, for instance.”

  Baley said, “True. And one does not murder even a human being by accident, only by deliberate intent. The more general term is ‘to kill.’ That applies to accidental death as well as to deliberate murder–and it applies to animals as well as human beings. Even a tree may be killed by disease, so why may not a robot be killed, eh, Daneel?”

  “Human beings and other animals and plants as well, Partner Elijah, are all living things,” said Daneel. “A robot is a human artifact, as much as this viewer is. An artifact is ‘destroyed,’ ‘damaged,’ ‘demolished,’ and so on. It is never ‘killed.”

  “Nevertheless, Daneel, I shall say ‘Killed.’ Jander Panell was killed.”

  Daneel said, “Why should a difference in a word make any difference to the thing described?”

  “That which we call a rose by any other name would smell as sweet.’ Is that it, Daneel?”

  Daneel paused, then said, “I am not certain what is meant by the smell of a rose, but if a rose on Earth is the common flower that is called a rose on Aurora, and if by its ‘smell’ you mean a property that can be detected, sensed, or measured by human beings, then surely calling a rose by another sound-combination–and holding all else equal–would not affect the smell or any other of its intrinsic properties.”

  “True. And yet changes in name do result in changes in perception where human beings are concerned.”

  “I do not see why, Partner Elijah.”

  “Because human beings are often illogical, Daneel. It is not an admirable characteristic.”

  Baley sank deeper into his chair and fiddled with his viewer, allowing his mind, for a few minutes, to retreat into private thought. The discussion with Daneel was useful in itself, for while Baley played with the question of words, he managed to forget that he was in space, to forget that the ship was moving forward until it was far enough from the mass centers of the Solar System to make the Jump through hyperspace; to forget that he would soon be several million kilometers from Earth and, not long after that, several light-years from Earth.

  More important, there were positive conclusions to be drawn. It was clear that Daneel’s talk about Aurorans making no distinction between robots and human beings was misleading. The Aurorans might virtuously remove the initial “B.,” the use of “boy” as a form of address, and the use of “it” as the customary pronoun, but from Daneel’s resistance to the use of the same word for the violent ends of a robot and of a human being (a resistance inherent in his programming which was, in turn, the natural consequence of Auroran assumptions about how Daneel ought to behave) one had to conclude that these were merely superficial changes. In essence, Aurorans were as firm as Earthmen in their belief that robots were machines that were infinitely inferior to human beings.

  That meant that his formidable task of finding a useful resolution of the crisis (if that were possible at all) would not be hampered by at least one particular misperception of Auroran society.

  Baley wondered if he ought to question Giskard, in order to confirm the conclusions he reached from his conversation with Daneel–and, without much hesitation, decided not to. Giskard’s simple and rather unsubtle mind would be of no use. He would “Yes, sir” and “No, sir” to the end. It would be like questioning a recording.

  Well, then, Baley decided, he would continue with Daneel, who was at least capable of responding with something approaching subtlety.

  He said, “Daneel, let us consider the case of Jander Panell, which I assume, from what you have said so far, is the first case of roboticide in the history of Aurora. The human being responsible–the killer–is, I take it, not known.”

  “If,” said Daneel, “one assumes that a human being was responsible, then his identity is not known. In that, you are right, Partner Elijah.”

  “What about the motive? Why was Jander Panel! killed?”

  “That, too, is not known.”

  “But Jander Panel! was a humaniform robot, one like yourself and not one like, for instance, R. Gis–I mean, Giskard.”

  “That is so. Jander was a humaniform robot like myself.”

  “Might it not be, then, that no case of roboticide was intended?”

  “I do not understand, Partner Elijah.”

  Baley said, a little impatiently, “Might not the killer have thought this Jander was a human being, that the intention was homicide, not roboticide?”

  Slowly, Daneel shook his head. “Humaniform robots are quite like human beings in appearance, Partner Elijah, down to the hairs and pores in our skin. Our voices are thoroughly natural, we can go through the motions of eating, and so on. And yet, in our behavior there are noticeable differences. There may be fewer such differences with time and with refinement of technique, but as yet they are many. You–and other Earthmen not used to humaniform robots–may not easily note these differences, but Aurorans would. No Auroran would mistake J
ander–or me–for a human being, not for a moment.”

  “Might some Spacer, other than an Auroran, make the mistake?”

  Daneel hesitated. “I do not think so. I do not speak from personal observation or from direct programmed knowledge, but I do have the programming to know that all the Spacer worlds are as intimately acquainted with robots as Aurora is–some, like Solaria, even more so–and I deduce, therefore, that no Spacer would miss the distinction between human and robot.”

  “Are there humaniform robots on the other Spacer worlds?”

  “No, Partner Elijah, they exist only on Aurora so far.”

  “Then other Spacers would not be intimately acquainted with humaniform robots and might well miss the distinctions and mistake them for human beings.”

  “I do not think that is likely. Even humaniform robots will behave in robotic fashion in certain definite ways that any Spacer would recognize.”

  “And yet surely there are Spacers who are not as intelligent as most, not as experienced, not as mature. There are Spacer children, if nothing else, who would miss the distinction.”

  “It is quite certain, Partner Elijah, that the–roboticide–was not committed by anyone unintelligent, inexperienced, or young. Completely certain.”

  “We’re making eliminations. Good. If no Spacer would miss the distinction, what about an Earthman? Is it possible that–”

  “Partner Elijah, when you arrive in Aurora, you will be the first Earthman to set foot on the planet since the period of original settlement was over. All Aurorans now alive were born on Aurora or, in a relatively few cases, on other Spacer worlds.”

  “The first Earthman,” muttered Baley. “I am honored. Might not an Earthman be present on Aurora without the knowledge of Aurorans?”

  “No!” said Daneel with simple certainty.

  “Your knowledge, Daneel, might not be absolute.”

  “No!” came the repetition, in tones precisely similar to the first.

  “We conclude, then,” said Baley with a shrug, “that the roboticide was intended to be roboticide and nothing else.”

 

‹ Prev