Asimov's Future History Volume 1
Page 34
She spoke them carefully, clearly, quoting word for word the famous bold print on page one of the “Handbook of Robotics.”
“I’ve heard of them,” said Quinn, carelessly.
“Then the matter is easy to follow,” responded the psychologist, dryly. “If Mr. Byerley breaks any of those three rules, he is not a robot. Unfortunately, this procedure works in only one direction. If he lives up to the rules, it proves nothing one way or the other.”
Quinn raised polite eyebrows, “Why not, doctor?”
“Because, if you stop to think of it, the three Rules of Robotics are the essential guiding principles of a good many of the world’s ethical systems. Of course, every human being is supposed to have the instinct of self-preservation. That’s Rule Three to a robot. Also every ‘good’ human being, with a social conscience and a sense of responsibility, is supposed to defer to proper authority; to listen to his doctor, his boss, his government, his psychiatrist, his fellow man; to obey laws, to follow rules, to conform to custom – even when they interfere with his comfort or his safety. That’s Rule Two to a robot. Also, every ‘good’ human being is supposed to love others as himself, protect his fellow man, risk his life to save another. That’s Rule One to a robot. To put it simply – if Byerley follows all the Rules of Robotics, he may be a robot, and may simply be a very good man.”
“But,” said Quinn, “you’re telling me that you can never prove him a robot.”
“I may be able to prove him not a robot”
“That’s not the proof I want.”
“You’ll have such proof as exists. You are the only one responsible for your own wants.”
Here Lanning’s mind leaped suddenly to the sting of an idea, “Has it occurred to anyone,” he ground out, “that district attorney is a rather strange occupation for a robot? The prosecution of human beings – sentencing them to death – bringing about their infinite harm-”
Quinn grew suddenly keen, “No, you can’t get out of it that way. Being district attorney doesn’t make him human. Don’t you know his record? Don’t you know that he boasts that he has never prosecuted an innocent man; that there are scores of people left untried because the evidence against them didn’t satisfy him, even though he could probably have argued a jury into atomizing them? That happens to be so.”
Lanning’s thin cheeks quivered, “No, Quinn, no. There is nothing in the Rules of Robotics that makes any allowance for human guilt. A robot may not judge whether a human being deserves death. It is not for him to decide. He may not harm a human – variety skunk, or variety angel.”
Susan Calvin sounded tired. “Alfred,” she said, “don’t talk foolishly. What if a robot came upon a madman about to set fire to a house with people in it? He would stop the madman, wouldn’t he?”
“Of course.”
“And if the only way he could stop him was to kill him-”
There was a faint sound in Lanning’s throat. Nothing more.
“The answer to that, Alfred, is that he would do his best not to kill him. If the madman died, the robot would require psychotherapy because he might easily go mad at the conflict presented him – of having broken Rule One to adhere to Rule One in a higher sense. But a man would be dead and a robot would have killed him.”
“Well, is Byerley mad?” demanded Lanning; with all the sarcasm he could muster.
“No, but he has killed no man himself. He has exposed facts which might represent a particular human being to be dangerous to the large mass of other human beings we call society. He protects the greater number and thus adheres to Rule One at maximum potential. That is as far as he goes. It is the judge who then condemns the criminal to death or imprisonment, after the jury decides on his guilt or innocence. It is the jailer who imprisons him, the executioner who kills him. And Mr. Byerley has done nothing but determine truth and aid society.
“As a matter of fact, Mr. Quinn, I have looked into Mr. Byerley’s career since you first brought this matter to our attention. I find that he has never demanded the death sentence in his closing speeches to the jury. I also find that he has spoken on behalf of the abolition of capital punishment and contributed generously to research institutions engaged in criminal neurophysiology. He apparently believes in the cure, rather than the punishment of crime. I find that significant.”
“You do?” Quinn smiled. “Significant of a certain odor of roboticity, perhaps?”
“Perhaps. Why deny it? Actions such as his could come only from a robot, or from a very honorable and decent human being. But you see, you just can’t differentiate between a robot and the very best of humans.”
Quinn sat back in his chair. His voice quivered with impatience. “Dr. Lanning, it’s perfectly possible to create a humanoid robot that would perfectly duplicate a human in appearance, isn’t it?”
Lanning harrumphed and considered, “It’s been done experimentally by U. S. Robots,” he said reluctantly, “without the addition of a positronic brain, of course. By using human ova and hormone control, one can grow human flesh and skin over a skeleton of porous silicone plastics that would defy external examination. The eyes, the hair, the skin would be really human, not humanoid. And if you put a positronic brain, and such other gadgets as you might desire inside, you have a humanoid robot.”
Quinn said shortly, “How long would it take to make one?”
Lanning considered, “If you had all your equipment – the brain, the skeleton, the ovum, the proper hormones and radiations – say, two months.”
The politician straightened out of his chair. “Then we shall see what the insides of Mr. Byerley look like. It will mean publicity for U. S. Robots – but I gave you your chance.”
Lanning turned impatiently to Susan Calvin, when they were alone. “Why do you insist-”?
And with real feeling, she responded sharply and instantly, “Which do you want – the truth or my resignation? I won’t lie for you. U. S. Robots can take care of itself. Don’t turn coward.”
“What,” said Lanning, “if he opens up Byerley, and wheels and gears fall out what then?”
“He won’t open Byerley,” said Calvin, disdainfully. “Byerley is as clever as Quinn, at the very least”
The news broke upon the city a week before Byerley was to have been nominated. But “broke” is the wrong word. It staggered upon the city, shambled, crawled. Laughter began, and wit was free. And as the far off hand of Quinn tightened its pressure in easy stages, the laughter grew forced, an element of hollow uncertainty entered, and people broke off to wonder.
The convention itself had the sir of a restive stallion. There had been no contest planned. Only Byerley could possibly have been nominated a week earlier. There was no substitute even now. They had to nominate him, but there was complete confusion about it.
It would not have been so bad if the average individual were not torn between the enormity of the charge, if true, and its sensational folly, if false.
The day after Byerley was nominated perfunctorily, hollowly – a newspaper finally published the gist of a long interview with Dr. Susan Calvin, “world famous expert on robopsychology and positronics.”
What broke loose is popularly and succinctly described as hell.
It was what the Fundamentalists were waiting for. They were not a political party; they made pretense to no formal religion. Essentially they were those who had not adapted themselves to what had once been called the Atomic Age, in the days when atoms were a novelty. Actually, they were the Simple-Lifers, hungering after a life, which to those who lived it had probably appeared not so Simple, and who had been, therefore, Simple-Lifers themselves.
The Fundamentalists required no new reason to detest robots and robot manufacturers; but a new reason such as the Quinn accusation and the Calvin analysis was sufficient to make such detestation audible.
The huge plants of the U. S. Robot & Mechanical Men Corporation was a hive that spawned armed guards. It prepared for war.
Within the city
the house of Stephen Byerley bristled with police.
The political campaign, of course, lost all other issues, and resembled a campaign only in that it was something filling the hiatus between nomination and election.
Stephen Byerley did not allow the fussy little man to distract him. He remained comfortably unperturbed by the uniforms in the background. Outside the house, past the line of grim guards, reporters and photographers waited according to the tradition of the caste. One enterprising ‘visor station even had a scanner focused on the blank entrance to the prosecutor’s unpretentious home, while a synthetically excited announcer filled in with inflated commentary.
The fussy little man advanced. He held forward a rich, complicated sheet. “This, Mr. Byerley, is a court order authorizing me to search these premises for the presence of illegal... uh... mechanical men or robots of any description.”
Byerley half rose, and took the paper. He glanced at it indifferently, and smiled as he handed it back. “All in order. Go ahead. Do your job. Mrs. Hoppen” – to his housekeeper, who appeared reluctantly from the next room-” please go with them, and help out if you can.”
The little man, whose name was Harroway, hesitated, produced an unmistakable blush, failed completely to catch Byerley’s eyes, and muttered, “Come on,” to the two policemen.
He was back in ten minutes.
“Through?” questioned Byerley, in just the tone of a person who is not particularly interested in the question, or its answer.
Harroway cleared his throat, made a bad start in falsetto, and began again, angrily, “Look here, Mr. Byerley, our special instructions were to search the house very thoroughly.”
“And haven’t you?”
“We were told exactly what to look for.”
“Yes?”
“In short, Mr. Byerley, and not to put too fine a point on it, we were told to search you.”
“Me?” said the prosecutor with a broadening smile. “And how do you intend to do that?”
“We have a Penet-radiation unit-”
“Then I’m to have my X-ray photograph taken, hey? You have the authority?”
“You saw my warrant.”
“May I see it again?”
Harroway, his forehead shining with considerably more than mere enthusiasm, passed it over a second time.
Byerley said evenly, “I read here as the description of what you are to search; I quote: ‘the dwelling place belonging to Stephen Allen Byerley, located at 355 Willow Grove, Evanstron, together, with any garage, storehouse or other structures or buildings thereto appertaining, together with all grounds thereto appertaining’... um... and so on. Quite in order. But, my good man, it doesn’t say anything about searching my interior. I am not part of the premises. You may search my clothes if you think I’ve got a robot hidden in my pocket.”
Harroway had no doubt on the point of to whom he owed his job. He did not propose to be backward, given a chance to earn a much better – i.e., more highly paid – job.
He said, in a faint echo of bluster, “Look here. I’m allowed to search the furniture in your house, and anything else I find in it. You are in it, aren’t you?”
“A remarkable observation. I am in it. But I’m not a piece of furniture. As a citizen of adult responsibility – I have the psychiatric certificate proving that – I have certain rights under the Regional Articles. Searching me would come under the heading of violating my Right of Privacy. That paper isn’t sufficient.”
“Sure, but if you’re a robot, you don’t have Right of Privacy.”
“True enough but that paper still isn’t sufficient. It recognizes me implicitly as a human being.”
“Where?” Harroway snatched at it.
“Where it says ‘the dwelling place belonging to’ and so on. A robot cannot own property. And you may tell your employer, Mr. Harroway, that if he tries to issue a similar paper which does not implicitly recognize me as a human being, he will be immediately faced with a restraining injunction and a civil suit which will make it necessary for him to prove me a robot by means of information now in his possession, or else to pay a whopping penalty for an attempt to deprive me unduly of my Rights under the Regional Articles. You’ll tell him that, won’t you?”
Harroway marched to the door. He turned.. “You’re a slick lawyer-” His hand was in his pocket. For a short moment, he stood there. Then he left, smiled in the direction of the ‘visor scanner, still playing away – waved to the reporters, and shouted, “We’ll have something for you tomorrow, boys. No kidding.”
In his ground car, he settled back, removed the tiny mechanism from his pocket and carefully inspected it. It was the first time he had ever taken a photograph by X-ray reflection. He hoped he had done it correctly.
Quinn and Byerley had never met face-to-face alone. But visorphone was pretty close to it. In fact, accepted literally, perhaps the phrase was accurate, even if to each, the other were merely the light and dark pattern of a bank of photocells.
It was Quinn who had initiated the call. It was Quinn, who spoke first, and without particular ceremony, “Thought you would like to know, Byerley, that I intend to make public the fact that you’re wearing a protective shield against Penet-radiation.”
“That so? In that case, you’ve probably already made it public. I have a notion our enterprising press representatives have been tapping my various communication lines for quite a while. I know they have my office lines full of holes; which is why I’ve dug in at my home these last weeks.” Byerley was friendly, almost chatty.
Quinn’s lips tightened slightly, “This call is shielded – thoroughly. I’m making it at a certain personal risk.”
“So I should imagine. Nobody knows you’re behind this campaign. At least, nobody knows it officially. Nobody doesn’t know it unofficially. I wouldn’t worry. So I wear a protective shield? I suppose you found that out when your puppy dog’s Penet-radiation photograph, the other day, turned out to be overexposed.”
“You realize, Byerley, that it would be pretty obvious to everyone that you don’t dare face X-ray analysis.”
“Also that you, or your men, attempted illegal invasion of my Rights of Privacy.”
“The devil they’ll care for that.”
“They might. It’s rather symbolic of our two campaigns isn’t it? You have little concern with the rights of the individual citizen. I have great concern. I will not submit to X-ray analysis, because I wish to maintain my Rights on principle. Just as I’ll maintain the rights of others when elected.”
“That will, no doubt make a very interesting speech, but no one will believe you. A little too high-sounding to be true. Another thing,” a sudden, crisp change, “the personnel in your home was not complete the other night.”
“In what way?”
“According to the report,” he shuffled papers before him that were just within the range of vision of the visiplate, “there was one person missing – a cripple.”
“As you say,” said Byerley, tonelessly, “a cripple. My old teacher, who lives with me and who is now in the country – and has been for two months. A ‘much-needed rest’ is the usual expression applied in the case. He has your permission?”
“Your teacher? A scientist of sorts?”
“A lawyer once – before he was a cripple. He has a government license as a research biophysicist, with a laboratory of his own, and a complete description of the work he’s doing filed with the proper authorities, to whom I can refer you. The work is minor, but is a harmless and engaging hobby for a – poor cripple. I am being as helpful as I can, you see.”
“I see. And what does this... teacher... know about robot manufacture?”
“I couldn’t judge the extent of his knowledge in a field with which I am unacquainted.”
“He wouldn’t have access to positronic brains?”
“Ask your friends at U. S. Robots. They’d be the ones to know.”
“I’ll put it shortly, Byerley. Your crippled teac
her is the real Stephen Byerley. You are his robot creation. We can prove it. It was he who was in the automobile accident, not you. There will be ways of checking the records.”
“Really? Do so, then. My best wishes.”
“And we can search your so-called teacher’s ‘country place,’ and see what we can find there.”
“Well, not quite, Quinn.” Byerley smiled broadly. “Unfortunately for you, my so-called teacher is a sick man. His country place is his place of rest. His Right of Privacy as a citizen of adult responsibility is naturally even stronger, under the circumstances. You won’t be able to obtain a warrant to enter his grounds without showing just cause. However, I’d be the last to prevent you from trying.”
There was a pause of moderate length, and then Quinn leaned forward, so that his imaged-face expanded and the fine lines on his forehead were visible, “Byerley, why do you carry on? You can’t be elected.”
“Can’t I?”
“Do you think you can? Do you suppose that your failure to make any attempt to disprove the robot charge – when you could easily, by breaking one of the Three Laws – does anything but convince the people that you are a robot?”
“All I see so far is that from being a rather vaguely known, but still largely obscure metropolitan lawyer, I have now become a world figure. You’re a good publicist.”
“But you are a robot.”
“So it’s been said, but not proven.”
“It’s been proven sufficiently for the electorate.”
“Then relax you’ve won.”
“Good-by,” said Quinn, with his first touch of viciousness, and the visorphone slammed off.
“Good-by,” said Byerley imperturbably, to the blank plate.
Byerley brought his “teacher” back the week before election. The air car dropped quickly in an obscure part of the city.