Asimov's Future History Volume 1

Home > Science > Asimov's Future History Volume 1 > Page 42
Asimov's Future History Volume 1 Page 42

by Isaac Asimov


  “The labor unions are naturally against us, but surely we may expect cooperation from the large universities. The robot, Easy, will help you by relieving you of scholastic drudgery – by assuming, if you permit it, the role of galley slave for you. Other universities and research institutions will follow your lead, and if it works out, then perhaps other robots of other types may be placed and the public’s objections to them broken down by stages.”

  Minott murmured, “Today Northeastern University, tomorrow the world.”

  Angrily, Lanning whispered to Susan Calvin, “I wasn’t nearly that eloquent and they weren’t nearly that reluctant. At a thousand a year, they were jumping to get Easy. Professor Minott told me he’d never seen as beautiful a job as that graph he was holding and there was no mistake on the galley or anywhere else. Hart admitted it freely.”

  The severe vertical lines on Dr. Calvin’s face did not soften. “You should have demanded more money than they could pay, Alfred, and let them beat you down.”

  “Maybe,” he grumbled.

  Prosecution was not quite done with Professor Hart. “After Dr. Lanning left, did you vote on whether to accept Robot EZ-27?”

  “Yes, we did.”

  “With what result?”

  “In favor of acceptance, by majority vote.”

  “What would you say influenced the vote?” Defense objected immediately.

  Prosecution rephrased the question. “What influenced you, personally, in your individual vote? You did vote in favor, I think.”

  “I voted in favor, yes. I did so largely because I was impressed by Dr. Lanning’s feeling that it was our duty as members of the world’s intellectual leadership to allow robotics to help Man in the solution of his problems.”

  “In other words, Dr. Lanning talked you into it.”

  “That’s his job. He did it very well.”

  “Your witness.”

  Defense strode up to the witness chair and surveyed Professor Hart for a long moment. He said, “In reality, you were all pretty eager to have Robot EZ-27 in your employ, weren’t you?”

  “We thought that if it could do the work, it might be useful.”

  “If it could do the work? I understand you examined the samples of Robot EZ-27’s original work with particular care on the day of the meeting which you have just described.”

  “Yes, I did. Since the machine’s work dealt primarily with the handling of the English language, and since that is my field of competence, it seemed logical that I be the one chosen to examine the work.”

  “Very good. Was there anything on display on the table at the time of the meeting which was less than satisfactory? I have all the material here as exhibits. Can you point to a single unsatisfactory item?”

  “Well –”

  “It’s a simple question. Was there one single solitary unsatisfactory item? You inspected it. Was there?”

  The English professor frowned. “There wasn’t.”

  “I also have some samples of work done by Robot EZ-27 during the course of his fourteen-month employ at Northeastern. Would you examine these and tell me if there is anything wrong with them in even one particular?”

  Hart snapped, “When he did make a mistake, it was a beauty.”

  “Answer my question,” thundered Defense, “and only the question I am putting to you! Is there anything wrong with the material?”

  Dean Hart looked cautiously at each item. “Well, nothing.”

  “Barring the matter concerning which we are here engaged. do you know of any mistake on the part of EZ-27?”

  “Barring the matter for which this trial is being held, no.”

  Defense cleared his throat as though to signal end of paragraph. He said. “Now about the vote concerning whether Robot EZ-27 was to be employed or not. You said there was a majority in favor. What was the actual vote?”

  “Thirteen to one, as I remember.”

  “Thirteen to one! More than just a majority, wouldn’t you say?”

  “No, sir!” All the pedant in Dean Hart was aroused. “In the English language, the word ‘majority’ means ‘more than half.’ Thirteen out of fourteen is a majority, nothing more.”

  “But an almost unanimous one.”

  “A majority all the same!”

  Defense switched ground. “And who was the lone holdout?”

  Dean Hart looked acutely uncomfortable. “Professor Simon Ninheimer.”

  Defense pretended astonishment. “Professor Ninheimer? The head of the Department of Sociology?”

  “Yes, Sir.”

  “The plaintiff?”

  “Yes, sir.”

  Defense pursed his lips. “In other words, it turns out that the man bringing the action for payment of $750,000 damages against my client. United States Robots and Mechanical Men Corporation was the one who from the beginning opposed the use of the robot – although everyone else on the Executive Committee of the University Senate was persuaded that it was a good idea.”

  “He voted against the motion, as was his right.”

  “You didn’t mention in your description of the meeting any remarks made by Professor Ninheimer. Did he make any?”

  “I think he spoke.”

  “You think?”

  “Well, he did speak.”

  “Against using the robot?”

  “Yes.”

  “Was he violent about it?”

  Dean Hart paused. “He was vehement.”

  Defense grew confidential. “How long have you known Professor Ninheimer, Dean Hart?”

  “About twelve years.”

  “Reasonably well?”

  “I should say so, yes.”

  “Knowing him, then, would you say he was the kind of man who might continue to bear resentment against a robot, all the more so because an adverse vote had –”

  Prosecution drowned out the remainder of the question with an indignant and vehement objection of his own. Defense motioned the witness down and Justice Shane called luncheon recess.

  Robertson mangled his sandwich. The corporation would not founder for loss of three-quarters of a million, but the loss would do it no particular good. He was conscious, moreover, that there would be a much more costly long-term setback in public relations.

  He said sourly, “Why all this business about how Easy got into the university? What do they hope to gain?”

  The Attorney for Defense said quietly, “A court action is like a chess game, MI. Robertson. The winner is usually the one who can see more moves ahead, and my friend at the prosecutor’s table is no beginner. They can show damage; that’s no problem. Their main effort lies in anticipating our defense. They must be counting on us to try to show that Easy couldn’t possibly have committed the offense – because of the Laws of Robotics.”

  “All right,” said Robertson, “that is our defense. An absolutely airtight one.”

  “To a robotics engineer. Not necessarily to a judge. They’re setting themselves up a position from which they can demonstrate that EZ-27 was no ordinary robot. It was the first of its type to be offered to the public. It was an experimental model that needed field-testing and the university was the only decent way to provide such testing. That would look plausible in the light of Dr. Lanning’s strong efforts to place the robot and the willingness of U. S. Robots to lease it for so little. The prosecution would then argue that the field-test proved Easy to have been a failure. Now do you see the purpose of what’s been going on?”

  “But EZ-27 was a perfectly good model,” Argued Robertson. “It was the twenty-seventh in production.”

  “Which is really a bad point,” said Defense somberly. “What was wrong with the first twenty-six? Obviously something. Why shouldn’t there be something wrong with the twenty-seventh, too?”

  “There was nothing wrong with the first twenty-six except that they weren’t complex enough for the task. These were the first positronic brains of the sort to be constructed and it was rather hit-and-miss to begin with. But the
Three Laws held in all of them! No robot is so imperfect that the Three Laws don’t hold.”

  “Dr. Lanning has explained this to me, Mr. Robertson, and I am willing to take his word for it. The judge, however, may not be. We are expecting a decision from an honest and intelligent man who knows no robotics and thus may be led astray. For instance, if you or Dr. Lanning or Dr. Calvin were to say on the stand that any positronic brains were constructed ‘hit-and-miss,’ as you just did, prosecution would tear you apart in cross-examination. Nothing would salvage our case. So that’s something to avoid.”

  Robertson growled, “If only Easy would talk.”

  Defense shrugged. “A robot is incompetent as a witness, so that would do us no good.”

  “At least we’d know some of the facts. We’d know how it came to do such a thing.”

  Susan Calvin fired up. A dullish red touched her cheeks and her voice had a trace of warmth in it. “We know how Easy came to do it. It was ordered to! I’ve explained this to counsel and I’ll explain it to you now.”

  “Ordered to by whom?” asked Robertson in honest astonishment. (No one ever told him anything, he thought resentfully. These research people considered themselves the owners of U. S. Robots, by God!)

  “By the plaintiff,” said Dr. Calvin. “In heaven’s name, why?”

  “I don’t know why yet. Perhaps just that we might be sued, that he might gain some cash.” There were blue glints in her eyes as she said that.

  “Then why doesn’t Easy say so?”

  “Isn’t that obvious? It’s been ordered to keep quiet about the matter.”

  “Why should that be obvious?” demanded Robertson truculently. “Well, it’s obvious to me. Robot psychology is my profession. If

  Easy will not answer questions about the matter directly, he will answer questions on the fringe of the matter. By measuring increased hesitation in his answers as the central question is approached, by measuring the area of blankness and the intensity of counterpotentials set up, it is possible to tell with scientific precision that his troubles are the result of an order not to talk, with its strength based on First Law. In other words, he’s been told that if he talks, harm will be done a human being. Presumably harm to the unspeakable Professor Ninheimer, the plaintiff, who, to the robot, would seem a human being.”

  “Well, then,” said Robertson, “can’t you explain that if he keeps quiet, harm will be done to U. S. Robots?”

  “U. S. Robots is not a human being and the First Law of Robotics does not recognize a corporation as a person the way ordinary laws do. Besides, it would be dangerous to try to lift this particular sort of inhibition. The person who laid it on could lift it off least dangerously, because the robot’s motivations in that respect are centered on that person. Any other course –” She shook her head and grew almost impassioned. “I won’t let the robot be damaged!”

  Lanning interrupted with the air of bringing sanity to the problem. “It seems to me that we have only to prove a robot incapable of the act of which Easy is accused. We can do that.”

  “Exactly,” said Defense, in annoyance. “You can do that. The only witnesses capable of testifying to Easy’s condition and to the nature of Easy’s state of mind are employees of U. S. Robots. The judge can’t possibly accept their testimony as unprejudiced.”

  “How can he deny expert testimony?”

  “By refusing to be convinced by it. That’s his right as the judge. Against the alternative that a man like Professor Ninheimer deliberately set about ruining his own reputation, even for a sizable sum of money, the judge isn’t going to accept the technicalities of your engineers. The judge is a man, after all. If he has to choose between a man doing an impossible thing and a robot doing an impossible thing, he’s quite likely to decide in favor of the man.”

  “A man can do an impossible thing,” said Lanning, “because we don’t know all the complexities of the human mind and we don’t know what, in a given human mind, is impossible and what is not. We do know what is really impossible to a robot.”

  “Well, we’ll see if we can’t convince the judge of that,” Defense replied wearily.

  “If all you say is so,” rumbled Robertson, “I don’t see how you can.”

  “We’ll see. It’s good to know and be aware of the difficulties involved, but let’s not be too downhearted. I’ve tried to look ahead a few moves in the chess game, too.” With a stately nod in the direction of the robopsychologist, he added, “With the help of the good lady here.”

  Lanning looked from one to the other and said, “What the devil is this?”

  But the bailiff thrust his head into the room and announced somewhat breathlessly that the trial was about to resume.

  They took their seats, examining the man who had started all the trouble.

  Simon Ninheimer owned a fluffy head of sandy hair, a face that narrowed past a beaked nose toward a pointed chin, and a habit of sometimes hesitating before key words in his conversation that gave him an air of a seeker after an almost unbearable precision. When he said, “The Sun rises in the – uh – east, 11 one was certain he had given due consideration to the possibility that it might at some time rise in the west.

  Prosecution said, “Did you oppose employment of Robot EZ-27 by the university?”

  “I did, sir.”

  “Why was that?”

  “I did not feel that we understood the – uh – motives of U. S. Robots thoroughly. I mistrusted their anxiety to place the robot with us.”

  “Did you feel that it was capable of doing the work that it was allegedly designed to do?”

  “I know for a fact that it was not.”

  “Would you state your reasons?”

  Simon Ninheimer’s book, entitled Social Tensions Involved in Space-Flight and Their Resolution, had been eight years in the making. Ninheimer’s search for precision was not confined to his habits of speech, and in a subject like sociology, almost inherently imprecise, it left him breathless.

  Even with the material in galley proofs, he felt no sense of completion. Rather the reverse, in fact. Staring at the long strips of print, he felt only the itch to tear the lines of type apart and rearrange them differently.

  Jim Baker, Instructor and soon to be Assistant Professor of Sociology, found Ninheimer, three days after the first batch of galleys had arrived from the printer, staring at the handful of paper in abstraction. The galleys came in three copies: one for Ninheimer to proofread, one for Baker to proofread independently, and a third, marked “Original,” which was to receive the final corrections, a combination of those made by Ninheimer and by Baker, after a conference at which possible conflicts and disagreements were ironed out. This had been their policy on the several papers on which they had collaborated in the past three years and it worked well.

  Baker, young and ingratiatingly soft-voiced, had his own copies of the galleys in his hand. He said eagerly, “I’ve done the first chapter and it contains some typographical beauts.”

  “The first chapter always has them,” said Ninheimer distantly. “Do you want to go over it now?”

  Ninheimer brought his eyes to grave focus on Baker. “I haven’t done anything on the galleys, Jim. I don’t think I’ll bother.”

  Baker looked confused. “Not bother?”

  Ninheimer pursed his lips. “I’ve asked about the – uh – workload of the machine. After all, he was originally – uh – promoted as a proofreader. They’ve set a schedule.”

  “The machine? You mean Easy?”

  “I believe that is the foolish name they gave it.”

  “But, Dr. Ninheimer, I thought you were staying clear of it” ‘

  “I seem to be the only one doing so. Perhaps I ought to take my share of the – uh – advantage.”

  “Oh. Well, I seem to have wasted time on this first chapter, then,” said the younger man ruefully.

  “Not wasted. We can compare the machine’s result with yours as a check.”

  “If you want
to, but –”

  “Yes?”

  “I doubt that we’ll find anything wrong with Easy’s work. It’s supposed never to have made a mistake.”

  “I dare say,” said Ninheimer dryly.

  The first chapter was brought in again by Baker four days later. This time it was Ninheimer’s copy, fresh from the special annex that had been built to house Easy and the equipment it used.

  Baker was jubilant. “Dr. Ninheimer, it not only caught everything I caught – it found a dozen errors I missed! The whole thing took it twelve minutes!”

  Ninheimer looked over the sheaf, with the neatly printed marks and symbols in the margins. He said, “It is not as complete as you and I would have made it. We would have entered an insert on Suzuki’s work on the neurological effects of low gravity.”

  “You mean his paper in Sociological Reviews?”

  “Of course.”

  “Well, you can’t expect impossibilities of Easy. It can’t read the literature for us.”

  “I realize that. As a matter of fact, I have prepared the insert. I will see the machine and make certain it knows how to – uh – handle inserts.”

  “It will know.”

  “I prefer to make certain.”

  Ninheimer had to make an appointment to see Easy, and then could get nothing better than fifteen minutes in the late evening.

  But the fifteen minutes turned out to be ample. Robot EZ-27 understood the matter of inserts at once.

  Ninheimer found himself uncomfortable at close quarters with the robot for the first time. Almost automatically, as though it were human, he found himself asking, “Are you happy with your work?’

  “Most happy, Professor Ninheimer,” said Easy solemnly, the photocells that were its eyes gleaming their normal deep red.

  “You know me?”

  “From the fact that you present me with additional material to include in the galleys, it follows that you are the author. The author’s name, of course, is at the head of each sheet of galley proof.”

 

‹ Prev