Asimov's Future History Volume 1

Home > Science > Asimov's Future History Volume 1 > Page 41
Asimov's Future History Volume 1 Page 41

by Isaac Asimov


  “More harmless than I am,” said Lanning. “I could be goaded into striking you. Easy could not be. You know the Three Laws of Robotics, I presume.”

  “Yes, of course,” said Goodfellow.

  “They are built into the positronic patterns of the brain and must be observed. The First Law, the prime rule of robotic existence, safeguards the life and well-being of all humans.” He paused, rubbed at his cheek, then added, “It’s something of which we would like to persuade all Earth if we could.”

  “It’s just that he seems formidable.”

  “Granted. But whatever he seems, you’ll find that he is useful.”

  “I’m not sure in what way. Our conversations were not very helpful in that respect. Still, I agreed to look at the object and I’m doing it.”

  “We’ll do more than look, Professor. Have you brought a book?”

  “I have.”

  “May I see it?”

  Professor Goodfellow reached down without actually taking his eyes off the metal-in-human-shape that confronted him. From the briefcase at his feet, he withdrew a book.

  Lanning held out his hand for it and looked at the backstrip. “Physical Chemistry of Electrolytes in Solution. Fair enough, sir. You selected this yourself, at random. It was no suggestion of mine, this particular text. Am I right?”

  “Yes.”

  Lanning passed the book to Robot EZ-27.

  The professor jumped a little. “No! That’s a valuable book!” Lanning raised his eyebrows and they looked like shaggy coconut icing. He said, “Easy has no intention of tearing the book in two as a feat of strength, I assure you. It can handle a book as carefully as you or I. Go ahead, Easy.”

  “Thank you, sir,” said Easy. Then, turning its metal bulk slightly, it added, “With your permission, Professor Goodfellow.”

  The professor stared, then said, “Yes – yes, of course.”

  With a slow and steady manipulation of metal fingers, Easy turned the pages of the book, glancing at the left page, then the right; turning the page, glancing left, then right; turning the page and so on for minute after minute.

  The sense of its power seemed to dwarf even the large cement-walled room in which they stood and to reduce the two human watchers to something considerably less than life-size.

  Goodfellow muttered, “The light isn’t very good.”

  “It will do.”

  Then, rather more sharply, “But what is he doing?”

  “Patience, sir.”

  The last page was turned eventually. Lanning asked, “Well, Easy?”

  The robot said, “It is a most accurate book and there is little to which I can point. On line 22 of page 27, the word ‘positive’ is spelled p-o-i-s-t-i-v-e. The comma in line 6 of page 32 is superfluous, whereas one should have been used on line 13 of page 54. The plus sign in equation XIV-2 on page 337 should be a minus sign if it is to be consistent with the previous equations –”

  “Wait! Wait!” cried the professor. “What is he doing?”

  “Doing?” echoed Lanning in sudden irascibility. “Why, man, he has already done it! He has proofread that book.”

  “Proofread it?”

  “Yes. In the short time it took him to turn those pages, he caught every mistake in spelling, grammar and punctuation. He has noted errors in word order and detected inconsistencies. And he will retain the information, letter-perfect, indefinitely.”

  The professor’s mouth was open. He walked rapidly away from Lanning and Easy and as rapidly back. He folded his arms across his chest and stared at them. Finally he said, “You mean this is a proofreading robot?”

  Lanning nodded. “Among other things.”

  “But why do you show it to me?”

  “So that you might help me persuade the university to obtain it for use.”

  “To read proof?”

  “Among other things,” Lanning repeated patiently.

  The professor drew his pinched face together in a kind of sour disbelief. “But this is ridiculous!”

  “Why?”

  “The university could never afford to buy this half-ton – it must weigh that at least – this half-ton proofreader.”

  “Proofreading is not all it will do. It will prepare reports from outlines, fill out forms, serve as an accurate memory-file, grade papers –”

  All picayune!”

  Lanning said, “Not at all, as I can show you in a moment. But I think we can discuss this more comfortably in your office, if you have no objection.”

  “No, of course not,” began the professor mechanically and took a half-step as though to turn. Then he snapped out, “But the robot – we can’t take the robot. Really, Doctor, you’ll have to crate it up again.”

  “Time enough. We can leave Easy here.”

  “Unattended?”

  “Why not? He knows he is to stay. Professor Goodfellow, it is necessary to understand that a robot is far more reliable than a human being.”

  “I would be responsible for any damage –”

  “There will be no damage. I guarantee that. Look, it’s after hours. You expect no one here, I imagine, before tomorrow morning. The truck and my two men are outside. U. S. Robots will take any responsibility that may arise. None will. Call it a demonstration of the reliability of the robot.”

  The professor allowed himself to be led out of the storeroom. Nor did he look entirely comfortable in his own office, five stories up.

  He dabbed at the line of droplets along the upper half of his forehead with a white handkerchief.

  “As you know very well, Dr. Lanning, there are laws against the use of robots on Earth’s surface,” he pointed out.

  “The laws, Professor Goodfellow, are not simple ones. Robots may not be used on public thoroughfares or within public edifices. They may not be used on private grounds or within private structures except under certain restrictions that usually turn out to be prohibitive. The university, however, is a large and privately owned institution that usually receives preferential treatment. If the robot is used only in a specific room for only academic purposes, if certain other restrictions are observed and if the men and women having occasion to enter the room cooperate fully, we may remain within the law.”

  “But all that trouble just to read proof?”

  “The uses would be infinite. Professor. Robotic labor has so far been used only to relieve physical drudgery. Isn’t there such a thing as mental drudgery? When a professor capable of the most useful creative thought is forced to spend two weeks painfully checking the spelling of lines of print and I offer you a machine that can do it in thirty minutes, is that picayune?”

  “But the price –”

  “The price need not bother you. You cannot buy EZ-27. U. S. Robots does not sell its products. But the university can lease EZ-27 for a thousand dollars a year – considerably less than the cost of a single microwave spectograph continuous-recording attachment.”

  Goodfellow looked stunned. Lanning followed up his advantage by saying, “I only ask that you put it up to whatever group makes the decisions here. I would be glad to speak to them if they want more information.”

  “Well,” Goodfellow said doubtfully, “I can bring it up at next week’s Senate meeting. I can’t promise that will do any good, though.”

  “Naturally,” said Lanning.

  The Defense Attorney was short and stubby and carried himself rather portentously, a stance that had the effect of accentuating his double chin. He stared at Professor Goodfellow, once that witness had been handed over, and said, “You agreed rather readily, did you not?”

  The Professor said briskly, “I suppose I was anxious to be rid of Dr. Lanning. I would have agreed to anything.”

  “With the intention of forgetting about it after he left?”

  “Well –”

  “Nevertheless, you did present the matter to a meeting of the Executive Board of the University Senate.”

  “Yes, I did.”

  “So that you agre
ed in good faith with Dr. Lanning’s suggestions. You weren’t just going along with a gag. You actually agreed enthusiastically, did you not?”

  “I merely followed ordinary procedures.”

  “As a matter of fact, you weren’t as upset about the robot as you now claim you were. You know the Three Laws of Robotics and you knew them at the time of your interview with Dr. Lanning.”

  “Well, yes.”

  “And you were perfectly willing to leave a robot at large and unattended.”

  “Dr. Lanning assured me –”

  “Surely you would never have accepted his assurance if you had had the slightest doubt that the robot might be in the least dangerous.”

  The professor began frigidly, “I had every faith in the word –”

  “That is all,” said Defense abruptly.

  As Professor Goodfellow, more than a bit ruffled, stood down, Justice Shane leaned forward and said, “Since I am not a robotics man myself, I would appreciate knowing precisely what the Three Laws of Robotics are. Would Dr. Lanning quote them for the benefit of the court?”

  Dr. Lanning looked startled. He had been virtually bumping heads with the gray-haired woman at his side. He rose to his feet now and the woman looked up, too – expressionlessly.

  Dr. Lanning said, “Very well, Your Honor.” He paused as though about to launch into an oration and said, with laborious clarity, “First Law: a robot may not injure a human being, or, through inaction, allow a human being to come to harm. Second Law: a robot must obey the orders given it by human beings, except where such orders would conflict with the First Law. Third Law: a robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.”

  “I see,” said the judge, taking rapid notes. “These Laws are built into every robot, are they?”

  “Into every one. That will be borne out by any roboticist.”

  “And into Robot EZ-27 specifically?”

  “Yes, Your Honor.”

  “You will probably be required to repeat those statements under oath.”

  “I am ready to do so, Your Honor.” He sat down again.

  Dr. Susan Calvin, robopsychologist-in-chief for U. S. Robots, who was the gray-haired woman sitting next to Lanning, looked at her titular superior without favor, but then she showed favor to no human being. She said, “Was Goodfellow’s testimony accurate,

  Alfred?”

  “Essentially,” muttered Lanning. “He wasn’t as nervous as all that about the robot and he was anxious enough to talk business with me when he heard the price. But there doesn’t seem to be any drastic distortion.”

  Dr. Calvin said thoughtfully, “It might have been wise to put the price higher than a thousand.”

  “We were anxious to place Easy.”

  “I know. Too anxious, perhaps. They’ll try to make it look as

  though we had an ulterior motive.”

  Lanning looked exasperated. “We did. I admitted that at the University Senate meeting.”

  “They can make it look as if we had one beyond the one we admitted.”

  Scott Robertson, son of the founder of U. S. Robots and still owner of a majority of the stock, leaned over from Dr. Calvin’s other side and said in a kind of explosive whisper, “Why can’t you get Easy to talk so we’ll know where we’re at?”

  “You know he can’t talk about it, Mr. Robertson.”

  “Make him. You’re the psychologist, Dr. Calvin. Make him.”

  “If I’m the psychologist, Mr. Robertson,” said Susan Calvin coldly, “let me make the decisions. My robot will not be made to do anything at the price of his well-being.”

  Robertson frowned and might have answered, but Justice Shane was tapping his gavel in a polite sort of way and they grudgingly fell silent.

  Francis J. Hart, head of the Department of English and Dean of Graduate Studies, was on the stand. He was a plump man, meticulously dressed in dark clothing of a conservative cut, and possessing several strands of hair traversing the pink top of his cranium. He sat well back in the witness chair with his hands folded neatly in his lap and displaying, from time to time, a tight-lipped smile.

  He said, “My first connection with the matter of the Robot EZ-27 was on the occasion of the session of the University Senate Executive Committee at which the subject was introduced by Professor Goodfellow. Thereafter, on the tenth of April of last year, we held a special meeting on the subject, during which I was in the chair.”

  “Were minutes kept of the meeting of the Executive Committee? Of the special meeting, that is?”

  “Well, no. It was a rather unusual meeting.” The dean smiled briefly. “We thought it might remain confidential.”

  “What transpired at the meeting?”

  Dean Hart was not entirely comfortable as chairman of that meeting. Nor did the other members assembled seem completely calm. Only Dr. Lanning appeared at peace with himself. His tall, gaunt figure and the shock of white hair that crowned him reminded Hart of portraits he had seen of Andrew Jackson.

  Samples of the robot’s work lay scattered along the central regions of the table and the reproduction of a graph drawn by the robot was now in the hands of Professor Minott of Physical Chemistry. The chemist’s lips were pursed in obvious approval.

  Hart cleared his throat and said, “There seems no doubt that the robot can perform certain routine tasks with adequate competence. I have gone over these, for instance, just before coming in and there is very little to find fault with.”

  He picked up a long sheet of printing, some three times as long as the average book page. It was a sheet of galley proof, designed to be corrected by authors before the type was set up in page form. Along both of the wide margins of the galley were proofmarks, neat and superbly legible. Occasionally, a word of print was crossed out and a new word substituted in the margin in characters so fine and regular it might easily have been print itself. Some of the corrections were blue to indicate the original mistake had been the author’s, a few in red, where the printer had been wrong.

  “Actually,” said Lanning, “there is less than very little to find fault with. I should say there is nothing at all to find fault with, Dr. Hart. I’m sure the corrections are perfect, insofar as the original manuscript was. If the manuscript against which this galley was corrected was at fault in a matter of fact rather than of English, the robot is not competent to correct it.”

  “We accept that. However, the robot corrected word order on occasion and I don’t think the rules of English are sufficiently hidebound for US to be sure that in each case the robot’s choice was the correct one.”

  “Easy’s positronic brain,” said Lanning, showing large teeth as he smiled, “has been molded by the contents of all the standard works on the subject. I’m sure you cannot point to a case where the robot’s choice was definitely the incorrect one.”

  Professor Minott looked up from the graph he still held. “The question in my mind, Dr. Lanning, is why we need a robot at all, with all the difficulties in public relations that would entail. The science of automation has surely reached the point where your company could design a machine, an ordinary computer of a type known and accepted by the public, that would correct galleys.”

  “I am sure we could,” said Lanning stiffly, “but such a machine would require that the galleys be translated into special symbols or, at the least, transcribed on tapes. Any corrections would emerge in symbols. You would need to keep men employed translating words to symbols, symbols to words. Furthermore, such a computer could do no other job. It couldn’t prepare the graph you hold in your hand, for instance.”

  Minott grunted.

  Lanning went on. “The hallmark of the positronic robot is its flexibility. It can do a number of jobs. It is designed like a man so that it can use all the tools and machines that have, after all, been designed to be used by a man. It can talk to you and you can talk to it. You can actually reason with it up to a point. Compared to even a simple
robot, an ordinary computer with a non-positronic brain is only a heavy adding machine.”

  Goodfellow looked up and said, “If we all talk and reason with the robot, what are the chances of our confusing it? I suppose it doesn’t have the capability of absorbing an infinite amount of data.”

  “No, it hasn’t. But it should last five years with ordinary use. It will know when it will require clearing, and the company will do the job without charge.”

  “The company will?”

  “Yes. The company reserves the right to service the robot outside the ordinary course of its duties. It is one reason we retain control of our positronic robots and lease rather than sell them. In the pursuit of its ordinary functions, any robot can be directed by any man. Outside its ordinary functions, a robot requires expert handling, and that we can give it. For instance, any of you might clear an EZ robot to an extent by telling it to forget this item or that. But you would be almost certain to phrase the order in such a way as to cause it to forget too much or too little. We would detect such tampering, because we have built-in safeguards. However, since there is no need for clearing the robot in its ordinary work, or for doing other useless things, this raises no problem.”

  Dean Hart touched his head as though to make sure his carefully cultivated strands lay evenly distributed and said, “You are anxious to have us take the machine. Yet surely it is a losing proposition for U. S. Robots. One thousand a year is a ridiculously low price. Is it that you hope through this to rent other such machines to other universities at a more reasonable price?”

  “Certainly that’s a fair hope,” said Lanning.

  “But even so, the number of machines you could rent would be limited. I doubt if you could make it a paying proposition.”

  Lanning put his elbows on the table and earnestly leaned forward. “Let me put it bluntly, gentlemen. Robots cannot be used on Earth, except in certain special cases, because of prejudice against them on the part of the public. U. S. Robots is a highly successful corporation with our extraterrestrial and spaceflight markets alone, to say nothing of our computer subsidiaries. However, we are concerned with more than profits alone. It is our firm belief that the use of robots on Earth itself would mean a better life for all eventually, even if a certain amount of economic dislocation resulted at first.

 

‹ Prev