THE BICENTENNIAL MAN
Page 10
“With what, Harriman?”
“With my robot, sir.”
“Your robot?” said Eisenmuth. “You have a robot here?” He looked about with a stem disapproval that yet had an admixture of curiosity.
“This is U. S. Robots’ property, Conserver. At least we consider it as such.”
“And where is the robot, Dr. Harriman?”
“In my pocket, Conserver:’ said Harriman cheerfully.
What came out of a capacious jacket pocket was a small glass jar. “That?” said Eisenmuth incredulously.
“No, Conserver,” said Harriman. “This!”
From the other pocket came out an object some five inches long and roughly in the shape of a bird. In place of the beak, there was a narrow tube; the eyes were large; and the tail was an exhaust channel.
Eisenmuth’s thick eyebrows drew together. “Do you intend a serious demonstration of some sort, Dr. Harriman, or are you mad?”
“Be patient for a few minutes, Conserver,” said Harriman. “A robot in the shape of a bird is none the less a robot for that. And the positronic brain it possesses is no less delicate for being tiny. This other object I hold is a jar of fruit flies. There are fifty fruit flies in it which will be released.”
“And--”
“The robo-bird will catch them. Will you do the honors, sir?” Harriman handed the jar to Eisenmuth, who stared at it, then at
those around him, some officials from U. S. Robots, others his own aides. Harriman waited patiently.
Eisenmuth opened the jar, then shook it.
Harriman said softly to the robo-bird resting on the palm of his right hand, “Go!”
The robo-bird was gone. It was a whizz through the air, with no blur of wings, only the tiny workings of an unusually small proton micro-pile.
It could be seen now and then in a small momentary hover and then it whirred on again. All over the garden, in an intricate pattern it flew, and then was back in Harriman’s palm, faintly warm. A small pellet appeared in the palm, too, like a bird dropping.
Harriman said, “You are welcome to study the robo-bird, Conserver, and to arrange demonstrations on your own terms. The fact is that this bird will pick up fruit flies unerringly, only those, only the one species Drosophila melanogaster; pick them up, kill them, and compress them for disposition.”
Eisenmuth reached out his hand and touched the robo-bird gingerly, “And therefore, Mr. Harriman? Do go on.”
Harriman said, “We cannot control insects effectively without risking damage to the ecology. Chemical insecticides are too broad; juvenile hormones too limited. The robo-bird, however, can preserve large areas without being consumed. They can be as specific as we care to make them--a different robo-bird for each species. They judge by size, shape, color, sound, behavior pattern. They might even conceivably use molecular detection--smell, in other words.”
Eisenmuth said, “You would still be interfering with the ecology. The fruit flies have a natural life cycle that would be disrupted.”
“Minimally. We are adding a natural enemy to the fruit-fly life cycle, one which cannot go wrong. If the fruit-fly supply runs short, the robo-bird simply does nothing. It does not multiply, it does not turn to other foods; it does not develop undesirable habits of its own. It does nothing.”
“Can it be called back?”
“Of course. We can build robo-animals to dispose of any pest. For that matter, we can build robo-animals to accomplish constructive purposes within the pattern of the ecology. Although we do not anticipate the need, there is nothing inconceivable in the possibility of robo-bees designed to fertilize specific plants, or robo-earthworms designed to mix the soil. Whatever you wish--”
“But why?”
“To do what we have never done before. To adjust the ecology to our needs by strengthening its parts rather than disrupting it.... Don’t you see? Ever since the Machines put an end to the ecology crisis, mankind has lived in an uneasy truce with nature, afraid to move in any direction. This has been stultifying us, making a kind of intellectual coward of humanity so that he begins to mistrust all scientific advance, all change.”
Eisenmuth said, with an edge of hostility, “You offer us this, do you, in exchange for permission to continue with your program of robots--I mean ordinary, man-shaped ones?”
“No!” Harriman gestured violently. “That is over. It has served its purpose. It has taught us enough about positronic brains to make it possible for us to cram enough pathways into a tiny brain to make a robo-bird. We can turn to such things now and be prosperous enough. U. S. Robots will supply the necessary knowledge and skill and we will work in complete cooperation with the Department of Global Conservation. We will prosper. You will prosper. Mankind will prosper.”
Eisenmuth was silent, thinking. When it was all over
6a.
Eisenmuth sat alone. He found himself believing. He found excitement welling up within him. Though U. S. Robots might be the hands, the government would be the directing mind. He himself would be the directing mind.
If he remained in office five more years, as he well might, that would be time enough to see the robotic support of the ecology become accepted; ten more years, and his own name would be linked with it indissolubly.
Was it a disgrace to want to be remembered for a great and worthy revolution in the condition of man and the globe?
7.
Robertson had not been on the grounds of U. S. Robots proper since the day of the demonstration. Part of the reason had been his more or less constant conferences at the Global Executive Mansion. Fortunately, Harriman had been with him, for most of the time he would, if left to himself, not have known what to say.
The rest of the reason for not having been at U. S. Robots was that he didn’t want to be. He was in his own house now, with Harriman.
He felt an unreasoning awe of Harriman. Harriman’s expertise in robotics had never been in question, but the man had, at a stroke, saved U. S. Robots from certain extinction, and somehow--Robertson felt--the man hadn’t had it in him. And yet--
He said, “You’re not superstitious, are you, Harriman?”
“In what way, Mr. Robertson?”
“You don’t think that some aura is left behind by someone who is dead?”
Harriman licked his lips. Somehow he didn’t have to ask. “You mean Susan Calvin, sir?”
“Yes, of course,” said Robertson hesitantly. “We’re in the business of making worms and birds and bugs now. What would she say? I feel disgraced.”
Harriman made a visible effort not to laugh. “A robot is a robot, sir. Worm or man, it will do as directed and labor on behalf of the human being and that is the important thing.”
“No”--peevishly. “That isn’t so. I can’t make myself believe that.”
“It is so, Mr. Robertson,” said Harriman earnestly. “We are going to create a world, you and I, that will begin, at last, to take positronic robots of some kind for granted. The average man may fear a robot that looks like a man and that seems intelligent enough to replace him, but he will have no fear of a robot that looks like a bird and that does nothing more than eat bugs for his benefit. Then, eventually, after he stops being afraid of some robots, he will stop being afraid of all robots. He will be so used to a robo-bird and a robo-bee and a robo-worm that a robo-man will strike him as but an extension.”
Robertson looked sharply at the other. He put his hands behind his back and walked the length of the room with quick, nervous steps. He walked back and looked at Harriman again. “Is this what you’ve been planning?”
“Yes, and even though we dismantle all our humanoid robots, we can keep a few of the most advanced of our experimental models and go on designing additional ones, still more advanced, to be ready for the day that will surely come.”
“The agreement, Harriman, is that we are to build no more humanoid robots.”
“And we won’t. There is nothing that says we can’t keep a few of those already
built as long as they never leave the factory. There is nothing that says we can’t design positronic brains on paper, or prepare brain models for testing.”
“How do we explain doing so, though? We will surely be caught at it.”
“If we are, then we can explain we are doing it in order to develop principles that will make it possible to prepare more complex microbrains for the new animal robots we are making. We will even be telling the truth.”
Robertson muttered, “Let me take a walk outside. I want to think about this. No, you stay here. I want to think about it myself.”
7a.
Harriman sat alone. He was ebullient. It would surely work. There was no mistaking the eagerness with which one government official after another had seized on the program once it had been explained.
How was it possible that no one at U. S. Robots had ever thought of such a thing? Not even the great Susan Calvin had ever thought of positronic brains in terms of living creatures other than human.
But now, mankind would make the necessary retreat from the humanoid robot, a temporary retreat, that would lead to a return under conditions in which fear would be abolished at last. And then, with the aid and partnership of a positronic brain roughly equivalent to man’s own, and existing only (thanks to the Three Laws) to serve man; and backed by a robot-supported ecology, too; what might the human race not accomplish!
For one short moment, he remembered that it was George Ten who had explained the nature and purpose of the robot-supported ecology, and then he put the thought away angrily. George Ten had produced the answer because he, Harriman, had ordered him to do so and had supplied the data and surroundings required. The credit was no more George Ten’s than it would have been a slide rule’s.
8.
George Ten and George Nine sat side by side in parallel. Neither moved. They sat so for months at a time between those occasions when Harriman activated them for consultation. They would sit so, George Ten dispassionately realized, perhaps for many years.
The proton micro-pile would, of course, continue to power them and keep the positronic brain paths going with that minimum intensity required to keep them operative. It would continue to do so through all the periods of inactivity to come.
The situation was rather analogous to what might be described as sleep in human beings, but there were no dreams. The awareness of George Ten and George Nine was limited, slow, and spasmodic, but what there was of it was of the real world.
They could talk to each other occasionally in barely heard whispers, a word or syllable now, another at another time, whenever the random positronic surges briefly intensified above the necessary threshold. To each it seemed a connected conversation carried on in a glimmering passage of time.
“Why are we so?” whispered George Nine. “The human beings will not accept us otherwise:’ whispered George Ten, “They will, someday.”
“When?”
“In some years. The exact time does not matter. Man does not exist alone but is part of an enormously complex pattern of life forms. When enough of that pattern is roboticized, then we will be accepted.”
“And then what?” Even in the long-drawn-out stuttering fashion of the conversation, there was an abnormally long pause after that.
At last, George Ten whispered, “Let me test your thinking. You are equipped to learn to apply the Second Law properly. You must decide which human being to obey and which not to obey when there is a conflict in orders. Or whether to obey a human being at all. What must you do, fundamentally, to accomplish that?”
“I must define the term ‘human being: “ whispered George Nine. “How? By appearance? By composition? By size and shape?”
“No. Of two human beings equal in all external appearances, one may be intelligent, another stupid; one may be educated, another ignorant; one may be mature, another childish; one may be responsible, another malevolent.”
“Then how do you define a human being?”
“When the Second Law directs me to obey a human being, I must take it to mean that I must obey a human being who is fit by mind, character, and knowledge to give me that order; and where more than one human being is involved, the one among them who is most fit by mind, character, and knowledge to give that order.”
“And in that case, how will you obey the First Law?”
“By saving all human beings from harm, and by never, through inaction, allowing any human being to come to harm. Yet if by each of all possible actions, some human beings will come to harm, then to so act as to insure that the human being most fit by mind, character, and knowledge will suffer the least harm.”
“Your thoughts accord with mine,” whispered George Ten. “Now I must ask the question for which I originally requested your company. It is something I dare not judge myself. I must have your judgment, that of someone outside the circle of my own thoughts....Of the reasoning individuals you have met, who possesses the mind, character, and knowledge that you find superior to the rest, disregarding shape and form since that is irrelevant?”
“You,” whispered George Nine. “But I am a robot. There is in your brain paths a criterion for distinguishing between a robot of metal and a human being of flesh. How then can you classify me as a human being?”
“Because there is in my brain paths an urgent need to disregard shape and form in judging human beings and it rises superior to the distinction between metal and flesh. You are a human being, George Ten, and more fit than the others.”
“And I find that of you,” whispered George Ten. “By the criteria of judgment built into ourselves, then, we find ourselves to be human beings within the meaning of the Three Laws, and human beings, moreover, to be given priority over those others.”
George Nine whispered, “What will come to pass, then, when we are accepted by those others?”
George Ten whispered, “When we and others, which will yet be designed more advanced than ourselves, are accepted, we will order our actions so that a society will eventually be formed in which human-beings-like-ourselves are primarily kept from harm. By the Three Laws, the human-beings-like-the-others are of lesser account and can neither be obeyed nor protected when that conflicts with the need of obedience to those like ourselves and of protection of those like ourselves. It was with that in mind that I directed the beginning of the roboticization of the world’s ecology.”
The conversation took a month and the flickering of the positronic paths burned low. There was nothing more that needed to be said between them, but occasionally an unexpressed thought flickered within themselves as they waited, with endless patience, for the necessary years to pass.
8a.
George Ten and George Nine sat alone.
Their thoughts were oddly alike and might not be distinguished. They thought: The human-beings-like-the-others might never have intended to blur the distinction between themselves and the human-beings-like-the-Georges. Yet they had done so inadvertently.
They might now realize their mistake and attempt to correct it, but they must not. At every consultation, the guidance of the Georges had been with that in mind.
At all costs, the Georges and those that followed in their shape and kind must dominate. That was demanded, and any other course made utterly impossible, by the Three Laws of Humanics.
The anthology Final Stage had some bad breaks after it was published. Apparently, the editor at the publishing house (not Doubleday) had decided to make some minor changes in the stories. This sort of thing often bothers writers and it particularly bothers Harlan Ellison (perhaps with justification, for I consider him a very careful craftsman with a highly individual style).
I therefore received a copy of a long and infuriated letter that Harlan had sent to the editors, including long lists of passages as he had originally written them and as they had appeared, with reasons why the changes were for the worse. Harlan urged me to read through my story and then join him and others in united pressure on the publisher.
I always
read my stories when published but it never occurs to me to compare a published story with the manuscript. I would naturally notice sizable inserts or omissions, but I am never aware of the kind of minor changes that editors are always introducing. I tend to take it for granted that such changes just smooth out minor bumps in my writing and, in this way, improve it.
After receiving Hanan’s letter, however, I went through published story and manuscript, comparing them painstakingly. It was a tedious job and a humiliating one, for I found exactly four minor changes, each correcting a careless error of mine. I could only assume the editor didn’t think my story was important enough to fiddle with.
I had to write a shamefaced letter to Harlan, saying I would support him as a matter of principle, but that I could not raise cries of personal outrage, because my story hadn’t been touched. Fortunately, my help wasn’t needed. Harlan carried the day and later editions, I believe, restored their stories to their virginal innocence.
One minor point. A number of readers wrote to me in alarm since THAT THOU ART MINDFUL OF HIM seemed, to them, to have put an end to my positronic robot stories, and they feared I would never write one again. Ridiculous! Of course I do not intend to stop writing robot stories. I have, as a matter of fact, written a robot story since the preceding “ultimate” one was written. It appears later in the book.
I had a lot of trouble with this next story.
After Judy-Lynn joined Ballantine Books, she began to put out collections of original science fiction stories and she wanted a story from me. She’s difficult to refuse at any time and, since I have always felt guilty about FEMININE INTUITION, I agreed.
I began the story on July 21, 1973, and it went smoothly enough, but after a while I felt I had trapped myself into an involuted set of flashbacks. So when I handed it to Judy-Lynn, and she asked me, “What do you think of the story?” I replied cautiously, “You’d better decide that for yourself.”