Caliban c-1
Page 1
Caliban
( Caliban - 1 )
Isaac Asimov
Roger Macbride Allen
Isaac Asimov, Roger MacBride Allen
Caliban
To five wondrous creatures, named in the order of their appearance on this planet:
Aaron
Victoria
Benton
Jonathan
And
Meredith
Acknowledgments
This book would not have been possible without the support, and especially the patience, of David Harris, John Betancourt, Byron Preiss, Susan Allison, Ginjer Buchanan, and Peter Heck. There was many a slip between cup and the lip, but thanks to their collective efforts, never a drop of the good stuff was lost. The book stands as proof once again that every writer needs at least one editor, and sometimes five or six is no bad idea. Thanks are also due to Thomas B. Allen and Eleanore Fox, neither of whom had time to read the manuscript, and both of whom did.
I. A Robot May Not Injure a Human Being, or, Through Inaction, Allow a Human Being to Come to Harm.
II. A Robot Must Obey the Orders Given It by Human Beings Except where Such Orders Would Conflict with the First Law.
III. A Robot Must Protect Its Own Existence As Long As Such Protection Does Not Conflict with the First or Second Law.
…THE Spacer-Settler struggle was at its beginning, and at its end, an ideological contest. Indeed, to take a page from primitive studies, it might more accurately be termed a theological battle, for both sides clung to their positions more out of faith, fear, and tradition than through any carefully reasoned marshaling of the facts.
Always, whether acknowledged or not, there was one issue at the center of every confrontation between the two sides: robots. One side regarded them as the ultimate good, while the other saw them as the ultimate evil.
Spacers were the descendants of men and women who had fled semi-mythical Earth with their robots when robots were banned there. Exiled from Earth, they traveled in crude starships on the first wave of colonization from Earth. With the aid of their robots, the Spacers terraformed fifty worlds and created a culture of great beauty and refinement, where all unpleasant tasks were left to the robots. Ultimately, virtually all work was left to the robots. Having colonized fifty planets, the Spacers called a halt, and set themselves no other task than enjoying the fruits of their robots’ labor.
The Settlers were the descendants of those who stayed behind on Earth. Their ancestors lived in great underground Cities, built to be safe from atomic attack. It is beyond doubt that this way of life induced a certain xenophobia into Settler culture. That xenophobia long survived the threat of atomic war, and came to be directed against the smug Spacers—and their robots.
It was fear that had caused Earth to cast out robots in the first place. Part of it was an irrational fear of metal monsters wandering the landscape. However, the people of Earth had more reasonable fears as well. They worried that robots would take jobs—and the means of making a living—from humans. Most seriously, they looked to what they saw as the indolence, the lethargy and decadence of Spacer society. The Settlers feared that robots would relieve humanity of its spirit, its will, its ambition, even as they relieved humanity of its burdens.
The Spacers, meanwhile, had grown disdainful of the people they perceived to be grubby underground dwellers. Spacers came to deny their own common ancestry with the people who had cast them out. But so, too, did they lose their own ambition. Their technology, their culture, their worldview, all became static, if not stagnant. The Spacer ideal seemed to be a universe where nothing ever happened, where yesterday and tomorrow were like today, and the robots took care of all the unpleasant details.
The Settlers set out to colonize the galaxy in earnest, terraforming endless worlds, leapfrogging past the Spacer worlds and Spacer technology. The Settlers carried with them the traditional viewpoints of the home world. Every encounter with the Spacers seemed to confirm the Settlers’ reasons for distrusting robots. Fear and hatred of robots became one of the foundations of Settler policy and philosophy. Robot hatred, coupled with the rather arrogant Spacer style, did little to endear Settler to Spacer.
But still, sometimes, somehow, the two sides managed to cooperate, however great the degree of friction and suspicion. People of goodwill on both sides attempted to cast aside fear and hatred to work together—with varying success.
It was on Inferno, one of the smallest, weakest, most fragile of the Spacer worlds, that Spacer and Settler made one of the boldest attempts to work together. The people of that world, who called themselves Infernals, found themselves facing two crises. Their ecological difficulties all knew about, though few understood their severity. Settler experts in terraforming were called in to deal with that.
But it was the second crisis, the hidden crisis, that proved the greater danger. For, unbeknownst to themselves, the Infernals and the Settlers on that aptly named world were forced to face a remarkable change in the very nature of robots themselves…
—Early History of Colonization, by Sarhir Vadid, Baleyworld University Press, S.E 1231
1
THE blow smashed into her skull.
Fredda Leving’s knees buckled. She dropped her tea mug. It fell to the floor and shattered in a splash of brown liquid. Fredda crumpled toward the ground. Her shoulder struck the floor, smashing into the broken shards of the cup. They slashed into her left shoulder and the left side of her face. Blood poured from the wounds.
She lay there, on her side, motionless, curled up in a ghoulish mockery of the fetal position.
For the briefest of moments, she regained consciousness. It might have been a split second after the attack, or two hours later, she could not say. But she saw them, there was no doubt of that. She saw the feet, the two red metallic feet, not thirty centimeters from her face. She felt fear, astonishment, confusion. But then her pain and her injury closed over her again, and she knew nomore.
ROBOT CBN-001, also known as Caliban, awoke for the first time. In a world new to him, his eyes switched on to glow a deep and penetrating blue as he looked about his surroundings. He had no memory, no understanding to guide him. He knew nothing.
He looked down at himself and saw he was tall, his body metallic red. His left arm was half-raised. He was holding it straight out in front of him, his fist clenched. He flexed his elbow, opened his fist, and stared at his hand for a moment. He lowered his arm. He moved his head from side to side, seeing, hearing, thinking, with no recollection of experience to guide him. Where am I, who am I, what am I?
I am in a laboratory of some sort, I am Caliban, I am a robot. The answers came from inside him, but not from his mind. From an on-board datastore, he realized, and that knowledge likewise came from the datastore. So that is where answers come from, he concluded.
He looked down to the floor and saw a body lying on its side there, its head near his feet. It was the crumpled form of a young woman, a pool of blood growing around her head and the upper part of her body. Instantly he recognized the concepts of woman, young, blood, the answers flitting into his awareness almost before he could form the questions. Truly a remarkable device, this on-board datastore.
Who is she? Why does she lie there? What is wrong with her? He waited in vain for the answers to spring forth, but no explanation came to him. The store could not—or would not—help him with those questions. Some answers, it seemed, it would not give. Caliban knelt down, peered at the woman more closely, dipped a finger in the pool of blood. His thermocouple sensors revealed that it was already rapidly cooling, coagulating. The principle of blood clotting snapped into his mind. It should be sticky, he thought, and tested the notion, pressing his forefinger to h
is thumb and then pulling them apart. Yes, a slight resistance.
But blood, and an injured human. A strange sensation stole over him, as he knew there was some reaction, some intense, deep-rooted response that he should have—some response that was not there at all.
The blood was pooling around Caliban’ feet now. He rose to his full two-meter height again and found that he did not desire to stand in a pool of blood. He wished to leave this place for more pleasant surroundings. He stepped clear of the blood and saw an open doorway at the far end of the room. He had no goal, no purpose, no understanding, no memory. One direction was as good as another. Once he started moving, there was no reason to stop.
Caliban left the laboratory, wholly and utterly unaware that he was leaving a trail of bloody footprints behind. He went through the doorway and kept on going, out of the room, out of the building, out into the city.
SHERIFF’S Robot Donald DNL-111 surveyed the blood-splattered floor, grimly aware that, on all the Spacer worlds, only in the city of Hades on the planet of Inferno could a scene of such violence be reduced to a matter of routine.
But Inferno was different, which was of course the problem in the first place.
Here on Inferno it was happening more and more often. One human would attack another at night—it was nearly always night—and flee. A robot—it was nearly always a robot—would come across the crime scene and report it, then suffer a major cognitive dissonance breakdown, unable to cope with the direct, vivid, horrifying evidence of violence against a human being. Then the med-robots would rush in. The Sheriff’s dispatch center would summon Donald, the Sheriff’s personal robot, to the scene. If Donald judged the situation warranted Kresh’s attention, Donald instructed the household robot to waken Sheriff Alvar Kresh and suggest that he join Donald at the scene.
Tonight the dismal ritual would be played out in full. This attack, beyond question, required that the Sheriff investigate personally. The victim, after all, was Fredda Leving. Kresh must needs be summoned.
And so some other, subordinate robot would waken Kresh, dress him, and send him on his way here. That was unfortunate, as Kresh seemed to feel Donald was the only one who could do it properly. And when Alvar Kresh woke in a bad mood, he often flew his own aircar in order to work off his tension. Donald did not like the idea of his master flying himself in any circumstances. But the thought of Alvar Kresh in an evil mood, half-asleep, flying at night, was especially unpleasant.
But there was nothing Donald could do about all that, and a great deal to be done here. Donald was a short, almost rotund robot, painted a metallic shade of the Sheriff’s Department’s sky-blue and carefully designed to be an inconspicuous presence, the sort of robot that could not possibly disturb or upset or intimidate anyone. People responded better to an inquisitive police robot if it was not obtrusive. Donald’s head and body were rounded, the sides and planes of his form flowing into each other in smooth curves. His arms and legs were short, and no effort had been made to put anything more than the merest sketch of a human face on the front of his head.
He had two blue-glowing eyes, and a speaker grille for a mouth, but otherwise his head was utterly featureless, expressionless.
Which was perhaps just as well, for had his face been mobile enough to do so, he would have been hard-pressed to formulate an expression appropriate to his reaction now. Donald was a police robot, relatively hardened to the idea of someone harming a human, but even he was having a great deal of trouble dealing with this attack. He had not seen one this bad in a while. And he had never been in the position of knowing the victim. And it was, after all, Fredda Leving herself who had built Donald, named Donald. Donald found that personal acquaintance with the victim only made his First Law tensions worse.
Fredda Leving was crumpled on the floor, her head in a pool of her own blood, two trails of bloody footprints leading from the scene in different directions, out two of the four doors to the room. There were no footprints leading in.
“Sir—sir—sir?” The robotic voice was raspy and rather crudely mechanical, spoken aloud rather than via hyperwave. Donald turned and looked at the speaker. It was the maintenance robot that had hyperwaved this one in.
“Yes, what it is?”
“Will she—will she—will she be all—all right right?” Donald looked down at the small tan robot. It was a DAA-BOR unit, not more than a meter and a half high. The word-stutter in his speech told him what he knew already. Before very much longer, this little robot was likely to be good for little more than the scrap heap, a victim of First Law dissonance.
Theory had it that a robot on the scene should be able to provide first aid, with the medical dispatch center ready to transmit any specialized medical knowledge that might be needed. But a serious head injury, with all the potential for brain damage, made that impossible. Even leaving aside the question of having surgical equipment in hand, this maintenance robot did not have the brain capacity, the fine motor skills, or the visual acuity needed to diagnose a head wound. The maintenance robot must have been caught in a classic First Law trap, knowing that Fredda Leving was badly injured, but knowing that any inexpert attempt to aid her could well injure her further. Caught between the injunction to do no harm and the command not to allow harm through inaction, the DAA-BOR’s positronic brain must have been severely damaged as it oscillated back and forth between the demands for action and inaction.
“I believe that the medical robots have the situation well in hand, Daabor 5132,” Donald replied. Perhaps some encouraging words from an authority figure like a high-end police robot might do some good, help stabilize the cognitive dissonance that was clearly disabling this robot. “I am certain that your prompt call for assistance helped to save her life. If you had not acted as you did, the medical team might well not have arrived in time.”
“Thank—thank—thank you, sir. That is good to know.”
“One thing puzzles me, however. Tell me, friend—where are all the other robots? Why are you the only one here? Where are the staff robots, and Madame Leving’s personal robot?”
“Ordered—ordered away,” the little robot answered, still struggling to get its speech under greater control. “Others ordered to leave area earlier in evening. They are in—are in the other wing of the laboratory. And Madame Leving does not bring a personal robot with her to work.”
Donald looked at the other robot in astonishment. Both statements were remarkable. That a leading roboticist did not keep a personal robot was incredible. No Spacer would venture out of the house without a personal robot in attendance. A citizen of Inferno would be far more likely to venture out stark naked than without a robot—and Inferno had a strong tradition of modesty, even among Spacer worlds.
But that was as nothing compared to the idea of the staff robots being ordered to leave. How could that be? And who ordered them to go? The assailant? It seemed an obvious conclusion. For the most fleeting of seconds, Donald hesitated. It was dangerous for this robot to answer such questions, given its fragile state of mind and diminished capacity. The additional conflicts between First and Second Laws could easily do irreparable harm. But no, it was necessary to ask the questions now. Daabor 5132 was likely to suffer a complete cognitive breakdown at any moment in any event, and this might be the only chance to ask. It would have been far better for a human, for Sheriff Kresh, to do the asking, but this robot could fail at any moment. Donald resolved to take the chance. “Who gave this order, friend? And how did you come to disobey that order?”
“Did not disobey! Was not present when order given. Sent—I was sent—on an errand. I came back after.”
“Then how do you know the order was given?”
“Because it was given before! Other times!”
Other times? Donald was more and more amazed. “Who gave it? What other times? Who gave the order? Why did that person give the order?”
Daabor 5132’s head jerked abruptly to one side. “Cannot say. Ordered not to tell. Ordered we were o
rdered not to say we were sent away, either—but now going away caused harm to human—harm—harm—harm—”
And with a low strangling noise, Daabor 5132 froze up. Its green eyes flared bright for a moment and then went dark.
Donald stared sadly at what had been a reasoning being brief moments before. There could be no question that he had chosen rightly. Daabor 5132 would have failed within a few minutes in any event.
At least there was the hope that a skilled human roboticist could get further information out of the other staff robots.
Donald turned away from the ruined maintenance robot and turned his attention back toward the human victim on the floor, surrounded by the med-robots.
It was the sight that had destroyed the Daabor robot, but Donald knew he was, quite literally, made of sterner stuff. Fredda Leving herself had adjusted his First, Second, and Third Law potential with the express purpose of making him capable of performing police work.
Donald 111 stared at the scene before him, feeling the sort of First Law tension familiar to a sheriff’s robot: Here was a human being in pain, in danger, and yet he could not act. The med-robots were here for that, and they could aid Fredda Leving far more competently than he ever could. Donald knew that, and restrained himself, but the First Law was quite clear and emphatic: A robot may not injure a human being, or, through inaction, allow a human being to come to harm. No loopholes, no exceptions.
But to aid this human would be to interfere with the work of the med-robots, thus at least potentially bringing harm to Fredda Leving. Therefore, to do nothing was to aid her. But he was enjoined against doing nothing, and yet to aid her would be to interfere—Donald fought down the tremors inside his mind as his positronic brain dealt with the same dissonance that had destroyed Daabor 5132. Donald knew that his police-robot adjustments would see to it he survived the episode, as he had so many in the past, but that did not make it any less unpleasant.