by Isaac Asimov
Caliban stood over two meters tall, his body metallic red in color, his eyes a penetrating glowing blue. His appearance was striking, even intimidating, but far less so than his reputation. Caliban the Lawless, they still called him, sometimes.
Caliban, the robot accused, but cleared, of attempting the murder of his creator – of Fredda Leving herself.
Prospero regarded his companion for a moment before he replied. “The need for discretion,” he said. “Yes, I have heard that answer before. But I am far from sure that I know it is the true answer.”
“And what purpose would it serve for me to lie to you?” Caliban asked. For a Three-Law robot, the very idea of lying would be difficult to imagine, but Caliban was a No Law robot, and, in theory at least, just as able to lie as any human.
“Perhaps you would have no purpose in lying,” Prospero said, looking back toward Fredda. “But others might well have reasons to deceive you.”
“You are not at your most tactful today,” said Fredda. “And I must confess I don’t see why our perfectly true answers should not satisfy you. Nor can I see what motive I would have for lying to you and Caliban.”
“I might add that I do not understand your motive for offending our principal benefactor,” said Caliban.
Prospero hesitated, and looked from one of them to the other. “My apologies,” he said at last. “There are times when my understanding of human psychology fails me, even when I am attempting to learn more. I was attempting to gauge your emotional reaction to such an accusation, Dr. Leving.”
“I would have to believe in the sincerity of the accusation before I could have much of a reaction to it,” said Fredda.
“Yes,” said Prospero. “Of course.”
But if Fredda Leving was sure of anything at that moment, she was sure that Prospero had not given her all of the story – and perhaps had not given her any of the true story. But what motive would Prospero have for playing such a strange game? It was rare indeed when she felt completely sure that she understood Prospero. She had long known he was one of her less stable creations. But he was the undisputed leader of the New Law robots. She had no real choice but to deal with him.
“In any event,” said Caliban, “it is time for us both to be leaving. I have no doubt, Dr. Leving, that we shall all meet again soon.”
“I look forward to it,” said Fredda.
The jet-black robot regarded first Fredda, and then Caliban. “Very well,” he said. “We will depart. But I doubt that I will be the first or last robot to observe that the more I know about humans, the less I understand them.”
Fredda Leving sighed wearily. There were times when it was frustrating in the extreme listening to Three-Law robots holding forth on the subject of human behavior. Prospero and the other New Laws were even worse. At least Three-Law robots were not judgmental. Prospero had an opinion about everything.
Fredda could almost imagine him as the last priest of some long-forgotten human religion, always ready to debate any intricate point of theology, so long as it was of no interest or importance to anyone at all. There were times Caliban was no better. She had designed and built both of these robots by herself. Surely she could have designed their brains so they didn’t spend their days logic-chopping. But it was too late now. “Whatever you think of my reasons for doing so,” she said, “I must ask you again to leave, by the back way. Our next appointment is in three days, is it not?”
“Yes,” said Prospero. “We have several other appointments that will take up the intervening time.”
“Fine then. Return in three days, in the afternoon, and we will conclude our business.”
Caliban nodded his head toward her, in what was almost a bow. “Very well,” he said in a most courteous tone. “We will see you at that time.”
Prospero took no interest in courtesy. He simply turned, opened the door, and left the room, leaving all the farewells to his companion. Caliban had to hurry just to keep up with him.
Fredda watched them go, and found herself once again wondering about Prospero. She did not understand what went on behind those glowing eyes. There was something not quite right about a robot that – that secretive. She shook her head as she crossed the room. Not much point in worrying about it now. She sealed the door shut behind them and scrambled the keypad. Only she and Caliban and Prospero knew the door’s keypad combination.
And there were times she thought seriously about taking at least one name off that list.
2
Caliban followed Prospero down the tunnel. It ran for about a hundred meters, and deposited them at the base of a ravine that was otherwise quite inaccessible to the house. Their aircar was hidden there.
“I would like to know what all that was about,” Caliban said as they emerged from the tunnel into the cool of the evening.
“I spoke the truth,” Prospero said coolly. “It was in part merely a test to see how she would react to such an accusation. Surely you would agree it is worth knowing if she is capable of betraying us. “Prospero climbed into the pilot’s station.
Caliban followed, climbing into the forward passenger seat. “I suppose the case could be made that such information would be useful in a general sense,” he said. “But you have dealt with Dr. Leving for quite some time now. Why worry about such hypotheticals now? And if the need for a test was only part of your intent, what was the rest?”
“I have answers to both questions, friend Caliban, but I do not choose to give them now. This is all I can tell you: I believe we are in danger. The possibility that we will be betrayed – or have been already – is quite real. And I can tell you no more than that.”
Prospero engaged the aircar’s controls, and they lifted off into the evening air. Caliban said no more, but he found that he had reached a conclusion about Prospero. There was no longer the slightest doubt in his mind that the New Law robot was unstable. He did not merely suspect betrayal on all sides – he virtually invited it. He had gone out of his way to encourage Dr. Leving’s hostility. More than likely, the fellow was confusing danger to himself with danger to the New Laws.
All of which made Caliban’s next decision quite simple. As soon as it was conveniently possible, he would put some distance, in every sense of the word, between himself and Prospero.
He no longer wished to stand quite so close to so tempting a target.
Fredda Leving walked to the other end of the underground safe room, and went through the open door there. She wearily closed the door behind her, and scrambled the combination as well. She, Fredda, was the only one who knew the combination to this door. Alvar had insisted on that much. He had no desire for a New Law robot like Prospero – let alone a No Law robot like Caliban – to have free access to his home. There had been times when she herself had been glad to keep her home well barricaded against New Law robots.
And of course, the New Laws felt the same way about humans. She still had not the slightest idea where, exactly, the New Law city of Valhalla was. She knew it was underground, and that it was in the Utopia sector, but that was about all. Fredda had even been taken there several times, but she had always been transported in a windowless aircar equipped with a system for jamming tracking devices. The New Law robots took no chances, and she could not blame them. Fredda had been quite willing to cooperate with their precautions, and to make sure everyone knew about them. They were for her safety as much as for that of the robots. What she did not know, she could not reveal under the Psychic Probe. The New Law robots had a large number of enemies. Some of them might well be willing to reduce the governor’s wife to a vegetable, and damn the consequences, if that was what it took to find the lair of the New Law robots.
Astonishing, really, the lengths they all went to. Not just the New Laws, but Alvar, and even herself. They all took such elaborate precautions. Against discovery, against scandal, against each other. No wonder Prospero was turning half paranoid. Maybe even more than half.
In all probability, of course, the precau
tions would turn out to be useless in the end. Plots and secrets and hidden agendas generally came crashing down, sooner or later. She had never been involved in a plot or a secret that hadn’t. But the secrets and plots and safeguards and precautions made them all feel better, feel secure, at least for a while. Perhaps that was the point of having them.
Fredda double-checked the inner door, and then stepped into the elevator car that would carry her up above ground, to the household proper.
OBR-323 was waiting there for her, in all his rather ponderous solemnity. “Master Kresh has landed,” he announced in his gravely, ponderous voice. “He should be here momentarily.”
“Very good,” Fredda said. “Will dinner be ready soon?”
“Dinner will be ready in twelve minutes, Mistress. Is that acceptable?”
“That will be fine, Oberon.” Fredda regarded Oberon with a critical – and self-critical-eye. She had built him, after all. He was a tall, solid-looking robot, heavily built and gun-metal gray. Oberon was nearly twice the size of Donald – and perhaps only half as sophisticated. Fredda was not entirely satisfied with her handiwork regarding Oberon. If nothing else, there was the question of overall appearance. At the time she had designed him, she had concluded that a robot as big as Oberon who was all angles and hard edges would have been rather intimidating. That would not have been a good idea in these rather edgy times. Therefore, Oberon was as rounded-off as Donald. However, Fredda was not entirely satisfied with the overall effect. Donald’s rounded angles made him look unthreatening. Oberon merely looked half-melted.
She often wondered what Oberon’s design said about her own psychology. The custom-design robots she had built before him – Donald, Caliban, Ariel, Prospero – had all been cutting-edge designs, highly advanced, even, except for Donald, dangerously experimental. Not Oberon. Everything about his design was basic, conservative – even crude. Her other custom-built robots had required highly sophisticated construction and hand-tooled parts. Oberon represented little more than the assembly of components.
“I’ll just go in and freshen up,” she said to Oberon, and headed for the refresher, her mind still on why she had made Oberon the way she had. Once burned, twice shy? she wondered. Of course she had been burned twice already. It was a desire for rebellion against caution that had gotten her into trouble in the first place. And the second place. She found herself thinking back on it all as she stripped and headed into the refresher. The hot water jets of the needle-shower were just what she needed to unwind after the meeting with Prospero.
A few years before, Fredda Leving had been one of Inferno’s leading roboticists, with a well-earned reputation for taking chances, for searching out shortcuts, for impatience.
None of those character traits were exactly well-suited to the thoroughly calcified field of robotics research. There had not been a real breakthrough in robotics for hundreds of years, just an endless series of tiny, incremental advances. Robotics was an incredibly conservative field, caution and safety and care the watchwords at every turn.
Positronic brains had the standard Three Laws of Robotics burned into them, not once, but millions of times, each microcopy of the Laws standing guard to prevent any violation. Each positronic brain was based on an earlier generation of work, and each later generation seemed to include more Three-Law pathing. The line of development went back in an unbroken chain, all the way to the first crude robotic brain built on Earth, untold thousands of years before.
Each generation of positronic brain had been based on the generation that went before – and each generation of design had sought to entwine the Three Laws more and more deeply into the positronic pathway that made up a robotic brain. Indeed, the closest the field had come to a breakthrough in living memory was a way to embed yet more microcopies of the Three Laws into the pathways of a positronic brain.
In principle, there was, of course, nothing wrong with safety. But there was such a thing as overdoing it. If a robotic brain checked a million times a second to see if a First Law violation was about to occur, that meant all other processing was interrupted a million times, slowing up productive work. Very large percentages of processing time, and very large percentages of the volume of the physical positronic brain, were given over to massively, insanely redundant iterations of the Three Laws.
But Fredda had wanted to know how a robot would behave with a modified law set – or with no law set at all. And that meant she was stuck. In order to create a positronic brain without the Three Laws, it would have been necessary to start completely from scratch, abandoning all those thousands of years of refinement and development, almost literally carving the brain paths by hand. Even if she had tried such a thing, the resulting robot brain would have been of such limited capacity and ability that the experiment results would have been meaningless. What point in testing the actions of a No Law robot who had such reduced intellect that it was barely capable of independent action?
There seemed no way around the dilemma. The positronic brain was robotics, and robotics was the positronic brain. The two had become so identified, one with the other, that it proved difficult, if not impossible, for most researchers to think of either one except as an aspect of the other.
But Gubber Anshaw was not like other researchers, He found a way to take the basic, underlying structure of a positronic brain, the underlying pathing that made it possible for a lump of sponge palladium to think and speak and control a body, and place that pathing, selectively, in a gravitonic structure.
A positronic brain was like a book in which all the pages had the Three Laws written on them, over and over, so that each page was half filled with the same redundant information, endlessly repeated, taking up space that thus could not be used to note down other, more useful data. A gravitonic brain was like a book of utterly blank pages, ready to be written on, with no needless clutter getting in the way of what was written. One could write down the Three Laws, if one wished, but the Three Laws were not jammed down the designer’s throat at every turn.
No other robotics lab had been willing to touch Anshaw’s work, but Fredda had jumped at the chance to take advantage of it.
Caliban was the first of her projects to go badly wrong. Fredda had long wanted to conduct a controlled, limited experiment on how a robot without the Three Laws would behave. But for long years, the very nature of robotics, and the positronic robot brain, had rendered the experiment impossible. Once the gravitonic brain was in her hands, however, she moved quickly toward development of a No Law robot – Caliban. He had been intended for use in a short-term laboratory experiment. The plan had been for him to live out his life in a sealed-off, controlled environment. Caliban had, unfortunately, escaped before the experiment had even begun, becoming entangled in a crisis that had nearly wrecked the government, and the reterraforming program on which all else depended.
The second disaster involved the New Law robots, such as Prospero. Fredda had actually built the first of the New Law robots before Caliban. It was only because the world had become aware of Caliban first that people generally regarded him as preceding the New Laws.
But both the New Laws and Caliban were products of Fredda’s concerns that robots built in accordance with the original Three Laws were wrecking human initiative and tremendously wasteful of robot labor. The more advanced robots became, the more completely they protected humans from danger, and the fewer things humans were allowed to do for themselves. At the same time, humans made the problem worse by putting the superabundance of robot labor to work at the most meaningless and trivial of tasks. It was common to have one robot on hand to cook each meal of the day, or to have one robot in charge of selecting the wine for dinner, while another had as its sole duty the drawing of the cork. Even if a man had only one aircar, he was likely to have five or six robot pilots, each painted a different color, to insure the driver did not clash with the owner’s outfit.
Both humans and robots had tended to consider robots to be of very little value,
with the result that robots were constantly being destroyed for the most pointless of reasons, protecting humans from dangers that could have easily been avoided.
Humans were in the process of being reduced to drones. They were unproductive and in large part utterly inactive. Robots did more and more of the work, and were regarded with less and less respect. Work itself was held in lower and lower esteem. Work was what robots did, and robots were lesser beings.
The spiral fed on itself, and Fredda could see it leading down into the ultimate collapse of Spacer society. And so she had developed the New Law robots. The New First Law prevented them from harming humans, but did not require them to take action in order to protect humans. The New Second Law required New Law robots to cooperate with humans, not just obey them blindly. The New Third Law required the New Law robots to preserve themselves, but did not force them to destroy themselves for some passing human whim. The deliberately ambiguous Fourth Law encouraged New Law robots to act for themselves.
The New Laws had seemed so reasonable to Fredda, so clearly an improvement over the original Three Laws. And perhaps they would have been an improvement, if it had been possible to start over, completely from scratch. But the New Law robots came into being on a world where Three-Law robots were already there, and on a world that seemed to have no place for them.
The New Law robots were more catalyst for the second major crisis than actual cause of it. Through a complex series of events, the mere existence of the New Law robots, and the shortage of Three-Law robot labor, had ultimately set in train Governor Chanto Grieg’s assassination. If not for the calm and steady hand of Alvar Kresh, that crisis could have been far worse.
In neither case had the robots, New Law or No Law, Prospero or Caliban, actually malfunctioned. All that was required for disaster and crisis to happen was for people to fear robots that were different. Inferno was a world that did not much like change, and yet it was one that had change thrust upon it. It was a world that punished boldness, and rewarded caution.