by Isaac Asimov
“I mean that we have rung out the changes on positronics. Certainly today’s positronic brain is far superior to the original units. It has been greatly advanced and improved. There have been many refinements in it. But the positronic brain’s basic design hasn’t changed in thousands of years. It would be as if we were still using chemical rockets for spacecraft, instead of hyperdrive. The positronic brain is an incredibly conservative design that puts tremendous and needless limits on what robots can do. Because the Three Laws are embedded in its design, the positronic brain is seen as the only possible design for use in robots. That is an article of belief, of faith, even among robotics researchers. But gravitonics could change all that.
“Gravitonic brains currently have one or two minor drawbacks, but they are at the beginning of their development. They promise tremendous advantages over positronics, in terms of flexibility and capacity.”
“Well, you certainly sound like a true believer yourself,” Kresh said dryly. There is none so faithful as the converted, he thought. “Very well, Terach. I may well wish to talk with you later, but that will do it for now. You may go.”
Jomaine nodded and stood up. He hesitated before heading for the door. “Ah, one question,” he said. “What is the prognosis for Fredda Leving?”
Kresh’s face hardened. “She’s still unconscious,” he said, “but they expect her to awaken sometime in the next day or so, and go on to a rapid and complete recovery. They are using the most advanced regeneration techniques to stimulate recovery. I understand her head injury should be completely healed within two days.”
Jomaine Terach smiled and nodded. “That’s excellent news,” he said. “The staff here will be delighted to hear it—ah, that is, if I’m allowed to tell them.”
Kresh waved his hand in negligent dismissal. “Go right ahead, Terach. It’s public knowledge—and she’s under heavy guard.”
Terach pasted on a patently false smile, nodded nervously, and left the room.
Kresh watched him go. “What’s your reading, Donald?” he asked, without looking over at the robot. No one talked about it much, but advanced police robots were specially engineered to detect the body’s involuntary responses to questions. In effect, Donald was a highly sophisticated lie detector.
“I should remind you that Jomaine Terach quite possibly knows about my capabilities as a truth-sensor. I have never met him before, but a records-check confirms that he was on staff here during my construction. That does add a variable. However, suffice to say that he was highly agitated, sir. Far more so than any of the others, and, in my opinion, more so than would be accounted for solely by surprise and concern over the attack on Lady Leving. Voice stress and other indicators confirm that he was concealing something.”
That didn’t surprise Alvar. All witnesses concealed things. “Was he lying?” he asked. “Lying directly?”
“No, sir. But he was most concerned to learn we knew about the gravitonic brains. I found this confusing, as he went to some length to discuss them. I formed the impression that he was intent on steering the interrogation away from some other point.”
“You caught that, too, I see. The damnable thing is that I can’t imagine what point he was trying to lead us away from. My hunch is he thinks we know more than we do.”
“That is my opinion as well.”
Alvar Kresh drummed his fingers on the table and stared at the door Jomaine Terach had used to leave the room.
There was more going on here than the attack on Leving. Something else was up. Something that involved the Governor, and Leving, and Welton, and the Settler-Spacer relationship on Inferno.
Indeed, the attack was already beginning to recede in importance in his mind. That was merely the loose thread he was tugging on. He knew that if he left it alone, the rest of it would never be revealed. Pull it too hard, and it would snap, break its connections to the rest of the mystery. But play the investigation of the attack carefully, tug the thread gently, and maybe he could use it to unravel the whole tangled problem.
Alvar Kresh was determined to find out all he could.
Because something big was going on.
JOMAINE Terach left the interview room. His personal robot, Bertran, was waiting outside in the hall and dutifully followed him as Jomaine hurried back to his own laboratory.
Sheriff Kresh had made Bertran wait outside the room during the interrogation. It was just a little harassment, Jomaine told himself, another way for Kresh to get and keep me unnerved. And yes, he admitted to himself, it had worked. Spacers in general, and Infernals particularly, did not like to be without their robots.
Only after he was in his own lab, only after Bertran had followed him in and shut the door safely behind him, did Jomaine allow himself to succumb to the fears he was feeling. He crossed the room hurriedly. He dropped back into his favorite old armchair and breathed a sigh of relief.
“Sir, are you all right?” Bertran asked. “I fear the bad news about Lady Leving and the police interview have greatly distressed you.”
Jomaine Terach nodded tiredly. “That they have, Bertran. That they have. But I’ll be fine in just a moment. I just need to think for a bit. Why don’t you bring me some water and then retire to your niche for a while?”
“Very good, sir.” The robot stepped to the lab sink, filled a glass, and brought it back. Jomaine watched as Bertran went over to his wall niche and dropped back into standby mode.
That was the way it was supposed to be. A robot did what you told it to do and then got out of the way. That was how it had been for thousands of years. Did they really dare try and change that? Did Fredda Leving truly think she could overturn everything that completely?
And did she truly have to make a deal with the devil, with Tonya Welton, in order to make it happen?
Well, at any rate, he had managed to steer things away from any discussion of the Three Laws. If he had been forced to sacrifice a few tidbits about gravitonics in order to accomplish that, so be it. It would all be public in a day or so, anyway.
They were safe for the moment. But still, the project was madness. Caliban was madness. Building him had been a violation of the most basic Spacer law and philosophy, but Fredda Leving had gone ahead, anyway. Typical bullheadedness.
Never mind theory and philosophy, she had said. They were an experimental lab, not a theory shop that never acted on its ideas. It was time to take the next step, she said. It was time to build a gravitonic robot with no limits on its mind whatsoever. A blank slate, that’s what she had called Caliban. An experimental robot, to be kept inside the lab at all times, never to leave. A robot with no knowledge of other robots, or the Settlers, or anything beyond human behavior and a carefully edited source of knowledge about the outside world. Then let it live at the lab, under controlled conditions, and see what happens. See what rules it developed for its own behavior.
Did she truly have to build Caliban?
No, ask the question directly, he told himself. We’ve all hedged around it long enough. And yes, that was the deadly secret question. No one else knew. With Caliban broken free of the lab, with Fredda unconscious, there was no one else in the wide world who could ask the question.
So Jomaine asked it of himself.
Did she really have to build a robot that did not have the Three Laws?
4
SIMCOR Beddle lifted his left hand, tilted his index finger just so, and Sanlacor 123 pulled back his chair with perfect timing, getting it out from behind him just as Simcor was getting up, so that the chair never came in contact with Simcor’s body as he rose.
There was quite a fashion for using detailed hand signals to command robots, and Simcor was a skilled practitioner of the art.
Simcor turned and walked away from the breakfast table, toward the closed door to the main gallery, Sanlacor hard on his heels. The door swung open just as he arrived at it. The Daabor unit on the other side of the door had no other job in the world but to open it. The machine marked out it
s existence by standing there, watching for anyone who might approach from its side of the door, and listening for footsteps from inside the room.
But Simcor Beddle, leader of the Ironheads, had no time to think about how menial robots spent their days. He was a busy man.
He had a riot to plan.
Simcor Beddle was a small, rotund man, with a round sallow face and hard, gimlet eyes of indeterminate color. His hair was glossy black, and just barely long enough to lie flat. He was heavy-set, there was no doubt about that. But there was nothing soft about him. He was a hard, determined man, dressed in a rather severe military-style uniform.
Managing his forces, that was the main thing. Keeping them from getting out of control was always a problem. His Ironheads were a highly effective team of rowdies, but they were rowdies all the same—and as such, they easily grew restive and bored. It was necessary to keep them busy, active, if he were to keep them under any sort of control at all.
No one quite knew where the Ironheads had gotten their name, but no one could deny it was appropriate. They were stubborn, pugnacious, bashing whatever was in their way whenever they saw fit. Maybe it was that stubbornness that earned them their name. More likely, though, it was their fanatical defense of the real Ironheads—robots. Well, granted, no one used anything as crude as raw iron to make robot bodies, but robots were as hard, as strong, as powerful, as iron.
Not that the Ironheads held robots themselves in any special esteem. If anything, Ironheads were harder on their robots than the average Infernal. But that was not the point. Robots gave humans such freedom, such power, such comfort. Those things were the birthright of every Infernal, indeed of all Spacers, and the Ironhead movement was determined to preserve and expand that birthright by any means necessary.
And making life unpleasant for the Settlers certainly fit into that category.
Simcor smiled to himself. That was getting to be a bad habit, thinking in speeches like that. He crossed to the far side of the gallery, toward his office, and another door robot swung the door wide as he approached. He entered the room, quite unaware of Sanlacor moving ahead of him to pullout his chair from his desk for him.
But he did not sit down. Instead, he made a subtle gesture with his right hand. The room robot, Brenabar, was at his side instantly, bringing Simcor’s tea. He took the cup and saucer and sipped thoughtfully for a moment. He nodded his head a precise five degrees down toward the desktop, and spoke one word. “Settlertown.”
Sanlacor, anticipating his master, was already at the view controls, and in less than a second, the bare desktop was transformed into a detailed map of Settlertown. Simcor handed his teacup to the empty air without looking, and Brenabar took it from him smoothly.
Kresh’s deputies were sure to be ready for them, after last night. Simcor had superb connections inside the Sheriff’s Department, and he knew everything Kresh knew about the attack on Fredda Leving. In fact, he knew quite a bit more. He had heard a recording of that lecture of hers. Damnable, treasonous stuff. Simcor smiled. Not that she was likely to make any more such speeches. Everything was working his way.
But he had to concentrate on the plans for today. He had to assume the Sheriff’s Department was ready for trouble. Once the Ironheads started the ruckus, they would only have a few minutes before the law stepped in to protect the damned Settlers.
So they would have to do as much damage as possible in those first few minutes. Under the circumstances, it was too much to hope they would be able to penetrate the underground section of Settlertown again. No sense wasting effort in the attempt. This time, it would have to be on the surface, at ground level. Simcor Beddle lay his hands on the desktop and stared thoughtfully at the map of his enemy’s stronghold.
IT was morning in the city of Hades. Caliban knew that much for certain, if very little else of any substance. By now he was no longer sure what he knew.
But he was beginning to believe something was wrong. Something was terribly wrong.
It was as if Caliban’s utterly blank memory and the precise but limited information in the datastore were the double lenses of a distorted telescope, utter ignorance and expert knowledge combining to twist and warp all he saw. The world his eyes and mind presented to him was a crazed and frightening patchwork.
In the busiest part of the city’s midtown, he turned off the sidewalk and found a bench set in a quiet corner of a tiny park, well out of sight from any casual passersby. He sat down and began reviewing all that he had seen as he had walked the streets of Hades.
There was something distinctly unreal, and somewhat alarming, about the world around him. He had come to realize just how clean, perfect, idealized, precise were the facts and figures, maps, diagrams, and images that leapt up from the datastore. But the real-world objects that corresponded to the datastore’s concepts were far less precise.
Further exploration confirmed that false voids and featureless buildings were not the only flaws in the datastore map.
The map likewise did not report which blocks were busy, full of people and robots, and which were empty, semi-abandoned, even starting to decay.
Some new buildings had materialized since the map was stored in his datastore, and other, older buildings that seemed whole and complete in the datastore had vanished from reality.
No image in the datastore showed anything to be worn-out or dirty, but the real world was full of dust and dirt, no matter how vigorously the maintenance robots worked to keep it all clean.
Caliban found the differences between idealized definitions and real-world imperfections deeply disturbing. The world he could see and touch seemed, somehow, less real than the idealized, hygienic facts and images stored deep inside his brain.
But it was more than buildings and the map, or even the datastore, that confused him.
It was human behavior he found most bewildering. When Caliban first approached a busy intersection, the datastore showed him a diagram of the correct procedure for crossing a street safely. But human pedestrians seemed to ignore all such rules, and common sense, for that matter. They walked wherever they pleased, leaving to the robots driving the groundcars to get out of the way.
Something else about the datastore was strange, even disturbing: There was a flavor of something close to emotion about much of its data. It was as if the opinions, the feelings, of whoever implanted the information into the datastore had been stored there as well.
He was growing to understand the datastore on something deeper than an intellectual level. He was learning the feel of it, gaining a sense of how it worked, developing reflexes to help him use it in a more controlled and useful manner, keep it from spewing out knowledge he did not need. Humans had to learn to walk: That was one of many strange and needless facts the datastore had provided. Caliban was coming to realize that he had to learn how to know, and remember.
Confusion, muddle, dirt, inaccurate and useless information—those he could perhaps learn to accept. But it was far more troubling that, on many subjects, the datastore was utterly—and deliberately—silent. Information he most urgently wanted was not only missing but excised, purposely removed. There was a distinct sensation of emptiness, of loss, that came to him when he reached for data that should have been there and it was not. There were carved-out voids inside the datastore.
There was much he desperately wanted to know, but there was one thing in particular, one thing that the store did not tell him, one thing that he most wanted to know: Why didn’t it tell him more? He knew it should have been able to do so. Why was all information on that place where the sign said Settlertown deleted from the map? Why had all meaningful references to robots been deleted? There was the greatest mystery. He was one, and yet he scarcely knew what one was. Why was the datastore silent on that of all subjects?
Humans he knew about. At his first sight of that woman he saw when he awoke, he had immediately known what a human was, the basics of their biology and culture. Later, when he glanced at an old man, or
one of the rare children walking the street, he knew all basic generalities concerning those classes of person—their likely range of temperament, how it was best to address them, what they were and were not likely to do. A child might run and laugh, and adult was likely to walk more sedately, an elder might choose to move more slowly still.
But when he looked at another robot, one of his fellow beings, his datastore literally drew a blank. There was simply no information in his mind.
All he knew about robots came from his own observation. Yet his observations had afforded him little more than confusion.
The robots he saw—and he himself—appeared to be a cross between human and machine. That left any number of questions unclear. Were robots born and raised like humans? Were they instead manufactured, like all the other machines that received detailed discussion in the datastore? What was the place of the robot in the world? He knew the rights and privileges of humans—except as they pertained to robots—but he knew nothing at all of how robots fit in.
Yes, he could see what went on around him. But what he saw when he looked was disturbing, and baffling. Robots were everywhere—and everywhere, in every way, robots were subservient. They fetched and they carried, they walked behind the humans. They carried the humans’ loads, opened their doors, drove their cars. It was patently clear from every scrap of human and robot behavior that this was the accepted order of things. No one questioned it.
Except himself, of course.
Who was he? What was he? What was he doing here? What did it all mean?
He stood up and started walking again, not with any real aim in mind, but more because he could not bear to sit idle any longer. The need to know, to understand who and what he was, was getting stronger all the time. There was always the chance that the answer, the solution, was just around the corner, waiting to be discovered.
He left the park and turned left, heading down the broad walkways of downtown.