Utopia c-3

Home > Science > Utopia c-3 > Page 4
Utopia c-3 Page 4

by Isaac Asimov


  “Very good,” Fredda said. “Will dinner be ready soon?”

  “Dinner will be ready in twelve minutes, Mistress. Is that acceptable?”

  “That will be fine, Oberon.” Fredda regarded Oberon with a critical—and self-critical—eye. She had built him, after all. He was a tall, solid-looking robot, heavily built and gun-metal gray. Oberon was nearly twice the size of Donald—and perhaps only half as sophisticated. Fredda was not entirely satisfied with her handiwork regarding Oberon. If nothing else, there was the question of overall appearance. At the time she had designed him, she had concluded that a robot as big as Oberon who was all angles and hard edges would have been rather intimidating. That would not have been a good idea in these rather edgy times. Therefore, Oberon was as rounded-off as Donald. However, Fredda was not entirely satisfied with the overall effect. Donald’s rounded angles made him look unthreatening. Oberon merely looked half-melted.

  She often wondered what Oberon’s design said about her own psychology. The custom-design robots she had built before him—Donald, Caliban, Ariel, Prospero—had all been cutting-edge designs, highly advanced, even, except for Donald, dangerously experimental. Not Oberon. Everything about his design was basic, conservative—even crude. Her other custom-built robots had required highly sophisticated construction and hand-tooled parts. Oberon represented little more than the assembly of components.

  “I’ll just go in and freshen up,” she said to Oberon, and headed for the refresher, her mind still on why she had made Oberon the way she had. Once burned, twice shy? she wondered. Of course she had been burned twice already. It was a desire for rebellion against caution that had gotten her into trouble in the first place. And the second place. She found herself thinking back on it all as she stripped and headed into the refresher. The hot water jets of the needle-shower were just what she needed to unwind after the meeting with Prospero.

  A few years before, Fredda Leving had been one of Inferno’s leading roboticists, with a well-earned reputation for taking chances, for searching out shortcuts, for impatience.

  None of those character traits were exactly well-suited to the thoroughly calcified field of robotics research. There had not been a real breakthrough in robotics for hundreds of years, just an endless series of tiny, incremental advances. Robotics was an incredibly conservative field, caution and safety and care the watchwords at every turn.

  Positronic brains had the standard Three Laws of Robotics burned into them, not once, but millions of times, each microcopy of the Laws standing guard to prevent any violation. Each positronic brain was based on an earlier generation of work, and each later generation seemed to include more Three-Law pathing. The line of development went back in an unbroken chain, all the way to the first crude robotic brain built on Earth, untold thousands of years before.

  Each generation of positronic brain had been based on the generation that went before—and each generation of design had sought to entwine the Three Laws more and more deeply into the positronic pathway that made up a robotic brain. Indeed, the closest the field had come to a breakthrough in living memory was a way to embed yet more microcopies of the Three Laws into the pathways of a positronic brain.

  In principle, there was, of course, nothing wrong with safety. But there was such a thing as overdoing it. If a robotic brain checked a million times a second to see if a First Law violation was about to occur, that meant all other processing was interrupted a million times, slowing up productive work. Very large percentages of processing time, and very large percentages of the volume of the physical positronic brain, were given over to massively, insanely redundant iterations of the Three Laws.

  But Fredda had wanted to know how a robot would behave with a modified law set—or with no law set at all. And that meant she was stuck. In order to create a positronic brain without the Three Laws, it would have been necessary to start completely from scratch, abandoning all those thousands of years of refinement and development, almost literally carving the brain paths by hand. Even if she had tried such a thing, the resulting robot brain would have been of such limited capacity and ability that the experiment results would have been meaningless. What point in testing the actions of a No Law robot who had such reduced intellect that it was barely capable of independent action?

  There seemed no way around the dilemma. The positronic brain was robotics, and robotics was the positronic brain. The two had become so identified, one with the other, that it proved difficult, if not impossible, for most researchers to think of either one except as an aspect of the other.

  But Gubber Anshaw was not like other researchers. He found a way to take the basic, underlying structure of a positronic brain, the underlying pathing that made it possible for a lump of sponge palladium to think and speak and control a body, and place that pathing, selectively, in a gravitonic structure.

  A positronic brain was like a book in which all the pages had the Three Laws written on them, over and over, so that each page was half filled with the same redundant information, endlessly repeated, taking up space that thus could not be used to note down other, more useful data. A gravitonic brain was like a book of utterly blank pages, ready to be written on, with no needless clutter getting in the way of what was written. One could write down the Three Laws, if one wished, but the Three Laws were not jammed down the designer’s throat at every turn.

  No other robotics lab had been willing to touch Anshaw’s work, but Fredda had jumped at the chance to take advantage of it.

  Caliban was the first of her projects to go badly wrong. Fredda had long wanted to conduct a controlled, limited experiment on how a robot without the Three Laws would behave. But for long years, the very nature of robotics, and the positronic robot brain, had rendered the experiment impossible. Once the gravitonic brain was in her hands, however, she moved quickly toward development of a No Law robot—Caliban. He had been intended for use in a short-term laboratory experiment. The plan had been for him to live out his life in a sealed-off, controlled environment. Caliban had, unfortunately, escaped before the experiment had even begun, becoming entangled in a crisis that had nearly wrecked the government, and the reterraforming program on which all else depended.

  The second disaster involved the New Law robots, such as Prospero. Fredda had actually built the first of the New Law robots before Caliban. It was only because the world had become aware of Caliban first that people generally regarded him as preceding the New Laws.

  But both the New Laws and Caliban were products of Fredda’s concerns that robots built in accordance with the original Three Laws were wrecking human initiative and tremendously wasteful of robot labor. The more advanced robots became, the more completely they protected humans from danger, and the fewer things humans were allowed to do for themselves. At the same time, humans made the problem worse by putting the superabundance of robot labor to work at the most meaningless and trivial of tasks. It was common to have one robot on hand to cook each meal of the day, or to have one robot in charge of selecting the wine for dinner, while another had as its sole duty the drawing of the cork. Even if a man had only one aircar, he was likely to have five or six robot pilots, each painted a different color, to insure the driver did not clash with the owner’s outfit.

  Both humans and robots had tended to consider robots to be of very little value, with the result that robots were constantly being destroyed for the most pointless of reasons, protecting humans from dangers that could have easily been avoided.

  Humans were in the process of being reduced to drones. They were unproductive and in large part utterly inactive. Robots did more and more of the work, and were regarded with less and less respect. Work itself was held in lower and lower esteem. Work was what robots did, and robots were lesser beings.

  The spiral fed on itself, and Fredda could see it leading down into the ultimate collapse of Spacer society. And so she had developed the New Law robots. The New First Law prevented them from harming humans, but did not require them
to take action in order to protect humans. The New Second Law required New Law robots to cooperate with humans, not just obey them blindly. The New Third Law required the New Law robots to preserve themselves, but did not force them to destroy themselves for some passing human whim. The deliberately ambiguous Fourth Law encouraged New Law robots to act for themselves.

  The New Laws had seemed so reasonable to Fredda, so clearly an improvement over the original Three Laws. And perhaps they would have been an improvement, if it had been possible to start over, completely from scratch. But the New Law robots came into being on a world where Three-Law robots were already there, and on a world that seemed to have no place for them.

  The New Law robots were more catalyst for the second major crisis than actual cause of it. Through a complex series of events, the mere existence of the New Law robots, and the shortage of Three-Law robot labor, had ultimately set in train Governor Chanto Grieg’s assassination. If not for the calm and steady hand of Alvar Kresh, that crisis could have been far worse.

  In neither case had the robots, New Law or No Law, Prospero or Caliban, actually malfunctioned. All that was required for disaster and crisis to happen was for people to fear robots that were different. Inferno was a world that did not much like change, and yet it was one that had change thrust upon it. It was a world that punished boldness, and rewarded caution.

  And Fredda had suffered punishment enough. Small wonder, then, that Fredda had built herself such a cautious, stolid, lumpen robot as Oberon. But small wonder too that she was already tired of caution.

  Fredda shut off the needle-shower and activated the air blowers to dry herself off. She smiled, and reminded herself that even the simple act of taking a shower by herself, bathing herself, represented a revolution. Ten years before, such a thing would have been unthinkable, scandalous. There would have been a waterproofed domestic robot to take her clothes off for her, activate the shower system for her, push the dry button for her, and dress her again, in clothes selected by the robot.

  She stepped out of the refresher and starting picking out the clothes for her evening outfit. Something easy and casual for a night at home. Strange to think that she had left it to a robot to pick out her clothes for her, not so very long ago. Now it was a real pleasure, a savored luxury, to choose the clothes for an evening at home.

  Feeling well-scrubbed and revived by her shower, she threw open the closet and selected her clothes for the evening. Something subdued, but not too understated. She decided on her dark-blue sheath skirt, and a black pullover to go with it. She dressed, and then paused in front of the mirror to consider the effect.

  The outfit looked good on her. She selected earrings, and a silver brooch that would be set off by the black top. She looked back in the mirror and considered the effect.

  Fredda was small and fine-boned, with blue eyes and curly black hair she wore short. She was round-faced and snub-nosed. In short, she looked like what she was—a youthful woman given to sudden enthusiasm, and equally sudden outbursts of temper.

  The world of Inferno approved of seniority and experience. This did not make things any easier for Fredda Leving. She was a mere forty years old. By Infernal standards, that was just barely old enough for respectability—or it would have been if she had looked that age. Fredda had a naturally youthful appearance, and she was perverse enough to do everything she could to preserve the appearance of youth. At a time of life when most other Infernal woman were glad to be acquiring a properly mature appearance, Fredda still looked to be no more than twenty-five years of age.

  The hell with what they thought. Fredda knew she looked good—and looked better in the outfit she had picked out for herself. Certainly better than in anything Oberon would have selected. Pleased with her appearance, she headed out into the main salon, proud of having chosen just the right clothing.

  A silly thing, a small thing, but there it was. Making choices, however trivial, for oneself, was a liberation. There had been a time, and not so long ago, when Fredda, and Alvar, and thousands, millions of other people on Inferno had been little more than well-trained slaves to their own servants. Awakened at the hour the robots thought best, washed by the robots, dressed by the robots in clothes the robots picked out. Up until a few years ago, many clothes did not even have fasteners the wearer could attach or undo. The wearer was completely dependent on his or her dresser robot to get the garment on or off.

  Once dressed, you were fed the breakfast, lunch, and dinner selected by the robot cook to be most commensurate with the dictates of the First Law injunction to do no harm. Then your pilot robot flew you to this appointment or that—all appointments, of course, having being made by your secretary robot.

  You would get to wherever it was without ever knowing where it was, because you trusted in the robot to remember the address and know the best routes there. More than likely, your robots knew better than you what you were supposed to do there. Then the pilot robot flew you home, because you certainly wouldn’t know how to find your own way home, either. At the end of the day, you were undressed and then bathed again by the robots, and buttoned or zipped or clipped into pajamas by the robots, and then tucked into bed by them.

  A whole day, each day, every day, with the robots making every single personal decision, with the servants controlling your every movement. A whole day spent in an incredibly luxurious cage, without your ever being so much as aware that the cage existed.

  Fredda could not quite believe that she had ever allowed herself to live that way—but she had. Incredible. At least now she was conscious of the fact that Oberon had selected the dinner menu for her, and their dinner time. At least now, Oberon inquired if the mealtime he had selected was right, rather than informing her when she would eat. Tonight it was her choice to let the robots handle dinner. Another night, she might dictate the meal in every detail. Scandal of scandals, she had even been known to bum a meal for herself once in a while. If the tyrannical rule of the servants had not been completely shattered, at least it had been recognized for it was, and thus weakened.

  Fredda knew that she was not the only one who had taken back at least some control of her own life from the robots. She also knew that her research, her speeches, the turmoil she had caused were a large part of the reason. But beyond doubt, the presence of the Settlers had been a major influence as well. And then there was the bald fact that there simply weren’t as many robots available for private use these days. People were more careful with the limited amount of robot labor still available. They tended not to waste so much of it on trivial tasks.

  The revolution was far from complete, of course. There were still many Infernals out there who had not managed the change in attitude, who clung to the old ways, who rallied around the Ironhead calls for more and better robots as the solution to everything.

  But for whatever reason, or reasons, and by however many fits and starts, the change was happening. Allover the planet Infernals had come to realize just how dependent on robots they were, and had begun to back off just a little. And, much to the horror of Simcor Beddle and the Ironheads, people were starting to discover they liked having a bit more freedom in their lives.

  From Fredda’s point of view, all of it seemed good, positive change for the better. But she had learned, over the past few years, just how frightening—and genuinely dangerous—change, even change for the good, could be. There would be some unintended consequence, or someone left behind, someone who felt disaffected and threatened. Or else someone who was not harmed in the least by the turmoil, but found a way to take advantage of it, to the detriment of others.

  Or perhaps she was being too pessimistic. Perhaps the days of Inferno in upheaval, of the planet lurching from crisis to crisis, were over. And yet even steady, incremental change and improvement, of the sort her Alvar had presided over in the last few years, could bring jarring dislocations.

  The days ahead were likely to be…interesting.

  She heard the sound of her husband and D
onald coming in from the rooftop landing pad, and hurried to meet them.

  3

  “THEY WERE HERE again,” Kresh said as he kissed his wife. It was not a question, and Fredda knew better then to pretend she didn’t know who he meant.

  “Yes,” she said carefully. “They’ve just left.”

  “Good,” Kresh said as he eased himself down into his favorite chair. “I don’t like having them around.”

  “Nor do I, Dr. Leving,” Donald 111 announced. “The danger represented by the presence of those two pseudorobots is far greater than you believe.”

  “Donald, I built both of those pseudo-robots, as you insist on calling them,” Fredda said, feeling as much amusement as irritation. “I understand fully what they are capable of.”

  “I am not at all sure that is the case, Dr. Leving,” Donald said. “But if you will insist on meeting them when I am not present, there is nothing I can do to prevent you from doing so. I would urge you once again to exercise extreme caution when you deal with them.”

  “I will, Donald, I will,” Fredda said, her voice a bit tired. She had built Donald, too, of course. She knew as well as anyone that the First Law forced Donald to mention the potential danger to her at every opportunity. For all of that, it was still tedious to hear the same warning over and over again. Donald, and most other Three-Law robots, referred to Caliban and Prospero—and all New Law robots—as pseudo-robots because they did not possess the Three Laws. By definition, a robot was a sentient being imbued with the Three Laws. Prospero was possessed of the New Laws, and Caliban had no laws at all. They might look like robots, and in some ways act like robots, but they were not robots. Donald saw them as a perversion, as unnatural beings that had no proper place in the universe. Well, perhaps he would not phrase it in quite that way, but Fredda knew she was not far off the mark.

 

‹ Prev