Book Read Free

Utopia c-3

Page 15

by Isaac Asimov


  Caliban did not answer. There were times humans would say more in reply to silence than they would to words.

  This seemed to be one of those times. “Look,” said Fiyle. “One, I don’t have to justify myself to you. Two, I’m not making any charge at all for this one. All I want to do is make sure the world knows. I’m trying to do that the best way I know how. A guy like me can’t exactly call a press conference. Not without getting arrested. Three, no one has ever gotten killed because of something I’ve said. I hand out little tidbits, gossip, things that let one side confirm what it already knows about the other. That’s all. Worst I ever did was turn in a dirty cop—and it turned out he’d already gotten himself killed, anyway. I just deal in small-time information.” Fiyle paused a moment and frowned. “At least, all that was true until now. Until this. There has never been anything bigger than this. These guys have found a way to dig themselves an ocean. A sea, anyway. A polar sea.”

  “That’s absurd,” Prospero objected. “There is no way they could accomplish such a thing.”

  Caliban thought for a moment. “It is a sensible goal, at least. A polar sea with proper communication to the Southern Ocean would do a great deal to moderate the climate. But friend Prospero is correct. There is no way to do such a thing.”

  Fiyle nodded his agreement. “In the normal course of events, digging an ocean would be an impossibly huge project. Way beyond the capacity of Inferno’s engineers. Of anyone’s engineers. But all of a sudden someone dealt us a wild card.”

  “Go on,” Caliban said.

  Fiyle leaned forward in his chair, and went on in an earnest tone of voice. “There’s a guy by the name of Davlo Lentrall. He was working on something called Operation Snowball. A small-scale, low-budget project that’s been running for a few years now. You find comets in suitable orbits, set mining machines and robots on them, and, quite literally, set the robots to work making snowballs, mining hunks of ice. You load the snowballs into a linear accelerator that fires them toward the planet, one after another, over and over, working nonstop, around the clock. You fire the snowballs toward Inferno, one after another, over and over and over again, millions of them, until the whole mass of the comet is delivered to the planet in five- or ten-kilo chunks.

  “Each snowball vaporizes as it enters Inferno’s atmosphere—and there’s another five or ten kilos worth of water vapor in the atmosphere. Repeat five or ten or twenty million times, and you’ll got a substantial increase in the amount of water on the planet. Some of the water escapes to space, and some of what’s in the comet isn’t water—but the other elements serve as nutrients, and we can use those too. Every little bit helps—that’s the Operation Snowball motto. They’ve chewed up nine or ten small comets that way in the last few years.”

  “I have heard of the project, and seen the constant streams of meteors that sometimes appear in one part of the sky or another. What of it?”

  “Lentrall found Comet Grieg while he was doing a scan for comets suitable for Snowball. Except Grieg wasn’t suitable for Operation Snowball. It had too little water ice, and too much stony material. And that should have been the end of it—except for two things.

  “The first thing was that Lentrall saw how close the comet was going to come to Inferno. The second thing was that Lentrall was—and is—an arrogant, ambitious little man who wanted to be a big man. He was sick and tired of pushing numbers around for Operation Snowball. He was looking for a way out, a way up. Something big. And he found it.”

  “And what, exactly, was that something big?”

  “Deliberately dropping a comet on the planet in order to dig that polar sea and its outlets,” said Fiyle. “And who cares if the New Law robots get in the way?”

  A human would have professed shock and refused to believe such a thing could be. But Caliban was not a human, and he had never suffered from the human need to try and reshape reality by denying the unpleasant parts of it could exist. Instead he moved on to the next logical question. But, even as he asked it, somehow he already knew what the answer had to be. “You refer to the New Law robots being in the way. Assuming they do drop a comet on the planet—where, precisely, do they intend to drop it?” he asked.

  “On the Utopia region,” Fiyle said. “And if it’s anywhere near where I think it is, your hidden city of Valhalla is right in the middle of ground zero.”

  SOPHON-06 WATCHED PLACIDLY as Gubber Anshaw unplugged the test meter from his diagnostic socket.

  “That will do for this trip,” Gubber said cheerfully.

  “Do I still register as sane on all of your meters, Dr. Anshaw?” asked Sophon-06.

  “So far as I can tell,” Gubber replied. “I have yet to work out what, exactly, should be defined as sanity among New Law robots.”

  “I thought the majority was always sane,” Lancon-03 suggested from across the room.

  The human shook his head as he put away his equipment. “I don’t believe that is true for my species,” he said. “At least I hope it isn’t. As for your species, I am still at the beginning of my studies. I’ve done tests on dozens of the New Law robots in Valhalla. The vast majority of the New Law robots seem to fall within a narrow band of personality types. You are a careful, earnest, thoughtful group. The world, the universe, is a very new place to you, and you seek to explore yourselves and it at the same time. You want to know where you belong.”

  “And you see that as the primary motivation for New Law behavior?” asked Sophon-06.

  Gubber thought for a moment. “There is a very ancient procedure used by humans to examine their own drives and impulses. It has gone under many names, indeed many disguises, as the millennia have passed. But the basics are always the same. The subject is required to speak to a listener, but it is not what the listener hears that matters. What is important is that the subject is forced to order his or her thoughts and express them coherently. In the act of speaking to the listener, the subject speaks to himself or herself, and thus is able to perform a self-examination.”

  “In other words, it does not matter what you think our basic drives are,” said Sophon-06. “What is important is that we take the opportunity to ask that question of ourselves, in the most objective way possible.”

  “It is useful to ask the question,” said Gubber. “But it is also important to express the answer.”

  “Or at least an answer,” said Lancon-03. “So come, friend Sophon. Tell us. What is it that you think drives the New Law robots?”

  Sophon sat motionless, deep in thought. “It is certainly a question that goes to the center of things,” he said at last. “Why do we hide away here in Valhalla, obsessed with secrecy? Why do we seek to develop our own aesthetic, our own way of looking at the world? Why are we driven to improve and demonstrate our skills as terraformers? I think all of these can be explained by our desire to survive. We hide to avoid destruction, we seek acts of creation to develop a system of reference for the greater universe, and we sharpen our skills to insure that we are of more use alive than dead.”

  Gubber considered Sophon-06 thoughtfully. A coldblooded, even brutal, analysis, but cogent for all of that. It came closer to the truth than most theories did. “It has been interesting, as always,” he said, preparing to take his leave. “I look forward to my next visit.”

  Lancon-03 nodded thoughtfully, mimicking the human gesture. “I am glad to hear it,” she said. “I hope we are still here when the time for that visit comes.”

  GUBBER HAD MADE the trip from Valhalla often enough to take all of the journey’s odd features for granted. One never came in or went out by the same route, and one rode in a different sort of sealed and windowless vehicle each time one arrived or departed. Nor did one journey to or from Depot ever take anything like the same amount of time as the one before it or the one after. As Sophon-06 had observed, the New Law robots invested a great deal of effort in order to stay hidden. Gubber therefore paid no attention to the journey back and forth to Depot. He had something else o
n his mind: the question of New Law robot sanity.

  Well, what was sanity, anyway? Surely it was something more than the will of the majority. He had never given much thought to defining the term. It was simply one of those concepts that were hard to define, and yet easy to recognize. One could say with a high degree of assurance that a given being was sane, even if one could not define the term.

  And, of course, the converse was true. Which was why Gubber Anshaw always preferred to time his visits to Valhalla for times when Prospero was not there. Not that it was always possible to do so. Gubber had simply been lucky this time.

  He did not like Prospero. He did not like dealing with Prospero. The other New Law robots were thoughtful, careful, reticent beings. Prospero was none of those things.

  And, if one defined the other New Law robots as sane, Gubber Anshaw was far from certain that Prospero was that, either.

  Part 2.

  Impact Minus Fifty-Five

  9

  ALVAR KRESH STARED out the window of his home, and watched the rain come pouring down. Rain—lifegiving, welcoming rain. It was a rare thing for the city of Hades, and always a most welcome one.

  But the rain and the darkness made the way impossible to see, and made the going slippery. Flash floods could wash out the road altogether. It was best to stay in one place, stay inside and home and dry in the rain. But Kresh could see another, larger, and more dangerous storm, one that had swept across the planet, Comet Grieg bearing down in its wake. In that larger storm, the storm of politics and decision and danger, Kresh had no choice but to move forward, to venture out and choose the direction that would lead to safety.

  If any direction could do so. If there was any way to choose a path, or any way to know that it would lead in the direction it seemed to go.

  What was to be done?

  Alvar Kresh had faced many decisions in his life, made many choices that affected many people, but never had he felt the loneliness of decision more. If only Lentrall had discovered his damnable comet sooner. If only there were more time.

  “What am I going to do?” he asked the rain, speaking softly enough that his voice would not carry. But there were no answers, no guidance there. He turned around and looked around his living room. Fredda and Donald were there, watching him, waiting for him to speak to them.

  It was a big, comfortable, informal room. Fredda had redecorated it in soft and gentle colors, pastel shades of yellow and white, with soft rugs and comfortable chairs and cheerful abstract murals on the walls. Kresh would not have picked out any of it for himself, and yet, somehow, it all suited him very well. It felt more like a home than any place he had ever lived by himself. Warm, and safe, and bright.

  But then Kresh saw the room flash white for a split second as a lightning bolt lit up the window behind him. The thunder came quickly after, a booming roar that seemed powerful enough to shake the room apart.

  A well-timed reminder, it was, that they were not safe, that they could build all the buildings and walls and barriers they liked. The world would still be outside, unpredictable, uncontrollable, unknowable.

  And why merely imagine the chance of Comet Grieg being spotted earlier? Comet Grieg could just as easily have been left undiscovered until it was much closer, until it was too late to even consider diverting it. Or else the comet’s natural, undiverted orbit could just as easily have been too far off to even contemplate moving it. Or the damned thing could have been heading in for an unplanned, uncontrolled direct hit on the planet. What would they have done then?

  But no. “What if” was no longer the question. Alvar Kresh, and Alvar Kresh alone, had to answer another question.

  “What now?” he asked Fredda and Donald. “What is to be done?”

  There was a long moment’s pause before either of them replied, the rain on the roof a fitting, brooding background to the mood of the room.

  “I don’t know,” said Fredda at last. “Either leave the comet alone or bring it in to drop on our heads. Those are the two things you can do. It seems to me that either one could save all the life on the planet from destruction—or actually bring on that destruction. Are we doomed if we do nothing? Can we drop the comet without killing us all?”

  Kresh made a thoughtful little noise in his throat. “That’s what it comes down to, isn’t it?” He considered for a moment, and then went on. “Of course, the traditional Spacer response would be to do nothing at all,” said Kresh. “Let it alone, let it pass. If there is no way to know if it would be better to act, why then far better to leave the thing alone. If you do nothing, then there is nothing you can be blamed for if things go wrong.”

  “Another proud legacy of the Three Laws,” Fredda said. “Be safe, do nothing, take no chances.”

  “If the Three Laws teach humans to avoid taking needless risks now and again, I for one see that as a very strong argument in their favor,” said Donald, speaking for the first time. “But even the First Law contains an injunction against inaction. A robot cannot stand idly by. It must act to prevent harm to humans.”

  Kresh looked toward Donald with a smile. “Are you saying that a robot faced with this decision would choose to bring down the comet? Is that what you would do?”

  Donald held up his hands palm out and shook his head back and forth vigorously. “By no means, Governor. I am quite literally incapable of making this decision. It would be a physical impossibility for me to do it, and more than likely suicidal to attempt it. So it would be for any properly constructed Three Law robot.”

  “How so?”

  “The First Law enjoins us against doing harm to humans, and against inaction at such times when robotic action would prevent harm to humans.” Donald’s speech became labored as he spoke. It was plain that even discussing the issues in a hypothetical context was difficult for him. “In this case, both action or inaction might or might not cause or prevent harm to humans. Attempting to deal with such a difficult problem, with the lives of so many present and potential humans in the bal—balance would cause… would cause irreparable damage to any pospospositronic brain, as the question produced cascading First-Law/First-Law conflictzzz.” Donald’s eyes grew a bit dimmer, and his movements seemed oddly sluggish as he put his arms back down at his side.

  “All right, Donald,” said Kresh, in his firmest and most reassuring voice. He stepped over to the robot and put his hand on Donald’s sloping shoulder. “It’s all right. You are not the one who will have to make that decision. I order you to stop considering it at this time.” There were times when only the words of a robot’s direct master could be enough to snap the robot out of such a state.

  Donald’s eyes faded out all but completely for a moment, and then came back to their normal brightness. He seemed to be looking at nothing at all for a few seconds, but then his eyes began to track again, and he looked straight at Kresh. “Thank—thank you, sir. It was most unwise of me to consider the question so closely, even when called upon to do so.”

  Kresh nodded absently, knowing that he had brought it on himself. He had asked Donald why a robot could not make such a decision, and a question was, in essence, an order. It required constant caution, endless care, to deal with the delicacy of a Three-Law robot’s sensibilities and sensitivities. Sometimes Kresh was deeply tired of it all. There were even times when he was ready to concede that the Settlers might have a point. Maybe some parts of life would be easier without robots.

  Not as if they had such an option at the moment. But if robots could not be trusted to face such a situation… Kresh turned toward Donald again. “Donald, I hereby order you to turn around and face the wall, and to shut off all your audio inputs until you see my wife or me waving at you. Do you understand?”

  “Yes, sir. Of course.” Donald turned his back on Kresh and Fredda. “I have now shut down my audio receptors.”

  “Very good,” said Kresh. More damn fool precautions, but that couldn’t be helped. At least now Donald would be unable to hear or eavesdrop. Now
they would be able to talk without fear of saying the wrong thing in front of the robot and accidentally setting up a damn fool First Law crisis. Kresh turned toward Fredda. “What about the Robotic Planetary Control Center?” he asked. “I wanted to consult with it—and with the Computational Planetary Control Center—before I reached a decision.”

  “Well, what about them?” Fredda asked.

  The two control centers were the heart of the reterraforming effort, performing all the calculations and analyses of each new project before it was launched. The original intent had been to build a single control center. There were two basic designs to choose between. One was a Settler-style computational unit, basically a massively complex and powerful, but quite nonsentient, computer. The other was a Spacer-style robotic unit that would be based on a hugely powerful positronic brain, fully imbued with the Three Laws. It would, in effect, be a robot mind without a robot body.

  There had been a tremendous controversy over whether to trust the fate of the planet to a mindless machine, or to a robotic brain that would refuse to take necessary risks. It was easy to imagine a robotic control unit choosing to avoid harm to one human, rather than permit a project vital to the future of the planet. The robotics experts all promised that it didn’t work that way, but experts had been wrong before. Governor Grieg had died before he could reveal his choice between the two systems. In one of the first acts of his administration, Kresh had decided to build both, and interconnect them so that the two systems worked in consensus with each other. In theory, if the two systems could not reach agreement on what to do, or not to do, they were to call in human referees to decide the issue. In practice, the two systems had agreed with each other far more often than anyone could have hoped. Thus far, there had only been a half dozen or so very minor issues that had required human decisions.

 

‹ Prev