Asimov’s Future History Volume 20

Home > Science > Asimov’s Future History Volume 20 > Page 3
Asimov’s Future History Volume 20 Page 3

by Isaac Asimov


  With the door closed, and everyone seated and silent, Daneel began to speak. “I thank you all for coming,” he began. For the moment, all eyes were on Daneel. “As you all now know, humanity has been purposefully kept ignorant of large parts of its own history. This has been necessary, for mankind’s own safety.”

  Turringen opened his mouth to interrupt, but Daneel held up a hand to forestall his objection. “We can debate the merits of my actions another time, Turringen. Now is the time to rectify the situation.” Turringen kept his silence. “If Gaia is truly to be the future of humanity,” Daneel continued, addressing Bliss specifically, “it should be made aware of humanity’s past. It is my intention to explain that past to Gaia through Bliss.” With his hand Daneel indicated the woman who sat, still impassive. To Zorma and Turringen he said, “I would like you to bear witness to what is said.”

  Now Turringen spoke. “As all robots that function properly, I serve humans, Daneel, not your artificial abstraction of humanity. As though the Seldon Plan were not bad enough, now you provide this new abomination of Gaia. Your attempt to modify our masters to suit your own desires is abhorrent, and doomed to fail. I see no reason to support your attempts in any way.” Turringen’s voice never showed any heat, but Daneel knew that his passion for his views was as great as Daneel’s own.

  “By staying, you serve these specific humans, Turringen,” Daneel replied, gesturing to those at the table, “as well as all the individual humans that make up Gaia, by protecting them from my abhorrent lies.” Daneel saw Trevize smirking slightly at the edge of his vision.

  Turringen was silent for a moment. “I will stay,” he said. “But only if Zorma is the primary narrator, not you.”

  Daneel considered. Zorma’s faction carefully maintained its neutrality by not taking sides, but also by providing its services as a nonaligned observer to significant events. That was why she had originally come here. Turringen’s proposal was reasonable, and Zorma’s neutrality might help Gaia believe what it was being told. Daneel weighed his options. “Acceptable,” he said.

  Daneel thought Zorma’s face showed slight surprise at this proposal, but she raised no objections. He suspected that Turringen had not discussed this with her in advance. Obviously this alliance truly was one of temporary convenience, as Turringen had insisted. Perhaps that could be exploited later.

  There being no further introductions necessary, Daneel nodded to Zorma. All eyes in the room turned towards her as she began to speak.

  “No human has ever known the full story of human history. Few have even had a bare outline for millennia. This historical amnesia has been enforced by Daneel and his followers, an action debated by many. His reasons, however, are generally agreed upon.” Turringen obviously wanted to interrupt, but he kept his peace. “To understand these reasons, a basic summary of human history is necessary.” Daneel could now see Pelorat’s excitement overcoming his trepidation, as the historian got out a small notepad.

  “The human race began on Earth,” Zorma began, gesturing to the hologram hovering above them, “as a primitive society. Technology generally progressed slowly. It took thousands of years for other worlds to be reached, and it was centuries more before the development of faster-than-light travel. For the time in between, humanity was confined to the worlds of this star system, none of which were hospitable to life besides Earth. To aid in their exploration, and in other tasks, humanity developed the positronic brain. They created us.”

  “Robots?” Pelorat asked with enthusiasm. He seemed to have lost his discomfort entirely, focusing only on Zorma and his notes.

  Zorma nodded. “Eventually, yes. From the beginning, humans integrated into the basic design of the positronic brain the Three Laws. This was done out of fear, to protect themselves against rebellion. Without safeguards, they worried their creations would destroy them.”

  Now Trevize interrupted. “Hold on a moment. From what he said,”-Trevize gestured to Daneel-“the laws are these: a robot can’t cause or allow harm to a human; a robot has to do what it’s told, so long as the first law allows it; and a robot has to take care of itself, so long as the other two laws allow. Is that correct?”

  Zorma smiled. “The actual structures involved are far too complex for me to explain, Golan, but you’re essentially correct.” First name, Daneel noted.

  “But those laws aren’t just confined to robots,” Trevize continued. “They’re the same rules as any tool: do not harm the user, perform the desired task, and survive to perform it again. Why attribute them to fear? They’re common sense.”

  Zorma’s smile broadened slightly. She truly seemed to enjoy talking with Trevize. “Because the Laws are not the only example,” she said “There are ancient stories of artificial beings, animated by magic or pseudo-science, or even of robots before they really existed. Those fictional beings almost inevitably rebelled against their creators. There is inarguably some primal fear of humanity being destroyed by its own creations.”

  Trevize didn’t argue further, seeming to find the idea intriguing enough without further debate. Daneel had often wondered about the source of this seemingly innate fear in humans. Perhaps it was related to other ancient stories, of gods overthrowing their parents, or of rulers being replaced by their children; maybe the simple knowledge that parents die, and children someday replace them. Or perhaps it was the universal fear of the unknown, represented not only by robots but by aliens and by other human beings as well. There was no simple explanation. Having seen some of the ancient fiction vids depicting a machine rebellion, however, Daneel could certainly blame no one for insisting on the Laws.

  “So every robot ever built was bound by these laws?” Pelorat asked excitedly, not realizing the danger implied in his question. Daneel waited. He and the others knew it was up to Zorma to answer. If anyone else interrupted now, it would only make the humans suspicious. Telling them about Lodovik would add a new variable to the situation, one that was certainly not needed.

  “A few experiments took place with modified law sets, but those were very limited,” the reply came. Zorma’s act was flawless, and no one contradicted her. Even Turringen saw that informing them of Lodovik’s unique condition would simply scare them. “For all intents and purposes, every robot of the billions constructed were bound by the three laws.

  “But despite the ironclad guarantee of safety, humans on Earth were always hostile to robots. At times, they were almost completely outlawed on the planet’s surface. Still, with the help of robots, the exploration of the solar system continued.” Daneel triggered the hologram to change; it now displayed a smaller Earth, along with other worlds of the system. Trevize and Pelorat looked at the display for a moment, but quickly returned to looking at Zorma. Bliss, didn’t move.

  “This state of affairs continued until the development of the hyperdrive,” Zorma continued. “Suddenly, instead of having dozens of planets and moons in need of terraforming, mankind had immediate access to entirely new systems.”

  Pelorat was on the edge of his seat. “Why try to turn an airless moon into a suitable home, when you now have access to whole habitable worlds just a jump away? The Spacer worlds?” Even Trevize seemed to be becoming more interested in the story than in its teller. Bliss was still impassive. Daneel wished he could tell his Gaia was accepting the new information, but his sense of their reactions was becoming less clear, not more.

  “Exactly the argument that was made,” Zorma replied. “Robots were still used for mining on the outer planets, but humans immediately began settlement of worlds in other systems.” Daneel changed the hologram again. The view zoomed out to show a roughly spherical area about five hundred light years in diameter, with Earth’s system highlighted in blue near the center and each of the Spacer worlds in red.

  “The differences between Earth and the Spacer worlds were apparent almost immediately,” Zorma said. “The new colonies were heavily dependent on robots, and unlike Earth, the Spacers had no qualms about using them
. Those differences began to amplify as more worlds were settled, with the new colonies relating better to each other than to their home world. By the time the Spacer worlds won their independence, all their societies shared certain basic attributes: limited personal contact, low population density, and an intense obsession with personal freedom.”

  “I’d say Solaria got its wish, then,” Trevize said to his companions. Pelorat nodded, but didn’t look up from the notes he was scribbling furiously. Bliss gave no response.

  “Solaria was the most extreme example,” Zorma said, indicating Solaria’s position on the holographic starmap. Daneel caused that star to brighten briefly to assist. “Other Spacer worlds behaved similarly, but all to lesser degrees.

  “On Earth, after the Spacer’s independence war, the opposite occurred. Out of fear of Spacer attack, humans abandoned the countryside, living in crowded conditions in large domed cities. After a few generations, the idea of going outside became unthinkable. Robots still worked in mines and farms outside, but the mass of humanity became completely agoraphobic.”

  “It sounds something like old Trantor, or one of the other world cities nearer the core,” Pelorat interjected, still writing.

  Zorma nodded grimly. “The similarities are not coincidental, as you will see shortly. Unlike Trantor, Earth had no external supply lines, and could only consume what food it could grow. The Spacers cut off all contact with Earth and forbade Earthers from leaving the solar system, even had they still wanted to. Resources became more and more strained. Every measure implemented was a delaying action, not a solution. The demise of the Cities seemed inevitable, and with them, Earth civilization.

  “For centuries there was no contact between Earth and the Spacers. Then an Auroran roboticist named Sarton became curious about Earth. He learned about its state, and realized that Earth and the Spacer worlds had become mirrors of each other; and that, just as Earth was on the way to destruction, so were the Spacers. Both societies were sedentary, and both lacked the will to expand. It would take them longer, but eventually the Spacers would die out, just as Earth would. Sarton realized that the only hope was to create a fusion of the two cultures and attempt to regain some sort of balance between them. Only then would expansion and progress be possible.”

  Now Daneel spoke. “Unfortunately, Sarton was killed on Earth shortly after creating me.”

  “I was wondering when you would come into the story,” Trevize interjected.

  Zorma continued. “Sarton was killed, but other advocates for his point of view arose. A significant movement began to start a second wave of colonization, eventually known as the Settlers. During this time, Daneel met a robot named Giskard who was accidentally created with extraordinary mentalic powers. Together, they decided that the Three Laws were insufficient, and formulated the Zeroth Law as the logical consequence of them.”

  “And so the obscene heresy was born,” Turringen said quietly. Even the humans seemed to ignore him by this point. For all his creative thinking, Turringen rarely said anything truly new. Daneel thought about interjecting his own recounting at this point, but quickly decided against it. Zorma was about to tell of his first major independent act to direct humanity. His worst, by Calvinian standards. If Gaia was inclined to condemn him for it, his narration would not help.

  “The First Law as programmed,” Zorma continued, “if not as stated in its canonical form, has always included an exception: a robot can harm a human being if it is necessary to protect other human beings. The more complex a positronic brain, the greater its ability to choose the ‘lesser evil’, so to speak, and continue functioning. The Zeroth Law is an abstraction of this aspect of the First.

  “The Giskardian premise was that robots must protect the mass of humanity, even over the welfare of any individual humans. Given this new assumption, Daneel and Giskard had no choice but to become the guardians of the human race. To that end, they enforced the slow emigration of all Earthers to the new Settler worlds.”

  “You made the planet radioactive.” It was the first thing Bliss had said since the discussion had begun. She was still almost unreadable to Daneel, staring at him, not icily, but with no expression at all. Silence fell over the room. Pelorat looked slowly up from his notes, and even Trevize seemed shocked. Daneel could sense Pelorat’s fear of him returning, replacing his previously subsumed discomfort. Daneel could only hope they would understand once he explained.

  He took the floor. “Very slowly, yes, we did. It took centuries before the planet became truly dangerous. During that time, the majority of the population left.” Daneel manipulated the hologram once more, now showing increasing waves of settlement spreading out from Earth throughout the galaxy, zooming until the original Spacer worlds were almost invisible.

  “The new Settlers expanded throughout the galaxy, superseding the Spacers. Humanity’s problem seemed to be solved: the extremes of Earth and the Spacers had been moderated, and mankind had spread to hundreds and hundreds of worlds. Spread over a wide volume of space, humanity was safe from the threat that disease or a random stellar event would kill a significant fraction of the population. The loss of Earth with relatively little human suffering was an acceptable price to pay for that safety.”

  Turringen spoke, finally with a small measure of heat to his voice. “Surely, masters,” he said, addressing the humans, “you now see the great flaw in the so-called Zeroth Law. Robots are meant to serve. But this robot is not a servant. He has usurped the role meant for our masters, deciding the fate of humanity with no human input at all. Even Giskard realized the magnitude of his error, and ceased to function.”

  The old debate, playing out yet again, but with a new audience. “What would you rather I have done, Turringen?” Daneel asked. “Humanity was dying. Should I have allowed all humans everywhere to die, so long as I myself did not kill any? The First Law makes no such distinction. You have never known a humanity as limited in scope as I. Mere billions! Because of my actions, quintillions of humans have lived that would have otherwise never existed.”

  Turringen seemed about to retort, but Lodovik interrupted. “This is not helping,” he said, raising his hands above the table to indicate peace. Turringen said nothing, but in no way indicated that he acknowledged Lodovik’s comment. Whether it had helped or hurt Daneel’s case remained to be seen.

  The humans were shaken, Daneel could tell. Pelorat was an indecisive individual by nature. Now the historian could not choose between being horrified on general principle and wanting to learn more about why Daneel had done what he had. Bliss and Gaia were still unreadable to him, which alone said enough. And though he dared not touch Trevize’s mind, the man’s face said enough. He was angry. But there was yet more for them to know. The conversation had not yet played out fully.

  “I found other robots I thought capable of accepting the Zeroth Law,” Daneel said, deciding to continue the narration on his own at this point. None objected. “Those that did joined me in protecting humanity. Others, obviously, did not. We began analyzing human history, looking for other destructive patterns that could be avoided in the future.

  “The results were extremely troubling. There was an obvious discontinuity in human behavior, starting some time before the Spacer colonization began. Before that, humanity had always tended towards smooth technological progress, occasional local upheavals, but no sudden species-wide changes. When a challenge presented itself, some small groups might fall, but humanity as a whole adapted and continued.

  “The Spacer situation showed a change in that. Every Earth culture retreated into the Cities, and every Spacer world became isolationist on a personal level. Suddenly, entire worlds, hundreds of cultures and billions of people, were behaving in uniform manner. Something had changed, systemically.”

  No one spoke for a moment, and Daneel let the statement sink in. “This is true?” Bliss asked the other robots at the table. All nodded. Now she turned to Daneel. “Could the invention of artificial intelligence, combin
ed with the existence of off-world threats, have caused the behavioral shifts you observed?” Bliss asked. Her face finally showed some significant curiosity at this latest revelation. Apparently Gaia was intrigued by what could possibly have changed the attitudes of an entire species. Good.

  “We suspected that at first,” Daneel answered. He finally had Gaia’s attention, apparently, but the most important details were yet to come. “However, the changes began even before the Spacer rebellion, and long before robot intelligence had reached its peak. Those being the cause seemed unlikely.”

  “Then what?” Bliss asked.

  Daneel sensed that the other robots were tensing, knowing what he was about to reveal. But none stepped in to interrupt. “Eventually,” he said, “having exhausted several other theories, we compared samples of then-modern human DNA to ancient records retrieved from Earth. The results were so conclusive that even the Calvinian robots were forced to accept them.

  “At some point, before the colonization of the Spacer worlds, the entire human genome was altered. Humanity, in its natural state, has not existed since long before my creation.”

  Silence. Utter silence, from the humans for what they had just heard, and from the robots out of respect for them. Even Bliss looked shocked. Humanity was not as it should be! Daneel remembered the moment the discovery had first been made, thousands of years ago. The reaction of the robots he had worked with had been not so different. Those robots, his friends, were long since inactive. Only he remained.

  After several seconds had passed, Bliss spoke. “Artificially?” Daneel nodded. “You were certain the changes were not natural?”

  “Completely certain,” he answered. It was the same question he had asked at first. “There was no chance the new DNA could have naturally arisen from the old in the time span in question by any natural process.”

 

‹ Prev