Book Read Free

Asimov’s Future History Volume 7

Page 60

by Isaac Asimov


  She balled her fists and bit her lip, looked up at the ceiling, then shook her head. “I don’t think he’s dangerous. He’s never hurt anyone intentionally. And I just want this whole business to be over with. So yes, tell him I’ll trust him.”

  Derec was about to relay her words to Lucius, but he realized that he needn’t bother. “Okay,” he said aloud. “Come on out from wherever you’re hiding.”

  There came a soft tearing sound, and a section of ceiling near the door peeled away to fall with a flop against the wall. It peeled off the wall as well, gathered into a lump on the floor, and quickly rose on two legs to become Lucius’s familiar form.

  Despite his other failings, Lucius made an excellent surgeon. Within a day, Ariel was up and walking around again, though still somewhat sore. Even so, she was far better off physically than mentally, for in that area neither Lucius nor anyone else could help her heal. Derec was the only one who could even begin to ease the torment she was going through, but he was feeling it just as strongly as she.

  Had they done the right thing? Of course they had. They knew they had. Hadn’t they?

  As Derec struggled with his own feelings of guilt, he found himself appreciating Avery’s position for the first time, What a load his father carried around with him, considering all he had done! With a background like his, just carrying on from day to day would be a continual struggle, especially with Derec there as a constant reminder of it.

  No wonder Avery strove to keep busy. It kept his mind off his past. After an absolutely disastrous day spent moping around the apartment, both Derec and Ariel realized the wisdom of his strategy, and followed his example.

  While Derec and Avery set to work preparing the city robots for their reprogramming to suit the Ceremyons, Ariel and Wolruf set out to meet with them to find out what they had decided they wanted. The meeting was easy to arrange; Lucius contacted Adam and Eve, who were back with the aliens again, and between them they settled on a time and place.

  Ariel left for the meeting in relatively high spirits, but she returned with a puzzled frown.

  “The Ceremyons want us to make philosophers out of the robots,” she reported, slumping down in a chair and putting her hand to her forehead. “I told them that’s not what robots were for, but they insisted. They said they’ve got a bunch of difficult philosophical questions that they haven’t been able to work out, so their council decided to let the robots have a try at them.”

  “What are the questions?” Avery asked, looking up from his computer terminal.

  “They didn’t say. They said they wanted us to reprogram two robots for philosophy and let them see how well they work.”

  Derec and Avery looked at one another with eyebrows raised skeptically. Derec said, “I don’t know, the original Wohler thought he was a philosopher, but I didn’t think he was very profound.”

  “He was just spouting other people’s philosophy,” Ariel added. “He didn’t come up with anything of his own.”

  “Of course he didn’t,” Avery said. “That’s because he didn’t do any cross-correlation. “As Derec watched, Avery, s skepticism disappeared, replaced by a fanatic gleam in his eye that Derec recognized. Avery saw the aliens’ request as a challenge, and he intended to meet it. “He wasn’t programmed to combine old information into new patterns, so all he could do was echo the thoughts of others. But if we give our robots the ability to compare and to generalize, and for working material load them up with all the philosophy texts in the central library, they’11 be able to out-think these Ceremyons hands down. It won’t be real thinking, but with a big enough library behind them, it’ll be completely convincing to the user. Ha! It’ll be easy. “Avery turned back to the computer and began keying instructions furiously.

  Without looking up, he said, “Get this city’s Wohler unit up here to try it on. It should accept the new programming easier than just a random robot.”

  “You melted him along with the other supervisors,” Derec reminded him.

  “Oh. Well, then, have another one made.”

  Derec obediently contacted the central core and advised it that Avery wanted another Wohler.

  “Here, you can help with the programming, too. Dig out the code the supervisors use to reject crazy buildings, and see if you can modify it to filter out crazy thoughts. I’ll work on the correlation routine.”

  With a smile and a shake of his head for Ariel’s benefit, Derec got to work. Ariel and Wolruf stayed for a few minutes, but soon became bored and left. Lucius stayed, standing silently behind Derec and Avery where he could see what either of them did.

  They spent the better part of the afternoon on the project, but they were ready by the time a new golden-hued robot presented itself at the door.

  “I am Wohler-10,” the robot said.

  Avery looked up, rubbed his eyes, and said, “Good. Scan this.” He banded Wohler a memory cube, which the robot took in its right hand. The hand flowed until it completely enveloped the cube, then after a few seconds returned to normal. Wohler gave the cube back to Avery.

  “What is the relationship between free will and determinism?” Avery asked him.

  “Determinism is necessary for free will, but not the reverse,” the robot answered without hesitation.

  “Did you think that up just now, or was it already in memory?”

  “It was already in memory.”

  “Hmm. How does free will differ from freedom, and how does that difference affect a robot’s behavior?”

  Wohler hesitated slightly this time before saying, “Free will is the ability to act upon desires. Freedom is the ability to use free will indiscriminately. For practical purposes, a robot has neither. I can elaborate if you wish.”

  “No, that’s fine. Was that your thought this time?”

  “It was a correlation from existing definitions, but it did not exist previously in the data bank.”

  “Good. What is reality?”

  “I quote: ‘Reality is that which, when you cease to believe in it, does not go away. ‘Source: Phillip K. Dick, twentieth century author, Earth. I have on file seventy-three other definitions, but that one seems most logical.”

  Avery grinned at Derec and spread his hands. “One out of three responses are original. That’s a pretty good average among philosophers. I think he’ll do.”

  Lucius made a humming sound, a robotic clearing of the throat. “May I ask a question?”

  Avery frowned. He obviously still didn’t trust the renegade robot, but with a shrug, he said, “Fire away.”

  Lucius turned to face Wohler. “What is a human?”

  Wohler hesitated even longer than before. At last he said, “That definition depends upon your point of view.”

  Avery burst into laughter. “He’s a philosopher, all right! Come on, let’s fix up another one and give them to the Ceremyons tomorrow.”

  Chapter 9

  FRIENDS

  THEY CHOSE A regular city robot for the second philosopher, testing him thoroughly to make sure that his answers were the same as the brand-new Wohler’s. His experiences in the city and his previous reprogrammings didn’t seem to affect his responses at all. They arranged a meeting through Lucius, and this time they all went to present the philosopher robots to the aliens.

  They met at the edge of the spaceport farthest from the city, a spot no doubt chosen by the aliens to communicate their displeasure with the city and its inhabitants.

  There were two of the living silhouettes at the meeting this time, as well as two alien-looking but obviously robotic companions: Adam and Eve. The robots ignored the humans, and the humans returned the courtesy. Sarco ignored the robots as well, but, realizing that humans couldn’t distinguish one alien from another, he introduced himself again, then introduced his companion, Synapo, whom all but Avery had already met the first time they had been to Ceremya.

  “And these are the philosophers?” Synapo asked dubiously. “I believe I recognize this one. It directed the killin
g of two of my people when this city first began growing here. It is a most unpleasant robot.”

  Derec had forgotten about that incident. It had happened because the robots didn’t see the aliens as human, and were following the simplest procedure to get them out of the way. It was a stupid mistake then, and Derec’s decision to use a Wohler model for a philosopher was a stupid mistake now. Wars had been fought over lesser matters.

  “This is a different robot,” he said, trying to smooth over the unintended insult. “The old Wohler was inactivated.”

  “A wise decision,” Synapo said. The alien looked to its. companion, receiving an eyeblink and a rustling of its wings in response. That was evidently the Ceremyon equivalent of a shrug, because Synapo said,” Well, then, to the test. Sarco, do you wish to ask the first question, or shall I?”

  “The honor is yours,” Sarco said.

  Synapo bobbed down and up again in a gesture no doubt meant as an acceptance of Sarco’s courtesy. “Very well. The new Wohler, then. I ask you this: What is the value of argument?”

  Wohler folded his arms across his chest, a gesture Derec had taught him, and said, “The value of argument is that it allows two opposing views to be expressed, along with supporting evidence for each, so that an examination of the evidence can then lead to a determination of the more correct of the two views.”

  “A reasonable answer. And you, the other robot. Your name?”

  “Plato.”

  “Plato. What is your answer to the same question?”

  “It must, of course, be the same answer.”

  A tiny flame shot out from the darkness of Synapo’s face. Sarco said, “Why must it be?”

  “It is the correct answer.”

  “Then apply that answer to the discussion at hand!”

  Plato looked at Sarco, then shifted its eyes to look helplessly at Derec. “I must disagree with a correct answer?”

  Synapo’s flame winked out. “Of course you must!” he said. “That is the root of philosophical debate. If we all agreed, we could learn nothing.”

  Plato tried. He said, “Then I... then argument has no value. It is a pointless waste of energy. The correct answer should be obvious to all.”

  “Wrong!”

  “Of course it is wrong!” Plato said desperately. “You told me to disagree with a correct answer!”

  “That did not mean you had to give an incorrect one. You are not a philosopher. Dr. Avery, these robots are useless to us.”

  “Wrong,” said Wohler. “We are useless to you in our present form.”

  Synapo jetted flame again, but Sarco jiggled up and down in obvious amusement. “It caught you!” the alien hooted.

  Synapo’s eyes shifted to the robot. “I stand corrected. You are useless to us in your present form. Perhaps in another form you would not be useless. Dr. Avery, what else can these robots do?”

  “What do you want them to do?” Avery asked in return.

  “Philosophize, but that seems too much to ask. Sarco, do you have another suggestion?”

  “You know I do,” Sarco replied. His eyes shifted to meet Avery’s. “At our council meeting, I suggested that the robots be used as musicians. It was my thought that each of us could be attended by a personal musician who could play melodies to fit our individual moods.”

  “That’s simple,” Avery said. “They can do that without modification.”

  “Unlikely,” Sarco said. “Our music consists of modulated hyperwave emissions.”

  “Okay, then,” Avery said with a nod, “we’ll need to give them hyperwave transmitters. And you ‘II have to teach them some of your songs.”

  “That can be done. Synapo?”

  “Very well. My suggestion came to nothing; we’ll see how yours fares. When will the robots be modified?”

  “I can have them back to you by tomorrow,” Avery said.

  “We will be here.” Synapo backed away, gave a running hop, and was airborne. Sarco followed, and Adam and Eve, who had been silently flanking them all along, also turned to go.

  “Wait a minute,” Derec said. “I want to talk to you.”

  “What do you wish to say?” the one on the left asked in Adam’s voice.

  “Why don’t you come back with us?”

  “We do not wish to.”

  “Why not? You can have the same deal we made Lucius. Peaceful coexistence while you figure out your definition of human.”

  “We are working on that definition with the Ceremyons. In fact, at this point we believe them to be more human than you.”

  “Because they don’t ask you to do anything,” Ariel put in.

  “You have a clear understanding of the situation,” the robot replied.

  Avery shook his head. “Stay with them forever, for all I care. Good riddance. Come on, Wohler, Plato. Let’s see if we can give you two rhythm.”

  They could, but that, it seemed, was not enough. It came close, closer than their first attempt to please the aliens, but on the morning of the third day after the trial, Lucius received a message from his counterparts that the aliens wanted to meet with the ‘self-named humans’ one more time.

  They took transport booths out to the edge of the spaceport again. Sarco and Synapo were already waiting for them by the time they arrived, along with Adam and Eve and the musician robots as well.

  Wohler was still recognizable by his gold color, but that was the only way to tell him from the other three robots. All had taken on the Ceremyon form.

  The alien on the right stepped forward and said, “I am Sarco. These robots are not musicians.”

  “What’s the problem this timer’ Avery asked with a sigh.

  “They are nothing more than elaborate recording and playback devices with the limited ability to improvise on a theme. In all the time they have been with us, not once has either of them been able to create a completely new piece of music.”

  “Well, not quite,” amended Synapo. “They are able to produce random variations, which are new.”

  Sarco snorted flame. “I said ‘new piece of music,’ not just new noise.”

  “Sarco is a music lover,” Synapo explained. “He is greatly disappointed.”

  Avery nodded. “All right. Let’s get one thing straight. Twice now you’ve asked me to give you robots with creative minds. I’ve tried to accommodate you, but I think you’re missing the point. Robots aren’t supposed to be used for creativity. That’s our job. Robots were made for the drudge work, for servants and laborers and all the other tasks that you need to have done in order to keep a society going but that nobody wants to do.”

  Sarco said, “Our society exists without such drudge work, as you call it.”

  “Then you don’t need robots.”

  “Which is precisely what I told you at our first meeting.”

  Avery threw up his hands in defeat. “All right. Forget it. We’ll take them off your hands. I was just trying to be helpful.”

  The irony of it was, Derec thought, Avery really was trying to be helpful. It was almost as if he wanted to prove to himself that he could still do it. And here the aliens were telling him that the only way he could help was to take his toys and go home.

  “May I ask what you intend to do with them?” Synapo asked.

  “What does it matter? They won’t bother you anymore.”

  “I am curious.”

  “All right, since you’re curious; I’ll probably order them to self-destruct.”

  Synapo and Sarco exchanged glances. The robots did so as well.

  “That would be a great waste;” Synapo said.

  “Waste? You just said they weren’t any good to you. With the planet already occupied, they aren’t any good to me, either. If there’s no use for them, then how can it be a waste to get rid of them?”

  “They represent a great degree of organization.”

  “Who cares? Organization doesn’t mean anything. An apple has more complex organization than a robot. What matters isn’t how soph
isticated it is, but how much it costs you to produce. These robots are self-replicating; you can get a whole city from one robot if you’ve got the raw materials, so their cost is effectively zero. That’s how much we lose if we get rid of them: nothing.”

  “But the robots lose. You forget, they are intelligent beings. Not creative, granted, but still intelligent. Perhaps too intelligent for the purpose for which you use them, if this is your attitude toward them.”

  “They’re machines,” Avery insisted.

  “So are we all,” Sarco said. “Biological machines that have become self-aware. And self-replicating as well. Do you maintain that our value is also zero, that we need not be concerned with individual lives, because they are so easy to replace?”

  Avery took a deep breath, working up to an explosive protest, but Ariel’s response cut the argument from under him.

  “No,” she whispered. “They’re all important.” She turned to Avery, and her voice grew in intensity as she said, “We just went through all this. Didn’t we learn anything from it? Derec and I aborted our own baby because it was going to be born without a brain. Without that, it was just a lump of cells. Doesn’t that tell us something? Doesn’t that tell us the brain is what matters?”

  Lucius said to Derec, “You told me that adding a robot brain to the baby at birth would not have made it human.”

  Ariel looked surprised, and Derec realized she hadn’t been in on that conversation. Even so, it only slowed her down for a moment. “That’s right,” she said. “It wouldn’t have. I1 would have been a robot in a baby’s body, and we didn’t want a baby robot. But the one question you didn’t ask was whether or not we would have aborted it if it was already as intelligent as a robot, and the answer is no.. We wouldn’t have, because even a robot is self-aware. Self-awareness is what matters.”

  “You are more civilized than we thought,” Synapo said.

  “We try.” Ariel reached out a hand toward Wohler. “Come on,” she said. “I owe you a favor. The original Wohler lost his life saving me from my own stupidity; the least I can do is save his namesake.”

 

‹ Prev