Suspicion iarc-2

Home > Other > Suspicion iarc-2 > Page 6
Suspicion iarc-2 Page 6

by Mike Mcquay


  Rydberg spoke. “We are in a standby security mode that renders certain information classified by our programming.”

  “Did our arrival prompt the institution of the security mode?” Katherine asked.

  “No,” Euler said. “It was in effect when you arrived. If, in fact, you arrived when you said you did. We must ask you again how you came to be here.”

  Derec decided to try a little truth. It couldn’t hurt as long as no mention was made of the Key. Perhaps a dose of the truth might get them to open up about the Key’s existence. “We materialized out of thin air atop this very building.”

  “And where were you before that?” Wohler, the gold one, asked.

  Derec walked slowly around the circle, studying his questioners. “A Spacer way station named Rockliffe near Nexon, right on the edge of the Settlement Worlds quarantine zone.”

  Arion, the mannequin, asked, “What means, then, did you use to get from one place to the other?”

  “No means,” Derec said. “We were simply transported here.”

  There was silence for a moment. “This does not coordinate with any information extant in memory,” Avernus said, his large dome following Derec’s progress around the circle.”

  You’ve found no ship that could have brought us,” Derec said, “and I’m sure you’ve searched.”

  “That is correct,” Euler said, “and our radar picked up no activity that could have been construed to be a vessel in our atmosphere.”

  “I can’t explain it beyond that,” Derec said. “Now, you answer a question for me. Where did you come from?”

  “Who are you addressing?” Euler asked.

  “All of you,” Derec said.

  Avernus answered. “All of them except for me were constructed here, on Robot City,” he said. “I was… awakened here, but believe I was constructed elsewhere.”

  “Where?”

  “I do not know,” the large robot replied. “My first i/o memories are of this place. Nothing in my pre-programming suggested anything of an origin.”

  “Are you trying to say,” Katherine broke in, “that all of you know nothing but the company of other robots? That your entire existence is here?”

  “Correct,” Rydberg said. “Our master programming is well aware of human beings and their societies, but no formal relationship exists between our species.”

  “Then how did you come to build this place?” Derec asked. “How then, did it become important to you to make a world for humans?”

  “We are incomplete without human beings,” Waldeyer said, his squat dome swiveling to Derec and then Katherine. “The very laws that govern our existence revolve around human interaction. We exist to serve independent thought, the higher realms of creativity that we are incapable of alone. We discovered this very quickly, without being told. Alone, we simply exist to no end, no purpose. Even artificial intelligence must have a reason to utilize itself. This world is the first utilization of that intelligence. We’ve been building it for humans, in order to make the perfect atmosphere in which human creativity can flourish to the greater completeness of us all. Without this world we are nothing. With it, we are vital contributing factors to the ongoing evolution of the universe.”

  “Why would that matter to you?” Katherine asked.

  “I have a theory about that,” Dante said, his elongated eyes glowing bright yellow. “We are the product, the child if you will, of higher realms of creative thought. It seems impossible that the drives of that creative thought wouldn’t permeate every aspect of our programming. We want for nothing. We desire nothing. Yet, the incompleteness of our inactivity makes us… feel, for lack of a better word, useless and extraneous. Given the total freedom of our own world, we were driven to function in service.”

  Derec suddenly felt a terrible sadness well up in him for these unhappy creatures of man’s intelligence. “You’ve done all this, even though you never knew if any people would come here?”

  “That is correct,” Euler said. “Then David came, and we thought that all would be right. Then came his death, then the calamities, then you… suspects to murder. We never meant for anything to be this way.”

  “When you say calamities,” Derec said, “are you speaking of the problems with the storms?”

  “Yes,” Rydberg said. “The rains threaten our civilization itself, and it’s all our own fault. We are breaking apart from the inside out, with nothing to be done about it.”

  “I don’t understand,” Derec said.

  “We don’t expect you to, nor can we tell you why it must be this way,” Euler said.

  Derec thought about the hot air pumping through the reservoir. “Is the city’s rapid growth rate normal?” he asked.

  “No,” Euler said. “It coincides with David’s death.”

  “Is it because of David’s death?”

  “We do not know the answer to that,” Euler said.

  “Wait a moment,” Katherine said, walking away from the circle to sit on the floor, her back up against the north wall. “I want to talk to you about our connection with all this… and why Rydberg called this a preliminary trial.”

  “You were the one who first mentioned the concept of trial,” the robot replied, leaning out of the circle to stare at her. “I only used that term to make you feel comfortable.”

  “Okay,” she said. “I’ll play. You say this is a civilization of robots that have never had human interaction, yet obviously someone gave you your initial programming and ability to perform the work on this city.”

  “Someone… yes,” Euler said.

  “Someone who’s in charge,” she said.

  “No,” Euler said. “We are now in group communication with our master programming unit, but it simply provides us with information from which logical decisions are made. Our overall philosophy is service; our means are logical. Other than that, our society has no direction.”

  “Then why put us on trial at all?” she asked.

  “Respect for human life is our First Law,” Rydberg said. “When we envisioned our perfect human/robot world, we saw a world in which all shared respect for the First Law. We envisioned a system of humanics that would guide human behavior, just as the Laws of Robotics guide our behavior, just as the Laws of Robotics guide our behavior. Of course, we have been working entirely from theory, but we have made a preliminary list of three laws that would provide the basis for an understanding of humans.”

  “Cute,” Katherine said. “Now they want us to follow the Laws of Robotics.”

  Derec interrupted her complaint. “Wait. Let’s see what they’ve come up with.”

  “Thank you, Friend Derec. Our provisional First Law of Humanics is: A human being may not injure another human being, or, through inaction, allow a human being to come to harm.”

  “Admirable,” conceded Derec, “even if it isn’t always obeyed. What is your Second Law?”

  Rydberg’s hesitation before answering gave Derec the clear impression that the robot wanted to ask a question of its own, but his took precedence under the Second Law of Robotics.

  “The Second Law of Humanics is: A human being must give only reasonable orders to a robot and require nothing of it that would needlessly put it into the kind of dilemma that might cause it harm or discomfort.”

  “Still admirable, but still too altruistic to be always obeyed. And the third?”

  “The Third Law of Humanics is: A human being must not harm a robot, or through inaction, allow a robot to come to harm, unless such harm is needed to keep a human being from harm or to allow a vital order to be carried out.”

  “Not only is your experience with humans limited, so is your programming,” Derec said, shaking his head. “These ‘laws’ might describe a utopian society of humans and robots, but they certainly don’t describe the way humans really behave.”

  “We have become aware of that,” said Rydberg. “Obviously, we are going to have to reconsider our conclusions. Since your arrival we have been su
bjected to human lies and deceit, concepts beyond our limited understanding.”

  “But the First Law must stand!” Avernus said loudly, his red photocells glowing brightly. “Human or robot, all are subject to respect for life.”

  “We certainly aren’t arguing that point,” Derec said.

  “No!” Katherine said, standing angrily and walking back to the circle. “What we’re talking about is the lack of respect with which we’re being treated here!”

  “Kath… ” Derec began.

  “Shut up,” Katherine said. “I’ve been listening to you having wonderful little philosophical conversations with your robot buddies, and I’m getting a little tired of it. Listen, folks. First thing, I demand that you give us access to communications with the outside and that you let us leave. You have no authority to hold us here.”

  “This is our world,” Euler said. “We mean no offense, but all societies are governed by laws, and we fear you have broken our greatest law.”

  “And what if we have?” she asked. “What happens then?”

  “Well,” Euler said. “We would do nothing more than keep you from the society of other humans who you could harm.”

  “Great. So, how do you prove we did anything in order to hold us?”

  “Process of elimination,” Waldeyer said. “Friend Derec has previously suggested some other possible avenues of explanation, but we feel it is incumbent upon both of you to explore them-not because we are trying to make it difficult for you, but because we respect your creative intelligence more than we respect our own deductive intelligence in an area like this.”

  Derec watched as Katherine ran hands through her long black hair and took several deep breaths as she tried to get herself together and in a position to work with this. “All right,” she said, more calmly. “You said before that you won’t let us see the body.”

  “No,” Euler said. “We said that we can’t let you see the body.”

  “Why?”

  There was silence. Finally Rydberg spoke. “We don’t know where it is,” he said. “The city began replicating too quickly and we lost it.”

  “Lost it?” Derec said.

  Derec knew it was impossible for a robot to be or look embarrassed, but that was exactly the feeling he was getting from the entire group.

  “We really have no idea of where it is,” Euler said.

  Derec saw an opening and quickly took it. “In order to do this investigation and prove that we’re innocent of any First Law transgressions, we must have freedom of movement around your city.”

  “We exist to protect your lives,” Euler said. “You’ve been caught in the rains; you know how dangerous they are. We can’t let you out under those conditions.”

  “Is there advance warning of the rain?” he asked.

  “Yes,” Rydberg said. “The clouds build in the late afternoon, and the rain comes at night.”

  “Suppose we promise to not go out when the conditions are unfavorable?” Derec asked.

  Wohler, the golden robot, said, “What are human promises worth?”

  Katherine pushed her way beneath the hands of the robots to stand in the center of the circle. “What are our lives worth without freedom?”

  “Freedom,” Wohler echoed.

  A dark cloud passed above the skylight, plunging the room into a gray, melancholy halflight, illumination provided by a score of CRT screens, many of them now showing pictures of madly roiling clouds.

  The circle broke immediately, the robots, agitated, hurrying toward the door.

  “Come,” Euler said, motioning to the humans. “The rains are approaching. We must get you back to shelter. There is so much to do.”

  “What about my suggestion?” Derec called loudly to them.

  “Hurry,” Euler called, waving his arm as Derec and Katherine walked toward him. “We will think about it and let you know tomorrow.”

  “And if we can investigate and prove our innocence,” Katherine said, “will you then let us contact the outside?”

  Euler stood still and fixed her with his photocells. “Let me put it this way,” he said. “If you don’t prove your innocence, you’ll never be allowed to contact the outside.”

  Chapter 5. A Witness

  Derec sat before the CRT screen on the apartment table and watched the “entertainment” that Arion was providing him in the form, at this moment, of sentences and their grammatic diagrams. Before that it had been a compendium of various failed angle trisection theorems, and before that, an incredibly long list of the powers of ten and the various words that had been invented to describe the astronomical numbers those powers represented. It was an insomniac’s nightmare.

  It was a dark, gray morning, the air heavy with the chill of the night and the rain that had pounded Robot City for many hours. The sky was slate as the remnants of the night’s devastation drifted slowly away on the wings of the morning.

  He felt like a caged animal, his nerves jangling madly with the notion that he couldn’t leave the apartment if he wanted to. They had been dropped off in the early evening after the meeting at the Compass Tower and hadn’t seen a supervisor robot since. The CRT had no keyboard and only received whatever data they chose to show him from moment to moment. At this particular time, they apparently felt the need to amuse him; but the time filler of the viewscreen only increased his frustration.

  He hadn’t slept well. The apartment only had one bed and Katherine was using it. Derec slept on the couch. It had been too short for him, and that didn’t make sleeping any easier. But that wasn’t the real reason he’d been awake.

  It was the rain.

  He couldn’t get out of his head the fact that the reservoir had been nearly filled when he’d been flung into it the night before. How, then, could it possibly hold the immense amounts of water that continued to pour into it with each successive rainfall? He’d worried over that point: the more rain, the greater the worry. The fact that the supervisors hadn’t contacted him since before the storm seemed ominous. All of their efforts seemed to revolve around the weather problems.

  How did the weather tie in with the rapid growth rate of the city? Were the two linked?

  “You’re up early,” came Katherine’s voice behind him.

  He turned to see her, face soft from sleep, framed by the diffused light. She looked good, a night’s sleep bringing out her natural beauty. She was wrapped in the pale green cover from her bed. He wondered idly what she was wearing beneath it, then turned unconsciously to his awakening, after the explosion in Aranimas’s ship, in the medical wing of the Rockliffe Station to find her naked on the bed beside. Embarrassed, he pushed that thought aside, but its residue left another thought from that time, something he had completely forgotten about.

  “Can I ask you a question?” he said.

  Her face darkened and he watched her tighten up. “What is it?” she asked.

  “When we were at Rockliffe, Dr. Galen mentioned you had a chronic condition,” he said. “Later, when he began to talk about it, you shut him up.”

  She walked up to look at the screen, refusing to meet his gaze. “You’re mistaken,” she said. “I’m fine… the picture of health.”

  She turned slightly from him, and there seemed to be a small catch in her voice. When she turned back, her face was set firm, quite unlike the vulnerable morning creature he’d seen a moment ago. “What’s happening on the screen?” she asked.

  He looked. A pleasant, always changing pattern of computer generated images was juicing through the CRT, accompanied by a random melody bleeped out of the machine’s tiny speaker.

  “You make it very hard for me to believe you,” he said, ignoring the screen. “Why, when we need total honesty and trust between us, do I feel that you’re holding back vital information from me?”

  “You’re just paranoid,” she said, and he could tell he was going to get nothing from her. “And if you don’t change the subject quickly, I’m going to find myself getting angry, and that�
�s no way to start the day.”

  He reluctantly agreed. “I’m worried about the rains,” he said. “They were worse last night than the night before.”

  She sat at the table with him. “Well, if this place is getting ready to have major problems, I hope we’re out of here before they happen. We’ve got to get something going with the murder investigation.”

  “Do you know what makes rain?” he asked, ignoring the issue of the murder.

  “What has that got to do with our investigation?” she asked, on edge.

  “Nothing,” he said. “I’m just wondering about these rains, I… ”

  “Don’t say it,” she replied holding up a hand. “You’re worried about your robot friends. Well, let me tell you something, your friends are in the process of keeping us locked up for the rest of our lives… ”

  “Not locked up, surely,” he interrupted.

  “This is serious!” she said, angry now. “We have a very good chance of being kept prisoner here for life. You know, once they make a decision like that, I see no reason that they would ever change it. Don’t you understand the gravity of the situation?”

  He looked at her calmly, placing a hand over hers on the table. She drew it away, and he felt his own anger rise, then rapidly subside. “I understand the problem,” he explained, “but I fear the problem with the city is more pressing, more… immediate.”

  “But it’s not our problem. The murder is.”

  “Indulge me,” he said. “Let’s talk about weather for just a minute.”

  She sighed, shaking her head. “Let’s see what I remember,” she said. “Molecules respond to heat, separating, moving more quickly. Water molecules are no exception. On a hot day, they rise into the atmosphere and cling to dust particles in the air. When they rise into the cooler atmosphere, they turn into clouds. When the clouds get too heavy, too full of water, they return to the ground in the form of rain.”

  “Okay,” he said. “And wind is simply the interplay of heat and cold in the atmosphere.”

 

‹ Prev