Book Read Free

Asimov’s Future History Volume 8

Page 55

by Isaac Asimov


  “Would someone please explain this to me?” Mia asked.

  “Derec built it, let him try,” Ariel said in disgust.

  “Under normal circumstances...” Derec started. His throat caught, and he coughed. “Normal circumstances... whatever that means... the Three Laws are built into every positronic brain, part of the core template. They represent the First Principles for a robot, the foundation on which all its subsequent learning and experience rests. The initial designers set it up so that almost all the secondary programming requires the presence of those laws in order to function. It would require a complete redesign of all the manufacturing methods as well as the basic pathways themselves to build a brain without the Three Laws. They aren’t just hardwired into the brain, they are basic to the processes of constructing one.”

  “Is that what you did? Redesign everything?”

  “No. I’m not that good. Nor am I that irresponsible. All I did was set in place a new set of parameters for the application of the Laws.”

  “You circumvented them,” Ariel snapped.

  “I did not. I took an accepted standard practice and stretched it.”

  “What practice?” Mia asked.

  “Setting conditions of when the robot perceives that it is responsible for a violation,” Derec explained. “Think about it. According to the First Law, if a robot followed it absolutely, all robots would collapse. ‘A robot may not injure a human being, or, through inaction, allow a human being to come to harm.’ Consider that as an absolute. Human beings are always coming to harm. Time and distance alone guarantee that a robot can’t act to prevent that harm in all cases. If some kind of buffer zone, a hierarchical response, weren’t in place, the instant robots realized how many humans came to harm because they weren’t doing something to prevent it, we would have no functional robot population. So we establish a reasonable limitation for its application. If a robot does nothing to stop the human it is standing right next to from being killed–say, by a speeding vehicle, out of control–then it fails and collapse follows. If that same robot can do nothing to prevent the same fate happening to a human a kilometer away from it, the only way it fails is if another human forcefully asserts that it was at fault. It does not automatically perceive itself as responsible.”

  Mia nodded. “That makes sense.”

  “Practical engineering,” Derec said. “But then we run into a functional brick wall when it comes to certain tasks. Law enforcement, for one. Most positronic brains cannot cope with violent crime. A few have been programmed to dissociate under very specific circumstances so that, say, witnessing dead humans at a crime scene won’t create a Three Law crisis. But they still can’t make arrests because that offers the potential of harm to a human being.”

  “But a criminal–” Mia began.

  “Is still a human being,” Derec insisted.” What a human has done by violating a law does not mitigate the robot’s adherence to the Three Laws.”

  “Unless you redefine harm,” Ariel said. “Which is how you got around the Three Law imperative.”

  “That’s an oversimplification,” Derec replied.” What I redefined was the sphere of responsibility. Bogard is just as committed to the Three Laws as any other robot, but it has a broader definition of that commitment. It can make the determination that limiting a human’s freedom action, even if it results in a degree of harm–bruises or strained muscles, for instance–may prevent harm from coming to other humans.”

  “You’re telling me you gave it a moral barometer?” Ariel demanded.

  “Sort of. It relies on the human to which it is assigned to make that determination. It also recognizes a human prerogative to go in harm’s way should circumstances require risk to prevent further harm.”

  “And when confronted with a clear case of Three Law violation?”

  “It has memory buffers and a failsafe that shunts the data out of the primary positronic matrix. It prevents positronic collapse and allows for the opportunity for further evaluation. In a proper lab debriefing, the cause-and-effect of a situation can be properly explained and set in context. The robot can be reset and returned to duty.”

  “You gave it selective amnesia,” Ariel said. “It can allow a human to come to harm and still function because after the fact it doesn’t remember doing it.”

  “That’s why Bogard left data out of its report,” Mia said.

  “That’s why I have to take Bogard back to Phylaxis to debrief it.”

  Mia nodded thoughtfully. “So why don’t you approve, Ariel?”

  “A robot is a machine,” she said. “A very powerful machine. It is intelligent, it can make decisions. I want them inextricably joined to the Three Laws so that they can never–never–circumvent their concern for my safety. If they fail to protect me, I want them shut down. I don’t want them thinking it over. I don’t want to ever be considered a secondary or tertiary concern by a robot who may decide that I ought to be sacrificed for the good of the many. Or of a specific individual. I think loosening the bonds like this can only lead to operational conflicts that will result in unnecessary harm.”

  “That’s the only way to construct a robot bodyguard, though,” Derec said.

  “There should be no such thing, then!” Ariel shouted. “It didn’t work! Somewhere in its sloppy brain it made a decision and sacrificed Senator Eliton! Explain it to me how that was for anyone’s greater good!”

  Derec stared at her, ashamed. He could think of no answer to give her. In fact, he had no answer for himself.

  Twenty-One

  MIA WATCHED THE argument escalate, amazed at Ariel. She had always seen her friend as impatient but controlled, usually even-tempered, never enraged and irrational. But this was a side of Ariel with which Mia had no experience. The unreasoned hatred she directed at Bogard reminded Mia more of an anti-robot fanatic than of a Spacer who ought to be at ease with robots.

  “Ariel–” Derec said tightly, obviously reining in his own anger.

  Ariel left the room.

  Derec closed his eyes, leaning back in his chair.

  “You two have known each other a long time?” Mia asked.

  Derec gave a wan smile. “Too long, I sometimes think. In a way, I’ve known her all my life.”

  “You’re not–”

  “Related? No. It’s just I–we–both had amnemonic plague. Burundi’s Fever. We’ve been complete amnesiacs. When I had my bout, Ariel was the first human I came into contact with.”

  “And you were with her when she had hers?”

  Derec nodded.

  “So... why don’t you explain this dispute to me. I didn’t understand half of what you were talking about.”

  Derec drew a deep breath, clearly uncomfortable. “Well. I started investigating the way positronic memory works, especially in the aftermath of collapse. Sometimes you can recover a collapsed positronic brain–not often, but it can happen. There’s something... unpredictable... in the way they collapse. I was curious about that.”

  “Having been an amnesiac have anything to do with this?”

  “More than a little. What differs between human and robot is in the way we’re locked into our perceptual realities. The way we interface with the world. Humans have a plasticity robots lack–we can indulge fiction, for instance, and know the difference, even when it’s a full-sensory entertainment that is designed to mimic reality in the finest detail. A robot can’t do that. Tell its senses that what it is perceiving is ‘real,’ and it acts upon that stimulus. It can’t make the intuitive distinction. If what it perceives causes a conflict with its Three Law imperatives, collapse is likely unless quickly resolved.”

  “Even a fictional crisis?” Mia asked.

  Derec nodded. “Exactly. Convince a robot a lie is real, and it has no way to treat the lie as a conditional reality pending further data, like a human does. Now in either case, unacceptable realities can cause breakdowns. Humans still suffer nervous collapses, psychotic amnesia, reactive psychoses–a varie
ty of disorders in which the brain tries to deal with an emotional or physical shock that the mind cannot accept. It happens faster and under more concrete conditions to a robot. But in the case of humans, the attempted resolution is also an attempt to circumvent the trauma to allow the organism to continue functioning.”

  “Amnesia victims can still carry on living even if they can’t remember who they are or where they came from.”

  “Simply put, yes. I wanted to see if some sort of the same mechanism could be duplicated in a positronic brain.”

  Mia looked over at Bogard. “It seems you succeeded.”

  “Not completely. What I established was a bypass, true. The memory is still there, but inaccessible to the primary matrix. I shunted it over to a buffer. Eventually, it has to be dealt with or Bogard will start suffering from diagnostic neurosis.”

  “What’s that?”

  “It’s what I tried to explain to you before. A positronic brain runs a self-diagnostic every few hours. At some point, Bogard’s diagnostic will register the absence of a specific memory node as a chronic problem. It won’t be able to fix it, so Bogard will start feeling the need to be serviced. It can impair function.”

  Mia felt a ripple of anxiety. She still did not want to release Bogard. “So tell me why Ariel doesn’t like this.”

  “Anything that tampers with the full function of the Three Laws she sees as a step away from heresy,” Derec said. “In her view, by giving Bogard the ability to continue functioning in the wake of a Three Law conflict that should shut it down, I’ve created a monster.” He grunted. “It was her work that gave me the direction to go, though.”

  “How’s that?”

  “Her doctoral thesis from Calvin. ‘Three Law Conflict Under Alternative Concretizations.’ Basically, she proposed the possibility of an informational loop that is created when a robot has incomplete data which strongly suggests the necessity of action.” Derec frowned. “Unfortunately, it can’t make the determination of which kind of action because the information is incomplete. It starts running probability scenarios, to fill in–basically by Occam’s Razor–the blanks in its information so it can make a decision. But it still can’t. It can theoretically create its own delusional scenario wherein collapse is imminent based on an unreal situation. One of Ariel’s inferences was that a positronic brain could be lied to on a fundamental level and thus create a false standard of reality for it. The hierarchical response to perception would be distorted. And it would be stuck in it, the loop causing a cascade of alternative perceptions.”

  “How would you do that? Just walk up to it and say ‘By the way, black is white, and people can fly’?”

  “No, the hardwiring prevents the brain from accepting that kind of input. It would have to be a more direct interference, like a virus that could change pathways in the brain structure. Something that would directly affect the positronic pathways themselves.”

  “Doesn’t that describe what happened to the RI at Union Station?”

  Derec looked worriedly at her. “Yes. That’s what I wanted to see by doing the physical inspection. There’s evidence of a direct intervention at certain sensory nodes, but we can’t tell which ones they were.”

  “Could anyone have accessed your research?”

  Derec shook his head, but Mia saw uncertainty in his face. The notion had occurred to him, but he did not want to give it too much consideration.

  “If I understand everything you’ve told me so far,” Mia continued, “that means that Bogard could only have malfunctioned if it had been ordered to do so. If it had been given a set of operational parameters that allowed it to perceive reality a little bit differently.”

  “I suppose... yes, that’s logical. But–”

  “So someone told it to fail. That’s the only way it could have. Correct? Because it’s been performing flawlessly for me.”

  Derec shifted uncomfortably. “I’m not sure I completely follow.”

  “Somehow, someone convinced Bogard that Eliton was in no danger. Someone programmed it to ignore the real situation.”

  “But it went into full protect mode. It enshielded Eliton.”

  “Bogard was linked to the RI’s sensory net at the time.”

  Derec stared at her for a long time, then nodded. “Bogard would have been affected by the same thing that altered the RI’s realtime perception. But it still enshielded Eliton...”

  “It looked to me like Eliton had ordered Bogard to go protect Humadros.”

  “Eliton would have needed the override codes. He didn’t have them, did he?”

  “No... unless someone gave them to him.” Mia rapped her fist on the arm of her chair. “I have to get access to the Service datum.”

  “There may be a way to do that. But I have to debrief Bogard before we go much further. You have to release it to me.”

  Mia knew he was right. Bogard carried information they needed and she did not have the skills to get to it. But she still walked painfully and badly, and she felt more vulnerable than she ever had in her life.

  Bogard could not protect her forever, though. The only way to feel safe again, on her own, would be to solve this. For that, Derec needed Bogard. She had to hand it over to him.

  Do Spacers feel this way all the time? she wondered. Vulnerable without their robots? Lost, insecure, incapable? Maybe it wasn’t such a bad thing to get rid of them...

  The thought shocked her. Robots were tools. If people had allowed themselves to become so overdependent on them that they could no longer function without them, was that the fault of the tool? Hardly. But it was always easier to change an environment than change the people who lived in it. Getting rid of robots was far easier than making the necessary–and probably beneficial–changes in people.

  But what do I know? I’m just a cop, not a philosopher.

  “Very well, Mr. Avery,” she said. She turned to the robot. “Bogard?”

  Mia knocked on Ariel’s bedroom door. She heard nothing and nearly returned to the living room when it opened. Ariel had changed her clothes. She had an appointment with the representative from the Settler’s Coalition this afternoon, Mia remembered.

  “Sorry,” Ariel said.

  “For what?”

  Ariel shrugged, smiled, scowled, and turned away with a look of disgust, all within the space of a second. “I get a little irrational on certain subjects. I hate losing control in front of people.”

  “You enjoy it in private?”

  Ariel looked startled, then laughed. She came out and went into the living room. She stopped and looked around. “Where are they?”

  “I, uh, turned Bogard over to Derec. It was time.”

  Ariel nodded thoughtfully. “You know, this is the only thing Derec and I have ever disagreed on.”

  “I’m sure.”

  Ariel smiled wryly. “You’re very politic. But I’m talking about serious issues, not the annoying debris that clutters up anyone’s life.”

  “Besides bodyguard work, what else would such a robot be good for?”

  “Good for? Nothing.” Ariel paused. “No, that’s not fair. Quite a few situations would suit a robot that was able to more loosely interpret the Three Laws. Starships rarely have mixed bridge crews. Usually, the robotic contingents are on stand-by till needed. It’s still dangerous to travel space. Exploration, certain kinds of lab work, some heavy industries. It’s feasible that where now we have to either leave it all to robots or all to people, a robot like Bogard would make it possible for a mixed presence. Police work, certainly. Forensics robots are lab technicians that are specifically programmed to ignore the fact that a corpse was once a human being in the sense of a person in need of protection. But those are very limited refinements, nothing like what Derec has done.”

  “You really disapprove,” Mia said.

  “It really scares me.”

  Silence stretched between them. Mia could find no worthwhile response to her friend’s admission. She shrugged.

  �
�We have plenty to keep us busy. You got another interesting call while you and Derec were gone.”

  Mia ran a query through the public datum on the garage. The place was serviced under general contract by a maintenance firm called Cyvan. Cyvan Services did upkeep on a variety of storage facilities and a few government buildings. They were owned by a holding company, though: Glovax Diversified. It took time to find out who owned Glovax, but the answer, even though she would have guessed another, did not surprise her.

  “Imbitek. Just like the ambulance.”

  “Imbitek bought the ambulance?” Ariel asked.

  “Not directly. It took me the whole time you were gone to track that one down. The bid was submitted by Holden Transport and Combined Services, who took possession. But the credits came from Lexington-Siever Financial. Holden Transport banks with them, but it’s not their primary account, just a short term cache for incidental expenditures. It was certainly large enough for this purchase, but the credits were replenished the next day from another in the same institution–a private account for one Nis Garvander, who sits on the board of directors of Glovax Diversified. Three days after the transfer of funds, Garvander closed his account down.”

  “Glovax...” Ariel mused. “So we’re right back at Imbitek.”

  “Looks that way.”

  Ariel tapped her lips. “But this is still circumstantial, isn’t it? After all, Imbitek suffered casualties. How come DyNan didn’t?”

  “Good question. Rega Looms certainly looks like the more obvious suspect.”

  “Which might be exactly why DyNan took no hits.”

  Mia looked at Ariel. “You’re thinking they were set up.”

  Ariel nodded. Mia respected Ariel’s perceptiveness. She would have made a good security specialist.

  “But,” Mia said, “we shouldn’t discount DyNan completely. I’m trying to find out if there have been any funds exchanged between the two companies. It’s still possible that DyNan engineered everything and left this trail just so we’d find it.”

 

‹ Prev