Derec felt inexplicably reluctant to open it. He looked at Ariel, his heart pounding. “Do you still have that key?”
Ariel handed him the seal key from the warehouse. Derec switched it on and ran it along the seam. The crate lid popped open.
Nestled within padding lay a plastic-wrapped mound of silver-and-gold webbing, wrapped tightly and mingled with darker nodes.
“Damn,” Ariel hissed.
“What is it?” Mia asked.
“A positronic brain,” Derec said. “Absolute contraband.”
“Close it up,” Ariel said.
Derec complied, then looked at Mia. “You called the police, didn’t you?”
Mia nodded. “Something came up. I had Bogard issue a dispatch through their channels. Not much, just a cruiser to go look-see.”
“Bogard is just full of tricks,” Ariel said icily. “Pity it couldn’t do its primary job as well.”
Derec looked at her. She was staring at Bogard, arms folded, an expression of unconcealed resentment on her face.
“You sent me a message,” Derec said, “after the Incident. You said ‘I see you got your wish. ’ What did you mean by that?”
“You knew it was from me. Can’t you figure out what it means?”
“Eliton’s death was--”
“Beside the point. You got what you wanted by being able to create a dangerous robot. I don’t think you wanted to kill Eliton. I think you wanted to build robots, any way you could, any way you wanted.”
“How does that follow?” Derec asked. “With Eliton’s death, no one will be able to build robots on Earth.”
“I don’t think it matters. Someone will hire you to build bodyguards now, no matter what.”
“Excuse me, but I failed to do that.”
Ariel waved her hand dismissively. “Glitches. No one on Earth would buy it, but Spacers understand that prototypes always have bugs to be worked out. It brought back three of the assassins. That’s the point that won’t be missed. You’ve got your opportunity to build your special positronics now. More leeway, more freedom of action, more humanlike. Hell, they’ll even make mistakes.”
“Ariel--”
“Pardon me,” Mia said. “I feel like I’ve come in at the tail end of a very complicated argument.”
“Derec and I disagree fundamentally over Bogard. What it represents.”
“I gathered that much. Why?”
“Derec’s robot here is the product of an attempt to circumvent Three Law programming--”
“That’s a complete mischaracterization!” Derec shouted. “You never understood what I was after!”
“Really? Tell me something--why is that robot still functioning?”
“What? I don’t--”
“It failed,” Ariel snapped. “It let a human in its care die. It stood witness to dozens of fatalities and injuries. It should be a mass of collapsed positronic gelatin. Instead, it is fully functional.”
“What good would it be if it had collapsed?”
“It would be inert. It would pose no further threat.”
“What threat?”
“The threat of negligence!”
“It wasn’t negligent! Look at how it performed for Mia.”
“Then why is Eliton dead?”
“We don’t know he’s dead!”
“As far as Bogard is concerned, he is! Why?”
Derec did not know. Of everything he had intended in designing and building Bogard, that was precisely the thing which ought never to have happened. He looked at the unmoving, unmoved machine and wondered what had gone so profoundly wrong that it had allowed the human it was programmed expressly to protect to die.
“Would someone please explain this to me?” Mia asked.
“Derec built it, let him try,” Ariel said in disgust.
“Under normal circumstances...” Derec started. His throat caught, and he coughed. “Normal circumstances... whatever that means... the Three Laws are built into every positronic brain, part of the core template. They represent the First Principles for a robot, the foundation on which all its subsequent learning and experience rests. The initial designers set it up so that almost all the secondary programming requires the presence of those laws in order to function. It would require a complete redesign of all the manufacturing methods as well as the basic pathways themselves to build a brain without the Three Laws. They aren’t just hardwired into the brain, they are basic to the processes of constructing one.”
“Is that what you did? Redesign everything?”
“No. I’m not that good. Nor am I that irresponsible. All I did was set in place a new set of parameters for the application of the Laws.”
“You circumvented them,” Ariel snapped.
“I did not. I took an accepted standard practice and stretched it.”
“What practice?” Mia asked.
“Setting conditions of when the robot perceives that it is responsible for a violation,” Derec explained. “Think about it. According to the First Law, if a robot followed it absolutely, all robots would collapse. ‘A robot may not injure a human being, or, through inaction, allow a human being to come to harm. ’ Consider that as an absolute. Human beings are always coming to harm. Time and distance alone guarantee that a robot can’t act to prevent that harm in all cases. If some kind of buffer zone, a hierarchical response, weren’t in place, the instant robots realized how many humans came to harm because they weren’t doing something to prevent it, we would have no functional robot population. So we establish a reasonable limitation for its application. If a robot does nothing to stop the human it is standing right next to from being killed--say, by a speeding vehicle, out of control--then it fails and collapse follows. If that same robot can do nothing to prevent the same fate happening to a human a kilometer away from it, the only way it fails is if another human forcefully asserts that it was at fault. It does not automatically perceive itself as responsible.”
Mia nodded. “That makes sense. “
“Practical engineering,” Derec said. “But then we run into a functional brick wall when it comes to certain tasks. Law enforcement, for one. Most positronic brains cannot cope with violent crime. A few have been programmed to dissociate under very specific circumstances so that, say, witnessing dead humans at a crime scene won’t create a Three Law crisis. But they still can’t make arrests because that offers the potential of harm to a human being.”
“But a criminal--” Mia began.
“Is still a human being,” Derec insisted. ”What a human has done by violating a law does not mitigate the robot’s adherence to the Three Laws.”
“Unless you redefine harm,” Ariel said. “Which is how you got around the Three Law imperative.”
“That’s an oversimplification,” Derec replied. ”What I redefined was the sphere of responsibility. Bogard is just as committed to the Three Laws as any other robot, but it has a broader definition of that commitment. It can make the determination that limiting a human’s freedom action, even if it results in a degree of harm--bruises or strained muscles, for instance--may prevent harm from coming to other humans.”
“You’re telling me you gave it a moral barometer?” Ariel demanded.
“Sort of. It relies on the human to which it is assigned to make that determination. It also recognizes a human prerogative to go in harm’s way should circumstances require risk to prevent further harm.”
“And when confronted with a clear case of Three Law violation?”
“It has memory buffers and a failsafe that shunts the data out of the primary positronic matrix. It prevents positronic collapse and allows for the opportunity for further evaluation. In a proper lab debriefing, the cause-and-effect of a situation can be properly explained and set in context. The robot can be reset and returned to duty.”
“You gave it selective amnesia,” Ariel said. “It can allow a human to come to harm and still function because after the fact it doesn’t remember doing it.”
&
nbsp; “That’s why Bogard left data out of its report,” Mia said.
“That’s why I have to take Bogard back to Phylaxis to debrief it.”
Mia nodded thoughtfully. “So why don’t you approve, Ariel?”
“A robot is a machine,” she said. “A very powerful machine. It is intelligent, it can make decisions. I want them inextricably joined to the Three Laws so that they can never--never--circumvent their concern for my safety. If they fail to protect me, I want them shut down. I don’t want them thinking it over. I don’t want to ever be considered a secondary or tertiary concern by a robot who may decide that I ought to be sacrificed for the good of the many. Or of a specific individual. I think loosening the bonds like this can only lead to operational conflicts that will result in unnecessary harm.”
“That’s the only way to construct a robot bodyguard, though,” Derec said.
“There should be no such thing, then!” Ariel shouted. “It didn’t work! Somewhere in its sloppy brain it made a decision and sacrificed Senator Eliton! Explain it to me how that was for anyone’s greater good!”
Derec stared at her, ashamed. He could think of no answer to give her. In fact, he had no answer for himself.
_
TWENTY-ONE
Mia watched the argument escalate, amazed at Ariel. She had always seen her friend as impatient but controlled, usually even-tempered, never enraged and irrational. But this was a side of Ariel with which Mia had no experience. The unreasoned hatred she directed at Bogard reminded Mia more of an anti-robot fanatic than of a Spacer who ought to be at ease with robots.
“Ariel--” Derec said tightly, obviously reining in his own anger.
Ariel left the room.
Derec closed his eyes, leaning back in his chair.
“You two have known each other a long time?” Mia asked.
Derec gave a wan smile. “Too long, I sometimes think. In a way, I’ve known her all my life.”
“You’re not--”
“Related? No. It’s just I--we--both had amnemonic plague. Burundi’s Fever. We’ve been complete amnesiacs. When I had my bout, Ariel was the first human I came into contact with.”
“And you were with her when she had hers?”
Derec nodded.
“So... why don’t you explain this dispute to me. I didn’t understand half of what you were talking about.”
Derec drew a deep breath, clearly uncomfortable. “Well. I started investigating the way positronic memory works, especially in the aftermath of collapse. Sometimes you can recover a collapsed positronic brain--not often, but it can happen. There’s something... unpredictable... in the way they collapse. I was curious about that.”
“Having been an amnesiac have anything to do with this?”
“More than a little. What differs between human and robot is in the way we’re locked into our perceptual realities. The way we interface with the world. Humans have a plasticity robots lack--we can indulge fiction, for instance, and know the difference, even when it’s a full-sensory entertainment that is designed to mimic reality in the finest detail. A robot can’t do that. Tell its senses that what it is perceiving is ‘real,’ and it acts upon that stimulus. It can’t make the intuitive distinction. If what it perceives causes a conflict with its Three Law imperatives, collapse is likely unless quickly resolved.”
“Even a fictional crisis?” Mia asked.
Derec nodded. “Exactly. Convince a robot a lie is real, and it has no way to treat the lie as a conditional reality pending further data, like a human does. Now in either case, unacceptable realities can cause breakdowns. Humans still suffer nervous collapses, psychotic amnesia, reactive psychoses--a variety of disorders in which the brain tries to deal with an emotional or physical shock that the mind cannot accept. It happens faster and under more concrete conditions to a robot. But in the case of humans, the attempted resolution is also an attempt to circumvent the trauma to allow the organism to continue functioning.”
“Amnesia victims can still carry on living even if they can’t remember who they are or where they came from.”
“Simply put, yes. I wanted to see if some sort of the same mechanism could be duplicated in a positronic brain.”
Mia looked over at Bogard. “It seems you succeeded.”
“Not completely. What I established was a bypass, true. The memory is still there, but inaccessible to the primary matrix. I shunted it over to a buffer. Eventually, it has to be dealt with or Bogard will start suffering from diagnostic neurosis.”
“What’s that?”
“It’s what I tried to explain to you before. A positronic brain runs a self-diagnostic every few hours. At some point, Bogard’s diagnostic will register the absence of a specific memory node as a chronic problem. It won’t be able to fix it, so Bogard will start feeling the need to be serviced. It can impair function.”
Mia felt a ripple of anxiety. She still did not want to release Bogard. “So tell me why Ariel doesn’t like this.”
“Anything that tampers with the full function of the Three Laws she sees as a step away from heresy,” Derec said. “In her view, by giving Bogard the ability to continue functioning in the wake of a Three Law conflict that should shut it down, I’ve created a monster.” He grunted. “It was her work that gave me the direction to go, though.”
“How’s that?”
“Her doctoral thesis from Calvin. ‘Three Law Conflict Under Alternative Concretizations. ’ Basically, she proposed the possibility of an informational loop that is created when a robot has incomplete data which strongly suggests the necessity of action.” Derec frowned. “Unfortunately, it can’t make the determination of which kind of action because the information is incomplete. It starts running probability scenarios, to fill in--basically by Occam’s Razor--the blanks in its information so it can make a decision. But it still can’t. It can theoretically create its own delusional scenario wherein collapse is imminent based on an unreal situation. One of Ariel’s inferences was that a positronic brain could be lied to on a fundamental level and thus create a false standard of reality for it. The hierarchical response to perception would be distorted. And it would be stuck in it, the loop causing a cascade of alternative perceptions.”
“How would you do that? Just walk up to it and say ‘By the way, black is white, and people can fly’?”
“No, the hardwiring prevents the brain from accepting that kind of input. It would have to be a more direct interference, like a virus that could change pathways in the brain structure. Something that would directly affect the positronic pathways themselves.”
“Doesn’t that describe what happened to the RI at Union Station?”
Derec looked worriedly at her. “Yes. That’s what I wanted to see by doing the physical inspection. There’s evidence of a direct intervention at certain sensory nodes, but we can’t tell which ones they were.”
“Could anyone have accessed your research?”
Derec shook his head, but Mia saw uncertainty in his face. The notion had occurred to him, but he did not want to give it too much consideration.
“If I understand everything you’ve told me so far,” Mia continued, “that means that Bogard could only have malfunctioned if it had been ordered to do so. If it had been given a set of operational parameters that allowed it to perceive reality a little bit differently.”
“I suppose... yes, that’s logical. But--”
“So someone told it to fail. That’s the only way it could have. Correct? Because it’s been performing flawlessly for me.”
Derec shifted uncomfortably. “I’m not sure I completely follow.”
“Somehow, someone convinced Bogard that Eliton was in no danger. Someone programmed it to ignore the real situation.”
“But it went into full protect mode. It enshielded Eliton.”
“Bogard was linked to the RI’s sensory net at the time.”
Derec stared at her for a long time, then nodded. “Bogard would have been affected
by the same thing that altered the RI’s realtime perception. But it still enshielded Eliton...”
“It looked to me like Eliton had ordered Bogard to go protect Humadros.”
“Eliton would have needed the override codes. He didn’t have them, did he?”
“No... unless someone gave them to him.” Mia rapped her fist on the arm of her chair. “I have to get access to the Service datum.”
“There may be a way to do that. But I have to debrief Bogard before we go much further. You have to release it to me.”
Mia knew he was right. Bogard carried information they needed and she did not have the skills to get to it. But she still walked painfully and badly, and she felt more vulnerable than she ever had in her life.
Bogard could not protect her forever, though. The only way to feel safe again, on her own, would be to solve this. For that, Derec needed Bogard. She had to hand it over to him.
Do Spacers feel this way all the time? she wondered. Vulnerable without their robots? Lost, insecure, incapable? Maybe it wasn’t such a bad thing to get rid of them...
The thought shocked her. Robots were tools. If people had allowed themselves to become so overdependent on them that they could no longer function without them, was that the fault of the tool? Hardly. But it was always easier to change an environment than change the people who lived in it. Getting rid of robots was far easier than making the necessary--and probably beneficial--changes in people.
But what do I know? I’m just a cop, not a philosopher.
“Very well, Mr. Avery,” she said. She turned to the robot. “Bogard?”
Mia knocked on Ariel’s bedroom door. She heard nothing and nearly returned to the living room when it opened. Ariel had changed her clothes. She had an appointment with the representative from the Settler’s Coalition this afternoon, Mia remembered.
“Sorry,” Ariel said.
“For what?”
Ariel shrugged, smiled, scowled, and turned away with a look of disgust, all within the space of a second. “I get a little irrational on certain subjects. I hate losing control in front of people.”
Mirage Page 24