Unwise Child

Home > Science > Unwise Child > Page 9
Unwise Child Page 9

by Randall Garrett


  9

  Captain Sir Henry (Black Bart) Quill was seated in an old-fashioned,formyl-covered, overstuffed chair, chewing angrily at the end of anunlighted cigar. His bald head gleamed like a pink billiard ball, almostmatching the shining glory of his golden insignia against his scarlettunic.

  Mike the Angel had finally found his way through the maze of undergroundpassageways to the door marked _wardroom 9_ and had pushed it opengingerly, halfway hoping that he wouldn't be seen coming in late but notreally believing it would happen.

  He was right. Black Bart was staring directly at the door when it slidopen. Mike shrugged inwardly and stepped boldly into the room, flickinga glance over the faces of the other officers present.

  "Well, well, well, Mister Gabriel," said Black Bart. The voice was oily,but the oil was oil of vitriol. "You not only come late, but you comeincognito. Where is your uniform?"

  There was a muffled snicker from one of the junior officers, but itwasn't muffled enough. Before Mike the Angel could answer, CaptainQuill's head jerked around.

  "That will do, Mister Vaneski!" he barked. "Boot ensigns don't snickerwhen their superiors--_and_ their betters--are being reprimanded! I onlyuse sarcasm on officers I respect. Until an officer earns my sarcasm, hegets nothing but blasting when he goofs off. Understand?"

  The last word was addressed to the whole group.

  Ensign Vaneski colored, and his youthful face became masklike. "Yes,sir. Sorry, sir."

  Quill didn't even bother to answer; he looked back at Mike the Angel,who was still standing at attention. Quill's voice resumed its causticsaccharinity. "But don't let that go to your head, Mister Gabriel. Irepeat: Where is your pretty red spaceman's suit?"

  "If the Captain will recall," said Mike, "I had only twenty-four hours'notice. I couldn't get a new wardrobe in that time. It'll be in on thenext rocket."

  Captain Quill was silent for a moment, then he simply said, "Very well,"thus dismissing the whole subject. He waved Mike the Angel to a seat.Mike sat.

  "We'll dispense with the formal introductions," said Quill. "CommanderGabriel is our Engineering Officer. The rest of these boys all know eachother, Commander; you and I are the only ones who don't come fromChilblains Base. You know Commander Jeffers, of course."

  Mike nodded and grinned at Peter Jeffers, a lean, bony character who hada tendency to collapse into chairs as though he had come unhinged.Jeffers grinned and winked back.

  "This is Lieutenant Commander von Liegnitz, Navigation Officer;Lieutenant Keku, Supply; Lieutenant Mellon, Medical Officer; and EnsignVaneski, Maintenance. You can all shake hands with each other later;right now, let's get on with business." He frowned, overshadowing hiseyes with those great, bushy brows. "What was I saying just beforeCommander Gabriel came in?"

  Pete Jeffers shifted slightly in his seat. "You were sayin', suh, thatthis's the stupidest dam' assignment anybody evah got. Or words to thateffect." Jeffers had been born in Georgia and had moved to the south ofEngland at the age of ten. Consequently, his accent was far fromstandard.

  "I think, Mister Jeffers," said Quill, "that I phrased it a bit moredelicately, but that was the essence of it.

  "The _Brainchild_, as she has been nicknamed, has been built at greatexpense for the purpose of making a single trip. We are to take her, andher cargo, to a destination known only to myself and von Liegnitz. Wewill be followed there by another Service ship, which will bring us backas passengers." He allowed himself a half-smile. "At least we'll get toloaf around on the way back."

  The others grinned.

  "The _Brainchild_ will be left there and, presumably, dismantled."

  He took the unlighted cigar out of his mouth, looked at it, and absentlyreached in his pocket for a lighter. The deeply tanned young man who hadbeen introduced as Lieutenant Keku had just lighted a cigarette, so heproffered his own flame to the captain. Quill puffed his cigar alightabsently and went on.

  "It isn't going to be easy. We won't have a chance to give the ship ashakedown cruise because once we take off we might as well keepgoing--which we will.

  "You all know what the cargo is--Cargo Hold One contains the greatestsingle robotic brain ever built. Our job is to make sure it gets to ourdestination in perfect condition."

  "Question, sir," said Mike the Angel.

  Without moving his head, Captain Quill lifted one huge eyebrow andglanced in Mike's direction. "Yes?"

  "Why didn't C.C. of E. build the brain on whatever planet we're going toin the first place?"

  "We're supposed to be told that in the briefing over at the C.C. of E.labs in"--he glanced at his watch--"half an hour. But I think we can allget a little advance information. Most of you men have been around herelong enough to have some idea of what's going on, but I understand thatMister Vaneski knows somewhat more about robotics than most of us. Doyou have any light to shed on this, Mister Vaneski?"

  Mike grinned to himself without letting it show on his face. The skipperwas letting the boot ensign redeem himself after the _faux pas_ he'dmade.

  Vaneski started to stand up, but Quill made a slight motion with hishand and the boy relaxed.

  "It's only a guess, sir," he said, "but I think it's because the robotknows too much."

  Quill and the others looked blank, but Mike narrowed his eyesimperceptibly. Vaneski was practically echoing Mike's own deductions.

  "I mean--well, look, sir," Vaneski went on, a little flustered, "theystarted to build that thing ten years ago. Eight years ago they startedteaching it. Evidently they didn't see any reason for building it offEarth then. What I mean is, something must've happened since then tomake them decide to take it off Earth. If they've spent all this muchmoney to get it away, that must mean that it's dangerous somehow."

  "If that's the case," said Captain Quill, "why don't they just shut thething off?"

  "Well--" Vaneski spread his hands. "I think it's for the same reason. Itknows too much, and they don't want to destroy that knowledge."

  "Do you have any idea what that knowledge might be?" Mike the Angelasked.

  "No, sir, I don't. But whatever it is, it's dangerous as hell."

  * * * * *

  The briefing for the officers and men of the _William Branchell_--the_Brainchild_--was held in a lecture room at the laboratories of theComputer Corporation of Earth's big Antarctic base.

  Captain Quill spoke first, warning everyone that the project was secretand asking them to pay the strictest attention to what Dr. MorrisFitzhugh had to say.

  Then Fitzhugh got up, his face ridged with nervousness. He assumed theair of a university professor, launching himself into his speech asthough he were anxious to get through it in a given time withoutfinishing too early.

  "I'm sure you're all familiar with the situation," he said, as thoughapologizing to everyone for telling them something they alreadyknew--the apology of the learned man who doesn't want anyone to thinkhe's being overly proud of his learning.

  "I think, however, we can all get a better picture if we begin at thebeginning and work our way up to the present time.

  "The original problem was to build a computer that could learn byitself. An ordinary computer can be forcibly taught--that is, atechnician can make changes in the circuits which will make the robot dosomething differently from the way it was done before, or even make itdo something new.

  "But what we wanted was a computer that could learn by itself, acomputer that could make the appropriate changes in its own circuitswithout outside physical manipulation.

  "It's really not as difficult as it sounds. You've all seenautoscribers, which can translate spoken words into printed symbols. Anautoscriber is simply a machine which does what you tell itto--literally. Now, suppose a second computer is connected intimatelywith the first in such a manner that the second can, on order, changethe circuits of the first. Then, all that is needed is...."

  Mike looked around him while the roboticist went on. The men werelooking pretty bored. They'd come to get a
briefing on the reason forthe trip, and all they were getting was a lecture on robotics.

  Mike himself wasn't so much interested in the whys and wherefores of thetrip; he was wondering why it was necessary to tell anyone--even thecrew. Why not just pack Snookums up, take him to wherever he was going,and say nothing about it?

  Why explain it to the crew?

  "Thus," continued Fitzhugh, "it became necessary to incorporate into thebrain a physical analogue of Lagerglocke's Principle: 'Learning is aresult of an inelastic collision.'

  "I won't give it to you symbolically, but the idea is simply that anorganism learns _only_ if it does _not_ completely recover from theeffects of an outside force imposed upon it. If it recovers completely,it's just as it was before. Consequently, it hasn't learned anything.The organism _must change_."

  He rubbed the bridge of his nose and looked out over the faces of themen before him. A faint smile came over his wrinkled features.

  "Some of you, I know, are wondering why I am boring you with this longrecital. Believe me, it's necessary. I want all of you to understandthat the machine you will have to take care of is not just an ordinarycomputer. Every man here has had experience with machinery, from thevery simplest to the relatively complex. You know that you have to becareful of the kind of information--the kind of external force--you givea machine.

  "If you aim a spaceship at Mars, for instance, and tell it to go_through_ the planet, it might try to obey, but you'd lose the machinein the process."

  A ripple of laughter went through the men. They were a little morerelaxed now, and Fitzhugh had regained their attention.

  "And you must admit," Fitzhugh added, "a spaceship which was given thatsort of information might be dangerous."

  This time the laughter was even louder.

  "Well, then," the roboticist continued, "if a mechanism is capable oflearning, how do you keep it from becoming dangerous or destroyingitself?

  "That was the problem that faced us when we built Snookums.

  "So we decided to apply the famous Three Laws of Robotics propoundedover a century ago by a brilliant American biochemist and philosopher.

  "Here they are:

  "'_One: A robot may not injure a human being, nor, through inaction,allow a human being to come to harm._'

  "'_Two: A robot must obey the orders given it by human beings exceptwhere such orders would conflict with the First Law._'

  "'_Three: A robot must protect its own existence as long as suchprotection does not conflict with the First or Second Law._'"

  Fitzhugh paused to let his words sink in, then: "Those are the ideallaws, of course. Even their propounder pointed out that they would beextremely difficult to put into practice. A robot is a logical machine,but it becomes somewhat of a problem even to define a human being. Is afive-year-old competent to give orders to a robot?

  "If you define him as a human being, then he can give orders that mightwreck an expensive machine. On the other hand, if you don't define thefive-year-old as human, then the robot is under no compulsion to refrainfrom harming the child."

  He began delving into his pockets for smoking materials as he went on.

  "We took the easy way out. We solved that problem by keeping Snookumsisolated. He has never met any animal except adult human beings. Itwould take an awful lot of explaining to make him understand thedifference between, say, a chimpanzee and a man. Why should a hairy peltand a relatively low intelligence make a chimp non-human? After all,some men are pretty hairy, and some are moronic.

  "Present company excepted."

  More laughter. Mike's opinion of Fitzhugh was beginning to go up. Theman knew when to break pedantry with humor.

  "Finally," Fitzhugh said, when the laughter had subsided, "we must askwhat is meant by 'protecting his own existence.' Frankly, we've beendriven frantic by that one. The little humanoid, caterpillar-trackmechanism that we all tend to think of as Snookums isn't reallySnookums, any more than a human being is a hand or an eye. Snookumswouldn't actually be threatening his own existence unless his brain--nowin the hold of the _William Branchell_--is destroyed."

  As Dr. Fitzhugh continued, Mike the Angel listened with about half anear. His attention--and the attention of every man in the place--hadbeen distracted by the entrance of Leda Crannon. She stepped in througha side door, walked over to Dr. Fitzhugh, and whispered something in hisear. He nodded, and she left again.

  Fitzhugh, when he resumed his speech, was rather more hurried in hisdelivery.

  "The whole thing can be summed up rather quickly.

  "Point One: Snookums' brain contains the information that eight years ofhard work have laboriously put into it. That information is morevaluable than the whole cost of the _William Branchell_; it's worthbillions. So the robot can't be disassembled, or the information wouldbe lost.

  "Point Two: Snookums' mind is a strictly logical one, but it isoperating in a more than logical universe. Consequently, it is unstable.

  "Point Three: Snookums was built to conduct his own experiments. Toforbid him to do that would be similar to beating a child for actinglike a child; it would do serious harm to the mind. In Snookums' case,the randomity of the brain would exceed optimum, and the robot wouldbecome insane.

  "Point Four: Emotion is not logical. Snookums can't handle it, except ina very limited way."

  Fitzhugh had been making his points by tapping them off on his fingerswith the stem of his unlighted pipe. Now he shoved the pipe back in hispocket and clasped his hands behind his back.

  "It all adds up to this: Snookums _must_ be allowed the freedom of theship. At the same time, every one of us must be careful not to ... topush the wrong buttons, as it were.

  "So here are a few _don'ts_. Don't get angry with Snookums. That wouldbe as silly as getting sore at a phonograph because it was playing musicyou didn't happen to like.

  "Don't lie to Snookums. If your lies don't fit in with what he knows tobe true--and they won't, believe me--he will reject the data. But itwould confuse him, because he knows that humans don't lie.

  "If Snookums asks you for data, qualify it--even if you know it to betrue. Say: 'There may be an error in my knowledge of this data, but tothe best of my knowledge....'

  "Then go ahead and tell him.

  "But if you absolutely don't know the answer, tell him so. Say: 'I don'thave that data, Snookums.'

  "Don't, unless you are...."

  He went on, but it was obvious that the officers and crew of the_William Branchell_ weren't paying the attention they should. Every oneof them was thinking dark gray thoughts. It was bad enough that they hadto take out a ship like the _Brainchild_, untested and jerry-built asshe was. Was it necessary to have an eight-hundred-pound, moron-geniuschild-machine running loose, too?

  Evidently, it was.

  "To wind it up," Fitzhugh said, "I imagine you are wondering why it'snecessary to take Snookums off Earth. I can only tell you this: Snookumsknows too much about nuclear energy."

  Mike the Angel smiled grimly to himself. Ensign Vaneski had been right;Snookums was dangerous--not only to individuals, but to the wholeplanet.

  Snookums, too, was a juvenile delinquent.

 

‹ Prev