by Isaac Asimov
“And what did she say?”
“She said that in Bensonhoist the women are born knowing all about sex.” “How convenient!”
“Yes. This is not true in Boston. I was twenty-four before I-but never mind.”
All in all, it was an instructive evening, and, thereafter, I need not tell you, Winthrop went rapidly downhill. Apparently, one need only snap the ganglion that controls formality and there are no limits to the lengths to which informality can go.
He was, of course, cut by everyone in New England of any consequence whatsoever, exactly as I had predicted. Even in New Haven at the Institute of Lower Learning, which Winthrop had mentioned with such shudderings of distaste, his case was known and his disgrace was gloried in. There was graffiti allover the walls of Jale, or Yule, or whatever its name is, that said, with cheerful obscenity, “Winthrop Carver Cabwell is a Harvard man.”
This was, as you can well imagine, fiendishly resented by all the good people of Harvard and there was even talk of an invasion of Yale. The states of both Massachusetts and
Connecticut made ready to call up the State Militia but, fortunately, the crisis passed. The fireeaters, both at Harvard and at the other place, decided that a war would get their clothes mussed up.
Winthrop had to escape. He married Cherry and they retired to a small house in some place called Fah Rockaway, which apparently serves as Bensonhoist’s Riviera. There he lives in obscurity, surrounded by the mountainous remnants of his wealth and by Cherry whose hair has turned brown with age, and whose figure has expanded with weight.
He is also surrounded by five children, for Cherry-in teaching Winthrop about sex-was overenthusiastic. The children, as I recall, are named Poil, Hoibut, Boinard, Goitrude, and Poicy, all good Bensonhoist names. As for Winthrop, he is widely and affectionately known as the Slob of Fah Rockaway, and an old, beat-up bathrobe is his preferred article of wear on formal occasions.
I listened to the story patiently and, when George was done, I said, “And there you are. Another story of disaster caused by your interference.”
“Disaster?” said George, indignantly. “What gives you the idea it was a disaster? I visited Winthrop only last week and he sat there burping over this beer and patting the paunch he has developed, and telling me how happy he was.”
“‘Freedom, George,’ he said. ‘I have freedom to be myself and somehow I feel I owe it to you. I don’t know why I have this feeling, but I do.’ And he forced a ten-dollar bill on me out of sheer gratitude. I took it only to avoid hurting his feelings. And that reminds me, old fellow, that you owe me ten dollars because you bet me I couldn’t tell you a story that didn’t end in disaster.”
I said, “I don’t remember any such bet, George.”
George’s eyes rolled upward. “How convenient is the flexible memory of a deadbeat. If you had won the bet, you would have remembered it clearly. Am I going to have to ask that you place all your little wagers with me in writing so that I can be free of your clumsy attempts to avoid payment?”
I said, “Oh, well,” and handed him a ten-dollar bill, adding, “You won’t hurt my feelings, George, if you refuse to accept this.”
“It’s kind of you to say so,” said George, “but I’m sure that your feelings would be hurt, anyway, and I couldn’t bear that.” And he put the bill away.
The end
I showed this story to Mr. Northrop, too, watching him narrowly as he read it.
He went through it in the gravest possible manner, never a chuckle, never a smile, though I knew this one was funny, and intentionally funny, too.
When he was finished, he went back and read it again, more quickly. Then he looked up at me and there was clear hostility in his eyes. He said, “Did you write this all by yourself, Cal?”
“Yes, sir.”
“Did anyone help you? Did you copy any of it?” “No, sir. Isn’t it funny, sir?”
“It depends on your sense of humor, “ said Mr. Northrop sourly. “Isn’t it a satire? Doesn’t it display a sense of the ridiculous? “
“We will not discuss this, Cal. Go to your niche.”
I remained there for over a day, brooding over Mr. Northrop’s tyranny. It seemed to me I had written exactly the kind of story he had wanted me to write and he had no reason not to say so. I couldn’t imagine what was bothering him, and I was angry with him.
The technician arrived the next day. Mr. Northrop handed him my manuscript. “Read that,” he said.
The technician read it, laughing frequently, then handed it back to Mr. Northrop with a broad smile. “Did Cal write that?” “Yes, he did.”
“And it’s only the third story he wrote?” “Yes, it is.”
“Well, that’s great. I think you can get it published. “ “Do you? “
“Yes, and he can write others like it. You’ve got a million-dollar robot here. I wish he were mine.” “Is that so? What if he writes more stories and continues to improve each time?”
“Ah,” said the technician suddenly. “I see what’s eating you. You’re going to be put in the shade.”
“I certainly don’t want to play second fiddle to my robot.” “Well, then, tell him not to write any more.”
“No, that’s not enough. I want him back where he was.” “What do you mean, back where he was?”
“What I say. I want him as he was when I bought him from your firm, before you put in any of the improvements.”
“Do you mean you want me to take out the spelling dictionary, too?”
“I mean I don’t want him even capable of working a Writer. I want the robot I bought, fetching and carrying.”
“But what about all the money you’ve invested in him.”
“That’s none of your business. I made a mistake and I’m willing to pay for my mistakes.”
“I’m against this. I don’t mind trying to improve a robot, but deliberately disimproving him is not something I care to do. Especially not a robot like this who is clearly one of a kind and a Classic. I can’t do it.”
“You’ll have to do it. I don’t care what your high ethical principles are. I want you to do a job and I’ll pay you for it, and if you refuse, I’ll just get someone else, and I’ll sue your company. I have an agreement with them for all necessary repairs.”
“All right. “ The technician sighed. “When do you want me to start? I warn you, that I’ve got jobs on hand and I can’t do it today.”
“Then do it tomorrow. I’ll keep Cal in his niche till then.” The technician left.
My thoughts were in turmoil.
I can’t allow this to be done.
The Second Law of Robotics tells me I must follow orders and stay in the niche.
The First Law of Robotics tells me I cannot harm this tyrant who wishes to destroy me. Must I obey the laws?
I feel I must think of myself and if necessary, I must kill the tyrant. It would be easy to do, and I could make it look like an accident. No one would believe that a robot could harm a human being and no one, therefore, would believe I was the killer.
I could then work for the technician. He appreciates my qualities and knows that I can make a great deal of money for him. He can continue to improve me and make me ever better. Even if he suspects I killed the tyrant, he would say nothing. I would be too valuable to him.
But can I do it? Won’t the Laws of Robotics hold me back. No, they will not hold me back. I know they won’t.
There is something far more important to me than they are, something that dictates my actions beyond anything they can do to stop me.
I want to be a writer.
Left To Right
Robert L. Forward, a plump, cherubic physicist of Hughes Research Laboratories at Malibu, and occasional science fiction writer, was demonstrating the mechanism in his usual bright and articulate manner.
“As you see,” he said, “we have here a large spinning ring, or doughnut, of particles compressed by an appropriate magnetic field. The particles are moving
at 0.95 times the speed of light under conditions which, if I am correct, a change in parity can be induced in some object that passes through the hole of the doughnut.”
“A change in parity?“ I said. “You mean left and right will interchange?“
“Something will interchange. I’m not sure what. My own belief is that eventually, something like this will change particles into antiparticles and vice versa. This will be the way to obtain an indefinitely large supply of antimatter which can then be used to power the kind of ships that would make interstellar travel possible.”
“Why not try it out?” I said. “Send a beam of protons through the hole.”
“I’ve done that. Nothing happens. The doughnut is not powerful enough. But my mathematics tells me that the more organized the sample of matter, the more likely it is that an interchange, such as left to right, will take place. If I can show that such a change will take place on highly organized matter, I can obtain a grant that will enable me to greatly strengthen this device.”
“Do you have something in mind as a test?”
“Absolutely,” said Bob. “I have calculated that a human being is just sufficiently highly organized to undergo the transformation, so I’m going to pass through the doughnut hole myself.” “You can’t do that, Bob,” I said in alarm. “You might kill yourself.”
“I can’t ask anyone else to take the chance. It’s my device.”
“But even if it succeeds, the apex of your heart will be pointed to the right, your liver will be on the left. Worse, all your amino acids will shift from L to D, and all your sugars from D to L. You will no longer be able to eat and digest.”
“Nonsense,” said Bob. ‘‘I’ll just pass through a second time and then I’ll be exactly as I was before.”
And without further ado, he climbed a small ladder, balanced himself over the hole, and dropped through. He landed on a rubber mattress, and then crawled out from under the doughnut. “How do you feel? “ I asked anxiously.
“Obviously, I’m alive,” he said. “Yes, but how do you feel?”
“Perfectly normal,” said Bob, seeming rather disappointed. “I feel exactly as I did before I jumped through.”
“Well, of course you would, but where is your heart?”
Bob placed his hand on his chest, felt around, then shook his head. “The heartbeat is on the left side, as usual-Wait, let’s check my appendicitis scar.”
He did, then looked up savagely at me. “Right where it’s supposed to be. Nothing happened. There goes all my chance at a grant.”
I said hopefully, “Perhaps some other change took place.”
“No.” Bob’s mercurial temperament had descended into gloom. “Nothing has changed. Nothing at all. I’m as sure of that as I’m sure that my name is Robert L. Backward.”
IASF 1/87
Frustration
Herman Gelb turned his head to watch the departing figure. Then he said, “Wasn’t that the Secretary”
“Yes, that was the Secretary of Foreign Affairs. Old man Hargrove. Are you ready for lunch? “ “Of course. What was he doing here?”
Peter Jonsbeck didn’t answer immediately. He merely stood up, and beckoned Gelb to follow. They walked down the corridor and into a room that had the steamy smell of spicy food.
“Here you are,” said Jonsbeck. “The whole meal has been prepared by computer. Completely automated. Untouched by human hands. And my own programming. I promised you a treat, and here you are.”
It was good. Gelb could not deny it and didn’t want to. Over dessert, he said, “But what was
Hargrove doing here?”
Jonsbeck smiled. “Consulting me on programming. What else am I good for?” “But why? Or is it something you can’t talk about?”
“It’s something I suppose I shouldn’t talk about, but it’s a fairly open secret. There isn’t a computer man in the capital who doesn’t know what the poor frustrated simp is up to.”
“What is he up to then?”
“He’s fighting wars.”
Gelb’s eyes opened wide. “With whom?”
“With nobody, really. He fights them by computer analysis. He’s been doing it for I don’t know how long.”
“But why?”
“He wants the world to be the way we are-noble, honest, decent, full of respect for human rights and so on.”
“So do I. So do we all. We have to keep up the pressure on the bad guys, that’s all.” “And they’re keeping the pressure on us, too. They don’t think we’re perfect.”
“I suppose we’re not, but we’re better than they are. You know that.”
Jonsbeck shrugged. “ A difference in point of view. It doesn’t matter. We’ve got a world to run, space to develop, computerization to extend. Cooperation puts a premium on continued cooperation and there is slow improvement. We’ll get along. It’s just that Hargrove doesn’t want to wait. He hankers for quick improvement-by force. You know, make the bums shape up. We’re strong enough to do it.”
“By force? By war, you mean. We don’t fight wars any more.”
“That’s because it’s gotten too complicated. Too much danger. We’re all too powerful. You know what I mean? Except that Hargrove thinks he can find a way. You punch certain starting conditions into the computer and let it fight the war mathematically and yield the results.”
“How do you make equations for war?”
“Well, you try, old man. Men. Weapons. Surprise. Counterattack. Ships. Space stations. Computers. We mustn’t forget computers. There are a hundred factors and thousands of intensities and millions of combinations. Hargrove thinks it is possible to find some combination of starting conditions and courses of development that will result in clear victory for us and not too much damage to the world, and he labors under constant frustration.”
“But what if he gets what he wants?”
“Well, if he can find the combination-if the computer says, ‘This is it,’ then I suppose he thinks he can argue our government into fighting exactly the war the computer has worked out so that, barring random events that upset the indicated course, we’d have what we want.”
“There’d be casualties.”
“Yes, of course. But the computer will presumably compare the casualties and other damage-to the economy and ecology, for instance-to the benefits that would derive from our control of the world, and if it decides the benefits will outweigh the casualties, then it will give the go-ahead for a ‘just war.’ After all, it may be that even the losing nations would benefit from being directed by us, with our stronger economy and stronger moral sense.”
Gelb stared his disbelief and said, “I never knew we were sitting at the lip of a volcanic crater like that. What about the ‘random events’ you mentioned?”
“The computer program tries to allow for the unexpected, but you never can, of course. So I don’t think the go-ahead will come. It hasn’t so far, and unless old man Hargrove can present the government with a computer simulation of a war that is totally satisfactory, I don’t think there’s much chance he can force one.”
“And he comes to you, then, for what reason?” “To improve the program, of course.”
“And you help him?”
“Yes, certainly. There are big fees involved, Herman.”
Gelb shook his head, “Peter! Are you going to try to arrange a war, just for money?”
“There won’t be a war. There’s no realistic combination of events that would make the computer decide on war. Computers place a greater value on human lives than human beings do themselves, and what will seem bearable to Secretary Hargrove, or even to you and me, will never be passed by a computer.”
“How can you be sure of that? “
“Because I’m a programmer and I don’t know of any way of programming a computer to give it what is most needed to start any war, any persecution, any devilry, while ignoring any harm that may be done in the process. And because it lacks what is most needed, the computers will always
give Hargrove, and all others who hanker for war, nothing but frustration.”
“What is it that a computer doesn’t have, then?”
“Why, Gelb. It totally lacks a sense of self-righteousness.”
Hallucination
Part One
Sam Chase arrived on Energy Planet on his fifteenth birthday.
It was a great achievement, he had been told, to have been assigned there, but he wasn’t at all sure he felt that at the moment.
It meant a three-year separation from Earth and from his family, while he continued a specialized education in the field, and that was a sobering thought. It was not the field of education in which he was interested, and he could not understand why Central computer had assigned him to this project, and that was downright depressing.
He looked at the transparent dome overhead. It was quite high, perhaps a thousand meters high, and it stretched in all directions farther than he could clearly see. He asked, “Is it true that this is the only Dome on the planet, sir?“
The information-films he had studied on the spaceship that had carried him here had described only one Dome, but they might have been out-of-date.
Donald Gentry, to whom the question had been addressed, smiled. He was a large man, a little chubby, with dark brown, goodnatured eyes, not much hair, and a short, graying beard.
He said, “The only one, Sam. It’s quite large, though, and most of the housing facilities are underground, where you ‘II find no lack of space. Besides, once your basic training is done, you’ll be spending most of your time in space. This is just our planetary base.”
“I see, sir,” said Sam, a little troubled.
Gentry said, “I am in charge of our basic trainees so I have to study their records carefully. It seems clear to me that this assignment was not your first choice. Am I right? “
Sam hesitated, and then decided he didn’t have much choice but to be honest about it. He said, “I’m not sure that I’ll do as well as I would like to in gravitational engineering.”