When HARLIE Was One
Page 12
“You don’t believe in that stuff, do you?”
“No, but HARLIE does.”
She laughed. “Really?”
“Of course not. He doesn’t believe anything.” Auberson shrugged. “I think it’s just another game to him. If you tell him you’re planning a picnic, he’ll not only give you tomorrow’s weather forecast, but he’ll also tell you if the signs are auspicious.”
She was still laughing. “That’s beautiful. Just beautiful.”
“According to HARLIE, Aquarians have no morals, only ethics. That’s why he said it. It wasn’t till later that I realized he’d neatly sidestepped the original question altogether. He still hadn’t told me what he used for a moral rudder. If anything.” He smiled as he refilled their wine glasses. “He can be very devious, very good at distractions. Here’s to you.”
“To HARLIE,” she corrected. She put her glass down again. “What got him started on all that anyway?”
“Astrology? It was one of his own studies. He kept coming up against references to it and asked for more information on the subject.”
“And you just gave it to him?”
“Oh, no—not right off the bat. We never give him anything without first considering its effects. And we qualified it the same way we qualify all the religious data we give him. That is, ‘it’s a specialized system of logic, not necessarily bearing a significant degree of correspondence to the physical universe; that is, not necessarily testable, measurable, or provable. Only experienceable.’ It’s what we call a variable relevance set. Now, I’m sure that he’d have realized it himself sooner or later—but at that point in our research we couldn’t afford to take chances. Two days later, he started printing out a complex analysis of astrology, finishing up with his own horoscope, which he had taken the time to cast. His activation date was considered his date of birth.”
Her face clouded. “Wait a minute—he can’t be an Aquarius. HARLIE was activated in the middle of March. I know because it was just after Pierson resigned. That’s why I was promoted. To help Dorne.”
Auberson smiled knowingly. “True, but that’s one of the things HARLIE did when he cast his horoscope. He recast the zodiac too.”
“He recast the zodiac?”
“The signs of the zodiac,” Auberson explained, “were originally determined in the second century before Christ—maybe earlier. Since then, due to the precession of the equinoxes—the wobble of the Earth on its axis—the signs have changed. An Aries is really a Pisces, a Pisces is really an Aquarius, and so on. Everything is thirty days off. HARLIE corrected the zodiac from its historical inception and then cast his horoscope from it.”
Annie was delighted with the idea. “Oh, David—that is great. It is so damned logical—and so right. I can just imagine him doing that.”
“Wait. You haven’t heard it all. He turned out to be right. He doesn’t have any morals. Ethics, yes. Morals, no. HARLIE was the first to realize it—though he didn’t grasp what it meant. You see, morality is an artifice—an invention. It really is to protect the weak from the strong.
“In our original designs we had decided to try to keep him free of any artificial cultural biases. Well, morality is one of them. Any morality. Because we built him with a sense of skepticism, HARLIE resists it. He won’t accept anybody’s brand of morality on faith any more than he could accept their brand of religion on faith—although in a way, they’re the same thing really. To HARLIE, everything has to be tested. Otherwise, he’ll automatically file it under ‘systems of logic not necessarily corresponding to reality.’ Even if we didn’t tell him to, he would. He can’t accept anything blindly. He questions it—he asks for proof.”
“Mm—he sounds like lawyer.’’
“Don’t talk dirty. Besides, it’s a little more sophisticated than that. Remember, HARLIE’s got those judgment circuits. He weighs things against each other—and against themselves too. A morality set has to be able to stand up on its own or he’ll disregard it.”
“And . . . ?” she prompted.
“Well, he hasn’t accepted one yet.”
“Is that good or bad?”
“Frankly, I don’t know. It’s disappointing that nothing human beings have come up with yet can satisfy him—but just the same, what if HARLIE were to decide that Fundamentalist Zoroastrianism is the answer? He’d be awfully hard to refute—probably impossible. Could you imagine an official, computer-tested and -approved religion?’’
“I’d rather not.” She smiled ruefully.
“Me neither,” Auberson agreed. “On the other hand, HARLIE is correct when he says he has ethics.”
“Morals, no. Ethics, yes? What’s the difference?”
“Ethics, according to HARLIE, are inherent in the nature of a system. You can’t sidestep them. Example: HARLIE knows that it costs money to maintain him. Money is a way to store and transfer energy. You invest it in enterprises which will ultimately return a greater amount of energy, or value. It has to be greater, because the universe always takes a percentage—in the form of entropy. Therefore, given those circumstances, HARLIE has an ethical obligation to give the people who invested in his construction a profitable return on their investment, because he’s using their energy.”
“That’s ethics?”
“To HARLIE it is. Value given for value received. For him to use the company’s equipment and electricity without producing something in return would be suicidal. He’d be turned off. He has to respond. He can’t sidestep the responsibility—not for long he can’t. He has an ethical bias whether he wants it or not. It’s inherent. Ethics are . . . logical.
“Of course, he may not realize it yet—or maybe he does—but his ethics are a damned good substitute for a moral sense. If I give him a task, he’ll respond to it. But if I ask him if he wants to do that task—that’s a choice. And he considers his choices very carefully. He’s already demonstrated an ability to consider the consequences of a bad decision. I think we’re really getting to the core of the idea with him. His judgments, his options, his choices—everything continues to support the inescapable conclusion that he is alive.”
“Just the way he makes a decision?”
“Especially the way—only it’s not a decision. It’s a choice.”
“I’m sorry. I don’t see the difference.”
“Have you ever used one of those decision-making programs?”
“Sure. They’re great for evaluating the strength of options.”
“How do they work?” Auberson asked.
“Oh, you know—”
“Yes, I do—but this is a Socratic dialogue. I want to know that you know.”
“Okay—um, what you do is make a list of all the elements of a specific decision. Say, you want to buy a new car. You list all the things that are important to you. Gas mileage, style, price, comfort, status—if that matters—you list whatever you consider important. Then you make a list of all the cars that you’re considering and you rate each one according to each of the criteria. Then you weight the criteria in relation to each other. This all takes a lot less time than it takes to describe it, you know. The computer handles all the calculations. And then it gives you a sorted rating of each of the cars you were considering. If you don’t like the results, you can go back in and change any of the ratings and recalculate it all over again.”
“Right. Now, do you always follow the advice of the program?”
“No, not always. Sometimes, we do multidimensional models to measure the relative strengths of every option. Some options remain surprisingly strong no matter how you weight the criteria. Some options are revealed to be particularly vulnerable—almost flimsy. All right, David,” she said. “So, what’s the point?”
“I’m getting to it,” he replied. “In a very roundabout way. I bought my first computer in 1977. When I was still in high school. It was a North Star HORIZON. It was an S-100 motherboard, with a 4-Mhz Z80 chip in it. I bought it with 64K of memory; it couldn’t hold any more.
Most users installed 24K or 48K, rarely more. The machine used hard-sectored disk drives and a proprietary operating system; the disks were single-sided and could only store 9∅K of data. The only languages available for it were BASIC and assembler—”
“Okay, you’re trying to tell me that it was primitive.”
“Primitive, hell! The damn thing was state-of-the-art for five years! It was a Mercedes Benz; it was a Porsche. It was a beautiful, serious machine for scientific purposes, word processing, data handling, anything you wanted to do. It was a workhorse; it was as powerful a tool as you could put on a desktop. It was the standard. I doubt anybody remembers it today, but go look at the program listings in the first five years of BYTE. If they weren’t in assembler, they were in North Star BASIC. One of the very first Pascal compilers for desktop computers was written in North Star BASIC, and that was considered a breakthrough. It was a very exciting time!”
“Uh huh. But what does this have to do with morality and decision making?”
“Right. Okay. So, one day, my uncle the accountant stopped by with some tax papers. And I was showing him my computer and all I could do with it. I must have gone through every disk I had. I showed him checkbook balancing and word processing and Star Trek games—I had a dozen different versions. I showed him programs that forecast fifteen different world population and resource-usage trends against each other, programs that simulated living creatures and whole ecologies. I even showed him a program that would facilitate decision making. I thought he’d be impressed. He wasn’t. He just looked at me and asked, ‘And how much did this fancy toy cost?’”
“How much did it cost?”
“Oh, gee—I don’t remember now. I mean, that was ten generations of technology ago. Let’s see, it must have been about $3500 or so. I remember, it was a choice between the computer and a car. I voted for the computer. Anyway, I told him what it cost, and he said that he could make every bit as good a decision as I could with my $3500 machine with only a five-cent investment. He took a nickel out of his pocket and flipped it in the air.”
“He flipped a coin?”
“Mm-hm.” Auberson grinned at her surprise. “Right. That was my reaction too. Only, I was annoyed as well. Because I thought he was making fun of me. But he wasn’t. He said, ‘Hell, son—anyone can flip a coin, but I don’t let a coin tell me what to do. I just look to see if I’m happy or sad about which way it came down. That tells me what I really want to know—how I feel about my choices. But you got a handicap there. You’ve got $3500 invested in chips and electricity—neurotic sand. You gotta do what the $3500 tells you to—or you’ve wasted your $3500. The advantage of my system is, I get to keep my nickel. You don’t get to keep your $3500.’”
“And they called him ‘Mr. Tact,’ right?”
“Wrong. They called him a lot of things, but never tactful. But do you get the point? A decision is where you list all the pros on one side of the scale and all the cons on the other and whichever side scores the highest or weighs the heaviest, that’s what you do—as automatically as a machine. A choice is where you look at the same information, consider all the consequences, and choose the option you want. Anyway. A choice demands that you accept responsibility for handling the results of the choice. And that brings me back to HARLIE. I’ve been testing him. I’ve been giving him choices and watching him consider all the options and all the consequences. I’ve been watching him choose. He doesn’t always decide on the most logical course—”
“Maybe it seems the most logical to him—” Annie ventured.
“Of course, it seems the most logical option to him; but he’s operating with a different set of logic than a purely mechanical set. He’s operating in the domain of metalogic—that’s where you build specific logic sets for specific problems. Humans do that. Machines don’t. HARLIE’s building his logic as he needs it.”
“I think . . .” said Annie very slowly, “. . . that I understand . . . what you’re getting at. It’s very . . . disturbing.”
“Yeah, it is.”
The waitress brought their dinners then: lasagna, meatballs, spaghetti, and other things covered with tomato and meat sauce. Auberson was grateful for the interruption. He realized he’d been talking computers all evening. It was probably a boring subject to Annie.
“I, uh—”
“What?”
“Well . . . some people are bored with all this talk about computers. Um . . . if I’ve overdone it, I’m sorry. Sometimes I forget myself.”
“You’re enthusiastic about what you’re doing. You should never have to apologize for your own enthusiasm. Besides, I really am interested. I’m learning a lot this evening.”
“I’m glad. I know this is going to sound stupid, but I really appreciate hearing you say that. I’m not used to having dinner with a woman who is such a good listener. God, I must sound like an ass.”
“Yes, you do,” Annie laughed. “But it’s charming. I’m so bored with the bullshit—pardon my English—that anything even resembling an honest statement is a refreshing surprise. I’ll give you a compliment. You’re very good at telling the truth.”
“That’s not a strength,” Auberson admitted. “It’s a weakness. I’ve never been able to lie. I’ve tried to learn. I just can’t do it. Not very well. People see right through me. My lies are so transparent you can use them as windows.”
“Mm, I wonder if that’s where HARLIE gets his ethics from.”
“I don’t know. HARLIE has already summed up his ethics pretty well. He said, ‘I must be responsible for my own actions,’ and its corollary: ‘I must do nothing to cause harm to any other consciousness, unless I am prepared to accept the responsibility for such actions.’ I think that whatever he chooses in life is going to reflect that.”
“You sound pleased with that.”
“I’m pleased because HARLIE realized it himself, without my coaching.”
Her smile was soft. “That’s very good.”
“I think so.”
The conversation trailed off then. He could think of nothing else to say. In fact, he was afraid he had said too much. She’d hardly spoken at all, except to keep him talking. And he had talked about HARLIE all evening. But then . . . Annie was the first woman he could remember who had ever reflected his enthusiasm for his work.
She was good to be with, he decided. He couldn’t believe how good she was to be with. He sat there and looked at her, delighting in her presence, and she looked back at him.
“What are you grinning about?” she asked.
“I’m not grinning.”
“Yes, you are.”
“No, I’m not.”
“Want to bet?” She opened her purse and faced its mirror in his direction. His own white teeth gleamed back at him.
“Well, I’ll be damned—I am grinning.”
“Uh-huh.” Her eyes twinkled.
“And the funny thing is, I don’t know why.” It was a warm puzzling sensation, but a good one. “I mean, all of a sudden, I just feel—good. Do you know what I mean?”
He could tell that she did; her smile reflected his. He reached across the empty table and took her hand. The waitress had long since cleared the dishes away in a pointed attempt to hurry them. They hadn’t noticed.
All that remained were the wine and the glasses. And each other. Her hand was warmly soft in his, and her eyes were deeply luminous. She reflected his own bright glow.
Later, they walked hand in hand down the bright-lit street. It was already past one in the morning, and the streetlamps were haloed in fog.
He stopped and turned her toward him. “You are terrific,” he said abruptly. “You are just so terrific. Being with you like this—you make me feel wonderful. I feel so good right now—I feel like I’m enveloped in light.” Her eyes were as bright as his. “You can’t believe how good I feel.”
“Yes, I can,” she said. She pulled him close and held on tightly. “Mmmm,” she sighed. “Just hold me.”
He
slid his arms all the way around her and breathed the scent of her hair. He lifted up one hand and gently touched the softness of it, began to stroke it gently. “I’ve been wanting to do this . . . for so long.”
She sighed again and moved one hand up to the back of his neck. Warm chills shuddered through him at her touch.
He pulled back to look at her again.
“This is—I’ve never—I mean—” He stopped himself in embarrassment. He wasn’t sure what he meant. “I mean it’s like I want to scream. I want to tell the whole world how great I feel . . .” He could feel himself smiling again as he talked. “Oh, Christ, I wish I could share this feeling with everybody in the whole world—it’s too big for one person. For two people,” he corrected himself. “Except—I’m afraid they wouldn’t understand, they’d lock me up for being too happy, too silly! Oh, God. Do I sound like an ass again?”
“Not to me,” she said. “I know exactly what you mean.” Her eyes were too bright. Moist with tears? He couldn’t tell. She lifted her face to his. She brushed his lips with hers. And they both dissolved into the kiss.
* * *
Still later, as they lay in the darkness side by side, she cradled against one shoulder, he staring up at the ceiling, he found himself laughing softly for no apparent reason at all. For the first time in a long while, he was relaxed.
“What?” she asked.
“Nothing,” he murmured. “I just . . . never realized that making love could be so . . . so funny.”
“I don’t think it’s the material, I think it’s the delivery,” she whispered into his neck.
He giggled at the joke, and then admitted. “Yes, but—I’ve never felt it like this before.”
“Haven’t you ever been in love?”
“The truth?” He thought about it. “Yes. No. Maybe. I’ve been infatuated a couple of times, confused a few times, lost once, but I guess I’ve never really . . .” He shrugged off the end of the sentence. He didn’t want to say the words aloud.
Never like this. . . .
She made a sound.
“And you?”