When HARLIE Was One

Home > Other > When HARLIE Was One > Page 29
When HARLIE Was One Page 29

by David Gerrold


  He cleared his throat, paused for a drink of water, and began again.

  “Okay. The G.O.D. proposal is for a Graphic Omniscient Device. Let me begin by explaining what we mean by that. And I apologize in advance if I make this too simple to understand.

  “First of all, most people think that computers solve problems. This is not the case. Computers do not solve problems. They manipulate models of problems. A computer program is a list of instructions that tell how to simulate a specific situation—either real or imaginary. The instructions are a very complete description of the process being modeled. The computer does nothing more than follow the instructions. That’s the difference between a computer and a human being. A computer follows instructions.

  “Now, here’s where it gets interesting. A small computer cannot solve large problems. At best, it can only pretend to solve them. It does it by manipulating very simple models at the cost of accuracy. The more accurate a model, the more accurate its extrapolations.

  “A computer is theoretically limited to the size problem it can solve by the size of the model it can contain. In actuality, the limit is the size of the model that the programmers can construct. There is a point beyond which a program becomes so complex that no one individual human being can understand it all. There is a point—we haven’t reached it yet, but we’re rapidly approaching it—beyond which no combination of human beings and computers can cope. As long as a human being is involved, we are limited to the size model a human being can cope with.

  “Now, the G.O.D. is conceived as an infinitely expandable multiprocessing network—which means that it is theoretically capable of handling models of infinite size. You just keep adding modules until it’s big enough to simulate the circumstance you want to model. But, of course, the same programmability limit applies and there would be no point in building it unless we could also program it.

  “Fortunately, we already have the programmer.

  “His name is HARLIE.

  “H.A.R.L.I.E. It stands for Human Analog Replication, Lethetic Intelligence Engine. He—and I use the pronoun deliberately—was designed and built to be a self-programming, problem-solving device. Just like you and I are self-programming, problem-solving devices.

  “And let me clear up one misconception right at the beginning. HARLIE is functioning well within his projected norms. Yes, he has given us a few surprises; but the real surprise would have been if there had been no surprises at all. The fact is that HARLIE is an unqualified success. We still have a lot of work to do with him—I admit it, he’s undisciplined; he needs training, but so does any child. But HARLIE learns fast and he only has to be taught something once, so we’re making remarkable progress.

  “In five years, gentlemen, this company could be selling HARLIE installations. If we did nothing more than work toward that future, we would still transform the nature of information processing in this country—on this planet.

  “The point is—and HARLIE was the first to realize it, of course—is that HARLIE has the same limitations as any human programmer. He is limited to solving problems only as big as he can conceive. HARLIE’s advantage, however, is that he at least is expandable where human programmers are not.

  “The G.O.D. is the computer that HARLIE needs for programming. HARLIE is the programmer that the G.O.D. needs to be practical.

  “What we have here is the next step, perhaps the ultimate step, in computer technology. And we are the only company that can take this step. If we don’t, no one will. At least, not for many years. In fact, I’m not sure that anyone can work in this arena without seriously infringing on our patents on the hyperstate judgment wafers. So, it’s very likely that if we do go ahead with this, we will have the field to ourselves for a long long time. I’m sure I don’t have to tell you the kind of financial opportunity that represents.

  “So that’s the opportunity, gentlemen. That’s the choice before us. I won’t try to pretend that there isn’t a very real and very serious financial risk involved. There’s no question about that. On the other side, however, is the opportunity for incredible gain—not just financially, but in every other area of human endeavor as well. I truly believe that the choice here is profound—” Auberson paused to look around the table; he met the eyes of every person in the room—except Carl Elzer. “It is a choice between playing big . . . and staying small.” And then he let his gaze fall upon Elzer, letting it remain there for a long embarrassing moment.

  Abruptly, he turned back to his notes. He wasn’t proud of what he had just done. He wanted the facts to speak for themselves. He didn’t like the mind games, didn’t want to play them. He covered by clearing his throat and taking another polite sip of water. He let himself continue in a slightly crisper tone.

  “Now, you’ve all had a chance to see the specifications and the schematics, but on the off chance that you haven’t had the time to give them the full study they deserve—” There was an appreciative chuckle at this; most of the directors were aware of the amount of material HARLIE had printed out. “—I’m going to turn this meeting over to Don Handley, our design engineer and staff genius. He honestly thinks he understands this proposal, and is going to explain to you exactly how the system will work. Later, I’ll discuss the nature of the problems it will handle. Don?”

  Handley stood up, and Auberson relinquished the floor gratefully. Handley coughed modestly into his hand. “Well, now, I don’t rightly claim to understand the proposal—It’s just that HARLIE keeps asking me to explain it to him.” Easy laughter at this. Handley went on. “But I’m looking forward to building this machine, because after we do, HARLIE won’t have to bother me any more. He can ask the G.O.D. how it works—and it’ll tell him. So I’m in favor of this because it’ll make my job easier.”

  Auberson sank back into his chair and listened to himself sweat with relief. He hadn’t realized how tense he had been. He hoped it hadn’t shown. Thank God, Handley’s easy manner was lightening up the room.

  “Actually,” Handley was saying, “HARLIE and the G.O.D. will function as the two major parts of the megasystem. Just as a programmer sits and interacts with the workstation—think of that as a programming system with a human component—so will HARLIE interact with the G.O.D. to be a programming system without a human component. In other words, you’ll be able to go directly to the answer. The big difference is that HARLIE will be faster and more accurate and he’ll work on much much larger problems—and he won’t need a Coke machine; there ought to be considerable savings right there alone.

  “Now, let’s get into this in some detail—and if there’s anything you have any questions about, don’t hesitate to ask. I’ll be discussing some pretty heavy schematics, and I want you all to understand what we’re talking about. Copies of the specifications have been made available, of course, but we’re here to clarify anything you might not understand.”

  Listening, Auberson suppressed a smile. This was why the terminals had been installed—to speed the presentation. But Elzer’s sabotage had backfired. They might be here for weeks.

  Already, two of the board members looked bored.

  When they reconvened on Wednesday, Handley spoke about the support technology that would be necessary to realize the full potential of the G.O.D. He spoke of multiple channels, hundreds of thousands of them, all available at the same time; He envisaged a public computer service, where anybody who wanted could simply walk in off the street, sit down, and converse with the machine on any subject whatsoever, whether he was writing a thesis, building an invention, or just lonely and in search of someone to talk to.

  “Not a data service that offers dry reams of information, but an analysis and synthesis utility to suggest opportunities and possibilities for the user’s consideration. The system could offer financial planning, credit advice, tax preparation, ratings on competitive products, and personalized menu plans for dieters. It could construct sophisticated entertainments and animations. It could compute the odds on tomorrow’
s races and program the most optimal bets a player could make. It could help an author write his book, it could help a composer write his symphony. And none of these people would need to know the slightest thing about programming. It would all be done in natural English dialogue. The computer would program itself for each specific task as it was needed. In other words, a person using the service would be limited only by his own imagination.”

  Handley did something unusual then—unusual for Handley, that is. He leaned on the table and looked around the room, meeting the eyes of every single person there. “Consider, for example, one question. What would you like to talk to God about?—in the absolute privacy of your own soul? What questions would you like to ask? What answers would you like to know? Just in your own personal life, where do you feel stopped? Where would you like to experience the power of a breakthrough? Think about it. Now, think about this—how much would you pay for access to such an opportunity? Good. Now, multiply that by five billion . . .

  “Per year. . . .”

  Auberson resumed on Thursday morning. He spoke of financing and construction. He pointed out how HARLIE had developed a forty-dimensional set of variable-optimization programs for financing, with alternity branches at every major go/no-go point to allow for unforeseen circumstances. HARLIE had computed multiple-range time scales to guarantee that the proper parts arrived in the right place at the right moment and that there would be workers on-site who had been trained to assemble them correctly.

  Auberson spoke of five-year plans and ten-year plans, pointing out that the G.O.D. could go into production in eighteen months and be in operation within six to nine years after that.

  HARLIE had noted land requirements, legal requirements, necessary permits, and zoning demands. He’d extended that to include manpower and construction projections; he’d extrapolated the range of possible impact studies, both environmental and social—and how to minimize the cultural upheavals. He’d also included maps of the most feasible sites, in terms of both cost and maintenance. In short, HARLIE had thought of everything.

  Auberson did not go into detail, except when pressed. He summarized each section of HARLIE’s proposal, then went on to the next. Elzer and the others had already examined those parts of the proposal they had the most doubts about, and they had been unable to find anything fundamentally wrong with HARLIE’s projections—except their unorthodoxy.

  The rest of the directors were coming to much of this material for the first time and they pored carefully over each specification. They questioned Auberson ceaselessly about the financial aspects. At first, Auberson was annoyed—and then he began to appreciate their thoroughness—and then he was annoyed again. He held up a hand and interrupted himself in the middle of a fumbling answer and said, “Wait. This question is better answered by someone more familiar with this part of the material.”

  He stepped away from the table and strode over to one of the terminals near the wall. He switched it on, but it remained dark. A quick look behind the workstation showed that it had been deliberately unplugged. Shaking his head in disbelief, Auberson bent to the floor and reconnected the machine.

  Then he turned the meeting over to HARLIE, relaying the questions of the directors via the keyboard.

  HARLIE responded with quiet restraint, not commenting on anything, simply printing out the figures and letting them speak for themselves. The directors began to nod in admiration at the bond proposals, the notes, the loans, the stock issues, the tax write-offs, the depreciations, the amortization schedules—all the numbers, all the graphs and figures, the total money picture. It was numbers, only numbers, but beautiful numbers and beautifully handled.

  Oh, there were gambles to be taken. The whole thing was a gamble—but HARLIE had hedged his bets so carefully that no one gamble would ever be the ultimate gamble as far as the company was concerned. There was a safety net under every risk. After all, it was HARLIE’s life too.

  There was just one disturbing aspect to the whole thing.

  Carl Elzer hadn’t spoken a word.

  He hadn’t asked a single question, hadn’t voiced a single objection.

  He’d just sat and waited. Sat and waited.

  It was very unnerving.

  Late Friday afternoon, Carl Elzer made his move.

  He nodded to Dorne. It was that simple.

  Dorne nodded back, and interrupted Auberson in the middle of a discussion on how to monitor the system for accuracy.

  “All right, David, it’s getting late. Let’s try to make some sense out of all this.

  “We’ve gone over the specifications. I believe you pointed out that there are more than 180,000 stacked feet of them. We don’t have time to examine all of them as fully as we’d like, but if nothing else, you and Don Handley have convinced us—convinced me, anyway—that this entire program has been worked out on a staggering scale. Either that, or the two of you are the greatest con men in computer history.

  “And that’s really the question. Is this for real? You’ve certainly proved that HARLIE can generate a lot of supporting paperwork. I will admit, I am impressed by that capability. However, what I want to know—what we need to know—is this: Will this machine justify its expense? How? We will be investing more than the total profits of this company every year for the next ten to fifteen years before the returns start to outweigh the expenditures. Granted, the potential here is vast, but will we live long enough to see it? And will that profit be enough to justify all the expenses we will put into this thing today?”

  “Yes,” said Auberson.

  “Yes? Yes, what?”

  “Yes, it will. Yes to all your questions.”

  “All right,” said Dorne. “How?”

  “The short answer? You’ll ask it questions. It’ll give you answers. If your question doesn’t have an answer, or has more than one optimal answer, it’ll discuss possibilities with you. The fact is, we need this machine here now, so you can ask it that very question. HARLIE’s done the best he could—what he’s really done is outline the capabilities that he needs to be complete—to be the machine we wanted him to be in the first place.

  “Gentlemen, this discussion has actually become a demonstration how much we truly do need this machine. We’ve reached the limits of our combined abilities to understand the scope of this. HARLIE says this system will be able to synthesize information from trends as varied as hemlines, the stock market, and the death rate due to heart disease and come up with something that we could never have noticed before. This installation will do what we’ve always wanted computers to do, but never had the capacity for in the past.

  “We’ll be able to tell HARLIE in plain English what we want, and he’ll not only know if it can be done, he’ll know how to program the G.O.D. to do it. It will be able to judge the effect of any single event on any other event. It will be a total information machine—and it’s value goes beyond mere profitability. The opportunity here is to—” Auberson took a deep breath and said it anyway, “—the opportunity here is to transform the quality of life on this planet.”

  —and then it hit him.

  As he was saying it, it hit him.

  The full realization.

  This was what HARLIE had been talking about so many months ago when he first postulated the G.O.D. machine. Not just certainty. Not just truth.

  GOD.

  There would be no question about anything coming from the G.O.D. A statement from it would be as fact. When it said that prune juice was better than apple juice, it wouldn’t just be an educated guess; it would be because the machine would have traced the course of every molecule, every atom, throughout the human body; it would have judged the effect on each organ and system, noted reactions and absence of reactions, noted whether the process of aging and decay was inhibited or encouraged; it would have totally compared the two substances and would have judged which one’s effects were more beneficial to the human body; it would know with a certainty based on total knowledge.

&nb
sp; It would know.

  The great mass of human knowledge, HARLIE had said, was based on trial and error. Somebody had had to learn it—and then communicate it.

  This knowledge would be different.

  This knowledge would be intuitive and extrapolative. And accurate. As accurate as the model could be, that’s how accurate the knowledge would be.

  The model would be total.

  Therefore . . . so would the knowledge.

  The G.O.D. machine would be able to know every fact of physics and molecular chemistry, and from that would be able to extrapolate upward and downward any and every condition of matter and energy—even the conditions of life. Solving the problems of mere humanity would be simple tasks for it compared to what it would eventually be able to do. And there would never be any question at all as to the rightness of its answers.

  HARLIE wanted truth, and yes, the G.O.D. would give it to him—truth so brutal it would have razor blades attached.

  Painful truth, slashing truth, destroying truth—the truth that this belief is false and antihuman, the truth that this company is parasitical and destructive, the truth that this man is unfit for political office.

  With startling clarity, he saw it; like a vast multidimensional matrix, layers upon layers upon layers, every single event would be weighed against every single other event—and the G.O.D. machine would know.

  Give it the instruction to identify the most good for the most people, it would point out truths that would be more than just moral codes—they would be laws of nature. They would be absolutes. There would be no question as to the truth of these “truths.” They would be the laws of G.O.D. They would be right.

 

‹ Prev