Book Read Free

The Soul of a New Machine

Page 10

by Tracy Kidder


  Alsing's childhood did not leave him with an abundance of sweet memories. Discovering the telephone was one of his fondest When he was eight and in third grade, his family moved from central Massachusetts to Evansville, Indiana. He loathed that place. "I was smaller, paler, weaker, less rugged and I had a funny accent. I remember discovering in third grade that there was a pecking order — 'Hey, there's a pile and I'm on the bottom of it.' I was a very depressed kid in third grade." He vividly recalled the day when he skipped recess, usually a painful event for him anyway, and instead worked at his desk on the design of a telephone. He wanted to find out how the thing could possibly capture a voice. It seemed to him an improbable instrument, one that shouldn't work. He read about telephones in several encyclopedias. Then he took the family phone apart. Finally, he figured it out to his satisfaction. "This was a fantastic high, something I could get absorbed in and forget that I had these other social problems."

  One day, while at home in Evansville, he was prowling around the basement and noticed wires running across the ceiling of the coal bin. He traced them back to the phone upstairs. He got ahold of some batteries, an old microphone and an ancient set of earphones, and from the coal bin — sitting there all alone in the grime, with the earphones on — he tapped the family's telephone. Once in a while, he accidentally short-circuited it, but his father, an engineer for Westinghouse who designed refrigerators, was indulgent on that score.

  Some years later, Alsing's family returned to New England, and he enrolled at the University of Massachusetts. He made mostly poor marks. Then, in his junior year, he took a course in the theory of digital circuits. It entailed, as such courses always do, the study of Boolean algebra. Alsing's world was never the same after that. Fathoming this algebra, Alsing felt as some children do when all at once they know how to read. Boolean algebra was something that made perfect sense, and thus it was a rare commodity for him. He called it beautiful.

  At the heart of the computer lies a device made up largely of transistors. Engineers call this device a gate, and the analogy is apt. You might think of it as a newfangled, automatic barnyard gate in an electrical fence. If the gate is closed, current flows through it along the length of the fence; but the current stops at the gate when it stands open. You open and close this gate by sending signals to it down two or more (let's say it's two) control wires. How can you add two numbers together with electricity? For Alsing, the question had the force of those that the telephone had long ago aroused in him. You assumed, first of all, that you were going to do your adding in binary arithmetic. It was simple, once you got the hang of it. You can count as high as you like in binary, but you use only the integers 0 and I. The zero of the familiar decimal arithmetic is 0 in binary, and the one in decimal is also I in binary; but two in decimal is 10 in binary, three is 11, and so on.

  Next, you let a high voltage represent the binary integer I and a low voltage represent the binary integer 0. Then you build a gate, which for all practical purposes is a binary device: it's either opened or closed. When opened it passes on a low voltage, which stands for 0; closed, it passes on the symbol for I. What's crucial is how this gate responds to signals from the two wires that control it. You can build your gate in such a way that it will close and pass on the electrical symbol for a I if, and only if, one of its control wires sends it a I and the other sends it a 0. A gate built this way thus will add 0 and I and yield the right answer every time.

  But if you want to add I and I and get the right answer, which is binary 10, then you have to modify your circuit, and if you want to add large numbers together, you have to build a large array of gates and control wires. Suppose you want to build an adder that can combine two 32-bit packets. The diagram for such a circuit looks forbiddingly complex. How can you be sure that for every possible set of signals you put into this circuit, it will produce just the right pattern of opened and closed gates to yield up the right answer? You need a set of rules. Boolean algebra, for instance.

  Conventional algebra sets rules about the relationships between numbers. Boolean algebra expresses relationships between statements. It is a system of logic; it sets general conditions under which combinations of statements are either true or false. In this sense, it is a binary system. What Alsing studied was in fact a simplified form of Boolean algebra, one that had been tailored to digital circuits. "If, and only if, A is true AND B is true, then their combination is also true" — that is one of the statements of the algebra. There are others, and it turns out that gates can be built so that they behave precisely according to these statements. Indeed, the kinds of gates are named for the Boolean statements that they mimic; there are, among others, AND gates, NAND gates, OR gates, NOR gates. Boolean algebra provides one systematic way for engineers to design their circuits. It is less important for that purpose than formerly, but it was a crucial tool in Alsing's college days.

  The time was the middle sixties. The computer was a famous and closely held marvel; the day had not arrived when curious undergraduates could buy the parts and figure out the thing by building one themselves. But here, in the course that Alsing took, lay the secret of the machine. "It was so simple, so elegant," he said. He got an A in that course. So he took a course in how to program a computer by using a high-level computer language called FORTRAN, which was developed for scientists mainly. Alsing got an A in that. He flunked everything else — because of an actual computer.

  It was an IBM machine, archaic now but gaudy then. The university owned it, in effect, and it lay inside a room that none but the machine's professional caretakers could enter during the day. But Alsing found out that a student could just walk into that room at night and play with the .computer. Alsing didn't drink much and he never took any other drugs. "I was a midnight programmer," he confessed.

  During the first nights after he learned to write a computer program, Alsing would go off from the computer room and search the empty building, looking for a classroom with a blackboard and some chalk. He posed problems for himself and on the blackboard wrote up little programs for their automatic solution. Then he hurried back to the computer to try out his programs on the machine. That was what made it fun; he could actually touch the machine and make it obey him. "I'd run a little program and when it worked, I'd get a little high, and then I'd do another. It was neat. I loved writing programs. I could control the machine. I could make it express my own thoughts. It was an expansion of the mind to have a computer."

  About ten other young, male undergraduates regularly attended these sessions of midnight programming. "It was a whole subculture. It's been popularized now, but it was a secret cult in my days," said Alsing. "The game of programming — and it is a game — was so fascinating. We'd stay up all night and experience it. It really is like a drug, I think." A few of his fellow midnight programmers began to ignore their girlfriends and eventually lost them for the sake of playing with the machine all night. Some started sleeping days and missed all their classes, thereby ruining their grades. Alsing and a few others flunked out of school.

  In a book called Computer Power and Human Reason, a professor of computer science at MIT named Joseph Weizenbaum writes of a malady he calls "the compulsion to program." He describes the afflicted as "bright young men of disheveled appearance, often with sunken, glowing eyes," who play out "megalomaniacal fantasies of omnipotence" at computer consoles; they sit at their machines, he writes, "their arms tensed and waiting to fire their fingers, already poised to strike, at the buttons and keys on which their attention seems to be as riveted as a gambler's on the rolling dice." Was this a portrait of Alsing playing with that old IBM?

  In order to explain the hold that machine once had on him, Alsing had insisted that we confront Trixie late at night. As if preparing for a journey or an execution, we had dined well that evening. And when we'd gotten to the basement, he'd decided to start me on the way to addiction with small stuff— that is, with Adventure. I'd played and played. One by one the night owls of the baseme
nt — and there were many in the Eclipse Group then — had departed. I'd kept on playing.

  When I finally quit, I felt weary in my bones. I was actually sweating; my shirt stuck to my back. Things around me kept going in and out of focus. I looked at Alsing, and the rims of his eyes were red. He said he could remember experiencing weariness like this during his midnight-programming days, but he had been younger then. Weariness had been a badge and part of the fun. Some of his cohorts had suffered, it was true. "But college kids are vulnerable. They can get taken down by girls, or drink, or by programming." As for him, he felt that he had gained far more than he had lost.

  Up until that year when he discovered the machine, his life, every way he had looked at it, had seemed chaotic. He had done fairly well in a course in psychology, but in nothing else. He had proven to himself that he was an inveterate failure. Now, finally, he felt truly interested in something. He left college for a year. When he came back, he took a number of courses in electrical engineering and became a straight-A student. He took a job with DEC when he graduated, then went to Data General, partly because he thought it would be a lively place to work; he got that idea from the famous first ad and from the angry talk about Data General he heard in the hallways at DEC.

  At Data General, the number of your security badge reflects your longevity: the lower the number, the longer your service. By 1979 new employees were receiving badges numbered in five figures. Alsing wore number 150, which was low enough to confer upon him some status and, he thought, perhaps a measure of security. According to several stories, de Castro had shown a special loyalty toward those who had joined his company early on and stayed. At Data General, Alsing became a microcoder. In fact, he was Data General's first and most prolific one.

  Most new microcoders, on their first job, have the odd feeling that what they're doing can't possibly be real. "I didn't fully believe, until I saw it work, that microcode wasn't just a lie," said Alsing's main submanager, Chuck Holland, remembering the first code that he wrote. At the level of the microcode, physical and abstract meet. The microcode controls the actual circuits.

  A stairway of sorts leads down into a computer. At the top stand the high-level languages. A number of these exist, and more are on the way all the time. They vaguely resemble human languages, and they remain the same no matter what computer a programmer uses. Alsing wanted me to write a little program in the high-level language called BASIC, which resembles a pidgin English. In one part of my program I ordered a simple division of two numbers, a command that in BASIC is represented by a slash: "/". I typed the program into Trixie via Alsing's terminal. What happened to the "/"?

  Inside Trixie's storage system lay a number of programs called interpreters. There was one for BASIC, and that interpreter program worked on my high-level program. Actually, the interpreter gave Trixie instructions as to how to begin translating my program into instructions that Trixie's circuits could respond to. Computers compute in order to compute. The interpreter for BASIC had Trixie translate the "/" into so-called assembly language.

  It is commonplace today for programmers to stick exclusively to high-level languages and never look inside their machines. Alsing felt that they were missing something. He remembered learning assembly language during his time of midnight programming. "It was neat to learn it. I could skip the middleman and talk right to the machine. It was also great for me to learn that priestly language. I could talk to God, just like IBM." Written down outside the machine, assembly language is a list of mnemonics, such as ADD, Skip On Equal, and so on. It contains the names of all the roughly two hundred basic operations that Trixie can perform. Instructions, these operations are called. Inside the machine, these instructions exist, of course, in the form of electrical charges. No single instruction in Trixie's set was equivalent to the "/". The "/" became a series of several discrete instructions.

  Not many years before, in most computers, the "/", once it had been translated into its constituent assembly-language instructions, would have been fed right into the actual circuits. A computer in which the stairway ends there, at the level of assembly language, is said to be "hard-wired." It has circuits especially designed to perform each basic operation in the machine's instruction set. But by the seventies — again, in most computers — assembly language no longer went directly to the circuits, but was itself translated into another language, called microcode.

  For each assembly-language instruction there exists a microprogram, and most microprograms consist of several microinstructions. Each of Trixie's microinstructions, in turn, consists of 75 bits. Seen written on a page, a microinstruction is a string of O's and l's. These correspond directly, of course, to strings of high and low voltages stored in a special place inside the computer — a "microstorage" compartment. Each string of 75 bits is divided into portions, and each portion is destined for some part or parts of the machine's circuitry. The 75 bits of each microinstruction are the actual signals that will make the gates in the circuits open and close in just the right patterns. So my "/" became a linked list of, let's say, 10 assembly-language instructions, each of which became a microprogram, each of which consisted, on the average, of 3 microinstructions, each of which consisted of 75 bits. The simple "/" was now platoons of signals, which were sent out one after the other, causing Trixie's circuits to take the two numbers I had provided for division, to translate these numbers into electrical code, to determine which was to be divided by which, to run the now encoded numbers through the Arithmetic and Logic Unit in such a way as to divide them (a labyrinthine passage itself), and to put the answer somewhere for the next step in my program. Actually, far more microsteps than that occurred. Indeed, the physical machine responds only to microcode. It was microcode, at bottom, that caused Trixie to translate my "/" into microcode. In this sense, the computer chases its tail.

  Since I'd asked for it, the result of this one operation in my little program — the quotient of my division — returned through another complex process, at the end of which it appeared on the screen of Alsing's cathode-ray tube in the form of a decimal number. The entire journey, from the moment when I ordered that the program run, to the time when the answer to the division appeared, occurred in an imperceptible moment, no more noticeable than the time it takes for a light to go on after its switch is thrown. But the speed seemed less impressive than the sheer volume of action that had occurred. I had this small revelation: division was, after all, the name of something intricate.

  Some modern computers, most notably the machines of Seymour Cray, remain hard-wired; they respond directly to the electrical equivalent of assembly language. Microcode just makes the language more specific. Microcode is, in this sense, like early Old English, in which there was no word for fighting and a poet who wished to convey the idea of battle had to describe one.

  The chief advantage of microcode is flexibility, which accrues mainly to the builder of computers. If, for instance, unforeseen defects crop up after a machine has gone to market — and this almost always happens — the manufacturer can often repair them without changing printed-circuit boards. Often the microcode can be altered instead, at considerably smaller cost. Eagle would be designed to make such changes especially cheap and painless. The code would exist on a so-called floppy disk, like a 45-rpm record, not in unalterable ROMs inside the machine. Each morning, when starting up the computer, the operator would play the disk into the circuits of Eagle's microstorage compartment. If the code had to be changed, the engineers could merely change the floppy disk and send a new copy to customers.

  Writing microcode, however, is no simple task. The code is by definition intricate. To make the machine execute just one of its two hundred or three hundred basic instructions, the coder usually has to plan the passage of hundreds of signals through hundreds of gates. Limited storage space forces the coder to economize — to make one microinstruction accomplish more than one task, for example. At the same time, though, the coder must take care that one microinstructi
on does not foul up the performance of another.

  Alsing had written fat volumes of microcode, in an extremely quirky way. The Eclipse was to be Data General's first micro- coded machine. Alsing signed up for the job — there was signing up in those days, too — and then he procrastinated. Month after month, his boss would ask him how the code was coming along and he would say: "Fine. A few problems, but pretty well." In fact, though, he had not yet written a single line of code. Finally, he could sense that his boss and some of his colleagues were growing angry; failure had become an almost palpable object — a pair of headlights coming toward him down the wrong side of a road. Scared, he packed up the necessary circuit diagrams, specs and manuals and went to the Boston Public Library.

  The Eclipse contained 195 assembly-language instructions, which in the end Alsing encoded in some 390 microinstructions, many of which performed multiple duties. He said he wrote most of those microinstructions in two weeks at the library. Perhaps it really took him less; West believed that Alsing did it all in two days and nights. "Without question he did," West insisted

  It was always this way with Alsing. The summer before the Eagle project began, he was assigned to write the code for a new Eclipse. As usual, he stalled, and when he felt that he was about to get in trouble, he went home with an armful of books.

  He lived about fifteen miles from Data General, in a new ranch house. "My microporch," he said, showing me into a small screened porch. We looked out on a nearby grove of white pines and smelled the needles on the floor of the woods. The room contained a bright green outdoor carpet, an electric grill for barbecues, some uncomfortable-looking wrought iron chairs and a table with a glass top. That summer, he had asked his wife to keep his three sons away from there for a while, had deposited his manuals on the tabletop, and had started to think. Again, he did the entire job in a rush, and finished in about two weeks. "A quick hit," he said.

 

‹ Prev