Book Read Free

In the Beginning...Was the Command Line

Page 2

by Neal Stephenson


  When I moved on to college, I did my computing in large, stifling rooms where scores of students would sit in front of slightly updated versions of the same machines and write computer programs: these used dot-matrix printing mechanisms, but were (from the computer’s point of view) identical to the old teletypes. By that point, computers were better at time-sharing—that is, mainframes were still mainframes, but they were better at communicating with a large number of terminals at once. Consequently, it was no longer necessary to use batch processing. Card readers were shoved out into hallways and boiler rooms, and batch processing became a nerds-only kind of thing, and consequently took on a certain eldritch flavor among those of us who even knew it existed. We were all off the batch, and on the command line, interface now—my very first shift in operating system paradigms, if only I’d known it.

  A huge stack of accordion-fold paper sat on the floor underneath each one of these glorified teletypes, and miles of paper shuddered through their platens. Almost all of this paper was thrown away or recycled without ever having been touched by ink—an ecological atrocity so glaring that those machines were soon replaced by video terminals—so-called glass teletypes—which were quieter and didn’t waste paper. Again, though, from the computer’s point of view, these were indistinguishable from World War II-era teletype machines. In effect we still used Victorian technology to communicate with computers until about 1984, when the Macintosh was introduced with its Graphical User Interface. Even after that, the command line continued to exist as an underlying stratum—a sort of brainstem reflex—of many modern computer systems all through the heyday of graphical user interfaces, or GUIs, as I will call them from now on.

  GUIs

  Now the first job that any coder needs to do when writing a new piece of software is to figure out how to take the information that is being worked with (in a graphics program, an image; in a spreadsheet, a grid of numbers) and turn it into a linear string of bytes. These strings of bytes are commonly called files or (somewhat more hiply) streams. They are to telegrams what modern humans are to Cro-Magnon man, which is to say, the same thing under a different name. All that you see on your computer screen—your Tomb Raider, your digitized voice mail messages, faxes, and word-processing documents written in thirty-seven different typefaces—is still, from the computer’s point of view, just like telegrams, except much longer and demanding of more arithmetic.

  The quickest way to get a taste of this is to fire up your web browser, visit a site on the Net, and then select the View/Document Source menu item. You will get a bunch of computer code that looks something like this:

 

 

  C R Y P T O N O M I C O N

 

 

 

 

 

 

 

 

 

 


 



 

 

 

 

 

  This crud is called HTML (HyperText Markup Language) and it is basically a very simple programming language instructing your web browser how to draw a page on a screen. Anyone can learn HTML and many people do. The important thing is that no matter what splendid multimedia web pages they might represent, HTML files are just telegrams.

  When Ronald Reagan was a radio announcer, he used to call baseball games that he did not physically attend by reading the terse descriptions that trickled in over the telegraph wire and were printed out on a paper tape. He would sit there, all by himself in a padded room with a microphone, and the paper tape would creep out of the machine and crawl over the palm of his hand printed with cryptic abbreviations. If the count went to three and two, Reagan would describe the scene as he saw it in his mind’s eye: “The brawny left-hander steps out of the batter’s box to wipe the sweat from his brow. The umpire steps forward to sweep the dirt from home plate,” and so on. When the cryptogram on the paper tape announced a base hit, he would whack the edge of the table with a pencil, creating a little sound effect, and describe the arc of the ball as if he could actually see it. His listeners, many of whom presumably thought that Reagan was actually at the ballpark watching the game, would reconstruct the scene in their minds according to his descriptions.

  This is exactly how the World Wide Web works: the HTML files are the pithy description on the paper tape, and your web browser is Ronald Reagan. The same is true of graphical user interfaces in general.

  So an OS is a stack of metaphors and abstractions that stands between you and the telegrams, and embodying various tricks the programmer used to convert the information you’re working with—be it images, e-mail messages, movies, or word-processing documents—into the necklaces of bytes that are the only things computers know how to work with. When we used actual telegraph equipment (teletypes) or their higher-tech substitutes (“glass teletypes,” or the MS-DOS command line) to work with our computers, we were very close to the bottom of that stack. When we use most modern operating systems, though, our interaction with the machine is heavily mediated. Everything we do is interpreted and translated time and again as it works its way down through all of the metaphors and abstractions.

  The Macintosh OS was a revolution in both the good and bad senses of that word. Obviously it was true that command line interfaces were not for everyone, and that it would be a good thing to make computers more accessible to a less technical audience—if not for altruistic reasons, then because those sorts of people constituted an incomparably vaster market. It was clear that the Mac’s engineers saw a whole new country stretching out before them; you could almost hear them muttering, “Wow! We don’t have to be bound by files as linear streams of bytes anymore, vive la revolution, let’s see how far we can take this!” No command line interface was available on the Macintosh; you talked to it with the mouse, or not at all. This was a statement of sorts, a credential of revolutionary purity. It seemed that the designers of the Mac intended to sweep command line interfaces into the dustbin of history.

  My own personal love affair with the Macintosh began in the spring of 1984 in a computer store in Cedar Rap-ids, Iowa, when a friend of mine—coincidentally, the son of the MGB owner—showed me a Macintosh running MacPaint, the revolutionary drawing program. It ended in July of 1995 when I tried to save a big important file on my Macintosh PowerBook and instead of doing so, it annihilated the data so thoroughly that two different disk crash utility programs were unable to find any trace that it had ever existed. During the intervening ten years, I had a passion for the MacOS that seemed righteous and reasonable at the time but in retrospect strikes me as being exactly the same sort of goofy infatuation that my friend’s dad had with his car.

  The introduction of the Mac triggered a sort of holy war in the computer world. Were GUIs a brilliant design innovation that made computers more human-centered and therefore accessible to the masses, leading us toward an unprecedented revolution in human society, or an insulting bit of audiovisual gimcrackery dreamed up by flaky Bay Area hacker types that stripped computers of their power and flexibility and turned the noble and serious work of computing into a childish video game?

  This debate actually seems more interesting
to me today than it did in the mid-1980s. But people more or less stopped debating it when Microsoft endorsed the idea of GUIs by coming out with the first Windows system. At this point, command-line partisans were relegated to the status of silly old grouches, and a new conflict was touched off: between users of MacOS and users of Windows.*

  There was plenty to argue about. The first Macintoshes looked different from other PCs even when they were turned off: they consisted of one box containing both CPU (the part of the computer that does arithmetic on bits) and monitor screen. This was billed, at the time, as a philosophical statement of sorts: Apple wanted to make the personal computer into an appliance, like a toaster. But it also reflected the purely technical demands of running a graphical user interface. In a GUI machine, the chips that draw things on the screen have to be integrated with the computer’s central processing unit, or CPU, to a far greater extent than is the case with command line interfaces, which until recently didn’t even know that they weren’t just talking to teletypes.

  This distinction was of a technical and abstract nature, but it became clearer when the machine crashed. (It is commonly the case with technologies that you can get the best insight about how they work by watching them fail.) When everything went to hell and the CPU began spewing out random bits, the result, on a CLI machine, was lines and lines of perfectly formed but random characters on the screen—known to cognoscenti as “going Cyrillic.” But to the MacOS, the screen was not a teletype but a place to put graphics; the image on the screen was a bitmap, a literal rendering of the contents of a particular portion of the computer’s memory. When the computer crashed and wrote gibberish into the bitmap, the result was something that looked vaguely like static on a broken television set—a “snow crash.”

  And even after the introduction of Windows, the underlying differences endured; when a Windows machine got into trouble, the old command line interface would fall down over the GUI like an asbestos fire curtain sealing off the proscenium of a burning opera. When a Macintosh got into trouble, it presented you with a cartoon of a bomb, which was funny the first time you saw it.

  These were by no means superficial differences. The reversion of Windows to a CLI when it was in distress proved to Mac partisans that Windows was nothing more than a cheap facade, like a garish afghan flung over a rotted-out sofa. They were disturbed and annoyed by the sense that lurking underneath Windows’ ostensibly user-friendly interface was—literally—a subtext.

  For their part, Windows fans might have made the sour observation that all computers, even Macintoshes, were built on that same subtext, and that the refusal of Mac owners to admit that fact to themselves seemed to signal a willingness, almost an eagerness, to be duped.

  Anyway, a Macintosh had to switch individual bits in the memory chips on the video card, and it had to do it very fast and in arbitrarily complicated patterns. Nowadays this is cheap and easy, but in the technological regime that prevailed in the early 1980s, the only realistic way to do it was to build the motherboard (which contained the CPU) and the video system (which contained the memory that was mapped onto the screen) as a tightly integrated whole—hence the single, hermetically sealed case that made the Macintosh so distinctive.

  When Windows came out, it was conspicuous for its ugliness, and its current successors, Windows 95, 98, and Windows NT, are not things that people would pay money to look at either. Microsoft’s complete disregard for aesthetics gave all of us Mac-lovers plenty of opportunities to look down our noses at them. That Windows looked an awful lot like a direct ripoff of MacOS gave us a burning sense of moral outrage to go with it. Among people who really knew and appreciated computers (hackers, in Steven Levy’s nonpejorative sense of that word), and in a few other niches such as professional musicians, graphic artists, and schoolteachers, the Macintosh, for a while, was simply the computer. It was seen as not only a superb piece of engineering, but an embodiment of certain ideals about the use of technology to benefit mankind, while Windows was seen as both a pathetically clumsy imitation and a sinister world domination plot rolled into one. So, very early, a pattern had been established that endures to this day: people dislike Microsoft, which is okay; but they dislike it for reasons that are poorly considered, and in the end, self-defeating.

  CLASS STRUGGLE ON THE DESKTOP

  Now that the Third Rail has been firmly grasped, it is worth reviewing some basic facts here. Like any other publicly traded, for-profit corporation, Microsoft has, in effect, borrowed a bunch of money from some people (its stockholders) in order to be in the bit business. As an officer of that corporation, Bill Gates has only one responsibility, which is to maximize return on investment. He has done this incredibly well. Any actions taken in the world by Microsoft—any software released by them, for example—are basically epiphenomena, which can’t be interpreted or understood except insofar as they reflect Bill Gates’s execution of his one and only responsibility.

  It follows that if Microsoft sells goods that are aesthetically unappealing, or that don’t work very well, it does not mean that they are (respectively) philistines or half-wits. It is because Microsoft’s excellent management has figured out that they can make more money for their stockholders by releasing stuff with obvious, known imperfections than they can by making it beautiful or bug-free. This is annoying, but (in the end) not half so annoying as watching Apple inscrutably and relentlessly destroy itself.

  Hostility toward Microsoft is not difficult to find on the Net, and it blends two strains: resentful people who feel Microsoft is too powerful, and disdainful people who think it’s tacky. This is all strongly reminiscent of the heyday of Communism and Socialism, when the bourgeoisie were hated from both ends: by the proles, because they had all the money, and by the intelligentsia, because of their tendency to spend it on lawn ornaments. Microsoft is the very embodiment of modern high-tech prosperity—it is, in a word, bourgeois—and so it attracts all of the same gripes.

  The opening “splash screen” for Microsoft Word 6.0 summed it up pretty neatly: when you started up the program you were treated to a picture of an expensive enamel pen lying across a couple of sheets of fancy-looking handmade writing paper. It was obviously a bid to make the software look classy, and it might have worked for some, but it failed for me, because the pen was a ballpoint, and I’m a fountain pen man. If Apple had done it, they would’ve used a Mont Blanc fountain pen, or maybe a Chinese calligraphy brush. And I doubt that this was an accident. Recently I spent a while reinstalling Windows NT on one of my home computers, and many times had to double-click on the “Control Panel” icon. For reasons that are difficult to fathom, this icon consists of a picture of a clawhammer and a chisel or screwdriver resting on top of a file folder.

  These aesthetic gaffes give one an almost uncontrollable urge to make fun of Microsoft, but again, it is all beside the point—if Microsoft had done focus group testing of possible alternative graphics, they probably would have found that the average mid-level office worker associated fountain pens with effete upper management toffs and was more comfortable with ballpoints. Likewise, the regular guys, the balding dads of the world who probably bear the brunt of setting up and maintaining home computers, can probably relate best to a picture of a clawhammer—while perhaps harboring fantasies of taking a real one to their balky computers.

  This is the only way I can explain certain peculiar facts about the current market for operating systems, such as that ninety percent of all customers continue to buy station wagons off the Microsoft lot while free tanks are there for the taking, right across the street.

  A string of ones and zeroes was not a difficult thing for Bill Gates to distribute, once he’d thought of the idea. The hard part was selling it—reassuring customers that they were actually getting something in return for their money.

  Anyone who has ever bought a piece of software in a store has had the curiously deflating experience of taking the bright shrink-wrapped box home, tearing it open, finding that it’s ninet
y-five percent air, throwing away all the little cards, party favors, and bits of trash, and loading the disk into the computer. The end result (after you’ve lost the disk) is nothing except some images on a computer screen, and some capabilities that weren’t there before. Sometimes you don’t even have that—you have a string of error messages instead. But your money is definitely gone. Now we are almost accustomed to this, but twenty years ago it was a very dicey business proposition.

  Bill Gates made it work anyway. He didn’t make it work by selling the best software or offering the cheapest price. Instead he somehow got people to believe that they were receiving something valuable in exchange for their money. The streets of every city in the world are filled with those hulking, rattling station wagons. Anyone who doesn’t own one feels a little weird, and wonders, in spite of himself, whether it might not be time to cease resistance and buy one; anyone who does, feels confident that he has acquired some meaningful possession, even on those days when the vehicle is up on a lift in a repair shop.

  All of this is perfectly congruent with membership in the bourgeoisie, which is as much a mental as a material state. And it explains why Microsoft is regularly attacked, on the Net and elsewhere, from both sides. People who are inclined to feel poor and oppressed construe everything Microsoft does as some sinister Orwellian plot. People who like to think of themselves as intelligent and informed technology users are driven crazy by the clunkiness of Windows.

  Nothing is more annoying to sophisticated people than to see someone who is rich enough to know better being tacky—unless it is to realize, a moment later, that they probably know they are tacky and they simply don’t care and they are going to go on being tacky, and rich, and happy, forever. Microsoft therefore bears the same relationship to the Silicon Valley elite as the Beverly Hillbillies did to their fussy banker, Mr. Drysdale—who is irritated not so much by the fact that the Clampetts moved to his neighborhood as by the knowledge that when Jethro is seventy years old, he’s still going to be talking like a hillbilly and wearing bib overalls, and he’s still going to be a lot richer than Mr. Drysdale.

 

‹ Prev



 

 


  Cryptonomincon by Neal Stephenson