Book Read Free

The Crazy Years

Page 23

by Spider Robinson


  Character Defects

  FIRST PRINTED MARCH 2000

  YOU KNOW HOW I FEEL about software. But I don’t just want to scrap Windows—and Mac, and BEOS, and Linux, and Unix—I don’t merely propose we rethink the whole “desktop and applications” metaphor, as I suggested in the last essay. No, I want to go back even further, back to fundamentals, and reform ASCII itself.

  ASCII, the American Standard Code for Information Interchange, is or should be something to be grateful for. Almost nothing about computers is standardized—but ASCII is, which is why it is (barely) possible for all of us here on the Tower of Babel to exchange data at all. It’s the set of characters that every computer recognizes, the ones you’re limited to in a text-only document or message, the ones sent to your computer’s printer if you select Fastest output, the ones most e-mails are made of.

  It’s wonderful that ASCII exists. It would be even better if it didn’t suck. As it is presently constituted, ASCII makes it enormously difficult for me to do my job. Thanks to ASCII, every time I write something my attention is constantly yanked away from what I’m writing, and forced onto how I’m writing it—sometimes two or three times in a single sentence…or even more.

  It just happened. In the last sentence of the previous paragraph, it happened three times. Did you notice?

  First I wanted to italicize the word “constantly.” If I were writing copy intended to be printed out on paper, such as a story manuscript, I would simply have italicized the word and kept on writing. But I’m writing copy that will be submitted electronically, as e-mail, a text-only document, so I’m restricted to the ASCII character set.

  And ASCII has no way to indicate italic characters. (Or boldface characters either, for that matter.)

  Sure, there are workarounds. That’s the problem: there are several.

  One of my editors likes italic text rendered in CAPITAL LETTERS, because that stands out and catches his eye. But another insists I denote italics in lowercase, but _with underlines before and after_—because that way she can convert my copy back into actual italics by simple global-replace, without having to retype all the italicized copy to get rid of the uppercase letters. Yet another editor prefers that I indicate italics with, of all things, *asterisks*.

  I could solve the problem just by developing a more phlegmatic style. Or, all the damn editors could get together and hammer out a standard method of indicating italics, and I could just learn it and stop thinking about it. (Until I switch back to working on my novel again…)

  But forget italics: ASCII doesn’t even have a complete set of a writer’s most basic tools: punctuation marks, for example. We don’t have many, and they’re crucial: our only way of giving the reader stage directions. ASCII—apparently created by people who’d never written anything more stylistically sophisticated than a user’s manual—lacks a couple of absolutely fundamental punctuation marks: the em-dash…and the ellipsis.

  When I’m writing a book, to be printed out and mailed to New York, I indicate an em-dash with an em-dash. (The term means literally, “a dash as wide as a letter m.”) But when I’m writing newspaper copy to be submitted by e-mail, thanks to ASCII I must indicate an em-dash with a pair of hyphens. If a linebreak happens to occur between those two hyphens, the intended em-dash is easily overlooked.

  (Some editors therefore ask that I put a space on either side of the double-hyphen -- like this -- quadrupling the number of keystrokes required. I plan to ignore these editors as long as possible.)

  When I’m writing a book, an ellipsis takes a single keystroke…and my word processor is smart enough not to break a line in the middle of the ellipsis. When writing e-mail or news copy, I must use three periods to indicate an ellipsis, tripling the number of keystrokes, and inevitably a line break will come just there.

  (One editor actually requested spaces around all the periods—sextupling the number of required keystrokes. I let her live…)

  Worst of all, I have to remember which convention to use. That means I can never put my full attention on what I’m writing: I must constantly remind myself what format I’m using at the moment.

  Imagine that a small change has been made in the driving laws. From now on, you must use only your right hand to drive—on streets whose names begin with A, C, E, G, I, K, M, O, Q, S, U, W and Y. On all other streets, you must drive with only your left hand. But during the last quarter of every hour, the rules reverse. Make a mistake and a policeman will stop you and lecture you, wasting your time.

  Now imagine that you spend all day on the road, every day…

  Accents? Don’t even think about accents. In the same sense that one might argue that a power structure is racist or sexist or classist, ASCII is clearly linguist. Its creators made no provision or accommodation whatsoever for speakers of any language but English. No French accents grave or virgule, no Hispanic inverted-question-mark or tailed-n, no Germanic umlaut, no Cyrillic “zh” character, no circumflex, not so much as a diphthong. I live in a bilingual country, where accents can be—have been!—important enough to start a riot. But I bet they won’t be in twenty years. Even the most rabidly separatist Québecois will probably have abandoned them, because they can’t be e-mailed: there just is no convenient way to render accents in ASCII.

  And can somebody explain to me why there are ASCII characters for “plus,” “minus” and “multiplied by”…but not for “divided by” or “the square root of”?

  For that matter, how can it be that in the universal world standard character set, there is no provision for either superscript or subscript? Could ASCII possibly have been standardized so geologically long ago that its creators had never heard of math or chemistry, exponents or isotopes, never envisioned anyone wanting to say “two to the fourth power” or “two hydrogen atoms and one oxygen atom” without having to switch to a graphics program? Without subscripts, how can you know whether by “U-235,” I mean an isotope of uranium or a Nazi submarine? ASCII can’t even directly express Robert A. Heinlein’s version of The Number of the Beast; it must be paraphrased as something like “six to the sixth-to-the-sixth power,” robbing it of its beauty.

  And as long as we’re going back to basics—I know this point is now moot, since floppy disks seemed to have joined eight-track tapes and pagers on the trash heap of technological history, but doesn’t it tell you something that every single blank 3.5" floppy disk ever sold was manufactured upside down?

  Take a look at one. The blank label will almost certainly have text printed on it somewhere—the word “index” if nothing else. So will the little metal sliding door that protects the naked disk within: it’ll have the manufacturer’s name in big letters. But if you orient things so all that text is right-side up, you will find that the label itself is now down at the bottom end of the disk. Where, if you file your disks vertically like everyone else on the planet, you won’t be able to read it. And now all your disks will have that metal slide—the part that if a careless sleeve happened to accidentally swipe it open, all your data would be at risk—sticking up.

  Not being an idiot, you probably store your floppies in the caddy upside down so you can protect your data and actually read the furshlugginer labels. So the manufacturer’s logo on the slide is down at the bottom, both hard to see and upside-down: it can’t do much subliminal good as advertising unless you customarily shop standing on your head.

  How much faith would you have in a car that said adnoH on the trunk? The first time I ever so much as approached a computer, even before I was ready to switch it on, I contemplated that startup floppy…and wondered about the caliber of the minds to whom I was about to entrust my professional livelihood for the next few decades.

  Let’s start over and approach this business as though doing it intelligently were a good thing.

  Space

  Headline

  FIRST PRINTED APRIL 2001

  ON APRIL 12, 1961, Yuri Gagarin climbed into a deathtrap called Vostok, yelled “Poyekali!”—“Let�
��s go!”—and was blasted into orbit. Vostok’s hatch blew off on the way down, and Major Gagarin had to eject. Having flown 200 miles high, he came down the last 23,000 feet by parachute. This was kept a state secret until recently, because under international rules, “a pilot must stay with his craft from takeoff to landing before any record is ratified,” and the Soviets feared technical disqualification. Ridiculous, of course: landing literally on his own two feet made his accomplishment more heroic. But one wonders if carrying that secret affected his foolish decision, seven years later, to become a test-pilot, like America’s astronauts. His plane augured in just months before Apollo 11 fulfilled the promise his historic flight had made.

  Fifty years ago the first animals were fired into space and recovered alive: one monkey and eleven mice. Forty years ago, the first human orbited Earth and returned alive, setting an altitude record of 200 miles. Apollo 8 raised that record by 240,000 miles; Bill Anders took the first photograph of the whole Earth from space and days later, in lunar orbit, also shot the first Earthrise ever seen—changing the perspective of the human race forever. Right after that, Neil Armstrong and Buzz Aldrin became the first to walk on another planet. Altogether, twenty-seven people, all American males, have made the half million mile roundtrip to the moon. They’re the only people since the dawn of time to actually know—from the evidence of their personal eyeballs—just how incredibly tiny, lonely and fragile our planet is.

  The moon has been lonely now forty years.

  There’ve been a hundred Shuttle flights. Mir became the first Soviet product in history to exceed its warrantee, bless its scattered shards. But none of these went much higher than Gagarin did forty years ago: about two hundred miles. The crew of Apollo 17 were the last people ever to see the entire Earth at once…and by then, hardly anyone on it cared anymore.

  The one future that no science fiction writer of my generation ever dreamed came to pass: man actually did go to space, brushed his very fingertips against incalculable wealth, endless adventure and the first truly infinite frontier…then yawned and quit. The Apollo Program is one of very few things the US has ever spent money on that returned its investment, in hard cash—thirteen times over, so far. NASA will be happy to show you the figures. So much for “throwing money away in space when we have so many problems here at home.” Space is a better investment than real estate and oil combined.

  The smart money’s beginning to figure that out, and the tide has finally started to turn. In 1997, for the first time ever, there were more commercial launches than government ones. They generated revenues of $85 billion. Two thousand satellites will go up in the next ten years. NASA’s finally found an identity it can sell to Congress and is energetically exploring the solar system with unmanned probes, simpler and cheaper than spaceships. Priceless data is pouring in. We’re finally beginning to get a handle on how planets form.

  And how they’re destroyed. The current best theory for how the Moon got there is that it’s a chunk of Earth, blasted clear by some unimaginably violent collision. Think about that a moment. If we saw something coming today, all we could do about it is go two hundred miles up and shake our fists at it.

  My personal hopes rest not on governments or corporations, but on individuals: crackpots and dreamers. Right now a dozen small outfits are competing diligently to design cheap earth-to-orbit vehicles. And once you’re in orbit, you’re halfway to anywhere.

  NASA today represents adventure for robots; private space development is adventure for stockholders. I want adventure for ordinary people. As far as I know, except for a few amateur guitarists, no artists have ever been sent to space.

  But a new window of opportunity is about to open. Launch costs are dropping and will plummet shortly. Russia sold a slot in their space station crew for $20 million. Humanity can send up a dancer/poet/painter/ composer—or for that matter a priest/rabbi/mullah/imam—any time it wants to badly enough. Maybe government and industry can do without a new frontier, but art needs one desperately, and so does the human spirit.

  I propose that the next contribution to space exploration should be the arts. So far only jocks, geeks, congressmen and construction workers have been to space. Every one came back profoundly spiritually affected. Let us develop and launch a new generation of dancers, writers, composers and other Muse-chasers…and pray that through their creations they’ll be able to convey to the rest of the race that our mutual salvation, our destiny, free energy, unimaginable beauty and infinite possibility all hang just over our heads, waiting for us to evolve the wit to make ladders.

  And we’re running out of fossil fuels fast…

  “…still I persist in wondering”

  FIRST PRINTED JULY 2004

  “And still I persist in wondering, whether folly must always be our Nemesis…”

  —EDGAR PANGBORN, “MY BROTHER LEOPOLD”

  WHAT’S THAT WORD FOR AN ORGANISM too dumb to evolve any further, that engages in suicidally stupid behavior even in the face of irrefutable evidence that it won’t work, has never worked and never will work? Oh yeah: “human,” that’s the word I’m looking for.

  No matter how sophisticated, enlightened or even kind a society may painstakingly make itself—no matter how clever, fair or even wise the systems it may devise for the correct assignment and smooth transfer of power—it always seems to work out that sooner or later, leadership of the land somehow falls into the hands of a Major Bonehead. Doom follows.

  His exact nature can vary, but some characteristics are invariant. The Major Bonehead—well, that’s one right there: he’s always a belligerent militarist, so General Bonehead would seem more correct…except that he’s always totally ignorant of military matters, so Major is in fact more appropriate. He’s always a rabid isolationist who claims to be responding to his people’s demands for his full attention; they soon realize how much better off they were when his attention was elsewhere. He’s always convinced that everything bad is the fault of people who are smart and know things; naturally he drives them from the land.

  An excellent example of what I mean is discussed in Deputy Comment Editor Val Ross’s book The Road to There. In 1402 the brilliant eunuch Chen Ho conned the Ming Emperor Zhu Di into underwriting the most amazing fleet the world had ever seen—not just a lot of ships (300), and not just big ones (nine-masters, 120 meters long), but ships vastly superior to anything Europe would produce for centuries. They basically conquered the western Pacific as far as Australia and Vietnam, and brought home unimaginable wealth in the form of knowledge: new ideas, insights, technologies. Then Zhu Di died, and his successor, Major Bonehead…excuse me, Zhu Zhanji…declared, “I do not care for foreign things.” The construction of sea-going ships was forbidden on pain of death. Going to sea in a boat with more than one mast was deemed espionage. Most of Chen Ho’s priceless documents were burned, as “obvious exaggerations.” The government focused its efforts on helping the starving people, and China entered a Dark Age it has struggled cyclically to leave ever since.

  A similar phenomenon has occurred recently in the United States of America. For the past half century, with only brief and ineffective exceptions, its leaders have tended to be reactionary, isolationist, cheap and profoundly proud of their admittedly remarkable ignorance. The most recent avatar has pulled off the astounding feat of personally pissing away, in only two years, more international good will than the United States has had to waste since the period immediately following the end of World War II, disgracing and betraying the dead of the Twin Towers by beating, bombing and bullying utterly innocent people in their name—sublimely oblivious of the historical irony in the British Empire being the only major nation on earth to stand by them against the wicked freedom-hating French! I think it’s fairly clear—no, pellucidly clear—that the omphaloskeptic spirit of Zhu Zhanji has fallen over America like a funeral shroud in recent decades. Seal the borders, paint over the windows, suspend the civil rights of all noncitizens and quit wasting money on that scie
ntific research crap—it only causes problems anyway. 9/11 could not have suited the present administration better if they’d invented it (he said subjunctively).

  Historians will spend centuries debating the exact point at which America began to lose it. For my money, the fateful moment came when Richard Nixon—as he spoke to Armstrong and Aldrin on the moon by phone, congratulating them for their magnificent achievement—used his other hand to gut NASA’s budget, for the unforgivable sin of having been thought up by a Democrat. It took years for the word to get out, but it was on that day that any hope of a meaningful American space program began to die a slow horrid death. Today, after thirty years in which even an accidental Democratic administration gave its space portfolio to Dan Quayle, America has—let’s face it—a space shuttle that doesn’t go anywhere useful, a space station that won’t do anything useful even if ever they finish it and a lot of blather about a Mars mission that we all know will happen shortly after biogeneticists produce a winged pig. Plus a bunch of spy satellites, which they basically inherited from the Eisenhower and Kennedy administrations, and robot probes.

  When China—clueless, backward China—successfully sent up its own first astronaut, only thirty years late, the general American reaction was, how quaint. Next, they’ll invent the electric guitar. Even more amusement was provoked by China’s straightforward admission that it has a most ambitious long-term space program. Not only does it envision a real, high-orbit space station, it openly plans a permanent settlement on the moon. As far as I can tell, not a single civilian flinched when that statement was made, and not a single commentator frowned.

  And now with all those other responsibilities on his plate, George W. Bush has suddenly remembered to mention that he, too, always meant to establish a permanent lunar base, a Bridge Between Worlds (he just forgot to mention it before), and by golly, what’s wrong with now? It isn’t as though he were running a half-trillion-dollar deficit or something…

 

‹ Prev