The Gentle Seduction

Home > Other > The Gentle Seduction > Page 27
The Gentle Seduction Page 27

by Marc Stiegler


  In large information spaces, the maps themselves will have maps showing their interconnections, in a manner analogous to maps in an ordinary atlas. The best road maps will never be more than one button away, no matter where you are in the information space. They will always have a small dot inside, showing YOU ARE HERE. And these maps, unlike ordinary geographic maps, will support teleportation—if you see a place you d like to go, just point, click, and prepare for landing.

  These map buttons, and all the buttons binding the information space together, will give the reader unprecedented control over his own exploration. Because the reader will have choices of what to read next, hypermedia literature will be interactive in a way that no current form of art can equal (with the exception of computer games, which have not yet been recognized as art).

  Earlier, I mentioned an analogy between hypermedia links and the suggested reading lists in the encyclopedia. Anyone who has run such a series of links knows that it can be fun, even with slow, finger-based indexing. But it can also be frustrating. Hypermedia will bring the fun back into learning again.

  Definition Of Singularity

  I hope I’ve succeeded in defining hypermedia. Our next topic, as you may recall from the road map at the beginning of this article, is the technological Singularity.

  "Singularity” is the term first used by_Vernor Vinge7to describe the result of an exponential increase in technological sophistication. As the rate of technological advance rises beyond the point where normal human beings can comprehend it, mankind will encounter problems and solutions that cannot even be understood, much less described, in today's context. The people who enter the epoch of the Singularity will find all their material needs fulfilled. Thgey will be effectively immortal. Looking back from their present, they will consider the concerns of our generation (such as wars, environmental pollution, and bureacracies) to be appalling yet quaint, just as we might view the concerns of stone-age hunters. By analogy, we, today, can understand the problems beyond Singularity to the same extent as the prehistoric hunter can understand our problems: how would you explain the idea of "cutting red tape" to a Cro-Magnon?

  Those of you who have been reading Analog's editorials and fact articles have encountered the idea of Singularity several times. One of the transformational upcoming developments is nanotechnology. With trillions of self- replicating nanorobots scattered through the solar system, we can build machines and products sufficient to fulfill all imaginable material desires (though we will surely imagine outrageous new material desires once nanotechnology starts pouring forth this cornucopia).

  Alas, there's one little problem: there are limits to growth in the rate of improvement in technology. People who predict exponential growth for systems are almost always wrong. In practice, systems tend to follow S curves: after a period of exponential growth, some limiting factor intercedes and constricts the growth to asymptomatically approach an upper bound (See Figure 6).

  For technological progress, the limiting factor is the human mind. As the velocity of change increases, we humans, who are developing that technology, spend increasingly more time just learning recent technology, leaving us less time to create even better technology (I myself read twenty or more magazines a month, but I churn out about four articles a year). As the complexity of the tasks increases, there will be fewer of us who can understand it well enough to make the next improvement.

  A nanotechnology spaceship factory will require the careful orchestration of thousands of kinds of nanorobots in a harmonious collusion. This problem is tantamount to building a complete ecosystem of organisms, with the constraint that the ecosystem not only sustain itself but also create a complex machine. Who among us— what thousand-man team among us—has the requisite set of skills to set up this extraordinary symphony?

  We know the answer in a vague way. "Computers," we wave our hands, "will augment our minds in constructing these systems." Yes, they will—but how? Word processors won't make the difference. Not even the sophisticated simulation tools used to design aircraft today can make the whole difference—we need to be able to design something worth simulating before we can check it out.

  Relationship Of Hypermedia and Singularity

  Hypermedia is part of the answer. Hypermedia will give us an indexing system that is over a hundred times faster than traditional indexes such as tables of contents.

  That suddenly sounds mundane: how big a deal is it to have an indexing system that is 100 times faster? Indexing, after all, is such a dull chore—of course it is, because it is so important to so many different activities.

  Is doing the same old thing 100 times faster a big deal? Let me propose the Magnitude Theorem, about the consequences of orders of magnitude of change: If a process becomes ten times faster or ten times cheaper or ten times better, it is not the same process. If a process becomes 100 times better, it is no longer even recognizable. Airplanes are rarely thought of as horses that are 100 times faster. We can best demonstrate the meaning of the Magnitude Theorem with respect to hypermedia indexing with an example.

  A Child Dying Of

  Adrenoleukodystrophy

  In November, 1987, Newsweek ran an article about a heroic couple. Their child had a very rare disease, adrenoleukodystrophy, known as ALD. The disease was characterized by the accumulation in the blood of very long-chain saturated fatty acids, known as VLCFAs. The VLCFAs attacked the nervous system, leading to death in a few years.

  ALD had no known cure; the doctors threw up their hands and went on to assist others whom they knew how to treat. Most parents would have thrown up their hands at that point as well. But this couple did not surrender so easily.

  They started their own research, and soon found that the scattered researchers on ALD had never met. So they convened a meeting of all the ALD researchers in the world. None of these men had any solutions either—at least, none that they could implement in less then ten years, using advanced genetic engineering. But one researcher had found, in test tube experiments, that monounsaturated fat, oleic acid, reduced VLCFA production.

  So the couple gave up their jobs to pursue a cure for ALD. Their research became more intense, this time searching for a company that could produce oleic acid in a purified form.

  After finding a company that could manufacture oleic acid, after testing it for toxicity, they started giving their son oleic acid. It reduced the levels of VLCFAs—but not enough. The couple realized that, to make further progress, they needed to understand why oleic acid helped, so they could develop something even better.

  Again, research. With months of effort, including the finding of an article from a Polish medical journal, they developed a theory about oleic acid's success. Comparing and crosslinking accounts of animal experiment successes with the kinds of chemicals used in those experiments, it seemed that monounsaturated long-chain fatty acids monopolized the elongation process, blocking production of the toxic saturated VLCFAs.

  Research! Now they needed the longest-chain mono- unsaturate they could identify—the longest one that was not toxic. Erucic acid, from rapeseed oil, was a long chain indeed. But it caused heart disease in animals. More research! Animals, they learned, metabolize erucic acid differently from humans; no heart disease or any other problem in humans had ever been identified.

  And research. They had to find a company that could purify the oil sufficiently to make it useful. Again, after a long search, they found one.

  When at last they could treat their son with erucic acid, his VLCFA levels dropped to normal in three weeks. Unfortunately, it had taken years for the couple to complete the long search—the long cross-indexing of existing information—to find the cure. Their son was already in a coma. At the time of this writing, it was unclear whether he could recover.

  Is it obvious how hypermedia could have affected this effort? If databases on ALD, biochemistry, molecular structures, chemical manufacturers, and ongoing research activities had been interlinked in a hypermedia information
space, the effort that took these people years could have been completed in a few months (See Figure 7 for a picture of a “Hypermall,” where future searches for such cures might begin).

  Those parents could have saved the life of their child. Even more incredibly, they could have saved the life of their child cheaply—without sacrificing their own lives to the effort. Their search for a cure could have been a modest activity, rather than a heroic event.

  With hyperrnedia information spaces, this could open up a breathtaking alternativive for those of us faced with seemingly insurmountable problems; if no one else has a cure, OK, I'll invent one! It no one else has a device, OK, I'll invent one! With hypernmedia information spaces at our disposal, our ability to keep up with the technology explosion will itself explode.

  Next Steps in

  Hyperinedin Development

  Putting all the world's data into an information space would be a huge undertaking—just digitizing it would be an enorrnous task, and beyond that, is the effort of putting in the crosslinks, the hypermedia buttons. Putting the worlds knowledge into hypermedia might become the titanic yet vital project for the information age that the transcontinental railroad's development was for the industrial age. Building information space will give us the same increase in speed and power for information movement that the railioad gave us for material goods.

  Several organizations are working toward the building of a global information space, albeit slowly. Apple Computer is probably the leader in the use of hypermedia on personal workstations, having introduced HyperCard, the most-raved-about hypermedia product in history. At its first public presentation, it received a standing ovation from the audience.

  Both Apple and Microsoft, the two principal drivers of personal computer technology, have made major commitments to the optical storage devices needed to inexpensively store hypermedia databases. At last year's Comdex, Kodak displayed an optical disk juke box that could store half a terabyte of information, enough to store a century of Scientific Americans, 400 times over.

  The rise of digital information standards, such as Postscript and SGML, will reduce the agonizing costs now incurred by anyone trying to collect large blocks of data from diverse sources. These standards were not designed as data formats for hypermedia information, but their widespread adoption will nonetheless help by creating a smaller set of formats from which conversion will be necessary.

  Researchers at IRIS, the Institute for Research for Information and Scholarship at Brown University, have built curriculum materials for English and biology in their own hypermedia system, Intermedia, with more to follow. Key goals include the building of easy-to-use tools for creating information spaces (called webs in Intermedia), and to allow growth of the information spaces without bound (see figure 8).8

  Perhaps the most visionary hypermedia undertaking is the Xanadu project, started by Ted Nelson (the same Ted Nelson who coined the term hypermedia in the first place).9 The Xanadu project is developing a hypertext publishing network capable of interlinking millions of documents for thousands of users. Xanadu is also building several examples of front ends, or user interface software, to this information space for personal computers such as the Macintosh, and the IBM PC; their long term plan is to support third-party front end developers.

  Xanadu incorporates many significant features beyond the basic hypermedia concept. Xanadu will maintain version control of all the documents in its information space. Links to one version of a document are also present in all other versions (as long as any of the linked data is still present). Thus the reader may trace the evolution of a concept. It also allows the original author to update and correct his work, based on the comments and criticisms others have leveled at his document (and which have been attached to his document by later readers).

  The basic links in a Xanadu information space are two-way, i.e., when a link is installed, it puts a button at both ends, allowing the reader to go in either direction (which is considerably different from HyperCard and Guide). Thus when an author inserts backward references to earlier works, the system automatically creates forward references. This will fulfill the scholar's greatest fantasy, giving him a bibliography that lists not only material that predates an article, but also a bibliography of all the works created later (several years ago, Analog published a story about a thiotimoline-operated typewriter. This typewriter could print material from the future, offering a similar forward referencing capability; the idea was hysterically funny because it was so self-evidently impossible. I would have added a reference here to the issue of Analog that has the story—but the effort to find it is overwhelming, until we get Analog into hypermedia).

  Xanadu even has a reasonable answer to the question, "How does the author get paid?" The creators of Xanadu database material (and anyone can be a creator here) will receive royalties based on the number of times their material is accessed; the reader will be charged based on the number of kilobytes of data he reads.

  Even the Library of Congress is exploring the application of optical media in its quest for self-improvement. Anyone who has ever attempted to use the Library will appreciate their sense of urgency: the card catalog is not a boxful of index card racks, it is a series of rooms, full of boxes full of index card racks. A subject such as "Advertising" sprawls across half a dozen racks. Stoic is the researcher who selects a handful of books from that mammoth collection, necessarily at random, then waits several hours for retrieval—only to find that these weren't quite the books he had in mind.

  As information spaces like the Library of Congress get linked up, new commercial enterprises will arise that blend a bit of the editor's role, the publisher's role, and the reviewer's role. How will the average reader separate the wheat from the chaff? Part of the answer will be that respected reviewers and editors will construct link-sets that point out all the documents that they thought were excellent.

  Other value-added retailers will build unique, cross- pollinating link sets that highlight the interrelationships between items with no visible connection. Harmonic oscillators from physics have applications in fields from molecular biology to cosmology; a unilateral pull-out of Soviet forces from Europe several years ago, heralded by some news people as an overture of peace, turned out to be a preparatory step for the invasion of Afghanistan a few months later—long after everyone had forgotten the connection.

  A link-set spanning just the history of the United States might save us from the great danger to technology that I alluded to earlier: the danger that, as our ability to process paper increases, bureaucrats will increase the amount of paper. One set of buttons that I am personally eager to insert into an American history information space is a set of links connecting governmental regulations with the consequences of those regulations—all of those consequences. In the early days of railroads, short-haul passengers felt outrage that the railroads charged almost as much for short local runs as they charged to go the long distance from New York to Chicago. These angry citizens put the railroads under government regulation, and this fixed the problem: the long distance fares were increased.10 This might sound like a strange fluke—but the same thing happened when government took regulatory control of the airlines. The future is all too predictable for those who remember the past—for those who have a rich set of interconnections showing the relations between those past events.

  Perhaps easily accessible linkages, reiterating these relationships between laws and consequences, would help Americans to understand their vital role as cultural engineers. With such an understanding, our interaction with bureaucracies such as the government could become more rational. We could make institutions more effective—or we could intentionally make them less effective, based on a deeper understanding of effective government.

  Just this one clear articulation of the relationship between people and institution could pay for the entire effort of building our information space. Who knows where we might go from there?

  Authors Note: Needless to
say, this document was first drafted in hypermedia, then translated to linear form.

  References

  lLiterary Machines, Theodor Nelson, Project Xanadu, 1987. This book discusses hypertext from the perspective of hypertext's originator, and as such is as close to a bible as one can get in the field. A hypertext version of Literary Machines is now available from Owl International.

  2Guide is a product of Owl International. For further information contact Ed Taylor or Jamie Welch at (800) 344-9737, or write to Owl International, 14218 NE 21st St., Bellevue, WA 98007.

  3Hypercard is a trademark of Apple Computer, Inc.

  4David's Sling, Marc Stiegler, Baen Books, 1988.

  5The Elements of Style, Strunk and White, MacMillan Publishing Company, 1979.

 

‹ Prev