TechGnosis

Home > Other > TechGnosis > Page 12
TechGnosis Page 12

by Erik Davis


  Given the aura that surrounds such discoveries, the timing of Nag Hammadi’s unexpected blast from the past has led some myth-minded moderns to suspect that something more than happenstance was afoot. After all, history too has its poetic logic; apparently random accidents can strike deep chords of synchronicity, especially once those events are played through the organ of the mind, with its constant search for harmony and melody. In the words of June Singer, a contemporary Jungian gnostic of sorts, “What a coincidence, what a meaningful coincidence, that those Egyptian peasants stumbled upon that jar just at the end of the Second World War, after the Holocaust and after the dropping of the atomic bombs on Hiroshima and Nagasaki.”1 Singer points out that the Nag Hammadi codices themselves tell us to pay attention to the timing of their return to the world. The tractate known as the Gospel of the Egyptians claims:

  The great Seth wrote this book with letters in one hundred and thirty years. He placed it in the mountain that is called Charaxio, in order that, at the end of the times and eras … it may come forth and reveal this incorruptible, holy race of the great savior.2

  Now, 1945 was not exactly the end of times and eras, though one can forgive the citizens of Hiroshima, Nagasaki, and Dresden for thinking otherwise. But the atomic bomb was destined to inflict a world-rending dread on postwar life, and its solar powers were shrouded from the beginning with apocalyptic imagery. Immediately following that summer’s first Trinity test blast, in the New Mexican wasteland known as the Jornada del Muerto, Robert Oppenheimer recalled a quotation from the Bhagavad Gita: “Now I am become Death, the destroyer of worlds.” In the next decades, many feared that a cataclysmic incandescence was only a red phone call away, though few expected a savior any more holy than the tense stalemate of detente.

  Despite the danger that wayward nuclear weapons still pose today, the mushroom cloud has mostly evaporated in our imaginations, dissipating into a more amorphous apocalyptic atmosphere laced with airborne viruses, biological weapons, toxic fumes, and greenhouse gases. With this in mind, we might even say that the most world-shaking explosion in the 1940s was not atomic but informational. When Marshall McLuhan perversely described the atomic bomb as “information,” he probably was testing out one of his patented rhetorical shocks. But he may have glimpsed a deeper revelation as well. For if the information age was born in the electric nineteenth century, and nurtured in the first decades of the radio-crazed twentieth, World War II marked its glorious coming of age.

  This rite of passage was certainly not without its nightmares, especially when it came to the electronic media’s increasing ability to mesmerize hearts and minds. Technology critics who fear the power of mass media thought control still point to the German fascists, whose culture industry engineered a dark consensus reality with fiendish acumen. In the words of Albert Speer, the showman behind the Third Reich’s Nuremberg pep rallies,

  Hitler’s dictatorship was the first dictatorship of an industrial state in this age of modern technology, a dictatorship which employed to perfection the instruments of technology to dominate its own people … 80 million persons could be made subject to the will of one individual.3

  Besides staging megawatt mass spectacles, the Nazi propagandists exploited the sonorous immediacy of the radio with sorcerous brilliance, allowing Hitler to, as he himself put it, make his way with the ease of a somnambulist. To fight the Axis powers, the Allies also exploited new information technologies to the max. In both theaters of war, radar played a pivotal, if often overlooked, role, with microwaves giving the Allies a distinct tactical edge toward the end of the war, especially in coordinating D-day and the bombing raids on Germany. The war also saw the creation of the Z3, the world’s first programmable digital computer, invented in 1941 by an ardent Nazi and used to design some of Germany’s flying bombs. Secret codes were cranked out on both sides of the barbed-wire fence, and in Britain, Alan Turing used some of the earliest digital computers to unscramble German Enigma messages. Such efforts also fired up the burners for the kind of code paranoia that would come to typify postwar espionage, as civilian censors among the Allies, fearing the propagation of encrypted information, went so far as to rearrange stamps on outgoing letters, ban crossword puzzles and toddler drawings from the mail, and in one case spin the dials on an entire shipment of watches to scramble any possible hidden messages.

  Following 1945, the war’s intense electronic development found its way into civilian life, especially in the United States. ENIAC, the first electronic programmable computer, made its U.S. debut in 1946, stirring the public imagination with the “electronic genius” of its “superbrain.” A few years later, Bell Labs’ revolutionary transistor started replacing the vacuum tubes previously used in computers and other electronic devices, initiating the spiral of miniaturization and circuit-board complexity that has led us today to the realms of nanotechnology and quantum computing. In the late 1940s, theoretical developments like information theory and cybernetics laid the groundwork for new forms of information-driven social organization, while consumer culture kicked into electric overdrive. The first generation of media mutants was born, baby boomers destined to grow up in the first modern suburbs, soak up the first commercial television broadcasts, and blow their minds and turn global culture inside out when they eventually got their gadget-happy hands on electric guitars, Marx, and LSD, whose psycho-shamanistic powers were first uncorked at a Swiss pharmaceutical corporation in 1943.

  But what does this explosion of information culture and electronic media have to do with a stack of Coptic religious texts crumbling in a jar in upper Egypt? Obviously, an incalculable historical, cultural, and spiritual divide exists between the mystical aspirations of ancient dualists and the cultures and concepts that would come to surround information and its technologies in the twentieth century. But from a hermetic perspective, which reads images and synchronicities at least as deeply as facts, the mythic structures and psychology of Gnosticism seem strangely resonant with the digital Zeitgeist and its paradigm of information. As we’ll see, Gnostic myth anticipates the more extreme dreams of today’s mechanistic mutants and transhuman cowboys, especially their libertarian drive toward freedom and self-divinization, and their dualistic rejection of matter for the incorporeal possibilities of mind. Gnostic lore also provides a mythic key for the kind of infomania and conspiratorial thinking that comes to haunt the postwar world, with its terror of nefarious cabals, narcotic technologies, and invisible messengers of deception.

  Gnosis forms one of the principal threads in the strange and magnificent tapestry of Western esotericism, and I must emphasize that my use of its lore is not intended to belittle its possibly illuminating powers. Hermetic scholars or occult traditionalists would write off any similarities between Gnostic religion and contemporary technoculture as, at best, the latter’s demonic and infantile parody of the former. But the authenticity of spiritual ideas and religious experiences does not really concern me here; rather I am attempting to understand the often unconscious metaphysics of information culture by looking at it through the archetypal lens of religious and esoteric myth. Inauthentic or not, these patterns of thought and experience have played and continue to play a role in how humans relate to technology, and especially the technologies of information. But before we crack open the techgnostic jar and let its speculative genies loose, it seems important to wrestle a bit with the concept of information itself, that strange new angel that lends its name to the age.

  The Mythinformation Age

  Information gathering defines civilization as much as food gathering defines the nomadic cultures that preceded the rise of urban communities, agricultural surplus, and stratified social hierarchies. From the moment the first scribe took up a reed and scratched a database into the cool clay of Sumer, information flow has been an instrument of human power and control—religious as well as economic and political. It is hardly accidental that the first real writing machine emerges hand in hand with urban civilization, nor that
the technology was initially devoted to recording the transfer of goods into the hands of priests.

  But it wasn’t until the twentieth century that information became a thing in itself. People began to devote themselves more and more to collecting, analyzing, transmitting, selling, and using the stuff. Even more significantly, they built machines to automate and perform these tasks with a level of power and efficiency far beyond the builders themselves, and this information combustion fueled the expanding apparatus of science, commerce, and communications. In many people’s minds, what was once merely a category of knowledge began to mutate into a new unit of reality itself, one that took its place alongside matter and energy as one of the fundamental building blocks of the cosmos. If electricity is the soul of the modern age, information is its spirit.

  In the simplest everyday terms, “information” suggests a practical chunk of reified experience, a unit of sense lodged on the hierarchy of knowledge somewhere between data and report. Though an essentially incorporeal and “mental” element, information nonetheless seems to derive from the external physical world, tightly bound to mundane materials like newsprint or a thermometer or sound waves emerging from a herald’s mouth. Information emerges in the spark gap between mind and matter. In the middle of the twentieth century, scientifically rigorous definitions of the stuff began to appear, definitions that were destined to invade biology, social science, and popular culture, thereby transforming our understanding of ourselves and our social institutions. Computers brought the logical machinery of data processing into everyday life, while new communication technologies wove human beings into a global web of messages and signals.

  Inevitably, information became one of those concepts whose meaning expands even as it begins to evaporate. You could fill a million DustBusters with the fuzzy thinking that “information” has produced, especially as the technical term collided with social and cultural forms of knowledge. At the same time, the constantly shifting borderlines around the term have lent the concept an incorporeal mystique; despite its erstwhile objectivity, information has become an almost luminescent icon, at once fetish and Logos. Straddling mind and matter, science and psyche, hard drives and DNA, information has come to spawn philosophies both half-baked and profound, while also reconstructing, perhaps dangerously, our images of the self and its cosmic home. Gnosticism is hardly the only passageway into the storehouse of archetypes lurking beneath the secular mask of information, but it underscores the metaphysical patterns and Promethean fire that the new category of reality unleashed into the postwar mind.

  In the late 1940s, a Bell Labs researcher named Claude Shannon announced the birth of information theory, an abstract technical analysis of messages and communication. Shannon’s exacting description of information, initially embraced by scientists and engineers, planted the seeds of the concept’s later flowering. The theoretical tools that Shannon created apply to any scenario in which a message is passed from a sender to a receiver along a communication channel—in principle, they can describe a conversation in a barroom, the replication of genetic material, or an episode of Mad Men bounced off a wheezy satellite into millions of monitors across the planet. For the heroic message to reach its goal, it must survive the onslaught of “noise”—the chance fluctuations, interference, and transmission errors that inevitably degrade signals as they make their way through an error-ridden and analog world. The popular kids’ game of telephone—where a whispered phrase is passed mouth to ear through a circle of people, a process that inevitably mutates the message—provides a good playground image for the semantic drift of such signal degradation; the interruptions that plague mobile phones furnish a more visceral taste of noise in all its cranky glory.

  In the face of this formidable foe, Shannon’s celebrated second theorem proved that any message can be coded in such a way that it can be guaranteed to survive its journey through the valley of noise. The only limitation that needs to be factored into the equation is the natural carrying capacity of the channel—that is, its bandwidth. Shannon did not provide the “ideal code” of his second theorem—dubbed the holy grail of information theory—but he did show that such perfect communication was technically possible. More generally, his theory showed that the integrity of messages can be maintained by translating them into digital codes of varying degrees of complexity, redundancy, and bandwidth-sapping accuracy. Messages are not sent unalloyed but are embedded within additional information—the equivalent of decoder rings, say, or data that allow the recipient to know that the message received is really the proper one. This additional, or “meta,” information relies heavily on redundancy, a kind of repetition that ensures that the message will prevail even if noise takes a meaty bite out of it along the way.

  All this was great news for Shannon’s employers, who were fruitfully multiplying telephone lines across the land and applying wartime communications know-how to civilian life. But like the sciences of complexity and chaos theory today, information theory also became a Big Idea, one that people in many disciplines hoped would revise and clarify the known world. Once information received an abstract and universal form, it somehow became more real—not just a turn of phrase or a squiggle on some Bell Labs blackboard, but a force in the world, an objective yet essentially mind-like material that could help explicate any number of seemingly unrelated phenomena by boiling them down to the crisp binary unit of the bit.

  So in the 1950s and 1960s, social scientists, psychologists, biologists, corporate managers, and media organizations began reimagining and reorganizing their fields of expertise with information theory in mind. Shannon’s nuts-and-bolts picture of signal and noise, sender and receiver, started shaping the culture at large. The paradigm of information began to invade humanist discourses, promising to efficiently clean up all sorts of messy problems concerning language, learning, thought, and social behavior—all of which could now be seen as depending on more or less efficient systems of information processing. The budding technocracy of postwar society seemed to have found its lingua franca: an objective, utilitarian, and computational language of control with which to master the carnival of human being.

  All of this set information on a collision course with meaning—that signifying magic that, for all the analyses of linguists, sociologists, and cognitive scientists, remains one of the trickiest, most seductive, and most consternating glyphs in the human equation. Meaning is at once the mundane foundation of the mind’s trivial pursuits and the inspiration for our most intimate, creative, and spiritual quests. But meaning, even strictly linguistic meaning, is notoriously slippery stuff. Though the attempt to reconceive meaning under the abstract sign of information is vital for the technology of communication, the absolute dominance of the information model may well exact a withering cost. Information theory is fine and good if you are talking about radio transponders, telephone lines, and drive-through kiosks at Taco Bell, but its universal application saps the marrow from the rich lifeworld of meanings that humans actually inhabit—a world whose nuanced ambiguities are better captured by, say, Shakespeare’s soliloquies and Yoruban myth than by statistical algorithms. As the technology critic Theodore Roszak puts it, “for the information theorist, it does not matter if we are transmitting a fact, a judgment, a shallow cliché, a deep teaching, a sublime truth, or a nasty obscenity.”4 But today many people confuse information and meaning, which leads to a rather disturbing paradox: our society has come to place an enormous value on information even though information itself can tell us nothing about value.

  But let’s be fair. If you have had the pleasure of downloading crystal-clear images of Martian real estate through little copper wires into your home computer, you probably recognize that dodging the briar patch of value judgment and semantic ambiguity has its technical advantages. Besides, the information paradigm does provide a number of powerful ways to think about what we mean by meaning. To start with, information seems to have something to do with novelty. For you to provide me with genu
ine information, you must tell me something new. That is, information requires an amount of uncertainty on the part of the receiver. If you are so predictable that nothing you tell me is a surprise, then nothing you say is really information, even if the signal is crystal clear. On the other hand, for me to understand you in the first place, you need to be somewhat predictable—which is why loads of the language we blurt out or text is made up of redundancy, a thick wad of repeated cues and familiar syntactical rules that themselves signify little at all. This structural redundancy ensures that not too much novelty occurs, because such wide degrees of freedom might lead us into the chaos of a schizophrenic’s word salad, or the interminable ambiguities and connotations of Finnegans Wake.

  Communicating information is not simply a matter of cramming data into an envelope and sending it off; information is also something constructed by the receiver. In this sense, an element of “subjectivity” eventually enters into any communications circuit, because the question of how much information is received depends in part on how the receiver (which may be purely mechanical) is primed to parse the incoming message and code. To explain the role that receivers play in processing information, the science writer Jeremy Campbell uses the example of three students listening to an economics professor, only two of whom know English, and only one of whom actually studies economics. For the non–English speaker, the noises spilling out of the old fellow’s mouth are so uncertain, so unpredictable, that no information gets through. By virtue of shared language alone, the English speakers both receive more information, but the future mutual fund manager reaps the most, because his foreknowledge of economics concepts and jargon makes the professor’s data-dump even more predictable, but still surprising enough to generate novel differences.

 

‹ Prev