It was a late spring afternoon, almost twilight. We were on a northbound streetcar passing through the middle of an intersection when a westbound streetcar blindsided us. It was quite an impact. At first I didn’t know what had happened. My daughter and I were sitting at the back, and there was an incredible percussion as the whole streetcar lurched to one side. It was such a large noise that it was somehow strangely muted, more like a shape than a sound. When I look back now, I think my hearing must have been affected by my adrenaline cutting in, because that’s when time really slowed down. What happened next took place in lucid slow motion.
The impact of the westbound streetcar knocked us off the rails and sent our streetcar sliding sideways up the street. Trailing glass, sparks and metal, we skidded towards an oncoming southbound streetcar that was already too close to stop. My window faced the driver of the approaching streetcar. His eyes met mine, and an acknowledgement of some sort passed between us. Then he broke off his gaze. I could see the tendons in his arms as he furiously, but agonizingly slowly, pumped some kind of hand-brake lever. I could also see the lights of the advertisements in his streetcar and the stoic, clenched looks on the passengers’ faces.
Then he hit us. It was a tremendous collision, and the entire view swung away. Everyone in our streetcar who hadn’t already been knocked out of their seats went flying across the aisles. I remember seeing the safety glass erupt into the interior like floating mist, and there was a kind of smell, not of burning, but a sharp, high smell, almost chemical. All of this took place in deafening silence. Then, like a film resuming normal speed, everything started happening in real time, and I was surrounded by noise. People moaning, crying, the tinkle of glass still crumbling out of the windows and the hiss of air escaping from a broken pneumatic brake.
Fortunately my daughter and I were unhurt, though a day or so afterwards I noticed two bruises on my hips, one on each side. My adrenaline must have completely anesthetized me. I’ve read reports of others who’ve had similar experiences of slow-motion time during accidents or during competitive sporting games, so it seems we are able to experience accelerated time for brief periods. But we cannot experience anything like what high-speed film can capture now—bullets slowly plunging through apples, hovering hummingbirds flapping their wings at the speed of a crow.
Photography opened up a frontier of perception that has entertained, astonished and informed us like nothing else. With new eyes we look into a world of time that was once shut away from us. If I were to set up a time-lapse camera in my backyard, my garden would be almost unrecognizable. The rising stems of the peonies would writhe and extend like leafy fingers reaching blindly out of the soil. My tulips would flip open each morning and snap shut at night. The tendrils of the clematis would grope like octopus tentacles as they climbed the trellis beside the patio. There would be purpose here, for at this speed the difference between plant and animal disappears.
MEDIA TIME NOW
The modern world abridges all historical times as readily as it reduces space. Everywhere and every age have become here and now. History has been abolished by our new media.
—Marshall McLuhan, Understanding Media
Here, at the beginning of a new millennium, time is just another medium, one of many vying for our attention. Clock signals are electronically relayed around the world, they course through our computers, televisions, radios and telephones. The globe is connected in a single, precise instant that ticks inexorably onwards wherever we are.
We know this, and we know our biological clock ticks onwards too, yet we sometimes act as if we don’t really believe it, and we seem to believe it less and less every decade. Why? Because we are the inheritors of a century of time manipulation in films, television, pop music (with its sampled repetitions) and telecommunications. We call people in different time zones, so the sun can be shining brightly through our windows while we speak to someone watching the moon rise through theirs. Time has become at once both complex and infinitely malleable.
Live international television broadcasts and jets have shrunk the world, as the adage has it. Distance isn’t what it used to be. But they have also shrunk time, at least time in the older, grander sense. Our instantaneous, multi-tasking workplaces allow us to fit more work into shorter periods of time, and as we become increasingly wireless our private realm shrinks accordingly; everyone is connected to everyone else all the time. This comes with a price, for our bodies have their own sense of time, which arose out of the cycles of the natural world, from the alternation of night and day. Personal retreat distance and privacy are becoming scarce, and there is hardly any time to reflect, to meditate, to drift—natural states of mind that allow us the freedom to disengage from the world. Disengagement has become a luxury. Taking your time has become a recreational indulgence.
On television the flow of time is constantly interrupted by commercials that are paragons of time condensation. How do you sell your product in thirty seconds? Film techniques of editing and narrative compression are honed to their essence here. Recently it seems that the flow of time itself is the theme of some television ads. The ability to digitize films and separate the various action components of a scene and then play them at different speeds is now a cliché in automobile commercials. Cars cruise at normal speed under time-lapse clouds, or winter turns to spring in the wake of a sleek sedan. How advertisers capture our viewing time for their own ends has turned into the central struggle of our temporal economy, and as communications media fuse with entertainment media, the tussle feels increasingly intimate, even invasive.
THE GREAT ATTRACTOR
The future ain’t what it used to be.
—Yogi Berra
We are all passengers of “now”; this is our common bond. If we were suddenly transported to another century, we would be temporally and culturally marooned, wandering like orphans through an alien sensibility. If we came upon other time travellers from our own time, we would greet them like brothers or sisters, no matter what country they were from, because the bond of “now” is probably even stronger than nationality. But our “now” is a Hollywood concoction, the result of a century of time manipulation by various media, and it stretches from the Jurassic era to the distant future.
Time-based art forms, particularly television and cinema, have brought representations of all periods—past, present and future—into our timeless present. Here, in McLuhan’s “end of history,” all fashions, all cultures, all stories exist at once. This puts us in a special situation, a paradoxical one, where the present, with its idiomatic sayings, dress codes and assumptions, is advantaged over not just the past but the future as well. The evolution of fashion, if such a thing could be said to exist, is now stymied by a future that we’ve already visited. There is nowhere to go.
The future, at least as it looks in science fiction films, is either inhabited by the kind of people who attend Star Trek conventions or a high-tech showroom where villainous scientists battle oppressed heroes in a dysfunctional society. Everyone seems to be dressed in vaguely utilitarian designer clothes, like discards from Pierre Cardin’s abstract-modernist line of the 1960s. It is certainly no place for the urbane sophisticates of today’s popular culture. For them, the future is only slightly more repellent than the earnestness of the recent past. But within that recent past, the last thirty years or so, there is something that pop culture keeps circling back to, as if there were a kind of cultural black hole, a grand attractor that has warped time around it. The present can’t seem to get beyond it, can’t reach escape velocity.
The only way today’s fashionistas are able to ward off the painful earnestness of the past and yet still rummage through it for “new” looks is with special talismans, the irony of “retro” accessories. This irony lets them sample the past without stigma while appropriating its enduring charisma. Perhaps an anxiety lurks behind our timeless present, a sort of claustrophobia. It may be unconscious, but it is irrevocably there, its hum underscoring our rest
less nostalgia. We are trapped somehow. And the gravitational time-centre of our nostalgia seems to be located somewhere in the late sixties, even for those born decades later. Like a massive stellar object, its tremendous “retro” gravity sucks the future back into it, halting the forward progression of cultural time. But if the great attractor is a lost paradise that we can watch, that we can mimic, we still cannot inhabit it, even if we dress the part.
What was it about that period that is so compelling now? Was it people’s unquestioning faith in the future? Was it the technology? Certainly, the American manned space program, including six moon landings between 1969 and 1972, was a technological high-water mark that has dwarfed anything since. And then there were the regular supersonic passenger flights of the Concorde, which, like the last remnant of a glorious age, outlasted the peak of Western culture by two decades. This period was also marked by unprecedented liberalism, from topless bathing suits to love-ins. The only living vestiges of our once-promised future—where lunar colonies and civilian space travel would have been commonplace—are the Internet, personal computers, digital toys, cellphones and next year’s car designs. And even car designers have succumbed to the retro impulse with the resurrection of the Volkswagen Beetle, Mini Cooper and Ford Thunderbird.
Perhaps history hasn’t really ended, as McLuhan pronounced, but it has certainly stalled. When nostalgia erupts so persistently into the present, then the forward movement of history, particularly one without a future to pull it forward, is arrested. As Herman Melville wrote, “The poor old Past, The Future’s slave,” and in the normal state of affairs the future does completely determine the past. All the past can do is freeze the gestures made by the future as it touches the present. But in the present cultural era, which began sometime in the late 1980s, when “retro” really burst forth in earnest with period-mixing films like Tim Burton’s Batman (in which the technology and design mixed elements of the twenties, forties and eighties), both the past and the future seem to be slaves to the present. Meanwhile, the present is looking “elsewhen.”
The late sixties were a technological high-water mark, but they stand in microcosm to the previous high points of human achievement: the Roman Empire, the Egyptian and Mayan civilizations. It’s almost as if, as a species, we have a fear of success. But then, over time, civilizations rise and fall, like waves. So perhaps we will again rise to the engineering heights attained in the late sixties, will again have manned missions to the moon and supersonic passenger jets. But until then we at least have our digital media, and the chronosphere that they create.
The timeless “now” staged by celebrity culture is further underscored by the unnatural agelessness of its stars. Because of plastic surgery, they seem to occupy a null space in the flow of years. They are timeless beacons, Dorian Grays. George Hamilton (preserved with a perpetual tan like a Danish peat-bog mummy) once quipped that his peeling sunburn revealed a tan from 1955. When “ageless” stars finally flame out, the spectacle, as chronicled in tabloids, is mesmerizing. It is like watching a vampire impaled with a silver stake. They pass from youth to old age in the span of a few years.
Of course, we’ve been exposed to nested time periods—at least historic ones—for thousands of years, when you consider the architecture of cities. Any large city has old and new buildings side-by-side—Art Deco beside Mies van der Rohe, Louis Sullivan beside Frank Gehry—and in older European cities these architectural periods can be hundreds of years apart, medieval churches beside Bauhaus office buildings. Rome, already known as the “eternal city,” is one of the most extraordinary examples of different time periods existing cheek by jowl. It is a chronopolis, an archipelago of eras, so that a walk through Rome is a kind of random time travel, taking visitors and residents forwards and backwards willy-nilly, thousands of years at once. The Pantheon has stood intact, like an architectural living fossil, for millennia. It has presided over the rise and ruin of the Colosseum, outlasted empires and religions. Modern office towers soar beside it, and under the greatest palace of Rome, the Palazzo Venezia, lie Etruscan tombs.
REAL TIME AND TIME LAGS
In a media world of unreal time—where daily talk shows and the evening news are pre-taped, where Star Trek rubs shoulders with documentary footage from the First World War—broadcasts that are genuinely live have an instant cachet. The Olympics, live sports programs, unfolding disasters, car chases, all draw huge audiences. But a few years ago, another, more complicated kind of live broadcast arose—“real time”—and two-way video conferencing was one of the first examples. Video conferencing purported to take place “in real time” because a computer processed the signals so quickly that there was no delay between sender and receiver, something that had not been possible earlier. The result was a two-way visual phone conversation. Nothing radical in that, or so it seemed. But the fact that a computer was working furiously to translate, send and then untranslate the digital signals subtended the whole process. It wasn’t like video surveillance, which might also be said to take place in real time, but which merely reflects reality; there is no processing of the signal. Real time is synonymous with a kind of hyper-time, where electronic circuits race furiously just to stay in the present moment. It is as if the higher the price paid to be here and now, the more privileged the present moment.
The term “real time” was readily adopted and assigned to all sorts of things—for instance, the stock market crawlers that tick away at the bottom of your screen, or the “real-time four-wheel-drive” feature of some sport/utility vehicles (SUVs), which automatically switches from two- to four-wheel drive when you go off-road. Real time means that we are moving so fast we have caught up to the present—which waits for no one, save the fastest.
So “real time” is a commodity, something you can buy if you have enough money, unlike those who are marooned in what must be “unreal time,” where computers aren’t fast enough and cars don’t have automatic asymmetrical 4WD. (In one sense, “real time” is reminiscent of “quality time,” those hours that busy people snatch between appointments and jobs to be with their children.) On the other hand, television’s version of “real time” is trickier, particularly in live broadcasts. Sometimes people do or say things that networks would prefer they didn’t, and so broadcasts have a built-in time delay of several seconds that allows producers to hit a “dump” button and delete the offending sequence. We don’t notice a built-in time lag the way we notice the delay of an audio feed on a live television report from halfway around the world, where reporters blankly wait for the anchor’s next question. Yet there are other built-in time lags—one military and the other biological—that are almost completely invisible.
In the command-and-control situation room of modern battleships, a wall-mounted screen shows real-time positions of enemy boats, planes and missiles. During the Falkland Islands war in 1982, H.M.S. Sheffield’s radar defence shield was down temporarily while a satellite phone call was made back to Fleet Headquarters in England. When the radar screen came back online, it showed two enemy planes, thirty-three kilometres away—much too close for comfort, and, as it turned out, too late for response. Exocet guided missiles were already skimming towards H.M.S. Sheffield, just above the waves, at the speed of sound. There was not enough time to launch electronic countermeasures. All the officers could do was helplessly watch as the incoming blips tracked closer and closer.
But there was a further wrinkle. Because missiles and jets move quickly, most situation screens for both land and naval battle zones have built into them a computerized time delay of slightly more than 0.2 seconds. This delay counteracts the minimum human reaction time. The delay had been built into the command-and-control screen of Sheffield, which meant that—like a sinister video game—the officers saw the missile hit the target on the screen a moment before the ship itself exploded into flame.
Our 0.2-second reaction time is itself affected by a deeper neurological phenomenon, one that some neurologists believe is proof of an
immaterial soul. A few decades ago, a neurophysiologist by the name of Benjamin Libet discovered a perplexing paradox about the nervous system and the brain. During brain operations, when patients were conscious and parts of their cortex were exposed, Libet conducted experiments on how long it took for a pain impulse to reach the cortex, and then how long it took for the patient to report the pain. With the patient’s consent he would prick the skin and measure when the incoming impulses hit the part of the brain that represented that area.
A pain impulse takes only 15/1,000 of a second to reach the cortex, but the brain takes half a second to translate that signal into conscious awareness. You can see the spikes on an oscillograph wired directly into the brain; it’s hard science. Yet all patients reported the sensation after only 0.2 seconds had passed. Libet was stumped. How could you be consciously aware of something before your brain had delivered the information to you? He hypothesized that there must be some sort of perceptual mechanism that antedated the signal in time. But what was this “perceptual mechanism”? The question became a famous bone of contention among neurologists, and continues to be so up to the present day. How is it possible for the brain to “jump time”?
The famous neurologist John C. Eccles said he had the answer. He believed that Libet’s “perceptual mechanism” was evidence of an immaterial, self-conscious mind, one that scanned active modules in the brain, getting the jump on the slower processing of the cortex. Other brain processes, he noted, indicated the presence of something that wasn’t the product of neurons and electrical-chemical impulses. So in his opinion, the evidence for an immaterial mind was just too overwhelming. In a conversation he had with the philosopher Karl Popper in 1974, he said that he was “constrained to believe there is what we might call a supernatural origin of my unique self-conscious mind.” The mind time-travels. Continuously, every moment we are alive, our consciousness is time-lapsed by a tiny, mysterious, 0.2-second journey back in time. Our ability to be in the present moment is more—much more—than it seems to be.
Soul of the World Page 8