The experts disagree; that is their business. So sometimes firsthand accounts are useful. Here is a sixteen-year-old, writing to the New York Times in 2010 in reaction to an article on “Growing Up Digital, Wired for Distraction”:
Cynicism aside: Teenagers are dysfunctional. Anything remotely educational or not associated with “Gossip Girl,” Facebook or the like is seen as pointless—when it’s actually quite the reverse. Dependency on these technological media will result in a shallow and socially inept generation.
The teenage brain has evolved into a vestige—an appendix of sorts. Read. Draw. Go to the museum. Do something that’s not completely mindless. School has stigmatized learning. Of course we’re going to shut off our brains and rot in front of a computer—it’s just easier. YouTubing, watching “Glee,” playing Xbox? These are passive activities with easily attainable yet meaningless highs.
Just a kid? The film critic Antonia Quirke (I quoted her on Jaws) is old enough to be that young woman’s mother, and she sent me this e-mail reporting on the sudden riots that hit north London in August 2011 and that seemed to be organized by cell-phone connections:
About 100 children (some as young as 9) in an estate directly over the canal all gathered on Monday night at dusk and started yelling and screaming and photographing each other (this fixation with commemoration freaks me out most of all. I was up at Hampstead ladies pond the other day when a group of 16-year-olds were all photographing each other ALL day. Every action was merely a setup for a snap, every sensation a pose) and it got louder and louder, like the drums in King Kong, it was truly AWFUL. I called the police who told me, “It’s like world war 2 out there tonight. Don’t leave the boat” [Quirke lives on a canal barge] and so I didn’t. Then they all ran off and joined the riots down the road. Sudden and total silence.
It was John Berger who first noted that while a photograph seemed to summon presence, it also evoked absence. The base function of film and its successors is not just to join reality but to adjust to the screens that keep us from it. Watching television footage of the devastation in Japan in March 2011, I saw an endless shot of empty automobiles carried in tidy, prim reverse on the flood, backing and turning corners in a multistory parking garage. I was horrified and helpless, but I thought it was comic. I wondered if it was a scene from a Jacques Tati film. When my youngest son saw the second plane enter the World Trade Center tower on live television he asked, “What movie is that from, Dad?”
The film form has always been fascinated by the screen, and the way it converted “reality” or the prospect of it into a set (a place where some defined action would occur). In the same way, the makers of film have been challenged by the thing they call off-screen space. The thought of getting “into” the screen from our off-screen space has always been alluring, just as infants believe there must be real people behind or within the screen, while older children want to enter its flat world because the lovely, lifelike dream seems to be there and “on.” Buster Keaton played with the fancy in Sherlock Jr. (1924), Godard in Les Carabiniers (1963), and Woody Allen in The Purple Rose of Cairo (1985). But there are other ways of doing it: the past, the flashback, can be an interior or inset existence, so tempting but elusive. In Alf Sjöberg’s Miss Julie (1951), the past and the present share the frame together. We feel that overlap in Citizen Kane, and we see it in an array of optical effects—wipes, fades, and dissolves (not common now)—that remind us of the existence of the screen. The best thing about 3-D is to say, look, it’s all a trick on one screen. From the early days of sound (which was rich with off-screen suggestions), movie was drawn to another inner flatness, the mirror, as a double of its own existence and an unconscious way of saying, “Look at the show. Look at yourself. Are you there? Are you ‘on’?”
In François Truffaut’s 1969 picture, Mississippi Mermaid, Jean-Paul Belmondo has been tricked into marriage by Catherine Deneuve, who pretends to be his mail-order bride invited to the faraway island of Réunion. He knows she’s “wrong,” but she is so lovely that he falls for her, or into her—so enigmatic and yet so potent, she is a screen that slides into his life, like a blade. Then she vanishes, with the money from his tobacco plantation. She is his movie, and he wants to run it again just as most of us, when young, carry a movie’s dream home in our head and start to play it again, with variants and extensions.
So he goes back from Réunion (a dot in the Indian Ocean) to France, looking for her. He has a breakdown and one day, while he is in the hospital with other patients, there she is, Deneuve, on a screen, a television screen, dancing in a nightclub. So he pursues her and confronts her, and straightaway, like an actress who had the lines ready, she confesses and tells him a rigmarole of explanation while she sits in front of her mirror and the film presents her as a double image. He doesn’t need to hear it. He loves her still, so he believes her even while he can see the warning in the looking glass behind the real woman.
You could go back over the history of the movies and make an anthology of such moments, where the form of a film is saying, “Can you trust what you’re seeing?” But in 1998 one picture caught the whole duplicity. The Truman Show (1998) was so complete an insight that if it had been the first film you ever saw you might never have needed to see another.
Truman Burbank (Jim Carrey) lives in Seahaven, a hideous display of perfection, a perpetuity of advertising. The buildings are new, bright, flawless, and impersonal. The sun shines as steadily as lighting. The people are amiable; they smile. And the production didn’t have to build this warning of a place, as Fritz Lang built Metropolis. They found it, in the designed community of Seaview, Florida, an “ideal” place to live, they say. Truman seems happy there. He is cheery with everyone, although Jim Carrey’s nervy good nature never wipes away the thought that he might snap and turn into…the Grinch, Stanley Ipkiss (The Mask, 1994), The Cable Guy (1996), or worse. He has always felt unstable with repressed possibilities.
But Truman begins to wonder what is happening. How is it that other people seem always in the same place every day, uttering the same banal lines? Perhaps this has struck you about your life, too, until you wonder whether each new day is just one more take, a way of getting it right, or keeping it from going wrong, life as a loop, a way not to worry. “You have a good day!” the extras insist. Truman hears talk on his car radio: it seems to be a crew that might be looking at him and making him the center of attention or a show.
Indeed, there is a thing called The Truman Show, a television reality show, 24/7, before that phrase was common, and it’s just Truman’s life: having breakfast, going to work, chatting with his wife, Meryl (Laura Linney), and being asleep. It’s a bland, safe life and a parody of all the positive, high-key “we’ll-sort-it-out” television shows of the 1950s and ’60s. It’s also a payoff for any thought you ever had that it wasn’t just that the shows served the commercials; they were helpless extensions of them.
Andrew Niccol had written the script on spec, without a deal or any money. Once it was bought (by producer Scott Rudin), and once Peter Weir was established as director, then Niccol did many rewrites, attempting to get the tone Weir wanted. This storyline could have turned dark or horrific; it might have been a new version of Invasion of the Body Snatchers—for this bland Seahaven is emptiness occupying us. Weir preferred to keep a balance between desire and dread; he wanted us to be uncertain at most moments, whether we were amused or afraid.
I can’t rid myself of simultaneous feelings of admiration and loathing for Christof (Ed Harris), the director of the show or its auteur. He’s a master at what he does, and we like good directors, don’t we? Yet it’s just as clear that the movie he’s making, the Show, is ghastly, dead, and so conformist as to be a complacent, residential fascism with Seahaven as the new model camp.
Like a young hero, Truman determines to escape. The big movie of his life (the death of his father at sea) has given him the jitters over taking to the water. But he is brave. He gets a small boat and sails away fr
om Seahaven, until he reaches that place where the sea meets the great dome protecting and imprisoning The Truman Show. It is a diorama, an effect, but it is a screen—as in a screen that masks something as well as showing us. It is one of the most expressive moments in the history of film, and you must see the movie to find out what happens next.
There is another naked use of other screens within the obvious movie screen, called back projection. It was an industrial and economic convenience: a screened background image could be projected on a sound stage in front of actors who tried to give the impression of being in the real location. One standard use of this device was for scenes with a person in a car. The actor pretended to drive while sitting in a mock-up of a vehicle as the road unwound behind him. Fans could blow breeze on him—the sight of wind is one of the loveliest things.
Back projection declined as a habit as audiences became more critical, or as our need for credibility in the dream proved nagging. But Alfred Hitchcock persisted with it later than anyone. It’s not just Janet Leigh in the car in Psycho, but also the detective, Arbogast, falling backward on the staircase in the same film. People said, “Oh, that’s just Hitch. He liked to keep everything under control. He preferred being in a studio. He was too large to get out and about.” He was also an intuitive genius who felt the dislocation of back projection as a model for our separation and unease. Seen today, his use of it begins to look avant-garde or daring.
Back projection had a variant, the traveling matte shot where a second image was married with a first. For Gone With the Wind, the artist Jack Cosgrove did elaborate background paintings on glass that were bonded with foreground action. The trick there was meticulous and unnoticed; it was a part of the illusion. But tastes have changed. When Michael Powell and Emeric Pressburger made Black Narcissus (1947), instead of going to Tibet (where the story was set), they built exquisite sets with painted-glass perspectives to represent the Himalayas. (Alfred Junge was the art director.) When the film was released, Powell cried out proudly, “You can’t tell we’re not there!” Perhaps in 1947 the eye was deceived. If you look at the same film today, you know it is fake, but it is the artifice that is most appealing. Nowadays we always know we’re not there.
The neurotic urge persists in filmmaking—to have it fool the eye—but the eye has become more numb and less confident. Who can forget the theatrical aplomb of Lars von Trier putting the city of Dogville (2003) on a marked-up stage, with a diorama sky behind it? Wasn’t the folding up of whole blocks of Paris in Christopher Nolan’s Inception (2010) a landmark in surrealism’s dream?
Inception is a rare kind of picture nowadays. It’s filled with technology and emotion. It cost about $160 million and it grossed over $800 million. It won several Oscars for craftwork, and was nominated for Best Picture, though not for Nolan’s direction. In those areas it lost to The King’s Speech, by Tom Hooper, which cost $15 million and grossed about $138 million. The King’s Speech was a pleasant picture about an ordinary king dealing with a small personal problem. It was entertainingly acted and expertly dressed and it had the veneer of an independent picture that carried reassurance and no threat. It could have been made in 1937–38, and so anyone fond of the movies of that moment took heart and comfort. Inception, on the other hand, wanted to introduce marvels we had not seen before in the service of a brand-new, half-wry yet disturbing portrait of the mind at work, awake, in dreams, or approaching death with dread and desire. It was an attempt at a mainstream movie, but it was drawn to neurology and dream, and one day it will loom over The King’s Speech as much as Kiss Me Deadly (1955) now seems modern and poisonous while Marty (Best Picture in 1955) is hard to sit through.
Of course, this is esoterica now. Not even the secure place of film in the universities can persuade us that these differences of opinion matter. This may be the place to admit, and apologize for, the people and films omitted or shortchanged in this book: Michael Haneke, Krzysztof Kieslowski, Satyajit Ray, Abbas Kiarostami, Chen Kaige, Zhang Yimou, Wong Kar Wai, Andrei Tarkovsky, Aleksandr Sokurov, Madonna, and so many others you can add. On the other hand, I made space for Muybridge, I Love Lucy, television as a whole, the money and the deals, pornography and video games, the cell phone, streaming, and all the things that make up the shapes on our screens. This is a history of a larger process than “cinema is everything,” and it is a book about worrying over the general impact of moving imagery and our becoming more removed from or helpless about reality. Mark Zuckerberg is very optimistic about Facebook and the other social media of the last twenty years. He believes they spread openness and connectivity, and does not seem to notice the havoc that has come in the same era to our understanding, our economy, our community, and our belief that we can sustain ourselves.
So trying not to worry is a great goal, but deciding not to worry may be a larger mistake encouraged by our screens. Looking at video games and pornography for this book, as I had not done before, was not comforting. But in the twenty-first century, I have had exceptional experiences with a range of pictures that would include Faithless (2000, made by Liv Ullmann from an Ingmar Bergman script), Moulin Rouge, Mulholland Dr., The Piano Teacher (2001), The Best of Youth (2003), Birth (2004), A History of Violence (2005), The Lives of Others (2006), No Country for Old Men (2007), There Will be Blood (2007), Red Riding (2009), The Way Back (2010), The Arbor (2010), A Separation (2011), or Sarah Polley’s Stories We Tell.
We can see most of those tonight, or in nights to come, along with My Man Godfrey (1936), Midnight (1939), and Keeper of the Flame (1942)—three films that have not been mentioned so far. (In Godfrey, William Powell says, “The only difference between a man and a derelict is a job.” Unemployment is a deficit disorder, too.) No matter how small the screen has become, it is big enough to hold all our films. But its montage grows more profuse and uncontainable all the time. One hundred billion hours in 2025? So it’s a game, just like awarding the Oscars, to propose the best films ever made.
In 2002 the London movie magazine Sight & Sound held its critics’ poll on the best films. Since 1952 the magazine has run this poll at ten-year intervals; it is probably the most illustrious of top tens. And in 2002 the results were as follows: Citizen Kane (the top film since 1962), Vertigo, Renoir’s La Règle du Jeu, the first two parts of The Godfather, Ozu’s Tokyo Story, 2001, Battleship Potemkin, Sunrise, 8½, and Singin’ in the Rain. That is still only a collection of opinions against which we can nurse our disagreements. But what is notable is that the newest film in the top ten went as far back as 1974, The Godfather: Part II.
Another poll is set for 2012, and I will be surprised if Kane is dislodged—after all, it is a terrific film (as the RKO ads said in 1941), and it is not easy to believe it has been overshadowed. But isn’t movie made by and for young people? Shouldn’t there be something more recent, more now? Or is insisting on Kane just another funeral observation? Orson’s youth is still dazzling and intimidating, but why is it unsurpassed?
Cinephilia and cinema must look after themselves, but we may have a harder time of it. I mean the mass society that was first identified through novelties such as electric light, photography, and film shows, and by world war, genocide, indoctrination, and that ultimate huddling effect, poverty. It was enchanting in those first decades to see that film was a natural extension of photography and graphic arts, of theater, musical performance, and literature. It was harder to perceive that the deeper significance of a mass medium was to occupy most of the people enough of the time, to reconcile them in hardship, and to let them feel the great crowd was a “togetherness,” a pact and a purpose (as opposed to a basis for impersonality, fear, and loneliness). That process reached its climax in the Second World War, when—whether it looked like Hope and Crosby or agonizing newsreel—the movie screen was helping us feel a common purpose and the hope of its victory.
The victory of the Second World War was not shared universally. Some countries were defeated and devastated. Some of the principle battlegrounds remained u
nfree though their threatened liberty had prompted war. Chill pressures of imminent destruction and endemic cruelty dominated the cultural atmosphere. The number of displaced persons was beyond care or solution. But the victory was widely identified as American (especially in American eyes), and it reinforced the widespread feeling that the movies were American, so that Hollywood was briefly a cultural center. That attitude was often contested. Nations wanted their own pictures, and many put out great work, but the public at large abided by the notion that so many climatic conditions of film—space, light, music, glamour, romance, suspense, being good-looking—were American tricks.
No one quite admitted that other ingredient, fantasy, and its constant struggle with the factual nature of film. As American victory hardened into empire, so the uninhibited element of fantasy fed into the American soul. It bred several dangerous fallacies: that happiness was an American right; that individuals could be free in a mass society; that American power would endure because the United States was the greatest of all nations. All those foolish principles are endangered now, and the steady process of watching a glorious but unattainable reality has warped our judgment and made us bitter. That is why the transition from movies to commercials was so disastrous and why the adoption of advertising in American television has been as damaging as the reluctance to have a welfare state that can sustain a mass society. These problems exist in much of the rest of the world, because the inspiration and the lies of Hollywood have gone far afield. But the crisis is sharpest in America, which has now moved so swiftly from confidence to its opposite.
The Big Screen - The Story of the Movies Page 66