Movies and Meaning- Pearson New International Edition

Home > Other > Movies and Meaning- Pearson New International Edition > Page 35
Movies and Meaning- Pearson New International Edition Page 35

by Stephen Prince


  In contrast to the profusion of sound detail in contemporary film, the audio design of early sound films included less information. Occasionally, one finds an incomplete sound hierarchy in these films, a mix of dialogue, effects, and music that runs counter to the practices that would soon become normative in the industry.

  In Sergei Eisenstein’s first sound film, Alexander Nevsky (1938, about a Russian folk hero who repulsed a German invasion in the thirteenth century, music and dialogue tend to predominate in the sound structure of the film, with background ambient sound and sound effects used less extensively. Some scenes or shots completely lack the ambient sound and effects that are clearly denoted by the images and action.

  At the beginning of the movie, for example, a group of Mongol warriors visits Alexander Nevsky’s fishing village. Viewers hear the sounds of their horses and armor as they arrive, but leaving, they make no sound at all. Their exit is completely silent.

  216

  Principles of Sound Design

  ALEXANDER NEVSKY (1936)

  The soundtrack of many films in the early sound period have a minimal range of effects and ambient noise, even when images, such as these shots showing a screaming child, suggest highly specific sounds. Frame enlargements.

  Later in the film, during the visually impressive sequence that details the burning of the city of Pskov and the slaughter of its inhabitants by the invading German army, viewers hear only music and dialogue without any sound effects. Close-ups of screaming, crying children lack these sound effects.

  During the climax of the film, the epic battle on a frozen lake between Nevsky’s armies and the invading Germans, music and sound effects alternate one at a time. The music plays for a while and then stops, and viewers hear sound effects (swords clashing, men shouting). Then the sound effects stop, and the music begins again. These manipulations of sound may strike a modern moviegoer’s ears as rather crude and unrealistic because of the peculiar manner in which effects and music have been edited so that they are never present together and because of the lack of detail in the film’s audio space when compared to its often striking images.

  While most films establish a clear hierarchy of sound relationships that gives the voice a privileged pride of place and surrounds the voice with music and important sound effects, Altman’s work and Eisenstein’s Alexander Nevsky are significant alternatives to this practice. Their deviant structure demonstrates, by its omission, the prevalence of the conventional hierarchy in which voice, effects, and music are present together but in carefully regulated volumes.

  SOUND PERSPECTIVE Sound perspective designates the ways that sound conveys properties of the physical spaces seen on screen. Sound perspective in film is based on correspondences with the viewer’s acoustic perception of space in everyday life. The sound of an approaching or receding object, for example, changes its pitch in a predic way depending on its direction of movement, a phenomenon known as the Doppler effect .

  Sound designers routinely use Doppler effects to acoustically convey the movement of a sound-producing object through three-dimensional space. Recall that the sound engineers at Lucasfilm added Doppler effects to the arrows whizzing at Indiana Jones to give them a convincing three-dimensional presence in the scene. In the Star Wars films and other science fiction pictures, Doppler spatializes the approach of CGI or miniature-model spacecraft and helps sell these special-effect images to viewers.

  217

  Principles of Sound Design

  FILMMAKER SPOTLIGHT

  Ben Burtt

  Ben Burtt has created some of the most famous

  River character. The scream is very distinctive, and

  sounds in modern movies—the mechanical breath-

  Burtt affectionately used it in several Star Wars mov-

  ing of Darth Vader, the crack of Indiana Jones’ whip,

  ies. It’s now a kind of legendary audio effect and can

  the voices of E.T. and R2D2, the resonant hum of

  be heard in Inglourious Basterds (2008), Monsters vs.

  Luke Skywalker’s lightsaber. His long association with

  Aliens (2009), and Iron Man 2 (2010)

  George Lucas and Steven Speilberg helped to change

  When George Lucas hired Burtt to design the

  modern movie sound by emphasizing the invention of

  sounds of Star Wars (1977), he wanted the film to

  original sounds for a production.

  have an original audio profile, not recycled sound

  Before the 1970s, film studios compiled stock

  effects. He broke with existing studio practices. As

  libraries of audio effects that they used and re-used

  a result, Star Wars sounded unlike earlier science

  in their films. Warner Bros.’ movies sounded different

  fiction films. The Federation’s beat-up space ships,

  than Paramount’s films because each studio drew

  for example, sounded like Model T automobiles

  from its in-house audio archive. Many of the gunshots

  rather than having the electronic hum so common

  heard in years of Warner Bros. gangster movies were

  in 1950s-era sci-fi. Sound was uniquely assertive in

  created originally for G-Men (1935), and the studio

  defining the experience, emotions, characters, and

  recorded numerous bullet ricochets for The Charge

  settings of Star Wars. Burtt’s creative sound design

  of the Light Brigade (1936). Audiences continued to

  on that film helped to usher in a new era of audio

  hear these ricochets in movies for decades. They even

  invention in motion pictures in which sound was

  were altered to become the cartoon sound of the

  conceived as an active part of the moviegoer’s ex-

  Road Runner dashing away. Burtt points out that one

  perience.

  ricochet was played backwards to supply the sound of

  Because many of the characters and situations

  Superman landing on the 1960s-era television show.

  for which Burtt had to invent sound were novel and

  These practices meant that sound effects often were

  imaginary, he tried to map them onto sound experi-

  repetitive, familiar, and unsurprising.

  ences familiar to viewers, ones they would associ-

  The “Wilhelm scream” is a famous example of

  ate with particular emotions. To create the sounds

  recycled audio. The scream—recorded for a scene

  of large spaceships in the Star Wars movies, Burtt

  in Distant Drums (1951) where a man is bitten by an

  blended audio of thunder, animal growls, and jet

  alligator—was archived at Warners and used in many

  airplanes. Animal sounds are also part of the mix for

  of the studio’s films, including The Charge at Feather

  the engines on military vehicles appearing in Raiders

  River (1953) where it was used when a character

  of the Lost Ark (1981).

  named Wilhelm is shot with an arrow. Researching

  Because sounds taken in isolation are often

  sound at studio libraries, Burtt noticed this recurring

  hard to identify, sound design is an art of elegant

  scream and named it “Wilhelm,” after the Feather

  substitution, and a designer needs to be able to

  THE EMPIRE STRIKES

  BACK (20 TH CENTURY

  FOX, 1980)

  Darth Vader’s lack of humanity

  was unforgettably characterized

  by the mechanical sound of his

  breathing, a sound designed

  by Ben Burtt to embody the

  essence of the character. Frame

  enlargement.

  218

  Pr
inciples of Sound Design

  think analytically about sound in order to get the

  simply record wind? Because literal, realistic sound

  right combinations of elements. Burtt created Darth

  sources often are insufficiently dramatic. Other

  Vader’s heavy breathing by putting a microphone

  sources can be blended and manipulated to evoke

  inside a scuba tank regulator. Light-sabers were the

  the necessary emotional tone or personality. The

  combined sounds of Simplex movie projectors, the

  sound of EVE’s laser gun in the film is produced

  electric hum of television sets, and a moving micro-

  by stretching a slinky out to full length, placing

  phone re-recording the combined sounds to convey

  a microphone at one end and tapping the other

  a sense of object movement. Raccoon noises helped

  end. Because of the slinky’s length, high-frequency

  supply the voice of E.T. and the skittery sounds of

  tones reach the microphone first, then the mid-

  the cockroach that appears in WALL-E (2008).

  tones, and then the low. The resulting sound is

  Because WALL-E is a computer-animated film, no

  metallic and resembles an explosive discharge,

  audio was recorded as part of a production track.

  making a good fit with the images of EVE shooting

  All of the images were created without sound, re-

  her laser gun.

  quiring Burtt to invent the mechanical sounds of

  Burtt’s work on the Star Wars and Indiana Jones

  the robots WALL-E and EVE, their planet and space-

  films, and Walter Murch’s inventive work for Francis

  craft. He created more than 2000 individual sounds

  Coppola on Apocalypse Now (1979), helped to make

  for the film. The noise of wind, for example, was

  the 1970s the first great era of sound design in motion

  produced by running audio of Niagra Falls through

  pictures. The inventive, singular, and unique audio

  an echo chamber, and when the wind is heard

  profiles of these films established the essential role that

  from inside WALL-E’s trailer, the sound is produced

  original sound, expressly designed for a film, would

  by dragging a canvas bag across the floor. Why not

  play in the art of cinema. ■

  WALL-E (Pixar, 2008)

  Robots in love—WALL-E

  and EVE are personified

  through highly articulated

  and individuated sounds. A

  low-tech robot, his sounds

  are very mechanical, while

  hers are more electronic and

  airy. Sound design plays a

  major role in bringing this

  sci-fi world to life. Frame

  enlargement.

  Sound perspective also can be created by using reverberance and changes in volume. Direct sound is sound that comes immediately from the source. It is spoken or recorded directly into the microphone, and because of this, it typically carries minimal or no reverberance and conveys little environmental information. By contrast, reflected sound carries reverberance. It reflects off of surrounding surfaces in the environment to produce reverberation. Differing surfaces reflect sounds in differing ways, and these differences convey important information about the kind of physical environment in which the sound is occurring. Hard surfaces such as glass or metal tend to bounce sound very quickly and very efficiently, whereas softer surfaces such as carpeting or cushioned furniture are less reflective. They tend to absorb sound and, in extreme cases, may deaden sound. In The Conversation (1974), the noises of the murder that Harry Caul hears through an adjoining hotel room wall are muffled and deadened.

  219

  Principles of Sound Design

  Sound environments, then, can be characterized in terms of their sound-reflective or sound-deadening properties. Sound designers pay close attention to these features so that the audio environments they create for a film match the physical conditions of the scene or shot. Sound needs to reverberate in Edward Scissorhands’s huge, vacant castle but not on the western plains in Dances with Wolves .

  Another, very important characteristic of sound in the audio environment is ambient sound. As explained earlier, this term refers to generalized noises in the recording environment. If shooting takes place out-of-doors, ambient sounds may include the airplane traveling overhead, the cries of children playing in the distance, or the sound of wind in the trees. Ambient sound is found in all recording environments, even in an empty room. When a scene occurs in an empty room, the soundtrack will not be dead or silent. It will carry room tone , the acoustical properties of the room itself, the imperceptible sounds that it makes. Room tone is a very low level of ambient noise, and it indicates that the audio environment created by contemporary sound design is never silent or dead but always conveys some audio information.

  Sound perspective often correlates with visual perspective. If the action is presented in long shot, viewers also hear the sound as if in long shot. As the sound source gets more distant from the camera in a reverberant environment, the properties of reflected sound increase. As the sound source comes closer to the camera, the amount of reflected sound decreases. By varying the amount of reflected sound, filmmakers establish the location of a sound source within the visual space on screen.

  If the action is presented to the viewer in close-up, direct sound should predominate over reflected sound. The actors’ voices should be intimate and sound as if they are spoken closely to the microphone. Sound designer Walter Murch has stated that he records not just sounds in the environment but also the spaces between the listener THE OTHERS (MIRAMAX, 2001)

  Digital, multichannel sound can create tremendously vivid sound perspective. When Grace (Nicole Kidman) is terrorized by what she believes are ghosts, disembodied voices fly around the room, jumping from channel to channel, speaker to speaker, across the front soundstage and into the rear surrounds. As the unseen spirits flutter about the character, the sound reproduces this action in three-dimensional audio space. The effect becomes subjective, immersing the film viewer into the character’s experience. Frame enlargement.

  220

  Principles of Sound Design

  and those sounds. In actual practice, however, microphone placement does not exactly parallel camera placement. While the difference in camera placement between a close-up and a long shot may be very great, the difference in actual microphone placement may only be a matter of several feet. Moreover, many contemporary films invert visual and sound perspectives by filming actors in long shot and miking them for direct sound. Peter Weir’s Dead Poets’ Society (1989) deals with the relationship between an unconventional English teacher (Robin Williams) and his students in an elite prep school in 1959. One of the boys discusses with his friend his excitement over getting the lead role in the school play. The two boys stand on a pier next to the water and are filmed in extreme long shot. Their voices, however, are miked in intimate terms. The audio space is very close. The visual space is very distant.

  Sound Perspective in Early Cinema As with other attributes of film structure, filmmakers did not grasp the complexities of sound design all at once. Sound technology came to the movies in the late 1920s, and filmmakers gradually discovered the creative possibilities of sound and how to use it in a rich and naturalistic fashion. As a result, and because early sound technology was quite limiting, the soundtracks in many early films tend to be less detailed and less reflective of the realities of sound space.

  French director René Clair’s Under the Roofs of Paris (1930), for example, is a mixture of pantomime, music, and dialogue. Much of the film was shot silent, with a few talking sequences added later. At the beginnin
g of the film, the camera booms down from the rooftops to the streets of Paris where a song salesman is performing a new tune for a group of onlookers. Viewers hear the song throughout the camera movement, and as the camera draws closer, the song’s volume increases. There is, however, no apparent change in reverberation.

  At the end of the scene, the camera booms back up to the rooftops. This time the volume of the song does not decrease as much as it should given the amount of physical space the camera crosses. Again, there is no change in reverberation. The perspectives established by visual space and audio space do not correlate very well.

  UNDER THE ROOFS OF PARIS

  (TOBIS, 1930)

  Correct sound perspective is

  not a feature of every film. The

  relationship of audio space and

  camera perspective often proves

  to be quite flexible. In Under

  the Roofs of Paris, as the camera

  travels from the rooftops to the

  street below, the appropriate

  changes in audio space do not

  occur. Frame enlargement.

  221

  Principles of Sound Design

  Case Study JACQUES TATI

  As with all rules and conventions of film structure, sound

  moment, however, a tiny figure appears in the distance at

  perspective can be satirized and played with by smart

  the end of the hallway. This is joke number one, reversing

  filmmakers. French director Jacques Tati was one of

  the expectation viewers developed based on the prob-

  the masters of sound cinema. Tati was a pantomime

  able sound space–image space relation. The glass, metal,

  comedian whose films bear some relationships to silent

  and tile hallway conveys the reverberant footsteps very

  comedies. Dialogue in his films is minimal, and the sound

  effectively; they remain loud and only grow slightly in

  space is dominated by a multitude of carefully organized

  volume as the man approaches. This is joke number two.

  environmental sounds. Tati postdubbed his soundtracks

 

‹ Prev