Chapter 16 Main Points

advertisement
Chapter 16 Main Points
► Sound effects (SFX) can be classified as anything sonic that is not speech or
music. They bring realism and added dimension to a production; they shape what
you see.
► Sound effects perform two general functions: contextual and narrative.
Contextual sound emanates from and duplicates a sound source as it is. It is also
known as diegetic sound—coming from within the story space. (Nondiegetic
sound, or extra-sound, comes from outside the story space; music underscoring
is an example of nondiegetic sound.) Narrative sound adds more to a scene than
what is apparent.
► Narrative sound can be descriptive or commentative.
► Sound effects can break the screen plane; define space; focus attention;
establish locale; create environment; emphasize, intensify, and exaggerate
action; depict identity; set pace; provide counterpoint; create humor, metaphor,
paradox, and irony; symbolize meaning; animate inanimate objects; and unify
transition.
► Silence can be used to enhance sonic effect, particularly in situations where
sound is expected or anticipated.
► Generally there are six types of sound effects: hard (or cut) effects, soft effects,
Foley effects, ambience (or background) effects, electronic effects, and design
effects.
► The two primary sources of sound effects are prerecorded and produced.
► Prerecorded sound effects that can number from several dozen to several
thousand are available in libraries. The major advantage of sound-effect libraries
is that for relatively little cost many different, perhaps difficult-to-produce, sounds
are at your fingertips. The disadvantages include the lack of control over the
dynamics and the timing of the effects, possible mismatches in ambience and the
possibility that the effects may sound canned, and the effects’ not being long
enough for your needs.
► Sound-effect libraries (and any recorded sound) can be manipulated with a
variety of methods, such as varying a sound’s playing speed, playing it
backward, looping a sound, and altering it through signal processing.
► Producing live sound effects in the studio goes back to the days of radio drama.
Producing effects in synchronization with the picture is known as Foleying,
named for former film soundman Jack Foley, although today Foley refers to
produced effects in general.
► Many types of sound effects can be produced vocally.
► Foley effects are produced in a specially designed sound studio known as a
Foley stage, an acoustically dry studio to keep ambience from adding unwanted
sounds to the recording.
► The keys to creating a sound effect are analyzing its sonic characteristics and
then finding a sound source that contains similar qualities, whatever it may be.
► The capacitor microphone is most frequently used in recording Foley effects
because of its ability to pick up subtleties and capture transients (fast bursts of
sound). Tube-type capacitors help smooth transients and warm digitally recorded
effects.
► A critical aspect of Foley recording is making sure that the effects sound
consistent and integrated and do not seem to be outside or on top of the sound
track.
► Microphone placement is critical in Foley production because the on-screen
environment and perspective must be replicated as closely as possible on the
Foley stage.
► Important to successful Foleying is having a feel for the material—to become, in
a sense, the on-screen actor by recognizing body movements and the sounds
they generate.
► In addition to the obvious preparations for Foley recording—making sure that all
props are on hand and in place—it is also important to be suitably dressed and
physically fit.
► Instead of, or in addition to, Foley recording, many directors choose to capture
authentic sounds either as they occur on the set during production, by recording
them separately in the field, or both.
► If sounds are recorded with the dialogue as part of the action, getting the
dialogue is always more important than getting the sound effect.
► When a director wants sound recorded with dialogue, the safest course of action
is to record it with a different mic than the one being used for the talent.
► An advantage of multiple-miking sound effects is the ability to capture various
perspectives of the same sound.
► In field recording, utmost care should be taken to record the sound effect in
perspective, with no unwanted sound—especially wind and handling noise.
► As with most sound-effect miking, the directional capacitor is preferred for field
recording. Parabolic, middle-side (M-S), and stereo mics may also be used in the
field.
► A key to collecting successful live effects in the field is making sure that the
recorded effect sounds like what it is supposed to be, assuming that is your
intention.
► Ambience sound effects, also known as background effects or backgrounds, are
used to fill the screen space, providing a sense of environment, location, time of
day, indoor or outdoor settings, and so on. They also smooth changes in
presence, provide continuity in scene changes, give context to the principal
sound, and mask noise on dialogue tracks.
► One approach to creating ambience is known as worldizing—recording room
sound to add the sound of that space to a dry recording or to use it to enhance or
smooth ambient backgrounds that are already part of the dialogue track.
► Sound effects can also be generated electronically with synthesizers and
computers and by employing MIDI—an approach called electronic Foley. A
synthesizer is an audio instrument that uses sound generators to create
waveforms. Computer-generated sound effects can be created with
preprogrammed software or software that allows sounds to be produced from
scratch.
► An important sound-shaping capability of many electronic keyboards and
computer software programs is sampling, a process whereby digital audio
representing a sonic event, acoustic or electroacoustic, is stored on disk or into
electronic memory. It can then be signal-processed into a different sound or
expanded to serve as the basis for a longer sonic creation.
► In recording samples it is important to take your time and record them with the
highest fidelity; if a sample has to be rerecorded, it might be difficult to
reconstruct the same sonic and performance conditions.
► The main difference between produced and design effects is that design effects
are created after recording during editing and usually involve deconstructing the
waveform to create an almost entirely new effect.
► Database systems and software programs used to manage prerecorded sound
libraries facilitate searching, locating, and auditioning a sound effect in seconds.
► Ripping, also referred to as digital audio extraction, is the process of copying
audio (or video) data in one media form, such as CD or DVD, to a hard disk. To
conserve storage space, copied data are usually encoded in a compressed
format.
► Organizing and naming sound-effect files in a production requires a system
whereby a file can be identified in any of a combination of ways: by category,
name of effect, take number, length, source, sampling rate, bit depth, channels,
and any remarks. This requires a software management program to organize the
data and make it easily retrievable.
► Spotting sound effects involves going through the script or edited work print and
specifically noting on a spotting sheet each primary and background effect that is
called for. A spotting sheet indicates not only the sound effect but also whether it
is synchronous or nonsynchronous, its in- and out-times, and its description. If
the sounds have been recorded, notes about their dynamics, ambience, the way
they begin and end, and any problems that require attention are also noted.
Download