Lindenwood
Sound


As a follow-up to the May 4, 2010 Sound Workshop, the text and links compiled below will take you deeper into the wonders of sound


SOUND DESIGNERS

• WALTER MURCH http://www.filmsound.org/murch/murch.htm

Walter Murch has been editing sound in Hollywood since starting on Francis Ford Coppola's film The Rain People (1969). He edited sound on American Graffiti (1973) and The Godfather: Part II (1974), won his first Academy Award nomination for The Conversation (1974), won his first Oscar for Apocalypse Now (1979), and won an unprecedented double Oscar for sound and film editing for his work on The English Patient (1996). Most recently he helped reconstruct Touch of Evil (1958) to Orson Welles' original notes, and edited The Talented Mr. Ripley (1999). Mr. Murch was, along with George Lucas and Francis Coppola, a founding member of northern California cinema. Mr. Murch has directed -- Return to Oz (1985) -- and longs to do so again, but as an editor and sound man he is one of the few universally acknowledged masters in his field. For his work on the film "Apocolypse Now", Walter coined the term "Sound Designer", and along with colleagues such as Ben Burtt, helped to elevate the art and impact of film sound to a new level.

He believes in editing while standing up. Murch is a beekeeper and he makes his own honey. Introduced Hollywood to the idea of editing films on the non-linear Mac based program Final Cut Pro rather than the AvidTo date (2009), only artist ever to win Oscars for both film editing and sound engineering on a single film (The English Patient (1996)

• RANDY THOM 

Sound Design articles by Randy Thom http://filmsound.org/randythom/

• GARY RYDSTROM (Skywalker Sound) 

http://mixonline.com/recording/interviews/audio_gary_rydstrom/

At the ripe old age of 44, Gary Rydstrom, was arguably the finest sound designer and re-recording mixer of his generation, with seven Oscars (out of 12 nominations), a slew of BAFTA, Golden Reel and C.A.S. Awards, and a 20-year filmography remarkable for its range and quality. But he left Skywalker Sound for the director's chair at Pixar, a company that he's been associated with since creating the “voices” for Luxo Jr. back in 1986.

"The sound process is a big funnel, narrowing tracks and making choices as it goes forward."

(As a proponent of the musicality of effects and the interplay between music and effects.) "I don't think an audience is going to care which parts of the soundtrack are coming from an orchestra and which parts are sound effects. Alan Splet was the best at using sound effects in an overtly psychological and musical way. His ambiences were stunning: applying rhythms and pitches of sounds as evocatively as a composer would. Some people think that there should be a hard line between realistic sound effects and emotional music. But I don't heed that delineation.

• BEN BURTT http://en.wikipedia.org/wiki/

http://filmsound.org/starwars/burtt-interview.htm

Benjamin "Ben" Burtt, Jr. (born July 12, 1948) is an American sound designer for the films Star Wars (1977), Raiders of the Lost Ark (1981), E.T. The Extra-Terrestrial (1982) and Indiana Jones and the Last Crusade (1989). He is also a film editor and director; screenwriter; and voice actor.

He is most notable for creating many of the iconic sound effects heard in the Star Wars film franchise, including the "voice" of R2-D2, the lightsaber hum and the heavy-breathing sound of Darth Vader.

Kurt talks to Burtt about the movie and the legendary sounds of his career.
http://www.studio360.org/episodes/2008/06/27

 

AETHESTICS

When creating sound effects for films, sound recordists and editors do not generally concern themselves with the verisimilitude or accuracy of the sounds they present. The sound of a bullet entering a person from a close distance may sound nothing like the sound designed in the above example, but since very few people are aware of how such a thing actually sounds, the job of designing the effect is mainly an issue of creating a conjectural sound which feeds the audience's expectations while still suspending disbelief.

In the previous example, the phased 'whoosh' of the victim's fall has no analogue in real life experience, but it is emotionally immediate. If a sound editor uses such sounds in the context of emotional climax or a character's subjective experience, they can add to the drama of a situation in a way visuals simply cannot. If a visual effects artist were to do something similar to the 'whooshing fall' example, it would probably look ridiculous or at least excessively melodramatic.

The "Conjectural Sound" principle applies even to happenstance sounds, such as tires squealing, doorknobs turning or people walking. If the sound editor wants to communicate that a driver is in a hurry to leave, he will cut the sound of tires squealing when the car accelerates from a stop; even if the car is on a dirt road, the effect will work if the audience is dramatically engaged. If a character is afraid of someone on the other side of a door, the turning of the doorknob can take a second or more, and the mechanism of the knob can possess dozens of clicking parts. A skillful Foley artist can make someone walking calmly across the screen seem terrified simply by giving the actor a different gait.

SOURCE SOUND vs. UNDERSCORE

Source sound (aka 'diegetic') occurs in the story time and space inhabited by the main character(s). It can be sound effects, ambience, and music that the character(s) can hear.

Underscore (aka 'extra-diegetic) and sound textures occurs outside of the main story's time and space-- sound that the character(s) can't hear, usually suggesting an emotional context.

DIEGESIS vs. MIMESIS
http://en.wikipedia.org/wiki/Diegesis

Diegesis (Greek `to narrate') and mimesis (Greek `imitation' or `to copy') have been contrasted since Plato's and Aristotle's times. Mimesis shows rather than tells, by means of action that is enacted. Diegesis, however, is the telling of the story by a narrator. The narrator may speak as a particular character or may be the invisible narrator or even the all-knowing narrator who speaks from above in the form of commenting on the action or the characters.

Diegesis may concern elements, such as characters, events and things within the main or primary narrative. However, the author may include elements which are not intended for the primary narrative, such as stories within stories; characters and events that may be referred to elsewhere or in historical contexts and that are therefore outside the main story and are thus presented in an extradiegetic situation.

In Film sound and music

Sound in films is termed diegetic if it is part of the narrative sphere of the film. For instance, if a character in the film is playing a piano, or turns on a CD player, the resulting sound is "diegetic."

If, on the other hand, music plays in the background but cannot be heard by the film's characters, it is termed non-diegetic or, more accurately, extra-diegetic. The score of a film is "non-diegetic" sound. Some examples:

* In The Truman Show, a sequence shows the characters at night, when most of them are sleeping. Soft, soothing music plays, as is common in such scenes, but we assume that it does not exist in the fictional world of the film. However, when the camera cuts to the control room of Truman's artificial world, we see that the mood music is being played by Philip Glass standing at a bank of keyboards. The ways in which the Truman Show experiment blurs the line between diegesis and non-diegesis are a central theme to the film.

Example: X-Files clip, where the leitmotif music of a record playing, which represents the mental torment of the central character, becomes the underscore.


PRODUCTION PROCESS

FILMMAKING •http://en.wikipedia.org/wiki/Filmmaking

Film production occurs in five stages:[1]

  • Development—The script is written and drafted into a workable blueprint for a film.

  • Pre-production—Preparations are made for the shoot, in which cast and crew are hired, locations are selected, and sets are built.

  • Production—The raw elements for the finished film are recorded.

  • Post-Production—The film is edited; production sound (dialogue) is concurrently (but separately) edited, music tracks (and songs) are composed, performed and recorded, if a film is sought to have a score; sound effects are designed and recorded; and any other computer-graphic 'visual' effects are digitally added, all sound elements are mixed into "stems" then the stems are mixed then married to picture and the film is fully completed ("locked").

  • Sales and distribution—The film is screened for potential buyers (distributors), is picked up by a distributor and reaches its cinema and/or home media audience.

TELEVISIONhttp://en.wikipedia.org/wiki/Television_program

CREW

PRODUCTION SOUND

Production Sound Mixerhttp://en.wikipedia.org/wiki/Production_sound_mixer

What does a Production Sound Mixer do? A Motion Picture Production Sound Mixer is responsible for recording film/video production dialogue and efx. He commands a crew consisting of one or more boom persons, a cable person and sometimes an Equipment Technician. The Sound Mixer will determine what mikes are used for every scene (or assign that responsibility to his boom person), he will operate the sound recorder, maintain the Sound Report, notify the director (or AD) of any sound problems, keep sound levels consistent, avoid distortion because of too high levels, watch for boom shadows, determine sound perspective after discussion with the director, record "room tone", and of course most important provide a sound track with clean, intelligible, first-rate audio quality.

Boom Operator http://en.wikipedia.org/wiki/Boom_operator_(media)

POST PRODUCTION SOUND

• Sound Designer http://en.wikipedia.org/wiki/Sound_designer

• Dialog Editor

• Sound Editor http://en.wikipedia.org/wiki/Sound_editor

• Music Editor

• Composer

• Mixing (Re-Recording) Engineer(s)

 

SOUND EFFECTS

HARD EFFECTS - common sounds that appear on screen, such as door slams, weapons firing, and cars driving by.

FOLEY http://www.filmsound.org/terminology/foley.htm

What the Heck is "Foley"? http://www.sound-ideas.com/foleymavart.html

Foley effects are sound effects added to the film during post production (after the shooting stops). They include sounds such as footsteps, clothes rustling, crockery clinking, paper folding, doors opening and slamming, punches hitting, glass breaking, etc. etc. In other words, many of the sounds that the sound recordists on set did their best to avoid recording during the shoot.

The boom operator's job is to clearly record the dialogue, and only the dialogue. At first glance it may seem odd that we add back to the soundtrack the very sounds the sound recordists tried to exclude. But the key word here is control. By excluding these sounds during filming and adding them in post, we have complete control over the timing, quality, and relative volume of the sounds.

For example, an introductory shot of a biker wearing a leather jacket might be enhanced if we hear his jacket creak as he enters the shot - but do we really want to hear it every time he moves? By adding the sound in post, we can control its intensity, and fade it down once the dialogue begins. Even something as simple as boots on gravel can interfere with our comprehension of the dialogue if it is recorded too loudly. Far better for the actor to wear sneakers or socks (assuming their feet are off screen!) and for the boot-crunching to be added during Foley.

Foley http://en.wikipedia.org/wiki/Foley_(filmmaking)

Foley describes the process of live recording of sound effects that are created by a foley artist, which are added in post production to enhance the quality of audio for films, television, video, video games and radio. The term "foley" is also used to describe a place, such as foley-stage or foley-studio, where the foley process takes place. "Foley" gets its name from Jack Donovan Foley (1891-1967), a sound editor at Universal Studios in the 1950s who became famous for his advancements in synchronized sound effects.

Footsteps - of the main charactors
The "moves" - accompanying the character, such as clothing noises
Spot or sync effects - onscreen chair moving, object pick up/put down, door open/close

WALLAhttp://en.wikipedia.org/wiki/Walla

In American radio, film, and television, walla is a sound effect imitating the murmur of a crowd in the background. A walla group of actors create murmur for the indistinct chatter of a crowd. Nowadays, walla actors make use of real words and conversations, often improvised, tailored to the languages, speech patterns, and accents that might be expected of the crowd to be mimicked.

Walla is called rhubarb in the UK, rhabarber in Germany and rabarber in the Netherlands and Flanders (Belgium) as well as Sweden, perhaps in part reflecting the varying textures of crowd noise in the different countries. Similar phrases are "carrots and peas," "watermelon cantaloupe, watermelon cantaloupe" and "natter natter" (to which the response is "grommish grommish"). The TV show South Park often parodies this concept by having angry mobs mutter "rabble rabble rabble," sometimes clearly and distinctly. In an episode of Harvey Birdman Attorney at Law, a distraught courtroom audience distinctly and repeatedly shouts "rutabaga", an obvious reference to the use of the term "rhubarb".

AMBIENCE / Room Tone (aka Background, Nat Sound or Atmospheres)

MIXING

Stem Mixing (aka separation mastering or submixing)

http://www.digido.com/mixing-tips-and-tricks.html

http://www.digido.com/delivery-ftp.html

Erik Aadahl Special: Editing for the 'Transformers' Mix
http://designingsound.org/2010/03/erik-aadahl-special-editing-for-the-mix/

Comment by Randy Thom

Great stuff, Erik!

Almost all of my projects in recent years have been structured as “in the box” premixes, meaning that all or virtually all of the editing and premixing has been done in ProTools, keeping it all virtual. We record predubs only as a delivery requirement, but don’t use the recorded predubs in the final mix. In the final each individual sound is funneled through virtual six channel premixes coming out of ProTools that then go through a DFC. Typically there are two ProTools systems carrying effects, backgrounds, and foley. Often all the effects are on one system, and the backgrounds and foley are on the other system. My personal preference is to not use a ProTools “mixing console” control surface, but I know I’m in the minority. I like to make adjustments within ProTools with a mouse rather than knobs and faders. I do use the knobs and faders on the DFC, but most of the work is being done in ProTools.

We can get away with this approach, given the limited number of sounds ProTools can play at one time, because we are very disciplined about making editorial decisions before the final mix. In other words, we come to the final with fewer sounds than would be typical on a more traditional mix, where it’s assumed lots of alts will be needed. One reason this approach works is that on all these projects we spend a lot of time presenting sounds to the director before the final. That way we are pretty sure we know what’s going to make everybody happy before the final starts.

Best, Randy

GLOSSARY

EDL (Edit Decision List) •http://en.wikipedia.org/wiki/Edit_decision_list

ADR (Automated Dialog Replacement or 'Dubbing' •http://en.wikipedia.org/wiki/Dubbing_(filmmaking)

• Dialog Replacement 101
http://mixonline.com/sound4picture/film_tv/audio_dialog_replacement/

• ADR for KingKong (video)
http://filmsound.org/terminology/adr.htm

Field Recording •http://en.wikipedia.org/wiki/Field_recording
Field recording is the term used for any recording produced outside of a recording studio.

GLOSSARY of TERMS
http://www.dilettantesdictionary.org/index.php?let=q

http://web.archive.org/web/20010807063153/http://dolby.com/movies/dfsglos.html

SOUND EFFECTS Libraries

Sound Ideas libraryhttp://www.sound-ideas.com/demos.html
http://www.sound-ideas.com/podcasts.html

HISTORY & TRIVIA

Wilhelm Scream •http://www.hollywoodlostandfound.net/wilhelm/

FINANCE

Budgeting for Sound Post Pro
http://socialsounddesign.com/questions/796/indie-feature-production-and-post-production-audio-budgets


THEATRICAL SURROUND FORMATS

DOLBY DIGITAL •http://web.archive.org/web/20030401230537/dolby.com/tech/

Dolby Digital (aka DD, AC-3, ATSC A/52, 5.1), Dolby Digital EX 6.1 & 7.1 and Surround EX

What is "5.1-channel" Dolby Digital? At the option of their producers, Dolby Digital programs can deliver surround sound with five discrete full-range channels (left, center, right, left surround, and right surround) plus a sixth channel for those powerful low-frequency effects (LFE) that are felt more than heard in movie theaters. As it needs only about one-tenth the bandwidth of the others, the LFE channel is referred to as a ".1" channel (and sometimes erroneously as the "subwoofer" channel).

• Does 5.1-channel Dolby Digital make Dolby Surround obsolete? No, Dolby Surround will be with us for as long as stereo is with us. This is why all Dolby Digital decoder units also incorporate a digitally-implemented Dolby Surround Pro Logic decoder. Dolby Surround encodes four sound channels (left, center, right, surround) onto the two tracks of any conventional stereo program source. Dolby Digital soundtracks, on the other hand, can be carried only by Laserdiscs and new formats such as DVD and DTV.

• Can I hear 5.1-channel Dolby Digital programs over a regular stereo or Dolby Surround Pro Logic system? Yes. All Dolby Digital decoders, whether 5.1-channel or two-channel, have a unique feature called "downmixing" that assures full compatibility with any playback system.

Dolby E (up to 8 ch can be dist via 2 channel chain)
http://www.dolby.com/professional/technology/broadcast/dolby-e.htm

Dolby Surround (4 ch) - matrixed for 4-channel surround
http://en.wikipedia.org/wiki/Dolby_Digital

Surround for cell phones?

DTS • http://www.dts.com/ 7.1 Surround | Neural Upmix
Timecode embedded in film edge syncs to disc playback of surround channerls

 

SDDS •http://www.sdds.com/ 8 channel encode/decode for 35mm film
DDS enables filmmakers and theatre owners to fill auditoriums with 6 or 8 channels of discrete digital sound through 5 screen loudspeakers, 2 stereo surround channels and a full-frequency sub-woofer channel. The usual gaps between the Left, Centre and Right speakers are filled, creating a far more uniform sound field for all patrons in the cinema, regardless of cinema size.


QUAD TRACK

A photo of a 35 mm film print featuring all four audio formats - from left to right:

SDDS (blue area to the left of the sprocket holes)

Dolby Digital (grey area between the sprocket holes labeled with the Dolby "Double-D" logo in the middle)

Analog optical sound (the two white lines to the right of the sprocket holes)

DTS time code (the dashed line to the far right.)

http://en.wikipedia.org/wiki/Surround_sound


DIALOG NORMALIZATION

Dialnorm

 

HEADROOM
http://en.wikipedia.org/wiki/Headroom_(audio_signal_processing)
headroom


SIGNAL PROCESSING TECHNIQUES

http://en.wikipedia.org/wiki/Sound_effect#Processing_effects

In music and film/television production, typical effects used in recording and amplified performances are:

ECHO - to simulate the effect of reverberation in a large hall or cavern, one or several delayed signals are added to the original signal. To be perceived as echo, the delay has to be of order 50 milliseconds or above. Short of actually playing a sound in the desired environment, the effect of echo can be implemented using either digital or analog methods. Analog echo effects are implemented using tape delays and/or spring reverbs. When large numbers of delayed signals are mixed over several seconds, the resulting sound has the effect of being presented in a large room, and it is more commonly called reverberation or reverb for short.

FLANGER - to create an unusual sound, a delayed signal is added to the original signal with a continuously-variable delay (usually smaller than 10 ms). This effect is now done electronically using DSP, but originally the effect was created by playing the same recording on two synchronized tape players, and then mixing the signals together. As long as the machines were synchronized, the mix would sound more-or-less normal, but if the operator placed his finger on the flange of one of the players (hence "flanger"), that machine would slow down and its signal would fall out-of-phase with its partner, producing a phasing effect. Once the operator took his finger off, the player would speed up until its tachometer was back in phase with the master, and as this happened, the phasing effect would appear to slide up the frequency spectrum. This phasing up-and-down the register can be performed rhythmically.

PHASER - another way of creating an unusual sound; the signal is split, a portion is filtered with an all-pass filter to produce a phase-shift, and then the unfiltered and filtered signals are mixed. The phaser effect was originally a simpler implementation of the flanger effect since delays were difficult to implement with analog equipment. Phasers are often used to give a "synthesized" or electronic effect to natural sounds, such as human speech. The voice of C-3PO from Star Wars was created by taking the actor's voice and treating it with a phaser.

CHORUS - a delayed signal is added to the original signal with a constant delay. The delay has to be short in order not to be perceived as echo, but above 5 ms to be audible. If the delay is too short, it will destructively interfere with the un-delayed signal and create a flanging effect. Often, the delayed signals will be slightly pitch shifted to more realistically convey the effect of multiple voices.

EQUALIZATION (EQ) - different frequency bands are attenuated or boosted to produce desired spectral characteristics. Moderate use of equalization (often abbreviated as "EQ") can be used to "fine-tune" the tone quality of a recording; extreme use of equalization, such as heavily cutting a certain frequency can create more unusual effects.

FILTERING - Equalization is a form of filtering. In the general sense, frequency ranges can be emphasized or attenuated using low-pass, high-pass, band-pass or band-stop filters. Band-pass filtering of voice can simulate the effect of a telephone because telephones use band-pass filters.

OVERDRIVE effects such as the use of a fuzz box can be used to produce distorted sounds, such as for imitating robotic voices or to simulate distorted radiotelephone traffic (e.g., the radio chatter between starfighter pilots in the science fiction film Star Wars). The most basic overdrive effect involves clipping the signal when its absolute value exceeds a certain threshold.

PITCH SHIFT - similar to pitch correction, this effect shifts a signal up or down in pitch. For example, a signal may be shifted an octave up or down. This is usually applied to the entire signal, and not to each note separately. One application of pitch shifting is pitch correction. Here a musical signal is tuned to the correct pitch using digital signal processing techniques. This effect is ubiquitous in karaoke machines and is often used to assist pop singers who sing out of tune. It is also used intentionally for aesthetic effect in such pop songs as Cher's Believe and Madonna's Die Another Day.

TIME STRETCH - the opposite of pitch shift, that is, the process of changing the speed of an audio signal without affecting its pitch.

RESONATORS - emphasize harmonic frequency content on specified frequencies.

ROBOTIC VOICE effects are used to make an actor's voice sound like a synthesized human voice.

SYNTHESIZER - generate artificially almost any sound by either imitating natural sounds or creating completely new sounds.

MODULATION - to change the frequency or amplitude of a carrier signal in relation to a predefined signal. Ring modulation, also known as amplitude modulation, is an effect made famous by Doctor Who's Daleks and commonly used throughout sci-fi.

COMPRESSION - the reduction of the dynamic range of a sound to avoid unintentional fluctuation in the dynamics. Level compression is not to be confused with audio data compression, where the amount of data is reduced without affecting the amplitude of the sound it represents.

3D AUDIO EFFECTS - place sounds outside the stereo basis

REVERSE ECHO - a swelling effect created by reversing an audio signal and recording echo and/or delay whilst the signal runs in reverse. When played back forward the last echos are heard before the effected sound creating a rush like swell preceding and during playback.


WEB COLLECTIONS

WikiRECORDING.org • http://www.wikirecording.org

FILM SOUND.ORG • http://filmsound.org

RADIO PODCASTS

RADIOLAB - On a Curiosity Bender  http://www.wnyc.org/shows/radiolab/
Radiolab believes your ears are a portal to another world. Where sound illuminates ideas, and the boundaries blur between science, philosophy, and human experience. Big questions are investigated, tinkered with, and encouraged to grow. Bring your curiosity, and we'll feed it with possibility. Radiolab is heard around the country on over 200 stations. Check your local station for airtimes. Podcasts are free.

THIRD COAST AUDIO FESTIVAL http://www.thirdcoastfestival.org/
Celebrating the best audio stories produced worldwide for radio and the internet

THIS AMERICAN LIFE  http://www.thisamericanlife.org/  NPR radio stories built around a weekly theme

This American Life is a weekly public radio show broadcast on more than 500 stations to about 1.7 million listeners. It is produced by Chicago Public Radio, distributed by Public Radio International, and has won all of the major broadcasting awards. It is also often the most popular podcast in the country, with more than a half million people downloading each week
[Sample pitches http://www.thisamericanlife.org/about/submissions/sample-pitches]

 

TRADE MAGAZINES

AUDIOMEDIA - Audio Technology Magazine
http://www.audiomedia.com/

MIX MAGAZINE - Recording and Productions News
http://mixonline.com/
Mix Archive http://web.archive.org/web/20000815092730/www.mixonline.com/

POST MAGAZINE - Magazine for Post Production
http://www.postmagazine.com/ME2/Default.asp

MILLIMETER - Resource for Production and Post Production
http://digitalcontentproducer.com/

AUDIOMEDIA - For Audio Post Production Professionals
http://www.audiomedia.com/

RECORDING - For the Recording Musician
http://www.recordingmag.com/

FILMMAKER MAGAZINE - The Magazine of Independent Film
http://www.filmmakermagazine.com/

VIDEOMAKER - Your Guide to Creating and Publishing Great Video
http://www.videomaker.com/

SOUND groups and blogs

DAW Mac •http://groups.yahoo.com/group/daw-mac/

Skywalker Sound Blog •http://skywalkersound.blogspot.com/

The Workbook Project •http://workbookproject.com/about/

Social Sound Design •http://socialsounddesign.com/questions

LOCAL USER GROUPS

St. Louis Final Cut User Group •http://www.stlfcpug.org/

Webster Student AES •http://www.webster.edu/aes/
http://www.webster.edu/depts/comm/audioprod/audioprod.html#

StL MCA-I (St. Louis Media Communications Assoc) •http://www.stlmca.com/
MCA-I is an international organization that supports communications professionals and promotes the industry as a profession through networking, members-only benefits and forums for education. The St. Louis chapter of MCA-I meets monthly to allow members the opportunity to network, share ideas, and promote themselves and the organization. Everyone is welcome but there is a charge for non-members.

Cinema St Louis •http://www.cinemastlouis.org/
St Louis Filmmakers Showcase, July 17-22, 2010
http://www.cinemastlouis.org/showcase.html

True/False Film Festival (Columbia, MO) •http://www.truefalse.org
The best documentary festival ever!

 

TOP OF PAGE

Back to Rainbow Sound home page