Sie sind auf Seite 1von 7

Riccardo Percacci

Audio and MIDI Production for the Screen Composer



Assignment 3: Essay


Possibilities in sound experimentation for film scoring in the DAW: an overview


Introduction

Between the 1970s and 1980s, with the incorporation of electronic instruments in soundtracks, film
sound experienced a fresh enrichment. With a virtually unlimited sound palette at disposal, films
were gifted new sonorities that lent themselves (and arguably gave way) to the popularity of genres
such as science fiction, action and thriller movies. 

Among the great pioneers of this new synthetic film sound are Vangelis, whose soundtrack for
Blade Runner (Blade Runner, 1982) still constitutes a landmark for electronic music in general,
Giorgio Moroder with his iconic Midnight Express (Midnight Express, 1978) and Scarface
(Scarface, 1984) soundtracks, and german band Tangerine Dream, which scored over sixty films.

Alongside the excitement for this new “synthetic” soundtrack however, the classic, big orchestra
sound of Hollywood’s golden age remained firmly present in movie culture, mainly thanks to the
work of now legendary composer John Williams and his iconic Star Wars score (Star Wars, 1977). 

In this dual sonic reality, composers were inspired to mix instrumental and electronic sounds to
evoke a wider range of emotions and feelings. Composers started successfully including electronic
sound in the orchestra, most prominently Jerry Goldsmith, Howard Shore in his long collaboration
with Cronenberg, and many others.


However, technological limitations at the time meant that these elements often had to be recorded
separately, with significant impact on workflow and production costs. It was only with the arrival of
the DAW (Digital Audio Workstation) that things changed. With software based sequencing,
composers were finally able to combine any type of sound seamlessly, in the closed context of a
computer program. 

This opened the door to endless creative opportunity, and represents to me the most important and
revolutionary aspect of modern film scoring. Nowadays, without having to spend large amounts of
money for hiring and recording live orchestras or buying expensive hardware gear, one can access
high quality instrumental sample libraries, as well as software synthesisers, and freely combine and
manipulate these in a practical environment.

The creative significance of this is that we are able to experiment with sound and timbre in real
time, with immediate feedback, so that we can more effectively achieve the desired sound quality
for any project. It dodges many logistical issues, for instance the imaginative effort implied in
having to picture how an electronic sound may mix with particular instruments. This can be tricky
and counter-intuitive and yield poor results, possibly resulting in having to go through rearranging
and rerecording. In the DAW however, we can easily combine sounds, hear how these work
together, recombine and manipulate them through countless types of audio effects, in the search for
an appropriate sound palette.


Broadly speaking, in the context of the DAW, there are two main types of sound elements, or
instruments: samplers and synthesisers. I am going to consider these two individually and explore
some of the possibilities they offer, as well as their limitations. I will then discuss how the use of
these instruments influences and enhances the practice of musical composition for soundtracks.

Sample libraries

A sampler is a virtual instrument that allows audio samples to be inputted and played through
keyboard commands. In the simplest case of one sample, the latter’s pitch is shifted across the
keyboard to create a playable instrument. Multi-sampler instruments on the other hand allow many
samples (ideally one per key) to be stored and played with more realistic results. 

In the last decades there has been a great proliferation of high quality sample libraries that try to
recreate the sound of real instruments. This has had a huge impact on making film scoring more
accessible to musicians with little or no formal training in orchestration. While it’s still important (if
not simply convenient) to know some of the fundamental “rules” of orchestration, users can just
play around with sampled instruments, and work out what works best and what does not work at all. 

Recently, these libraries have become so realistic in sonic detail and in the breadth of instrumental
articulations offered, that orchestral mockups are very often used to some degree in the final product
(especially in lower budget productions). There are countless examples for this, and in many cases
an orchestral score is simply enriched and its sound “enlarged” by the subtle layering of live
recording with samples. James Newton Howard describes a typical situation:

Sometimes I will take […] sample marcato strings from my library and double the orchestral
marcato strings because people always expect the strings to be louder than they are. (Karlin
& Wright 2004, p. 371)

In the case of Hans Zimmer’s work on the BBC documentary series Blue Planet II (Blue Planet II,
2017), a specific orchestral library was produced to create the soundtrack (Bley Griffiths 2017).

There are several problems that have to be considered when using orchestral libraries. For instance,
unrealistic effects can arise when using the same sample for parts with varying number of voices
(for example a 4 trumpet instrument will sound like 8 trumpets when playing a two-note chord).
Another frequent issue is the way that woodwinds tend not to blend as well as strings and brass in
the orchestra, which is amplified in the case of samples. This sometimes leads to woodwinds being
neglected in orchestral writing, because their use and their potential isn’t understood. This reflects a
more general tendency of “composing for samples”, where a less experienced composer’s practice
is shaped by the somewhat artificial, mechanic way in which samples operate.


However, granted that such limitations are understood and sensibly handled, I think that the
advantages offered by sample libraries outweigh the disadvantages. Simply having at disposal an
arsenal of realistic samples to combine and manipulate at will can be an amazing creative launch
pad. Even though there’s a tendency for different productions to sound somewhat similar because of
the use of the same sample libraries, I find this to be more a creative challenge than a hindrance.
Starting composers especially should welcome the quality of sample libraries and strive to use their
instruments in subtler ways. While they provide good standard orchestration tools, I feel it should
also be taken as a stimulus to develop sound design skills, in trying to give these sounds a personal
touch.

Soft synthesisers 


The second main sound element that a DAW features is the synthesiser. This instrument generates a
steady pitch through one or more oscillators, whose timbre is then shaped by filters, envelopes and
low-frequency oscillators (LFOs). While there are different types of sound synthesis (additive,
subtractive, wavetable, frequency modulation etc.), synthesisers shape sounds in more or less the
same way, and are able to produce an enormous amount of different types of sounds. Depending on
the starting waveforms, one can produce short plucked sounds, soft breathy flute-like sounds, string
and brass pads, all the way to the most dissonant, noisy and evolving sound effects.

In the context of the DAW, we are mainly considering software synthesisers (known also as soft
synths), as opposed to hardware synthesisers. However, even when working with hardware
synthesisers, the DAW is arguably the best way to record, layer and mix all the desired sounds. This
is made even easier by routing MIDI data from the DAW to the hardware, and feeding the audio
back through an audio track.

Soft synths are often emulations of hardware synthesisers, and try to recreate their sound through
software synthesis. There is much debate regarding the use of such simulations, their sound often
falling short of the real, hard-wired, feel of the originals. However, for those whose ear is less used
to these original sounds, soft synthesisers provide nevertheless a great pool of valuable sonic
material to pick from. Also importantly, one can come close to an authentic sound without having to
pay hundreds or thousands for hardware gear which often doesn’t even allow for presets to be
saved. 

On the other hand, there exist many soft synthesisers which aren’t simulations and have their own -
often unique - engineering and sound. These software synthesisers are often used by composers,
because of the greater processing power that computers provide for sound synthesis and
manipulation. Hans Zimmer, longtime synth devotee, admits to using different soft synths and
effects for films such as Blade Runner 2049 (Lobenfeld 2017).

The way soft synthesisers are integrated in DAWs allows for endless manipulability and fast,
efficient workflow. On top of the effects and parameters featured in the actual synthesiser, one can
then directly feed the output through a custom effect rack to colour and personalise the sound even
further. Similarly, one could also record an audio clip from the synthesiser and input it into a
sampler and perhaps use it more as a “sound object” than a musical instrument.

The bottom line is, again, that all these elements are so well integrated into a single interface that
endless manipulability is allowed, with instant feedback.

Common hybrid scoring devices


Let’s suppose now that one should set out to write a score in the DAW, that combines orchestral
elements with synthesised sounds (a so called “hybrid” score). The first phase in this case is the
search for the right sound palette. Composer Jerry Goldsmith testifies to adding a week on feature
film projects to gather electronic sounds (Karlin & Wright 2004, p. 374). As Karlin & Wright state,

…the search for the appropriate sounds to express the musical material a composer has
envisioned […] can be a very stimulating and productive phase of the creative process,
leading the composer to musical solutions that would not have occurred to him without the
availability of these particular colors. (Karlin & Wright 2004, p. 374)

From this extended sound palette, it follows that the standard orchestration techniques must be
enhanced and slightly modified to better embrace these artificial sounds. Through decades of
practice, there are a few typical ways in which synthesised sounds are fitted in an orchestral context.

Synth sounds are often conceived as imitations of real instruments. For example, square wave
sounds can be associated to woodwinds with reeds, sine and triangle waves to flutes, sawtooth
waves to strings, and so on. It follows naturally to try to layer these “imitations” with the real
instruments to achieve a richer, fuller sound. So we see that very often (sometimes hardly
noticeably) synth pads are layered on top of string sections, when playing vertical chordal parts,
resulting in a denser sound. In these cases however, one must be careful not to overload the sound.
In filtering out the unnecessary frequencies from the synthesiser, we can aim directly at the desired
effect (for example brightening or darkening the sound), without risking taking up too much space
from the spectrum. 

In a similar way, it’s common practice to add depth to a double bass line by layering it with a synth
bass, often a sine wave. An example for this is Alexandre Desplat’s score for Godzilla (Godzilla,
2014), where synth basses are often used to double bass parts, especially ostinato type parts, adding
rhythmic power and definition.

One of the trademarks of film-maker and composer John Carpenter is the pulsing synth bass. With
the influence of Hans Zimmer scores such as The Dark Knight Rises (The Dark Knight Rises,
2012), this device has been featuring in countless soundtracks. It’s one of those tropes that once
proven successful gets used and abused by others, but it’s nevertheless an efficient way of setting
the pace of a musical passage, and also provides a grid to synchronise music to picture, when
needed. This sort of device provides an underlying rhythmic mood on top of which musical material
can be developed to follow the emotional arc of the scene.

Drones function in a similar way, allowing the composer to state a connecting musical mood, upon
which to develop melodic or harmonic material. Though often featured as real instruments,
synthesiser drones offer greater variety of sound. Through the use of different combinations of
effects such as flangers or LFO-controlled filters, a drone can be given an evolving life of its own.

Both the pulsing bass line and the drone are devices that lend themselves well to orchestral mixing:
their quite restricted spectral range leaves room for instrumental development on top (or beneath, in
the case of a high drone). 


All these techniques, combining orchestral and synthesised sounds, require thoughtful
experimentation and careful mixing, for effective results. The DAW represents an ideal environment
for this type of work. 

In my case, starting out writing short pieces for guitar, then messing around with electronic music,
familiarising with DAWs helped connect my classical training with more modern composing
techniques. Apart from providing a great tool for learning standard orchestration, the DAW’s
versatility as a sound factory represents to me an invaluable source of inspiration for different types
of musical concepts and products. 


Reference List

Bley Griffiths, E. (2017) How Claude Monet inspired Hans Zimmer’s soundtrack for Blue
Planet II [Online]. Available at: https://www.radiotimes.com/news/2018-01-26/blue-
planet-2-soundtrack/ [Accessed: 30 April 2019].

Scarface, [feature film] Dir. Brian De Palma. Universal Pictures, United States, 1983.
170mins.

Godzilla, [feature film] Dir. Gareth Edwards. Legendary Pictures, United States, 2014.
123mins.

Blue Planet II, [documentary series] Prod. James Honeyborne & Mark Brownlow. BBC
Natural History Unit/BBC Studios et al., United Kingdom, 29/10/2017 - 10/12/2017, BBC
One/BBC Earth. 60mins.

Karlin, F. & Wright, R. (2004) On The Track: A Guide to Contemporary Film Scoring. 2nd
ed. New York: Routledge.

Lobenfeld, C. (2017) Blade Runner 2049: How Hans Zimmer and Benjamin Wallfisch
followed up the most influential sci-fi score of all time [Online]. Available at: https://
www.factmag.com/2017/10/20/hans-zimmer-wallfisch-blade-runner-2049-interview/
[Accessed: 30 April 2019].

Star Wars, [feature film] Dir. George Lucas. Lucasfilm, United States, 1977. 121mins.

The Dark Knight Rises, [feature film] Dir. Christopher Nolan. Legendary Pictures/Syncopy,
United States/United Kingdom, 2012. 165mins.

Midnight Express, [feature film] Dir. Alan Parker. Casablanca FilmWorks, United States,
1978. 121mins.

Blade Runner, [feature film] Dir. Ridley Scott. The Ladd Company/Shaw Brothers, United
States/Hong Kong, 1982. 117mins.











Das könnte Ihnen auch gefallen