Sie sind auf Seite 1von 51

Editing is the process of selecting and preparing written, visual, audible, and film media used to convey information

through the processes of correction, condensation, organization, and other modifications performed with an intention of producing a correct, consistent, accurate, and complete work. The editing process often begins with the author's idea for the work itself, continuing as a collaboration between the author and the editor as the work is created. As such, editing is a practice that includes creative skills, human relations, and a precise set of methods.

Linear video editing is a video editing post-production process of selecting, arranging and modifying images and sound in a predetermined, ordered sequence. Regardless whether captured by a video camera, tapeless camcorder, recorded in a television studio on a video tape recorder (VTR) the content must be accessed sequentially.

Non-linear editing is a method that allows you to access any frame in a digital video clip regardless of sequence in the clip. The freedom to access any frame, and use a cut-and-paste method, similar to the ease of cutting and pasting text in a word processor, and allows you to easily include fades, transitions, and other effects that cannot be achieved with linear editing

Linear Editing consists of three main categories:

1. In-Camera Editing: Video shots are structured in such a


way that they are shot in order and of correct length. This process does not require any additional equipment other than the Camcorder itself, but requires good shooting and organizational skills at the time of the shoot.

2. Assemble Editing: Video shots are not structured in a


specific order during shooting but are rearranged and unneeded shots deleted at the time of transferring (copying). This process requires at the least, a Camcorder and VCR. the original footage remains intact, but the rearranged footage is transferred to a new tape. Each scene or cut is "assembled" on a blank tape either one-at-a-time or in a sequence.

3. Insert Editing: New material is recorded over existing footage. This technique can be used during the original shooting process or during a later editing process. Since the inserted footage is placed over the unwanted footage some of the original footage is erased. There are two types of Assemble Editing:

A Roll -Editing from a single source, with the option of adding


an effect, such as titles or transitioning from a frozen image the start of the next cut or scene.

A/B Roll - Editing from a minimum of two source VCR's or


Camcorders and recording to a third VCR. This technique requires a Video Mixer and/ or Edit Controller to provide smooth transitions between the sources. Also, the sources must be electronically "Sync'd" together so that the record signals are stable. The use of a Time Base Corrector or Digital Frame Synchronizer is necessary for the success of this technique.

Technically this isn't video editing, it's film editing. It was the first way to edit moving pictures and conceptually it forms the basis of all video editing.Traditionally, film is edited by cutting sections of the film and rearranging or discarding them. The process is very straightforward and mechanical. In theory a film could be edited with a pair of scissors and some splicing tape, although in reality a splicing machine is the only practical solution. A splicing machine allows film footage to be lined up and held in place while it is cut or spliced together.

A new shot instinctively asks what is new, what it tells, how it relates to the previous image. Editing moves the scene or story along in both screen and real time. Action is shortened to the key moments in most editing, so cutting a journey can reduce years to seconds, while a chase can expand seconds to minutes. Flash backs and flash forwards are a rare exception to the usually chronological (and sometimes parallel) passage of film time. Some shots appear timeless, beyond any local context of time. As ever, playing with time should be motivated.

Cuts in a scene normally start wide, move closer; the reason is to establish a spatial sense of the scenes context, since we instinctively ask these questions when confronted with a new space. Cuts should also respect this spatial sense, and the 180 degree rule of dialogue editing is part of this respect. Of course you can break the spatial sense to create a sense of confusion or ambiguity, and you can reveal a space progressively rather than establishing it initially to build suspense

The editor is the real storyteller in video, every thing is subject to the editors judicial hand. Editing is about the who, with whom, what, where, why, when and how that we are constantly compelled to ask, and good editing reveals answers to these questions in interesting, thoughtful and deliberate ways. Your questions as an editor (which shot to reveal first, how long to hold a shot, etc) should be answered by considering the process of storytelling and the experience of the viewer. Normally for instance, every scene has a principal character that the audience will feel dominates the point of view; the editing should support this. Simple examples: Close up -who is the story about? Long shot (establishing shot) - where is the story taking place? Extreme close up & sound - why is the character behaving such?

Editors cut - There are several editing stages and the editor's cut is the first. The editor continues to refine the cut while shooting continues, and often the entire editing process goes on depending on the film.

Directors cut - When shooting is finished, the director can then turn his full attention to collaborating with the editor and further refining the cut of the film. Scenes and shots are re-ordered, removed, shortened and otherwise tweaked according to the directors vision.
Final cut - Often after the director has had his chance to oversee a cut, the subsequent cuts are supervised by one or more producers.

The basic idea is that a cut should provide some new explication or development of the story, no matter how abstract that may be. It shouldnt be there merely to hide bad filming or performance. The timing of a cut is the means by which you can ensure the smooth movement of the point of attention between one scene and another. For example, if the point of attention moves offscreen, or becomes undefined, that usually means it is a good time for a cut! Cinema has a language that most films adhere to this language helps define which shots are required for a scene, and can guide their arrangement. For example, the typical shots required for editing dialogue: Establishing shot Main dialogue interaction: Over the shoulder shots of the characters Direct facial shots of each character Incidental cutaways

Dissolve- A transition between two shots during which the first


image gradually disappears while the second image gradually appears.

Wipe - A transition betwen shots in which a line passes across the


screen, eliminating the first shot as it goes and replacing it with the next one

Iris - A round, moving mask that can close down to end a scene
(iris-out) or emphasize a detail, or it can open to begin a scene (irisin) or to reveal more space around a detail.

Fade In/Out - A cinematographic technique causing the picture to darken and disappear (fade out) or the reverse (fade in)

A shot, usually involving a distant framing, that shows the spatial relations among the important figures, objects, and setting in a scene. Usually, the first few shots in a scene are establishing shots, as they introduces us to a location and the space relationships inside it.

Two or more shots edited together that alternate characters, typically in a conversation situation. In continuity editing, characters in one framing usually look left, in the other framing, right. Over-the-shoulder framings are common in shot/reverse-shot editing. Shot / reverse shots are one of the most firmly established conventions in cinema, and they are usually linked through the equally persuasive eyeline matches.

Editing that alternates shots of two or more lines of action occurring in different places, usually simultaneously. The two actions are therefore linked, associating the characters from both lines of action. Parallel editing across space and time can for example suggest that history repeats itself, generation after generation

An instantaneous shift from a distant framing to a closer view of some portion fo the same space, and vice versa. In Lars Von Trier's Dancer in the Dark ( Denmark, 2000) Selma and Bill have a dramatic conversation in Bill's car that is framed by a cut-in and a cut-away.

An elliptical cut that appears to be an interruption of a single shot. Either the figures seem to change instantly against a constant background, or the background changes instantly while the figures remain constant. Jump cuts are used expressively, to suggest the ruminations or ambivalences of a character, or of his/her everyday life, but they are also a clear signifier of rupture with mainstream film storytelling.

The exposure of more than one image on the same film strip. Unlike a dissolve, a superimposition does not signify a transition from one scene to another. The technique was often used to express subjective or intoxicated vision, or simply to introduce a narrative element from another part of the diegetic world into the scene. Editing matches refer to those techniques that join as well as divide two shots by making some form of connection between them. That connection can be inferred from the situation portrayed in the scene (for example, eyeline match) or can be of a purely optical nature (graphic match).

A cut obeying the axis of action principle, in which the first shot shows a person off in one direction and the second shows a nearby space containing what he or she sees. If the person looks left, the following shot should imply that the looker is offscreen right. Eyeline matches can be a very persuasive tool to construct space in a film, real or imagined.

Two successive shots joined so as to create a strong similarity of compositional elements (e.g., color, shape). Used in trasparent continuity styles to smooth the transition between two shots. Graphic matches can also be used to make metaphorical associations, as in Soviet Montage style.

A cut which splices two different views of the same action together at the same moment in the movement, making it seem to continue uninterrupted. A match on action adds variety and dynamism to a scene, since it conveys two movements: the one that actually takes place on screen, and an implied one by the viewer, since her/his position is shifted.

The decision to extend a shot can be as significant as the decision to cut it. Editing can affect the experience of time in the cinema by creating a gap between screen time and diegetic time (Montage and overlapping editing) or by establishing a fast or slow rhythm for the scene.

A shot that continues for an unusually lengthy time before the transition to the next shot. The average lenght per shot differs greatly for different times and places, but most contemporary films tend to have faster editing rates. In general lines, any shot above one minute can be considered a long take. Aside from the challenge of shooting in real time, long takes decisively influence a film's rhythm.

Cuts that repeat part or all of an action, thus expanding its viewing time and plot duration. Most commonly associated with experimental filmmmaking, due to its temporally disconcerting and purely graphic nature, it is also featured in films in which action and movement take precedence over plot and dialogue: sports documentaries, musicals, martial arts,

The perceived rate and regularity of sounds, series of shots, and movements within the shots. Rhythmic factors include beat (or pulse), accent (or stress), and tempo (or pace). Rhythm is one of the essential features of a film, for it decisively contributes to its mood and overall impression on the spectator. It is also one of the most complex to analyze, since it is achieved through the combination of mise-en-scene, cinematography, sound and editing. Indeed, rhythm can be understood as the final balance all of the elements of a film.

What are special effects?


Special effects (or SFX) are used in the film and entertainment industry to create effects that cannot be achieved by normal means, such as travel to other star systems. They are also used when creating the effect by normal means is prohibitively expensive, such as an enormous explosion. They are also used to enhance normal visual effects. Many different visual special effects techniques exist, ranging from traditional theatre effects, through classic film techniques invented in the early 20th century, to modern computer graphics techniques (CGI). Often several different techniques are used together in a single scene or shot to achieve the desired effect. Special effects are often "invisible." That is to say that the audience is unaware that what they are seeing is a special effect. This is often the case in historical movies, where the architecture and other surroundings of previous eras is created using special effects.

Special effects animation


Also known as effects animation, special effects animation is a specialization of the traditional animation and computer animation processes. Anything that moves in an animated film and is not a character (handled by character animators) is considered a special effect, and is left up to the special effects animators to create. Effects animation tasks can include animating cars, trains, rain, snow, fire, magic, shadows, or other non-character entities, objects, and phenomena. A classic case of this would be the light sabres and laser-bolts in the original Star Wars, or the Monster from the ID from Forbidden Planet, both of which were created by rotoscopy. Sometimes, special processes are used to produce effects animation instead of drawing or rendering. Rain, for example, has been created in Walt Disney Feature Animation/Disney films since the late-1930s by filming slow-motion footage of water in front of a black background, with the resulting film superimposed over the animation.

Special effects animation

Among the most notable effects animators in history are A.C. Gamer from Termite Terrace/Warner Bros.; and Joshua Meador, Cy Young, Mark Dindal, and Randy Fullmer from the Walt Disney animation studio.
Special effects animation is also common in live-action films to create certain images that cannot be traditionally filmed. In that respect, special effects animation is more commonplace than character animation, since special effects of many different types and varieties have been used in film for a century.

TYPES Of SFX

effects and mechanical effects. With the emergence of digital film-making tools a greater distinction between special effects and visual effects has been recognized, with "visual effects" referring to digital post-production and "special effects" referring to on-set mechanical effects and in-camera optical effects. Optical effects (also called photographic effects), are techniques in which images or film frames are created photographically, either "in-camera" using multiple exposure, mattes, or in post-production processes using an optical printer. An optical effect might be used to place actors or sets against a different background.

Special effects are traditionally divided into the categories of optical

TYPES Of SFX

Mechanical effects (also called practical or physical effects), are usually accomplished during the live-action shooting. This includes the use of mechanized props, scenery, scale models, animatronics, pyrotechnics and Atmospheric Effects: creating physical wind, rain, fog, snow, clouds etc. Making a car appear to drive by itself, or blowing up a building are examples of mechanical effects. Mechanical effects are often incorporated into set design and makeup. For example, a set may be built with break-away doors or walls to enhance a fight scene, or prosthetic makeup can be used to make an actor look like a monster.

Most common techniques of mechanical effects:

Rain effect soak the actors clothes in water and superimpose the rain from a video tape recording. Or set up long plastic tubes with tiny holes in the bottom, attached to a water hose in the top of the setting. Snow spray commerical snow from aerosol cans on a piece of glass n cover the lens or sprinkle plastic snow from above. Eg, F.R.I.E.N.D.S. Fog place dry ice in hot water or use a fog machine ( used in 1980s films in bollywood in villian scenes. Wind use large electric fans to stimulate wind. Smoke commercial smoke machines can be used . OLDEN TECHNIQUE mineral oil poured on a hot plate. Fire for fire reflections staple large strips of silk or nylon cloth on a small batten n project the shadows on the set. Lightning place two large photo flash units about 10 ft apart. Trigger them one after the other.

Computer Generated Imagery (CGI)


A recent and profound innovation in special effects has been the development of computer generated imagery, or CGI which has changed nearly every aspect of motion picture special effects. Digital compositing allows far more control and creative freedom than optical compositing, and does not degrade the image like analogue (optical) processes. Digital imagery has enabled technicians to create detailed models, matte "paintings," and even fully realized characters with the malleability of computer software.

The most spectacular use of CGI has been the creation of photographically realistic images of fantasy creations. Images could be created in a computer using the techniques of animated cartoons or model animation. In 1993, stop-motion animators working on the realistic dinosaurs of Steven Spielberg's Jurassic Park were retrained in the use of computer input devices. By 1995, films such as Toy Story underscored that the distinction between live-action films and animated films was no longer clear. Other landmark examples include a character made up of broken pieces of a stained-glass window in Young Sherlock Holmes, a shape shifting character in Willow, a tentacle of water in The Abyss, the T-1000 Terminator in Terminator 2: Judgment Day, hordes of armies of robots and fantastic creatures in the Star Wars prequel trilogy and The Lord of the Rings trilogy and the planet Pandora in Avatar.

Landmark movies
The Day After Tomorrow (Prolonged digital shots, playing with "weather effects") Independence Day (Digital effects combined with small-scale models) Jurassic Park (Large animatronics, creating creatures from scratch) King Kong (2005) (Motion Capture) The Lord of the Rings film Trilogy (Created Massive Software, prosthetic work, digital effects, motion capture) The Matrix Trilogy (Bullet Time) Pirates of the Caribbean: Dead Man's Chest (Motion capture from a 2D image) Star Wars (Creation of original, practical effects, "destruction" effects, pioneer in spaceships models) Superman (Human flight) Terminator 2: Judgment Day (3-Dimensional Morphing and 3D Human Body) Titanic (Model work, computer generated water, motion capture) Toy Story (Complete Computer Animation) Tron (Computer animation, virtual sets)

Special effects in Bollywood

Bollywood filmmakers are now taking it a step ahead. Otherwise, why would a film shot in Philadelphia, go under the name of New York and become the biggest hit so far this year? Or one shot in Jaipur goes as Delhi-6, while almost 50% of Karan Johar's genre-changing Dostana, with its Miami-based storyline, is shot in Mumbai's film city. Though sets re-creating bygone eras with sketched backdrops have been there ever since story-tellers started weaving their magic on the silver screen, today modern technology has given an entirely a new dimension to the art of filmmaking. While there are the obvious special effects (VFX) blockbuster franchisees in Hollywood like Star Wars, Superman, Spiderman, Batman, Matrix, Lord of the Rings or an Indiana Jones, Bollywood has only recently gone heavy on VFX with films like Drona and Love Story 2050, though both of them failed to make any impression on the box office.

Special effects in Bollywood

However, there were others like Dhoom which also used a lot of VFX and broke records. The difference is that all these movies used what movie lovers describe as 'visible special effects', while movies such as New York, Delhi-6, Chak De, Dostana, Ghajini, and Om Shanti Om (OSO) among many others are increasingly using 'invisible' special effects, which a viewer would never know, unless told.

Ra.One, according to industry buzz, could outshine Enthiran in visual effects. The movie is expected to have 3,500 VFX shots, with around 750 people from across five countries India, Canada, France, Thailand and the US working on these. If the buzz is true, this will be no mean feat. Rajnikanth's Enthiran (Robot was the Hindi version) was a box-office hit, raking in Rs 350 crore in worldwide receipts. The movie dazzled the audience with special effects, including a Matrixlike sequence, but it had only 2,000 VFX shots.

USE OF VFX RECENTLY IN INDIAN CINEMA

GUZAARISH: Hrithik Roshan trying to flick a fly off his nose. The fly was a VFX creation. Also, the grand house that appears was computer-generated and so were the candle tricks in the film; ZINDAGI NA MILEGI DOBARA: VFX was used to recreate sky diving scene and the race between car and horse ENTHIRAN: The last fighting scene in the movie, which had over 500 shots BLUE: Underwater fight sequence; the shark was shot live along with Akshay. However, in one shot of the same scene, Sanjay Dutt puts his hand in the water and a shark emerges from the water, almost biting his hand off. This was achieved by a computer-generated shark

FILMING LOCATIONS
Types of Filming locations
There are two main types of Filming locations. 1. 2. Location shooting is the practice of filming in an actual setting Studio shoots in either a sound stage or back lot

Substitute locations
It is common for films to be set in one place, but filmed in another, usually for reasons of economy or convenience, but sometimes because the substitute location looks more historically appropriate.

SUBSTITUTE LOCATION
1. Song New York Nagaram claims to be set in New York but was shot in Europe.

2. In the movie Roja the pulsating scene where Aravind Swamy is nabbed by the terrorists in front of Madhu Balas eyes as she walks out of the temple was shot in Manali, HP but according to the movie was set in Jammu and Kashmir.

LOCATION SHOOTING
Location shooting is the practice of filming in an actual setting rather than

on a sound stage or back lot.


Most films do a bit of both location shooting and studio shoots, although low budget films usually do more location shooting than bigger budget films because the cost of shooting at someplace that already exists is much cheaper than creating that place from scratch. In certain situations it my be cheaper to shoot in a studio. In these situations lower budget films often shoot more in a studio. Before filming on location its generally wise to conduct a thorough check on all your resources and the possible loop holes..

PROS AND CONS


ADVANTAGES Location shooting has several advantages over filming on a studio set: It can be cheaper than constructing large sets The illusion of reality can be stronger - it is hard to replicate real-world wear-and-tear, and architectural details It sometimes allows the use of cheaper non-union labor or to bypass a work stoppage in the US. Canadian locations such as Vancouver and Toronto are known for this. It sometimes allows "frozen" currency to be used. The 1968 movie Kelly's Heroes was filmed in Yugoslavia using profits that had been made on movie exhibitions in that country but could not be exported.

PROS AND CONS


DISADVANTAGES Its disadvantages include: A lack of control over the environment lighting, passing aircraft, traffic, pedestrians, bad weather, city regulations, etc. Finding a real-world location which exactly matches the requirements of the script Members of the audience may be familiar with a real-world location used to double as a fictional location (such as Rumble in the Bronx inexplicably showing

the mountains outside Vancouver in the background of an urban Bronx-set scene)


Taking a whole film crew to film on location can be extremely expensive

SETS
Set construction is the process by which a construction manager undertakes to build full scale scenery suitable for viewing by camera, as specified by a Production Designer or Art Director working in collaboration with the director of a production to create a set for a theatrical, film or television production. The set designer produces a scale model, scale drawings, paint elevations (a scale painting supplied to the scenic painter of each element that requires painting), and research about props, textures, and so on. Scale drawings typically include a ground plan, elevation, and section of the complete set, as well as more detailed drawings of individual scenic elements which, in theatrical productions, may be static, flown, or built onto scenery wagons. Models and paint elevations are frequently hand-produced.

SETS
Construction of a film set is mainly done on studio stages or back lots, often within a studio
complex and several studio stages may be allocated purely as workshop space during the construction process. Many disciplines are employed under construction managers but craftsmen tend to not multi-task and so there are a range of job titles, such as, carpenter, rigger, plasterer, stage hand, poly waller, scenic painter, standby painter and standby carpenter are among them. A prop making workshop is set up in a similar stage and may be paid for out of a Construction or Art Department budget depending on the nature and size of the props in question Studio complexes tend to have support services such as Drape Shops, general stores, timber stores and plaster shop as well as special effects companies, on site to support construction and other departments.

THOTTA THARANI SETS Realistic


Sets of the movie Nayagan

THOTTA THARANI SETS Fictional

BLUE MATTE OR CHROMA KEY


Compositing is the combining of visual elements from separate sources into single images,
often to create the illusion that all those elements are parts of the same scene. Live-action shooting for compositing is variously called "chroma key", "blue screen", "green screen" and other names. All compositing involves the replacement of selected parts of an image with other material, usually, but not always, from another image. In the digital method of compositing, software commands designate a narrowly defined color as the part of an image to be replaced. Then every pixel within the designated color range is replaced by the software with a pixel from another image, aligned to appear as part of the original. For example, a TV weather person is recorded in front of a plain blue or green screen, while compositing software replaces only the designated blue or green color with weather maps.

BLUE MATTE OR CHROMA KEY


The chroma keying technique is commonly used in video production and postproduction. This technique is also referred to as color keying, colour-separation overlay (CSO), or by various terms for specific color-related variants such as green screen, and blue screen chroma keying can be done with backgrounds of any color that are uniform and distinct, but green and blue backgrounds are more commonly used because they differ most distinctly in hue from most human skin colors. No part of the subject being filmed or photographed may duplicate a color used in the

background.
The most important factor for a key is the color separation of the foreground (the subject) and background (the screen) a bluescreen will be used if the subject is

predominately green (for example plants), despite the camera being more sensitive to
green light.

BLUE MATTE OR CHROMA KEY - For Locations


Background Compositing for the movie 300

BLUE MATTE OR CHROMA KEY


ALICE IN WONDERLAND

LORD OF THE RINGS

MAJOR FACTORS IN CHROMA KEYING


Clothing
A chroma key subject must avoid wearing clothes which are similar in color to the
chroma key color(s) (unless intentional), because the clothing may be replaced with the background video. An example of intentional use of this is when an actor wears a blue covering over a part of his body to make it invisible in the final shot. This technique can be used to achieve an effect similar to that used in the Harry Potter films to create the effect of an invisibility cloak. The actor can also be filmed against a chroma-key background and inserted into the background shot with a

distortion effect, in order to create a cloak that is marginally detectable

MAJOR FACTORS IN CHROMA KEYING

Even lighting
The biggest challenge when setting up a bluescreen or greenscreen is even lighting and the avoidance of shadow, because it is best to have as narrow a color range as possible being replaced. A shadow would present itself as a darker color to the camera and might not register for replacement. This can sometimes be seen in low-budget or live broadcasts

where the errors cannot be manually repaired. The material being used affects the quality
and ease of having it evenly lit. Materials which are shiny will be far less successful than those that are not. A shiny surface will have areas that reflect the lights making them appear pale, while other areas may be darkened. A matte surface will diffuse the reflected light and have a more even color range.

MAJOR FACTORS IN CHROMA KEYING


Background Blue is generally used for both weather maps and special effects because it is complementary to human skin tone. The use of blue is also tied to the fact that the blue emulsion layer of film has the finest crystals and thus good detail and minimal grain, however, green has become the favored color because digital cameras retain more detail in the green channel, and it requires less light than blue. Green not only has a higher luminance value than blue, but also in early digital formats, the green channel was sampled twice as

often as the blue, making it easier to work with.


The choice of color is up to the effects artists and the needs of the specific shot. In the past decade, the use of green has become dominant in film special effects. Also, the green background is favored over blue for outdoor filming where the blue sky might appear in the

frame and could accidentally be replaced in the process.

MAJOR FACTORS IN CHROMA KEYING


In order to get the cleanest key from shooting greenscreen it is necessary to create

a value difference between the subject and the greenscreen. In order to


differentiate the subject from the screen, a two-stop difference can be used, either by making the greenscreen two stops higher than the subject, or vice versa.

Sometimes a shadow can be used to create a special effect. Areas of the bluescreen or greenscreen with a shadow on them can be replaced with a darker version of the desired background video image, making it look like the person casting the shadow is actually casting a shadow on the background image instead.

MAJOR FACTORS IN CHROMA KEYING

Another challenge for bluescreen or greenscreen is proper camera exposure. Underexposing or overexposing a colored backdrop can lead to poor saturation levels. In the case of video and digital-cinema cameras, underexposed images can contain high amounts of noise, as well. The background must be bright enough to

allow the camera to create a bright and saturated image.

CHROMA KEYING TECHNIQUES


Chroma Key Suits
Chroma key screen suits allow a person to be practically invisible when up against a green screen. These suits are usually close fitting to

eliminate wrinkles and shadows. These suits


are especially useful for filming special effects or puppet shows. All someone needs is a little imagination to use these suits.

USING CHROMA KEY SUITS


1. The famous Pradhu Deva dance sequence in the song Mukkaalaa Mukkabalaa was done using Chroma key suits.

2. The movie Hollow Man was one of the


first films in which Chroma Key suits and techniques were incorporated.

Sound mixing What is Audio Mixing?


Audio mixing is the process by which multiple recorded sounds are combined into one or more channels, most commonly 2-channel stereo. In the process, the source signals' level, frequency content, dynamics, and panoramic position are manipulated and effects such as reverb may be added. Sound editing, design, and mixing comprise a series of activities that are geared toward polishing the audio of your program to enhance the final presentation. Never underestimate the power of a good mix.

Mixers
Mixers offer three main functionalities: Mixing summing signals together, which is normally done by a dedicated summing amplifier or in the case of digital by a simple algorithm. Routing allows the routing of source signals to internal buses or external processing units and effects. Processing many mixers also offer on-board processors, like equalizers and compressors. Effects: Reverbs used to simulate boundary reflections created in a real room, adding a sense of space and depth to otherwise 'dry' recordings. Delays most commonly used to add distinct echoes as a creative effect.

Mixing in Surround
Mixing in surround is very similar to mixing in stereo except that there are more speakers, placed to "surround" the listener. The same mixing domains mentioned above are involved, but instead of stereo's horizontal panoramic aspects, and depth's front-back aspects, mixing in surround lets the mix engineer pan sources within a much more two-dimensional environment. In a surround mix, sounds can appear to originate from any direction.

There are two common ways to approach mixing in surround: Expanded Stereo With this approach, the mix will still sound very much like an ordinary stereo mix. Most of the sources such as the instruments of a band, the vocals, and so on, will still be panned between the left and right speakers, but lower levels might also be sent to the rear speakers in order to create a wider stereo image, while lead sources such as the main vocal might be sent to the center speaker. Additionally, reverb and delay effects will often be sent to the rear speakers to create a more realistic sense of space.

Complete Surround/All speakers are treated equally Instead of following the traditional ways of mixing in stereo, this much more liberal approach lets the mix engineer do anything he or she wants. Instruments can appear to originate from anywhere, or even spin around the listener. When done correctly, interesting sonic experiences can be achieved

Audio Post-Production
Dialogue editing: Editing dialogue involves fine-tuning lines spoken by actors and other onscreen speakers and fixing bad pronunciation, stumbled words, and other slight defects of speech. Automated dialogue replacement (ADR, looping, or dubbing): This is the process of completely rerecording lines that were originally recorded in unsalvageable situations. Voiceover recording: This involves pristinely recording narration in such a way as to best capture the qualities of a speakers voice. Sound design: This is the process of enhancing the original audio with additional sound effects and filters, such as adding car crash or door slam sound effects to a scene to replace sound that was too difficult or unimpressive to record cleanly in the field. Foley recording and editing: This is the process of recording and editing custom sound effects that are heavily synchronized to picture, such as footsteps on different surfaces, clothes rustling, fight sounds, and the handling of various noisy objects. Music editing: Whether youre using prerecorded tracks or custom-composed music, the audio needs to be edited into and synchronized to events in your program, which is the music editors job. Mixing: This is the process of finely adjusting the levels, stereo (or surround) panning, equalization, and dynamics of all the tracks in a program to keep the audiences attention on important audio cues and dialogue and to make the other sound effects, ambience, and music tracks blend together in a seamless and harmonious whole.

Das könnte Ihnen auch gefallen