Beruflich Dokumente
Kultur Dokumente
through the processes of correction, condensation, organization, and other modifications performed with an intention of producing a correct, consistent, accurate, and complete work. The editing process often begins with the author's idea for the work itself, continuing as a collaboration between the author and the editor as the work is created. As such, editing is a practice that includes creative skills, human relations, and a precise set of methods.
Linear video editing is a video editing post-production process of selecting, arranging and modifying images and sound in a predetermined, ordered sequence. Regardless whether captured by a video camera, tapeless camcorder, recorded in a television studio on a video tape recorder (VTR) the content must be accessed sequentially.
Non-linear editing is a method that allows you to access any frame in a digital video clip regardless of sequence in the clip. The freedom to access any frame, and use a cut-and-paste method, similar to the ease of cutting and pasting text in a word processor, and allows you to easily include fades, transitions, and other effects that cannot be achieved with linear editing
3. Insert Editing: New material is recorded over existing footage. This technique can be used during the original shooting process or during a later editing process. Since the inserted footage is placed over the unwanted footage some of the original footage is erased. There are two types of Assemble Editing:
Technically this isn't video editing, it's film editing. It was the first way to edit moving pictures and conceptually it forms the basis of all video editing.Traditionally, film is edited by cutting sections of the film and rearranging or discarding them. The process is very straightforward and mechanical. In theory a film could be edited with a pair of scissors and some splicing tape, although in reality a splicing machine is the only practical solution. A splicing machine allows film footage to be lined up and held in place while it is cut or spliced together.
A new shot instinctively asks what is new, what it tells, how it relates to the previous image. Editing moves the scene or story along in both screen and real time. Action is shortened to the key moments in most editing, so cutting a journey can reduce years to seconds, while a chase can expand seconds to minutes. Flash backs and flash forwards are a rare exception to the usually chronological (and sometimes parallel) passage of film time. Some shots appear timeless, beyond any local context of time. As ever, playing with time should be motivated.
Cuts in a scene normally start wide, move closer; the reason is to establish a spatial sense of the scenes context, since we instinctively ask these questions when confronted with a new space. Cuts should also respect this spatial sense, and the 180 degree rule of dialogue editing is part of this respect. Of course you can break the spatial sense to create a sense of confusion or ambiguity, and you can reveal a space progressively rather than establishing it initially to build suspense
The editor is the real storyteller in video, every thing is subject to the editors judicial hand. Editing is about the who, with whom, what, where, why, when and how that we are constantly compelled to ask, and good editing reveals answers to these questions in interesting, thoughtful and deliberate ways. Your questions as an editor (which shot to reveal first, how long to hold a shot, etc) should be answered by considering the process of storytelling and the experience of the viewer. Normally for instance, every scene has a principal character that the audience will feel dominates the point of view; the editing should support this. Simple examples: Close up -who is the story about? Long shot (establishing shot) - where is the story taking place? Extreme close up & sound - why is the character behaving such?
Editors cut - There are several editing stages and the editor's cut is the first. The editor continues to refine the cut while shooting continues, and often the entire editing process goes on depending on the film.
Directors cut - When shooting is finished, the director can then turn his full attention to collaborating with the editor and further refining the cut of the film. Scenes and shots are re-ordered, removed, shortened and otherwise tweaked according to the directors vision.
Final cut - Often after the director has had his chance to oversee a cut, the subsequent cuts are supervised by one or more producers.
The basic idea is that a cut should provide some new explication or development of the story, no matter how abstract that may be. It shouldnt be there merely to hide bad filming or performance. The timing of a cut is the means by which you can ensure the smooth movement of the point of attention between one scene and another. For example, if the point of attention moves offscreen, or becomes undefined, that usually means it is a good time for a cut! Cinema has a language that most films adhere to this language helps define which shots are required for a scene, and can guide their arrangement. For example, the typical shots required for editing dialogue: Establishing shot Main dialogue interaction: Over the shoulder shots of the characters Direct facial shots of each character Incidental cutaways
Iris - A round, moving mask that can close down to end a scene
(iris-out) or emphasize a detail, or it can open to begin a scene (irisin) or to reveal more space around a detail.
Fade In/Out - A cinematographic technique causing the picture to darken and disappear (fade out) or the reverse (fade in)
A shot, usually involving a distant framing, that shows the spatial relations among the important figures, objects, and setting in a scene. Usually, the first few shots in a scene are establishing shots, as they introduces us to a location and the space relationships inside it.
Two or more shots edited together that alternate characters, typically in a conversation situation. In continuity editing, characters in one framing usually look left, in the other framing, right. Over-the-shoulder framings are common in shot/reverse-shot editing. Shot / reverse shots are one of the most firmly established conventions in cinema, and they are usually linked through the equally persuasive eyeline matches.
Editing that alternates shots of two or more lines of action occurring in different places, usually simultaneously. The two actions are therefore linked, associating the characters from both lines of action. Parallel editing across space and time can for example suggest that history repeats itself, generation after generation
An instantaneous shift from a distant framing to a closer view of some portion fo the same space, and vice versa. In Lars Von Trier's Dancer in the Dark ( Denmark, 2000) Selma and Bill have a dramatic conversation in Bill's car that is framed by a cut-in and a cut-away.
An elliptical cut that appears to be an interruption of a single shot. Either the figures seem to change instantly against a constant background, or the background changes instantly while the figures remain constant. Jump cuts are used expressively, to suggest the ruminations or ambivalences of a character, or of his/her everyday life, but they are also a clear signifier of rupture with mainstream film storytelling.
The exposure of more than one image on the same film strip. Unlike a dissolve, a superimposition does not signify a transition from one scene to another. The technique was often used to express subjective or intoxicated vision, or simply to introduce a narrative element from another part of the diegetic world into the scene. Editing matches refer to those techniques that join as well as divide two shots by making some form of connection between them. That connection can be inferred from the situation portrayed in the scene (for example, eyeline match) or can be of a purely optical nature (graphic match).
A cut obeying the axis of action principle, in which the first shot shows a person off in one direction and the second shows a nearby space containing what he or she sees. If the person looks left, the following shot should imply that the looker is offscreen right. Eyeline matches can be a very persuasive tool to construct space in a film, real or imagined.
Two successive shots joined so as to create a strong similarity of compositional elements (e.g., color, shape). Used in trasparent continuity styles to smooth the transition between two shots. Graphic matches can also be used to make metaphorical associations, as in Soviet Montage style.
A cut which splices two different views of the same action together at the same moment in the movement, making it seem to continue uninterrupted. A match on action adds variety and dynamism to a scene, since it conveys two movements: the one that actually takes place on screen, and an implied one by the viewer, since her/his position is shifted.
The decision to extend a shot can be as significant as the decision to cut it. Editing can affect the experience of time in the cinema by creating a gap between screen time and diegetic time (Montage and overlapping editing) or by establishing a fast or slow rhythm for the scene.
A shot that continues for an unusually lengthy time before the transition to the next shot. The average lenght per shot differs greatly for different times and places, but most contemporary films tend to have faster editing rates. In general lines, any shot above one minute can be considered a long take. Aside from the challenge of shooting in real time, long takes decisively influence a film's rhythm.
Cuts that repeat part or all of an action, thus expanding its viewing time and plot duration. Most commonly associated with experimental filmmmaking, due to its temporally disconcerting and purely graphic nature, it is also featured in films in which action and movement take precedence over plot and dialogue: sports documentaries, musicals, martial arts,
The perceived rate and regularity of sounds, series of shots, and movements within the shots. Rhythmic factors include beat (or pulse), accent (or stress), and tempo (or pace). Rhythm is one of the essential features of a film, for it decisively contributes to its mood and overall impression on the spectator. It is also one of the most complex to analyze, since it is achieved through the combination of mise-en-scene, cinematography, sound and editing. Indeed, rhythm can be understood as the final balance all of the elements of a film.
Among the most notable effects animators in history are A.C. Gamer from Termite Terrace/Warner Bros.; and Joshua Meador, Cy Young, Mark Dindal, and Randy Fullmer from the Walt Disney animation studio.
Special effects animation is also common in live-action films to create certain images that cannot be traditionally filmed. In that respect, special effects animation is more commonplace than character animation, since special effects of many different types and varieties have been used in film for a century.
TYPES Of SFX
effects and mechanical effects. With the emergence of digital film-making tools a greater distinction between special effects and visual effects has been recognized, with "visual effects" referring to digital post-production and "special effects" referring to on-set mechanical effects and in-camera optical effects. Optical effects (also called photographic effects), are techniques in which images or film frames are created photographically, either "in-camera" using multiple exposure, mattes, or in post-production processes using an optical printer. An optical effect might be used to place actors or sets against a different background.
TYPES Of SFX
Mechanical effects (also called practical or physical effects), are usually accomplished during the live-action shooting. This includes the use of mechanized props, scenery, scale models, animatronics, pyrotechnics and Atmospheric Effects: creating physical wind, rain, fog, snow, clouds etc. Making a car appear to drive by itself, or blowing up a building are examples of mechanical effects. Mechanical effects are often incorporated into set design and makeup. For example, a set may be built with break-away doors or walls to enhance a fight scene, or prosthetic makeup can be used to make an actor look like a monster.
Rain effect soak the actors clothes in water and superimpose the rain from a video tape recording. Or set up long plastic tubes with tiny holes in the bottom, attached to a water hose in the top of the setting. Snow spray commerical snow from aerosol cans on a piece of glass n cover the lens or sprinkle plastic snow from above. Eg, F.R.I.E.N.D.S. Fog place dry ice in hot water or use a fog machine ( used in 1980s films in bollywood in villian scenes. Wind use large electric fans to stimulate wind. Smoke commercial smoke machines can be used . OLDEN TECHNIQUE mineral oil poured on a hot plate. Fire for fire reflections staple large strips of silk or nylon cloth on a small batten n project the shadows on the set. Lightning place two large photo flash units about 10 ft apart. Trigger them one after the other.
The most spectacular use of CGI has been the creation of photographically realistic images of fantasy creations. Images could be created in a computer using the techniques of animated cartoons or model animation. In 1993, stop-motion animators working on the realistic dinosaurs of Steven Spielberg's Jurassic Park were retrained in the use of computer input devices. By 1995, films such as Toy Story underscored that the distinction between live-action films and animated films was no longer clear. Other landmark examples include a character made up of broken pieces of a stained-glass window in Young Sherlock Holmes, a shape shifting character in Willow, a tentacle of water in The Abyss, the T-1000 Terminator in Terminator 2: Judgment Day, hordes of armies of robots and fantastic creatures in the Star Wars prequel trilogy and The Lord of the Rings trilogy and the planet Pandora in Avatar.
Landmark movies
The Day After Tomorrow (Prolonged digital shots, playing with "weather effects") Independence Day (Digital effects combined with small-scale models) Jurassic Park (Large animatronics, creating creatures from scratch) King Kong (2005) (Motion Capture) The Lord of the Rings film Trilogy (Created Massive Software, prosthetic work, digital effects, motion capture) The Matrix Trilogy (Bullet Time) Pirates of the Caribbean: Dead Man's Chest (Motion capture from a 2D image) Star Wars (Creation of original, practical effects, "destruction" effects, pioneer in spaceships models) Superman (Human flight) Terminator 2: Judgment Day (3-Dimensional Morphing and 3D Human Body) Titanic (Model work, computer generated water, motion capture) Toy Story (Complete Computer Animation) Tron (Computer animation, virtual sets)
Bollywood filmmakers are now taking it a step ahead. Otherwise, why would a film shot in Philadelphia, go under the name of New York and become the biggest hit so far this year? Or one shot in Jaipur goes as Delhi-6, while almost 50% of Karan Johar's genre-changing Dostana, with its Miami-based storyline, is shot in Mumbai's film city. Though sets re-creating bygone eras with sketched backdrops have been there ever since story-tellers started weaving their magic on the silver screen, today modern technology has given an entirely a new dimension to the art of filmmaking. While there are the obvious special effects (VFX) blockbuster franchisees in Hollywood like Star Wars, Superman, Spiderman, Batman, Matrix, Lord of the Rings or an Indiana Jones, Bollywood has only recently gone heavy on VFX with films like Drona and Love Story 2050, though both of them failed to make any impression on the box office.
However, there were others like Dhoom which also used a lot of VFX and broke records. The difference is that all these movies used what movie lovers describe as 'visible special effects', while movies such as New York, Delhi-6, Chak De, Dostana, Ghajini, and Om Shanti Om (OSO) among many others are increasingly using 'invisible' special effects, which a viewer would never know, unless told.
Ra.One, according to industry buzz, could outshine Enthiran in visual effects. The movie is expected to have 3,500 VFX shots, with around 750 people from across five countries India, Canada, France, Thailand and the US working on these. If the buzz is true, this will be no mean feat. Rajnikanth's Enthiran (Robot was the Hindi version) was a box-office hit, raking in Rs 350 crore in worldwide receipts. The movie dazzled the audience with special effects, including a Matrixlike sequence, but it had only 2,000 VFX shots.
GUZAARISH: Hrithik Roshan trying to flick a fly off his nose. The fly was a VFX creation. Also, the grand house that appears was computer-generated and so were the candle tricks in the film; ZINDAGI NA MILEGI DOBARA: VFX was used to recreate sky diving scene and the race between car and horse ENTHIRAN: The last fighting scene in the movie, which had over 500 shots BLUE: Underwater fight sequence; the shark was shot live along with Akshay. However, in one shot of the same scene, Sanjay Dutt puts his hand in the water and a shark emerges from the water, almost biting his hand off. This was achieved by a computer-generated shark
FILMING LOCATIONS
Types of Filming locations
There are two main types of Filming locations. 1. 2. Location shooting is the practice of filming in an actual setting Studio shoots in either a sound stage or back lot
Substitute locations
It is common for films to be set in one place, but filmed in another, usually for reasons of economy or convenience, but sometimes because the substitute location looks more historically appropriate.
SUBSTITUTE LOCATION
1. Song New York Nagaram claims to be set in New York but was shot in Europe.
2. In the movie Roja the pulsating scene where Aravind Swamy is nabbed by the terrorists in front of Madhu Balas eyes as she walks out of the temple was shot in Manali, HP but according to the movie was set in Jammu and Kashmir.
LOCATION SHOOTING
Location shooting is the practice of filming in an actual setting rather than
SETS
Set construction is the process by which a construction manager undertakes to build full scale scenery suitable for viewing by camera, as specified by a Production Designer or Art Director working in collaboration with the director of a production to create a set for a theatrical, film or television production. The set designer produces a scale model, scale drawings, paint elevations (a scale painting supplied to the scenic painter of each element that requires painting), and research about props, textures, and so on. Scale drawings typically include a ground plan, elevation, and section of the complete set, as well as more detailed drawings of individual scenic elements which, in theatrical productions, may be static, flown, or built onto scenery wagons. Models and paint elevations are frequently hand-produced.
SETS
Construction of a film set is mainly done on studio stages or back lots, often within a studio
complex and several studio stages may be allocated purely as workshop space during the construction process. Many disciplines are employed under construction managers but craftsmen tend to not multi-task and so there are a range of job titles, such as, carpenter, rigger, plasterer, stage hand, poly waller, scenic painter, standby painter and standby carpenter are among them. A prop making workshop is set up in a similar stage and may be paid for out of a Construction or Art Department budget depending on the nature and size of the props in question Studio complexes tend to have support services such as Drape Shops, general stores, timber stores and plaster shop as well as special effects companies, on site to support construction and other departments.
background.
The most important factor for a key is the color separation of the foreground (the subject) and background (the screen) a bluescreen will be used if the subject is
predominately green (for example plants), despite the camera being more sensitive to
green light.
Even lighting
The biggest challenge when setting up a bluescreen or greenscreen is even lighting and the avoidance of shadow, because it is best to have as narrow a color range as possible being replaced. A shadow would present itself as a darker color to the camera and might not register for replacement. This can sometimes be seen in low-budget or live broadcasts
where the errors cannot be manually repaired. The material being used affects the quality
and ease of having it evenly lit. Materials which are shiny will be far less successful than those that are not. A shiny surface will have areas that reflect the lights making them appear pale, while other areas may be darkened. A matte surface will diffuse the reflected light and have a more even color range.
Sometimes a shadow can be used to create a special effect. Areas of the bluescreen or greenscreen with a shadow on them can be replaced with a darker version of the desired background video image, making it look like the person casting the shadow is actually casting a shadow on the background image instead.
Another challenge for bluescreen or greenscreen is proper camera exposure. Underexposing or overexposing a colored backdrop can lead to poor saturation levels. In the case of video and digital-cinema cameras, underexposed images can contain high amounts of noise, as well. The background must be bright enough to
Mixers
Mixers offer three main functionalities: Mixing summing signals together, which is normally done by a dedicated summing amplifier or in the case of digital by a simple algorithm. Routing allows the routing of source signals to internal buses or external processing units and effects. Processing many mixers also offer on-board processors, like equalizers and compressors. Effects: Reverbs used to simulate boundary reflections created in a real room, adding a sense of space and depth to otherwise 'dry' recordings. Delays most commonly used to add distinct echoes as a creative effect.
Mixing in Surround
Mixing in surround is very similar to mixing in stereo except that there are more speakers, placed to "surround" the listener. The same mixing domains mentioned above are involved, but instead of stereo's horizontal panoramic aspects, and depth's front-back aspects, mixing in surround lets the mix engineer pan sources within a much more two-dimensional environment. In a surround mix, sounds can appear to originate from any direction.
There are two common ways to approach mixing in surround: Expanded Stereo With this approach, the mix will still sound very much like an ordinary stereo mix. Most of the sources such as the instruments of a band, the vocals, and so on, will still be panned between the left and right speakers, but lower levels might also be sent to the rear speakers in order to create a wider stereo image, while lead sources such as the main vocal might be sent to the center speaker. Additionally, reverb and delay effects will often be sent to the rear speakers to create a more realistic sense of space.
Complete Surround/All speakers are treated equally Instead of following the traditional ways of mixing in stereo, this much more liberal approach lets the mix engineer do anything he or she wants. Instruments can appear to originate from anywhere, or even spin around the listener. When done correctly, interesting sonic experiences can be achieved
Audio Post-Production
Dialogue editing: Editing dialogue involves fine-tuning lines spoken by actors and other onscreen speakers and fixing bad pronunciation, stumbled words, and other slight defects of speech. Automated dialogue replacement (ADR, looping, or dubbing): This is the process of completely rerecording lines that were originally recorded in unsalvageable situations. Voiceover recording: This involves pristinely recording narration in such a way as to best capture the qualities of a speakers voice. Sound design: This is the process of enhancing the original audio with additional sound effects and filters, such as adding car crash or door slam sound effects to a scene to replace sound that was too difficult or unimpressive to record cleanly in the field. Foley recording and editing: This is the process of recording and editing custom sound effects that are heavily synchronized to picture, such as footsteps on different surfaces, clothes rustling, fight sounds, and the handling of various noisy objects. Music editing: Whether youre using prerecorded tracks or custom-composed music, the audio needs to be edited into and synchronized to events in your program, which is the music editors job. Mixing: This is the process of finely adjusting the levels, stereo (or surround) panning, equalization, and dynamics of all the tracks in a program to keep the audiences attention on important audio cues and dialogue and to make the other sound effects, ambience, and music tracks blend together in a seamless and harmonious whole.