THE DSLR FILMMAKERS HANDBOOK REAL-WORLD PRODUCTION TECHNIQUES PDF
The DSLR Filmmaker's Handbook Second Edition Photos courtesy of Carrie Vines DSLR Filmmaker's Handbook Real-World Production Techniques Second. Download the Book:The Dslr Filmmaker'S Handbook: Real-World Production Techniques PDF For Free, Preface: Learn to shoot professional-quality HD footag . Thank you for reading the dslr filmmakers handbook real world production techniques. Maybe you have knowledge that, people have search numerous times for.
|Language:||English, Spanish, German|
|ePub File Size:||18.66 MB|
|PDF File Size:||8.28 MB|
|Distribution:||Free* [*Regsitration Required]|
Get Free Read & Download Files The Dslr Filmmakers Handbook Real World Production Techniques PDF. THE DSLR FILMMAKERS HANDBOOK REAL. ppti.info: The DSLR Filmmaker's Handbook: Real-World Production Techniques (): Barry Andersson: Books. Andersson Barry. The DSLR Filmmaker's Handbook: Real-World Production Techniques. Файл формата pdf; размером 80,70 МБ. Добавлен.
Depth of field
But to achieve good results from your project and save yourself time, money, and frustration, you need to understand what all your options are before you dive into a project. Whether it was Walt Disney creating the early Alice Comedies with cartoons inked over film footage in the s or Ray Harryhausen combining stop-motion miniatures with live footage for King Kong in , the quest to bring the worlds of reality and fantasy together continues to evolve.
With computer technology pushing the envelope more every year, filmmakers are constantly attempting to outdo their predecessors with more realism and fantastic visual effects. Often misrepresented as chroma keying which is a process relegated to a video switcher that turns off a specific color value in a video channel , the matting or traveling matte process uses a sophisticated series of elements that allow you to make complex extractions and composites.
They were developed by Frank Williams, who used a black-backing matting process, which he patented in The process required the foreground actor to be evenly lit in front of a black background and then copied to high-contrast films, back and forth, until a clear background and a black silhouette were all that was left on the film. Using a contact print with the silhouette matte film and the intended background footage together, a composite could be created.
This process was used in many of the action silent films and continued to be used through the s for the series of The Invisible Man features. It was such a success that several sequels were created in the years following the original; they used the same process, even though more sophisticated techniques had been developed. Fulton of Universal used effects for The Invisible Man that awed audiences for generations as being technologically far ahead of their time.
Walt Disney set out in the s to do a series of cartoons called simply the Alice Comedies. These were short films that used footage of a live actress shot against a white background. Some of the scenes were done frame by frame from a series of stills to get closer interaction with the live actress and the animated characters. His Alice Comedies continued, with various actresses playing the Alice role in these silent films.
He and Disney parted ways for a time due to a dispute over a third-party contract, and Ub ventured out on his own. In , Disney and Ub developed new ways of mixing animation and live action in color with the feature film The Three Caballeros. Using the optical printers at the time, this footage was combined with animation cells and color overlays to create some fantastic effects never before seen. Dunn worked on developing the first commercial production model of the optical printer, which was used by the armed forces during WWII; he won an Academy Technical Award in for his design.
In this film, many of the location scenes were performed and shot in front of a rear-projection screen. This process originally used sodium vapor lighting on a set with the actors well in front of it.
Keep in mind that this was well before technology and information were as rapidly globally accessible as they are today. Ub then went on to perfect the camera used to shoot the traveling matte shots for several Disney feature films. Alfred Hitchcock borrowed Iwerks from Disney to supervise the special effects in his production of The Birds. Widmere and others tried using ultraviolet lighting to create mattes, with acceptable, but still limited results.
He knew there was a select band in the color spectrum where blue could be split off with good results. He won an Academy Technology Award in for this technology, which made the shooting and compositing of Ben Hur possible.
He has won several awards for his technical achievements, including Oscars, Emmys, and more, and has altered the course of motion picture history through his many accomplishments. In , Vlahos founded the Ultimatte Corporation www. Patro was joined by his son Paul Vlahos, and Ultimatte has led the industry in digital keying and compositing. Their technology in live broadcast keying hardware remains the de facto standard today and is also licensed by their largest competitor, Grass Valley, for use in its hardware systems.
Paul has been instrumental in the development of much of the technology at Ultimatte over the years, holds several patents of his own, and has received awards for his accomplishments. Ultimatte has been used extensively in broadcast television production since the late 70s.
The composited footage was ahead of its time, which raised the bar for production quality of Dr. You can find several examples of Cosmos on YouTube. Different ways to light and shoot the screens have been explored, including front projection, rear projection, interior and exterior lighting, film, and hardware and digital compositing—each has made its place in special effects history.
A fixed matte stays constant throughout a sequence, as is done with basic garbage matting and the traditional hand-painted matte paintings on glass that the camera shoots through, to the static scene behind the glass. The color film records this portion of the scene as a dark bronze color, since the prism splits off the narrow bandwidth of the yellow sodium vapor light.
There are two film carriers in the camera.
The lens in front captures the image, a prism deflects the sodium light frequency off at an angle onto the black and white film, and the color information continues straight on through to the color film. The original scene showing the sodium vapor lighting on the background. The color footage of the actors is captured along with the black and white matte footage, which is then processed and inverted to create a void in the background plate.
This matte is then masked to incorporate the animated character with the background plate and the animated characters and rotoscoped shadows. The final result may be several duplications of film later.
You can see why this technique was timeconsuming and was eventually abandoned for the newer blue screen process and, eventually, the digital compositing that we take for granted today. The blue or green screen production process is primarily made up of three elements: the foreground subject, the colored screen background, and the target background that the subject is composited into. Instead of a separate film stock processed at the same time as the original footage to create the traveling matte, the matte is generated from the background color on original film or digital video footage and composited digitally through hardware or a software application as shown in the diagram in Figure 1.
Professional and amateur filmmakers alike can now shoot, extract, and composite scenes with ease, thanks to the technology available for every budget. The only limitation is your imagination and how much time you want to put into the planning, production, and post-production of your project.
What is the best solution for your particular production? Should you use green or blue? Will your production be live or composited in post? Should you use a solid-color background or a reflective screen? Planning ahead and developing your workflow will dictate what your needs are and guide you through your available options. This chapter will help you decide between using a green or blue screen and will explore your options for hardware and software.
Blue Screen Finally Defined! The original process for blue screen compositing, called the color difference traveling matte composite, utilized a series of steps in layering and exposing individual frames of film to create a composite, as discussed in Chapter 1.
The next logical step for digital compositing production was to simplify and expedite the process of compositing blue screen shots with a combination of hardware and software.
The term blue screen was the industry standard until more video production started taking off in the late s. The green channel in HD digital video has the most samples of the three available channels, so it gives you more data to work with, with the least amount of noise. You can read more about the different camera sensors and digital camcorder technology in Chapter In addition, green is easier to light with tungsten artificial lighting, and it requires less light to illuminate it fully—thus costing less in production for lighting and setup.
Combining these three color channels together provides a composite of RGB and gives a color image. When these channels are separated out independently, you can see the image data contained in that color space.
The original footage was shot with an HD camera with a sensor.
This series of images is only for illustration purposes; the software used to generate a matte works much more comprehensively. A lot of work in matching the foreground color correction to the background composite will be required. A well-lit blue screen background will serve better in this situation.
In compositing, blue spill is less offensive on hair and skin tones, especially in night shots or composites where the scene includes a lot of cool-colored light. At times, a magenta or red screen is used for difficult-tolight night shots with a lot of foliage and cool color lighting on the foreground subjects and objects; in those cases, you need a contrasting color to pull a tight matte.
In high-resolution still photography, an evenly lit blue screen is the preferred background. The many subtleties in the color range give you a cleaner matte in Adobe Photoshop www. On this stage, Mr.
Andersson Barry. The DSLR Filmmaker's Handbook: Real-World Production Techniques
Erland can test for the purest background and lighting combinations while extracting a broad range of foreground colors with various cameras and devices. Erland describes the various color chips and focus targets placed around the frame of the stage with the green screen background, in addition to the blue and red backgrounds used for testing. With all spill completely removed from the front face of the stage frame, as well as an isolated and separately lit Esmeralda mannequin, the background is approximately 15 feet behind with changeable color backgrounds and filtered lights for complete illumination.
As Jonathan points out, the key to a great composite is all about the lighting and getting clean information from the background into the compositing software or hardware. As Alex Lindsay of Pixel Corps www.
Chroma Key 23 Difference Matte vs. Chroma Key The term chroma key is often used loosely to mean anything that refers to pulling a colordifference matte from film or video footage. Even several manufacturers of matte compositing hardware and software still refer to the process as chroma keying.
Depth of Field changes linearly with F-number and circle of confusion, but changes in proportional to the square of the focal length and the distance to the subject. As a result, photos taken at extremely close range have a proportionally much smaller depth of field. Sensor size affects DOF only in that changing the sensor size on a camera requires changing the focal length to get the same picture.
It is the change in focal length that then affects the DOF. Custom shaped apertures and bright points of light can be used artistically to produce images like this For a given subject framing and camera position, the DOF is controlled by the lens aperture diameter, which is usually specified as the f-number the ratio of lens focal length to aperture diameter.
Reducing the aperture diameter increasing the f-number increases the DOF because only the light travelling at shallower angles passes through the aperture. Because the angles are shallow, the light rays are within the acceptable circle of confusion for a greater distance.
Aperture settings are adjusted more frequently in still photography, where variations in depth of field are used to produce a variety of special effects.
Focus point is on the first blocks column. Otherwise, a point object will produce a blur spot shaped like the aperture , typically and approximately a circle.
When this circular spot is sufficiently small, it is visually indistinguishable from a point, and appears to be in focus. The diameter of the largest circle that is indistinguishable from a point is known as the acceptable circle of confusion , or informally, simply as the circle of confusion. Points that produce a blur spot smaller than this acceptable circle of confusion are considered acceptably sharp. The acceptable circle of confusion depends on how the final image will be used.
It is generally accepted to be 0.
The limit of tolerable error was traditionally set at 0. The plane of focus is normally parallel to the image plane. However, moving the lens relative to the sensor can rotate the plane of focus. When the plane of focus is rotated, the near and far limits of DOF are no longer parallel; the DOF becomes wedge-shaped, with the apex of the wedge nearest the camera. Alternatively, rotating the POF, in combination with a small f-number, can minimize the part of an image that is within the DOF.
Object field calculation methods[ edit ] Traditional depth-of-field formulas can be hard to use in practice. As an alternative, the same effective calculation can be done without regard to the focal length and f-number. Merklinger [c] suggested that distant objects often need to be much sharper to be clearly recognizable, whereas closer objects, being larger on the film, do not need to be so sharp.
Achieving this additional sharpness in distant objects usually requires focusing beyond the hyperfocal distance , sometimes almost at infinity.
For example, if photographing a cityscape with a traffic bollard in the foreground, this approach, termed the object field method by Merklinger, would recommend focusing very close to infinity, and stopping down to make the bollard sharp enough.There is also an alpha cleaner that automates the process of reducing noise while retaining edge details such as hair and transparency.
Add all three to Cart Add all three to List. The index was well organized, the table of contents made sense, and the format of the book was a good progression and pace.
The DSLR Filmmaker's Handbook: Real-World Production Techniques. Barry Anderson, Janie L. Geyen
Shooting HD video with a DSLR has many benefits — and also a few tricky drawbacks — but this guide gives you the insight and training you need to overcome these challenges as you learn what to anticipate, how to work around it, and how to fix imperfections in post-production. This, combined with incredible low-light capabilities, shallow depth of field, and relatively low price point make these cameras an extremely attractive entry point for would-be independent filmmakers.
The key light is your main light. A lot of work in matching the foreground color correction to the background composite will be required.
Don't have a Kindle? You can see why this technique was timeconsuming and was eventually abandoned for the newer blue screen process and, eventually, the digital compositing that we take for granted today. Ultimatte Hardware Compositors The first and still the best hardware compositor on the market is the Ultimatte system www.