I've not edited this thread much even though I think some answers are far too obsessed with the technolodgy and theory rather than the pictures and practise, don't give up, read past them! GB
I've been thinking about the philosophy of exposure. In the film days exposure was fairly simple (after you spent the time learning and practicing, of course): you used the incident meter and/or the spot meter and deciding on the stop was (most of the time) pretty straightforward. I'm an old Zone System guy so when shooting film I have a definite philosophy of Place and Fall.
Shooting Log/Raw is a different matter, especially if you are viewing log on the monitors or a Rec 709 conversion. In digital, there is one philosophy of Expose to the Right.
Another one is to place something pure black or (nearly so) in the scene at 0% (0 IRE) and let the rest fall where they may. Or place something middle gray at 40% (or 39%, or
41% or whatever that camera's log encoding calls for). I don't know of a short name for this approach.
I was wondering what other exposure philosophies people use when shooting Log/Raw and how it's different from when they are shooting Rec 709 HD. Light meters? Histograms?
Zebras? The Waveform (in which case what are the "landmarks" they use?), Red's Stop Lights and Goal Posts? Or maybe just view Rec 709 and judge from there? Would love to
hear how folks are doing this.
I read this on the Alexa Forum "With ALEXA you can use a light meter just like on a film camera and it will allow more room for errors than any other camera." I was wondering
if folks agree with that statement?
I'm a bit troubled by the "room for errors" phrase; partly just because I'm very old school: I started out shooting 4x5 and 8x10 transparency film for print (and this was before Photoshop and magazines back then were very particular about these things). In that case, I bracketed by 1/4 stops and when you laid them out on the light table, there was a clear difference between something that was 1/4 stop over or under exposed when viewed beside the "correct" exposure. Clearly, it's not that critical anymore, but I still like to expose "properly" as much as possible and not rely on "room for error," or at least rely on it as little as possible or at the very least, pretend to myself that
I'm not relying on it!
Which makes me think that the real "big picture" philosophical question is — what is "correct" exposure with modern Raw/Log cameras? What type of exposure is going to yield the ideal image? Exposure wise, what is the "ideal image?"
I understand Art's previous post about sometimes shooting Rec 709 because the reality is that the footage on some kinds of jobs is not going to get extensive grading. Long ago, I used to shoot a lot of Betacam for the Japanese market and when I popped the tape out of the camera, I knew that I would never see it again. In that case, obviously I tried to control the image in camera as much as I could. But in this message, I'm specifically talking about shooting Raw/Log and monitoring it "as is" or with a LUT applied;
in which case what LUT? Rec 709?
Thanks Blain Brown DP LA
Thanks for asking this question Blain its getting some interesting responses.
I understand the expose to the right camp and I have done this at times in the past but not anymore.
As has been pointed out consistency is the key to a happy life as a DP!
The images flow together, the dailies are faster/cheaper, the final grade is faster/cheaper and on and on.
It's the first one that is most important, keeping skin tones consistent.
That's why I've gone back totally to using my incident meter together with my old Pentax digital spot calibrated with the zone system.
I decide where the skin will be and keep it there.
Overall with the incident reading, spot checks of highlights, shadows and skin, hang on! That's how I used to shoot film.
I have expanded the zone system range by a stop at either end, i.e. where I knew I would get pure white or black I now know I will have detail.
I use an ISO for the cameras that I've established from testing works for the way that I work, that sounds familiar as well.
Yes, I will use a waveform on GS work it's a great tool there or diamonds, but otherwise it's a rare thing for me.
For me it's a question of dynamic range, once it gets to 12 stops or more I can relax and get into the pictures rather than the tech.
Geoff Boyle FBKS EU based cinematographer
Quote: [I was wondering what other exposure philosophies people use when shooting Log/Raw]
When shooting to a ture raw recording camera two things matter and will grade from the true raw recording:
1) Are any important things in the subject matter pushed over the ADC clip level (maximum sensor code value) in one of the four channels (R, G1, G2, B in a Bayer sensor).
2) How much noise can you stand in your work-flow in the parts of the image under mid-tones. If the camera is locked down you can de-noise better and maybe shoot as much as two stops lower to gain highlight range, if its a rapid moving camera shot then you may need more fill light and to frame with bright things like sky out of the frame.
If you monitor using a Rec.709 or monitor gamma table applied for the K value and ISO you are shooting at, you will be able to focus better since the mid-tones will be at maximum contrast. You cannot judge the image exposure using a small LCD monitor under various ambient lighting conditions if you try then you may overexpose outdoor shots and underexpose night shots because you eyes adjust to the ambient light but the monitor brightness does not compensate equally for the wide range of ambient lighting around it.
Looking at the waveforms and histograms is a help, more so if you can monitor those at sensor raw levels rather than monitoring path levels because you can see the actual data levels and so you can expose higher when the subject contrast allows.
What I have found most useful is to set the monitoring at about 1.5 stops headroom above 90% white, set the zebras to 90% absolute sensor data (not monitoring data because with highlight crush in monitoring S-curve there is not enough room to tell much if you zebra based on monitor levels) and use the zebras to tell what in the frame is close to the top, then underexpose as needed to avoid zebras or re-frame to reduce the zebras, knowing that I can pull the shots up to 3 stops in grading, so if I monitor at EI ISO 200 the light meter exposure would range between maybe 160 and 1600 or so.
That gives the highest signal possible in the recording for each shot based on the absolute raw data, and enough contrast on the monitor to focus well.
If you need "one light ready" footage, then you have to change the monitoring tables shot to shot so the meta data will force the grade to match the floating mid-tone level in the recorded data, and for run-and-gun shooting its just simpler not to change anything shot to shot other than the exposure, keeping the exposure as high as the sensor raw zebras allow.
If you don't have sensor raw zebras, waveforms and histograms in your camera then you cannot know what the recorded true raw levels are, and that makes the exposure decisions much more vague even if you know the transforms used in the monitor tables because you don't have the instant conformation that you are safe with regard to highlight detail and getting into the noise levels.
What exactly the mid-tone sensor code level is is much less important for shooting real subjects (if you are going to grade the footage from the raw data, although the KineRAW (tm) cameras have 18% gray and 90% white tick marks on the histograms so you can hold up a white and or gray card to check if your are on the calibrated sensor output level for the monitoring selected in case you need "one light" processing or otherwise want the EI ISO to match for two camera shooting etc.
If you are going to grade from the DPX or other RGB made from the raw data with default processing, then the placement of mid-tone does matter because you will get fewer histogram gaps if you grade from the raw data if you need to adjust mid-tone by several stops.
Shooting raw is different from shooting film in that film had only one good exposure level, and so you made compensations for the subject before you exposed the film to avoid corrections later. Sometimes with color negative one would expose higher than rated then grade down, but not much more than a stop to reduce grain.
Digital has no such prescribed exposure that is "best" for all subjects, rather you just expose as high as you can while avoiding blown out highlights on something important, when that can be avoided, and do so to reduce the noise as much as you can since compressed distribution works best when the footage has minimum noise levels, so the optimum exposure is the one that gives the highest ADC code levels without bloom or clipping for the most part, and that varies shot to shot depending on each shots framing, lighting, and subject matter.
Its best to work with a camera that records the linear ADC output 100% and actually lossless since that gives you everything to draw on in the grading direct from the actual sensor data raw recording, anything else starts an avalanche of compromises, starting with histogram gaps in the highlights for log encoding, noise blur from wavelet compression in the deep shadows, up to block artifacts and massive histogram gaps in 8bit H.264 when restored to 16bit linear data analog for grading. If you record all the ADC data verbatim, you can decide later what sensor code value ends up where in the end results and can keep your mind on the clipping and noise when shooting, and so can concentrate on those two sensor signal matters that are the important issues to make exposure decisions on while shooting digital.
Some camera makers may be trying to produce monitoring tables and say "shoot like you would with film, by ISO and K" but all that gets you is uniform noise at mid-tone for all the shots, which may not give the best result overall since some shots will needlessly be exposed very low (because the monitoring is set to ISO 800 etc.) despite their low subject contrast ratio, and other shots will show blow-out highlights cause by forcing the exposure to match the monitoring since most people are reluctant to set ISO 3200 for a high contrast noon bright daylight back light shot then pile on ND filters.
Dan Hudgins firstname.lastname@example.org http://www.DANCAD3D.com San Francisco, CA, USA
I think most if us who shoot narrative work, where besides dealing with the exposure per set-up, there is also the continuity of exposure across all the set-ups of the scene --
you can't treat each shot like a still photo that stands alone, there needs to be some consistency of noise level, gamma, etc. within the scene to some degree.
So I think a lot of us expose for the subject in terms of how we want it to look in brightness, and then we may tweak the exposure if something in the extreme ends of the exposure range is important. This is once we have settled on an ASA and look in terms of gamma & color that works for us. We don't expose each set-up individually for either
the brightest or the darkest thing in the frame and therefore have each set-up potentially vary quite a bit in exposure once cut together. This is why the dynamic range of the camera system matters so much, so we can expose creatively and quickly and then move on to the next set-up -- we can't treat every set-up like some complicated science experiment in exposure.
David Mullen, ASC Los Angeles
Quote: [We don't expose each set-up individually for either the brightest or the darkest thing in the frame and therefore have each set-up potentially vary quite a bit in
exposure once cut together.Â This is why the dynamic range of the camera system matters so much, so we can expose creatively and quickly and then move on to the next set-up --
we can't treat every set-up like some complicated science experiment in exposure.]
In the days of film, lights and reflectors were used to equalize the subject for the film's range.
With lower budgets shooting by natural lighting gives a wider subject contrast range range to cover for some shots.
Using a stable high ISO sensor exposure for mid-tone without regard to the highlights or shadows, with today's sensors is not going to provide the lowest noise levels for compressed distribution.Â The usable ISO range for a single analog gain value, say 1x, is more limited than the ISO range for making adjustments to the analog gain and so, you end up with more noise by not balancing the subject contrast to the other sensor settings.
Its good for those that can afford to to shoot with a constant headroom ISO, but not everyone can afford to because they don't have the budget any longer for lights and generators, one thing that sold the digital changeover was being free of the limited range of film for natural light shooting (I'm not saying that makes better looking movies, but the goal can be to get the best result from todays cameras's sensors under less than ideal shooting conditions, not say I "hope" someday they will make a digital camera that is really better than film was, in part because film never was as good as it looks since people used lights and reflectors with it as well).
My point was, given any given lighting on the subject, how do you get the "best" result, after de-graining the intrinsic noise level is not so important because the post de- noise needs to be applied for compressed release formats, try looking at a Blue-ray made from a "raw" film scan without de-noise applied at least in the compression stage.Â If you are shooting a some constant mid-tone sensor ADC count, then you will need to equalize the subject contrast ratio to some extent, if someone has a camera you can leave at ISO 1280 and compare that result to ISO 160 and not see an improvement when shooting a low subject contrast at the lower ISO, I have not seen it, do you have an example full size DPX at the two speeds? Cameras that have built in de-noise may look closer than raw shooting cameras, so much of what limits the shooting is not just the camera's dynamic range, but also the post workflow and what can be extracted and conditioned from the image data captured, just comparing two cameras "first stage" output tells you nothing about the ultimate quality that can be extracted after heavy post processing, i.e. the
I would rather shoot at 160 with fill lights, than 1280 without them (because of the higher contrast ratio involved in not using fill lighting), but until there is a camera that one can get flawless results at ISO's 1280 to 2560, is it really better to degrade all the shots for consistency.
Quote: [One of Lowryâ€™s most recent projects, director David Fincherâ€™s The Curious Case of Benjamin Button, has just been nominated for 13 Academy Awards â€“ including, crucially, a nod for Claudio Mirandaâ€™s work as cinematographer.]
Dan Hudgins email@example.com http://www.DANCAD3D.com San Francisco, CA, USA
> In the days of film, lights and reflectors were used to equalize the subject for the film's range.
Not for all kinds of filmmaking! Those of us shooting non-fiction films were never allowed to use lights, or even turn existing lights on and off. Plus, we liked emulsions with very limited latitude for low light shooting, like 7250 pushed to EI 800.
That required careful (incident) metering and a good feel for the stock and processing. There were some situations where the right exposure had to be within 1/4 stop, or you would lose something critical.
Wide dynamic range is a nice thing for capture. But of course grading is needed to make it look like it should!
Jeff Kreines Kinetta
Quote: [Plus, we liked emulsions with very limited latitude for low light shooting, like 7250 pushed to EI 800.]
Yes, Kubrick pushed ALL the footage in BARRY LYNDON for consistency, but today consistency means grainless for compressed end use formats, so getting the noise a low as you can for all shots gives better results when you can have the lighting contrast ratio low.
Even in the documentary days, did you push the bright daylight shots to 800? So that every shot no matter what the available light was to be as grainy as the night "coal mine" shots?
Quote: [Kubrick went ahead and push-developed the entire film one stop - outdoor and indoor scenes alike.]
That's basically what you are doing if you set your Digital Cinema Camera to 800 to 2560 for all the shots.
Dan Hudgins firstname.lastname@example.org http://www.DANCAD3D.com San Francisco, CA, USA
> Even in the documentary days, did you push the bright daylight shots to 800? So that every shot no matter what the available light was to be as grainy as the night "coal
We always pushed 7250 to EI800 no matter where we were shooting, because you never knew where you would end up. We did not what to have to switch between mags of pushed and unpunished film.
Plus, there's a rule: There is an inverse corollary between the ambient light level and how interesting something is. Phrased simple, all the good stuff happens in the dark.
Jeff Kreines Kinetta
> I question what the value of me or you knowing these fun little facts is > relative to our accomplishing our work. Truly there is nothing we could do > with the knowledge, no way that we could apply it to our advantage in any > way.
So… just stop asking questions? Not me, sorry. I love learning this stuff. Does it have practical application? I have no idea. I never know what I need to know until I need to know it.
Just knowing that crosstalk on a circuit board is potentially a major contributor to noise just blows my mind, and someday there may be a need to know that. Even if there isn't a need to know, I just love knowing it.
I'm fascinated by the idea that someone thinks I shouldn't know about stuff at this level. And slightly horrified, too.
-- Art Adams Director of Photography San Francisco Bay Area | CA | USA
> So… just stop asking questions? Not me, sorry. I love learning this stuff. Does it have practical application? I have no idea. I never know what I need to know until I need
to know it.
Mitch, I've got to agree with Art. Your vast contributions to my personal knowledge (and I'm sure many, many others') seems at odds with your statements. By all means please continue to "curate" your own contributions to whatever you perceive is our ability to absorb it, but don't fault deeper dives. Because really, you never know.
Martin Baumgaertner Shooter, editor, wonderer Angle Park, Chicago
> Mitch, I've got to agree with Art.
Just understand that a little bit of knowledge can be quite dangerous, as it can be attributed to the wrong conclusion far to easily. I see it happen all the time. But point taken.
It's like the old Steve Martin joke about getting a Philosophy degree in college. It teaches you just enough to fuck you up for the rest of your life.
Mitch Gross Applications Specialist AbelCine NY
> Just understand that a little bit of knowledge can be quite dangerous, as it can be attributed to the wrong conclusion far to easily. I see it happen all the time. But point taken.
Understood. Your job must involve a lot of parsing. Fortunately cml >= reduser
Martin Baumgaertner Shooter, editor, onetime coder Angle Park, Chicago
> Which makes me think that the real "big picture" philosophical question is — what is "correct" exposure with modern Raw/Log cameras? What type of exposure is going to yield
the ideal image? Exposure wise, what is the "ideal image?
Great question … we just spent the weekend down at Del Mar race track with a two EPICs, a C500, F55 and 1DC and all shooting their own particular flavor of 4K RAW/Log (plus a whole bunch of GoPros shooting their version of Log) … we had a couple of experienced ASC chaps with us so once we’ve finished processing the RAW/LOG rushes, I’ll ask them that question while we evaluate the footage. From what I’ve seen of the rushes so far, these guys know how to nail it.
BTW, thoroughbred race horses in 4K shot with any of these cameras with good glass and great DPs is a sight to behold … totally stunning watching those magnificent beast at 120fps in 4K … if 4K/UHDTV is to thrive in the home and not go the way of 3D, sports in 4K will be a major contributing factor.
Neil Smith CEO LumaForge LLC
I go for accurate middle gray, or place middle gray where I want it, and see how the rest falls.
Exposing to the right results in inconsistent noise from shot to shot, which can be jarring, and also tends to result in less consistency of exposure such that every shot needs it's own grade. I'd rather have the colorist set up to the first shot of the scene and be in the ballpark for everything that comes afterwards. It's not only easier for me to get the look that I want, because my exposures aren't a moving target, but it saves money as the grade goes quicker.
Putting something black at the black point is similarly pointless because there's not a lot of real black out there unless it's a dark shadow. Also, if a camera exhibits wide dynamic range but it isn't equally split between highlights and shadows you'll always have more stops in the shadows, which means putting something black at black results in pushing the exposure way down. Also, if what's black in one frame isn't black in another you end up with the same problems as exposing to the right: inconsistent noise and overall exposure.
Exposing to the right or to the left is great for one shot that stands alone… which almost never happens. When your images sit adjacent to others why would you base their exposures on the things that change the most between them?
Mid-tones are where our eyes are the most sensitive. We naturally roll off the extreme shadows and highlights, just as cameras have been designed to do, so it makes little or no sense to me to base a scene's exposure on the part of the image that our brains naturally compress anyway. I generally expose raw and log based around middle gray using a meter… unless I'm shooting doc-style, and quickly.
In that case I tend to look at the mid-tones and the highlights together on a waveform. In certain cameras, like any of the Canon C family, throwing the camera into cine lock mode puts the waveform in that mode as well. I know that in the C500 the waveform continues to look only at a log representation of the raw signal even if view assist is turned on. Canon's implementation of log pushes blacks fairly far up the scale to the point where I really don't worry about them at all. I find a mid-tone exposure on the waveform that works for me while trying not to clip highlights. It's as much trusting the image by eye as it is judging the waveform monitor with my gut.
If I have a choice I work strictly by meter. If not I work off the waveform. In both cases I look primarily at mid-tones.
In the F5/F55, where I don't believe waveforms have been implemented, I look at an external waveform when possible. If I'm running around then I look at zebras and judge mid-tones in the viewfinder by eye. It's not ideal, but I'm old school enough that I can do it well.
The big change for me is that I used to use nothing but a spot meter but gamma curves are so variable that trying to nail an exposure based on a reflected reading is the true moving target. I can use a spot meter to find out how far exposed a window will be on a scout but it's tough to light using nothing but a spot meter in HD, the way I could in film. Film stocks had different gammas but we only had to know a couple of them; every HD camera has at least 7 or 8 basic variations, plus lots of other variables that come into play.
I find myself using an incident meter much more often now. I rough in the lighting and then check it on a monitor and waveform, and then use the incident meter for consistency when moving around within the scene. I still pull out my spot meter occasionally but I'm becoming less reliant on it. The incident meter helps me get my mid-tones right and keep them consistent, and the monitor and waveform tell me about everything else.
What I really want is a great yet affordable onboard multi-display monitor that shows me an image, a waveform and a vectorscope all at the same time. I'm hoping Convergent Design will pull that off with the Odyssey. I tried an Ikan monitor that had a mode with this layout that looked perfect for me, but the waveforms were incorrectly implemented and the image changed color when a meter was overlaid, which made them professionally useless. (That was a big lesson for me: I thought waveforms were waveforms and were hard to screw up, but it turns out it's not that hard to get the underlying math wrong. Or, rather, it's hard for some to get it right.)
I think the film method of centering your exposure around middle gray makes perfect sense. That way you're exposing for the most important part of the image instead of the bits that will be compressed and discarded anyway.
-- Art Adams Director of Photography San Francisco Bay Area | CA | USA
> We don't expose each set-up individually for either the brightest or the darkest thing in the frame and therefore have each set-up potentially vary quite a bit in exposure once cut together.
Absolutely, David, consistency throughout a scene (and over the whole film) has always been extremely important but what I'm curious about is how you judge exposure when shooting Raw/Log. Do you rely more on meters or on the waveform monitor? Do you monitor in log or in Rec 709, for example? You state the aims of dramatic long form cinematography very well but I wonder if you could say a bit more about methodology.
What interests me is how methods of exposure have changed since the film days and also how they have changed since the days of shooting HD.
"This is why the dynamic range of the camera system matters so much, so we can expose creatively and quickly and then move on to the next set-up -- we can't treat every set-up like some complicated science experiment in exposure." Absolutely, when shooting negative film — setting exposure is relatively straightforward and quick (with the occasional exception, of course). My example of shooting large chromes was meant to illustrate the historical difference between the hyper-crucial world of exposing reversal film as opposed to shooting negative with it's substantially greater latitude and the chance to make some (hopefully minimal) adjustments in the print stage. On motion picture negative, I doubt that a 1/4 stop difference could even be measured -- I would guess it probably falls within the variability of processing. Even from when I started shooting features on film until the present, this has changed dramatically as Kodak (and Fuji) steadily improved their stocks. I'm sure we all remember the introduction of Vision stock and what a huge leap forward that was.
It has often been said that shooting HD was like shooting reversal film, but of course, that has changed with the new cameras -- this is the interesting part, I think. The tremendous dynamic range of these cameras, plus the benefits of log encoding and recording Raw have made it a whole new ballgame. I have my own methods when shooting with these cameras and I've heard different approaches from other DPs and I'm just trying to gauge if there is a rough consensus or if there is a range of methods which work for different cinematographers and different shooting conditions.
But my question is this, when you say "the dynamic range of the camera matters;" I'm just wondering if you could possibly expand a bit on what you mean? Would love to hear more of your thoughts on this topic. I think I know what you intend by this, but I certainly don't want to put words in your mouth.
Thanks Blain DP LA
Someone mentioned "in the days of film"-I'm watching "Breaking Bad" right now, Kodak/ Arricam/Cookes, and of course it looks fantastic. Lots of other shows/features still
shooting film. Looks better than anything else, happiest crews on earth(laughs) Watched "Ana Karenina"in the theater and now on cable, really beautiful- The F55 has no exposure tools right now, so you need a light meter, or as Art mentioned, a waveform on your monitor, or guess-o-meter. I checked the new firmware update(1.15 as of yesterday?) and I think exposure tools haven't been included yet- From my experience bringing things into Assimilate Scratch, underexposed footage was noisier in the blacks. Back when the R1 was 320 it was even worse, and working w/ limited latitude/dynamic range is what it is, limited. Blown highlights are ugly, noisy or milky blacks are ugly. But who's paying for Scratch or other grading color correcting path?
So much just started going direct to FCP, maybe Color, some people butchering footage in AE after compositing...so lucky are the ones who have a proper post path.
On projects that people just take away your footage, who knows what they're doing with it-can you work a deal to get paid to run your footage through Resolve like photographers run their RAW pictures on Photoshop? Pretty much impossible, unless it's your project, you're Director/DP, control the budget, etc
J. Babl Miami
I tend to think of Rec.709 as the print and log / raw as the negative, figuring if I can make it look good in Rec.709 but record log or raw, I know I'll have more information to play with in post. That's my general approach. I usually start out by checking my meter readings against how it looks on the monitor but after that, mainly expose by monitor except perhaps outdoors where your monitor is often not in a dark enough environment and your eyes are used to a higher ambient level.
To a lessor extent, the same problem exists at the other end, in extremely low-light environment where an LCD monitor starts to be the brightest thing in the room and you may tend to underexpose the monitor image to make it feel dark.
So to some degree there is a mental calculation that takes place if you relying on monitors in variable viewing environments. I can usually adjust my exposure tendencies to compensate, but I've had second unit work come back overexposed in day exterior scenes because they made the mistake of exposing until the image looked bright enough to see the action in a bright viewing environment (and this was after I warned them that this would happen.) This is where metering & exposure tools become more important.
It would be nice to see a waveform display of the log signal while monitoring a Rec.709 image to know exactly how much clipping is going on... but most camera set-ups seem to involve sending Rec.709 from the camera to the monitors so any waveform on the cart would be reading Rec.709.
You could send log to video village but then you'd need a Rec.709 conversion happening between the waveform and the monitor (I know, some monitors have Rec.709 LUT generators,
just not the ones I can afford to rent.) Plus if I am using something like a ARRI Look File in the Alexa applied to the Rec.709 monitor output, then I can't really do the conversion to Rec.709 at the monitor unless I want to load special LUT's into a monitor that can take them. I could ask the camera assistant to switch the camera's monitor output back and forth between log and Rec.709 while I stand at the village but that's a bit time-consuming and awkward.
I don't find much value in seeing the waveform reading of the Rec.709 monitor signal if I am recording log or raw -- I can see clipping in Rec.709 with my own eyes on the monitor, so the waveform just tends to tell me what I can already see. I'd rather see a waveform reading of the log signal getting recorded.
I can't recall if the histogram in the Red cameras can be sent to video village and switched off and on easily at the village, since I generally don't operate... so having those exposure tools in the eyepiece or on the onboard doesn't always help me.
Anyway, once I get a sense of the how the monitor image tends to look relative to the exposure I should be using, I can rely less on my meter, which is one of the positive things about digital cinematography, to get away from being tied to the meter.
So I would say that my method is rather convoluted, I mainly come to rely on the monitor with occasional double-checks using various devices -- meters, waveforms, histograms -- just not on every shot. The main point is the feedback loop over time (if this is a long-form project), you check the footage again in dailies, in the editing room, etc. to see if your exposure technique is giving you good results, and sometimes you make adjustments when you find that your day work is a bit too hot or your night work is a bit too dark, etc. I also shoot tests before I begin to see how things look on the set versus later in a D.I. theater.
But again, this method of trying to work fast and not get bogged down thinking about exposure is one reason why latitude and wide dynamic range is important. When shooting film, I am pretty good at exposing so that I am printing within a narrow set of printer lights consistently... but I also know that within a range, the quality of the final image will turn out fine because of the latitude of film negative. With the newer digital cameras, the dynamic range is broad enough to give you similar flexibility.
Finally, the old saying "Know Thyself" can be applied to exposure technique, we know if we have a tendency to overexpose or underexpose in general so we can develop a technique to take that into account.
David Mullen, ASC Los Angeles
I am back to using a light meter again (with F55). finding that it is landing us right in the sweet spot and as Art says, everything else is falling into place to the penny. As a recovering film guy, this is a real treat.
Rick Thompson Midwest dp
> Lots of other shows/features still shooting film.
No, there really aren't, at least if you're talking about television production. At the moment I can count:
True Blood (might switch next season) Grey's Anatomy CSI Boardwalk Empire Walking Dead Breaking Bad (wrapped) 2 1/2 Men
...and that's about it. There are a number of shows that were on film and switched to digital capture in the last season or two, or are switching for the coming season (Mad
Men, The Mentalist, Glee, and, assuming I've been told correctly, Castle among them). But your statement ceased to be true almost two years ago, and is completely untrue today.
As for features, the use of digital cameras has accelerated greatly, and the demise of labs in many locations throughout the world has made it much more difficult to justify.
Mike Most On Location Services Director Technicolor Hollywood Los Angeles, CA.
© copyright CML, all rights reserved