On Sunday I attended a demo of the Panavision/Element Technica System. It was brought up that monitoring 3D onset is pretty meaningless and rarely how a shot will end up looking. The comment was made, "Any 3D setup can be made to impress the producer, director and cinematographer but that rarely resembles how it will look on screen. 3D is almost always corrected in post. " If this is true what does monitoring 3D live onset accomplish?
If 3D is becoming "Post Centric" is there a "shoot it flat" attitude and then create the 3D perspective from both cameras at DI? Another question is how are productions handling the enormous amount of data from to camera sources?
Cinematographer / Camera Operator
Los Angeles, CA
New Orleans, LA
What they mean is that monitoring on set is invariably on a smaller monitor than you will ultimately see it on during release, so in a sense, monitoring on set is meaningless if you're hoping for it to look the same as it will in the cinema.
It's quite possible to get 3D that looks great on set, but will tear your eyes out when projected on the big screen.
Part of the stereographer's job is to control the amount of stereo (in spite of producers and directors calls for it to be 'more 3D'!) so that it will be comfortable to look at when projected later on.
Some say that you can get the same effect by placing a smaller monitor closer so it takes up the same angular field of view, but it doesn’t really work like that. The only thing that works like projection is well, ...projection.
To look at the other part of the statement "3D is almost always corrected in post". If you’re doing a live shoot, there is no post, so you need perfect alignment. Typically you'd have enough time to set it up perfect for what is essentially one long take. On a drama you have multiple setups, lenses coming on and off, the camera going on multiple supports. Much as you try to keep things pixel perfect, there isn’t always time between shots. As long as one camera is not actually higher than the other, the post fix is a relatively easy sliding of one image over the other to bring them back into alignment.
Good Quality onset monitoring is ESSENTIAL to creating good 3D. The better your stereographer can see what s/he's doing, the better your 3D will be because they won't have to be conservative to make sure they don’t overcook it. Your post costs will be much less because any fixes will be simple 2D fixes like repositioning one image relative to the other.
Also you can't make an image MORE 3D in post (at least not without expensive software tools). You can push everything further away or pull it closer, but you can’t make it more 3D.
There's not really any way around having a stereographer who knows what they're doing on set, capturing the data right FIRST time.
head of production
>> Also you can't make an image MORE 3D in post (at least not without expensive software tools).
As I was demonstrating at NAB the reverse is also true, too much 3D and you're stuck.
I think that this is a more serious error because this is the one that will hurt!!!
Geoff Boyle FBKS
mobile: +44 (0)7831 562877 www.gboyle.co.uk
>> I think that this is a more serious error because this is the one that will hurt!!
Also, since the L-R disparities are inherently larger, harder to fix in post without resorting to a full rebuild. To sort of answer Tom's question, the differences between live to air and feature film work in 3D are as large if not larger than they are for other properties. Three impulses have been driving planned post conversion, which one can think of as 100% 3D adjustment :
1/. not wanting to shoot with a 3D rig,
2/. wanting full control in post, and
3/. hard out on the talent, so nothing can impede the schedule.
Santa Monica, CA
>> It was brought up that monitoring 3D onset is pretty meaningless...
I would characterize on-set monitoring this way:
If you are shooting for home display or "point of sale" display, and you are using a similar sized monitor on set, the on-set viewing experience can be very close to the "real thing."
If you are shooting for theatrical release and viewing on-set on a forty inch display, for instance, the viewing experience will not be the same, but the images you view will be indicative of what you will end up with in many ways, so on-set viewing is still VERY valuable.
As mentioned before, live broadcast is a totally different kettle of fish from production for future editing and release. I will address production for future editing and release for the purposes of this ramble.
There are a number of issues that differentiate the "small display" viewing experience and the "Large screen" viewing experience.
the passive polarized displays can show you some false ghosting. If viewed off axis, they can be not so great.
That said, with your glasses on, they give you a good idea of what your scene looks like and with your glasses off, you can easily see how much parallax you have in the background
My credo with 3D, "FIRST DO NO HARM"
The four easiest way to cause harm in 3D (ignoring issues of story) are :
1/. Causing excessive divergence in viewers eyes.... This is generally talked about in percentage of screen width for convenience, but in fact it is expressed as an angle - the angle you are asking viewers eyes to diverge when viewing.
It is figured out by taking "nominal human interocular distance of 2.5" and figuring out what percentage of the width of the screen will not cause eyes to diverge more than, say, half a degree or a third of a degree, or whatever value you have chosen for your "safe" value. Some people will tell you that NO divergence is acceptable, but studies would suggest that a half a degree is tolerable. Like many things that affect muscle strain, it would be a time-weighted average... some people use a third of a degree as their acceptable divergence.
Once you know your percentage, you can literally measure on your screen and see if you are exceeding it... even if you’re on-set display is much smaller than the real screen.
2/. Excessive total difference between negative parallax to nearest point and positive parallax in furthest back point... that is to say, that the total amount you are asking people's eyes to shift their convergence while looking at different objects within the shot. Your on-set monitor can help you judge this, though it can only be a guide from which you draw conclusions based on knowledge and experience.
3/. Excessive mis-,match of convergence point between successive cuts. Likewise, you’re on-set monitor can help you examine this also
4/. Vertical mis-alignment. - Our human vision system is not used to accommodating a vertical mis-match. You can see such a mis-alignment on a stereo monitor by taking off the glasses.
In addition to these harmful things, there are other things about your composition that you can judge with a stereo monitor on set, including but not limited to edge violations, some disparities between polarized and non-polarized reflections, occlusions to one eye that do not occur to the other eye, etc. to conclude: While on-set stereo monitors do not equate to the viewing experience on the big screen, they are nevertheless valuable tools to understand what you are doing. Even with lots of post work, the closer you get it on set, the better for later.
Even if shooting raw and doing colour correction later, we like looking at a colour monitor on set.... similar situation here.
>>Please also add retinal rivalry, edge violation, polarization and shading problems.
Read the whole missive...
The issues that I list are ones that cause physical pain in short order... I describe the other issues also down at the bottom...but with regard to understanding baked in probs that are hard or expensive to fix in post (thus affording you your lavish lifestyle - meat three times a week, heat, etc)these are the issues that I would prioritize as fundamental in avoiding.