Cinematography Mailing List - CML
    advanced

3D Monitoring And Other Needs

 

Hi all,

I'm drawing up a 3D assist system for the future, I feel that I have to step up soon. Would you all please tell me what is today needed/expected in a system. Cost is a big factor, but I pride myself in putting complete systems together that are designed from the operator's view, rather that the rental agencies. So not-really-expensive-but-helps-a-lot details are especially welcome.


My main questions:

-what is the standard monitoring for 3D nowadays on a location (not always stage) shoot? HD features are settled on 24" Cinetal/TVlogic and 17"Panny setups. I found quite a few 24" 3D screens that look okay, but I'm worried about the size. Is it big enough to accurately see depth? Can their setting be dialled up enough to closer match a cinema screen (I actually know better than that..)? Are they good for single-seat (stereographer/DoP) only, or do directors watch them? Or do you all move a 42" Hyundai around for the director? Do directors watch 3D at all? Or is it the stereographer's privilege?
-Do stereographers usually bring their own monitoring/analyzer rig or do they expect the production to rent something? If yes, what signal/format do they require, and if not, what do you need from production? Are there any gadgets, software’s or stuff you are grateful to have? (like I usually offer a wave panel if the DIT asks for a speedgrade package, if the production is not really cheap. Or I give them a Mac mini on the cart for their use).
-Do you like to be seated right up with the DoP (director?) Or you'd rather go separate?
-Correct me if I'm wrong, but the stereographer would rather watch a LUT corrected image than a raw one?

Any help appreciated.

Balazs Rozgonyi IMDb CV
Rental Manager / HD Video Assist Op

Video Assist Hungary - Technology for your vision IMDb IMDbPro
The only video rental place in Hungary!
www.videoassist.hu


That's a lot of questions for a single post Balazs. Truth is, don't know if there can be such a thing as what you're looking for.

Rather than focusing on the gear, it might be more appropriate to focus on the process. Who are you really building this system for? Remember when prosumer video cameras came out and all of a sudden everyone was a DP? - well it’s a lot like that now with overnight stereographers - they need a lot of help with visualization.

To create good stereo you need to have a solid grasp of both cinematography and visual effects. Successful people in both professions are generally endowed with above average powers of visualization and just don't need all the gizmos to "see" the indicators of a good stereo image. Clients, producers and some directors on the other hand, need all the help you can give them. Understanding that difference will go a long way toward designing your system.

To a working crew on location or on a busy set, viewing the image pairs in stereo is little more than a waste of time. You're going to depth grade it in post to work with a number of distribution systems and screen sizes so anything you view on set or on location is simply not that relevant. All you really want to know is if your math is right.

If you go onto the set of proficient stereo craftsmen you'll almost never see them with glasses on. Chances are they got over it long ago. What you will see are mechanisms and systems for determining the stereo offsets and alignments. Dual-feed monitors that let you overlay, wipe and switch back and forth between the two cameras will tell you much,
much more than a "stereo" display.

I was on a location test shoot last week with one of the best stereo crews in the business. They've got two huge projects coming at them and the pressure is on. Without divulging their secret sauce, nearly everything they need to know they get from mono images. The one rather small stereo monitor that they do carry is set to anaglyph - not so they can walk around with red and blue lenses on, but so that they can "see" the offsets.

RED on RIGHT in REAR - BLUE on RIGHT in FRONT. This tells you that you've got your cameras in basic alignment, you are shooting converged and your cables are run correctly. The amount of R/B offset in the positive and negative screen-space tells you how much dimensional offset you are building into your scene. The object or character that doesn't have any Red/Blue offset is at the point of convergence. What could be simpler?

You can always throw on a pair of anaglyph glasses to double check - but seriously, it hardly ever happens.

Of course everyone loves a little SpeedGrade package on-set - what's not to love about the best depth and colour grading tools on the market - but from the operator's perspective, there are more important things ..... like where's the closest Starbucks.

Scott Billups
PixelMonger


>>The one rather small stereo monitor that they do carry is set to anaglyph - not so they can walk >>around with red and blue lenses on, but so that they can "see" the offsets.

Well put, and co-incidentally, in defence of the humble Anaglyph...

http://realvision.ae/blog/2010/11/in-defense-of-the-humble-anaglyph-method-for-stereo-3d/

Regards,
Clyde DeSouza
RealVision
Dubai, UAE


>>Well put, and co-incidentally, in defence of the humble Anaglyph...

In the U.S., as long as anaglyph for TV continues to be promoted for "Very Special Events", and the cardboard glasses are glued into magazines and given away at convenience stores and gas stations, the public will continue to regard it (and home 3D) as a cheap and tacky promotional gimmick, which is, of course, precisely what it is.

Bob Kertesz
BlueScreen LLC
Hollywood, California


>> In the U.S., as long as anaglyph for TV continues to be promoted for "Very Special Events", and the >>cardboard glasses are glued into magazines and given away

True, but the discussion is mainly about Anaglyph for Professional use, and to bring to the attention of stereographers as a reminder of that.

Regards
Clyde DeSouza
RealVision
Dubai,


"Info Realvision" wrote:

>> the discussion is mainly about Anaglyph for Professional use, and to bring to the attention of >>stereographers as a reminder of that.

No offense, but as a professional I have always found something lacking in anaglyph, as a monitor control medium, with or without glasses. Anaglyph is a primitive and inaccurate method to monitoring stereo on or off set. It's debut theatrically, was at the intermission of a musical version of Moby Dick in the late 1800's, and should of ended there.

Max Penner
Stereographer
Paradisefx.com
310 864 5124 Cell
07999288849 EU Cell
Maxp3d skype


Obviously, anaglyph is a hideous viewing format for so many reasons - that was never the issue.

The point is that when used as an offset reference, in addition to all the other tools in the production arsenal, it delivers a dependable amount of highly relevant information. The context of the original post was with regard to low cost tools that an operator can use.

Quantitatively anaglyph is a useful tool, qualitatively it sucks.

There, is that better?

Scott Billups
PixelMonger


Scott Billups wrote:

>> Quantitatively anaglyph is a useful tool, qualitatively it sucks.
>> There, is that better?

On the teaching side, whether teaching a stereo class or trying to help a 2D director or DP get their heads around it, I find that anaglyph provides an elegant way of illustrating positive and negative parallax in a scene - better in a way than the luma diff displays because the "which side is the colour fringe on" assessment is very easy for anybody who is not colour-blind to make and the amplitude of the parallax as one moves away from convergence plane is easy to grasp.

Would I watch a movie with anaglyph glasses.... um, not unless I really had to.

Is it a useful tool (glasses off)

yup

Mark Weingartner
la based Dp vfx stereo


Mark Weingartner writes:

>> Is it a useful tool (glasses off)

We use anaglyph a lot, glasses on, for judging depth on standard monitors. One could use active glasses, I suppose, but it wouldn't add much more information - the artists aren't looking at colour, and the need to closely examine small sections of a frame eliminates some other techniques. Anaglyph makes it easy to count pixels offset, as long as one isn't playing back a chroma subsampling format like ProRes422.

Tim Sassoon
SFD
Santa Monica, CA


Anaglyph it's a useful tool if that's all you have. But other tools that are available today, such as a difference matte view usually give you more information in clearer form.

As for whether the crew or the tools are more important - that's a trick question. We are still working on that "Make it look good" button. Yes there are some really good stereographers in this business who can properly set up a S3D movie without additional tools.

But it makes your life considerably easier and lets you experiment more if you have a quick way to have instantlies on set. Plus, we all still have a lot to learn about stereo; So, after you have learned the rules, get out and try things out, then look at them.

I think good S3D monitoring and review helps you make informed creative choices and usually quickly pays for itself.

If you have to choose between a good stereographer and good tools, pick the stereographer. But it's a bit like asking: do you want a DP or a viewfinder?

You really need both for good results.

[Full disclosure, we create some of these tools, so I have a commercial
agenda]

Lin Sebastian Kayser IRIDAS
CEO www.iridas.com


>>"which side is the colour fringe on" assessment is very easy for anybody who is not colour-blind to >>make

This is correct Mark, just wanted to add that on a scene that has a lighter collared background (sky for instance) and there is are high contrast objects against it, the fringes will be reversed click on the buildings image in the article for a bigger version to see that.

http://realvision.ae/blog/2010/11/in-defense-of-the-humble-anaglyph-method-for-stereo-3d/

Regards
Clyde DeSouza
RealVision, Dubai


>> "the original post was with regard to low cost tools that an operator can use.
>> Quantitatively anaglyph is a useful tool"

I sorry....I still have to disagree. I can comp L&R images, or build a BS box around existing monitors, for the same price as producing anaglyphic imagery, and qualitatively have a cleaner image to judge image disparity.

Professionally, there is much more than disparity to consider, when monitoring XY&Z, and all the rest of it, when monitoring stereo live action capture. I don't believe anaglyph is the right tool to use.

Please forgive me for speaking in general, but I'm at work.

Respectfully from iPhone,

Max Penner
Stereographer
Paradisefx.com
310 864 5124 Cell
07999288849 EU Cell
Maxp3d skype


Anaglyph makes it easy to count pixels offset, as long as one isn't playing back a chroma subsampling format like ProRes422.

Would you define this a bit more please?

Best

Argyris Theos
DoP
Athens Greece
+306944725315
Skype: Argyris. Theos


>"Anaglyph makes it easy to count pixels offset, as long as one isn't playing back a chroma

> subsampling format like ProRes422."

If one is trying to stick a CGI object to a floor in a stereo scene, the easiest way to do that is to count the number of pixels offset at the point where you want to stick it, and then match that same offset in the object. Looking at it in stereo may not give you the right answer all of the time, just as in colour correction, oftentimes you'll do better to look at the scope than the picture.

Tim Sassoon
SFD
Santa Monica, CA


Don't worry, Apple's got you covered

http://www.fastcompany.com/1706347/apple-wins-patent-for-sci-fi-like-glasses-free-3d-tv

(with head tracking too)

Alan Lasky
PROC


Yes I clearly understand that. The question was about the Prores422. I am a windows man and I do not have any experience with this codec. But if it stops you from using anaglyph for at least part of a stereoscopic workflow, this might be a bit frustrating. Hence the question.

Best

Argyris Theos
DoP
Athens Greece
+306944725315
Skype: Argyris. Theos


>> The question was about the Prores422

Well, if one's determining things by hue, and one is using a codec which subsamples chroma, then one is looking at a fractional resolution. Fair enough for an OBM, but in terms of compositing, it's often fairly easy to see a single pixel error in a 2K image projected.

I'm not sure how this is useful information, but for some reason I stuck it in there, perhaps ill-advisedly.

Tim Sassoon
SFD
Santa Monica, CA


>>But if it stops you from using anaglyph for at least part of a stereoscopic workflow, this might be a >>bit frustrating. Hence the question.

Even I don't know how this fits in as advice.. but here goes:

The thing with Anaglyph is you have to have it rendered at the last mile i.e only at the time of display not rendered to a codec. Most codecs will subsample colour and this will result in "embossed" like artifacts that show on the edges of anaglyphs. This will throw all the pixel counting off Try it with jpeg saving in a Photoshop created anaglyph. Only on settings of 10 or higher (in Photoshop) the emboss artifacts will disappear in anaglyphs.

Any commercial anaglyph DVD also shows this artifact if you freeze frame a scene with a reasonably contrasty scene in it. Codecs such as Cineform are good because they mix to anaglyph at the last mile, so on an NLE timeline the anaglyph will not show these artifacts. but there's no proper way to apply effects to left/right views when using a mixed Cineform file. (at least this has been my experience)

However for quick edit work in real-time and "online" with the need for transcoding/proxy etc,, I recommend a good PC with Nvidia and Premiere CS5. The days of "offline" are numbered. Depth grading, key framed HIT and "depth ramps" or Depth blending is best done in real-time in an online environment.

Throwing two Red 4K layers with anaglyph channel filters on a reasonably powered PC (i7 with good ram and a Nvidia Quadro) should let you preview Anaglyph on the fly. What I also learned from a webinar today is that since Sept, Premiere even has Red rocket support built in natively. Simply hit a switch if you insist on full rez decoding.

Regards.
Clyde DeSouza
RealVision Dubai.


>> However for quick edit work in real-time and "online" with the need for transcoding/proxy etc,, I >>recommend a good PC with Nvidia and Premiere CS5.

I meant without, instead of with.

Regards
Clyde DeSouza
RealVision, Dubai


Lin Sebastian Kayser wrote:

>> anaglyph it's a useful tool if that's all you have. But other tools that are available today, such as a >>difference matte view usually give you more information in clearer form.

Just to beat this to death - I would never claim that anaglyph is a superior tool, rather that it is a useful one in teaching and seeing what is going on. It is easier for me to show a neophyte what is going on with anaglyph than with some of the other tools, but I would much rather look at the output of a 3ality SIP set in luma difference than look at anything else in order to set a shot... and that includes a stereo monitor, for that matter.

For me the perfect situation on set is a SIP feeding a monitor and a stereo monitor (I favour the Cinetronics beamsplitter monitor for so many reasons that I now rent them to people ) but if we have to go up a long spiral staircase and I can only take one thing, it will be a monitor that shows me the output of a SIP.

As Lin was too modest to point out in detail,, Iridas has some cool tools for us - not just to see it but to fix it too!

Mark H. Weingartner
LA-based DP & VFX Supervisor


HI all,

thank you for the responses.

First off, let me clarify, as some might misunderstood me. I'm not a stereographer. Nor I believe I'm one, or will become one in a year or two. I know about a learning curve and frankly, enjoy it. I mostly use and rent VA and DIT packages, film and HD. And that took a few years, too, and I'm far from being happy with what I know. With 3D, I'm reading all I can, trying to get the info, and currently I maybe able to ask questions. Or not..

So I know most stereographers don’t use the polarized 3D monitors, or only watching 2D. Or anaglyph, or differential. I guess they bring their own stuff or have the production get some from the provider of the rig. That’s why I asked, what kind of input signal do you guys need from me? 2D, 3D SbS, dual camera? I have to design this into this provisoric 3D VA rig.

Also, I am hearing different opinions on what the _other_ crew members can watch. Does the director or DP usually requests to watch 3D, and is s/he being told off by the stereographer? IF they can watch 3D..now that’s where my question really starts. I know the 3D effect is directly proportional with the viewer's distance to the screen. Which would practically mean that for the 3 guys in front of the monitor, they need a 42" screen. But that has its limitations on location (I should know - I hauled around a 50" plasma for 130 days for del Torro). And then you'll need two.


So practically, I'd design the rig around two 24" Panasonics. But if that minimizes the depth, as my limited knowledge tells me it will, I will put the stereographer in a very bad position where the director will always ask for more depth. So from this point of view, it’s better to tell them to watch 2D. But if they ask, I should be able to answer.

So again, what is, or will be, the de facto standard for the director/DP/producers to monitor 3D on set?

Thank you

Balazs Rozgonyi IMDb CV
Rental Manager / HD video operator
www.videoassist.hu


Argyris Theos wrote:

>>The question was about the Prores422. I am a windows man and I do not have any experience >>with this codec. But if it stops you from using anaglyph for at least part of a stereoscopic workflow, >>this might be a bit frustrating

Argyris,

By your reasoning then those same restrictions would cover ALL 4:2:2 subsampled codecs and recordings , and by this reckoning it would disallow any and all 4:2:2 codecs over Single link HDSDI? What about other recording processes?

Would this also exclude HDCam at 3:1:1 or Sony's EX or Panasonic's AVCHD which are LGOP 4:2:0?

I doubt that even looking at a live 4:2:0 signal would NOT substantially reduce the onscreen colour issues to effect viewing of anaglyph imagery for critical alignment usage on-set.

/Rant

Gary Adcock

Studio37
HD & Film Consultation
Chicago, USA


>>But if it stops you from using anaglyph for at least part of a stereoscopic workflow, this might be a >>bit frustrating. Hence the question.

The thing to take into consideration is what part of the workflow you want to use anaglyph for.

On a location, taking left/right views into a Blackmagic 3D monitoring box and feeding it to a display should not cause any compression/sub sampling artifacts. On an NLE timeline, using two discrete videos no matter what the original compression is, and throwing on two channel filters and an opacity of 50% will also not give you any anaglyph sub sampling artifacts.

Where you will get anaglyph sub sampling artifacts, and the key phrase here is "anaglyph sub sampling" is if you save out a burned in anaglyph video to any of these codecs. Then later if you playback the resulting anaglyph that will show the artifacts.

The usual way to save discrete left right, or openExr multiview, or even simple left/right and let the compositing/NLE program or hardware create the last mile mixing (basically removing channel info in the clips)

Regards
Clyde DeSouza
Realvision Dubai


Here's a limited perspective from a guy whose only done one 3D project with a director who had limited 3D experience.

I see 3 issues here.

One is what the technical crew - DP and stereographer - need to make stereo decisions. Here it’s true that anaglyph is plenty for that and probably even just an overlay will be fine for some people.

The 2nd issue is aesthetic - Do the director and possibly DP need a 3D monitor to evaluate what they're doing. Here I think that a 3D monitor is immensely valuable, because composition and even lighting feel different in 3D than 2D. A great 2D shot may suck in 3D and a 3D composition that works very well may be boring in 2D. Anaglyph is better than nothing to evaluate this but a colour monitor would certainly be better.

The third is just logistical and economic. A 42" monitor is just a giant pain in the butt on the set, so it’s generally impractical. Until recently there didn't seem to be any affordable smaller monitors but now the Panasonic and Sony 24" are available so I would think they would be great to have on set. Why not?

Re is it too small to make intelligent decisions ? I don't think so. You just need to keep in mind the caveats that a large screen will magnify issues. We learned plenty from a 17" anaglyph so a 24" colour monitor would have been heaven. At some point the director needs to trust his stereographers on that stuff. Somebody on this forum once said it was essential that any 3D project check their rushes nightly on a theatrical screen.

Well..... As for gadgets etc. We used a Cinetal Davio Box (there are other similar products) to run the SDI camera signals into. From there the box could be switched to give us: -straight video for colour and single camera monitoring - Anaglyph to make stereo decisions and to feed to a small monitor by camera for the operator/stereographer to set convergence and alignment -stereo hdmi if you have a colour 3D monitor

We were recording directly on camera so didn't also need to feed recorders.

In a more complicated scenario you could use a DA to split the signal for different paths so you could do multiple tasks at the same time - i.e recorders, stereographer/DP, DIT/DP and director. But someone with much more experience than me could answer that --

Leonard Levy, DP & baby stereographer
San Rafael, CA


>> Do the director and possibly DP need a 3D monitor to evaluate what they're doing. Here I think that >>a 3D monitor is immensely valuable

On any set there are at least the OBM and director/DP levels of monitoring, possibly client as well. I don't think anyone would even for a second argue for sending the director or clients anaglyph, except as a prank. And I think the main reason it (and other meta-stereo techniques) works in OBM is because one can get a relatively objective assessment from any angle, quickly, and without having to remember where one left one's glasses. As soon as someone comes up with a really reliable auto stereoscopic display, no-one will use anything else. I can't imagine that it would be otherwise.

Tim Sassoon
SFD
Santa Monica, CA


> >I don't think anyone would even for a second argue for sending the director or clients anaglyph, >>except as a prank.

Many paths to the kingdom and many levels of production.

If I am at the top of a mountain with a second unit director we probably don't have a 42" monitor with us...an on-board that makes anaglyph may be all that is available.

Let's not forget that not all production happens within easy access of humongous videocarts....

Weingartner LA DP VFX S3D
boats, barges, mountains... whatever


>> If I am at the top of a mountain with a second unit director

But a second unit director's not going to give you a load of crap about how unprofessional that is to only have an anaglyph display for them

Tim Sassoon
SFD
Santa Monica, CA




Copyright © CML. All rights reserved.