Yikes,
I plan to shoot a computer monitor, it doesn't have to be in sync - just a quick shot of someone looking at a gay porn internet site while at work (yes, it is a comedy). I suppose the solution is as simple as renting a cinematography electronics synchronizer, placing it behind the monitor, and away we go ?
I realize that computers run at diferent speeds than monitors - Macs run at 75hz ... does this mean the speed will be about 37.5 fps?
The camera is an Aaton XTR.
Thanks so much in advance for any help,
Duraid Munajim
>I plan to shoot a computer monitor...I suppose the solution is as simple as renting a >cinematography electronics synchronizer, placing it behind the monitor, and away >we go ?
That is basically correct, you then need to move the phase bar off the screen by looking at the monitor full frame in the camera viewfinder, while rolling the camera with the magazine removed. You turn the phase control on the sync box until the bar is moved off the bottom or top of the video screen while the camera is running. Then you can load the camera and shoot, knowing that the camera will run in sync and the phase bar will be not visible on the film.
There is one pitfall that you must be careful of: if you place the sync control's inductive pickup near the power transformer of the monitor, you might sync to the line frequency, not the monitor refresh rate. Bummer! Beware if the frame rate suggested by the sync box is exactly half of the local line frequency. Move the pickup around until you get the monitor refresh rate. This only gets weird if the refresh rate is close to the line frequency, then you have to analyze what you see in the phase test to be certain that the phase bar is stationary relative to the edge of the monitor, before you move it off the screen with the phase knob on the sync box.
Bill Bennett Los Angeles
Using the Cinematography electronics video sync units, or the Arriflex unit for the 435, that slaves the camera to the monitor frequency, does this spell trouble using HMI's? Can you get flicker, even using Flicker Free HMI's? As the camera speed is not exactly crystal, and follows the monitor if the monitor drifts.
Just curious, or will Flicker free HMI's work fine with what is essentially no longer a crystal controlled camera.
Steven Gladstone
http://home.earthlink.net/~veenotph
Strictly speaking, no make that I think, Macs and PCs refresh at the rates set by the Graphics Card and can usually be varied by going into the Display settings.
If your monitor is refreshing at 75hz then you've got a damned good graphics card. I always refresh as high as the monitor will stand as I'm very aware of flicker - one VERY famous cameraman I know says he couldn't see the flicker on his monitor but it drove me crazy the time I was in front of it! You know who you are ;-)
The resolution of the monitors tends to be constant (well, sort of). The Macs resolve 72 pixels per inch while the PCs resolve 96ppi.
Hope that's confused you further....
Regards,
Shangara Singh Lighting Cameraman
London Based Adobe Certified Expert(ACE) Photoshop 5
>one VERY famous cameraman I know says he couldn't see the flicker on his >monitor but it drove me crazy the time I was in front of it!
I have noticed that different people have diffeent thresholds for perceiving or not perceiving flicker. I am right on the edge at 50Hz, so that watching PAL or sitting under most 50Hz fluoro's in Europe I am not bothered EXCEPT at the edges of vision...my peripheral vision is more flicker -sensitive than the center. Watching NTSC video at (almost 60 fields per second) I do not perceive flicker at all...but the color sucks:-) Doug Trumbull and Richard Yuricich did a lot of research on flicker rates when they were cooking up Showscan, and they settled on 60 fps/ one flick per frame as being safely ABOVE the threshold for the viewing audience.
Mark H. Weingartner
Lighting and VFX for Motion Pictures
The 1/60th of a second rate for perceptual threshold is also very consistent with perceptual research.
Jessica Gallant
Director of Photography and Listmum Studio City, CA USA
Surely the perception of flicker is dependent upon luminance as well as frequency. Everyone will have noticed how much more a film projection screen appears to flicker if the end of the film runs out so you are viewing pure white. The whole question of screen brightness standards revolves around this.
One of the claimed advantages of 30fps theatrical projection (i.e. 60Hz flicker on a two-blade shutter) was the ability not only ot resolve faster action, but also to project a brighter image.
Dominic Case
Group Technology & Services Manager
Atlab Australia
I'm not Mac literate but PC's can be set to run at a number of frequencies, mine's at 85, but when I've shot computer commercials I've generally re-set all the monitors in shot to run at around 60, I say around 60 because they're not that accurate, and locked the camera to that.
Flicker free hmi's are fine used like this.
If you need to see a lot of computer monitors in shot at the same time and can live with the same picture on them all then the safest thing is to run all the monitors from a separate computer that has a DA attached and sync the camera to that.
Geoff Boyle
Dominic wrote :
>Surely the perception of flicker is dependent upon luminance as well as frequency...
>...The whole question of screen brightness standards revolves around this.
True enough...in fact working on the Linear Loop projectors as we were developing them for RP, we had to scrim the lamphouses way down to look at focus otherwise the flicker drove us mad...I guess the percentage change from light to dark is the real issue, but there may well be phisiological issues that have to do with rods' flicker sensitivity vs. cones' flicker sensitivity. and/or response times vs.luminance levels in the eye.
>One of the claimed advantages of 30fps theatrical projection (i.e. 60Hz flicker on a >two-blade shutter) was the ability not only ot resolve faster action, but also to project >a brighter image.
This part doesn't make sense to me...a shutter that cuts out a given percentage of the circle will make the same brightness at any frame rate...it may be that at 24 fps the dark portions of the cycle are more apparent than they are at 30 fps. This would yield an APPARENTLY brighter image Just guessing here.
I hear rumors, by the way, that we may have won a Technical Acheivement award from the Academy of Motion Picture Arts and Sciences for our projectors as used for RP...not yet confirmed. (No one ever tells me anything:-|)
Mark H. Weingartner
Lighting and VFX for Motion Pictures
>when I've shot computer commercials I've generally re-set all the monitors in shot to >run at around 60, I say around 60 because they're not that accurate, and locked the >camera to that.
I suppose that as long as they can stay put at 60 for the length of the shot, it would suffice. When filming a lot of montitors, this would seem to be the way to go.
Time to take another look at the PC. If I can make it run at 60, all I would have to do is run the camera at 29.97.
Hmmm...
Maybe I'll rent the synchronizer just in case
BTW, thanks for all the answers - I feel less foggy about the situation.
Duraid Munajim Montreal(-24 Celsius), Canada
Sorry, Mark, I didn't make my point very clear.
At 24 fps (48 interrupts per second), 16 ft lamberts is about as bright as you can get without flicker becoming apparent. If you have 60 interrupts per second, (30fps with the same 2-bladed shutter and pull-down mechanism) then you can increase the lamp brightness to get a brighter image (more visual impact) well above 16 ft-l before getting the same visual flicker.
But you do need to increase the lamp: you don't get something for nothing!
Dominic Case
Group Technology & Services Manager
Atlab Australia
Copyright © CML. All rights reserved.