Home of Professional Cinematography since 1996

>Defining 6K Film Stocks

Published : 25th March 2004


I noticed recently that Geoff suggested (in CML Basics) that the latest stocks may be 6k. If this is the case, what is the estimate from people 'in the know' regarding when our friends in the post community will be able to work at this resolution? At the moment, it seems to me that film technology (driven by the competition from HD) is outstripping Post technology. 6K is a term I've yet to hear from anyone in the Post world, it all seems to be 2K and 4K at the moment.

Mark Wiggins
DP/Operator/London
www.productionbase.co.uk/cv/scare



Mark Wiggins wrote :

>If this is the case, what is the estimate from people 'in the know' >regarding when our friends in the post community will be able to work at >this resolution?

We are currently in post on the ASC-DCI (digital cinema initiative) test where the original film elements were scanned at 6K and 16-bits per component on a Northlight scanner. File sizes were 170MB each. (one-half reel is about 2.5TB).

It has been painful to move these around, but doable. It is going to be an expensive way to work for quite a while because I/O is not increasing with the same leaps and bounds as CPU, memory, and disk.

Jim Houston
Pacific Title Imaging



Jim Houston wrote :

>the original film elements were scanned at 6K and 16-bits per >component on a Northlight scanner. File sizes were 170MB each.

What does this mean in terms of pixels? It has got me wondering, if you shot a feature with a 6K film stock, if you went via DI, (with its 4K and 2K limitations) does this mean that you would end up with a print that has less resolution than it would have if you had gone the optical root (on the basis that the higher the resolution you start with (6K), the higher the resolution you finish with).

Mark Wiggins
DP/Operator/London



Mark Wiggins wrote:

>It has got me wondering, if you shot a feature with a 6K film stock, if you >went via DI . . .

There is a greater loss, typically, from the two generations (IP, DN) done photo chemically vs. digitally.

Jeff Kreines



Are the imaging characteristics of current 35mm lens technologies able to resolve that much information? If not, then I'd think the only thing you'd be gaining from a higher resolution scan is better resolving of film grain, not necessarily more actual image detail.

Jason Rodriguez
Post Production Artist
Virginia Beach, VA



Jason Rodriguez wrote:

>Are the imaging characteristics of current 35mm lens technologies able >to resolve that much information?

Yes, high quality lenses can get to that resolution, it is only about 150 line pair per mm. In this particular case, it was a Nikkor printing lens. Measurements on this type lens showed up to about 280lp/mm was possible with about 80% falloff in the MTF. Camera lenses can have more of a problem, and are perhaps the most significant limiter on resolution. The ASC when shooting this test hand-selected the best possible lenses for sharpness.

I've also looked at some difference images to see what detail is pulled out, and there is a definite crispening effect on edges.

The subjective appearance feels about 1/3 the difference in sharpness between 2K and 4K. So going from 4K to 6K gives less bang for the buck, but it does show subtle image improvements.

Most of the improvements from a 6K image are its use in super sampling the original element. Downsizing to say a 3K image improves both edge detail and noise characteristics compared to an original 3K scan.

Lastly, there is a paper by Roger Morton from Kodak in last year's SMPTE Journal which shows that current t-grain films have some image detail at greater than 5.5K. It is a small amount, but nevertheless there.

Jim Houston
Pacific Title Imaging



Mark Wiggins wrote :

[With 2K 4K DI] does this mean that you would end up with a print that has less

>resolution than it would have if you had gone the optical root (on the >basis that the higher the resolution you start with (6K), the higher the >resolution you finish with).

Yes. The best result is usually from contact printing from the original negative. There is a small loss in resolution at the answer print stage, and then a medium loss at the IP/IN stage, and yet another loss at the release print stage. DI's often maintain a better appearance for the final by generating the IP directly (saving one generation) or sometimes for short film by generating the IN directly.

In the end, the differences in resolution from the original source material get squeezed in the duplication process so that the end results look pretty close to each other.

Jim Houston
Pacific Title Imaging



Thanks for the information. I look forward to seeing the results of Geoff's latest tests with interest.

Mark Wiggins
DP/Operator/London



Jim Houston wrote :

>Lastly, there is a paper by Roger Morton from Kodak in last year's >SMPTE Journal which shows that current t-grain films have some image >detail at greater than 5.5K. It is a small amount, but nevertheless there.

SMPTE journal Volume 111 Number 2, February/March 2002

http://www.kodak.com/US/c/enorp/researchDevelopment/

productFeatures/dCinema.shtml


also has some links with Roger's work.

In general we've also found you can see a benefit from 6K on some charts even on some of the older stocks, but if you go all the way down to 2K you are going to loose most of the difference.

I Kevin Wheatley |
| Senior Do-er of Technical Things |
| Cinesite (Europe) Ltd |



This 6K thread is very interesting as resolution is one of those issues that has a number of answers, depending on the reason for the question.

I have been lucky enough to have been involved in digital film for some time, both with its initial VFX shot use and today's DI applications.

What I have found is that the business model often has more sway that any technical argument - specifically because the resolution of a film print as seen in a cinema has a real-world figure of around 1.2K pixels. This has been proven time and again via any number of test, both scientific (measuring MTF losses) and through empiric/visual testing of the OCN/IP/IN/Print route.

Although there are much better OCNs available today their impact on the final distribution print is an interesting question. I have yet to see figures for such capture to projection via the optochemical path but I'm very interested to see if there is an improvement and by how much.

However, I was lucky enough t be contracted to work on the first feature shot with Vision 2 OCN (LD50 in the UK - posted at MPC and shot by Robin Vigeons BSC) and was very impressed with the stock for 2K DI use. It was a much easier process than via previous stocks as its ability to hold detail was very impressive and therefore improved the DI MTF throughput. Obviously, there are a lot less losses this route than via optochemical, especially if you digitally project the final image - not that this was done this time.

For film distribution the best route is to 'filmout' multiple INs from which the projection prints can be made. The resultant MTF is visibly better, but not always cost effective - film recorders are still to slow & expensive.

Obviously this quality is also available (better!) via OCN printed directly, but few productions will risk multiple prints from the OCN, or even multiple IPs to INs for obvious reasons.

For these reasons I tend to measure the requirements for digital cinema from the delivery stage and not from capture. 1.2K print vs. 4K or even 6K OCN.

As part of tests I have been able to perform I have been able to project answer prints (from older OCN stocks it has to be said) in parallel with both the newer 2K DMD projectors and the older 1280 ones. The digital projectors have always matched the print quality, with the 2K ones bettering it by some margin.

What is very interesting is that I have just been involved in a test that pitted 35mm OCN against Thomson's Viper camera in a carefully controlled test using the exact same lighting and a motion control rig. This was performed at Motion FX in the UK and the results were very interesting to say the least.

It's fair to say the 35mm OCN stock was 500EI, but the test was done in limited illumination conditions deliberately.

The film was scanned at 10bit log 2K via a Spirit, which obviously wont capture the full MTF of the film, but it is a fair example of what was on the film as the test later showed - the smallest component of the image was film grain or grain clumps, not the individual digital pixels.

With both image sequences graded to the very minimum to balance them against each other (the film not at all actually) a split screen comparison showed the Viper image to be every bit as pleasing as the film and containing just as much detail, both in resolution, dynamic range and colour, although the focus point on both takes was unfortunately not exactly identical.

What was even more impressive was when the images were expanded (zoomed) by 400% as the film degraded first due to the grain size while the Viper images still looked acceptable.

If possible, I will get example stills on the Digital Praxis website by the week's end.

What this test seems to suggest, and is backed up by other tests I have been able to perform, is that when scanning film it is best to use Nyquist to down sample from a large resolution (4K to 2K for example) as the film grain acts as a 'pre-filter' on the original scene. Also true when printing as the film grains never line-up. When capturing via a digital camera there is nothing in the way of the camera and the scene and so a better MTF for the available sensor resolution is possible.

At the end of the day I go back to my original comment. The cost of producing a theatrical 'film' release above that of the quality presently enjoyed in cinemas for traditional film releases is not viable as a business. Joe pubic will not pay extra to sit in the same seat in the same theatre unless the difference is stunning - 3D or IMAX possibly.

Everything I have seen suggests that 2K DI for digital or 'film' projection offers production and post-production benefit combined with a growing and acceptable business model.

Steve Shaw
Digital Praxis Ltd.
www.digitalpraxis.net



>At the end of the day I go back to my original comment. The cost of >producing a theatrical 'film' release above that of the quality presently >enjoyed in cinemas for traditional film releases is not viable as a >business.

I'm sorry, I'm kind of lost on the point of this post. Most theatres are currently limited to 1.2K resolution, so we should limit film scans to 2K resolution? That seems awfully short sighted, even from a business perspective.

Industries, from a capitalistic perspective, are suppose to thrive on competition - by competing for consumers' money, individual businesses are suppose to be offering better quality goods for the same or lessor amounts of money. We have seen this, to some extent, in the computer industry and in other areas of telecommunications.

Why is the motion picture distribution industry immune to this? We should theoretically be seeing theatres competing against one another by offering better quality projection, sound, seating, etc.

And many theatres have improved their seating, their sound systems, etc, but not their image quality - it seems to be decreasing.

Jessica Gallant
Los Angeles based Director of Photography
West Coast Systems Administrator, Cinematography Mailing List
http://www.cinematography.net



>Why is the motion picture distribution industry immune to this?

Because the real competition is to stay at home and watch TV.

Noel Sterrett
Baytech Cinema
www.baytechcinema.com



>I'm sorry, I'm kind of lost on the point of this post. Most theatres are >currently limited to 1.2K resolution, so we should limit film scans to 2K >resolution? That seems awfully short sighted, even from a business >perspective.

Also, even if the process of printing and projection lowers resolution, it doesn't mean that the digital negative can therefore be of lower resolution compared to a camera negative. You'll want to start at a higher resolution just to survive some of the loss of the later steps of post and exhibition.

This is why I think 4K has to become the norm for D.I. work eventually, regardless of what the resolution of the projected image finally becomes.

You start with a lower resolution image and it becomes even LOWER by the time it hits the screen, so it doesn't make sense to set the resolution level for digital work at a lower level than current 35mm camera negative. I mean, it currently makes sense financially and technically because of the greater data level of 4K but hopefully soon that hurdle can be overcome. It would be a shame to lock-in 2K as a standard for all digital negatives for years to come.

For D.I.'s to really become indispensable, it has to become an invisible step -- i.e. the final digital negative retains all the information on the original camera negative. Otherwise, we're just talking about a compromise, which is fine for the current situation but hopefully won't be necessary soon.

Otherwise, what's the point of getting beyond the resolution limits of current HD cameras? We're all hoping those get better soon, right?

David Mullen
Cinematographer / L.A.



Steve Shaw wrote :

>I have just been involved in a test that pitted 35mm OCN against >Thomson's Viper

Your findings are not a surprise to me, as Viper truly represents the state of the art in terms of usable, deployable HD-based Digital Cinematography today. The cleanest path from live image to data.

Martin Euredjian
eCinema Systems, Inc.
www.ecinemasys.com



>At the end of the day I go back to my original comment. The cost of >producing a theatrical 'film' release above that of the quality presently >enjoyed in cinemas for traditional film releases is not viable as a >business.

It is interesting that DCI is leaning toward 6k Scanning at 12bit Log color depth with 4k projection as their ultimate goal. There must be some serious reason for that.

Steven Poster ASC



It is interesting that DCI is leaning toward 6k Scanning at 12bit Log color depth with 4k projection as their ultimate goal. There must be some serious reason for that.

Steve - do you or Jessica (I know you're both very interested in the science of perception - has anyone done a study that could tell us just how much the human eye can resolve? Or for that matter, all matters of what we can measure visually?

Dale Launer
Writer/Filmmaker
Santa Monica



David Mullen wrote :

>You'll want to start at a higher resolution just to survive some of the loss >of the later steps of post and exhibition.

This thought is what prompted my original query. That the higher the resolution you start with, the higher the resolution you finish with.

Therefore, if stocks are going to be 6K or higher, the telecine and post world has to catch up. By the sound of some previous posts, this seems to be the case.

Mark Wiggins
DP/Operator/London



Dale Launer wrote :

>Steve - do you or Jessica (I know you're both very interested in the >science of perception - has anyone done a study that could tell us just >how much the human eye can resolve?

Dale :

I wrote a white paper addressing this in a theatrical context, available on our website at :

http://www.etconsult.com/papers/Technical%20Issues%20in%

20Cinema%20Resolution.pdf

or if that URL is too long, it is at www.etconsult.com click on "white papers".

The paper is 2 years old, references some older film data, and is "pre-2K" digital projectors - one day I will update it - but the perceptual considerations are still valid.

Matt Cowan
Technical consultant, imaging and perception



>Steve - haveyou or Jessica (I know you're both very interested in the >science of perception) - or has anyone done a study that could tell us >just how much the human eye can resolve?

I'm a little sleep deprived right now and may be a bit off, but I believe human visual acuity (assuming normal vision under favourable circumstances) is the equivalent of somewhere around 6K.

In more traditional measurements, it's about 1/60th of a degree. So screen size, distance from the screen and brightness would also need to be taken into account.

It may be higher for people who have had corrective laser surgery.

Jessica Gallant
Los Angeles based Director of Photography
West Coast Systems Administrator, Cinematography Mailing List
http://www.cinematography.net



It seems to me the goal isn't just to create some large number of lines or pixels, but to resolve the widest range of contrasts - get an optimal MTF......

So in more prosaic terms the best case would be capturing every nuance in very contrasty material, capturing the smallest variations in brightness (avoiding the 'paved over' look in more-or-less-even areas that we see in some digital images) and obviously, everything in between.

So is this a question of scanning at the highest possible number of lines, or something else ?

Sam Wells



Matt Cowan wrote:

>I wrote a white paper addressing this in a theatrical context

Matt, can you please share with us what "normal" lens was used and the exposure information referenced in the white paper charts?

Thanks!

Illya Friedman
Senior Camera Rental Agent
Moviola
Hollywood, CA
www.moviola.com



I would agree that working at higher than 2K is a good goal to strive for some of the initial stages of DI, such as capture, to avoid MTF losses via Nyquist and reduce aliasing and moire patterning. However, a 2K 10bit Log frame requires over 12Mb of data, a 4K frame 50Mb and 6K over 70Mb. Go to 12bit Log and the data requirements are even greater.

At the moment, with today's technology, such figures are just not feasible within a creative environment where interactive involvement with the image is important.

Additionally, the qualitative gains in going from 1K to 2K are great; from 2K to 4K visible; from 4K to 6K almost invisible. And this from evaluating the image under a microscope, not projected under standard viewing conditions.

We were recently able to perform a test where an image was generated with exactly 2K lines (digital projection of an uncompressed frame via a 2K DMD projector) with a screen pixel size of 1/6 of an inch square. What was interesting was that the viewer could tell there was a 'line' structure to the image, but was unable to resolve an individual line. Seems to be something to do with the eye seeing a lower harmonic within the image as it was a regular frequency pattern.

The same resolution image was then generated as a checkerboard of alternate pixels on alternate lines. The result from the viewers perspective is a uniform grey frame with no ability to register any detail at all.

I'm not exactly sure that this proves anything. It was hardly a scientific evaluation, but it suggests that a real-world image that has random detail cannot be resolved beyond 2K in a normal theatre viewing environment. If others have done similar, or hopefully more scientific, tests I would love to hear about the results. Matt's paper on the ETC website seems to suggest similar findings.

This test obviously only dealt with the final projection resolution, and as has been said throughout it is important to ensure any procedure undergone during the 'film' process doesn't reduce the image resolution to a lower quality. Hence the need to use 4K Nyquist scanning when working from film to generate a 2K image with a high enough MTF to make full benefit of the resolution carrying capabilities of a 2K frame. This is explained further in the DI document on the Digital Praxis website.

A while back we projected a consumer DVD via an upconverter and a 1280 pixel digital projector onto a 30ft screen. The DVD had been mastered via a high quality telecine using Nyquist and the images were highly impressive. The film was also a great story and even though the projection was a test to an industry audience, started and finished part way through the film, we had more complaints about stopping the projection as the audience was enjoying the film than comments about the image quality.

The biggest problem I have seen in digital projection is the subject material being treated as video, with no thought as to the function played by print film's D-logE curve in maintaining shadow and highlight detail without crushing or clipping - the two things that give 'video' away.

At the end of the day digital film is going to happen, both for projection & capture. The same way vinyl was replaced by CD. Even today a good record deck can out perform a CD, but the benefits of CD tend to negate the potential of vinyl.

To be attempting to specify a digital film standard above what is actually necessary is rather like attempting to lay a road in front of a speeding car - one with off road capability. It can make life easier for the car and driver, but if it's not laid quickly, and in the right place, it will get left behind.

Steve Shaw
Digital Praxis Ltd.
www.digitalpraxis.net



>At the moment, with today's technology, such figures are just not >feasible within a creative environment where interactive involvement >with the image is important.

We're not that far away from being able to do 4K 16-bit and even 32-bit work interactively - standard computing performance has made rapid strides over the past few years

Of course 2K will be always be faster with any given amount of processing power but then again 2K makes for a very high quality proxy for 4K workflows.

Maurice Patel



Steve Shaw wrote :

>We were recently able to perform a test where an image was generated >with exactly 2K lines

Viewer tests we've run show that the difference between 1440 (recorded HDCAM) and true 1920 (Viper, F900/950 direct out, ARRI D20, etc.) can be seen by most viewers (given the right tool, of course ). The next challenge is to ensure that the entire (post-production) process retains image data. There's a distinct possibility of running images through so many filters that information loss is part and parcel of the process.

This is where digital is not unlike the losses seen from OCN to release print. Digital processing --particularly YCbCr/4:2:2-- is not necessarily transparent or loss-less.

Martin Euredjian
eCinema Systems, Inc.



>The next challenge is to ensure that the entire (post-production) process retains image data. There's a distinct possibility of running images >through so many filters that information loss is part and parcel of the >process.

---For all I hear about the various tape formats (D-5, HDCAM, etc.) and for all I constantly hear about the difficulty in workflow, I am becoming more convinced by the second that it is necessary to go with DATA all the way, despite the current limitations now imposed by current hard drives, etc.

I realize that I'm a newbie, but it is still apparent even to me, and I know that the technology will most likely adapt and expand to accommodate the demands.

If anyone has some nuggets on the latest developments in this vein I am all ears....

Jeffery J. Haas
freelance editor, camera operator
Dallas, Texas



Jeffery J. Haas wrote:

>I am becoming more convinced by the second that it is necessary to go >with DATA all the way, despite the current limitations now imposed by >current hard drives, etc.

Yep.

Tape has its uses, of course.

Jeff Kreines



Jeffery J. Haas wrote:

>I am becoming more convinced by the second that it is necessary to go >with DATA all the way, despite the current limitations now imposed by >current hard drives, etc.

Just as important, it is critical that folks understand --once and for all -- that "digital" does not equate to "perfect" or even "right".

There are lots of ways to screw things up in digital processing and lots of ways to take shortcuts. To the uninitiated it would seem that a digital pipeline is guaranteed to delivery all pixels in the best possible way. So, if you are doing HD, 2K, 4K or 6K and run your data through one, just one, process that
compromises it ... that's it, you are done, you'll never recover the original.

Food for thought.

Martin Euredjian
eCinema Systems, Inc.



Steve Shaw wrote :

>challenge is to ensure that the entire (post-production) process retains >image data.

There's no question in my mind that 2k is too low for post. Supervising at Imageworks, I see every single spatial process that's run as image degradation .

Take the everyday case of stabilising an image to remove jitter - we're very concerned with motion trackers that can capture sub-pixel information, but largely take the re-sampling required on the chin. With 2k as an input it's almost impossible to provide a 2k output which will cut with untreated footage.

4k/16 should be the standard for 2k post work currently (and occasionally is for certain shots and sequences - vista being the current choice at 3k/10) but when the non-fx footage is captured at 4k, we need something yet higher to be given a chance of seamless integration without the audience reading a 'processed' shot.

Jake Morrison



Jake Morrison wrote:

>There's no question in my mind that 2k is too low for post. Take the >everyday case of stabilising an image to remove jitter

Good example. This is the sort of thing I was thinking of. Re-sizing, rotation, cropping, panning, perspective changes (sign on the side of a moving truck), frame-rate changes, etc.

At the recent HPA meet I asked the camera manufacturers if they had made any allocation for swapping imager modules in their current or future cameras --an idea I've been pushing for at least five years. Only ARRI had the architecture to consider this.

This is a bit of an unfair characterization because the D20 is an engineering test bed and, as such, it would have this sort of capability. However, regardless of that, it is interesting to see how Digital Cinematography is trying to be (as in "exist") using HD while, the world is waking up to the fact that more might be required. Or, perhaps more accurately, that for some work more is almost an absolute necessity.

Martin Euredjian
eCinema Systems, Inc.



Martin Euredjian wrote :

>Only ARRI had the architecture to consider this.

Of the manufacturers at HPA, you mean. Not everyone was there...

Look for a sensor-agnostic and resolution-agnostic camera at NAB... designed to easily upgrade to newer and higher-resolution sensors -- CCD or CMOS or whatever else comes along.

Admittedly, this isn't as big a deal for single-sensor camera as it is for prism cameras. But it makes great sense to incorporate it and the requisite bandwidth from the start.

Jeff "believes in the separation of, er, church and sensor" Kreines



Jeff Kreines wrote:

> Of the manufacturers at HPA, you mean. Not everyone was there...

There's so much going on that everyone could not possibly be there. After all, this was the Hollywood POST Alliance, not "Cinematography". The most interesting work is happening far, far away from this industry. And, building a camera is only 5% of the equation. There's also the unavoidable reality that some interesting efforts out there are going to run right-smack into patent issues.

>Look for a sensor-agnostic and resolution-agnostic camera at NAB...

Good luck. I mean this in a good way.

Martin Euredjian
eCinema Systems, Inc.



Steve Shaw wrote :

>The same resolution image (2K) was then generated as a >checkerboard of alternate pixels on alternate lines.

Do you think that this suggests that imagers that are neither laid out nor scanned in rows would create more realistic images? Progressive scan might look better than interlace but perhaps random scan is best of all?

Jim Iacona
DP
San Francisco



Martin Euredjian wrote:

>There's also the unavoidable reality that some interesting efforts out >there are going to run right-smack into patent issues.

Any specific ones you have in mind?

Given how the US Patent Office hands out many meaningless patents, I can imagine this might be the case.

But there's lots of interesting prior art out there, too.

Jeff Kreines



Jim Iacona :

>Do you think that this suggests that imagers that are neither laid out nor >scanned in rows would create more realistic images?

Definitely so. A properly randomly laid out set of pixels eliminates moiré patterns and other aliasing artefacts. This is why the photoreceptors in the extra foveal region of mammalian retinas are arranged in a random distribution that is mathematically modelled with a Poisson disk. And this is why sophisticated renderers use a jittered super sampling pattern.

Andreas Wittenstein
Founder, BitJazz Inc.



Jeff Kreines wrote :

>Given how the US Patent Office hands out many meaningless patents, I >can imagine this might be the case.

What I've learned is that none of that matters. Possession pretty much seals the deal, unless you have bushels of money to try and have a patent annulled. Any competent attorney will tell you that fighting a patent that's been issued, prior art or not, is not a smart thing to attempt below certain circles.

Re. what patents are out there and might be brewing. Too many. It's a minefield. Run a patent search with relevant terms and you'll see. I can't give you references. My IP attorney has stacks of 'em resulting from patent searches related to our work. I don't have that info with me. But I do remember reading through some of them in utter disbelief.

Martin Euredjian
eCinema Systems, Inc.



I. Friedman wrote :

>Matt, can you please share with us what "normal" lens was used and the >exposure information referenced in the white paper charts?

Following is from the ITU report re: cameras and lenses.

"Two cameras (a Panavision Panaflex Millennium camera, serial number PFX-127M and an Arriflex 435 S camera, serial number 435ES-140) were used to shoot the test chart. The Arriflex camera had been modified by Panavision to accept Panavision lenses.

The performance of both cameras was measured before the shooting and found to be within specifications.

Two prime lenses (Panavision Primo-L lens, serial number SL 50-78 and anamorphic[1] Panavision Primo Auto Panatar lens, serial number AL 75-23 CF) were used to photograph the test chart. The two lenses were used on both cameras since they could be interchangeably mounted on both. The axial MTF of both lenses was measured after the shooting at the focusing distance and T-stop used during the shooting. The MTF of the Primo-L lens dropped to about 75% of its maximum value at a resolution of 50 cycles/mm, and the MTF of the Primo Auto Panatar lens dropped to about 68% of its maximum value at a resolution of 50 cycles/mm[2].

[1] The use of an anamorphic lens had not been foreseen in the test plan. The Primo Auto Panatar lens was added to the test on recommendation of the cinematographer, who correctly pointed out that many feature films are photographed with anamorphic lenses.

[2] A spatial resolution of 50 cycles/mm corresponds to 1133 lines per picture height on release prints with an aspect ratio of 1:1.85. With an anamorphic lens 50 cycles/mm corresponds to about 1778 lines per picture height vertically and 890 lines per picture height horizontally."

Matt Cowan