Tuesday, August 25, 2009

Theatrical Successes

I've been missing from much of my usual online forum and twitter activities over the last 10 days, I've been flat out with the 48 Hour Film Project, my annual step in the creative side of the business. Lots of new things to report this year. First non-tech thing is CineForm staff members Jake, Tim, Craig and myself, formed two teams rather than just one this year. While this was saddened me at first, the results do not show anything negative from our divided efforts, both resulting films received more awards than the sum of our previous 5 year history with this competition. My team film draw the genre Detective/Cop to make "The Case of the Wild Hare" in a comic Film Noir style, winning an audience award a two jury prizes. Tim and Jake's team draw genre Film de Femme, shot a thriller, and got a jury prize and runner up for best of San Diego (out of 47 submitted films.)

Check both films out on Vimeo in HD, or just watch embedded.

The Case of the Wild Hare from David Newman on Vimeo.



Touch from Jake Segraves on Vimeo.



Now for the technical: one film used an SI-2K (running beta 2.0 software -- nice), a Kenyon gyro stabler (interesting), and range of 30+ year-old C-mount primes and zooms (classic), the other used an HV30 with a lens adapter with Canon still lenses. Can you tell which is which? Shooting technology is clearly only a small factor in making an enjoyable film, as both films won against very polished projects shooting from 5Ds to Red Ones. While the shooting technology could hardly me more different, the post workflows where very alike. Editing was done is Premiere Pro CS3 (Windows) on i7s or dual quad Core2 Intel systems (OSes range form XP64 to Win7.) CS4 wasn't used only for HDSDI monitoring reasons, as we haven't got that working yet. CS4 would have helped as we always used the beta features of our own tools, and one of the new features only worked in CS4 (we didn't realize until mid-post.) Both films where shoot for 2.35:1 presentation, as 16:9 has become so TV like these days. The new feature that neither team got to use was an addition to First Light, enabling 2.35:1 crop and centering as Active Metadata -- would have saved some time in post for positioning and rendering. First Light was used extensively on both projects, all grading was done as a real-time operation in First Light, particularly aided with the new auto sync. feature which keeps First Light connected with the NLE's timeline. No color correction was needed within Premiere itself.

More on First Light. One thing that helped us was a range of 3D LUTs (look files) that we have been preparing for sometime. These LUTs are now available for download to use with your Neo HD/4K and Prospect under Windows (Mac versions soon.) In such compressed schedule, you get very little time to work on your look, I think I put about 30-40 minutes into color correcting Wild Hare, about twice the time I had for color work for last year's project, but it is not much so prepared LUTs helped greatly. The Active Metadata LUT system, works on the final output for the image, with all the linear light processing for white balance, color matrixing (saturation) and channel gains, lift, and gamma being applied to the input of the LUT. This makes it pretty easy to mix and match a range of sources to produce on common look (stylized or not.) As I was working with two co-directors, one of whom had never working in film before (only stage work), I prepared different look profiles as switchable color databases, so the entire timeline could have is look/style switched dynamically. This helped showcase possible finishing styles without impacting the editing session, which went into 47th hour.

Here is an example of a before and after First Light processing.

Source:




Final:



After our two teams had submitted, the remainder of the week was spent preparing the other 45 films of presentation in CineForm format out to the Christie 2K projector to the local UltraStar theater. As CineForm is San Diego 48 Hour sponsor we requested that 1080p CineForm AVI/MOV be the default HD submission format (we gave all teams a 30-day Neo HD license,) fortunately more than half of films where submitting this way, making our life easier. It is was the remaining SD submissions, with its many pixel aspect ratios, letter-boxing, pillar-boxing, cropping, and pulldown variations, that was a time consuming headache (upres'ing it to look decent.)

For theatrical presentation we were prepared to use a Wafian F-1 as a playback server, which has worked flawlessly in previous years, and is not so rarely used for this purpose (as works really well -- F-1(s) where use to present HD at Comicon.) Unfortunately an hour before the first theatrical presentation the drive sled that goes into the Wafian crashed. This piece of bad luck put as in a panic, as it was going to take more than the remaining hour to prep a new drive with the 23 films that were screening that evening. Fortunately Tim had been experimenting with scheduled playback using, CineForm AVIs, a Blackmagic card and the open source Media Player Classic, and his system than all the films on it. Basically a cheap PC with a $300 HDSDI card hooked to a theater projector, become an awesome 10-bit 1080p presentation system. This was not flawless, it stopped about 7 films into the second screening group (resulting in 45 seconds of black as Tim run up into the projection booth to give MPC a kick.) While a Wafian would never have done that, the cheapness of this presentation solution made us pursue the same setup the next night and for the best of San Diego screening. There is still work to tweak this solution, but something like this is needed, as it is crazy that San Diego is still the only city to project in HD for this international festival, this being San Diego's third HD year.

Extending our experience from the local festival,we want every festival to stop presenting DVDs or Beta (still so common,) when most sources are film or now HD or originated. The solution has got to be cheap and simple, to allow for last minute playlist changes for pre-rolls, differing frame rates (24p, 30p and 60i,) audio level adjustments and skipping bars and tone, etc, i.e. BluRay will not do. We had one beautifully master film this year, that had the wrong black levels, put it through First Light, fixed it in seconds without any rendering. Duane Trammel, San Diego 48 Hour producer, used the playlist flexibility, to inter-cut interviews with several of the filmmakers in this short problem -- it was a pretty cool touch. More of this style of flexible is what is needed, and we are hoping to help.

Tuesday, July 28, 2009

Upcoming speaking engagements

This year will be the first time I'll will make to IBC, thanks to the persistence of Phil Streather to have me present the editing workflow section of his "3D at the Movies" super panel. The full panel will cover many elements such as pre-visualization, shooting and post, all presented in a large Real-D equipped movie theater. It will be a first for me to demonstrate 3D convergence manipulation live onto a screen of that size (seating over 800 people apparently.) So if you're in Amsterdam the morning of September 14th, might be worth dropping by early to seek good seating (for the better 3D experience.)

Something closer to home. This weekend I will be presenting part of the editing panel, covering post for the 48 Hour Film Project. If you have read here at all, you know I'm a big fan of this competition. I will be at the SaVille Theater at San Diego City College from around 1pm through 5pm, this Saturday. Primarily I will be showing local 48 Hour teams how to produce HD masters that are ready for projection. If you are putting a team in this year, and many already are (almost at capacity for teams now), and if you can't come to this session, please read up on the new HD submission guidelines -- And practice before the competition weekend. The a seminars on film making for this competition (and others) starting at 10am, you can see them all for only $15 -- more info here.

Thursday, July 09, 2009

Sponsoring 48 Hour Film Project for San Diego 2009

For last 5 years CineForm as put a team in this touring film making competition. In the last two we have sponsored the event by providing HD playback and HD up-res services for those submission that need it (everything is converted to CineForm AVI/MOVs for playback.) San Diego's 48 Hour is still the only city that is allows HD submissions for 1080p projection. This year we expanding, we are helping in Filmmaker Seminars in the weeks before, and we are putting in two teams this year to the competition -- this is not part of the sponsorship we pay like everyone else, and we don't even use the company name for these teams, never have -- always just for fun.

Come and join the craziness 48hourfilm.com/sandiego/

Wednesday, May 06, 2009

No Problem With 3-D

The title of the post is a come back to Daniel Engber's article in Slate "The Problem With 3-D", with the sub-title "It hurts your eyes. Always has, always will." As CineForm is entering the 3D post production world, I was curious whether his claims are valid. I'm personally one who doesn't find modern 3D film difficult to watch, and very much prefer to look at recent films like Coraline 3D over their 2D presentation, but I spend much of my day staring at images for quality analysis, so I'm not the best test subject.

Engber states the visual fatigue that "plague flight simulators, head-mounted virtual-reality displays, and many other applications of 3-D technology" is directly connected to 3D movie eye strain and that "no one yet knows exactly what causes this." Engber then goes to propose a reasonable sounding theory that our eyes want to refocus on objects that are not really closer or further away than the physical screen plane, and a likely cause of strain. This seems like a logical explanation for those experiencing eye fatigue, and he offers no other. The article then goes on to suggest "if 3-D becomes as widespread" the possible blindness (well "permanently cross-eyed") of your children -- Wow! Now I was initially going to accept the earlier claim of convergence without refocus being a potential cause of eye strain for some, but now that my kid's eyesight is involved I had to dig deeper.

I dug, and I now believe he is wrong, at least for most correctly presented theatrical presentations. I'm also proposing a theory without rigorous test data (just like Engber), but focusing on the optical characteristics of the human eye. I wondered whether hyperfocal distance, the particular range from x feet to infinity such that all appears in focus. While a typical lens has a single point of focus, there is range in which focus is still considered sharp. Whenever depth of field is discussed, that is talking about the same range of acceptable focus. So from Wikipedia "hyperfocal distance is a distance beyond which all objects can be brought into an "acceptable" focus." If the screen is beyond the hyperfocal distance of the human eye, all 3D images behind the screen plane still appear in focus, and a certain amount in front will still be in focus, with some simple rules. With all images in the theatrical 3D space appearing in focus, it doesn't matter if your eyes do change the focus range, so Engber's claim does not hold up, and your children are safe.

Basically the problem described in the article only happens in close screen viewing conditions or extreme "coming at ya!" 3D that has been losing favour as 3D projection becomes more common. In a typical movie theater the viewing distances are such that the eye can do its natural convergence and refocusing without losing focus on the presented 3D world.

Now to calculate the acceptable distances for 3D, we needed to calculate the human eye's hyperfocal distance. With some online research I was able to determine the eye is approximately a 22mm lens system (seems about right,) with a maximum f-stop of 2.4 (darkened theater would do the trick.) There is a great article on The Photographic Eye from which I gather numbers I used (they agreed with many sources.) Now we can plug these numbers into a lens calculator, and get a number for 35mm cameras -- 22.3 feet hyperfocal, focus range 11.1 feet to infinity. So if eyes were 35mm cameras, as long as the 3D object remains more than 11 feet away from us we can comfortably and safely view it and everything behind it into the 3D world. But of course our eyes are not 35mm cameras, and are more complex to model, but the heart of all this is the Circle of Confusion (CoC - the amount of allowable blur.) So instead of guessing the camera system that models the human eye, let's calculate what as acceptable blur for the typical theater viewing environment.

For our theater model, we have nice 40 foot horizontal movie screen at a viewing range of one and half screen distances, i.e. 60 feet away, using a common 2K projector (99% of all digital projection is either 2K (2048) or 1920 line HD.) So the amount of allowable blur is related to the pixel size, as we don't see a lot of chunky pixels, the resolution is high enough that it fuses into a continuous image for the audience. So let's estimate that a half pixel blur is OK and is still perceived as sharp. For the approximate 2000 pixels around 40' screen, 0.5 pixels will be 0.5/2000*40 = 0.01 feet or a blur of around 1/10th of an inch. The viewing angle for our blur at 60' is calculated as 0.01 degrees. As Circle of Confusion or CoC is calculated at 25cm, the 0.01 degrees results in a CoC of 0.04mm. Now using the CoC number in our lens calculator we get these results: When viewing the screen at 60' away, all objects from 13.1' to infinity will appear in focus. If an object jumps 75% off the screen and is perceived as 15' away, and you focus on it at 15', it and the screen plane are still in focus, so no source of eye strain. We now have the safe/enjoyable range to present a 3D image. You might be thinking the amount of allowable blur at 0.5 pixels was overly generous, and it was, in 3Ds favor. Wikipedia and other sites place the average acceptable CoC at 0.2mm, yet the numbers above are five times sharper than that (so there is plenty of headroom for the average viewer.)

This potentially points to the home screening environment having issues with the screen being so much closer. Yet using the same human eye lens modelling, a 3D depth range can be created such that the eye can focus at will without causing blur issues at the screen plane, or introducing eye strain. Plus as the average home environment is not as dark as the theater experience, we can use a different lens f-stop in our calculations. If our eye is more typically at f/4 for home viewing (totally guessing -- love help here) for a screen at 12' distance, 3D images can be placed at 6' (half out of the screen) to infinity (still using the same very sharp 0.04mm CoC.) So there is a reformatting required between theatrical and home release, but this was already an accepted factor to adjust for the smaller home screen.

This is not to say that there aren't other factors that contribute to eye strain in today's 3D technologies, such as imperfect filtering causing left/right cross-talk and poor quality glasses introducing other optical distortions. Yet the biggest factor to eye strain is more likely due to inexperience of 3D film making, such that there are good and bad presentations, which have nothing to do with the technology. The film making and the tech can only improve, and there is no inherent cause of eyestrain in today's 3D presentation technology.

-------

Here is a fun aside. For those wanting to experience the 2D and 3D version of a film without going to see the film twice (or closing one eye for prolonged periods of time), create yourself a pair of 2D glasses. These will also come in handy if you happen to see one of those bad 3D movies. Get two pairs of standard RealD glasses and pop out the left eye lens from one and the right eye lens from the other, and swap them. With a pair of scissors to cut the lens down a bit, you can restore the lenses to the frame with the opposite position (be careful not to flip the front and back side of the lens), so that you have one set of glasses that is a left only and another that is right only. At any time during a 3D film, put on your 2D glasses, to experience that retro 2D world.

Wednesday, April 29, 2009

NAB Coverage of CineForm Neo 3D

We are now all back and working very hard after our huge NAB. We pick several (all?) the major show awards, win best of show from both Videography Magazine's Vidy award (recognizing outstanding achievement in the art and science of video technology) and TV Technology's 2009 STAR award (Superior Technology Award Recipient.)

In addition to the NAB Filmmaking Central on the last blog post, we've had great written coverage from Adam Wilt's NAB wrap-up and a long two part interview with Fresh DV covering 3D from production & post through distribution.

Part 1: 22 minutes -- 3D concepts and production issues



Part 2: 12 minutes -- 3D post

Tuesday, April 21, 2009

At NAB talking up First Light and NEO 3D

While the camera is in way too much of a close up on my head (thanks to Dave Basulto of Filmmaking Central), you can see some of what CineForm is showing at NAB this year in this video interview.

Tuesday, March 31, 2009

An Early Glimpse at First Light

When CineForm developed the first RAW video compression (yes ages before those other guys,) we developed a related feature called Active Metadata. You see the problem with RAW imaging is the more RAW it is, the more boring it looks, frustrating to constantly explain to you film's investors, "yes it is supposed to look flat, with low contrast and green." Active Metadata came to rescue, allowing the cinematographer to specify how the image should be development upon decode, while preserving the internal flat, low contrast image for the most flexibility in downstream finishing. Users of the SI-2K have been loving this feature for years, as the camera had a lite version Iridas SpeedGrade OnSet built in for cool color development controls, but at less than 1% of the whole market, and Active Metadata support was RAW sources only, this feature wasn't getting the attention it deserved.

While back we added Active Metadata support for 4:2:2 and 4:4:4 CineForm encodes, but still many CineForm users didn't take advantage as the the controls where limited within Prospect 4K, as they were always intended to be replaced by a standalone tool -- First Light. First Light is only weeks away, arriving in time for NAB -- coincidence? The press release went out today so you will all visit us at NAB, but it only talks about the renderless color workflow, which is only scratching the surface for what First Light will be doing in the future, some of it future abilities will even be shown at NAB.

First Light will be available for all version 4.x users of Prospect HD/4K, Neo HD/4k on Mac and PC platforms. Version 4.x will start shipping before NAB.