This might be beating a dead horse, but QuickTime truly
sucks.
For those using 16-bit (deep color) applications, always use the Force 16-bit encoding option, it is the highest
quality and surprisingly, it is often the lowest data rate.
Now for the weird reason.
QuickTime loves 8-bit, it prefers it greatly, and support
for deep color is difficult at best. Over
the years we tried to make the 16-bit the preferred mode for our codec within QuickTime, yet there are many video tools that broke when we did this. The compromise was to add the Force 16-bit
into the QuickTime compression option, to allow user to control the codecs
pixel type preference – applications that can handle 16-bit will benefit, and
applications that don’t, still work.
Using After Effects for my test environment (but the same
applies to other QuickTime enabled deep color applications.) I created a smooth gradient 16-bit image, then encoded it
at 8-bit using using a 8-bit composite, 16-bit using a 16-bit composite and 16-bit using a
16-bit composite with Force mode enabled (pictured above.)
Without post color correction, all three encodes looked pretty much the same*, yet the data rates are very different.
* Note: QuickTime screws up the gamma for the middle option, so with the image gamma corrected to compensate, they looked the same.
The resulting file sizes for 1080p 4:4:4 CineForm encodes at Filmscan quality:
8-bit – 13.4Mbytes/s
16-bit – 28.4Mbytes/s
16-bit Forced – 5.3Mbytes/s
Our instincts that higher bit-rate is higher quality will lead us astray in this case.
Under color correction you can see the difference, so I went extreme using this curve:
To output this (16-bit Forced)
The result are beautiful, really a great demo for
wavelets.
Zooming in the results are still great. Nothing was lost with the smallest of the output files.
Storing 8-bit values into a 12-bit encoder, steps of 1,1,1,1,2,2,2,2 (in 8-bit gradients are clipped producing these flat spots) are encoded as 16,16,16,16,32,32,32,32, the larger step does take more bits to encode – all with the aim to deliver higher quality. Most compression likes continuous tones and gradients, edges are harder. Here the 8-bit breaks the smooth gradients into contours which have edges. The clean 16-bit forced encode above is all gradients, no edges, result in a smaller, smooth, beautiful image.
Now for the QuickTime craziness, 16-bit without forcing
16-bit.
The image is dithered.
This is the “magic” of QuickTime, I didn’t ask for dithering, I didn’t
want dithering. Dithering is why the file is so big when compressed. QuickTime is given a 16-bit format, to a
codec that can do 16-bit, but sees it can also do 8-bit, so it dithers to
8-bit, screws up the gamma, then gives that to the encoder. Now nearly every pixel has an edge, therefore a lot more information to encode.
CineForm still successfully encodes dithered images with good results, yet
this is not want you expect. If you wanted noise, you can add that as need, you don't want your video interface (QuickTime) to add noise for you.
If anyone can explain why Quicktime does this, I would love
to not have users have to manually select “Force 16-bit encoding”.
P.S. real world deep 10/12-bit sources pretty much always produce smaller files than 8-bit. This was an extreme example to show way this is happening.
Hi David,
ReplyDeletethanks for sharing this. That automatic dithering is at least annoying. What the heck are they doing?
Cineform just rocks :-)
Cheers,
Axel
Hi,
ReplyDeleteI am very interested in this article.
I using GoPro back edition and I am facing this gamma altering problem with Premiere CC.
I am not sure I understood properly everything.
I have the following questions and I would be really happy to get some help/eplanation
1) I tried Premiere CC with GoPro export but apparently the free GoPro version has no possibility to to force the 16-bit. so the result is a a video with gamma altered. I tried to compensate for just by altering the gamma by PremiereCC but I was not able to reproduce in the output the same color quality of the original file.
I see that there are different Gamma correction (Gamma, blu gamma etc..) which one is the most appropriate to compensate this gamma shift?
Of course this solution is not the one I like. The ideal (and if everything work as it should work), there should not be any changing in the color , gamma etc...
of the output video unless I change it using a video editing program (which is not the case) so I'd like to encode the video directly without loosing anything and obtaining the same quality of the original.
2) If i understand properly, this problem coms form QuickTime.
I tried to export the file using H.264 directly not from QuickTime but I got the same problem. I am wondering why....
I hope to get some explanation and help to solve this very annoying gamma shift problem. It is really a pity to have high quality video distorted without ask for that.
Thank you for the help and sharing infos.
Max
Max, You never mentioned the platform you are using, which is relevant to the nightmare of Quicktime gamma issues.
ReplyDeleteAlso are you convert in Studio or using native MP4?
Hi,
ReplyDeleteyou're right.
I am using Mac Book retina 15" with
Maverick (10.9.3).
For your second question, I am not sure I understand it but if I got it then the answer is
in Premiere CC export settings I used :
format: QuickTime
VideoCodec: GoPro-CineForm
thanks for answering
So you CC project is using camera original MP4s? Exporting to CineForm the gamma should be correct. However Quicktime player will preview the resulting files wrong (particularly on Mavericks, which will unnecessarily transcode them.) Are they wrong when important back into CC, or into Studio, or elsewhere? The gamma error is a presentation failure of QuickTime, the encoded file are correct.
ReplyDeleteyes, I am importing original MP4s in CC.
ReplyDeleteI export with CineForm without forcing the 444 16 bit
because in the free CineForm version this option is not possible. Then I open it with quicktime 10 and 7 and I got the color changed. I reimport it into CC and the color stay changed (not as the original MP4file).
Then I installed the trial CineForm premium
where I have this option 444 forced 16 bit but once I tried to export I get an error saying that the export fails for unknown reasons so I can't check if this option would work for me.
I just want to point out also that if I export the file
with H.264 in the format settings (usually is QuickTime and then the codec is H.264 or x264)
the color are experiencing the same problem....
I will ask at the office, as there are some confusing details there, and I'm not a Mac user.
ReplyDeleteOk, thanks.
ReplyDeleteI am not expert so I might not answer properly to the questions.
Hope this can be fix cause it is really annoying.
btw
thanks for the great support
Thanks! Given the new firmware and my license for Premium, what settings and workflow do you recommend to decrease flicker for timelapse video of night into day sunrise video ?
DeleteYou seem to have posted you comment on the wrong subject. For all video sourced timelapses: Protune On, Flat, ISO Limit 1600, CAM_RAW (or 5000K fixed WB.) Most of the flicker is the changing white balance, so simply turn that off.
ReplyDeleteThanks for this great insight - but one question: I´m using CC 2014.1, and I don´t have the option you are mentioning when choosing "GoPro Cineform" as Quicktime codec. Is this option only available when you have the full Studio/Premium software installed?
ReplyDeleteVery nice interested blog very nice blog full information. Please Visit:
ReplyDeletesccm implementation
16 bit applications
application packaging
This comment has been removed by the author.
ReplyDelete