This might be beating a dead horse, but QuickTime truly sucks.
For those using 16-bit (deep color) applications, always use the Force 16-bit encoding option, it is the highest quality and surprisingly, it is often the lowest data rate.
Now for the weird reason.
QuickTime loves 8-bit, it prefers it greatly, and support for deep color is difficult at best. Over the years we tried to make the 16-bit the preferred mode for our codec within QuickTime, yet there are many video tools that broke when we did this. The compromise was to add the Force 16-bit into the QuickTime compression option, to allow user to control the codecs pixel type preference – applications that can handle 16-bit will benefit, and applications that don’t, still work.
Using After Effects for my test environment (but the same applies to other QuickTime enabled deep color applications.) I created a smooth gradient 16-bit image, then encoded it at 8-bit using using a 8-bit composite, 16-bit using a 16-bit composite and 16-bit using a 16-bit composite with Force mode enabled (pictured above.)
Without post color correction, all three encodes looked pretty much the same*, yet the data rates are very different.
* Note: QuickTime screws up the gamma for the middle option, so with the image gamma corrected to compensate, they looked the same.
The resulting file sizes for 1080p 4:4:4 CineForm encodes at Filmscan quality:
8-bit – 13.4Mbytes/s
16-bit – 28.4Mbytes/s
16-bit Forced – 5.3Mbytes/s
Our instincts that higher bit-rate is higher quality will lead us astray in this case.
Under color correction you can see the difference, so I went extreme using this curve:
To output this (16-bit Forced)
Zooming in the results are still great. Nothing was lost with the smallest of the output files.
Now for the QuickTime craziness, 16-bit without forcing 16-bit.
If anyone can explain why Quicktime does this, I would love to not have users have to manually select “Force 16-bit encoding”.
P.S. real world deep 10/12-bit sources pretty much always produce smaller files than 8-bit. This was an extreme example to show way this is happening.