HDR10 vs. Dolby Vision: The new TV Format WAR T
Post# of 96879
The biggest thing in TVs right now is "HDR," or High Dynamic Range. Movies that are HDR-compatible have been mastered to push TVs to their limit, with brighter highlights, deeper blacks, and a wider array of colors. Combine that with the sharper picture you get from 4K and these new movies look absolutely stunning.
Of course, while HDR is the future, getting there is going to be messy. Currently, HDR is available in two formats: HDR10 and Dolby Vision. HDR10 is open source and backed by players like Samsung, Sony, LG, Panasonic, and Hisense.
Dolby Vision is more ambitious, but it is proprietary, and the only compatible TVs right now are LG's OLEDs, a few TVs from TCL and Philips, and the ultra rare and expensive Vizio Reference Series.
Confusing, right? We're here to help.
What sets HDR10 and Dolby Vision apart?
There are a number of key similarities and differences between the HDR10 and Dolby Vision formats. Right now, they have very similar requirements:
Both formats require TVs to have a minimum 4K (3,840 x 2,160) resolution
Both formats call for "wide color gamut" displays capable of ~90% of the DCI-P3 color gamut
Both formats require TV panels and components capable of at least 10-bit color depth
That's a big leap over current HD TVs, but that's about as far as the HDR10 standard goes. Dolby Vision is theoretically designed to go much, much further—way beyond the capabilities of today's TVs, even. That creates some major differences between the two:
Dolby Vision mastering supports up to 10,000 nits peak brightness, with a current 4,000 nit peak brightness target
HDR10 mastering supports up to 4,000 nits peak brightness, with a current 1,000 nit peak brightness target
Dolby Vision mastering supports up to 12-bit color depth, HDR10 is mastered for 10 bits
Dolby Vision mastering supports up to the BT.2020 color space, HDR10 is mastered for DCI-P3