TV manufacturers are committed to delivering the best available technology to their customers. Hence, you find some innovation or the other, frequently introduced on the latest television brands.
HDR10+ is the latest HDR (High Dynamic Range) format available today. It is an improved version of HDR10. We shall discuss HDR10 and HDR10+ to understand their differences and why HDR10+ is the better technology. Meanwhile, it is pertinent to know a bit of the history of the evolvement of HDR.
You had the CRT picture tubes with color and contrast standards based on Standard Dynamic Range (SDR) in the olden days. SDR offers less than half of the dynamic range that the human eye could perceive, leading to the invention of High Dynamic Range (HDR) technology.
HDR-enabled TVs display an expanded color and contrast range to offer more realistic and natural images closer to what the human eye perceives.
Just as HDR started becoming popular, TV manufacturers introduced advanced versions like HDR10, Dolby Vision, Hybrid Log-Gamma (HLG), and Advanced HDR. Now, you have another advancement in the form of HDR10+ to add to the existing confusion.
Let us see how HDR10+ is different from its basic version, HDR10.
What is HDR10+?
HDR10+ is different from HDR10 as it brings the same advanced functionalities of Dolby Vision to an open standard that the content makers can use without paying Dolby’s licensing fees.
Compared to Dolby Vision’s 12-bit color, HDR10+ has 10-bit color. Nevertheless, it exhibits a better balance between the dark and light scenes compared to HDR10.
HDR10+ was in the news as early as April 2017, when Samsung announced its partnership with Amazon Prime Video to support this new advanced format.
Now, 20th Century Fox and Panasonic have joined hands with Samsung to develop the HDR10+ format further.
Besides Samsung TVs, you have Vizio TVs and Google Play streaming HDR10+ standard. It is also available on Amazon Prime Video, with Netflix announcing that it might support the format soon in the future.
The Need for HDR10+
HDR10 is already available on most modern TVs. HDR10, announced by Consumer Technology Association, a US-based consumer electronic association, improves picture quality by sending static metadata to the video stream.
HDR10+ is an enhanced version of HDR10 as it sends dynamic metadata to the video stream, thereby allowing the TVs to set up color and brightness levels, frame by frame.
Dolby Vision works on the same principles, but TV content producers have to pay hefty licensing fees to Dolby to get continuous Dolby Vision support. HDR10+ does away with these licensing fees, and thus it is a royalty-free standard for HDR.
While Dolby Vision requires a scene-by-scene color correction, HDR10+ takes HDR10 content and brings it on par without additional labor. Content creators prefer HDR10+ primarily for this aspect.
HDR10 Vs. HDR10+
The prime difference between HDR10 and HDR10+ is that HDR10 is an open standard for HDR, whereas HDR10+ is a royalty-free standard. On the other hand, Dolby Vision is a proprietary standard for HDR developed by Dolby.
Here are the prime areas of differentiation between HDR10 and HDR10+.
|Peak Brightness Levels||Great||Great|
|Tone Mapping features||It depends on the manufacturer||Better than HDR10|
|Metadata||Static metadata||Dynamic metadata|
|TV Support||Widespread||At present, limited|
Let us dwell on each of these aspects in detail to understand them better.
Both HDR10 and HDR10+ have 10-bit content, as opposed to SDR content that masters at 8-bit. Hence, HDR10 and HDR10+ can display up to 1.07 billion colors.
The more colors a TV shows, the more realistic the image is. There is less banding, and transition in areas of similar colors is more subtle.
At this point, let us understand that Dolby Vision technically allows up to 12-bit color. Thus, it can display up to 68.7 billion colors. The difference is there for everyone to see.
Peak Brightness Levels (Nits)
Both HDR10 and HDR10+ can master content at 4000 cd/m2. This feature should not matter much because most TVs in the market do not go above 1000 cd/m2. The latest TVs, especially the QLED TVs, are capable of supporting better peak brightness levels.
Dolby Vision can support up to 10,000 cd/m2.
Not all TVs support peak brightness levels up to 4000 nits. If you have a TV with peak brightness levels of 1500 nits, it can be challenging to deal with film content mastered at 4000 nits.
One way of dealing with such an issue is clipping, where the TV display clips everything from 1500 to 4000. Therefore, the TV will not display any detail in this brightness range. Though the older TVs use this technique, the latest ones, thankfully, do not. They use a different method known as Tone Mapping using Perceptual Quantizers (PQs).
In Tone mapping, the highlights above 1500 nits to 4000 nits are remapped to fall below 1500 cd/m2. Therefore, you observe a gentle roll-off of color in the highlights starting around 1000 cd/m2. It entails that some details might appear dimmer when compared to the TV that uses clipping. However, a tone-mapped picture displays more details than a clipped image.
With HDR10 and HDR10+, the TV manufacturers decide on tone mapping. Compared to HDR10, TVs supporting HDR10+ are better placed in this regard.
HDR10+ differs from HDR10 in this aspect the most. Metadata is akin to an instruction manual that describes the different facets of TV content. It helps the TV display to deal with the content effectively.
HDR10 asks for static metadata, whereby the display sets the boundaries for the brightness levels for the entire movie/show. It is determined by the brightness range of the brightest scene in the content. It entails that some scenes might be displayed sub-optimally.
HDR10+ improves HDR10 as it seeks dynamic metadata that allows tone mapping on a scene-by-scene or even a frame-by-frame basis. It simplifies the process of authoring video content and enables the most optimal visual reproduction for each scene.
As of date, HDR10+ is not as widespread as HDR10. Hence, you have more devices supporting HDR10 than HDR10+. This table represents the device-wise compatibility for HDR10 vs. HDR10+
|Amazon Prime Video||Yes||Yes|
|Amazon Fire TV Stick 4K||Yes||Yes|
|Apple TV 4K||Yes||No|
|NVidia GTX 900 and upwards||Yes||No|
|AMD Radeon RX and upwards||Yes||No|
As of date, most of these content manufacturers do not support HDR10+. However, things can improve in the future.
As far as TVs are concerned, Samsung TVs support HDR10+, whereas the other major manufacturers like Sony, LG, TCL, etc., support HDR10 and Dolby Vision.
Some Panasonic TVs support all three formats outside the US. Apple TV will most likely never support HDR10+ because it is a Samsung proprietary technology. Brands like Toshiba, Xiaomi, Oppo, TCL, OnePlus, Hisense, and Vivo support HDR10+ in some of their top-end models.
Gaming devices, like PS4, PS4 Pro, Xbox One, and PC, support HDR10 but not HDR10+. Xbox One also supports Dolby Vision. Nintendo Switch does not endorse any of the three formats.
PC monitors are slow to adopt HDR. However, you have some of the latest monitors supporting HDR10. As of date, no PC monitor supports HDR10+ or Dolby Vision. A few of the latest laptops offer Dolby Vision support.
HDR10+ presents a viable alternative to Dolby Vision as it does away with the licensing fees. However, it will not replace it because Dolby Vision still offers features much beyond what HDR10+ is capable of. Thus, we can expect a scenario where HDR10+ and Dolby Vision can peacefully co-exist.
When compared to HDR10, HDR10+ is a better standard. It will take some time for complete HDR10+ adoption. However, many high-end TV brands support Dolby Vision, whereas the top-end Samsung TVs are compatible with HDR10+.