Today's televisions have become fashionable in a multitude of terms that are related to image quality. 4K, UHD, Dolby Vision, HDR, HDR10 +, 10-bit … The amount of acronyms can be overwhelming for any ordinary user, becoming confusing in some cases. And it is that while the terms 4K and UHD make mention of the resolution of the screen (3,840 x 2,160 pixels approximately), the rest of acronyms have to do with third-party certifications, certifications that ensure a minimum of quality in content compatible with these standards. But what do they really involve when watching content on TV? We have taken note of some of the most popular terms of 2020 to finally give you an explanation.
HDR10 vs HDR10 +, what differences are there and why are they important on your TV?
The term HDR refers to the acronym High Dynamic Range, that is, high dynamic range . In turn, dynamic range refers to the amount of colors and tones that an image is capable of representing (very broadly). An image with a poor dynamic range will miss details where contrasts are high, such as highlights and shadows.
HDR10 is the evolution of this term applied to image reproduction. There are three main differences from HDR10 +. The first has to do with the maximum brightness level. And is that while the first is capable of offering a brightness level of up to 1,000 nits , the second can project up to 4,000 nits at most. In other words, a TV with HDR10 + will be four times brighter than a TV with HDR10.
The other big difference has to do with the algorithm that handles the image's metadata. Broadly speaking, metadata contains basic information about the image, such as brightness or colors. The metadata applied to HDR10 TVs is static. This means that they do not vary during the playback of certain types of HDR10 compatible content . In contrast, the metadata for HDR10 + compatible TVs is dynamic. In fact, they can vary from frame to frame.
For example, when playing back images with shadows, the brightness of the images will be lowered to give priority to the colors displayed on the screen. On HDR10 televisions, the brightness level will remain stable at all times, regardless of the scene displayed on the screen.
Some streaming services support HDR10.
The last difference of HDR10 versus HDR10 + is the compatibility with the x265 HEVC standard . This codec promises to halve the bandwidth used to play streaming content while improving the final image quality. In this way, the TV's Internet consumption when watching HDR10 + content in applications like Netflix will be, literally, half.
Dolby Vision: the paid license that promises better features
It is very important not to confuse this term with its Dolby Atmos counterpart, which refers to sound certification from the same company. Starting from this premise, there is an essential difference between the HDR standard in its different variants and the Dolby Vision standard: the first is free and the second is paid . This difference has a direct effect on the final price of the product. It's a fact, Dolby Vision-compatible TVs cost more than HDR10 or HDR10 + compatible TVs.
Beyond these differences, the Dolby Vision standard represents a significant leap in image quality compared to other standards. The first has to do with color depth: 12-bit versus 10-bit for HDR10 and HDR10 + . We will talk about this in later sections to explain both terms.
Another difference that Dolby Vision has with its counterparts has to do with the maximum level of brightness. TVs with this certification can reach 10,000 theoretical nits. Yes, you read it right. Unfortunately, commercial televisions tend to be around 4,000 nits . This figure is reserved for some models with 8K resolutions of several thousand euros.
The last difference in Dolby Vision is related to Tone Mapping. It is a system that analyzes the colorimetry of the images in real time to show at all times the color and tone closest to that of the image that is being represented. Because the color range is limited, the TV will try to reproduce a tone very similar to that of the broadcast source. The key to this system in the Dolby Vision standard is that the mapping is done in a much more consistent way than the HDR10 mapping., by relying exclusively on hardware. In other words, the image on Dolby Vision compatible televisions will be very similar to each other, while the image on HDR10 televisions may differ from model to model. For video editors, this system gives them the advantage of further adjusting the image colors based on the Dolby standard. Hence, it is the most popular in the film industry.
10-bit and 12-bit: when the color depth determines the quality of the TV picture
First it was 8 bits, then 10 bits, and now 12 bits. These terms determine the color depth of an image, that is, the number of colors that a television is capable of reproducing . Outside of technical data, we have to stay with the idea that a 12-bit television is capable of reproducing a much wider range of colors than a 10-bit one.
This number of bits is generally associated with the use of standards. For example, the HDR10 and HDR10 + standards must use yes or yes a 10-bit color depth . In contrast, Dolby Vision certified TVs feature 12 bits. Hence the difference in prices between televisions with different standards.
Conclusion: HDR10 for cheap TVs and Dolby Vision for expensive TVs
And it is not something that I say, but it can be clearly seen in the catalog of televisions of the brands. Just take a look at the stickers printed on promotional TV posters .
Samsung 8K TV.
If the TV does not have a sticker mentioning picture quality, the TV will be compatible with the HDR10 or HDR10 + standard. Models compatible with Dolby Vision must have a sticker certifying such integration. Coincidentally, they are usually models that exceed 1,000 euros .