HDR stands for “High Dynamic Range”. The goal is to translate as well as possible, on a photo or a video, what the human eye perceives when the scene is very contrasted. To do this, the technique consists of adjusting the light intensity levels, from black to white, on the final image.
HDR technology thus aims to modulate the contrasts in a harmonized and natural way so that the deep blacks do not appear simply grayed out or the whites are not burnt.
Table of Contents
What is HDR?
High dynamic range imaging (denoted HDR in the following for convenience) is therefore a technology that will display many levels of light intensity in an image, in order to avoid blocked blacks on images that are too dark. or the burnt whites on those that are too bright.
HDR vs SDR
As opposed to SDR (Standard Dynamic Range), ie the type of images you have been dealing with since the early days of television, HDR (High Dynamic Range) offers to extend the dynamic range of the picture. It’s all about brightness, or rather luminance, which is expressed in nits or candelas per square meter (cd/m²).
HDR TVs are thus able to offer a wider range of brightness than their SDR counterparts, and to go much higher in luminance and go very low in the dark. And since everyone has their own recommendations, the easiest way is to rely on the recommendations of the UHD Alliance to see things more clearly.
How did it start?
The HDR function appeared recently on cameras, smartphones and cameras, but work began in the 1850s.
The French photographer Gustave Le Gray wanted to accurately represent a landscape composed of the sky and the sea. By playing on several exposure values, he was the first to design an HDR photo.
HDR standards and Types
There are currently several HDR standards which coexist in the greatest blur in the eyes of neophytes. Some manufacturers promise support for one, two or even more standards, and TVs or gaming monitors even offer bonus mentions and which combines support for HDR10, Dolby Vision IQ and HLG, certifications to which it adds the HGIG stamp.
HDR10
HDR10 is supported by the Ultra HD Alliance, which delivers its Ultra HD Premium trademark, whose logo is affixed to TV boxes, when HDR10 is provided. It’s not always indicated as such on TV packaging, and sometimes boiled down to simple HDR.
This standard, defined by the Consumer Electronics Association, notably imposes compatibility with the Rec.2020 colorimetric space, 10-bit encoding per primary color (1024 shades), with chrominance sub-sampling carried out in YCbCr 4:2 :0.
HDR10+
Three years after HDR10, the association bringing together 20th Century Fox, Panasonic and Samsung launched its evolution, HDR10+. Since 2018, therefore, the standard has of course taken over the basics of HDR10, enriched precisely in terms of the metadata mentioned above.
If those that accompany HDR10 videos are said to be static, that is to say configured once and for all according to the type of video on the screen (based on the brightest scene of the content), the HDR10+ is flanked by dynamic metadata. The tone matching curve is re-evaluated with each scene, for much more accurate rendering.
HDR10+ Adaptive
You guessed it, three years later, an evolution was to be expected. The association in charge of HDR10+ did not fail in early 2021, announcing HDR10+ Adaptive. More than ever supported by Samsung, the standard is above all a response to that of Dolby, Dolby Vision IQ to which we will return below.
Available on Samsung TVs announced in early 2021, HDR10+ Adaptive takes up the basics of HDR10+, namely its dynamic metadata, to which is added “real-time information on ambient light”. The idea here is therefore that the brightness of the image should be optimized not only scene by scene, but also according to the viewing conditions.
Dolby Vision
Unlike HDR10, HDR Dolby Vision offers 12-bit encoding, which makes it possible to obtain 4096 shades per color. Dolby Vision therefore goes further than HDR10, by also offering metadata associated with each image, which ensures the faithful reproduction of the videos, in accordance with what the director wanted to show.
On the peak brightness side, Dolby Vision can go up to a luminance of 4000 nits, and sees up to 10,000 nits in the future… even if in reality, some Dolby Vision certified televisions do not exceed 1000 nits and display 10 bit.
The standard, more demanding insofar as it must be respected from the production to the broadcasting of images, but also more precise than HDR10, nevertheless suffers from a defect: it is more expensive, Dolby marketing its license. It was less widespread a few years ago, but, supported in particular by video streaming platforms, it is now widespread. There are compatible TVs from Sony, LG, TCL or even Panasonic… but not from Samsung.
Dolby Vision IQ
If the HDR10 + Adaptive dates from CES 2021, it was during the previous edition of the American show that the Dolby Vision IQ was formalized.
In partnership with LG and Panasonic, who therefore integrate the technology into some of their television ranges, it “uses the dynamic metadata of Dolby Vision and the light sensors located inside the TVs to detect the level of brightness in the room. , intelligently displaying every detail of the content so it looks its best”. Here too, it is therefore a question, for manufacturers adopting the technology, of adapting their TVs at the hardware level.
HLG
The HLG, for Hybrid Log Gamma, is a much less common standard. Developed by the BBC and the Japanese broadcaster NHK, it aims to simplify the distribution of live HDR content, which the transmission of metadata imposed by HDR10 and Dolby Vision complicates.