HDR is a abbreviation for a high dynamic range, and would be easiest to explain it as a wider contrast to the color than usual. Of course, the whole story is much more complex, because HDR gives better pixels, in terms that bright white colors are brighter, darker colors are darker, and 10-bit panels finally display a whole billion-colors range. It is important to understand that HDR can have many shapes, just as HD defines a large set of resolutions.
The type of high dynamic range your TV can display should not be mixed with HDR photo options that have recently been added to smartphones. Both result in images that have higher contrasts of light and darkness, but the way they do it is different. With video, you still get an enhanced color and contrast, but you do not get it by combining multiple images. All this has to do with how the image is displayed on the screen, and the original content used for it.
On a standard screen, all below a certain brightness is displayed in the same black tint. But, the HDR screen range goes beyond that, allowing you to recognize the difference between what is really black and what is tinted. That is why it is increasingly used by movie producers and designers in Hollywood, but also in other big audio-visual centers. HDR brings the media closer to what the human eye sees, and thus creates more realistic images, from scenes blurred with sunlight, to night shots on city streets.
So, get used to see details in the shadows and the brightest details, and then realize that the standard pictures are a bit boring and washed out. The HDR has several hidden tricks compared to the standard display, and there is a new level of image dynamics. When we started from standard definition to HD, screen sizes were much smaller, and adding pixels were enough to reshape the image. Now, the screens are so big that it's not enough to add more pixels, but there are other aspects that can be improved. Here, HDR provides additional colors, better clarity in shadows, and emphasis on detail is subtle, but it significantly brings you a much more visually pleasing picture than adding pixels can do.
How to get a high quality HDR image?
HDR is an end-to-end technology, so every step, from the creation to distribution on your TV screen, must be HDR-compliant, which means older TVs cannot display it. When buying a TV you may notice the Ultra HD Premium logo. This means that the display offers a level of performance that is guaranteed to make the most of the HDR source. The specs must have 3840x2160 pixels (although it does not differ from any other UHD display), and must display a large number of unique shades of color inside the image.
The second key figure is a contrast ratio of at least 1000 nite peak and less than 0.05 nite in black. Nite is the concept adopted by the TV industry to indicate the brightness of the screen. 1 nite is approximately equal to the light emitted by a single candle, so it is comparable to a much more familiar measuring unit - candela. Most TV screens today offer between 300 and 500 nites, which gives a good image of the light needed to display HDR. The above mentioned numbers apply to LCD and LED displays, and for OLED displays (which have lower brightness and a significantly lower black levele than LCDs) 540 nite is the peak of light, and those have less than 0,0005 nites of black levels.
Most LCDs cannot get a premium tag on the specs, and since manufacturers are not required to show the number of nites or contrast ratios, they decide to produce non-Premium displays, that have enough light and contrast to provide a significant HDR image. However, that HDR image will not be what its creator has imagined, which means that even the end users will not be able to see the full range of colors.
The Ultra HD Premium sign can also be used on Ultra HD Blu-ray players and discs, because HDR is part of the Ultra HD Blu-ray specs. UHD displays without HDR compatibility will still show 4K images in the SDR (Standard Dynamic Range) from Ultra HD Blu-ray, but will not be able to access the HDR metadata inside the image. With a 100 Mb/s transfer rate (roughly five times the speed of video upload), Ultra HD Blu-ray is best for HDR product delivery.
All HDR displays can display what is called HDR10, which has the same specification as the UHD Premium standard. This is mandatory for all Ultra HD Blu-rays, and it also appears on HDR plays, broadcasted by Netflix and Amazon. But, there is another reason to pay attention on buying a specific HDR TV brand - the content creators used several types of HDRs.
Dolby has a rival of HDR10, called Dolby Vision, which has an increased color depth up to 10,000 nite, which is far sharper than any TV set. It also features a feature known as "dynamic tone mapping" that adjusts the brightness and contrast of the scene depending on how light or dark tones must be. Only screens and players equipped with Dolby Vision decoding will be able to display an "enhanced" version of their HDR, and LG is the only major manufacturer offering it.
Netflix and Amazon use Dolby Vision, and several studios offer UHD Blu-ray with built-in standards. The good news is that most of the contents in Dolby Vision contain HDR10 metadata, so if your TV is labeled with HDR, it is guaranteed that you will get an HDR experience. When, or if other manufacturers are offering Dolby Vision, it will not be available as software upgrade to existing models, so if you are buying a TV that does not support Dolby Vision, keep in mind that it will never do.
Samsung recently announced the upgraded HDR10 standard, called HDR10+, which adds dynamic mapping of Dolby Vision sound to the HDR10. However, HDR10+ is currently limited to Samsung's 2017 TVs, and Amazon Prime Instant Video content, so it is not as widespread as other standards. Technicolor and Philips work together on a system that creates, and delivers both, HDR, and normal versions simultaneously. Correct content will be automatically selected to fit the peak brightness of the screen and the dynamic range. Finally, BBC and NHK Japan have jointly developed a version of HDR, called the Hybrid Log Gamma, which can be used in conventional TV shows. It is interesting that it is bacwards compatible, which means that if an SDR television receives a HDR signal, it will be able to display the picture, even though it is not HDR.
What to think about HDR?
Briefly said, HDR brings cinema to TV screens. We can not wait for TV stations to accept HDR for the same reasons. Imagine livestream of a football match from the stadium divided into a shady and sunny part without a sudden jump at the exposure (the same sudden flash that makes you stunned) as the ball enters the sunny part of the terrain.
Cinematographically speaking, it will have a great impact to a director, who no longer needs to choose between shadow or sunlight exposure. Within a year or two, it will probably be possible to say that HDR has finally put "ultra" on UHD viewing, and all issues related to different standards will be resolved.
The majority of hardware manufacturers are backed by the HDR10 (official standard), but Dolby is a powerful force and has content creators on its side. It is not impossible that both standards will continue to exist (though Dolby and DTS share a Blu-ray audio market), but the safe choice is a TV with Dolby Vision because the HDR10 can always be displayed on it, while Dolby Vision cannot be added via firmware update. For the sake of the consumers, we hope that it will happen as soon as possible.
In addition, no matter the spectator's fascination, some might find it difficult to get used to the vibrancy and richness of colors that HDR delivers. Therefore, it is likely to come back to something darker and more vague, considering that the colors in HDR are invasive for human sight, and that they "make the minces blind", that is, it's quicker to tire them. Every monitor or TV does so, so this should not be seen as a special fault of HDR, but as an "inborn deformation".
More about HDR.