HDR, HDR10, HDR10+, HLG and Dolby Vision

I have read several articles on the differences between what all these different types of High Dynamic Range (HDR) video formats are, but few address it from a technical standpoint.

So lets do a quick high level (not so technical) overview of each:

  • WCG: Wide Color Gamut – more colors
  • HDR: Overloaded term: generally the combo of WCG + larger luminosity range
  • HDR10: single sheet of color and light information to set upper and lower bounds
  • HDR10+: a list of sheets for each scene to change those bounds dynamically
  • Dolby Vision: a proprietary list of bigger sheets
  • HLG: Regular color space with magic sprinkles on top

Now into a bit more detail. Keep in mind I will be talking about consumer consumption of these concepts, not the prosumer or creator original formats.[1]

HDR

HDR, aka High Dynamic Range has been overloaded to mean many things nowadays. Companies like to slap it on just like they do with “HD” if any of their stuff meets minimum requirements (and sometimes not.)

What HDR actually allows for is simply brighter brights, and darker darks.

Gamma Curve

The human eye is a very sensitive to small changes in darkness, and the traditional range of brightness in 8-bit video is basically a staircase, each level of light the same distance away from the one before and after it.

Whereas to capture more of the details in the darks, we would want something that has smaller steps of difference at lower light levels, but can be larger steps the brighter it gets.

That means we need two things, more room to store information (aka 10-bit videos), and some kind of maths that can condense the usual steps into a gamma curve. The standard curve generator for UHD movies is SMPTE ST 2084 but there is also HLG, discussed below.

Wide Color Gamut

Well now that we got 10-bit videos, we also have room for more colors right? The UHD Alliance thought so too, so they set the standard requirements of UHD videos to be:

  • 3840×2160 resolution
  • 10-bit minimum
  • SMPTE 2084
  • BT.2020 color gamut

So now they uses a 10-bit or 12-bit pixel format and larger color space (bt.2020) instead of the traditional 8-bit (bt.709) to encompass a much larger array of color and light depth.

Images from Wikimedia Commons[2][3]

This new standard is not directly backwards compatible, so sometimes you may come across devices that view what should be a beautifully colorful scene, as painfully muted and off color. For example, here is a 10-bit BT.2020 video as decoded proper (left) and a direct conversion to 8-bit (right).

HDR properly decoded vs direct conversion to an 8-bit video
Stills taken from Dolby’s Glass Blowing Demo[4]

This is not much of an issue for movies, as HDR is only currently on 4K Blu Rays, so the players have to support BT.2020. However this is an issue for broadcasters who need to provide to older and newer hardware, that’s where HLG comes into play. But first, lets go deeper into “true” HDR options.

HDR10

HDR10 is built on top of HDR as a small set of information given to the video to tell the player the “Mastering Display Color Volume” for the entire video. The Mastering Display consists of static Maximum Frame Average Light Level (MaxFALL) and Maximum Content Light Level (MaxCLL). If you extract these metadata elements, stored in the video’s SEI messages, it will look something like this:

"side_data_list": [
                {
                    "side_data_type": "Mastering display metadata",
                    "red_x": "35400/50000",
                    "red_y": "14600/50000",
                    "green_x": "8500/50000",
                    "green_y": "39850/50000",
                    "blue_x": "6550/50000",
                    "blue_y": "2300/50000",
                    "white_point_x": "15635/50000",
                    "white_point_y": "16450/50000",
                    "min_luminance": "50/10000",
                    "max_luminance": "40000000/10000"
                },
                {
                    "side_data_type": "Content light level metadata",
                    "max_content": 0,
                    "max_average": 0
                } 
]

Encoders such as x265 (for HEVC / H.265) and rav1e (AV1) can support mastering display, and will take the arguments in a simplified format. For example they will take MaxFALL as “G(x,y)B(x,y)R(x,y)WP(x,y)L(max,min)”.

G(8500,39850)B(6550,2300)R(35400,14600)WP(15635,16450)L(40000000,50)

HDR10+

Expanding upon HDR10, HDR10+, also called HDR10 plus, supports that information for each frame, or scene by scene. Videos that have HDR10+ support will generally include HDR10 static metadata as well. HDR10+ is specifically for 10-bit video and up to 8K resolution.

The movie will then take a large array of information, for example here is a single frame’s extra metadata.

    {
      "BezierCurveData": {
        "Anchors": [124, 255, 380, 501, 617, 728, 845, 904, 935],
        "KneePointX": 194,
        "KneePointY": 242
      },
      "LuminanceParameters": {
        "AverageRGB": 354,
        "LuminanceDistributions": {
          "DistributionIndex": [1, 5, 10, 25, 50, 75, 90, 95, 99],
          "DistributionValues": [16, 5312, 98, 123, 315, 551, 707, 805, 4986]
        },
        "MaxScl": [7056, 5801, 4168]
      },
      "NumberOfWindows": 1,
      "TargetedSystemDisplayMaximumLuminance": 400
    }

The HEVC encoder x265 is capable of taking such a metadata JSON file and applying HDR10+ to a video. Right now I know of no AV1 or H.266/VVC encoders that can do so.

Dolby Vision

Dolby’s take on dynamic HDR is their proprietary “Dolby Vision” that unlike all other options, requires a royalty cost. However, it does support 12-bit video for a larger range of color and brightness than HDR10+ is capable of supporting currently.

Right now most 4K Blu-rays and TVs support either Dolby Vision or HDR10+, however it is possible to support both in the same video.

Also unlike all previous options, it is not possible to re-encode a video with Dolby Vision without the original metadata file, as there is no current way to extract the proprietary information. (The x265 encoder does support encoding with Dolby Vision for content creators.)

HLG

Over in the ever dying world of direct-to-house TV broadcasting, they need a way to support both 30 year old TVs and receivers, as well as the brand new hotness of HDR. So they applied some black magic (math) and figured out how to send signals using both gamma and logarithmic curves, hence the name “Hybrid Log-Gamma.” Older SDR TVs read the gamma curve and produce a proper picture while receivers and TVs supporting HLG will show the HDR video. That way they don’t run into the issue shown in the glass blowing images above.

Learn More

Obviously this was a “getting your feet wet” post, and I wanted to include links to a lot of great resources to learn more on the various aspects of HDR and video formats.

Tools

References

  • [1] I will be referring to these formats as they are used in consumer releases, such as 4K Blu-Rays and online streaming services. As such, they use a more compressed video format “yuv42010le” and are HEVC encoded using standard RECs. Just be aware that these ideas may exist and be represented differently in creator and prosumer product that have full color ranges.
  • [2] https://commons.wikimedia.org/wiki/File:CIExy1931_Rec_2020.svg
  • [3] https://commons.wikimedia.org/wiki/File:CIExy1931_Rec_709.svg
  • [4] https://developer.dolby.com/tools-media/sample-media/video-streams/dolby-vision-streams/