Vertical Divider
The Rise in New TV Technology to Stimulate Sales
Several years ago, when Ultra HD was beginning to roll out, it was evident that the many innovations that were to carry it would not arrive at the same time. Now all the building blocks are in place, especially on the hardware side and broadcast TV will only use these features to a limited extent. But there’s a wide variety of sources that do employ them, including streaming video, physical media, games and user-generated content. The TV market is a competitive space that is driving HDR technologies to enhance picture quality. Some of these new technologies aim to deliver a contrast performance which is equivalent to OLED, but at lower cost in order to be able to address the mass consumer market. The deeper blacks possible from HDR displays are strongly desirable in TVs (for movies, particularly in low lighting) but also for other products such as surface-integrated automotive displays, monitors and notebooks. The alternative LCD technologies try to replicate OLED contrast ratio using some form of local dimming that in many instances sub-optimizes the image that tends to be ignored by viewers.
HDR is expected to change the nature of the TV images, perhaps more than any of the other changes such as 4K, 8K or HFR as it offers higher contrast, vivid colors and brighter images. Apart from the picture quality, the price is the other significant difference between TVs with HDR and those without.
Figure 1: HDR
Several years ago, when Ultra HD was beginning to roll out, it was evident that the many innovations that were to carry it would not arrive at the same time. Now all the building blocks are in place, especially on the hardware side and broadcast TV will only use these features to a limited extent. But there’s a wide variety of sources that do employ them, including streaming video, physical media, games and user-generated content. The TV market is a competitive space that is driving HDR technologies to enhance picture quality. Some of these new technologies aim to deliver a contrast performance which is equivalent to OLED, but at lower cost in order to be able to address the mass consumer market. The deeper blacks possible from HDR displays are strongly desirable in TVs (for movies, particularly in low lighting) but also for other products such as surface-integrated automotive displays, monitors and notebooks. The alternative LCD technologies try to replicate OLED contrast ratio using some form of local dimming that in many instances sub-optimizes the image that tends to be ignored by viewers.
HDR is expected to change the nature of the TV images, perhaps more than any of the other changes such as 4K, 8K or HFR as it offers higher contrast, vivid colors and brighter images. Apart from the picture quality, the price is the other significant difference between TVs with HDR and those without.
Figure 1: HDR
HiSense Dual-Cell TV
Looking at 65” 4K TVs with and without HDR. At the lower end of the price spectrum today a non-HDR LCD TV model at this size has a price of ~$500, a function of the capacity growth for TV production in the past 2-3 years. Price drops over that span have been dramatic, leading to amazing bargains for consumers, whilst display makers have been facing margin challenges, but are in the process of being reversed as panel makers are raising prices. At the other end of the price spectrum, a 65” OLED HDR TV can cost $3,000, although Amazon Prime Day priced a 65” OLED from Sony at $1,500.
FALD and Mini LED — Recently, premium LCD HDR TVs have adopted several backlight innovations that allow LCD performance to move toward the pixel-level very-high contrast of OLED TVs. Full array local dimming (FALD) LCD TVs have a backlight that is an array of LEDs that is divided into tens or several hundreds of zones, each of which is independently dimmable, reducing the light leakage and improving the black levels, but can lead to some halo effects for some content (e.g. if a still image requires an abrupt change from bright to dark at a spatial frequency higher than that of the dimming zones).
Mini LED backlight is conceptually an extension of the FALD approach (with more advanced technology required), but there are many more LEDs and therefore many more, and smaller, zones. Rather than several hundred LEDs there will be several thousand or tens of thousands. This adds more cost through BOM and process complexity, but produces even better HDR performance, and many TV makers, including Samsung are expected to adopt the technology, but the added costs are exorbitant at this time causing the backlight to be the most expensive component in the whole unit.
Dual-cell LCD technology — Dual-cell LCD enables a dual-cell structure that allows LCDs to compete with OLED TVs on contrast, particularly levels of black. Dual Cell TVs work by inserting a second pixelated cell (‘modulation cell’) between the display cell and the backlight, which allows the backlight to be modulated down in brightness before it reaches the display cell. This increases the contrast from 1,000:1 to 1,000,000:1 allowing 12-bit depth whilst carrying over all the existing performance advantages of LCD, such as high luminance, long lifetime, no burn-in. Examples include Panasonic’s 55” MegaCon and Hisense 65SX Dual Cell TV. But as with all new technologies there are trade-offs. The most obvious disadvantage of dual-cell displays is their thickness and weight, but optical issues also arise and require compensation. These issues include moiré/mura and reduced brightness at wide angles due to parallax, and, unlike OLED TVs, glass dual-cell models are not capable of true pixel-level dimming. All of these performance trade-offs in dual cell are a direct result of the very large separation between the modulation cell and display cell, caused by the thickness of display glass. This inter-cell separation, a couple of millimeters, is much larger than the pixel pitch (for example, a 55” 4K TV has a pixel pitch of 0.31mm). Having an inter-cell separation of several pixels inevitably causes major parallax issues that need overcoming with a suitable compensation film (diffusion layer) between the two cells. The addition of a compensation film spreads the light from a pixel in the modulation cell over several pixels in the display cell. This explains why the TV cannot offer true pixel-level dimming because even if the two cells are of the same resolution, one pixel illuminated on the modulation cell will illuminate several pixels on the display cell – still excellent performance and a huge improvement on FALD, but not quite the same as OLED in terms of true pixel level dimming. Secondly, the additional diffusion layer between the cells resets the polarization of the light, meaning that four polarizers are needed in total, to ‘re-set’ the polarization. This extra polarizer further reduces the transmission and therefore the brightness or power efficiency of the display.
All these competing HDR technologies face their own specific technical and manufacturing challenges. Some of them are easier to overcome than others and it will be ultimately up to display makers and the supply chain to invest and realize the most promising technologies – manufacturing costs will play an important role as they will impact on product price, as will of course demand and market potential.
The TV industry loves to give the impression of new technology and innovation to convince consumers to buy new TV. The following table lists the 8 categories of technology change used to spur new purchases.
Table 1: UHD Technology Roadmap for Content Distribution
Looking at 65” 4K TVs with and without HDR. At the lower end of the price spectrum today a non-HDR LCD TV model at this size has a price of ~$500, a function of the capacity growth for TV production in the past 2-3 years. Price drops over that span have been dramatic, leading to amazing bargains for consumers, whilst display makers have been facing margin challenges, but are in the process of being reversed as panel makers are raising prices. At the other end of the price spectrum, a 65” OLED HDR TV can cost $3,000, although Amazon Prime Day priced a 65” OLED from Sony at $1,500.
FALD and Mini LED — Recently, premium LCD HDR TVs have adopted several backlight innovations that allow LCD performance to move toward the pixel-level very-high contrast of OLED TVs. Full array local dimming (FALD) LCD TVs have a backlight that is an array of LEDs that is divided into tens or several hundreds of zones, each of which is independently dimmable, reducing the light leakage and improving the black levels, but can lead to some halo effects for some content (e.g. if a still image requires an abrupt change from bright to dark at a spatial frequency higher than that of the dimming zones).
Mini LED backlight is conceptually an extension of the FALD approach (with more advanced technology required), but there are many more LEDs and therefore many more, and smaller, zones. Rather than several hundred LEDs there will be several thousand or tens of thousands. This adds more cost through BOM and process complexity, but produces even better HDR performance, and many TV makers, including Samsung are expected to adopt the technology, but the added costs are exorbitant at this time causing the backlight to be the most expensive component in the whole unit.
Dual-cell LCD technology — Dual-cell LCD enables a dual-cell structure that allows LCDs to compete with OLED TVs on contrast, particularly levels of black. Dual Cell TVs work by inserting a second pixelated cell (‘modulation cell’) between the display cell and the backlight, which allows the backlight to be modulated down in brightness before it reaches the display cell. This increases the contrast from 1,000:1 to 1,000,000:1 allowing 12-bit depth whilst carrying over all the existing performance advantages of LCD, such as high luminance, long lifetime, no burn-in. Examples include Panasonic’s 55” MegaCon and Hisense 65SX Dual Cell TV. But as with all new technologies there are trade-offs. The most obvious disadvantage of dual-cell displays is their thickness and weight, but optical issues also arise and require compensation. These issues include moiré/mura and reduced brightness at wide angles due to parallax, and, unlike OLED TVs, glass dual-cell models are not capable of true pixel-level dimming. All of these performance trade-offs in dual cell are a direct result of the very large separation between the modulation cell and display cell, caused by the thickness of display glass. This inter-cell separation, a couple of millimeters, is much larger than the pixel pitch (for example, a 55” 4K TV has a pixel pitch of 0.31mm). Having an inter-cell separation of several pixels inevitably causes major parallax issues that need overcoming with a suitable compensation film (diffusion layer) between the two cells. The addition of a compensation film spreads the light from a pixel in the modulation cell over several pixels in the display cell. This explains why the TV cannot offer true pixel-level dimming because even if the two cells are of the same resolution, one pixel illuminated on the modulation cell will illuminate several pixels on the display cell – still excellent performance and a huge improvement on FALD, but not quite the same as OLED in terms of true pixel level dimming. Secondly, the additional diffusion layer between the cells resets the polarization of the light, meaning that four polarizers are needed in total, to ‘re-set’ the polarization. This extra polarizer further reduces the transmission and therefore the brightness or power efficiency of the display.
All these competing HDR technologies face their own specific technical and manufacturing challenges. Some of them are easier to overcome than others and it will be ultimately up to display makers and the supply chain to invest and realize the most promising technologies – manufacturing costs will play an important role as they will impact on product price, as will of course demand and market potential.
The TV industry loves to give the impression of new technology and innovation to convince consumers to buy new TV. The following table lists the 8 categories of technology change used to spur new purchases.
Table 1: UHD Technology Roadmap for Content Distribution
Spatial Resolution
The aspect of screen resolution, which should be obvious, continues create confusion about the terms. In short, it means more pixels per frame. It’s commonly referred to as 4K, and this is also how Sony referred to it when they announced their first UHD products in 2012. Then later that year, CEA – now Consumer Technology Association or CTA – came up with the term ‘Ultra HD’ to denote a resolution of 3840 × 2160 – not incidentally double the horizontal and vertical resolution of HDTV, 1920 × 1080, with the same aspect ratio of 16:9 or 1.78:1. Perhaps the CEA chose this Ultra HD naming to distinguish it from the 4K definition of the DCI (Digital Cinema Initiative), which specifies a resolution of 4096 × 2160, which corresponds to an aspect ratio of 17:9 or 1.90:1. This DCI-4K resolution is often misunderstood though. Movies are not typically made at 4096 × 2160 resolution. It’s a ‘container’ format, and the images held inside it are usually 4,096 × 1,716 (2.35:1 or CinemaScope) or 3996 × 2160 (1.85:1). Some cropping or letterboxing is needed to fit it to the consumer screen format.
While Ultra HD initially referred to the resolution, it has come to cover a wider set of features, and so organizations like the Ultra HD Forum have come to refer to the (3840 × 2160) resolution as 4K. The other features are the other five pillars mentioned above. But which ones are required to qualify as ‘Ultra HD’? That discussion is ongoing. Until that is settled, the current line of thinking:
Many other combinations are possible. Does 1080p HD video with 100fps HFR make Ultra HD? 8K is also referred to as Ultra HD. The CEA/CTA chose the name for both 4K and 8K resolutions, and the DVB Project refers to 8K as UHD2. In a TV context this means a resolution of 7680 × 4320 – again double the horizontal and vertical resolution of 4K. In cinematic production it would be 8192 × 4320. Some movies are already (partly) shot in this resolution but they are not finished in it. Instead, the ‘Digital Intermediate’ main deliverable will be in 2K or at best 4K, because of the high cost (and lead time) of VFX rendering.
While broadcast content shot in 4K (or higher) is still rare and limited to special-interest channels like Insight TV and Travel XP, in Hollywood 4K production is slowly but gradually becoming more common. For the richest source of 4K content however you need to look to the video streaming platforms like Netflix, Amazon and Disney+. Disney is in the process of scanning its back catalog and Fox’s in 4K to make it available via OTT while Netflix insists all its originally produced content is delivered in 4K. With HDR, no less.
Ultra HD Blu-ray Discs can be from native 4K productions (shot digitally in the resolution), from 4K scans of chemical film, or upscaled from 2K (HD) when movie was finished in that resolution, either because it was shot in that resolution or because the visual effects were rendered at that resolution or because it was shot on chemical film and scanned at 2K.
Will companies like Netflix soon start producing content in 8K? Perhaps. They’re the most likely candidate, after having pioneered 4K just a few years ago. It’s less likely we’ll see an 8K disc format for reasons explained here.
Figure 2: Pixel Structures
The aspect of screen resolution, which should be obvious, continues create confusion about the terms. In short, it means more pixels per frame. It’s commonly referred to as 4K, and this is also how Sony referred to it when they announced their first UHD products in 2012. Then later that year, CEA – now Consumer Technology Association or CTA – came up with the term ‘Ultra HD’ to denote a resolution of 3840 × 2160 – not incidentally double the horizontal and vertical resolution of HDTV, 1920 × 1080, with the same aspect ratio of 16:9 or 1.78:1. Perhaps the CEA chose this Ultra HD naming to distinguish it from the 4K definition of the DCI (Digital Cinema Initiative), which specifies a resolution of 4096 × 2160, which corresponds to an aspect ratio of 17:9 or 1.90:1. This DCI-4K resolution is often misunderstood though. Movies are not typically made at 4096 × 2160 resolution. It’s a ‘container’ format, and the images held inside it are usually 4,096 × 1,716 (2.35:1 or CinemaScope) or 3996 × 2160 (1.85:1). Some cropping or letterboxing is needed to fit it to the consumer screen format.
While Ultra HD initially referred to the resolution, it has come to cover a wider set of features, and so organizations like the Ultra HD Forum have come to refer to the (3840 × 2160) resolution as 4K. The other features are the other five pillars mentioned above. But which ones are required to qualify as ‘Ultra HD’? That discussion is ongoing. Until that is settled, the current line of thinking:
- 4K resolution and higher is Ultra HD
- 1080p HD video with HDR is Ultra HD
- Next-Gen Audio only does not make Ultra HD
Many other combinations are possible. Does 1080p HD video with 100fps HFR make Ultra HD? 8K is also referred to as Ultra HD. The CEA/CTA chose the name for both 4K and 8K resolutions, and the DVB Project refers to 8K as UHD2. In a TV context this means a resolution of 7680 × 4320 – again double the horizontal and vertical resolution of 4K. In cinematic production it would be 8192 × 4320. Some movies are already (partly) shot in this resolution but they are not finished in it. Instead, the ‘Digital Intermediate’ main deliverable will be in 2K or at best 4K, because of the high cost (and lead time) of VFX rendering.
While broadcast content shot in 4K (or higher) is still rare and limited to special-interest channels like Insight TV and Travel XP, in Hollywood 4K production is slowly but gradually becoming more common. For the richest source of 4K content however you need to look to the video streaming platforms like Netflix, Amazon and Disney+. Disney is in the process of scanning its back catalog and Fox’s in 4K to make it available via OTT while Netflix insists all its originally produced content is delivered in 4K. With HDR, no less.
Ultra HD Blu-ray Discs can be from native 4K productions (shot digitally in the resolution), from 4K scans of chemical film, or upscaled from 2K (HD) when movie was finished in that resolution, either because it was shot in that resolution or because the visual effects were rendered at that resolution or because it was shot on chemical film and scanned at 2K.
Will companies like Netflix soon start producing content in 8K? Perhaps. They’re the most likely candidate, after having pioneered 4K just a few years ago. It’s less likely we’ll see an 8K disc format for reasons explained here.
Figure 2: Pixel Structures
High Dynamic Range
HDR video is something quite different than “HDR” photography. The latter involves ‘exposure synthesis’ where two images, captured in quick succession, are combined into one. The resulting image is typically meant for displaying on an SDR display, or even printing on paper, so it’s easy to see this isn’t proper HDR. The only thing that has high dynamic range in HDR photography is the scene being captured.
HDR in video involves a greater dynamic range between the darkest and brightest part of an image, expressed in F-stops. Standard Dynamic Range covers 7 stops while High Dynamic Range can cover about 14 stops. Capturing such dynamic range is also done in still photography, typically in RAW format, which renders files that still need to be graded in order to display them.
One indication that photography and videography may be converging here is that Panasonic’s latest Lumix cameras can shoot still pictures in HLG HDR format which can be directly displayed in HDR on Panasonic TVs. Latter feature is still largely unknown, and Panasonic deserves more attention for this. Different companies have come up with a range of solutions for capturing and delivering HDR video. The most important distinction here is between the Hybrid Log Gamma (HLG) format and the Perceptual Quantizer (PQ) family of formats.
HLG, a so-called ‘scene-referred’ HDR format, does not use any metadata. It was developed by broadcasters BBC and NHK with the goal of delivering a single signal that’s compatible with HDR as well as SDR TV sets (the latter do need to work with Wide Color Gamut in order to render a proper picture), and offering a high level of compatibility with existing production workflows and equipment. That they’ve achieved. Not many broadcasts are in HDR yet, but those that are mostly use HLG.
PQ, developed by Dolby and defined by SMPTE in the ST.2084 standard, can use metadata but it’s optional. Static metadata is specified in the ST.2086 standard. ST.2084 HDR without metadata is referred to as PQ10; with static metadata it’s called HDR10. The distinction is less significant than it seems. While ST2086 specifies what the metadata should conform to, it doesn’t give strict guidelines how it should be used so TV makers are free to do their own thing, and some choose to just ignore the metadata and apply their own secret sauce, often leading to varying results.
In addition to static metadata, HDR can come with dynamic metadata, giving flexibility to content creators to vary the metadata from scene to scene. There are in essence three flavors of this, specified in set of standards called SMPTE ST.2094
Figure 3: HDR Families
HDR video is something quite different than “HDR” photography. The latter involves ‘exposure synthesis’ where two images, captured in quick succession, are combined into one. The resulting image is typically meant for displaying on an SDR display, or even printing on paper, so it’s easy to see this isn’t proper HDR. The only thing that has high dynamic range in HDR photography is the scene being captured.
HDR in video involves a greater dynamic range between the darkest and brightest part of an image, expressed in F-stops. Standard Dynamic Range covers 7 stops while High Dynamic Range can cover about 14 stops. Capturing such dynamic range is also done in still photography, typically in RAW format, which renders files that still need to be graded in order to display them.
One indication that photography and videography may be converging here is that Panasonic’s latest Lumix cameras can shoot still pictures in HLG HDR format which can be directly displayed in HDR on Panasonic TVs. Latter feature is still largely unknown, and Panasonic deserves more attention for this. Different companies have come up with a range of solutions for capturing and delivering HDR video. The most important distinction here is between the Hybrid Log Gamma (HLG) format and the Perceptual Quantizer (PQ) family of formats.
HLG, a so-called ‘scene-referred’ HDR format, does not use any metadata. It was developed by broadcasters BBC and NHK with the goal of delivering a single signal that’s compatible with HDR as well as SDR TV sets (the latter do need to work with Wide Color Gamut in order to render a proper picture), and offering a high level of compatibility with existing production workflows and equipment. That they’ve achieved. Not many broadcasts are in HDR yet, but those that are mostly use HLG.
PQ, developed by Dolby and defined by SMPTE in the ST.2084 standard, can use metadata but it’s optional. Static metadata is specified in the ST.2086 standard. ST.2084 HDR without metadata is referred to as PQ10; with static metadata it’s called HDR10. The distinction is less significant than it seems. While ST2086 specifies what the metadata should conform to, it doesn’t give strict guidelines how it should be used so TV makers are free to do their own thing, and some choose to just ignore the metadata and apply their own secret sauce, often leading to varying results.
In addition to static metadata, HDR can come with dynamic metadata, giving flexibility to content creators to vary the metadata from scene to scene. There are in essence three flavors of this, specified in set of standards called SMPTE ST.2094
- Dolby Vision (ST.2094-10)
- Advanced HDR by Technicolor
- HDR10+ (ST.2094-40), developed by Samsung and Panasonic
Figure 3: HDR Families
HDR FORMATS
Use of Dolby Vision is by far the most widespread among these three. In 2017 I wrote a piece about the prospects of each but surprisingly little has changed in the meantime and I still stand by my views. Technicolor HDR is a dark horse in this race. It’s been selected as the HDR format for Brazil (more about that later), it’s recently been implemented in ATSC 3.0 broadcasts in the U.S. and it’s under consideration by the Chinese authorities as a basis for a China HDR format.
Outside of broadcast, HDR10 rules. All HDR TVs accept it, as do all consumer HDR PC monitors. In a PC/gaming context, when HDR support is claimed, HDR10 is meant. All streaming services that offer HDR support at least HDR10 and it’s the one mandatory HDR format on Ultra HD Blu-ray (meaning that all players must be able to decode it and if a disc uses HDR it must at least contain HDR10; Dolby Vision, HDR10+ and Philips HDR, ST-2094-20, are optional). For a better view of the support for each of the respective HDR formats throughout the industry check the HDR Ecosystem Tracker.
Figure 4: COLOR SPACES: REC.709, DCI-P3, REC.2020
Use of Dolby Vision is by far the most widespread among these three. In 2017 I wrote a piece about the prospects of each but surprisingly little has changed in the meantime and I still stand by my views. Technicolor HDR is a dark horse in this race. It’s been selected as the HDR format for Brazil (more about that later), it’s recently been implemented in ATSC 3.0 broadcasts in the U.S. and it’s under consideration by the Chinese authorities as a basis for a China HDR format.
Outside of broadcast, HDR10 rules. All HDR TVs accept it, as do all consumer HDR PC monitors. In a PC/gaming context, when HDR support is claimed, HDR10 is meant. All streaming services that offer HDR support at least HDR10 and it’s the one mandatory HDR format on Ultra HD Blu-ray (meaning that all players must be able to decode it and if a disc uses HDR it must at least contain HDR10; Dolby Vision, HDR10+ and Philips HDR, ST-2094-20, are optional). For a better view of the support for each of the respective HDR formats throughout the industry check the HDR Ecosystem Tracker.
Figure 4: COLOR SPACES: REC.709, DCI-P3, REC.2020
Around the same time as 4K resolution became practically feasible, the TV industry agreed on a new color space to address the wider color gamut that flat panels (LCD and OLED alike) were able to generate. In the days of analog SD TV, when we all used CRT picture tubes, the color space (in PAL/SECAM as well as NTSC) was Rec.601. For HD TV, which came on the heels of the transition to digital TV, this color space was slightly expanded to Rec.709. What current displays can render varies a lot but is considerably wider, and generally corresponds to what is known as DCI-P3, a format defined by the Digital Cinema Initiative, just like the DCI-4K resolution. The Rec.2020 color space defined for UHD TV exceeds the capabilities of current display – quite a bit. It’s not a given that full Rec.2020 coverage will ever be achieved in commercial products or even that it’s desired but it’s a big enough container, that offers room for growth.
Virtually all TV sets offered nowadays are UHD TVs (see HDTV is officially dead) and except for the very first generations of 4K TVs of 2012/2013, pretty much of all them can handle Rec.2020 in the sense that they accept such signals and process them appropriately. None of them offer 100% coverage, and the coverage varies along with price, though the differences may be less pronounced than with HDR.
The more recent Rec.2100 standard is not a bigger color space than Rec.2020 but that same color space with specifications for HDR (both PQ and HLG are permitted), resolution (HD, 4K or 8K) and frame rate (all the usual rates between 24 and 120fps, including fractional frame rates).
Color depth
Color depth refers to the number of bits used per sub-pixel. Each pixel is made up of three sub-pixels: a red, a green and a blue one. So what is called 8-bit video actually uses 24 bits per pixel. The HDTV format (and digital SDTV before it) uses 8 bits which allows for 2^8 or 256 variations of a color (shades or red, shades of green or shades of blue, and so 24 bits per pixel gives 2^24 or 256^3 or 16.8 million colors. That seems like quite a bit, but it’s not quite enough anymore.
Why? Because of HDR. It’s not that our eyes have gotten better, but our displays have. They can produce a far greater color volume, and if we continued to use the same number of bits, they would need to cover a greater color range, where differences between consecutive colors are bigger and more easily noticeable – an effect known as ‘color banding'.
Like most Ultra HD features, when you see this effect explained the visuals are usually exaggerated. Faked, if you will. That’s because chances are you are reading this article on an SDR (Standard Dynamic Range) display. This may sound odd, but trying to show the difference between SDR and HDR on an SDR display is like trying to show the difference between color and black & white on a black & white TV. That’s why most stories trying to compare SDR and HDR come with visuals where the contrast of one picture has been lowered artificially. Some even go so far to show three pictures with dumbed-down brightness levels suggesting they show SDR, HDR with static metadata and with dynamic metadata.
Figure 5: 8-BIT VS 10-BIT COLOR SIMULATED POORLY
Virtually all TV sets offered nowadays are UHD TVs (see HDTV is officially dead) and except for the very first generations of 4K TVs of 2012/2013, pretty much of all them can handle Rec.2020 in the sense that they accept such signals and process them appropriately. None of them offer 100% coverage, and the coverage varies along with price, though the differences may be less pronounced than with HDR.
The more recent Rec.2100 standard is not a bigger color space than Rec.2020 but that same color space with specifications for HDR (both PQ and HLG are permitted), resolution (HD, 4K or 8K) and frame rate (all the usual rates between 24 and 120fps, including fractional frame rates).
Color depth
Color depth refers to the number of bits used per sub-pixel. Each pixel is made up of three sub-pixels: a red, a green and a blue one. So what is called 8-bit video actually uses 24 bits per pixel. The HDTV format (and digital SDTV before it) uses 8 bits which allows for 2^8 or 256 variations of a color (shades or red, shades of green or shades of blue, and so 24 bits per pixel gives 2^24 or 256^3 or 16.8 million colors. That seems like quite a bit, but it’s not quite enough anymore.
Why? Because of HDR. It’s not that our eyes have gotten better, but our displays have. They can produce a far greater color volume, and if we continued to use the same number of bits, they would need to cover a greater color range, where differences between consecutive colors are bigger and more easily noticeable – an effect known as ‘color banding'.
Like most Ultra HD features, when you see this effect explained the visuals are usually exaggerated. Faked, if you will. That’s because chances are you are reading this article on an SDR (Standard Dynamic Range) display. This may sound odd, but trying to show the difference between SDR and HDR on an SDR display is like trying to show the difference between color and black & white on a black & white TV. That’s why most stories trying to compare SDR and HDR come with visuals where the contrast of one picture has been lowered artificially. Some even go so far to show three pictures with dumbed-down brightness levels suggesting they show SDR, HDR with static metadata and with dynamic metadata.
Figure 5: 8-BIT VS 10-BIT COLOR SIMULATED POORLY
This picture for instance in the upper part shows a spectrum of about 32 colors (if you’d include black and white), which you could construct with 5 bits per pixel (2^5 = 32). So with 2 bits per subpixel you’d already have 64 colors – twice as many as in this graphic. 8-bit color gives you 2^18=262,144 times as many colors.
The number of colors in the lower part, which in SDR looks like a pretty smooth gradient, is about 1300 – well below 2048 which you can achieve with 11 bits per pixel or 4 bits per subpixel. In fact 4-bit color gives you 4096 colors – more than three times the number here.
Back to the real numbers now: 10-bit color gives us 2^10 = 1024 shades per subpixel, so 2^30 = 1024^3 = over 1 billion colors. Unfortunately I cannot show you the difference between 8-bit and 10-bit color on the 8-bit panel you’re likely looking at this moment, but you will be able to see it on an HDR TV with a 10-bit panel. It does not take highly trained eyes to see it.
Note that not all HDR TVs use 10-bit panels. 8-bit ones are still all too common. In PC monitors even more so. Here, you often find panels using 8-bit + ‘FRC’ or Frame Rate Control, a method of simulating 10-bit colors.
12-bit color with its 2 extra bits gives 4 times as many shades per subpixel, so 4096 gradations, and 2^36 = over 68 billion colors. It’s not clear when we’re going to see 12-bit panels in living rooms, but it’s already used in digital cinema – Dolby Vision handles 12-bit color, also on the Ultra HD Blu-ray consumer disc format. Japanese state broadcaster NHK, which has been transmitting programs in 8K since November 2018, advocates 12-bit color whereas the 8K Association sticks to 10-bit color. They also propose different frame rates. More about that later in this article.
At the moment this is the only form of consumer media supporting 12-bit color. Ultra HD Blu-ray without Dolby Vision uses 10-bit color and Rec.2020 color space, while regular 1080p HD Blu-ray Disc uses 8-bit color and Rec.709 color space.
HDR + WCG + extended color depth
A common perception is that HDR inherently comes with Wide Color Gamut and at least 10-bit color depth. In practice the three commonly go together, and for good reason: When combined, these three technologies offer a great improvement of color. The greater number of bits for instance helps prevent color banding that can otherwise easily occur when the dynamic range is extended.
In reality, the three can exist independently of each other, and may offer some benefits. 10-bit and 12-bit color were already tried on Blu-ray Disc in Japan. A few years ago, Panasonic launched the ‘Master Grade Video Coding’ (MGVC) Blu-ray format containing a sort of enhancement layer with 2 or 4 bits extra to attain 10 or 12-bit color. Only special Panasonic BD players could decode the added bits. A number of Studi Ghibli animation movies were released in this format but not much after that.
Wide Color Gamut perhaps would stand the best chance on its own, but this occurs sporadically.
HDR without WCG and without deep color resolution is used in Brazil’s proprietary HDR format. Several parties united in the SBTVD (Sistema Brasileiro de Televisão Digital) including Globo have agreed on an intermediate broadcast format that uses SL-HDR1 (Technicolor Advanced HDR) with Rec.709 color space, 8-bit color and MPEG-H audio. It can be transmitted in AVC via ISDB-Tb or in HEVC via 5G. We’ll have to wait and see how that works out, but it does show lots of variations are possible.
Since the three technologies go together so well, they’re usually combined and many probably assume that 10-bit video and WCG are part and parcel of HDR even while they’re not. Perhaps a different name than HDR, say ‘Ultra Color, would have been better for this feature package. Then you’d have Ultra High Definition and Ultra Color.
Figure 6: Finer Color Resolution, Wider Color Gamut And Higher Dynamic Range Ensure Better Pixels
The number of colors in the lower part, which in SDR looks like a pretty smooth gradient, is about 1300 – well below 2048 which you can achieve with 11 bits per pixel or 4 bits per subpixel. In fact 4-bit color gives you 4096 colors – more than three times the number here.
Back to the real numbers now: 10-bit color gives us 2^10 = 1024 shades per subpixel, so 2^30 = 1024^3 = over 1 billion colors. Unfortunately I cannot show you the difference between 8-bit and 10-bit color on the 8-bit panel you’re likely looking at this moment, but you will be able to see it on an HDR TV with a 10-bit panel. It does not take highly trained eyes to see it.
Note that not all HDR TVs use 10-bit panels. 8-bit ones are still all too common. In PC monitors even more so. Here, you often find panels using 8-bit + ‘FRC’ or Frame Rate Control, a method of simulating 10-bit colors.
12-bit color with its 2 extra bits gives 4 times as many shades per subpixel, so 4096 gradations, and 2^36 = over 68 billion colors. It’s not clear when we’re going to see 12-bit panels in living rooms, but it’s already used in digital cinema – Dolby Vision handles 12-bit color, also on the Ultra HD Blu-ray consumer disc format. Japanese state broadcaster NHK, which has been transmitting programs in 8K since November 2018, advocates 12-bit color whereas the 8K Association sticks to 10-bit color. They also propose different frame rates. More about that later in this article.
At the moment this is the only form of consumer media supporting 12-bit color. Ultra HD Blu-ray without Dolby Vision uses 10-bit color and Rec.2020 color space, while regular 1080p HD Blu-ray Disc uses 8-bit color and Rec.709 color space.
HDR + WCG + extended color depth
A common perception is that HDR inherently comes with Wide Color Gamut and at least 10-bit color depth. In practice the three commonly go together, and for good reason: When combined, these three technologies offer a great improvement of color. The greater number of bits for instance helps prevent color banding that can otherwise easily occur when the dynamic range is extended.
In reality, the three can exist independently of each other, and may offer some benefits. 10-bit and 12-bit color were already tried on Blu-ray Disc in Japan. A few years ago, Panasonic launched the ‘Master Grade Video Coding’ (MGVC) Blu-ray format containing a sort of enhancement layer with 2 or 4 bits extra to attain 10 or 12-bit color. Only special Panasonic BD players could decode the added bits. A number of Studi Ghibli animation movies were released in this format but not much after that.
Wide Color Gamut perhaps would stand the best chance on its own, but this occurs sporadically.
HDR without WCG and without deep color resolution is used in Brazil’s proprietary HDR format. Several parties united in the SBTVD (Sistema Brasileiro de Televisão Digital) including Globo have agreed on an intermediate broadcast format that uses SL-HDR1 (Technicolor Advanced HDR) with Rec.709 color space, 8-bit color and MPEG-H audio. It can be transmitted in AVC via ISDB-Tb or in HEVC via 5G. We’ll have to wait and see how that works out, but it does show lots of variations are possible.
Since the three technologies go together so well, they’re usually combined and many probably assume that 10-bit video and WCG are part and parcel of HDR even while they’re not. Perhaps a different name than HDR, say ‘Ultra Color, would have been better for this feature package. Then you’d have Ultra High Definition and Ultra Color.
Figure 6: Finer Color Resolution, Wider Color Gamut And Higher Dynamic Range Ensure Better Pixels
High Frame Rate
Perhaps confusingly, High Frame Rate means different things in different fields. In film, anything above 24fps is considered HFR Examples are rare. The movie’s most people will know are The Hobbit Trilogy (shot at 48fps), Billy Lynn’s Long Halftime Walk and Gemini Man (both shot at 120fps and screened either at 60 or 120fps but only when in 3D – at regular 24fps when in 2D), and there really isn’t much else because frame rates that deviate from the 24fps norm that we’ve had for almost 100 years tend to be rather polarizing. Some like it, many strongly dislike it, likely because we’ve been conditioned for decades to regard pictures shot at this frame rate as epic.
It’s somewhat like film grain: You could argue it’s an artefact that should be avoided but with most people it helps with the suspension of disbelief. It tells the brain to engage movie watching mode. Higher realism is not helpful; contrary, it breaks the spell.
There are plenty of questions left unanswered about how these mechanisms work in the brain. Perhaps the norm will not last forever, but it may still take a long time for higher frame rates to become common and gain widespread acceptance.
In TV meanwhile, 50 and 60Hz frame rates have been the norm since introduction in the middle of the century. We have in recent years come to a point where interlaced video (50i, 60i) is phased out in favor of progressive video (50p, 60p). This is not considered HFR though. HFR in TV means frame rates higher than this, which is mainly 100p and 120p. As you may have guessed, scripted content such as movies is not where you’ll see this used much. Instead, sports are the genre that will benefit most from this.
1080p50 broadcasts (in Europe) and 1080p60 broadcasts (in North America) are fairly common nowadays, especially with sports, and also with 2160p (4K) broadcasts these frame rates are the norm. High Frame Rate has not yet been employed outside a few trial broadcasts. That’s probably because very few TV sets at this moment can handle HFR. LG’s 2020 OLED TV range can, and more TV sets announced this year are prepared for HR because the new generation of games consoles set to arrive later this year will be able to output HFR video.
Figure 7: HFR DEMO (WITH HLG HDR) BY LG / EBU / 4EVER PROJECT AT IFA 2016
Perhaps confusingly, High Frame Rate means different things in different fields. In film, anything above 24fps is considered HFR Examples are rare. The movie’s most people will know are The Hobbit Trilogy (shot at 48fps), Billy Lynn’s Long Halftime Walk and Gemini Man (both shot at 120fps and screened either at 60 or 120fps but only when in 3D – at regular 24fps when in 2D), and there really isn’t much else because frame rates that deviate from the 24fps norm that we’ve had for almost 100 years tend to be rather polarizing. Some like it, many strongly dislike it, likely because we’ve been conditioned for decades to regard pictures shot at this frame rate as epic.
It’s somewhat like film grain: You could argue it’s an artefact that should be avoided but with most people it helps with the suspension of disbelief. It tells the brain to engage movie watching mode. Higher realism is not helpful; contrary, it breaks the spell.
There are plenty of questions left unanswered about how these mechanisms work in the brain. Perhaps the norm will not last forever, but it may still take a long time for higher frame rates to become common and gain widespread acceptance.
In TV meanwhile, 50 and 60Hz frame rates have been the norm since introduction in the middle of the century. We have in recent years come to a point where interlaced video (50i, 60i) is phased out in favor of progressive video (50p, 60p). This is not considered HFR though. HFR in TV means frame rates higher than this, which is mainly 100p and 120p. As you may have guessed, scripted content such as movies is not where you’ll see this used much. Instead, sports are the genre that will benefit most from this.
1080p50 broadcasts (in Europe) and 1080p60 broadcasts (in North America) are fairly common nowadays, especially with sports, and also with 2160p (4K) broadcasts these frame rates are the norm. High Frame Rate has not yet been employed outside a few trial broadcasts. That’s probably because very few TV sets at this moment can handle HFR. LG’s 2020 OLED TV range can, and more TV sets announced this year are prepared for HR because the new generation of games consoles set to arrive later this year will be able to output HFR video.
Figure 7: HFR DEMO (WITH HLG HDR) BY LG / EBU / 4EVER PROJECT AT IFA 2016
That brings us to the third application area: videogames. What constitutes HFR here is not clearly defined, but it’s fair to say frame rates above 60fps are. In PC/gaming monitors something of a refresh rate race is going on, where common numbers are 144Hz and 165Hz up to 240Hz.
For deeper exploration of all aspects concerning High Frame Rates check out this article: HFR – the one UHD technology you rarely hear about.
Next-Generation Audio
The one UHD pillar that is not about video is about audio, of course. Next-Gen Audio or NGA is considered to be audio that goes beyond conventional multichannel like Dolby Digital and DTS-HD and such. More specifically, it’s about surround sound systems that are object-based and that may use height channels, making the sound truly three-dimensional. There are three formats competing in this area: Dolby Atmos, DTS:X and MPEG-H.
Dolby Atmos for the home works a bit different than for cinemas but aims to achieve the same things: Accurately placing sounds anywhere in the 3D space you watch movies in. Where the movie theatre system can handle up to 128 audio objects and address 64 different channels (speakers), the home system adds a spatially coded substream to a Dolby Digital Plus or Dolby TrueHD signal, or presented as metadata in a Dolby MAT (Metadata-enhanced Audio Transmission) 2.0 format. Streaming services use the lossy Dolby Digital Plus format; Ultra HD Blu-ray discs and the Kaleidescape platform that lets you download movies in Ultra HD use the lossless Dolby TrueHD format. You can read more here.
How it works precisely in more detail goes beyond the scope of this article, but Dolby Atmos has found its way into homes mainly via video streaming platforms, movie downloads, Blu-ray and Ultra HD Blu-ray (the regular 1080p HD disc format also supports it), and to a lesser extent broadcast TV. BT TV in the UK is one of very few operators using Dolby Atmos for live TV broadcasts, mostly of football (soccer) matches. They’ve been doing so since 2017.
Another important source of Dolby Atmos audio are videogames. PS4 and Xbox One already support Atmos in games and Xbox Series S/X will too. PS5 on the other hand uses a Sony-proprietary object-based 3D audio system called ‘Tempest Engine’.
More recently, Dolby has started using Atmos for music (on streaming services like Tidal and Amazon Music) but that’s outside the scope of this article.
On the rendering side, of course you don’t need to install 64 speakers into your living room, which a movie theatre may have. You may have a number of regular, floor-standing speakers that make up a 7.1 or 9.1-channel configuration, with two or four height channels added, which would give you what is referred to as 9.1.4. There are more practical alternatives including ‘upfiring’ speakers which make ceiling-mounted speakers unneeded. And since everything is parameterized – every sound is essentially constructed by means of the number-crunching mathematics known as psycho-acoustics – you can even get Atmos sound from TVs, soundbars and smart speakers.
Figure 8: 5.1.4 DOLBY ATMOS Speaker Configuration
For deeper exploration of all aspects concerning High Frame Rates check out this article: HFR – the one UHD technology you rarely hear about.
Next-Generation Audio
The one UHD pillar that is not about video is about audio, of course. Next-Gen Audio or NGA is considered to be audio that goes beyond conventional multichannel like Dolby Digital and DTS-HD and such. More specifically, it’s about surround sound systems that are object-based and that may use height channels, making the sound truly three-dimensional. There are three formats competing in this area: Dolby Atmos, DTS:X and MPEG-H.
Dolby Atmos for the home works a bit different than for cinemas but aims to achieve the same things: Accurately placing sounds anywhere in the 3D space you watch movies in. Where the movie theatre system can handle up to 128 audio objects and address 64 different channels (speakers), the home system adds a spatially coded substream to a Dolby Digital Plus or Dolby TrueHD signal, or presented as metadata in a Dolby MAT (Metadata-enhanced Audio Transmission) 2.0 format. Streaming services use the lossy Dolby Digital Plus format; Ultra HD Blu-ray discs and the Kaleidescape platform that lets you download movies in Ultra HD use the lossless Dolby TrueHD format. You can read more here.
How it works precisely in more detail goes beyond the scope of this article, but Dolby Atmos has found its way into homes mainly via video streaming platforms, movie downloads, Blu-ray and Ultra HD Blu-ray (the regular 1080p HD disc format also supports it), and to a lesser extent broadcast TV. BT TV in the UK is one of very few operators using Dolby Atmos for live TV broadcasts, mostly of football (soccer) matches. They’ve been doing so since 2017.
Another important source of Dolby Atmos audio are videogames. PS4 and Xbox One already support Atmos in games and Xbox Series S/X will too. PS5 on the other hand uses a Sony-proprietary object-based 3D audio system called ‘Tempest Engine’.
More recently, Dolby has started using Atmos for music (on streaming services like Tidal and Amazon Music) but that’s outside the scope of this article.
On the rendering side, of course you don’t need to install 64 speakers into your living room, which a movie theatre may have. You may have a number of regular, floor-standing speakers that make up a 7.1 or 9.1-channel configuration, with two or four height channels added, which would give you what is referred to as 9.1.4. There are more practical alternatives including ‘upfiring’ speakers which make ceiling-mounted speakers unneeded. And since everything is parameterized – every sound is essentially constructed by means of the number-crunching mathematics known as psycho-acoustics – you can even get Atmos sound from TVs, soundbars and smart speakers.
Figure 8: 5.1.4 DOLBY ATMOS Speaker Configuration
DTS:X is a similar system from Xperi, the company behind DTS. It’s supported by all current AV Receivers and increasingly also by TVs and soundbars. Like Atmos, DTS:X is available on Blu-ray Discs as well as Ultra HD Blu-ray. However you won’t find it on streaming services and apart from a single trial by PBS it’s not used broadcast.
Broadcast TV is likely the main application targeted by MPEG-H. The NGA format was developed by Fraunhofer Institute that has played a key role in many compression standards all the way back to MP3. Broadcast is the main area where it’s been adopted. It’s used by various UHD broadcasters in South Korea in combination with ATSC 3.0 video. From: Dr. Paul Cain is Strategy Director at FlexEnable,
Broadcast TV is likely the main application targeted by MPEG-H. The NGA format was developed by Fraunhofer Institute that has played a key role in many compression standards all the way back to MP3. Broadcast is the main area where it’s been adopted. It’s used by various UHD broadcasters in South Korea in combination with ATSC 3.0 video. From: Dr. Paul Cain is Strategy Director at FlexEnable,
Contact Us
|
Barry Young
|