PSA: 4K 144 Hz monitors use chroma subsampling for 144 Hz

in life •  7 years ago 

I'm seeing a lot of user reviews for the new 4K 144 Hz monitors, and it seems like everyone mentions that it looks noticeably worse at 144 Hz. I keep expecting these posts to say "due to the 4:2:2 chroma subsamping", but instead they say "I'm not sure why" or something like that, both on here and on various forums. It seems monitor companies have done their usual good job of "forgetting" to inform people of this limitation, as most of the early adopters are apparently unaware that it is not actually capable of full 4K 144 Hz even though the subsampling was mentioned in the Anandtech article a month or two ago. In any case, I want to make people aware of what chroma subsampling is, and that these first-gen 4K 144 Hz monitors use it.

Chroma Subsampling

Chroma subsampling is a method of reducing bandwidth by partially lowering the resolution of the image.

Imagine you have a 4K image; 3840 × 2160 pixels. Each pixel is composed of a RED value between 0–255, a GREEN value 0–255, and a BLUE value 0–255. You could imagine this 3840 × 2160 full color image as three separate monochrome images; a 3840 × 2160 grid of RED values, one of GREEN values, and another of BLUE values, which are overlaid on each other to make the final image.

Now, imagine that you reduce the resolution of the RED and GREEN images to 1920 × 1080, and when you reconstruct the full image you do it as if you were upscaling a 1080p image on a 4K screen (with nearest neighbor scaling); use each 1080p pixel value for a square of 4 pixels on the 4K screen. This upscaling is only done for the RED and GREEN values; the BLUE image is still at full resolution so BLUE has a unique value for every 4K pixel.

This is the basic principle behind chroma subsampling. Reducing resolution on some of the pixel components, but not all of them.

The description above, of reducing resolution by half in both the vertical and horizontal resolution, on 2 of the 3 components, is analogous to 4:2:0 chroma subsampling. This reduces bandwidth by one half (One channel at full resolution, and 2 channels at one-quarter resolution = same number of samples as 1.5 out of 3 full-resolution channels)

Full resolution on all components is known as "4:4:4" or non-subsampled. Generally it's best to avoid calling it "4:4:4 subsampling", because it sounds like you're saying "uncompressed compression". 4:4:4 means no subsampling is being used.

4:2:2 subsampling is cutting the resolution in half in only one direction (i.e. 1920 × 2160; horizontal reduction, but full vertical resolution) on 2 out of the 3 components. This reduces the bandwidth by one third.

YCbCr

Above, I used subsampling RGB components only as an example; "RGB subsampling" would be truly horrible and has never ever been used or even considered for use, to my knowledge. In an RGB system, since each of the 3 components dictates the brightness of one of the primary colors, changing one of the RGB values affects both the hue and brightness of the resulting total color. Therefore, using one R, G, or B value on a neighboring pixel makes a very noticeable change, so subsampling would degrade the image by quite a lot.

Instead, subsampling is only paired with YCbCr. YCbCr is a different method of specifying colors, used as an alternative to RGB for transmission. Of course, physically speaking, every display generates an image using discrete red green and blue elements, so eventually every image will need to be converted to the RGB format in order to be displayed, but for transmission, YCbCr has some useful properties.

What is YCbCr Anyway?

People get confused about what YCbCr actually is; misuse of terminology all over the place adds to the confusion, with people incorrectly calling it a "color space" or "color model" or things like that. Really, it is an offshoot of the RGB system, it is literally just RGB with a different axis system. Imagine a two-dimensional cartesian (X-Y) coordinate system, then imagine drawing a new set of axes diagonally, at 45º angles to the standard set, and specifying coordinates using those axes instead of the standard set. That is basically what YCbCr is, except in 3 dimensions instead of 2.

If you draw the R, G, and B axes as a standard 3D axis set, then just draw 3 new axes at 45º-45º angles to the original, and there you have your Y, Cb, and Cr axes. It is just a different coordinate system, but specifies the same thing as the RGB system.

You can see how the YCbCr axes compare to the familiar "RGB cube" formed by the RGB axis set (RGB axes themselves not shown, unfortunately):

Why Even Use YCbCr?

YCbCr is useful because it specifies brightness and color separately. Notice in the image from the previous section, the Y axis (called the "luma" component) goes straight down the path of equal-RGB values (greys), from black to white. The Cb and Cr values (the "chroma" components) specify the position perpendicular to the Y axis, which is a plane of equal-brightness colors. This effectively makes 1 component for brightness, and 2 components for specifying the hue/color relative to that brightness, whereas in RGB the brightness and hue are both intertwined in the values of all 3 color channels.

This means you can do cool things like remove the chroma components entirely, and be left with a greyscale version of the image; this is how color television was first rolled out, by transmitting in YCbCr. Any black-and-white televisions could still receive the exact same broadcast, they simply would ignore the Cb and Cr components from the signal. (EDIT: Technically, old broadcasting used an analog system which is referred to as "YPbPr" while the term "YCbCr" is used for the digital equivalent, but it's the same concept).

Chroma subsampling also works much better in YCbCr, because the human eye is much less sensitive to changes in color than it is to changes in brightness. Therefore you can subsample the chroma components without touching the luma component, and reduce the color resolution without affecting the brightness of each pixel, which doesn't look much different to our eyes. Therefore, YCbCr chroma subsampling (perceptually) affects the image much less than subsampling RGB components directly would be. When converted back to RGB of course, every pixel will still have a unique RGB value, but it won't be quite the same as it would be if the YCbCr chroma subsampling had not been applied.

Terminology Notes

Since RGB-format images don't have luma or chroma components, you can't have "chroma subsampling" on an RGB image, since there are no chroma values for you to subsample in the first place. Terms like "RGB 4:4:4" are redundant/nonsensical. RGB format is always full resolution in all channels, which is equivalent or better than YCbCr 4:4:4. You can just call it RGB, RGB is always "4:4:4".

Also, chroma subsampling is not a form of compression, because it doesn't involve any de-compression on the receiving side to recover any of the data. It is simply gone. 4:2:2 removes half the color information from the image, and 4:2:0 removes 3/4 of it, and you don't get any of it back. The information is simply removed, and that's all there is to it. So please don't refer to it as "4:2:2 compression" or "compressed using chroma subsampling" or things like that, it's no more a form of compression than simply reducing resolution from 4K to 1080p is; that isn't compression, that's just reducing the resolution. By the same token, 4:2:2 isn't compression, it's just subsampling (reducing the resolution on 2/3 of the components).

Effects of Chroma Subsampling

Chroma subsampling reduces image quality. Since chroma subsampling is, in effect, a partial reduction in resolution, its effects are in line with what you might expect from that. Most notably, fine text can be affected significantly, so chroma subsampling is generally considered unacceptable for desktop use. Hence, it is practically never used for computers; many monitors don't even support chroma subsampling. The reduction in quality tends to be much less noticeable in natural images (i.e. excluding test images specifically designed to exploit subsampling). 4:2:2 is the standard for pretty much all movies and TV content, since it reduces the bandwidth for both transmission and disc storage significantly.

Interface Limitations - Why No Support for 4K 144 Hz RGB?

Chroma subsampling has started seeing implementation on computers in situations where bandwidth is insufficient for full resolution. The first notable example of this was NVIDIA adding 4K 60 Hz support to its HDMI 1.4 graphics cards (Kepler and Maxwell 1.0). Normally, HDMI 1.4 is only capable of around 30 Hz at 4K, but with 4:2:0 subsampling (which reduced bandwidth by half), double the framerate can be achieved within the same bandwidth constraints, at the cost of image quality.

Now, we're seeing it in these 4K 144 Hz monitors. With full RGB or YCbCr 4:4:4 color, DisplayPort 1.4 provides enough bandwidth for up to 120 Hz at 4K (3840 × 2160) with 8 bpc color depth, or up to around 100 Hz at 4K with 10 bpc color depth (exact limits depend on the timing format, which depends on the specific hardware; in these particular monitors, they apparently cap at 98 Hz at 4K 10 bpc). These monitors claim to support 4K 144 Hz with 10 bpc color depth, so some form of bandwidth reduction must be used, which in this case is YCbCr 4:2:2.

Before anyone mentions HDMI 2.1, it's not possible to implement HDMI 2.1 yet. Only the master specification has been released. I know a lot of people seem to think that when the specification is released, we'll start seeing products any day now, but that's not the case at all. The specification is the document that tells you how to build an HDMI 2.1 device; the release of that document is when engineers start designing silicon that is capable of that, let alone displays that use that silicon. The DisplayPort 1.4 standard was released in the early 2016, over 2 years ago, and we're only just now starting to see it implemented in monitors (I believe it has been implemented on only 1 monitor prior to this, the Dell UP3218K). Also, there are no graphics cards with HDMI 2.1 yet, so it wouldn't help much right now on a monitor anyway.

The HDMI 2.1 compliance test specification isn't even finished being written yet, so even if you had HDMI 2.1 silicon ready somehow, there's currently no way to have it certified, as the testing procedures haven't been released by the HDMI Forum yet. HDMI 2.1 is still under development from a consumer perspective. The release of the main specification is only a release for engineers.

DSC Compression - The Missed Opportunity

The creators of these monitors could have opted to use Display Stream Compression (DSC), which is a form of compression, unlike subsampling; it reduces bandwidth, and the image is reconstructed on the receiving side. DSC is part of the DisplayPort 1.4 standard, but Acer/ASUS chose not to implement it, likely for hardware availability reasons; presumably no one has produced display controllers that support DSC, and Acer/ASUS wanted to rush to get the product out rather than implement 4K 144 Hz properly. Note that DP 1.4 supports up to 4K 120 Hz uncompressed and non-subsampled; they could have simply released it as a 4K 120 Hz monitor with no tricks, but that sweet 144 Hz number was calling to them I guess. They probably feel marketing a "120 Hz" monitor would seem outdated, and don't want to be outdone by competition. Such is life in this industry... Still, they can be run at 120 Hz non-subsampled if you want, no capability has been lost by adding subsampling. Just that people are not getting what they expected due to the unfortunate lack of transparency about the limitations of the product.

EDIT: I forgot that these are G-Sync monitors. This is most likely why the monitor manufacturers did not support proper 4K 144 Hz using DSC, dual cables, or some other solution. When you make a G-Sync display, you have no choice but to use the NVIDIA G-Sync module as the main display controller instead of whatever else is available on the market. This means you are forced to support only the features that the G-Sync module has. There are several versions of the G-Sync module (these monitors use a new one, with DisplayPort 1.4 support), but G-Sync has historically always been way behind on interface support and very barebones in feature support, so come to think of it I highly doubt that the new G-Sync module supports DSC, or PbP/MST (for dual cable solutions).

If this is the case, it's more the fault of NVIDIA for providing an inadequate controller to the market, than the monitor manufacturers for "choosing" to use chroma subsampling (it would be the only way of achieving 144 Hz in that case). However it is still on them for not simply releasing it as a 4K 120 Hz display, or being clear about the chroma subsampling used for 144 Hz. Anyway, we'll have to wait and see what they do when they release FreeSync or No-sync 4K 144 Hz monitors, where NVIDIA's limitations don't apply.

DSC Concerns

Before anyone says "meh we don't want DSC anyway", I'll answer the two reservations I anticipate people will have.

DSC is a lossy form of compression. While it is true that DSC is not mathematically lossless, it is much much better than chroma subsampling since it recovers almost all of the original image. Considering that in natural images most people don't even notice 4:2:2 subsampling, image quality reduction with DSC is not going to be noticeable. The only question is how it performs with text, which remains to be seen since no one has implemented it. Presumably it will handle a lot better than subsampling.

Latency. "Compression will add tons of lag!". According to VESA, DSC adds no more than 1 raster scan line of latency. Displays are refreshed one line at a time, rather than all at once; on a 4K display, the monitor refreshes 2160 lines of pixels in a single refresh. At 144 Hz, each full refresh is performed over the course of 6.944 ms, therefore each individual line takes around 3.2 microseconds (0.0032 ms), actually less than that due to blanking intervals, but that's a whole different topic :P https://www.displayport.org/faq/#tab-display-stream-compression-dsc

How does VESA’s DSC Standard compare to other image compression standards?

Compared to other image compression standards such as JPEG or AVC, etc., DSC achieves visually lossless compression quality at a low compression ratio by using a much simpler codec (coder/decoder) circuit. The typical compression ratio of DSC range from 1:1 to about 3:1 which offers significant benefit in interface data rate reduction. DSC is designed specifically to compress any content type at low compression with excellent results. The simple decoder (typically less than 100k gates) takes very little chip area, which minimizes implementation cost and device power use, and adds no more than one raster scan line (less than 8 usec in a 4K @ 60Hz system) to the display’s throughput latency, an unnoticeable delay for interactive applications.

Conclusion

I know the internet loves to jump on any chance to rant about corporate deceptions, so I suppose now it's time to sit back and watch the philosophical discussions go... Is converting to YCbCr, reducing the resolution to 1920 × 2160 in 2 out of 3 components, and converting back to RGB really still considered 4K?

Then again, a lot of people are still stuck all the way back at considering anything other than 4096 × 2160 to be "4K" at all :P (hint: the whole "true 4K is 4096 × 2160" was just made up by uninformed consumer journalists when they were scrambling to write the first "4K AND UHD EXPLAINED" article back when 4K TVs were first coming out; in the cinema industry where the term originated from, "4K" is and always has been a generic term referring to any format ≈4000 pixels wide; somehow people have latched onto the "true 4K" notion and defend it like religion though... But anyway, getting off topic :3)

These 4K 144 Hz monitors use YCbCr 4:2:2 chroma subsampling for 4K 144 Hz which is a huge disappointment for me personally. The best you can get with RGB or YCbCr 4:4:4 is 4K 120 Hz with 8 bpc color depth, or 4K 98 Hz with 10 bpc color depth (HDR). Honestly that's fine to me, I'd run it at 120 Hz 8 bpc personally, I'm just disappointed they used subsampling over DSC to get the 144 Hz. Just a friendly PSA though, so hopefully fewer people will be caught off guard by this. I'd wait for second-generation 4K 144 Hz monitors personally.

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!