Monday, January 23, 2017

PS4 Pro: "2160p - YUV420" or "2160p - RGB"?

Setting up a PS4 Pro to take full advantage of 4K @ 60fps can be difficult (see my previous post). There are multiple options that don't look all that different on casual inspection. What's the difference, and what should you choose?

There are three different options in the PS4 Pro menu which are able to provide 4K support:


• Automatic
• 2160p - YUV420
• 2160p - RGB

If everything's set up correctly, the best choice is actually Automatic. Assuming you have compatble equipment, you'll know everything is set up right when Automatic gives you this:


The big question, though, is what does that really mean? What's the difference between RGB and YUV420? (And what in the world is YUV422??)

First things first, let's start with RGB. I won't spend much time on this since it's pretty simple and easy to find information on the web. TVs and monitors use separate red, green, and blue components to display colors in an image; RGB simply sends red, blue and green color information directly.

Original:

RGB:




Historically, though, televisions have rarely used RGB. Instead, the signal is split up into luma and chroma—this is called YUV (or pedantically speaking, Y′CbCr). The luma component is the Y component, and represents the pixel's brightness; it looks like a black and white version of the image. Black and white televisions supported only luma.

Luma:

When televisions were upgraded to color, they didn't want to break existing TV sets, so they kept broadcasting the luma information just as before. To add color, they added a chroma component, the UV information, which basically contains the color with the luma/brightness information removed. Chroma data is very hard to parse visually; you can compare it to the original above and see that colors are basically preserved, but with no bright or dark spots. Our visual system just sees the chroma image as a blur, mostly fuzzy and indistinct.

Chroma:

Here's another image split into luma and chroma.

Original:

Chroma:

Luma:


Compared to RGB, a pure YUV signal could theoretically present the exact same quality image with the same amount of bandwidth. In a normal SDR picture, RGB takes 8 bits per channel, which means 24 bits for a pixel. Pure YUV data takes 8 bits for the luma and 16 bits for the chroma on a per-pixel basis, so that's still 24 bits per pixel. This signal would be called YUV444.

However, the PS4 does not actually use YUV444—it's more complicated than RGB and technically has no advantage (or disadvantage). In cases where there is sufficient bandwidth, the PS4 will simply switch to RGB mode.

YUV420 mode is a smart compromise that takes advantage of the limitations of our visual cortex. Remember how the luma component of an image is very distinct, but it's difficult to see details clearly in the chroma component? Well, it turns out that you can shrink down the chroma channel to 1/4 of its original size, and it's extremely difficult to tell the difference. In other words, for a 4K 2160p signal, the luma component is sent in full 4K quality, but the chroma component is sent at 2K 1080p resolution. In this case, on a per-pixel basis, the luma still takes 8 bits per pixel. We send 16 bits of chroma information at 25% size (50% reduction on each axis), so that averages out to 4 bits per pixel. So YUV420 needs only 12 bits per pixel, meaning it only needs half the bandwidth and delivers an image that most people find indistinguishable from pure 4K. Given the choice between RGB and YUV420, RGB is the superior "no compromise" choice, but most viewers will struggle to see a difference in typical usage.

— more to come —


Monday, January 9, 2017

Is 4K HDR ready for prime time?

I recently upgraded all of my gear—brand new TV with 4K HDR support, new 4K-compatible receiver, shiny new braided HDMI cables. I thought I had done everything by the book—I made sure everything I bought supported the latest standards, was well reviewed, and seemed to work great. When I finally got a PS4 Pro, that was finally a chance for all that new gear to shine—4K in all its glory! 

Unfortunately, things got off to a rocky start when my PS4 Pro immediately told me this.


Only 2K supported? My old PS4 could do 2K HDR already :(

Also, the menus offered two different 2160p modes—YUV420 and RGB. The RGB mode was "unsupported." Was this important? Is RGB better? This seemed to be related to my HDR issues, but Google gave me very mixed messages.



In general, I found a lot of misinformation and uncertainty on the web about screen settings, HDMI compatibility, HDCP, 4K, HDR, RGB versus YUV and other topics. I was baffled that my setup, with a new high-end Sony TV connected to a Sony peripheral, was missing its main selling point. To make matters worse, while playing games, I began to experience picture dropouts and bursts of static that could only be temporarily resolved by switching inputs on the receiver. I had never been so flummoxed by commercial AV gear.


So I rolled up my sleeves and got to debugging. The first screen  contained a clue—"Your TV might support a higher quality color format and 4K HDR if you change its settings." I can do that! I love settings. In fact, I thought I knew exactly what to do for this one. For reasons that will become clear later in this post, 4K TVs generally ship by default with their HDMI ports set to "standard" mode, and have an option to switch to "enhanced" mode if you have 4K hardware attached. If you search for this problem online, you'll find many posts telling you that you need to enable enhanced HDMI on your TV. Depending on your TV, the setting looks a lot like this:


Or one of these:




Unfortunately, I already had this mode enabled! I did this right away after getting my TV—even though I lacked any 4K hardware initially—because, why not? It seemed to work fine, and who doesn't want to support enhanced picture quality, right?

Further study revealed that this setting not only existed on TVs, but also on receivers. Unfortunately, my Pioneer receiver's menus had no HDMI customization settings at all. After further research, I found that the Pioneer's enhanced HDMI mode requires use of a hidden options menu, accessed by rebooting the receiver with certain buttons held down:

So, that was one mystery solved. Once this setting was changed, 2160p - RGB became available. Unfortunately, while this allowed 4K HDR content to work on my TV, it also seemed to increase the failure rate—I was getting frequent picture drop-outs. Many websites suggested switching to HDCP 1.4 or disabling HDCP altogether, and this definitely reduced the rate of failure, but I would still occasionally get picture drop-out.

This picture drop-out—and the existence of an "enhanced HDMI" mode at all—was actually caused by limitations in my HDMI cables. As it turns out, very few HDMI cables are capable of carrying 4K content at full resolution! There is no visible difference when looking at an HDMI cable which is 4K compliant; the closest thing we get is "Premium HDMI" branding on some of the packaging, which looks like this:


Don't let anyone tell you all HDMI cables are the same—different cables are rated for different transmission speeds. In this case, for "enhanced" picture quality, you need a cable rated for 18 gigabits per second. Most HDMI cables in existence cannot reliably transmit at these speeds. The reason most devices disable "enhanced HDMI" by default is because "standard HDMI" mode uses only 10 gb/s, which most cables can handle. Most consumers will attach their existing HDMI cables to their devices and expect them to work, since they look and feel identical. To their credit, Sony included a Premium HDMI cable in the box with the PS4 Pro, but the cable between my receiver and the TV was only rated for 10gb/s. (The cable in question was high quality, thick and sturdy, less than a year old and was marketed as "guaranteed 4K compatible," but in the tech specs, it was only guaranteed to support a 10gb/s transmission rate.) 

So, what does standard HDMI do in order to save so much bandwidth? Wasn't HDMI supposed to be a pure, uncompressed digital signal? I'll cover that in the next post.