Monday, January 9, 2017

Is 4K HDR ready for prime time?

I recently upgraded all of my gear—brand new TV with 4K HDR support, new 4K-compatible receiver, shiny new braided HDMI cables. I thought I had done everything by the book—I made sure everything I bought supported the latest standards, was well reviewed, and seemed to work great. When I finally got a PS4 Pro, that was finally a chance for all that new gear to shine—4K in all its glory! 

Unfortunately, things got off to a rocky start when my PS4 Pro immediately told me this.


Only 2K supported? My old PS4 could do 2K HDR already :(

Also, the menus offered two different 2160p modes—YUV420 and RGB. The RGB mode was "unsupported." Was this important? Is RGB better? This seemed to be related to my HDR issues, but Google gave me very mixed messages.



In general, I found a lot of misinformation and uncertainty on the web about screen settings, HDMI compatibility, HDCP, 4K, HDR, RGB versus YUV and other topics. I was baffled that my setup, with a new high-end Sony TV connected to a Sony peripheral, was missing its main selling point. To make matters worse, while playing games, I began to experience picture dropouts and bursts of static that could only be temporarily resolved by switching inputs on the receiver. I had never been so flummoxed by commercial AV gear.


So I rolled up my sleeves and got to debugging. The first screen  contained a clue—"Your TV might support a higher quality color format and 4K HDR if you change its settings." I can do that! I love settings. In fact, I thought I knew exactly what to do for this one. For reasons that will become clear later in this post, 4K TVs generally ship by default with their HDMI ports set to "standard" mode, and have an option to switch to "enhanced" mode if you have 4K hardware attached. If you search for this problem online, you'll find many posts telling you that you need to enable enhanced HDMI on your TV. Depending on your TV, the setting looks a lot like this:


Or one of these:




Unfortunately, I already had this mode enabled! I did this right away after getting my TV—even though I lacked any 4K hardware initially—because, why not? It seemed to work fine, and who doesn't want to support enhanced picture quality, right?

Further study revealed that this setting not only existed on TVs, but also on receivers. Unfortunately, my Pioneer receiver's menus had no HDMI customization settings at all. After further research, I found that the Pioneer's enhanced HDMI mode requires use of a hidden options menu, accessed by rebooting the receiver with certain buttons held down:

So, that was one mystery solved. Once this setting was changed, 2160p - RGB became available. Unfortunately, while this allowed 4K HDR content to work on my TV, it also seemed to increase the failure rate—I was getting frequent picture drop-outs. Many websites suggested switching to HDCP 1.4 or disabling HDCP altogether, and this definitely reduced the rate of failure, but I would still occasionally get picture drop-out.

This picture drop-out—and the existence of an "enhanced HDMI" mode at all—was actually caused by limitations in my HDMI cables. As it turns out, very few HDMI cables are capable of carrying 4K content at full resolution! There is no visible difference when looking at an HDMI cable which is 4K compliant; the closest thing we get is "Premium HDMI" branding on some of the packaging, which looks like this:


Don't let anyone tell you all HDMI cables are the same—different cables are rated for different transmission speeds. In this case, for "enhanced" picture quality, you need a cable rated for 18 gigabits per second. Most HDMI cables in existence cannot reliably transmit at these speeds. The reason most devices disable "enhanced HDMI" by default is because "standard HDMI" mode uses only 10 gb/s, which most cables can handle. Most consumers will attach their existing HDMI cables to their devices and expect them to work, since they look and feel identical. To their credit, Sony included a Premium HDMI cable in the box with the PS4 Pro, but the cable between my receiver and the TV was only rated for 10gb/s. (The cable in question was high quality, thick and sturdy, less than a year old and was marketed as "guaranteed 4K compatible," but in the tech specs, it was only guaranteed to support a 10gb/s transmission rate.) 

So, what does standard HDMI do in order to save so much bandwidth? Wasn't HDMI supposed to be a pure, uncompressed digital signal? I'll cover that in the next post.

1 comment:

  1. Hi John, I have used the PS4 pro cable but still get the blackouts. Is it good for rgb or just yuv. Thanks

    ReplyDelete