Monday, January 23, 2017

PS4 Pro: "2160p - YUV420" or "2160p - RGB"?

Setting up a PS4 Pro to take full advantage of 4K @ 60fps can be difficult (see my previous post). There are multiple options that don't look all that different on casual inspection. What's the difference, and what should you choose?

There are three different options in the PS4 Pro menu which are able to provide 4K support:


• Automatic
• 2160p - YUV420
• 2160p - RGB

If everything's set up correctly, the best choice is actually Automatic. Assuming you have compatble equipment, you'll know everything is set up right when Automatic gives you this:


The big question, though, is what does that really mean? What's the difference between RGB and YUV420? (And what in the world is YUV422??)

First things first, let's start with RGB. I won't spend much time on this since it's pretty simple and easy to find information on the web. TVs and monitors use separate red, green, and blue components to display colors in an image; RGB simply sends red, blue and green color information directly.

Original:

RGB:




Historically, though, televisions have rarely used RGB. Instead, the signal is split up into luma and chroma—this is called YUV (or pedantically speaking, Y′CbCr). The luma component is the Y component, and represents the pixel's brightness; it looks like a black and white version of the image. Black and white televisions supported only luma.

Luma:

When televisions were upgraded to color, they didn't want to break existing TV sets, so they kept broadcasting the luma information just as before. To add color, they added a chroma component, the UV information, which basically contains the color with the luma/brightness information removed. Chroma data is very hard to parse visually; you can compare it to the original above and see that colors are basically preserved, but with no bright or dark spots. Our visual system just sees the chroma image as a blur, mostly fuzzy and indistinct.

Chroma:

Here's another image split into luma and chroma.

Original:

Chroma:

Luma:


Compared to RGB, a pure YUV signal could theoretically present the exact same quality image with the same amount of bandwidth. In a normal SDR picture, RGB takes 8 bits per channel, which means 24 bits for a pixel. Pure YUV data takes 8 bits for the luma and 16 bits for the chroma on a per-pixel basis, so that's still 24 bits per pixel. This signal would be called YUV444.

However, the PS4 does not actually use YUV444—it's more complicated than RGB and technically has no advantage (or disadvantage). In cases where there is sufficient bandwidth, the PS4 will simply switch to RGB mode.

YUV420 mode is a smart compromise that takes advantage of the limitations of our visual cortex. Remember how the luma component of an image is very distinct, but it's difficult to see details clearly in the chroma component? Well, it turns out that you can shrink down the chroma channel to 1/4 of its original size, and it's extremely difficult to tell the difference. In other words, for a 4K 2160p signal, the luma component is sent in full 4K quality, but the chroma component is sent at 2K 1080p resolution. In this case, on a per-pixel basis, the luma still takes 8 bits per pixel. We send 16 bits of chroma information at 25% size (50% reduction on each axis), so that averages out to 4 bits per pixel. So YUV420 needs only 12 bits per pixel, meaning it only needs half the bandwidth and delivers an image that most people find indistinguishable from pure 4K. Given the choice between RGB and YUV420, RGB is the superior "no compromise" choice, but most viewers will struggle to see a difference in typical usage.

— more to come —


Monday, January 9, 2017

Is 4K HDR ready for prime time?

I recently upgraded all of my gear—brand new TV with 4K HDR support, new 4K-compatible receiver, shiny new braided HDMI cables. I thought I had done everything by the book—I made sure everything I bought supported the latest standards, was well reviewed, and seemed to work great. When I finally got a PS4 Pro, that was finally a chance for all that new gear to shine—4K in all its glory! 

Unfortunately, things got off to a rocky start when my PS4 Pro immediately told me this.


Only 2K supported? My old PS4 could do 2K HDR already :(

Also, the menus offered two different 2160p modes—YUV420 and RGB. The RGB mode was "unsupported." Was this important? Is RGB better? This seemed to be related to my HDR issues, but Google gave me very mixed messages.



In general, I found a lot of misinformation and uncertainty on the web about screen settings, HDMI compatibility, HDCP, 4K, HDR, RGB versus YUV and other topics. I was baffled that my setup, with a new high-end Sony TV connected to a Sony peripheral, was missing its main selling point. To make matters worse, while playing games, I began to experience picture dropouts and bursts of static that could only be temporarily resolved by switching inputs on the receiver. I had never been so flummoxed by commercial AV gear.


So I rolled up my sleeves and got to debugging. The first screen  contained a clue—"Your TV might support a higher quality color format and 4K HDR if you change its settings." I can do that! I love settings. In fact, I thought I knew exactly what to do for this one. For reasons that will become clear later in this post, 4K TVs generally ship by default with their HDMI ports set to "standard" mode, and have an option to switch to "enhanced" mode if you have 4K hardware attached. If you search for this problem online, you'll find many posts telling you that you need to enable enhanced HDMI on your TV. Depending on your TV, the setting looks a lot like this:


Or one of these:




Unfortunately, I already had this mode enabled! I did this right away after getting my TV—even though I lacked any 4K hardware initially—because, why not? It seemed to work fine, and who doesn't want to support enhanced picture quality, right?

Further study revealed that this setting not only existed on TVs, but also on receivers. Unfortunately, my Pioneer receiver's menus had no HDMI customization settings at all. After further research, I found that the Pioneer's enhanced HDMI mode requires use of a hidden options menu, accessed by rebooting the receiver with certain buttons held down:

So, that was one mystery solved. Once this setting was changed, 2160p - RGB became available. Unfortunately, while this allowed 4K HDR content to work on my TV, it also seemed to increase the failure rate—I was getting frequent picture drop-outs. Many websites suggested switching to HDCP 1.4 or disabling HDCP altogether, and this definitely reduced the rate of failure, but I would still occasionally get picture drop-out.

This picture drop-out—and the existence of an "enhanced HDMI" mode at all—was actually caused by limitations in my HDMI cables. As it turns out, very few HDMI cables are capable of carrying 4K content at full resolution! There is no visible difference when looking at an HDMI cable which is 4K compliant; the closest thing we get is "Premium HDMI" branding on some of the packaging, which looks like this:


Don't let anyone tell you all HDMI cables are the same—different cables are rated for different transmission speeds. In this case, for "enhanced" picture quality, you need a cable rated for 18 gigabits per second. Most HDMI cables in existence cannot reliably transmit at these speeds. The reason most devices disable "enhanced HDMI" by default is because "standard HDMI" mode uses only 10 gb/s, which most cables can handle. Most consumers will attach their existing HDMI cables to their devices and expect them to work, since they look and feel identical. To their credit, Sony included a Premium HDMI cable in the box with the PS4 Pro, but the cable between my receiver and the TV was only rated for 10gb/s. (The cable in question was high quality, thick and sturdy, less than a year old and was marketed as "guaranteed 4K compatible," but in the tech specs, it was only guaranteed to support a 10gb/s transmission rate.) 

So, what does standard HDMI do in order to save so much bandwidth? Wasn't HDMI supposed to be a pure, uncompressed digital signal? I'll cover that in the next post.

Wednesday, March 10, 2010

Image shuffling script

I've written this Perl script twice now, so I might as well immortalize it somewhere where I'll be able to find it again.

This script randomizes the names of a folder full of JPGs. This is useful for showing a looping slideshow on the PS3 with a random order.


#!/usr/bin/perl
@files = <*>;
foreach $file (@files)
{
rename($file, rand() . ".jpg");
}

Monday, January 18, 2010

sscanf Tips and Tricks

It's rare to see sscanf used for string parsing in production code. I'm not entirely sure why it's shunned, because sscanf can parse a wide variety of strings in a single line of code, and the alternatives are typically extremely verbose and complicated in comparison. Parsing character strings letter-by-letter? Yuck. I think the issue is that its powers are hard to wield, and there aren't a lot of "Beginner's Guide to Sscanf Mastery" like we have for regular expressions. Also, the language of sscanf is quirkier and less thorough than regular expressions, so lots of users give up on it early. Don't be so hasty.

Before we begin, things to know about sscanf:

• sscanf's return value is the number of matched elements. This differs from sprintf which returns the number of generated characters. The number of matches doesn't seem too useful on the surface--shouldn't it just match the number of elements in the format string? As it turns out, this value can actually be quite useful. (And if you need to know the number of characters sscanf has consumed, you can use %n for this. Note that Microsoft actually disables %n by default, contrary to the standard, because of security implications, so you need to re-enable it before calling sscanf.)

• sscanf typically requires your format string body to match the input string exactly. For instance, if your string is "You scored 1000 points!" and your format is "You scored %d points!", your number will be read in without incident. But if your format string is "YOU SCORED %d POINTS!", sscanf will give up before it reads the "1000" value, because the strings up until that point don't match up. It's even case-sensitive, and unlike regular expressions, there's no option to disable case sensitivity. To work around this limitation, you can use _strlwr if you're using Visual C++ to make your input string all lowercase. Other platforms will need to make do with a standards-compliant _strlwr equivalent, such as the inelegant but compact:
std::transform(myString, myString + strlen(myString), myString, tolower);

• sscanf treats any and all whitespace as equal. In other words, sscanf will match any whitespace with any other whitespace, in any quantity. So if your input string is "This\n is a \t\t\t test", it will match the format string "This is \n\r\t a test". No problem. If you have a need for precise whitespace matching, you can sometimes use %c and then verify the character manually, but generally sscanf isn't going to be very convenient for you. Fortunately, in most cases, you don't care about the precise type or quantity of whitespace. It's a little odd that the API is completely strict about case and completely non-strict about whitespace, but that's how it works.

• sscanf's %s specifier stops reading the string as soon as it encounters whitespace. This may be useful in some cases, but in many other cases I've found it to be unhelpful.

• One of sscanf's least-known tricks is that it can do character groups, just like regular expressions. For instance, "%[A-Za-z]" is like %s, but will stop reading as soon as it encounters any non-alphabet character. The caret inverts the effect; "%[^=:]" will keep reading and consuming any character until it finds an equal sign or colon.

(to be continued)

The Costs of Canceling Cable

A few weeks ago Mandi and I decided that we had no need to spend $100 a month to watch a few random TV shows, particularly since they were shows that can be had on iTunes or Hulu. So we bit the bullet and canceled cable. So far I've bought the following gadgets to fill in the gaps.



Scosche component A/V cable for iPod/iPhone: an inexpensive substitute for the AppleTV. 2 x $30 (I bought one cable for each TV in the house).One of the few non-Apple cables that will allow you to play video from an iPhone 3G/3GS on your HDTV. Lots of knockoff cables exist that will work with older models of iPod/iPhone, but the newest models have an encryption chip in them that the counterfeiters can't replicate yet. If the phone doesn't detect the chip, the cable won't work.

Using this cable, I can easily download a TV episode straight from iTunes (either on the computer or on the phone), then plug the phone into the TV and watch. It's not quite as elegant as the official AppleTV, and the picture quality is clearly standard-def, but frankly it's pretty slick that a pocket-sized device can be a decent AppleTV substitute at all.

INTENDED USE: Watching ABC shows like Grey's Anatomy or Private Practice, that only show up on iTunes because ABC ≅ Disney ≅ Pixar ≅ Apple. Also, I've kind of wanted to try this for a while now. It just seems convenient and useful.

QUIRKS: The cable connection to the phone is not as snug as a real Apple dock connector. Even when it's fully engaged, if you push on it, it can twist to the side and lose connection. However, once it's seated and you place the phone down, it seems to stay connected just fine.



PlayOn for Windows: in tandem with an existing DLNA device like a Playstation 3, a pretty effective Roku substitute. $19 (Regular price $40, discounted to $30 if you wait 14 days for the demo to expire, another $11 discount if you sign up for a Gamefly trial and immediately cancel.)

This is a Windows app which connects to Hulu, Netflix, Amazon, and various other websites, transcodes the video into something which your PS3 can decode, then streams the video to the PS3. It shows up in the PS3 menus automatically, lists all your episode queues as a set of hierarchical folders, and is pretty darn intuitive to use. One quirk is that using "rewind" or "fast-forward" over 802.11g is painful. I may end up putting an 802.11n router on the PS3 if I find one for a decent price.

INTENDED USE: Watching Hulu videos, and perhaps occasional Amazon VOD. Hopefully, if we stay current with Hulu, we can get a large percentage of our TV watching done at no cost. I don't think Amazon has much content that iTunes doesn't, but it's kind of nice that Amazon doesn't require you to store the video locally--my backup hard drive is already filling up fast enough--and Amazon also has occasional coupons and specials, whereas iTunes only has discounts once in a blue moon.

QUIRKS: It doesn't support some advanced features, such as closed captioning in Hulu. This will be an issue when my hard-of-hearing family members show up. But it's not the only way to stream Hulu to the TV, just the most convenient option.



Hulu Desktop: In conjunction with Rowmote Pro, a very user-friendly interface for Hulu with a working remote control. $0 for Hulu Desktop, $5 for Rowmote Pro. Unlike PlayOn, it supports closed-captioning. It also has a very stylish user interface with episode previews.

The downside is that you need to have a Mac or PC actually plugged into the TV to make this work. Since we have laptops, this is not a huge downside, but it adds an extra 3-minute impediment to getting the TV show started, and it's just one more hassle. Plus, if your laptop battery is low, better have a charger set up by the TV, since this drains battery fast. (PlayOn also drains juice remarkably quickly, but you can set the laptop wherever you want as long as you're in 802.11n range.)

INTENDED USE: Hulu, obviously, if closed-captioning is needed.



HDTV Antenna: A little fancier than the rabbit ears of old, these antennas plug into the wall to provide a signal boost, and have the juice to receive HD over-the-air content. $19 on eBay. I haven't received this yet.

I've only ordered one antenna for now. I'll probably put it in the master bedroom. If it ends up working great, I'll probably get a second one for the downstairs TV, since antennas just aren't that expensive.

INTENDED USE: Local news, sports, PBS, live events like the Tournament of Roses. I doubt we will use it a lot, but with the rain we're having right now, having access to local news or weather might actually be handy. And PBS in the morning shows a lot of great kids programming like Martha Speaks or Sesame Street that is hard to get elsewhere.




So, in other words, in my first month of jettisoning cable, I have actually spent about $100 on various products to replace it. Which isn't really a savings--it's more or less a wash. The good news is that I don't have to spend another $100 next month to buy it all again. And it's kind of fun to set up all these gadgets, in a geeky way, even though most of them compromise picture quality.

Monday, November 16, 2009

MFC and the disappearing crashes

We've had a strange problem at work with our biggest MFC app. For some reason, crash-worthy bugs (like a write to a NULL pointer) would not show up in the debugger. Instead, whenever something bad would occur, we'd get booted back to the event loop. Even when stepping in the debugger, you'd be going along and suddenly--wham--your stack would be popped and that would be that. The only evidence would be a "First chance exception" message in the Output pane. Quite frustrating when trying to catch certain types of error.

I found that this is because the MFC headers include structured exception handling which magically consumes and hides your errors. Which is the opposite of what we want, but that's how it is. Fortunately, there's a way to make the debugger cooperate. In the Debug menu, choose "Exceptions" and enable every item in "Win32 Exceptions". Voila--the debugger will stop as soon as an error occurs. Why this isn't the default is beyond me.

Sunday, October 25, 2009

Speaker Hacks

For simple setups, a receiver is totally overrated. They're big and bulky, and draw a lot of power, and complicate a simple wiring setup. But, when you want to drive a set of unpowered speakers, what choice do you have? You've gotta drive those speakers somehow.

Today, I was thinking about this a bit. For the playroom speaker setup, I had given up on my amp because it didn't fit in the space I had, so I was just using computer speakers. The computer speakers are based off of a subwoofer that plugs into the wall, which takes the input via minijack, amplifies the signal, and sends it to two little satellite speakers via another minijack. Finally, I had the a-ha moment! The amp is in the subwoofer! It's itty-bitty compared to a stand-alone amp, and it sits on the floor unobtrusively. So hypothetically I could replace the chintzy satellite speakers with real SPEAKER speakers. The only* catch is that real speakers hook up via speaker wire, and the chintzy satellites hook up via minijack. How hard could it be to fix that?

* Well, and the quality/power probably isn't as good as a real amp, either, but I can compromise.

As it turns out, not hard at all. My first idea was to wire up a new minijack to speaker wire; that'd require a trip to Radio Shack for something like this:



That's a little complicated, though, since you need semi-precise soldering ability. I figured I could probably come up with a simpler plan, and after a little research, my revised idea was this:



Radio Shack asks a mere $9 for this beautiful RCA-to-speaker-wire cable. Not bad, and eliminates the need for soldering. But, I thought, if it's that simple, I have plenty of RCA cables sitting around in a box. PLENTY of them, and not much use for them. So, I figured, what's the worst that could happen? I cut the ends off of a perfectly good six-foot Monster Cable stereo RCA cable, stripped it down, twisted the ends, and voila. I had a high-quality equivalent to Radio Shack's cable.

So at the end of the day, the setup is as follows:

Wall socket → Inexpensive Computer-Speaker Subwoofer → Minijack-to-RCA adapter → Barrel adapter → RCA-to-bare speaker wire → Speakers

I'm having fun with the images in this post, so visually we have: