What HDR means for PC gaming

Everyone's talking about "HDR" these days, and if you pay attention to the video settings in PC games, you might be wondering: what's the big deal? That's been around for years! Well, I have some news for you. All those “HDR” checkboxes and options you’ve seen in games since Half-Life 2? Turns out that’s not really HDR. 

That option marked “HDR” in the camera app on your cellphone? That’s not really HDR either, although, like those faux-HDR checkboxes in game settings, the image-improving goal of making every pixel count remains the same. 

Lost? Don’t feel bad; when it comes to misconceptions about HDR, that’s just the tip of the iceberg. So just what is HDR and what can it do PC for gaming? This first question is easy to answer. The second, however, is going to take a little time. Read on to find out what HDR is, and what kind of graphics card and monitor you need to support it.

Nope, that's not real HDR.

What is HDR

HDR, or High Dynamic Range, is an umbrella term for a series of standards designed to expand the color and contrast range of video displays far beyond what most current hardware can produce. Despite what you may have heard during the headlong push to 4K, resolution is pretty far down the list when it comes to image quality. Beyond roughly 110 DPI under current Windows UI scaling limitations, the number of pixels starts to matter much less than what you do with them. 

Contrast, brightness, and vibrant color all become more important to image quality once resolution needs are met, and improving these is what HDR is all about. It’s not an incremental upgrade either; HDR’s radical requirements will mean new hardware for almost everyone and a difference you don’t need a benchmark or trained eyes to perceive.

For starters, HDR specs require a minimum 1000 cd/m2 or nits of brightness for LCD screens to adhere to the new "Ultra HD Premium" standard. High-end desktop gaming monitors, which top out around 300-400 nits in brightness, don’t come close to making the cut. Good laptops don’t fare much better, since they only push about a hundred nits or so more. Even cellphones, with their sci-fi sunlight-viewable screen technology, only reach about 800 nits. When it comes to brightness, HDR-compliant screens leave all these displays in the dark.

Color also gets a makeover with HDR specs requiring a full 10- or 12-bit color space per channel, which is fully accessible across the OS and managed via a set of active standards. Most PC displays only provide 6- or 8-bit color per channel using a subset of the full color space called sRGB, which covers a tiny third of HDR’s visual spectrum. However, even when the hardware is available to do more, software peculiarities make using legacy enhanced color modes cumbersome.

sRGB, on the right, provides just a third of colors available to HDR.

Currently, PC monitors that support wide gamut color, or WGC, generally reserve compatibility for professional use, such as photo editing or medical research applications. Games and other software simply ignore the extra colors and often wind up looking misadjusted when the reduced color space they use is mapped onto wide-gamut displays, unless the hardware takes special steps to emulate the reduced color space. HDR standards avoid the confusion by including metadata with the video stream that helps manage the color space properly, making sure applications look correct and take optimal advantage of the improved display capabilities.

Wide-gamut color is only part of the HDR equation. Eizo’ ColorEdge professional series, for example, support WGC but aren’t even close to HDR compatibility, despite their quality, cost and excellent reputation.

To help handle all the extra data, HDR also ushers in HDMI 2.0a as a minimum display connector requirement; a long overdue upgrade on the ubiquitous low-bandwidth HDMI 1.4 standard. 

Bumps in the road

HDR has plenty of promise, but the road ahead isn’t clear yet. The biggest problems aren’t technical roadblocks but competing, partially incompatible standards that threaten to detour early adopters into expensive dead ends. 

Nope, that's not real HDR, either.

Two main standards for HDR currently exist: the proprietary Dolby Vison which features 12-bit color depth and dynamic metadata vs. the open standard HDR-10, which supports 10-bit color and only provides static metadata at the start of a video stream.

Dolby Vision, with its license fees and extra hardware, is the more expensive implementation which has limited its adoption, even at the high end. Most display manufacturers and content providers have opted to support HDR-10 instead, significantly including Microsoft for the Xbox One S and Sony for the PS4, which indicates the way the gaming industry is leaning. This makes HDR-10 the easy format choice for console gamers.

Proponents of Dolby Vision tout its greater color depth, more demanding hardware standards, and frame-by-frame ability to adjust color and brightness on a dynamic basis, along with HDR-10 compatibility, but the market is moving towards the cheaper, good-enough standard of HDR-10 by itself.

Since HDR-10 isn’t compatible with Dolby Vision, that leaves Dolby’s superior but proprietary and expensive HDR standard at a likely dead end in a few years when HDR-10’s open-standard successor arrives. However, if you’re a video or film enthusiast buying on the basis of quality and content already graded and available for Dolby’s superior standard today, this may not matter so much and you’ll enjoy a better picture immediately. Gamers are likely to have different priorities, however.

But enough about the living room. What about HDR on our PCs?

Where can you find HDR today?

PC manufacturers are racing to join in on the HDR phenomenon, but you don’t have to wait for them to try it out. Hardware HDR has already arrived for consumers in the high-end television market. This is also the best way to see the difference HDR makes in action, as a small but growing library of HDR demos, films, and television make side-by-side comparison easy. Our recommended TVs for PC gaming all support HDR, making them strong choices as monitor replacements if you value size and image quality over monitors' typically lower response times.

TVs like Samsung’s KS8000 and KS9800 series, are capable of pulling double duty as impressive computer displays that allow exploration of first-wave HDR content as it becomes available on the PC. Just keep in mind the price for admission to the early adopter club isn’t cheap, and using a TV for PC gaming has a few drawbacks you should read up on.

Graphics cards

One place where PCs are already prepared for HDR is the graphics card market. While monitors lag behind their TV counterparts in HDR implementation, mid- and high-end GPUs have been ready for the revolution for almost a year now, thanks to the healthy rivalry between Nvidia and AMD.

Nvidia has been talking up HDR since the 900 series cards, and currently certify all Pascal-based cards as HDR ready as well. AMD is slightly later to the game, with the 390X and current Polaris lineup their first HDR-capable cards. If you bought a graphics card recently, there’s an excellent chance that at least part of your system is good to go.

What about games?

The trickier question is what all of this means for PC gamers. While some new games will come out of the box with HDR in mind, older software won’t support the wider color and contrast capabilities without patching. Those older games may play normally on HDR-equipped systems, but you won’t see any benefits without some fresh code added to the mix.

Fortunately, leveraging HDR’s superior technology doesn’t require a ground-floor software rewrite. A fairly straightforward mapping process that expands SDR color maps to HDR ranges via algorithmic translation can be used to convert SDR titles without massive effort. 

HDR left, Standard right. Nvidia is busy helping to add HDR to Rise of the Tomb Raider on the PC. Expect plenty more patches for popular games once HDR displays arrive in force.

This means popular games may, in the future, get studio remasters or patches adding HDR support, and the mod community is likely to step in where manufacturers won’t with older classics. Unlike the simulated or averaged HDR found in early games or used in photography, hardware HDR packs a visceral visual punch that’s already catching on in the entertainment industry. This bright and colorful HDR nirvana is still a bit down the road, however.

The real problem facing PC gamers interested in HDR today is that is hasn’t arrived on the platform yet. Nvidia is actively working on a Rise of Tomb Raider patch, and games that support HDR in the console world are likely to see versions that enable it on their PC counterparts, including Forza, Battlefield, Gears of War and others. But as of this writing, we really don't know for sure.

Until HDR patches and games start arriving, it’s going to be Blu-ray and streaming content, along with the indignity of seeing consoles get access to enhanced HDR visuals first. So kick up your heels and relax. It’ll be a while before HDR arrives on the PC, so for now just catch up on your Steam back catalog and wait for HDR to come to you.