HDR is the latest and greatest milestone in the pursuit of true immersion. It’s no longer a software sleight-of-hand, as cutting-edge hardware now supports HDR as well. But is HDR gaming worth the extra cost?

What is HDR gaming?

HDR (High-Dynamic-Range) is a rendering trick that simulates the human eye. When used in video games, HDR runs an extra set of lighting calculations to create broader contrast ratios. In short, dark spots are darker, and bright spots are brighter. And fine detail is preserved in the image instead of being “washed out.”

What Gaming Hardware Supports HDR?

A screenshot from Destiny 2
Image Credit: Bungie, Inc.

If you want your games to look their best, you have to rock the right hardware. Fortunately, HDR is much more common nowadays (though it’s still not universal). Most modern equipment from the past couple of years should do the trick.

Listed below are the HDR requirements for displays, PCs, and console gaming configurations.

HDR Display Requirements

Your TV or monitor must support the following minimum requirements:

1. HDR10 Format
2. DisplayPort 1.4 or HDMI 2.0

VA Display

HDR PC Requirements

If you want to game in HDR on PC, your device must support the following minimum requirements:

1. CPU that supports HDR (for integrated graphics)

  • Intel 7th Generation i-Series
  • Unfortunately, HDR is not officially supported by AMD Ryzen CPUs, though it’s still possible under certain conditions

2. Graphics card that supports hardware-accelerated 10-bit video decoding

  • GeForce GTX 950+, 1050+, 1650+, or RTX 2060+
  • Radeon RX 380+, 460+, 550+, Vega 56, Vega 64, 7, or 5700+

3. Required Windows 10 codecs installed for 10-bit video decoding

  • E.g., HEVC or VP9 codecs (The former is available on the Microsoft Store)
  • (Recommended) The latest WDDM2.4 drivers, obtainable from Windows Update
Gaming PC

HDR Console Requirements

Console gamers have it a little easier. As long as you have the display, all you need is a modern console:

1. Console that supports HDR

  • Xbox One S and Xbox One X
  • PlayStation 4 with system software version 4.0+
Nickmercs PS4 Pro 1TB Console

PC gaming requires every component in the chain to be HDR compatible. With consoles and most modern TVs, the process is much more convenient.

HDR Displays in Detail

A picture of a mug of hot beverage in front of an HD tv with the Netflix logo
Image Credit: John-Mark Smith

Trying to decide whether to spring for an HDR monitor or stick with your TV? As you might imagine, there are a few key differences to keep in mind.

Monitors

Monitors offer less input lag (latency) by design. They also offer higher refresh rates, if your framerate is high enough to benefit from it.

However, the adoption of HDR in the monitor space has been hit-and-miss. Unless you’ve upgraded quite recently—and perhaps even if you have—it’s possible your current monitor doesn’t support it. That’s another chunk of change to consider when pricing out an upgrade.

TVs

With TVs, you’ll experience more input lag. But HDR-ready TVs grow on every TV tree in Best Buy or your big box store of choice. The TV in your living room likely supports it already. And if not, TVs can be much cheaper per square inch than a gaming ready monitor.

Input Lag

Regardless of your display, input lag is going to be a concern. The extra encoding and decoding involved with HDR add milliseconds, much like vertical sync.

You can minimize the effect more efficiently with a PC and monitor, but it’s nigh impossible to eliminate. Console gamers have grown more accustomed to input lag, so a little extra isn’t likely to impact their experience much (at least not enough to offset the jump in visual quality HDR provides).

Top HDR-Compatible Games

A screenshot from Destiny 2
Image Credit: Bungie, Inc.

So you’re all set up and ready to roll. What do you play? Not every game is HDR-compatible out of the box, but most recent triple-A games will be. Here are the top 5 games on the market ready to look their best:

  1. Destiny 2
  2. Red Dead Redemption 2
  3. Call of Duty: Modern Warfare
  4. Battlefield V
  5. Tom Clancy’s Ghost Recon Breakpoint

Not All HDR is Created Equally

A picture of LG TVs; Many LG TVs are HDR ready
Image Credit: ETC-USC

But for all this talk about HDR, we haven’t even discussed the local flavors. Yes, there’s more than one. And different setups may not bring the same HDR to the table. Let’s break down the three most common variants, so you know your HDR10+ from your Dolby Vision when shopping screens.

HDR10

The most common variant, HDR10 is the most likely way you’ll experience HDR as a gamer. Both PC and console setups are primarily HDR10, with 10 standing for the bit depth of the dynamic range. Almost all TVs and monitors that advertise HDR compatibility support HDR10.

HDR10+

A joint venture by Samsung and Amazon Video, HDR10+ squeezes a bit more out of HDR10 by packing additional metadata in the stream. This metadata can be used for on-the-fly adjustments to keep the visuals as pretty as possible. Acceptance of HDR10+ is slowly growing, but most likely to affect only UHD home theater setups.

Dolby Vision

Created by Dolby Laboratories, Dolby Vision is a unique HDR format that boasts increased bit depth (up to 12) for brighter, more vivid colors. Most major TV brands support Dolby Vision, and once again, console gamers are in luck—the Xbox One S and Xbox One X support Dolby Vision output. PlayStation doesn’t yet have support for Dolby Vision.

Is HDR Worth It?

A screenshot from Red Dead Redemption 2
Image Credit: Rockstar Games, Inc.

A single question lingers in the air like a wisp of smoke: “Is HDR worth it?”

Yes. And no.

For the stalwart folks who love to live on the bleeding edge of technology, HDR is a no-brainer. And it provides a noticeable improvement to anything you’re watching. Whether you’re gaming or sitting down to a new Marvel movie, HDR is that extra polish that makes the visuals shine. Deeper blacks, brighter brights, and true-to-life colors. For the utmost in immersion, it’s essential.

But unless you’re an exclusive console gamer with a recent TV, the cost of entry may not be worth that layer of polish. Not to mention the performance hit. For PC users especially, it may be more trouble than it’s worth at the moment. As with all new tech standards, HDR is rolling out slowly. Someday we may see more standardized formats (besides HDR10) supported by all major displays and systems. But not today.

If you’re already most of the way to an HDR-capable setup, why not finish it? Take advantage of the latest and greatest. But if you’d have to overhaul your rig just for HDR, the cost may not be worthwhile. Consider waiting until you have another reason or two before upgrading.

Further Reading