From G-Sync Variable Refresh To G-Sync HDR Gaming Experience

The original FreeSync and G-Sync were solutions to a specific and longstanding problem: fluctuating framerates would cause either screen tearing or, with V-Sync enabled, stutter/input lag. The result of VRR has been a considerably smoother experience in the 30 to 60 fps range. And an equally important benefit was compensating for dips and peaks over the wide ranges introduced with higher refresh rates like 144Hz. So they were very much tied to a single specification that directly described the experience, even if the numbers sometimes didn’t do the experience justice.

Meanwhile, HDR in terms of gaming is a whole suite of things that essentially allows for greater brightness, blacker darkness, and better/more colors. More importantly, this requires developer support for applications and production of HDR content. The end result is not nearly as static as VRR, as much depends on the game’s implementation – or in NVIDIA’s case, sometimes with Windows 10’s implementation. Done properly, even with simply better brightness, there can be perceived enhancements with colorfulness and spatial resolution, which are the Hunt effect and Stevens effect, respectively.

So we can see why both AMD and NVIDIA are pushing the idea of a ‘better gaming experience’, though NVIDIA is explicit about this with G-Sync HDR. The downside of this is that the required specifications for both FreeSync 2 and G-Sync HDR certifications are closed off and only discussed broadly, deferring to VESA’s DisplayHDR standards. Their situations, however, are very different. For AMD, their explanations are a little more open, and outside of HDR requirements, FreeSync 2 also has a lot to do with standardizing SDR VRR quality with mandated LFC, wider VRR range, and lower input lag. Otherwise, they’ve also stated that FreeSync 2’s color gamut, max brightness, and contrast ratio requirements are broadly comparable to those in DisplayHDR 600, though the HDR requirements do not overlap completely. And with FreeSync/FreeSync 2 support on Xbox One models and upcoming TVs, FreeSync 2 appears to be a more straightforward specification.

For NVIDIA, their push is much more general and holistic with respect to feature standards, and purely focused on the specific products. At the same time, they discussed the need for consumer education on the spectrum of HDR performance. While there are specific G-Sync HDR standards as part of their G-Sync certification process, those specifications are only known to NVIDIA and the manufacturers. Nor was much detail provided on minimum requirements outside of HDR10 support, peak 1000 nits brightness, and unspecified coverage of DCI-P3 for the 4K G-Sync HDR models, citing their certification process and deferring detailed capabilities to other certifications that G-Sync HDR monitors may have. In this case, UHD Alliance Premium and DisplayHDR 1000 certifications for the Asus PG27UQ. Which is to say that, at least for the moment, the only G-Sync HDR displays are those that adhere to some very stringent standards; there aren't any monitors under this moniker that offer limited color gamuts or subpar dynamic contrast ratios.

At least with UHD Premium, the certification is specific to 4K resolution, so while the announced 65” 4K 120Hz Big Format Gaming Displays almost surely will be, the 35” curved 3440 × 1440 200Hz models won’t. Practically-speaking, all the capabilities of these monitors are tied into the AU Optronics panels inside them, and we know that NVIDIA worked closely with AUO as well as the monitor manufacturers. As far as we know those AUO panels are only coupled with G-Sync HDR displays, and vice versa. No other standardized specification was disclosed, only referring back to their own certification process and the ‘ultimate gaming display’ ideal.

As much as NVIDIA mentioned consumer education on the HDR performance spectrum, the consumer is hardly any more educated on a monitor’s HDR capabilities with the G-Sync HDR branding. Detailed specifications are left to monitor certifications and manufacturers, which is the status quo. Without a specific G-Sync HDR page, NVIDIA lists G-Sync HDR features under the G-Sync page, and while those features are specified as G-Sync HDR, there is no explanation on the full differences between a G-Sync HDR monitor and a standard G-Sync monitor. The NVIDIA G-Sync HDR whitepaper is primarily background on HDR concepts and a handful of generalized G-Sync HDR details.

For all intents and purposes, G-Sync HDR is presented not as specification or technology but as branding for a premium product family, and right now for consumers it is more useful to think of it that way.

The ASUS ROG SWIFT PG27UQ Review: Premium HDR Gaming When DisplayPort 1.4 Isn’t Enough: Chroma Subsampling
Comments Locked

91 Comments

View All Comments

  • imaheadcase - Tuesday, October 2, 2018 - link

    3840x1600 is the dell i mean.
  • Impulses - Tuesday, October 2, 2018 - link

    The Acer Predator 32" has a similar panel as that BenQ and adds G-Sync tho still at a max 60Hz, not as well calibrated out of the box (and with a worse stand and controls) but it has dropped in price a couple times to the same as the BenQ... I've been cross shopping them for a while because 2 grand for a display whose features I may or may not be able to leverage in the next 3 years seems dubious.

    I wanted to go 32" too because the 27" 1440p doesn't seem like enough of a jump from my 24" 1920x1200 (being 16:10 it's nearly as tall as the 16:9 27"erd), and I had three of those which we occasionally used in Eyefinity mode (making a ~40" display). I've looked at 40-43" displays but they're all lacking compared to the smaller stuff (newer ones are all VA too, mostly Phillips and one Dell).

    I use my PC for photo editing as much as PC gaming but I'm not a pro so a decent IPS screen that I can calibrate reasonably well would satisfy my photo needs.
  • Fallen Kell - Tuesday, October 2, 2018 - link

    It is "almost" perfect. It is missing one of the most important things, HDMI 2.1, which has the bandwidth to actually feed the panel with what it is capable of doing (i.e. 4k HDR 4:4:4 120Hz). But we don't have that because this monitor was actually designed 3 years ago and only now finally coming to market, 6 months after HDMI 2.1 was released.
  • lilkwarrior - Monday, October 8, 2018 - link

    HDMI 2.1 certification is still not done; it would not have been able to call itself a HDMI 2.1 till probably late this year or next year.
  • imaheadcase - Tuesday, October 2, 2018 - link

    The 35 inch one has been canceled fyi. Asus rep told me when inquired about it just a week ago, unless in a week something has changed. Reason being panel is not perfect yet to mass produce.

    That said, its not a big loss, even if disappointing. Because HDR is silly tech so you can skip this generation
  • EAlbaek - Tuesday, October 2, 2018 - link

    I bought one of these, just as they came out. Amazing display performance, but the in-built fan to cool the G-Sync HDR-module killed it for me.

    It's one of those noisy 40mm fans, which were otherwise banned from PC setups over a decade ago. It made more noise than the entirety of the rest of my 1080 Ti-SLI system combined. Like a wasp was loose in my room all the time. Completely unbearable to listen to.

    I tried to return the monitor as RMA, as I thought that couldn't be right. But it could, said the retailer. At which point I chose to simply return the unit.

    In my case, these things will have to wait, till nVidia makes a new G-Sync HDR module, which doesn't require active cooling. Plain and simple. I'm sort of guessing that'll fall in line with the availability of micro-LED displays. Which will hopefully also be much cheaper, than the ridiculously expensive FALD-panels in these monitors.
  • imaheadcase - Tuesday, October 2, 2018 - link

    Can't you just replace the fan yourself? I read around the time of release someone simply removed fan and put own silent version on it.
  • EAlbaek - Tuesday, October 2, 2018 - link

    No idea - I shouldn't have to void the warranty on my $2000 monitor, to replace a 40mm fan.
  • madwolfa - Tuesday, October 2, 2018 - link

    Is that G-Sync HDR that requires active cooling or FALD array?
  • EAlbaek - Tuesday, October 2, 2018 - link

    It's the G-Sync HDR chip, apparantly.

Log in

Don't have an account? Sign up now