Final Words

After spending a few days with G-Sync, I’m just as convinced as I was in Montreal. The technology, albeit a relatively simple manipulation of display timing, is a key ingredient in delivering a substantially better gaming experience.

In pathological cases the impact can be shocking, particularly if you’re coming from a 60Hz panel today (with or without v-sync). The smoothness afforded by G-Sync is just awesome. I didn’t even realize how much of the v-sync related stutter I had simply come to accept. I’d frequently find a scene that stuttered a lot with v-sync enabled and approach it fully expecting G-Sync to somehow fail at smoothing things out this time. I always came away impressed. G-Sync also lowered my minimum frame rate requirement to not be distracted by stuttering. Dropping below 30 fps is still bothersome, but in all of the games I tested as long as I could keep frame rates north of 35 fps the overall experience was great. 

In many situations the impact of G-Sync can be subtle. If you’re not overly bothered by tearing or are ok with v-sync stuttering, there’s really nothing G-Sync can offer you. There’s also the fact that G-Sync optimizes for a situation that may or may not be so visible 100% of the time. Unlike moving to a higher resolution or increasing quality settings, G-Sync’s value is best realized in specific scenarios where there’s a lot of frame rate variability - particularly between 30 and 60 fps. Staying in that sweet spot is tougher to do on a 1080p panel, especially if you’ve already invested in a pretty fast video card.

If you’re already running games at a fairly constant 60 fps, what G-Sync will allow you to do is to crank up quality levels even more without significantly reducing the smoothness of your experience. I feel like G-Sync will be of even more importance with higher resolution displays where it’s a lot harder to maintain 60 fps. Ideally I’d love to see a 2560 x 1440 G-Sync display with an IPS panel that maybe even ships properly calibrated from the factory. I suspect we’ll at least get the former.

There's also what happens if game developers can assume the world is running on displays with variable refresh rates. All of the sudden targeting frame rates between 30 and 60 fps becomes far less of a tradeoff.

NVIDIA hasn’t disclosed much about G-Sync pricing, although ASUS has already given us a little guidance. The VG248QE currently sells for $280 on Newegg, while the upcoming G-Sync enabled flavor will apparently be sold for $400. The $120 premium can be a tough pill to swallow. A 40% increase in display cost is steep, which is another reason why I feel like NVIDIA might have a little more success pushing G-Sync as a part of a higher end display. On the flip side NVIDIA could easily get those costs down by migrating from an FPGA to an ASIC, although to justify that move we’d have to see pretty broad adoption of G-Sync.

Some system integrators will be selling the aftermarket upgraded VG248QE between now and CES, but you can expect other displays to be announced over the coming months. NVIDIA still hasn't figured out if/how it wants to handle end user upgrades for those who already own VG248QE displays.

I feel like NVIDIA is slowly but surely assembling a bunch of components of a truly next-generation gaming experience. With all of the new consoles launched, the bar is set for the next several years. PCs already exceed what consoles are capable of in terms of performance, but the focus going forward really needs to be on improving ease of use as well as the rest of the experience. Things like GeForce Experience are a step in the right direction, but they need far more polish and honestly, integration into something like Steam. G-Sync just adds to the list. For PC gaming to continue to thrive, it needs to evolve into something even more polished than it is today. It’s not enough to just offer higher resolution and a better looking image than what you can get on other platforms, it’s very important to provide a smoother and more consistent experience as well. G-Sync attempts to and succeeds at doing just that.

With G-Sync enabled, I began to expect/demand more visually from my games. Aliasing and other rendering imperfections were far more pronounced now that a big portion of stuttering was removed. G-Sync isn't the final solution, but rather the first on a long list of things that need improving. There are other use cases for G-Sync outside of gaming as well. Streaming video where bandwidth constraints force a variable frame rate is another one I’ve heard passed around.

Although G-Sync is limited to NVIDIA hardware (GeForce GTX 650 Ti Boost or greater), the implementation seems simple enough that other manufacturers should be able to do something similar. That’s obviously the biggest issue with what we have here today - it only works with NVIDIA hardware. For die hard NVIDIA fans, I can absolutely see a G-Sync monitor as being a worthy investment. You might just want to wait for some more displays to hit the market first.

How it Plays
Comments Locked

193 Comments

View All Comments

  • dagnamit - Thursday, December 12, 2013 - link

    Agreed. You would think that getting the display and the thing that talks to the display speaking the same language would be close to first on the list.
  • DanNeely - Thursday, December 12, 2013 - link

    Same here. I'm not going to rush out and buy a 1080p gsync monitor; but even in a year or two an extra $120 on a 4k monitor isn't going to be a large hit relatively speaking and gaming at <60 FPS will be a lot more common there than at 1080p.
  • Black Obsidian - Thursday, December 12, 2013 - link

    4K is a really good point; I hadn't considered the utility of this sort of thing on much higher-resolution monitors.
  • TheHolyLancer - Thursday, December 12, 2013 - link

    Here is the thing, if you have it all. Then a SLI/CF or Titan / 290X setup will mean that you will more than likely able to max out the graphics, and G-sync and V-Sync becomes more or less the same.

    The target market is for when you are on a budget and is playing with mid range cards and the card cannot push 60 / 120 / 144 fps (or 30 fps, if that is your thing...) consistently at 1080 or 4K. Which means price becomes an issue, if you are going to buy a midrange card, likely you are going to reuse your existing monitor, or maybe get a nice cheap one unless the G-sync enabled models (and cards) are not significantly expensive that you can step up to a better card that can then run it nicely at full speed via v-sync.

    So if they can price it so that a new monitor + new nv gpu is the same as a new monitor of same size and speed + new amd gpu + say 20 dollars then that is fine. But if they can't do that then for a mid range gpu dropping 20 or 30 dollar more can mean a lot more performance for the buck; unless you are already at the upper end of midrange, to go from upper midrange to high end is a large jump in cost. And even then, if people want to keep the monitor they have, then there will likely be NO way that this will take off, because even a cheap 1080 is ~100 dollars, and that means a huge jump in quality of the GPU if you used it on the card itself rather than with the monitor.

    The killer app would be if G-Sync would work with any bog ol' monitor (or that all future monitor is sold with this soc enabled). Then it would become a nice new feature that is good for many people.
  • Kamus - Friday, December 13, 2013 - link

    "Here is the thing, if you have it all. Then a SLI/CF or Titan / 290X setup will mean that you will more than likely able to max out the graphics, and G-sync and V-Sync becomes more or less the same."

    This is just flat out wrong...

    I play BF4 on a 290x on a 120hz monitor. And there are very few maps that mantain a consistent framerate. So as soon as the framerate dips below 120 i start seeing suttering. And that's on the smooth maps. There are maps, like "seige of shangai" where the framerate hovers from 80 all the way down to 30-40 FPS... Vsync would be a HUGE deal in situations like that.

    TL;DR= Gsync is a big deal, even for high end rigs.
  • Da W - Thursday, December 12, 2013 - link

    The gamer that has it all certainly won't invest on a 1080p Tn panel.
    Here's the problem right now: a bunch of things that will get implemented later. Isn't the solution in hardware? Will I have to replace my panel next year? And then, my panel will be tied to NVIDIA?
    Not just yet. AMD will surely come with an open source solution next year, as usual.
  • SlyNine - Thursday, December 12, 2013 - link

    Lots of gamers will invest in TN panels because that technology is actually better for games. But it does come at a compromise.
  • rarson - Sunday, December 15, 2013 - link

    I just bought a 27" QHD IPS monitor for $285. From the games I've played on it so far, I'd say you're nuts if you buy a 1080p TN panel over a monitor like this.
  • tlbig10 - Tuesday, December 17, 2013 - link

    And I'll counter with saying you're nuts for overlooking 120/144hz TN panels *if* the main use for your machine is gaming. I have the VG248QE, have enabled LightBoost on it, and I would *never* use my wife's 27" QHD IPS for gaming because I would lose the butter smoothness a 120hz LB monitor gets me. Yes her display has better color reproduction, but it is a mess in BF4 with all its ghosting and 60hz choppiness. Until you've seen what LightBoost and 120hz is like in a first person shooter, you can't call us "nuts".

    And those of you on 120 or 144hz monitors who aren't using LightBoost, do yourself a favor and check it out. There is a substantial difference between LB 120hz and plain 144hz.
  • ZKriatopherZ - Thursday, December 12, 2013 - link

    I think a lot of this is leftover garbage from the way CRT displays needed to be implemented. Seems like we should be removing hardware here not adding it. Flat Panels when introduced to that ecosystem needed to output on a frame by frame basis even though the only real limitation seems to be the pixel color to color refresh. Since LCD pixels are more like a switch wouldn't a video card output and display system that updated on an independent per pixel basis be more efficient and better suited to modern displays? I understand games have frame buffers you would need to interpret but that can all be addressed in the video card hardware. If you have a card capable of drawing to the screen in such a way wouldn't that make this additional hardware unnecessary and eliminate the tearing problem?

Log in

Don't have an account? Sign up now