NVIDIA G-Sync Review
by Anand Lal Shimpi on December 12, 2013 9:00 AM ESTFinal Words
After spending a few days with G-Sync, I’m just as convinced as I was in Montreal. The technology, albeit a relatively simple manipulation of display timing, is a key ingredient in delivering a substantially better gaming experience.
In pathological cases the impact can be shocking, particularly if you’re coming from a 60Hz panel today (with or without v-sync). The smoothness afforded by G-Sync is just awesome. I didn’t even realize how much of the v-sync related stutter I had simply come to accept. I’d frequently find a scene that stuttered a lot with v-sync enabled and approach it fully expecting G-Sync to somehow fail at smoothing things out this time. I always came away impressed. G-Sync also lowered my minimum frame rate requirement to not be distracted by stuttering. Dropping below 30 fps is still bothersome, but in all of the games I tested as long as I could keep frame rates north of 35 fps the overall experience was great.
In many situations the impact of G-Sync can be subtle. If you’re not overly bothered by tearing or are ok with v-sync stuttering, there’s really nothing G-Sync can offer you. There’s also the fact that G-Sync optimizes for a situation that may or may not be so visible 100% of the time. Unlike moving to a higher resolution or increasing quality settings, G-Sync’s value is best realized in specific scenarios where there’s a lot of frame rate variability - particularly between 30 and 60 fps. Staying in that sweet spot is tougher to do on a 1080p panel, especially if you’ve already invested in a pretty fast video card.
If you’re already running games at a fairly constant 60 fps, what G-Sync will allow you to do is to crank up quality levels even more without significantly reducing the smoothness of your experience. I feel like G-Sync will be of even more importance with higher resolution displays where it’s a lot harder to maintain 60 fps. Ideally I’d love to see a 2560 x 1440 G-Sync display with an IPS panel that maybe even ships properly calibrated from the factory. I suspect we’ll at least get the former.
There's also what happens if game developers can assume the world is running on displays with variable refresh rates. All of the sudden targeting frame rates between 30 and 60 fps becomes far less of a tradeoff.
NVIDIA hasn’t disclosed much about G-Sync pricing, although ASUS has already given us a little guidance. The VG248QE currently sells for $280 on Newegg, while the upcoming G-Sync enabled flavor will apparently be sold for $400. The $120 premium can be a tough pill to swallow. A 40% increase in display cost is steep, which is another reason why I feel like NVIDIA might have a little more success pushing G-Sync as a part of a higher end display. On the flip side NVIDIA could easily get those costs down by migrating from an FPGA to an ASIC, although to justify that move we’d have to see pretty broad adoption of G-Sync.
Some system integrators will be selling the aftermarket upgraded VG248QE between now and CES, but you can expect other displays to be announced over the coming months. NVIDIA still hasn't figured out if/how it wants to handle end user upgrades for those who already own VG248QE displays.
I feel like NVIDIA is slowly but surely assembling a bunch of components of a truly next-generation gaming experience. With all of the new consoles launched, the bar is set for the next several years. PCs already exceed what consoles are capable of in terms of performance, but the focus going forward really needs to be on improving ease of use as well as the rest of the experience. Things like GeForce Experience are a step in the right direction, but they need far more polish and honestly, integration into something like Steam. G-Sync just adds to the list. For PC gaming to continue to thrive, it needs to evolve into something even more polished than it is today. It’s not enough to just offer higher resolution and a better looking image than what you can get on other platforms, it’s very important to provide a smoother and more consistent experience as well. G-Sync attempts to and succeeds at doing just that.
With G-Sync enabled, I began to expect/demand more visually from my games. Aliasing and other rendering imperfections were far more pronounced now that a big portion of stuttering was removed. G-Sync isn't the final solution, but rather the first on a long list of things that need improving. There are other use cases for G-Sync outside of gaming as well. Streaming video where bandwidth constraints force a variable frame rate is another one I’ve heard passed around.
Although G-Sync is limited to NVIDIA hardware (GeForce GTX 650 Ti Boost or greater), the implementation seems simple enough that other manufacturers should be able to do something similar. That’s obviously the biggest issue with what we have here today - it only works with NVIDIA hardware. For die hard NVIDIA fans, I can absolutely see a G-Sync monitor as being a worthy investment. You might just want to wait for some more displays to hit the market first.
193 Comments
View All Comments
repoman27 - Friday, December 13, 2013 - link
The G-Sync module is a daughter card for another custom NVIDIA board which replaces the inputs and scaler (if present) on whatever monitor they decide to build one for. Theoretically, any panel with a suitable LVDS connection for the TCON (i.e. most of them) can be supported. Also, NVIDIA only has to provide the specs for the daughter card socket and/or a reference design for their scaler replacement board in order for any display manufacturer to implement it and create a "G-Sync ready" product.MarkHunt - Friday, December 13, 2013 - link
What an exaggerated screen shot demonstrating 'tearing' - never seen anything remotely like that whilst playing a video game.ArmedandDangerous - Sunday, December 15, 2013 - link
I don't see it exaggerated. I see this very often if I turn V-Sync off. 2x7950's at 1080p running anything with V-Sync off introduces tearing.poohbear - Friday, December 13, 2013 - link
If they make a 1440p IPS display then count me in, otherwise i just don't see this as an option.web-cyborg - Friday, December 13, 2013 - link
I'll leave it at this for now because I've been posting too much..To review,
- every time you move your FoV greater than a snails pace on a sub 100hz, non backlight strobing monitor you drop to such a low rez that it isn't even definably a solid grid to your eyes and brain. So continual bursts/path-flow of the worst resolution possible more or less, the entire viewport dropping all high detail geometry and textures (including depth via bump mapping) into a blur. So much for high rez.
- 1080p is the same exact scene at 16:9 no matter what in HOR+, HOR+/virtual cinematography is used in essentially every 1st/3rd person perspective game and every virtual camera render. All of the on screen objects and the perspective are the same size on a 27" 1080p and a 27" 1440p for example (in a 1st/3rd person game). The difference is the amount of pixels in the scene providing greater detail obviously. This is a big difference but a much bigger difference for desktop/app real-estate where the usable area and display object sizes change, especially considering gpu budgets limits/fps in regard to games..
.
-at low hz and low fps, you are at greatly reduced motion definition and control definition.
Greatly less the amount of new action/animation/world state slices shown, seeing longer "freeze frame" periods during which a high hz+high fps person is seeing up to several newer updates.
1/2 the motion+control definition and opportunities to initiate actions in response at 60hz-60fps
1/3 the motion+control definition and opportunities to initiate actions in response at 40.
1/4 the motion+control definition and opportunities to initiate actions in response at 30.
-you need at least 100hz to support backlight strobing for essentially zero blur (120hz better).
-you can upscale 1080p x4 fairly cleanly on higher rez 3840x2160 (aka "quad HD") monitors if you have to, its not optimal but it can work
(so you can game at higher fps/lower rez on demanding games yet still use a high rez monitor for desktop/apps for example)
-the eizo FG2421 is a high hz 1080p VA panel that uses backlight strobing, it isn't TN.
- we know that nvidia is still supposed to support backlight strobing function as part of g-sync monitors, just that it won't work with the dynamic hz function (at least not for now). So "the industry" is still addressing backlight strobing for zero blur in both the eizo and the g-sync strobe option (which again, requires higher hz to make the strobing viable).
-We know there are higher rez and likely ips g-sync monitors due out, but we do not know if they will have the max hz bumped up which is necessary to utilize the backlight strobe function adequately.
There is more to a game than a screen shot resolution/definition.
There is continual FoV movement blur (an undefinable"non"definition resolution, unless perhaps you were to equate it to an extremely bad visual acuity number /"out of focus")
There is otherwise essentially zero blur using high hz and backlight strobing,
and there is high or low action updates and motion definition, animation definition, and control definition.
mdrejhon - Friday, December 13, 2013 - link
"-you need at least 100hz to support backlight strobing for essentially zero blur (120hz better)."Great reply, just minor clarification.
Not necessarily, if you can strobe at rates below 100Hz.
Some strobe backlights (e.g. BENQ Z-Series, such as XL2720Z) can strobe lower, like 75Hz or 85Hz.
You only need one strobe per refresh, since framerate=refreshrate on strobed displays leads to zero motion blur. (Also why CRT 60fps@60Hz has less motion blur than non-strobed LCD 120fps@120Hz). 100Hz is simply because of less flicker, and because of LightBoost's rate limitation. But nothing prevents zero motion blur at 85Hz, if you have an 85Hz strobe backlight (with low-persistence; aka brief backlight flash times).
Mark Rejhon
Chief Blur Buster
web-cyborg - Saturday, December 14, 2013 - link
yes I didn't mean that it was impossible, I meant that for people like me with "Fast eyesight" / visual acuity, 100hz sounds like it would be a good minimum against seeing flicker. I know from your posts on other forums that there are even 60hz sony tv's with some from of stobing but that would drive me crazy personally. Thanks for the input though so everyone reading knows the rest.Deepo - Friday, December 13, 2013 - link
27" VA panel and I'm in. Need this!beginner99 - Friday, December 13, 2013 - link
This $120 can also be invested into a better GPU which can easily hit 60 FPS. $120 is just way too pricy for this. I never play FPS with vsync. Especially in BC2 the effect is terrible and I can't hit anyone. The difference is day and night. However I never notice tearing...ArmedandDangerous - Sunday, December 15, 2013 - link
You need a GPU that can do a minimum 60FPS, not "average" 60FPS cos any time your FPS drops below 60, you WILL experience stutter.