NVIDIA G-Sync Review
by Anand Lal Shimpi on December 12, 2013 9:00 AM ESTHow it Plays
The requirements for G-Sync are straightforward. You need a G-Sync enabled display (in this case the modified ASUS VG248QE is the only one “available”, more on this later). You need a GeForce GTX 650 Ti Boost or better with a DisplayPort connector. You need a DP 1.2 cable, a game capable of running in full screen mode (G-Sync reverts to V-Sync if you run in a window) and you need Windows 7 or 8.1.
G-Sync enabled drivers are already available at GeForce.com (R331.93). Once you’ve met all of the requirements you’ll see the appropriate G-Sync toggles in NVIDIA’s control panel. Even with G-Sync on you can still control the display’s refresh rate. To maximize the impact of G-Sync NVIDIA’s reviewer’s guide recommends testing v-sync on/off at 60Hz but G-Sync at 144Hz. For the sake of not being silly I ran all of my comparisons at 60Hz or 144Hz, and never mixed the two, in order to isolate the impact of G-Sync alone.
NVIDIA sampled the same pendulum demo it used in Montreal a couple of months ago to demonstrate G-Sync, but I spent the vast majority of my time with the G-Sync display playing actual games.
I’ve been using Falcon NW’s Tiki system for any experiential testing ever since it showed up with NVIDIA’s Titan earlier this year. Naturally that’s where I started with the G-Sync display. Unfortunately the combination didn’t fare all that well, with the system exhibiting hard locks and very low in-game frame rates with the G-Sync display attached. I didn’t have enough time to further debug the setup and plan on shipping NVIDIA the system as soon as possible to see if they can find the root cause of the problem. Switching to a Z87 testbed with an EVGA GeForce GTX 760 proved to be totally problem-free with the G-Sync display thankfully enough.
At a high level the sweet spot for G-Sync is going to be a situation where you have a frame rate that regularly varies between 30 and 60 fps. Game/hardware/settings combinations that result in frame rates below 30 fps will exhibit stuttering since the G-Sync display will be forced to repeat frames, and similarly if your frame rate is equal to your refresh rate (60, 120 or 144 fps in this case) then you won’t really see any advantages over plain old v-sync.
I've put together a quick 4K video showing v-sync off, v-sync on and G-Sync on, all at 60Hz, while running Bioshock Infinite on my GTX 760 testbed. I captured each video at 720p60 and put them all side by side (thus making up the 3840 pixel width of the video). I slowed the video down by 50% in order to better demonstrate the impact of each setting. The biggest differences tend to be at the very beginning of the video. You'll see tons of tearing with v-sync off, some stutter with v-sync on, and a much smoother overall experience with G-Sync on.
While the comparison above does a great job showing off the three different modes we tested at 60Hz, I also put together a 2x1 comparison of v-sync and G-Sync to make things even more clear. Here you're just looking for the stuttering on the v-sync setup, particularly at the very beginning of the video:
Assassin’s Creed IV
I started out playing Assassin’s Creed IV multiplayer with v-sync off. I used GeForce Experience to predetermine the game quality settings, which ended up being maxed out even on my GeForce GTX 760 test hardware. With v-sync off and the display set to 60Hz, there was just tons of tearing everywhere. In AC4 the tearing was arguably even worse as it seemed to take place in the upper 40% of the display, dangerously close to where my eyes were focused most of the time. Playing with v-sync off was clearly not an option for me.
Next was to enable v-sync with the refresh rate left at 60Hz. Lots of AC4 renders at 60 fps, although in some scenes both outdoors and indoors I saw frame rates drop down into the 40 - 51 fps range. Here with v-sync enabled I started noticing stuttering, especially as I moved the camera around and the difficulty of what was being rendered varied. In some scenes the stuttering was pretty noticeable. I played through a bunch of rounds with v-sync enabled before enabling G-Sync.
I enabled G-Sync, once again leaving the refresh rate at 60Hz and dove back into the game. I was shocked; virtually all stuttering vanished. I had to keep FRAPS running to remind me of areas where I should be seeing stuttering. The combination of fast enough hardware to keep the frame rate in the G-Sync sweet spot of 40 - 60 fps and the G-Sync display itself produced a level of smoothness that I hadn’t seen before. I actually realized that I was playing Assassin’s Creed IV with an Xbox 360 controller literally two feet away from my PS4 and having a substantially better experience.
Batman: Arkham Origins
Next up on my list was Batman: Arkham Origins. I hadn’t played the past couple of Batman games but they always seemed interesting to me so I was glad to spend some time with this one. Having skipped the previous ones, I obviously didn’t have the repetitive/unoriginal criticisms of the game that some other seemed to have had. Instead I enjoyed its pace and thought it was a decent way to kill some time (or in this case, test a G-Sync display).
Once again I started off with v-sync off with the display set to 60Hz. For a while I didn’t see any tearing, that was until I ended up inside a tower during the second mission of the game. I was panning across a small room and immediately encountered a ridiculous amount of tearing. This was even worse than Assassin’s Creed. What’s interesting about the tearing in Batman was that it really felt more limited in frequency than in AC4’s multiplayer, but when it happened it was substantially worse.
Next up was v-sync on, once again at 60Hz. Here I noticed sharp variations in frame rate resulting in tons of stutter. The stutter was pretty consistent both outdoors (panning across the city) and indoors (while fighting large groups of enemies). I remember seeing the stutter and noting that it was just something I’m used to expecting. Traditionally I’d fight this on a 60Hz panel by lowering quality settings to at least drive for more time at 60 fps. With G-Sync enabled, it turns out I wouldn’t have to.
The improvement to Batman was insane. I kept expecting it to somehow not work, but G-Sync really did smooth out the vast majority of stuttering I encountered in the game - all without touching a single quality setting. You can still see some hiccups, but they are the result of other things (CPU limitations, streaming textures, etc…). That brings up another point about G-Sync: once you remove GPU/display synchronization as a source of stutter, all other visual artifacts become even more obvious. Things like aliasing and texture crawl/shimmer become even more distracting. The good news is you can address those things, often with a faster GPU, which all of the sudden makes the G-Sync play an even smarter one on NVIDIA’s part. Playing with G-Sync enabled raises my expectations for literally all other parts of the visual experience.
Sleeping Dogs
I’ve been wanting to play Sleeping Dogs ever since it came out, and the G-Sync review gave me the opportunity to do just that. I like the premise and the change of scenery compared to the sandbox games I’m used to (read: GTA), and at least thus far I can put up with the not-quite-perfect camera and fairly uninspired driving feel. The bigger story here is that running Sleeping Dogs at max quality settings gave my GTX 760 enough of a workout to really showcase the limits of G-Sync.
With v-sync (60Hz) on I typically saw frame rates around 30 - 45 fps, but there were many situations where the frame rate would drop down to 28 fps. I was really curious to see what the impact of G-Sync was here since below 30 fps G-Sync would repeat frames to maintain a 30Hz refresh on the display itself.
The first thing I noticed after enabling G-Sync is my instantaneous frame rate (according to FRAPS) dropped from 27-28 fps down to 25-26 fps. This is that G-Sync polling overhead I mentioned earlier. Now not only did the frame rate drop, but the display had to start repeating frames, which resulted in a substantially worse experience. The only solution here was to decrease quality settings to get frame rates back up again. I was glad I ran into this situation as it shows that while G-Sync may be a great solution to improve playability, you still need a fast enough GPU to drive the whole thing.
Dota 2 & Starcraft II
The impact of G-Sync can also be reduced at the other end of the spectrum. I tried both Dota 2 and Starcraft II with my GTX 760/G-Sync test system and in both cases I didn’t have a substantially better experience than with v-sync alone. Both games ran well enough on my 1080p testbed to almost always be at 60 fps, which made v-sync and G-Sync interchangeable in terms of experience.
Bioshock Infinite @ 144Hz
Up to this point all of my testing kept the refresh rate stuck at 60Hz. I was curious to see what the impact would be of running everything at 144Hz, so I did just that. This time I turned to Bioshock Infinite, whose integrated benchmark mode is a great test as there’s tons of visible tearing or stuttering depending on whether or not you have v-sync enabled.
Increasing the refresh rate to 144Hz definitely reduced the amount of tearing visible with v-sync disabled. I’d call it a substantial improvement, although not quite perfect. Enabling v-sync at 144Hz got rid of the tearing but still kept a substantial amount of stuttering, particularly at the very beginning of the benchmark loop. Finally, enabling G-Sync fixed almost everything. The G-Sync on scenario was just super smooth with only a few hiccups.
What’s interesting to me about this last situation is if 120/144Hz reduces tearing enough to the point where you’re ok with it, G-Sync may be a solution to a problem you no longer care about. If you’re hyper sensitive to tearing however, there’s still value in G-Sync even at these high refresh rates.
193 Comments
View All Comments
Jelic - Thursday, December 12, 2013 - link
Hmm, interesting article. I actually currently have the ASUS VG248QE. While gsync sounds intriguing, what I find even more promising is the use of lightboost to give CRT like quality to the panel. WIth my current setup I have a GTX 680 with the max framerate limited via EVGA's OC tool to 120fps. On a 1080p screen, with 120hz refresh rate, and 2d lightboost enabled you get absolutely no motion blur, very little tearing, and overall an amazing gaming experience. Since you have the hardware already, I'd be interested in hearing your opinion on 2d lightboost + gsync (at 120hz), and if that makes any difference. Also I'd love it if Anandtech did an article on lightboosted monitors as well! My ideal monitor would be something like a 27in 2560x1600 IPS panel with 120hz lightboost supported... of course I'd need something like dual 780s to get the most out of it, but it'd be well worth it to me heh.DesktopMan - Friday, December 13, 2013 - link
Lightboost doesn't work well on low framerates since you'd see the backlight flicker. If you flicker it more than once per frame you introduce retina blur again. It works best at high, stable framerates. G-Sync would still be useful with lightboost if your framerate hovers between 60 and 120 though.mdrejhon - Friday, December 13, 2013 - link
Just so all readers know, the great news is there are several different strobe backlights now:- LightBoost
- Official sequel to LightBoost (coming with G-SYNC monitors), mentioned by John Carmack
- EIZO's FG2421
- BENQ Blur Reduction (behaves like LightBoost, but via monitor menus)
- Sony's Motionflow "Impulse" (GAME MODE strobe backlight, low lag, no interpolation)
Some of them darken a lot, and others darken less. Some have better contrast ratios, and much better colors. Some of them (BENQ Z-series) can strobe at 75Hz and 85Hz, if you want zero motion blur with a bit less GPU horsepower. Some of them are zero-ghost (no double-image effect). But you can't "get it all" simultaneously.
From my experience playing on the EIZO FG2421 (warmed up after 30 mins to reduce VA ghosting on cold panels), it's lovely to have a bright and colorful picture, something that LightBoost has difficulty with. The VA panels ghosts a bit more (until I warm up), but when I sustain 120fps@120Hz (Bioshock Infinite, VSYNC ON on a GeForce Titan), it produces spectacular motion quality, the most CRT quality I have ever seen.
Now, if I fall below 100fps a lot, like Battlefield 4, I prefer G-SYNC because it does an amazing job of eliminating stutters during fluctuating framerates.
blackoctagon - Sunday, December 15, 2013 - link
And does G-Sync offer any benefit if you're ALREADY at 120fps@120Hz? Because, if so, surely someone needs to review the VG248QE with both G-Sync and LightBoost enabled at the same time :)web-cyborg - Thursday, December 12, 2013 - link
All of those articles focus on variable hz function of g-sync and not the supposed "superior to lightboost" backlight strobing option. The articles say "30 to 40 fps is 'fine'", with 40 being the sweet spot. I would disagree. These same people complain about marginal input lag milliseconds, yet accept long "freeze-frame" milliseconds with open arms in order to get more eye candy. I think people will be cranking up their graphics settings and getting 30 - 40fps. At 30fps you are frozen on the same frame of world action for 33.2ms while the 120hz+120fps user sees 4 game world action update "slices". At 40fps you are seeing the same frozen slice of game world action for 25ms, while the 120hz+120fps user see 3 action slice updates. This makes you see new action later and gives you less opportunities to initiate action, (less "dots per dotted line length") then you add input lag to your already out of date game world state you are acting on. Additionally, the higher hz+higher frame rates provide an aesthetically smoother control, aesthetically smoother higher motion definition and animation definition. Of course 120hz also cuts the continual FoV movement blur of the entire viewport by 50% (vs 60hz baseline full smearing "outside of the lines" blur) as well, and backlight strobing at high hz eliminates FoV blur essentially (eizo FG2421 now, "superior to lightboost" backlight strobing mode of g-sync monitors in the future supposedly).web-cyborg - Thursday, December 12, 2013 - link
60hz vz 120hz vs backlight strobing. Note that newer monitors like the eizo FG2421 and future "superior to lightboost" backlight functionality of g-sync strobe mode (unfortunately mutually exclusive from the variable hz mode) do not/will not suffer the lowered brightness and muted colors of the lightboost "hack" shown in these examples. However they will eliminate the blur which is shown in these examples.http://www.blurbusters.com/faq/60vs120vslb/
Now remember that in reality it's not just a single simple cell shaded cartoon object moving across your screen, rather your entire 1st/3rd person viewport of high detail textures, depth via bump mapping, "geography"/terrain, architectures and creatures are all smeared "outside of the lines" or "shadow masks" of everything on screen every time you move your FoV at 60hz, more within the "shadow masks" of onscreen objects at 120hz but still losing all detail, textures and bump mapping, and essentially zero blur when using backlight strobing over 100hz.
web-cyborg - Thursday, December 12, 2013 - link
I'm more interested in high fps and zero blur obviously, even if I have to turn down the ever higher *arbitrarily set by devs* graphics cieling "carrot" that people keep chasing (that ceiling could be magnitudes higher if they wanted).I still play some "dated" games too.. fps is high.
You are seeing multiple frames skipped and behind a 120hz+120fps user, watching "freeze-frames" for 25ms to 33.2 ms at 30fps and 40fps, and every time you move your FoV you are smearing the entire viewport into what can't even be defined as a solid grid resolution to your eyes/brain. So much for high rez.
I think people are sacrificing a lot motion, animation, and control wise aesthetically as well as sacrificing seeing action sooner and being given more and sooner opportunities to initiate actions - to reach for higher still-detail eye candy aesthetically.
You don't play a screen shot :b
mdrejhon - Thursday, December 12, 2013 - link
Hello fellow guys at AnandTech -- I've created a new TestUFO animation (via software interpolation) that simulates the smooth framerate ramping that G-SYNC can do:http://www.testufo.com/stutter#demo=gsync
It shows off stutter-free frame rate variances as well. I created this unique animation (I think, the only one of its kind on the Internet), for the Blur Busters preview of G-SYNC.
HisDivineOrder - Thursday, December 12, 2013 - link
The "biggest issue with what we have here today" is not that it's nVidia only. That's a big issue, to be sure.The biggest issue is that there are a LOT of us who have fantastic displays that we paid high dollar for and will not go down to 16:9 or TN panels. Hell, a lot of us won't even go and spend the same money we just spent on our incredibly expensive and incredibly hard to resell monitors to get this technology that should have 1) been included from the start in the LCD spec and 2) should have a way of being implemented that involves something other than tossing our old monitor in the bin.
They need to make an adapter box for monitors without built-in scalers that translates what they're doing to DVI-D. Else, there's a LOT of people who won't be seeing this technology have any use until they get around to making 4K monitors that include it with IPS and at an even semi-reasonable price.
Really, the biggest problem is they didn't find a way to adapt it for all monitors.
web-cyborg - Friday, December 13, 2013 - link
in regard to the backlight strobing functionality, the eizo FG2421 is a high hz VA panel whose backlight strobing "zero blur" capability is independent of gpu camps.We are talking about gaming usage. Practically all 1st/3rd person games use HOR+ / virtual cinematography which means you see more of a scene in 16:9 mode, even if you have to run 16:9 mode on a 16:10 monitor. 16:10 mode cuts the sides off basically.
http://www.web-cyb.org/images/lcds/HOR-plus_scenes...
Gpu upgrades can run $500 - $1000 now too for high end, and somewhere in between or double for dual gpus. 16:10 / 16:9 is really a bigger deal at 1080 vs 1200 even for desktop use. 16:10 30" is not as much real-estate difference as the size suggest between 2560x 27", the 30" pixels are a lot larger. Here is a graphic I made to show three common resolutions compared at the same ppi or equivalent percieved ppi at viewing distances.
http://www.web-cyb.org/images/lcds/4k_vs_27in_vs_3...
Imo for the time being you are better off using two different monitors, one for gaming and one for desktop/apps instead of trying to get both in one monitor and getting worse performance/greater trade-offs combined in one (i.e 60hz vs 120hz, lack of backlight strobing or gsync, resolutions too high to maintain high fps at high+/ultra gfx settings relative to your gpu budget, resolutions too low for quality desktop/app usage,lots of tradeoff, etc).
Upgrades to display and gpu technology are the nature of the beast really. Up until now you would be better off getting a korean knock off 2560x1440 ips or the american mfg versions for $400 or less, and put a good 120hz gaming monitor next to it imo Eizo FG2421 24" VA backlight strobing model is around $500, so for $900+ (and a good gpu of course) you could have better of both worlds pretty much for the time being. Going forward we know g-sync will have backlight strobing functionality but we don't know if any of the higher resolution monitors due to come out with g-sync will have 100hz+ required to support strobing adequately. If they don't, again we are back to major tradeoffs gaming vs desktop use again (low hz -> low motion+animation definition/much less game action updates shown per second/lower control definition, full 60hz baseline smear bluring out all detail and textures during continual FoV movement/motion flow).