Quality time with G-Sync
We spent some time with the new NVIDIA G-Sync prototype monitor and came away just as impressed as we did in Montreal.
Readers of PC Perspective will already know quite alot about NVIDIA's G-Sync technology. When it was first unveiled in October we were at the event and were able to listen to NVIDIA executives, product designers and engineers discuss and elaborate on what it is, how it works and why it benefits gamers. This revolutionary new take on how displays and graphics cards talk to each other enables a new class of variable refresh rate monitors that will offer up the smoothness advantages of having V-Sync off, while offering the tear-free images normally reserved for gamers enabling V-Sync.
NVIDIA's Prototype G-Sync Monitor
We were lucky enough to be at NVIDIA's Montreal tech day while John Carmack, Tim Sweeney and Johan Andersson were on stage discussing NVIDIA G-Sync among other topics. All three developers were incredibly excited about G-Sync and what it meant for gaming going forward.
Also on that day, I published a somewhat detailed editorial that dug into the background of V-sync technology, why the 60 Hz refresh rate existed and why the system in place today is flawed. This basically led up to an explanation of how G-Sync works, including integration via extending Vblank signals and detailed how NVIDIA was enabling the graphics card to retake control over the entire display pipeline.
In reality, if you want the best explanation of G-Sync, how it works and why it is a stand-out technology for PC gaming, you should take the time to watch and listen to our interview with NVIDIA's Tom Petersen, one of the primary inventors of G-Sync. In this video we go through quite a bit of technical explanation of how displays work today, and how the G-Sync technology changes gaming for the better. It is a 1+ hour long video, but I selfishly believe that it is the most concise and well put together collection of information about G-Sync for our readers.
The story today is more about extensive hands-on testing with the G-Sync prototype monitors. The displays that we received this week were modified versions of the 144Hz ASUS VG248QE gaming panels, the same ones that will in theory be upgradeable by end users as well sometime in the future. These monitors are TN panels, 1920×1080 and though they have incredibly high refresh rates, aren't usually regarded as the highest image quality displays on the market. However, the story about what you get with G-Sync is really more about stutter (or lack thereof), tearing (or lack thereof), and a better overall gaming experience for the user.
One more thing worth noting right away is that performance testing with G-Sync displays takes a wildly different spin. As you know PC Perspective has adopted our own Frame Rating graphics testing process that uses direct capture from a GPU running a game, into a hardware capture system with an overlay that we then post-process and analyze to get real world performance that you cannot get with software like FRAPS. The negative to that method is that is that is currently requires DVI connectivity, which hasn't been a problem as all graphics cards support DVI today. But G-Sync is a DisplayPort exclusive feature meaning that we cannot use our Frame Rating capture systems currently. We are working with some partners to enable DP 1.2 capture and thus performance testing, but there are other hiccups in the way. We are definitely working on those issues and I think we'll have them solved in early 2014.
That being said, the performance in terms of frames per second or frames times, for G-Sync are going to be closely analogous to current monitors running with V-sync disabled. Today's monitors are displaying at a fixed refresh rate and when a new frame is completed they are simply replacing part of a buffer and the data is sent immediately to the screen, resulting a horizontal tear which I am sure everybody here is familiar with.
With G-Sync, as soon as that frame is done it polls the graphics driver to check to see if the display is in the middle of a scan. If so, it waits and this poll time takes about 1 ms to be completed. Then it will tell the monitor to prepare for a new frame by resending the Vblank signal that is actually put on hold by NVIDIA's driver. The end result is that a G-Sync monitor and enabled systems performance will very closely mirror, in terms of frames per second, a standard configuration with V-Sync disabled. The benefit of course is that you no longer have any of this distracting, distorting horizontal tearing on the display.
Because of that polling time, NVIDIA did warn us that there is currently a 1-2% performance delta between V-Sync off frame rates and G-Sync enabled frame rates. G-Sync is a little bit slower because of that polling time that Tom Petersen indicated was in the 1ms area. Interestingly though, they did say they were planning to improve that time to basically 0ms with some driver updates once monitor partners begin to ship production units.
So performance results here are going to be very minimal, and in fact we are only going to show you a handful of graphs. We are going to show you V-Sync on versus V-Sync off, where V-Sync off will emulate the performance G-Sync though without the visual anomalies associated with it. In the graphs below we are using our standard GPU test bed with a SNB-E platform and processor, 16GB of DDR3 memory, an SSD and we are testing with a single GeForce GTX 760 2GB reference card using the latest NVIDIA 331.93 beta drivers. The two sets of results you see are Frame Rating captured results, one with V-Sync enabled and one with it disabled.
NVIDIA's G-Sync Demo Application
I decided to use the GeForce GTX 760 graphics cards as it is a very common, mainstream GPU and also allows us to find instances in games where G-Sync is very effective. In scenarios where you are gaming on a 60 Hz monitor and you are running enough graphics hardware to keep the frame rate over 60 FPS 100% of the time, it is true that the many of the benefits of G-Sync will be lost. However, I will argue that even dropping under that 60 FPS mark for 5% of your game time results in a sub-par experience to the gamer. Take into account that even the mighty GeForce GTX 780 Ti can be brought to its knees at 1920×1080 with the highest quality settings in Crysis 3 and I think that G-Sync technology will be useful for mainstream and enthusiast gamers alike.
Also note that higher resolution displays are likely to be shown at CES for 2014 release.
Results will show you instantaneous frame rates, average frame rates over time and how variance is affected as a way to attempt to demonstrate how stutter and frame time variance can affect your actual user experience. This is very similar to how we have tested SLI and CrossFire over the past two years, helping to showcase visual experience differences in a numeric, quantitative fashion. It's difficult to do, no doubt, but we believe that attempting this is at least required of a solid overview of G-Sync technology.
Based on the first graph, you might think that the experience of playing Crysis 3 (High, High, 4xMSAA settings) would be the same with V-Sync enabled or disabled, but it clearly is not. Even though the average rates per second are nearly the same, the second graph, that shows the instantaneous frame time tells a different story. The black line representing the V-Sync disabled test results shows a rather smooth transition of frame rates from the 0 second mark through the 60 second mark with a couple of hiccups along the way.
The orange line that shows a V-Sync enabled result is actually very quickly oscillating back and forth between a 16.6ms and 33.3ms frame time, essentially hitting either a 60 FPS mark or 30 FPS mark in any given moment. The result is unsmooth animation – the human eye is quite adept and seeing variances in patterns and the "hitching" or stutter that appears when the game transitions between these two states is explained very well in our interview with Tom Petersen above.
A zoomed-in graph (just the first 3 seconds) shows the back and forth frame times more clearly. The orange line shows a few frames at 16.6ms (great!) and then a spike to 33.3ms (not great), repeating over and over. The black line shows a more regular and consistent frame time of anywhere from 20-23ms.
It would seem obvious then that in this case, where performance is not able to stay above the refresh rate of the panel, the black line shows the better, smoother experience. However, with standard V-Sync options, this meant horrible tearing across the screen. G-Sync offers nearly the same performance levels as the V-Sync off result but without the horizontal tearing.
Another interesting angle to take on this debate is that with V-Sync off, and in analog G-Sync enabled, you are getting the full performance out of your graphics card, 100% of the time. Your GPU is not waiting on monitor refresh cycles to begin outputting a frame or begin rendering the next frame. Gamers have been enabling this for years by disabling V-Sync, but the tearing problem was again the result. It was a trade off between frame rate, responsiveness and tearing versus stutter, input latency and a tear-free image.
Is there any possibility that
Is there any possibility that existing monitors could be modified to support G-Sync? I have 3 Dell U2410s, but more importantly a Dell U3014. It would suck to have to change all of these monitors to get G-Sync support.
Not likely as all monitors
Not likely as all monitors are different. If the market for a specific monitor is large there might be a mod kit otherwise no, you will need to buy a new one that has it built in.
First of all I would like to
First of all I would like to say great blog! I had a quick
question which I’d like to ask if you do not mind. I was curious to know how you center yourself and clear your thoughts before writing.
I have had difficulty clearing my mind in getting
my thoughts out. I truly do take pleasure in writing but it just seems like the first 10 to 15 minutes tend to be lost simply just trying to
figure out how to begin. Any suggestions or hints? Thank you!
I think really NVIDIA only
I think really NVIDIA only has plans to allow upgrades to this ONE monitor.
Is there any talk of making
Is there any talk of making this an open standard for AMD/Intel to use? Otherwise I fear this will be another useless, but visually impressive piece of technology that is doomed to failure. Case point is Apple and their firewire.
Not really…that’s not
Not really…that's not usually a direction NVIDIA takes. Which is unfortunate, I agree.
I don’t think it will fail,
I don’t think it will fail, Seems like at some point in next few years hopefully monitor makers will start to ditch fixed refresh rate that is not needed anymore.
I don’t know, but I believe
I don’t know, but I believe Thunderbolt is on lifesupport…..
But, congrats on those who can get it.
We were promised the DIY kit
We were promised the DIY kit for December. As G-Sync is currently the only thing on my Christmas wish list, I would be rather disappointed if we don’t get the kits by years end.
Already set with my VG248QE and two 780’s to push out that blistering 144 Hz refresh rate without having to look at stuttering when it dips. Just need my kit!
Trust me, I am trying to get
Trust me, I am trying to get information from them on this. 🙂
So, I’m still a little
So, I’m still a little confused about whether G-Sync matters in my situation.
I have that 144 Hz VG248QE monitor, but when I play games I turn all the settings up, so even with my Titan card, I never find any games that push me over 144 frames per second (if I did get tearing, I’d just turn up the AA even higher).
So I run with V-sync off, and don’t get tearing anyway since my monitor has such a high refresh rate to accomodate the frames coming out.
Does G-Sync do anything for me?
Yes, it still will. You may
Yes, it still will. You may not have MUCH of it, but you have visual tearing even at that high of a refresh because the rates between the GPU and display don't match up perfectly.
I’m acutally really curious
I’m acutally really curious to see how this is compared to Adaptive V-Sync that came out a year or two ago.
Is there a big enough difference to warrant getting G-Sync compatible hardware vs sticking with current ahrdware and Adaptive V-Sync?
Adaptive Vsync turns off
Adaptive Vsync turns off Vsync under your maximum refresh rate, resulting in the horizontal tearing.
Looks great, but effectively
Looks great, but effectively adds the cost of a new monitor or GPU onto a purchase. I’ll wait until the technology is incorporated into the standard. Those that can afford new monitor/GPUs, have fun.
seconded if this becomes an
seconded if this becomes an open standard ill gladly buy. till then nvidia can take its closed standard and shove it.
Depends on how you look at
Depends on how you look at it, AMD’s stuff is closed as well. AMD knows nvidia will never use things like mantle in their gpu’s. So making it open is really nothing more then a PR move since even if nvidia did add it, probably would require special hardware in new cards to support it. I am sure nvidia would be willing to license the tech but AMD like with physx won’t want license it. If nvidia made it open and amd used it, no one would bat an eye about it, now if amd made something open and nvidia used it would massive news cause they used amd tech.
sorry to reign on your parade
sorry to reign on your parade but nvidia is in the freaking vesa display group they can purpose to add this to dp/hdmi/dvi the sad truth is they are not and are using it as a lock into to their product stack. Api wars aside if mantle pans out nvidia can adopt it and use it as its an open standard. on to physx and cuda could of run on amd gpu’s but it is not an open standard so why adopt something proprietary. the same for gsync it can be a great help to the gaming world as apart of dp/hdmi/dvi but it is not its going just another useless thing nvidia tries to push its product stack over its competitors.
Those of you running the Asus
Those of you running the Asus VG248QE (or Benq XL2420’s) must do yourselves a favor and check out LightBoost http://www.blurbusters.com/zero-motion-blur/lightboost/. On lightboost-approved monitors, you can enable buttery smooth motion with a simple free software solution (no extra hardware needed). The best example of LB vs. no-LB is by viewing this moving picture – http://www.testufo.com/#test=photo&photo=toronto-map.png&pps=960&pursuit=0. Don’t stare straight at the monitor, but track the moving map with your eyes. Non-LB monitors will have a very blurry image, and when LB is enabled, the moving image will be crystal clear.
Gsync sounds like a great option for those whose framerates can vary between 30 and 120fps, but if you have a proper gaming system and can keep your frames at or above 90fps, you really (I can’t emphasize this enough) need to experience a fps with a lightboosted monitor. Battlefield 4 is a completely different game.
Love the idea behind the
Love the idea behind the tech, and I don’t even mind the price premium, but I will not go back to a TN panel. Until there is an IPS option I’m out.
This is exactly where I
This is exactly where I stand. I’d gladly grab a pair of new Ultra Sharps with G-Sync
SOON! 😀
SOON! 😀
Will the G-Sync upgrade
Will the G-Sync upgrade module be compatible with Asus VG278H 3D Vision 2 monitors?
I do not know that…
I do not know that…
Doesn’t G-Sync require a
Doesn’t G-Sync require a Kepler-based GPU? In this case the 770 and 760 would not work for G-Sync. Only the 780 and up.
No, the 600-series was Kepler
No, the 600-series was Kepler as well.
when they first show to the
when they first show to the public what G-Sync is capable of in Montreal they were using GTX760. for 600 series some of the low end were based on Fermi but GTX650 and above is Kepler based. for GT640 they have both Fermi and Kepler version.
Why does the module have
Why does the module have on-board memory?
Is it just another buffer “on screen” and if so, what is it buffering?
It uses the memory to store
It uses the memory to store the most recent frame in case the FPS drops below 30 FPS and it needs to re-scan that frame back onto the display.
Why does the monitor have to
Why does the monitor have to re-scan a frame if the FPS drops below 30? Sorry if this has been answered elsewhere, but I thought the set-and-hold quality of lcd displays would negate the need for a minimum refresh rate.
Because the light of the LCD
Because the light of the LCD does eventually deteriorate if it isn't refreshed after a period of time. NVIDIA found that time
My AMD 5850 keeps locking up
My AMD 5850 keeps locking up and crashing my PC.
Happens randomly after working for a long time….I know what brand I`m NOT BUYING to replace it.
So whats up with Nvidia
So whats up with Nvidia already having G-sync on their Quadro cards, has Nvidia already been doing this? In software? They have G-Sync 2 even …I havn’t used a Quadro card so I’m unfamiliar with it first hand but It seems them using the same name for this kinda confused me a little bit. Do the 2 work together/cancel one out…
Thought I’d throw that out cause I’m curious myself.
The previous G-Sync
The previous G-Sync technology on the Quadro line, though it still exists, is actually used for multiple display synchronization.
"For example, in order to produce quality video recording of animated
graphics, the display graphics must be synchronized with the video camera.
As another example, applications presented on multiple displays must be
synchronized in order to complete the illusion of a larger, virtual canvas."
It's not doing the same thing as the new G-Sync here. Bad naming choice, yes. :/
The module with the G-Sync on
The module with the G-Sync on looked like stutter to me LOL
Not sure I follow…?
Not sure I follow…?
It seams it only works in
It seams it only works in some games. Which is a big FAIL!!!
Anandtech points out @60hz if the FPS are too low or to high it has issues and experience is worse.
Just spend the extra money for a better GPU to play at @60hz
Another let down from Nvidia.
This is the expensive Hardware version of GeForce Experience.
Wow, you are so full of $hit!
Wow, you are so full of $hit!
Don’t really understand what
Don't really understand what you are saying. Do you have links and quotes to the parts of Anand's story you are referring to?
Thx for clarification.
Thx for clarification.
The sad thing is that it
The sad thing is that it doesn’t support surround yet I believe?
Not yet but its new tech so
Not yet but its new tech so sure that it will happen when it matures.
It does not yet, but its more
It does not yet, but its more of an issue with the DisplayPort requirement. Do you know of any Kepler graphics cards with 3 DisplayPorts on them? Not yet you don't!
Doesn’t DisplayPort support
Doesn’t DisplayPort support daisy-chaining for multiple monitors? You should only need 1 DisplayPort.
I agree that switching
I agree that switching between 30 and 60 fps when v-sync is on produces annoying stuttering. Enabling triple buffering let’s you have v-sync and any fps value between 30 and and 60. Would it be possible to compare g-sync with v-sync/triple buffering?
Triple buffering doesn’t
Triple buffering doesn't allow frame rates between 30-60, at least not in REALITY, though perhaps yes what FRAPS sees.
Take note of the Crysis 3 benchmarks I showed you in my story – it does use triple buffering and still with Vsync enabled frames only ever appear at 16 ms or 33 ms.
Ryan thanks for the reply.
Ryan thanks for the reply. Technically you are right. The weird thing is, that playing assassin’s creed 4 with triple buffering enabled compared to disabled is a very different experience when the fps is between 30 and 60. With triple buffering enabled the gameplay is noticeably smoother. I always thought the increased smoothness has to do with less jumping between 30 and 60 fps.
Anandtech – NVIDIA G-Sync
Anandtech – NVIDIA G-Sync Review
http://anandtech.com/show/7582/nvidia-gsync-review
Assassin’s Creed IV
“The combination of fast enough hardware to keep the frame rate in the G-Sync sweet spot of 40 – 60 fps and the G-Sync display itself produced a level of smoothness that I hadn’t seen before”
If you notice the G-Sync Demo is made with-in the “sweet spot”. Maybe its limited to only functioning with as expected with in that window.
Batman: Arkham Origins
“You can still see some hiccups”
“That brings up another point about G-Sync: once you remove GPU/display synchronization as a source of stutter, all other visual artifacts become even more obvious. Things like aliasing and texture crawl/shimmer become even more distracting. The good news is you can address those things, often with a faster GPU”
Doesn’t matter if its smooth if you still need a fester GPU for a certain game out of all the ones you play would play.
You’d still have to lower quality settings to be with-in that sweet spot
Sleeping Dogs
“The first thing I noticed after enabling G-Sync is my instantaneous frame rate (according to FRAPS) dropped from 27-28 fps down to 25-26 fps. This is that G-Sync polling overhead I mentioned earlier. Now not only did the frame rate drop, but the display had to start repeating frames, which resulted in a substantially worse experience. The only solution here was to decrease quality settings to get frame rates back up again. I was glad I ran into this situation as it shows that while G-Sync may be a great solution to improve playability, you still need a fast enough GPU to drive the whole thing.”
Can’t say it any better than that. If a GTX 760 isn’t handling games with G-sync properly you’ll be forced to buy a better GPU no matter if you have the setup or not because its dependent on how it behaves with a game. Its not one and done solution.
Dota 2 & Starcraft II
“The impact of G-Sync can also be reduced at the other end of the spectrum. I tried both Dota 2 and Starcraft II with my GTX 760/G-Sync test system and in both cases I didn’t have a substantially better experience than with v-sync alone. Both games ran well enough on my 1080p testbed to almost always be at 60 fps, which made v-sync and G-Sync interchangeable in terms of experience.”
A faster GPU is a better option.
Bioshock Infinite @ 144Hz
“The G-Sync on scenario was just super smooth with only a few hiccups.”
Doesn’t seem too convincing.
The under 30 FPS issue is
The under 30 FPS issue is something to be worried about. But honestly, I am HOPING that if you are playing a game like Sleeping Dogs you are setting your IQ options to run at a higher frame rate than that… Not sure why Anand selected those for that game. It's interesting to see though.
As for Dota 2 and SC2; its true if you are locked at 60 FPS with Vsync then the difference in experience isn't quite as noticeable. But this particular monitor does allow you to go over 60 FPS while still maintaining a smooth, variable refresh rate if you want.
Its a niche of niche of
Its a niche of niche of market. SHIELD
GTX 760 for $210+
G-Sync $150+ premium on old TN panel technology @ 1080p
Enthusiast buying $400+ GPUs aren’t going to want to set their IQ options down to maintain the sweet spot of 40-60fps or be stuck to 1080p with newer games.
In which case the average console/pc player is better off getting a PS4 or Xbone for 1080p gaming.
There the games will be better programed and optimize and you wont run into the problems Anand pointed to when the rest of your PC isn’t up to snuff.
I think this so-called “SWEET SPOT” of 40-60fps is going to be very interesting.
It depends on what is
It depends on what is important to the gamer. Those who are into multiplayer FPS games like BF4 *should* worry more about keeping their minimum frames at 60fps (or 120fps if you’re on a 120hz monitor) at the expense of image quality settings. Everyone would agree in the heat of the battle, you’d rather have a smooth game rather than a pretty game chugging along at 30fps and getting yourself killed.
I think Anand summed up GSync well with this “what G-Sync will allow you to do is to crank up quality levels even more without significantly reducing the smoothness of your experience”. That is pretty cool.
I’ll push the LightBoost propaganda again. 120/144hz monitor users really need to check out the *free* LightBoost program if your system can keep games above 90fps. I’m running a GTX 780 and have my i5-2500k at 4.5Ghz in order to keep BF4 above 90fps (Ultra everything, no AA/HBAO). Every person who has seen this setup is blown away by how smooth it is. While I haven’t seen GSync in person, from what I’ve read about the technology, LightBoost is a better option for high end systems.
I will be getting this right
I will be getting this right away, but only because it is the right upgrade for me. I got a cheaper 23 inch lg 1080p monitor to keep overall costs down. 1st gpu: r6850, 2nd: gtx670(about a year ago).
Right now with v-sync on (I always use it)I prefer 54+fps. 50-53fps is a minimal but still noticeable slow down and 50fps or less means I should turn down a quality setting.
Several reviews so far have said 40+ fps is where g-sync shines, my 670 can dish out 40+fps ultra settings all day, plus with v-sync off I will get higher fps overall because I always use v-sync/lower settings vs v-sync off/higher settings so best of both worlds. For those with the entry level 650ti boost you may be better off getting an oc’d 4gb gtx 770, just a thought!