Impressions, Possible Bugs, and Conclusion
Impressions:
My initial impressions closely mirror what was reported by one of our G-Sync contest winners. If you're going from 60 Hz panels, there is a *huge* drop in latency and a corresponding increase in fluidity when gaming at 120+ Hz refresh rates, especially if the game FPS sits that high in the range. The higher you go on refresh, the harder it is to see tearing, even with lower FPS coming from the game and with vsync off. In those cases, torn frames are only on-screen for a fraction of the time, quickly replaced with a non-torn version of the new frame on the very next scan (only 7 msec later when refreshing at 144 Hz). At those speeds, tearing goes from smacking you in the face to you having to look for it.
Even though higher refresh rates attempt to solve the tearing issue, they do not correct the aggregate latency that comes from the GPU completing frame renders out-of-sync with display refreshes. If you've experienced the difference in feel going from a 60 to 120+ Hz panel, the best way to describe the transition to G-Sync is as a half-again gain over that 60-120 Hz change in feel – where those perceptions mainly apply to gaming above 120 Hz. If all games ran at 120+ Hz without issue, there would be much less of a need for G-Sync, but that's not reality as game developers tend to target 60 FPS.
With G-Sync eliminating the tearing that is so painfully obvious at 60 Hz rates, while keeping latency lower than even vsync-off levels, the effect of G-Sync becomes increasingly apparent as you drop towards and even below 60 FPS. The difference is immediately apparent on just about any game with settings adjusted to achieve a 40-60 FPS range. Latency improvement is a half-again increase when compared to a vsync-on/off toggle on a 60 Hz display, and every frame is drawn as if vsync were on. Games feel just as playable and responsive at ~75% of the FPS goal you'd have to set on a non-G-Sync display.
When tuning game settings for improved image quality, I got a bit overzealous and dialed everything up to eleven in Crysis 3 and Metro Last Light, and everything was still surprisingly responsive even at ~20 FPS (I'm on a GTX 680). Panning around a scene went from a laggy mess to what I can best describe as a feeling that the display was 'leading the mouse'. Now I wouldn't recommend trying to game at this low of a refresh since render times add their own latency, but it remains a definite and demonstrable improvement over the old tech.
The 30 Hz concern:
After my tinkering I read about a supposed issue where G-Sync 'reverted back to vsync-on' at levels below 30 Hz. This surprised me, especially after testing as low as 16 Hz with no noticeable lag issues (beyond that expected from increased render time at such a low FPS). I re-tweaked settings to shoot for a value hovering around 30 FPS and then I suddenly started to notice something odd. I'll start with our theory as to the cause and work towards the effect:
We know from our talks with NVIDIA that the LCD panel must be refreshed at no lower than 30 Hz (33 msec) intervals. G-Sync forces an additional redraw of the same frame if the new render does not complete within 33 msec. Assuming the panel has just forced a re-display of the frame, the next frame would have to wait until that was complete (7 msec assuming 144 Hz). This creates a 'dead zone' window of time between 33 msec (30 Hz) and 33+7 msec (~25 Hz). Frames rendered at a rate within that 25-30 FPS window would in theory be forced to wait for that additional redraw to complete before they could occur. Depending on the game, video settings, and GPU speed, a scene may intermittently dip through this dead band and therefore introduce an intermittent latency, which would be worst (7 msec) at just below 30 FPS. Once you get all the way down to 25 FPS, the issue should subside completely, which would explain why my testing at 16-20 FPS seemed fine. Bouncing various GPU heavy games I found a batch of settings for Metro Last Light that dipped through the theoretical 'dead band' several times within the same benchmark session. Here is the picture to go with the thousand words written above:
Note the distinct plateaus across the 25-30 FPS band. The center area of unhappiness feels like intermittent lag and introduces a judder-type effect on smooth pans occurring within that 25-30 FPS range.
I suspected this issue might be correctable by having the driver dynamically adjust the forced refresh to shorter delays when rendering within the 30 FPS range. After further discussion with NVIDIA, the 30 FPS plateau was a design decision for this first generation of G-Sync as it applies to this specific ASUS panel. Future iterations may still exhibit this quirk, but at longer forced refresh wait time and therefore would push the dead band to even lower FPS ranges. NVIDIA designed G-Sync around the 40-60 FPS target and it's worth noting that FPS can even dip into the 30's without issue. The reduced latency of G-Sync may cause a tendency towards cranking video settings up further than they normally would, and I believe that is why so many folks are reporting this effect at <30 FPS. For the moment I'd recommend shooting for the 40-60 FPS range, with the hope that future G-Sync panels will be able to force this band down below 20 FPS. This should reduce the possibility of entering the band in even the most demanding scenarios.
Conclusion:
I'm extremely impressed with the build quality of the G-Sync installation kit, as well as the product and technology itself. The installation should be possible for any do-it-youselfer, and the included documentation took away all of the uncertainty venturing into unknown waters. My biggest gripe / hope is to see this technology applied to larger, more color-accurate panels. I'd pay good money for a 2560×1600 IPS G-Sync panel, even if the max refresh rate was 100 or even 85 Hz.
my dream is to see this on
my dream is to see this on TVs..
My Sony has an Impulse mode
My Sony has an Impulse mode for gaming, which is essentially Lightboost. It’s terrific. It kills the intensity of the lighting to accomplish it (so you need to crank the backlight up) and it introduces flicker that some people notice to varying degrees (but seems fine to me – and I’m picky) and only really benefits you at 60fps, so should be disabled for sub 60fps content, but . . . man is it really fantastic.
Keep dreaming. It doesn’t
Keep dreaming. It doesn’t work in non-3d applications. Not to mention below 30fps it’s aweful.
Of course it works in non-3D
Of course it works in non-3D applications. I played with G-Sync at the recent SC2 Tournament in NYC where it was first debuted to the public and there was absolutely no indication that it was “awful” below 30fps. As far as I could tell, FPS had no ill-effect on the manner in which G-Sync worked (at least on the two demo-displays).
Are you calling Allyn a liar
Are you calling Allyn a liar ?
He has a half-page write up on the last page on it.
I’m not sure this would
I’m not sure this would benefit a television unless the source was a PC’s GPU.
There’s no point to this on a
There’s no point to this on a TV – TV content is a fixed framerate and latency doesn’t matter as long as the sound is in sync…
step 1. get 55″ good panel
step 1. get 55″ good panel quality tv with sub 1frame input lag
step 2. get 5$ hdmi cable
step 3. …
step 4. profit
now. if there was gsync in there, i wouldnt mind it. trust me when i say that playing BF4 on max settings on sony’s w905 55″ is not too shabby.
Please check your spelling
Please check your spelling before publishing the article.
Oh be quiet, you obviously
Oh be quiet, you obviously have no clue what you’re talking about… Allyn worked very hard to have this article ready for PCper fans before CES. Also, if you knew anything about PCper, you’d know that Allyn is not the Editor in Chief, Ryan is. So if you’d like to direct your comment towards someone, perhaps you should email Ryan.
stfu – gtfo
stfu – gtfo
I’m assuming this kit is only
I’m assuming this kit is only available for this one particular Asus monitor. Are there other kits for different monitors or brands?
Correct, and not at the
Correct, and not at the moment.
Whelp, my original post was
Whelp, my original post was blocked by the spam filter. I’ll make this a tad more concise.
Excellent and very thorough review Allyn.
Could you confirm whether or not LightBoost is built in?
Apologies, we are tweaking
Apologies, we are tweaking our spam filtering.
I haven't tested it personally, but the display should handle it with the gsync module installed – but not in gsync mode. It's either or.
I want this…but I am not
I want this…but I am not going back from 2560×1440. Not that I complain about IPS 2560×1440 @ 120Hz.
Also, what happened to thin
Also, what happened to thin bezels? Thin bezels are better looking than monitor thickness. I’ve taken my monitor apart and it sticks the panel to the bezel with an L bracket that makes requires a larger bezel. Why not use an L bracket and bring the bezel attachment BEHIND the panel for attachment, and reduce the bezel width. Then for multi-monitor users, make the front bezel detachable and then they only have the panel + <1mm bracket between monitors. To make it better, attach the brackets to the top and bottom of the panel. For either configuration they can place the OSD controls recessed from the panel and attached to the bracket for an even slicker profile.
AMD FREE-SYNC –> NVIDIA
AMD FREE-SYNC –> NVIDIA G-SYNC
Let wait and see; I’m
Let wait and see; I’m interested to see if latency is going to be a major problem with Freesync, given how AMD attempts to fix the problem in software. Nevermind most monitors don’t support dynamic Vblank intervals yet anyways. If NVIDIA gets its hardware in first…
This is a whole lot of work
This is a whole lot of work and way too expensive for what its trying to do. The article is well written if in need of a quick spell check/re-read. It is totally fair of readers to nitpick on spelling mistakes in a tech article. It also does not negate the content.
G-sync is a great idea, just save us all the cost and sell monitors that are G-Sync ready. Have a logo on the box or corner of the bezel that says G-Sync Inside or something. In my experience better drivers and tweaking settings can reduce or eliminate tearing almost every time.
I think this is really
I think this is really dangerous for the average person, and if someone gets killed.. Nvidia and PCPER and everyone else pushing this upgrade is liable to get sued by the dead persons family.
Monitors and especially TV`s are very dangerous, and should not be opened unless they are a qualified electrician. There is plenty of current stored in components well after its been switched off.. some components hold current for days and weeks that could easily kill you.
This is an accident waiting to happen.
Seriously? Apart from the
Seriously? Apart from the fact that LCD capacitors are signifcantly smaller and hold less power than the ones in a CRT, you feel that people should be able to sue a website for providing information that could used improperly?
Just purchased mine. One
Just purchased mine. One question though. Does it come with a display port cable or will I need to purchase one?
it comes with a display port
it comes with a display port cable. Installation is easy, just watch the orientation of the ‘Y’ style connector as mentioned in the article.
The posted YouTube video is a slightly different board than what ships to us.
The ribbon cable is also a pain to install.
Eagerly awaiting my kit! The
Eagerly awaiting my kit! The only thing that worries me a bit is that this kit is, by necessity, late-prototype engineering-sample production-candidate custom hardware and what goes in the new monitors (as Allyn says) will almost certainly be not this, but fully baked production ASICs. But, as we see here, the modules in these kits are FPGAs which makes me wonder (not knowing a whole lot about FPGAs) could these modules be ‘re-flashed’ or somehow patched in case something needs fixing before the retail monitors go live? (I do know this is bleeding-edge stuff and fully accept the risks the kits bring.) Either way, I still think it’s very cool of NVIDIA to let us in on at least a little bit of bleeding edge hardware hackery.
As someone that owns this
As someone that owns this monitor and GTX 760s in sli, I can usually push 100-120 fps in most games with max settings at 1080p. Would gysnc have any benefit for me? Or is it mainly beneficially when u can push 60fps?
Yes it would. Gsync even
Yes it would. Gsync even makes low framerates like 40fps feel way more fluid than they are. In fact you are likely to notice it a little less if you’re used to a high framerate with a 120 or 144hz monitor…but either way it’s going to be better, this should have happene din the monitor world a long time ago.
Does anyone know if the power
Does anyone know if the power brick with the kit is 100-240V ?
This is an old thread, but
This is an old thread, but does anyone know where you can actually buy a g-sync module? I’ve already invested in 3 VG248QE monitors because they were g-sync upgradable, but I can not find where to purchase modules.