UPDATE: If you missed our live stream today that covered NVIDIA G-Sync technology, you can watch the replay embedded below. NVIDIA's Tom Petersen stops by to talk about G-Sync in both high level and granular detail while showing off some demonstrations of why G-Sync is so important. Enjoy!!
Last week NVIDIA hosted press and developers in Montreal to discuss a couple of new technologies, the most impressive of which was NVIDIA G-Sync, a new monitor solution that looks to solve the eternal debate of smoothness against latency. If you haven't read about G-Sync and how impressive it was when first tested on Friday, you should check out my initial write up, NVIDIA G-Sync: Death of the Refresh Rate, that not only does that, but dives into the reason the technology shift was necessary in the first place.
G-Sync essentially functions by altering and controlling the vBlank signal sent to the monitor. In a normal configuration, vBlank is a combination of the combination of the vertical front and back porch and the necessary sync time. That timing is set a fixed stepping that determines the effective refresh rate of the monitor; 60 Hz, 120 Hz, etc. What NVIDIA will now do in the driver and firmware is lengthen or shorten the vBlank signal as desired and will send it when one of two criteria is met.
- A new frame has completed rendering and has been copied to the front buffer. Sending vBlank at this time will tell the screen grab data from the card and display it immediately.
- A substantial amount of time has passed and the currently displayed image needs to be refreshed to avoid brightness variation.
In current display timing setups, the submission of the vBlank signal has been completely independent from the rendering pipeline. The result was varying frame latency and either horizontal tearing or fixed refresh frame rates. With NVIDIA G-Sync creating an intelligent connection between rendering and frame updating, the display of PC games is fundamentally changed.
Every person that saw the technology, including other media members and even developers like John Carmack, Johan Andersson and Tim Sweeney, came away knowing that this was the future of PC gaming. (If you didn't see the panel that featured those three developers on stage, you are missing out.)
But it is definitely a complicated technology and I have already seen a lot of confusion about it in our comment threads on PC Perspective. To help the community get a better grasp and to offer them an opportunity to ask some questions, NVIDIA's Tom Petersen is stopping by our offices on Monday afternoon where he will run through some demonstrations and take questions from the live streaming audience.
Be sure to stop back at PC Perspective on Monday, October 21st at 2pm ET / 11am PT as to discuss G-Sync, how it was developed and the various ramifications the technology will have in PC gaming. You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!
NVIDIA G-Sync Live Stream
11am PT / 2pm ET – October 21st
PC Perspective Live! Page
We also want your questions!! The easiest way to get them answered is to leave them for us here in the comments of this post. That will give us time to filter through the questions and get the answers you need from Tom. We'll take questions via the live chat and via Twitter (follow me @ryanshrout) during the event but often time there is a lot of noise to deal with.
So be sure to join us on Monday afternoon!
A couple questions:
Will
A couple questions:
Will using the variable refresh rate be similar to Vsync in terms of possibly causing input lag, and other things like degrading performance? Is this something we’re going to only see people with high end hardware using, or can low-mid range cards still benefit from it without sacrificing performance?
Will there still be screen tearing when your fps goes above the max refresh rate?
I saw somewhere that a G-Sync
I saw somewhere that a G-Sync add-on may be possible for some monitors. My question is: How does the add-on integrate with monitors, and will it work on HDTVs?
So far, nVidia has a great
So far, nVidia has a great track record of providing support in their linux drivers for current GPU. Will the linux driver have support for G-Sync?
Have there been any
Have there been any discussion about licencing this technology for other companies to use (Intel, AMD, etc), to benefit the industry and gaming experience as a whole?
If not, can this discussion start? Please?
Clearly this would be beneficial for everyone, and would make NVIDIA more money in the long run as well.
I just recently purchased the
I just recently purchased the Asus VG248QE. My question is how difficult will it be for me to get the module itself from retail and install it? Also how much will the module by itself cost?
+1
+1
To answer your last question.
To answer your last question. It’ll cost $175 for the DIY kit.
http://www.geforce.com/hardware/technology/g-sync/faq
so to played like the game
so to played like the game meant to be played.
all we need is
650$ for a 780.
175$ for G-Sync
270$ – for ASUS VG248QE
= holy Josh, that’s getting ridiculously expansive.
Hi, Tom. Now that G-Sync
Hi, Tom. Now that G-Sync unshackles us from the 30/60/120 fps v-sync targets and the stuttering effect experienced when dropping from a higher v-sync setting to a lower setting, have you and your team been able to determine a frame-rate “sweet spot” for an optimal G-Sync enhanced gaming experience after which additional fps could be considered diminishing returns?
As a corollary, can you describe how noticeable the fluidity difference is in the subjective gaming experience between a G-Sync’ed ~50fps experience, a G-Sync’ed ~70fps experience and a G-Sync’ed ~100+fps experience?
Thanks Ryan and Tom.
Can you please ask about 3D
Can you please ask about 3D Vision and G-sync. I love my Asus Vg278 lightboost 120hz 3D vision monitor.
Not sure if the module would be eventually available for this Model. I think it will be worth the upgrade. I hope the 3d is improved and not hindered. 3D gaming is amazing.
Or for the VG236H. A little
Or for the VG236H. A little older but still 120hz 3dvision. WOuld really like to see the DIY kit for this!
I’m also interested in
I’m also interested in G-sync’s interaction with 3D Vision. Although I love my 3d Vision Surround setup, one of my complaints is the induced lag by the requirement of vsync in 3D mode. If g-sync works with 3D Vision, I’ll gladly upgrade my monitors.
Just adding I would also like
Just adding I would also like to know more details about G-Sync and 3D Vision compatibility. I’m currently using an Acer HN274H but I’d definitely consider upgrading to a G-Sync, Lightboosted setup if it’s worth it.
Very interesting video, what
Very interesting video, what I gathered is G-sync capable displays can also work in 3D Vision mode however I am not sure you can run Gsync and 3D Vision at the same time as the glasses need to flicker at 60 hz per eye hence the 120hz requirement for 3d vision to work but the way G-sync works is it slows down the refresh rate of the monitor to match the FPS of the graphics, if thats the case the glasses would also need to have a variable refresh, I hope i’m wrong because this technology looks amazing and I would love to see 3d vision with G-sync enabled.
Tom, what is your favorite
Tom, what is your favorite color?;)
In the Nvidia G-SYNC FAQ it
In the Nvidia G-SYNC FAQ it says.
“Q: Does NVIDIA G-SYNC work for all games?
A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.”
Can a list of compatible games provide ?
I don’t want to left in the dark as far as compatibility to my favorite games or certain genre.
When can we see a actual in-game action sequence with this technology demo. Like Borderlands 2, Hawken or Metro:Last Light with PhysX effects & particles flying everywhere to see how smooth it is.
On the DIY kit the G-SYNC disables all port VGA & HDMI it also disables the audio throough the DisplayPort. Why is that?
Will it only be DisplayPort only ?
Will monitors not have ability to pass through audio ?
Will using G-Sync on Monitors that already have G-Sync disable all other ports including audio or will G-Sync monitors have DisplayPort connectors only ?
When will IPS panels have this?
Why gfx cards less powerful
Why gfx cards less powerful than gtx660ti can not use g-sync?
Will there be any mid range [less than 1080×1920, 120Hz] g-sync supported monitor?
Is the image on the screen
Is the image on the screen still being updated from top to bottom or is every pixel being updated at the same time?
Thank you.
Is the image on the screen
Is the image on the screen still being updated from top to bottom or is every pixel being updated at the same time?
Let’s say former is the case, right? (1)Card is rendering a very intensive frame (from top to bottom, I guess). (2)Frame has been rendered completely (from top to bottom), (3)refresh cycle has begun and frame is now being drawn onto the screen. We can expect it to be completely drawn in about 16.6 ms.
Something is wrong either
Something is wrong either with my connection or with PCPer commenting system.
Here’s my question in one piece:
Is the image on the screen still being updated from top to bottom or is every pixel being updated at the same time?
Let’s say former is the case, right? (1)Card is rendering a very intensive frame (from top to bottom, I guess). (2)Frame has been rendered completely (from top to bottom), (3)refresh cycle has begun and frame is now being drawn onto the screen. We can expect it to be completely drawn in about 16.6 ms.
But wait! As soon as (2) happened, new frame is starting to be rendered. Only this time, frame is depicting a light scene and rendering is done in ~8 ms. I would love to have insight into what new has happened in this virtual world that I’m simulating and this new frame might have some information (enemy position or something similar) that I would be interested in knowing, even if I could see just the bottom half of this most current frame. Not to mention the input-to-output lag would be lower if I were to see this new frame.
So, I want this new frame because I want to see THE most current event in a scene. Am I out of luck and will have to wait for previous frame to be completely drawn onto my screen or what?
Thank you.
So we all know G-sync will be
So we all know G-sync will be great for games but what about Movies. The current standard is 24fps can i match the frame rate to the movie? Jen-Hsun Huang mention judder on TVs so your team must of known of this issue.
Also will G-sync support TVs in the future and is that a possible way to kill the NTSC and PAL framerate difference?
Also when i say support TVs I also mean the cable companies that show movies or maybe allow sports to be higher fps?
Finally my last question is that as i understand it you need hardware in the monitor to support G-sync/VRR and you need GPU cards that have the hardware feature as well. This is the reason why only some of the kepler GPU support G-sync. Why is it that you don’t push all the hardware requirements from the GPU side on to the monitor G-sync board. Wouldn’t this allow older nvidia cards to use g-sync and maybe other hardware manufactures a chance?
Q. Is there any chance that
Q. Is there any chance that Nvidia could come out with a fully branded 144hz Nvidia monitor that is completely bezel-less with the G-sync technology onboard? If so I would buy three monitors because lately I have been itching to try out Nvidia surround but I just can’t get over those black bezels that are in the way and to push those puppies I would run three 780’s in sli or maybe three $$ Titans $$ .
On the behave of the high-end enthusiast I would like to say thank you Nvidia for pushing out new technologies up until this point monitors where getting kind of stale.
Will Quadro K600 support
Will Quadro K600 support G-Sync? How about other OS? (e.g. Mac OS X, FreeBSD, Solaris)
According to the there
According to the there website they been doing something similar for a couple of years with Quadros but not at the monitor level.
NVIDIA Quadro Sync and G-Sync
http://www.nvidia.com/object/nvidia-sync-quadro-gsync.html
I just purchased the ASUS
I just purchased the ASUS VG278HE model and would like to know if NVIDIA has plans to expand the upgrade kit offering to include this model.
I recently purchased a 27″
I recently purchased a 27″ BenQ gaming monitor, will this have a mod kit as well down the line?
Will g-sync work with nvidia
Will g-sync work with nvidia surround?
IPS panels will have g-sync? any type of IPS?
any 4k panels with gsync on the near future?
Only DP conenction will work? if that so, how will you connect a nvidia surround setup in order to use g-sync?
I want to know how on-line
I want to know how on-line gaming and timinglag is affected.
If i’m casting a spell or shooting at someone running across pillars. Will I have to wait for G-Sync to render the scene smoothly ?
While the users with out G-Sync will have a competitive advantage over me and i’ll probably get owned by the time G-Sync smooths out the animations.
Fractions of a sec between being alive or dead.
How much power will G-Sync add to monitor usage?
Since G-sync provides more
Since G-sync provides more “safety” on lower and mid-end gaming hardware when it comes up to graphic quality / tearing etc.:
Are there already plans / confirmations of TV manufacturers to implement G-sync on their screens ?
(My guess: Especially for “living room gaming appliances/consoles/PCs/HTPCs…” that are usually not that high-ended it would be great to have a “reserve” on gaming power. In other words: Won’t these gaming solutions benefit most from g-sync ? )