Potential Benefits for a New Interface
Let’s take the first step in our discussion and remove the idea of a fixed refresh rate on LCD monitors. What does that change?
First, it means we can update the image on an LCD at any rate we want without having to worry about the 60 Hz standard or any multiple thereof. The only limitation now would be the response time of the pixels on the screen. If you have a high quality panel with a 4ms grey to grey (GTG) response time then you could in theory display frames at 250 Hz (FPS).
Pixel response time. Image source: Xbitlabs
Current DisplayPort capabilities have more than enough bandwidth to support that. If you run a 1920×1080 monitor at 200 Hz, that is a total bandwidth allotment under 10 Gbps. That is well inside the maximum range of DisplayPort 1.2.
We can also eliminate the horizontal tearing of games that occurs when the refresh rate of the monitor does not match the frame rate produced by the graphics card. By only sending complete frames to the monitor and having the panel refresh at that time, you could maximize frame rate without distracting visual anomalies. If you are able to run your game at 40 FPS then your panel will display 40 FPS. If you can run the game at 160 FPS then you can display 160 FPS.
Triple buffering. Image source: Wikipedia
One of the key performance components for enthusiast PC gamers is display latency and a variable refresh rate would eliminate delays due to buffering. Because the display can output a frame as soon as the GPU is finished rendering it, without having to bother with triple buffer techniques used in Vsync implementations today, that portion of latency would be several dropped or removed. And in fact, as Johan Andersson of DICE discussed with me, that experience can beat even games running at a locked 60 FPS as game developers "have to be conservative" to avoid render times longer than 14-15 ms. Engines that surpass that time could miss the Vblank (refresh) signal and be delayed on the screen significantly.
The data transfer rate between the graphics system and the monitor would no longer be limited by a refresh rate (or anything related to it) and instead would be capped by the medium that interconnects them. Images could be moved and displayed as fast as the interconnect or the monitors response time allows, whichever is slower.
This might also lead to faster connections between the monitor and computer as the response time and paint rate of panels improves. Perhaps a fiber optic link would appeal to you?
NVIDIA G-Sync VRR Technology
This leads us to the announcement today of NVIDIA's G-Sync technology. This new feature will combine changes to the graphics driver as well as change to the monitor to alter the way refresh rates and Vsync have worked for decades.
With standard LCD monitors gamers are forced to choose between a tear-free experience by enabling Vsync or playing a game with the substantial visual anomolies in order to get the best and most efficient frame rates. G-Sync changes that by allowing a monitor to display refresh rates other than 60 Hz, 120 Hz or 144 Hz, etc. without the horizontal tearing normally associated with turning off Vsync. Essentially, G-Sync allows a properly equiped monitor to run at a variable refresh rate which will improve the experience of gaming in interesting ways.
This technology will be available soon on Kepler-based GeForce graphics cards but will require a monitor with support for G-Sync; not just any display will work. The first launch monitor is a variation on the very popular 144 Hz ASUS VG248QE 1920×1080 display and as we saw with 3D Vision, supporting G-Sync will require licensing and hardware changes. In fact, NVIDIA claims that the new logic inside the panels controller is NVIDIA's own design – so you can obviously expect this to only function with NVIDIA GPUs.
DisplayPort is the only input option currently supported.
It turns out NVIDIA will actually be offering retrofitting kits for current users of the VG248QE at some yet to be disclosed cost. The first retail sales of G-Sync will ship as a monitor + retrofit kit as production was just a bit behind.
Using a monitor with a variable refresh rates allows the game to display 55 FPS on the panel at 55 Hz without any horizontal tearing. It can also display 133 FPS at 133 Hz without tearing. Anything below the 144 Hz maximum refresh rate of this monitor will be running at full speed without the tearing associated with the lack of vertical sync.
The technology that NVIDIA is showing here is impressive when seen in person; and that is really the only way to understand the difference. High speed cameras and captures will help but much like 3D Vision was, this is a feature that needs to be seen to be appreciated. How users will react to that road block will have to be seen.
Features like G-Sync show the gaming world that without the restrictions of console there is quite a bit of revolutionary steps that can be made to maintain the PC gaming advantage well into the future. 4K displays were a recent example and now NVIDIA G-Sync adds to the list.
How does it work??
The ability for NVIDIA to pull of this feat of magic is built around the fact that while the monitor sends EDID information to the graphics card to request a specific set of timings, it is in fact, only a request. The graphics card has the ability to adjust these signals in any way it wants though in the past it has always followed the correct timings in order to guarantee compatibility and not damage system components. NVIDIA already showed a desire to start allowing more user flexibility though when they announced pixel clock overclocking with the GK110 GPU release.
G-Sync essentially functions by altering and controlling the vBlank signal sent to the monitor. In a normal configuration, vBlank is a combination of the combination of the vertical front and back porch and the necessary sync time. That timing is set a fixed stepping that determines the effective refresh rate of the monitor; 60 Hz, 120 Hz, etc. What NVIDIA will now do in the driver and firmware is lengthen or shorten the vBlank signal as desired and will send it when one of two criteria is met.
- A new frame has completed rendering and has been copied to the front buffer. Sending vBlank at this time will tell the screen grab data from the card and display it immediately.
- A substantial amount of time has passed and the currently displayed image needs to be refreshed to avoid brightness variation.
In current display timing setups, the submission of the vBlank signal has been completely independent from the rendering pipeline. The result was varying frame latency and either horizontal tearing or fixed refresh frame rates. With NVIDIA G-Sync creating an intelligent connection between rendering and frame updating, the display of PC games is fundamentally changed.
Closing Thoughts
After finally seeing the demonstrations and side by side comparisons in person, I have to say the effect of NVIDIA G-Sync is quite impressive and dramatic. We are working on getting some high speed video to try and showcase the difference for readers, check back soon.
It is one thing to know that NVIDIA and the media are impressed by a technology, but when you get the top three game developers on stage at once to express their interest, that sells a lot. John Carmack, Johan Andersson and Tim Sweeney stood up with NVIDIA CEO Jen-Hsun Huang all raving about the benefits that G-Sync will bring to PC gaming. Mark Rein was standing next to me during a demonstration and was clearly excited about the potential for developers to increase visual quality without worrying about hitting a 60 FPS cap 100% of the time.
Be sure to stop back at PC Perspective on Monday, October 21st at 2pm ET / 11am PT as we will be joined in-studio by NVIDIA's Tom Petersen to discuss G-Sync, how it was developed and the various ramifications the technology will have in PC gaming. You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!
Why do lcd monitors need even
Why do lcd monitors need even that? Why not have the gpu have a done signal instead where each complete signal has a 1 or 2 bit done packet which tells the monitor that a frame is fully rendered and in a more dumb fashion the display simply displays each complete frame
All the monitor tells the videocard is the max frame rate that it can handle and the gpu will cap its frame rate to that limit.
Think of it like this, imagine you have 1 person showing drawings to a some people and next to that person, you have someone drawing pictures quickly and once he or she is finished he will ring a bell and the person showing the images to the group of people, will simply grab the next image and show them, not caring about the rate at which they are being drawn at.
Razor512 > From my
Razor512 > From my understanding, that is what G-Sync is attempting to resolve. To ignore the Monitor’s timing rate and display the images from the GPU when they are ready.
I believe they are focusing at LED/LCD monitors as these are widely available and the ideal display format trend. And being the most restricted in clock timing, it gives them a great foot into a large future for nVidia.
So when they have it well adopted, they will go down the chain to the next wide spread display.
We will know more in Fall 2014.
All I can say is, it’s about time someone as large releases something like this. There will be many more advantages from the GPU with these new features.
*page request error*
– Sorry
*page request error*
– Sorry for double post –
For Me, this maybe AMD’s last
For Me, this maybe AMD’s last nail in their coffin.
i head mostly ATI/AMD’s card’s, but it’s clear now
who is the leader & who is flower in this technology race.
wish Nvidia make cooler running cards. (80c ugh :S)
and that we don’t need sell a kidney for a very good card.
AMD/BBY drinking at a bar
AMD/BBY drinking at a bar together very soon.
Well if this will signal the
Well if this will signal the end of AMD GPU’s, wont it mean that NVIDIA will monopolize PC graphics card and we will pay an arm and a leg just to buy great graphics card????
I have to say the article
I have to say the article left me with the feeling that several crucial issues are not even mentioned. I am not an expert and I am not absolutely sure I remember everything correctly. But here are the issues I would like to be explained:
1. As far as I can remember LCD monitor do not tolerate very low refresh rates well – although these very low frequencies are probably not important for gmaing.
2. Flexible refresh probably has more important applications than improved gaming. How about power savings? How about discarding frame rate conversions on some archival videos? These seem to be more important for general public.
3. OLED in monitors and TVs is probably a very close futre. Other technologies like electrowetting are also a possiblity. These should be also considered, not just LCD.
4. It is understandable that Nvidia wants to make money on propriatary technology. But looking form a wider perspetivce only a standrdized non-propiatary technology seems sensible here.
In regards to #2 “Power
In regards to #2 “Power Savings”
The Displayport standards can directly control a display’s refresh rate. One of these reasons IS energy savings.
It’s not just a video out plug. It’s a whole new world.
I wouldn’t be surprised if G-Sync works because of Displayport standards and the vast number of possibilities that Displayport makes possible.
Yeah I’m bummed it’s another
Yeah I’m bummed it’s another proprietary push from Nvidia, but at least it is so basic of a concept they can lock it away in patents. I do have hope this could drive an open standard to be picked up that can be implimented.
Ugh, I meant can’t lock it
Ugh, I meant can’t lock it away.
Been hearing a lot about
Been hearing a lot about train derailment in Canada.
“tasks are limited by the “infrastructure””
Perhaps the Government should visit PCper and nVidia and discuss the concept of G-Sync and apply it to the real world! 🙂
Besides, computer hardware requires a lot of natural resources for productions and oil is part of that progress. *hint hint*
Cheers!
It would be awesome if
It would be awesome if somebody compared G-Sync with Lucid Virtual Vsync.
it is unfortunate that nvidia
it is unfortunate that nvidia has not catered this technology to many of us gaming enthusiasts who have just upgraded to 2560 120hz IPS displays.
I doubt many people will jump ship to an inferior display simply to get no tearing. (even though its a significant improvement)
I would have liked to see Nvidia produce adapters for monitors that use DVI.
For me, this probably means that I’ll buy a 290X because I wouldn’t be able to take advantage of G-sync.
Great article and even
Great article and even greater technology.
Does this also work with Nvidia’s 3D glasses for 3d gaming?