Our Legacys Influence
NVIDIA announces G-Sync and we dive into why matters to you.
We are often creatures of habit. Change is hard. And often times legacy systems that have been in place for a very long time can shift and determine the angle at which we attack new problems. This happens in the world of computer technology but also outside the walls of silicon and the results can be dangerous inefficiencies that threaten to limit our advancement in those areas. Often our need to adapt new technologies to existing infrastructure can be blamed for stagnant development.
Take the development of the phone as an example. The pulse based phone system and the rotary dial slowed the implementation of touch dial phones and forced manufacturers to include switches to select between pulse and tone based dialing options on phones for decades.
Perhaps a more substantial example is that of the railroad system that has based the track gauge (width between the rails) on the transportation methods that existed before the birth of Christ. Horse drawn carriages pulled by two horses had an axle gap of 4 feet 8 inches in the 1800s and thus the first railroads in the US were built with a track gauge of 4 feet 8 inches. Today, the standard rail track gauge remains 4 feet 8 inches despite the fact that a wider gauge would allow for more stability of larger cargo loads and allow for higher speed vehicles. But the cost of updating the existing infrastructure around the world would be so cost prohibitive that it is likely we will remain with that outdated standard.
What does this have to do with PC hardware and why am I giving you an abbreviated history lesson? There are clearly some examples of legacy infrastructure limiting our advancement in hardware development. Solid state drives are held back by the current SATA based storage interface though we are seeing movements to faster interconnects like PCI Express to alleviate this. Some compute tasks are limited by the “infrastructure” of standard x86 processor cores and the move to GPU compute has changed the direction of these workloads dramatically.
There is another area of technology that could be improved if we could just move past an existing way of doing things. Displays.
Development of the CRT and the Refresh Rate
The very first CRT (cathode ray tube) was built in Germany in 1897 but it took a Russian scientist named Boris Rosing in 1907 to first draw simple shapes on a screen that would lead to the creation of the television and the monitor. 1934 saw the first commercially available electronic televisions with cathode ray tubes by Telefunken in Germany.
Image source: CircuitsToday
A CRT TV or monitor produces an image on a fluorescent screen with the use of a set of electron guns that accelerate and excite different color phosphors (red, green and blue most often). The guns move in a fixed and systematic pattern that is called a raster from left to right and from top to bottom (as you face the display) in order to draw the image provided by a graphics controller on the screen.
CRT displays were built around a refresh rate, otherwise known as the scan rate, which represents the speed of the electron guns ability to draw a complete image from top to bottom on the screen and then relocate back to the starting position (top left). All kinds of devices have a refresh rate – CRTs, LCDs, movie projectors, etc. On CRT monitors, a faster refresh rate resulted in reduced screen flicker and reduced eye strain for users. Screen flicker is the result of the eye being able to witness the phosphors going dark before the next “scan” that activates them. The faster the monitor could update the image and keep the phosphors illuminated, the less likely you were to see flickering.
Image source: Wikipedia
What may not be common knowledge is why refresh rates were set to the speeds they are most commonly built at. In the 1920s as TVs were first being produced it became obvious that running vacuum tube hardware at anything other than a multiple of the AC line frequency coming into a home was problematic. AC powers lines run at 60 Hz in the US and 50 Hz in Europe – starting to sound familiar? By running a TV at 60 Hz in the US manufacturers of televisions could prevent moving horizontal bands of noise that were caused by power supply ripple.
An entire industry was built around the 60 Hz (and 50 Hz in Europe) refresh rates which has caused numerous other problems in video. Films are recorded at 24 frames per second most of the time which is not an easy multiple of the 60 Hz refresh rate, introducing the need for technologies like telecine and 3:2 pulldown to match up rates. Many readers will likely remember uneven, jittering movement of movies being played back on TV – these rate match problems are the reason why.
Great article. However, will
Great article. However, will Tom Peterson be joining you in November or on Monday October 21st? As there is no Monday the 21st in November. At any rate (pun intended) I look forward to the live event!
October 21st, sorry!!
Thanks
October 21st, sorry!!
Thanks for pointing that out!
Awesome article, nicely done!
Awesome article, nicely done! Is this technology limited in any way to screen resolution, i.e. use in 4k & surround?
It would appear as though a
It would appear as though a possible limitation with 4k resolutions would be that of the cable’s bandwidth and “the legacy logic that makes them slow and ‘lazy’.”
Great Article!
I think PCper
Great Article!
I think PCper should have a Calendar for sale with the best of the best of their articles and such for all their Pcper Fans!
I do believe that a re-write of the GPU to Display is required. The downside to this is back-compatibility will be eliminated.
The major challenge that I can see, is from the consumer point of view, the profit, the main reason for technology advancement. Consumers will probably not buy in it as it won’t be compatible with their current setup. This is where companies are releasing small devices to help the merge-over.
Such as the G-Sync device. I believe this is the transition piece. The GPU will probably get re-written and only display that will be able to make us of it will require the G-Sync until all companies follows through with their own approach and create a new standard. I believe Mantle from AMD has the same concept in mind, however they are not as aggressive as nVidia with their G-Sync.
All-in-all, change is scary and it will happen for the good! 🙂
Great broadcast, by the way!
Sounds great. Looking forward
Sounds great. Looking forward to seeing some video and hopefully checking it out soon.
one quibble — the birth of christ is about a valid metric as the birth of santa claus.
It’s historically accepted
It’s historically accepted that Jesus was born. The only quibble you should have is whether he was a diety or just a prophet of a religion.
http://en.wikipedia.org/wiki/Anno_Domini#Historical_birth_date_of_Jesus
Don’t forget, his birth date
Don’t forget, his birth date is also argued. Which may be what he was talking about. His birth date, from what I have heard, is likely in early summer, and the year is also unknown.
This seems like something
This seems like something FrameRating help, by giving them exact data points rather than, people trying to explain it in abstract terms.
This does have my curiosity running. Since G-Sync is more communication between the monitor and the GPU what happens with things like your FrameRating setup? Will the communication be able to pass through the capture card? Will the capture card have to be G-Sync enabled? I think that will be very interesting.
What about HTS (Home Theater
What about HTS (Home Theater Systems)?
How would the new GPU work when it passes through the Receiver (HDMI 1.4) and then outputs to an HDTV Display (60″ SMART VIERA Full HD 3D Plasma)?
Will a firmware be available for the HDTV or a G-sync display required?
Will the Receiver affect the signal where the G-Sync won’t be able to utilize the data correctly?
Many more questions that should be presented on Monday! 🙂
I see problems in adoption
I see problems in adoption from the monitor side of things if this is NVidia only, will ATi have their own version in the future requiring monitors to adopt multiple standards just so we can have what is effectively the same feature? Or is this something that other video card manufacturers can adopt reasonably?
I agree, the propriatary
I agree, the propriatary hardware makes me leery. I hope that in the future, the firmware of LCD display ICs will just be written to accept this new protocol. Then the transport (HDMI, DVI, DisplayPort) will not matter and it will all be controlled by software.
Game changer.
Game changer. Literally.
Can’t wait.
How does this ‘G Sync’
How does this ‘G Sync’ compare to using LightBoost-2d with a 120hz monitor and nVidia vid card to get great improvements in ‘FPS’ style games?
They’re not really
They’re not really comparable, IMO because these two technologies work on different sides of the smoothness problem.
Lighboost is a way of refreshing the screen in such a way that eliminates ghosting. G-sync changes the timing of the refresh so that it’s controlled by the GPU.
There’s absolutely no reason there can’t be a lightboosted G-sync display.
What is the limit refresh
What is the limit refresh rate of this monitor? If the Video card is free to dish out any framerate it wants, then what happens when it reaches 1000 frames per second? Will the monitor be capable of having an unlimited refresh rate?
OLED monitors will be. But at
OLED monitors will be. But at that point the interconnect will be the real limit…
So $399 for a 24inch.
That’s
So $399 for a 24inch.
That’s a 50% price increase just to have G-Sync in a monitor 24inch monitor.
OUCH!!!!
Is uneven frame pacing a
Is uneven frame pacing a serious concern for gsync? If finished frames are sent out as fast as they can, won’t the times between frames being displayed constantly change? I think this tech sounds great, but it seems like the game or gsync needs to intelligently find a refresh rate it can maintain still to avoid uneven frame pacing.
They still have a buffer of
They still have a buffer of 768mb inside the monitor.
You can predict if a frame will take too long to be rendered if you are the GPU. And then you can start to add artificial “exibition” latency in your “quick” frames.
This decrease in speed would give you time to shift the pace in seamless way.
Lucid already does that.
Its like F1 racing: the gpu can predict what the corners will look like and start to brake.
And in the exit apply “traction control” also.
Hi Ryan, did Nvidia give any
Hi Ryan, did Nvidia give any indication as to a potential upgrade kit for existing owners of any of the 4 monitor OEMs mentioned?
I ask because I recently purchased a BenQ monitor only a few months ago, and am hoping there would be a kit I could buy from them and install on my existing monitor so I could take advantage of the G-Sync.
Thanks.
sounds like more proprietary
sounds like more proprietary crap is required.
It hard to tell from the story what this ACTUALLY is without seeing it in action or having the nvidia religious types oozing over how they saved the world for buying nvidia.
Plain and simple the ONLY way to do this is to change the actual display logic inside the monitor. That means you need a new monitor or one that has the chip they use.
Hi Sorry I am a little
Hi Sorry I am a little confused…I currently have a vg248qe and its amazing but what is the point of this gsync if my monitor already has 144hz? when I game I don’t enable vsync because I don’t have to because of the 144hz and I don’t see screen tearing because of the 144hz….so what exactly is a g sync vg248qe going to do for me?
Screen tearing is still
Screen tearing is still visible on 144hz monitors. It’s less noticeable but still noticeable.
so basically this just
so basically this just eliminates all possible screen tearing and that’s it?
Ya and also eliminates the
Ya and also eliminates the input lag that vsync causes. It may be really good for syncing up with video too not sure though.
It reduces tearing without
It reduces tearing without introducing any input latency at all, it allows native playback of fixed lower-framerate content without pulldown or other judder-inducing tricks, it improves frame perception dramatically, and it allows variable framerate to be vsynced (also with no lag).
Kind of a big deal.
Vsync without any additional lag has been an unattainable holy grail. But variable-framerate vsync without lag? I know plenty of people willing to get a second job to pay for that if it actually works. Myself included.
I think the tech sounds
I think the tech sounds great, I’m still worried that uneven frame pacing might be a problem though since frame rate can fluctuate a lot.
To be fair, uneven frame
To be fair, uneven frame pacing would be a problem even if the video card had shared memory with your frontal cortex. Even if the frame was sent directly to your optic nerve in real time with 0femtosecond delay, the uneven frame pacing would still be a problem.
This just makes all the other problems around it become small enough to be unnoticeable. Then we can focus on the REAL frame pacing. See, when we say frame pacing now we mean frame pacing as seen on screen. What matters is also the frame pacing as seen by the game engine. Not just how far apart the frames are, but how far apart is the content in the frames as it exists in the virtual game world, and also how variable is that temporal distance. We’ve got a long way to go, that’s for sure.
But hey, this is the first concrete step in a long time, and I can’t wait to experience it.
https://pcper.com/files/im
https://pcper.com/files/imagecache/article_max_width/review/2013-10-18/pixelresponse.GIF
“G-Sync essentially functions by altering and controlling the vBlank signal sent to the monitor. In a normal configuration, vBlank is a combination of the combination of the vertical front and back porch and the necessary sync time. That timing is set a fixed stepping that determines the effective refresh rate of the monitor; 60 Hz, 120 Hz, etc.”
Alright. In terms of ACTUALS, this chart is about the best that you can view. It refers to the rise/fall time (Tr and Tf) that it takes for the signal to output. The capacitors, resistors, etc. all have to work and it isn’t instantaneous. That is the limiting factor of refresh rates. Yes you can overvolt, overclock, or do other things to speed that up, but nothing makes up for actual hardware that takes less time to rise and fall.
So, you need something from the micro scale to the nano scale. Nanotechnology is ever present in the news and there are a variety of options to build circuits on, but there is a matter of cost that goes along with it.
So, nvidia is selling an nvidia monitor with an nvidia chip to use nvidia commands with thier nvidia processor that works with the nvidia cable on the nvidia power pc because “that’s the way it’s meant to be played”.
I’ll believe it when I see it, and I’m not paying 100000 dollars for an extra 30 hertz when we all know we are content with 60 and haven’t even transitioned to 120-240 hertz yet.
We are content with 60? Who’s
We are content with 60? Who’s we? Anybody who has gamed on a 120hz monitor will want to go back, its a huge improvement. There’s nothing wrong with being content with technology, but that won’t stop it from advancing. When people experience the advances, then they prob won’t be so content.
My monitor is 2560*1440 @
My monitor is 2560*1440 @ 120hz. It’s fantastic. The ‘you can’t see a difference’ bit is a fallacy and I have yet to show it to someone who can’t see the difference. They may write off the difference as trivial or insignificant, but they will admittedly see the difference.
That isn’t to say, you don’t NOTICE the difference all the time, just like you don’t notice the colors aren’t calibrated when playing a game. That’s the point though, if you noticed it all the time it would detract from what you are doing, but it is definitely an enhancement. Just like well done 3D, especially something like using an oculus rift, you shouldn’t be thinking ‘This is 3D, This is 3D, This is 3D’ the entire time you play a game, you should be immersed in the game with the improvements making it an overall better experience, not constantly reminding you or pulling your attention away so you an actively away that they exist and just overall detracting from the experience.
There was already a display
There was already a display controller architecture update coming – to implement Panel Self Refresh (to conserve power). Implementing G-Sync (or similar) technology at the same time will be splendid.
I will buy this 🙂
I will buy this 🙂
http://www.geforce.com/hardwa
http://www.geforce.com/hardware/technology/g-sync/faq
NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.
What i am worried about is
What i am worried about is SLI and triple display. I am not sure if I am the only one, but I wasn’t able to overclock pixel clock on my SLI and triple monitor setup. Single display works. Any hints?
Ya Ruski ! Great article. We
Ya Ruski ! Great article. We shout … for Shrout.
NVIDIA FTW !
NVIDIA FTW !