Our Legacys Influence
NVIDIA announces G-Sync and we dive into why matters to you.
We are often creatures of habit. Change is hard. And often times legacy systems that have been in place for a very long time can shift and determine the angle at which we attack new problems. This happens in the world of computer technology but also outside the walls of silicon and the results can be dangerous inefficiencies that threaten to limit our advancement in those areas. Often our need to adapt new technologies to existing infrastructure can be blamed for stagnant development.
Take the development of the phone as an example. The pulse based phone system and the rotary dial slowed the implementation of touch dial phones and forced manufacturers to include switches to select between pulse and tone based dialing options on phones for decades.
Perhaps a more substantial example is that of the railroad system that has based the track gauge (width between the rails) on the transportation methods that existed before the birth of Christ. Horse drawn carriages pulled by two horses had an axle gap of 4 feet 8 inches in the 1800s and thus the first railroads in the US were built with a track gauge of 4 feet 8 inches. Today, the standard rail track gauge remains 4 feet 8 inches despite the fact that a wider gauge would allow for more stability of larger cargo loads and allow for higher speed vehicles. But the cost of updating the existing infrastructure around the world would be so cost prohibitive that it is likely we will remain with that outdated standard.
What does this have to do with PC hardware and why am I giving you an abbreviated history lesson? There are clearly some examples of legacy infrastructure limiting our advancement in hardware development. Solid state drives are held back by the current SATA based storage interface though we are seeing movements to faster interconnects like PCI Express to alleviate this. Some compute tasks are limited by the “infrastructure” of standard x86 processor cores and the move to GPU compute has changed the direction of these workloads dramatically.
There is another area of technology that could be improved if we could just move past an existing way of doing things. Displays.
Development of the CRT and the Refresh Rate
The very first CRT (cathode ray tube) was built in Germany in 1897 but it took a Russian scientist named Boris Rosing in 1907 to first draw simple shapes on a screen that would lead to the creation of the television and the monitor. 1934 saw the first commercially available electronic televisions with cathode ray tubes by Telefunken in Germany.
Image source: CircuitsToday
A CRT TV or monitor produces an image on a fluorescent screen with the use of a set of electron guns that accelerate and excite different color phosphors (red, green and blue most often). The guns move in a fixed and systematic pattern that is called a raster from left to right and from top to bottom (as you face the display) in order to draw the image provided by a graphics controller on the screen.
CRT displays were built around a refresh rate, otherwise known as the scan rate, which represents the speed of the electron guns ability to draw a complete image from top to bottom on the screen and then relocate back to the starting position (top left). All kinds of devices have a refresh rate – CRTs, LCDs, movie projectors, etc. On CRT monitors, a faster refresh rate resulted in reduced screen flicker and reduced eye strain for users. Screen flicker is the result of the eye being able to witness the phosphors going dark before the next “scan” that activates them. The faster the monitor could update the image and keep the phosphors illuminated, the less likely you were to see flickering.
Image source: Wikipedia
What may not be common knowledge is why refresh rates were set to the speeds they are most commonly built at. In the 1920s as TVs were first being produced it became obvious that running vacuum tube hardware at anything other than a multiple of the AC line frequency coming into a home was problematic. AC powers lines run at 60 Hz in the US and 50 Hz in Europe – starting to sound familiar? By running a TV at 60 Hz in the US manufacturers of televisions could prevent moving horizontal bands of noise that were caused by power supply ripple.
An entire industry was built around the 60 Hz (and 50 Hz in Europe) refresh rates which has caused numerous other problems in video. Films are recorded at 24 frames per second most of the time which is not an easy multiple of the 60 Hz refresh rate, introducing the need for technologies like telecine and 3:2 pulldown to match up rates. Many readers will likely remember uneven, jittering movement of movies being played back on TV – these rate match problems are the reason why.
Why do lcd monitors need even
Why do lcd monitors need even that? Why not have the gpu have a done signal instead where each complete signal has a 1 or 2 bit done packet which tells the monitor that a frame is fully rendered and in a more dumb fashion the display simply displays each complete frame
All the monitor tells the videocard is the max frame rate that it can handle and the gpu will cap its frame rate to that limit.
Think of it like this, imagine you have 1 person showing drawings to a some people and next to that person, you have someone drawing pictures quickly and once he or she is finished he will ring a bell and the person showing the images to the group of people, will simply grab the next image and show them, not caring about the rate at which they are being drawn at.
Razor512 > From my
Razor512 > From my understanding, that is what G-Sync is attempting to resolve. To ignore the Monitor’s timing rate and display the images from the GPU when they are ready.
I believe they are focusing at LED/LCD monitors as these are widely available and the ideal display format trend. And being the most restricted in clock timing, it gives them a great foot into a large future for nVidia.
So when they have it well adopted, they will go down the chain to the next wide spread display.
We will know more in Fall 2014.
All I can say is, it’s about time someone as large releases something like this. There will be many more advantages from the GPU with these new features.
*page request error*
– Sorry
*page request error*
– Sorry for double post –
For Me, this maybe AMD’s last
For Me, this maybe AMD’s last nail in their coffin.
i head mostly ATI/AMD’s card’s, but it’s clear now
who is the leader & who is flower in this technology race.
wish Nvidia make cooler running cards. (80c ugh :S)
and that we don’t need sell a kidney for a very good card.
AMD/BBY drinking at a bar
AMD/BBY drinking at a bar together very soon.
Well if this will signal the
Well if this will signal the end of AMD GPU’s, wont it mean that NVIDIA will monopolize PC graphics card and we will pay an arm and a leg just to buy great graphics card????
I have to say the article
I have to say the article left me with the feeling that several crucial issues are not even mentioned. I am not an expert and I am not absolutely sure I remember everything correctly. But here are the issues I would like to be explained:
1. As far as I can remember LCD monitor do not tolerate very low refresh rates well – although these very low frequencies are probably not important for gmaing.
2. Flexible refresh probably has more important applications than improved gaming. How about power savings? How about discarding frame rate conversions on some archival videos? These seem to be more important for general public.
3. OLED in monitors and TVs is probably a very close futre. Other technologies like electrowetting are also a possiblity. These should be also considered, not just LCD.
4. It is understandable that Nvidia wants to make money on propriatary technology. But looking form a wider perspetivce only a standrdized non-propiatary technology seems sensible here.
In regards to #2 “Power
In regards to #2 “Power Savings”
The Displayport standards can directly control a display’s refresh rate. One of these reasons IS energy savings.
It’s not just a video out plug. It’s a whole new world.
I wouldn’t be surprised if G-Sync works because of Displayport standards and the vast number of possibilities that Displayport makes possible.
Yeah I’m bummed it’s another
Yeah I’m bummed it’s another proprietary push from Nvidia, but at least it is so basic of a concept they can lock it away in patents. I do have hope this could drive an open standard to be picked up that can be implimented.
Ugh, I meant can’t lock it
Ugh, I meant can’t lock it away.
Been hearing a lot about
Been hearing a lot about train derailment in Canada.
“tasks are limited by the “infrastructure””
Perhaps the Government should visit PCper and nVidia and discuss the concept of G-Sync and apply it to the real world! 🙂
Besides, computer hardware requires a lot of natural resources for productions and oil is part of that progress. *hint hint*
Cheers!
It would be awesome if
It would be awesome if somebody compared G-Sync with Lucid Virtual Vsync.
it is unfortunate that nvidia
it is unfortunate that nvidia has not catered this technology to many of us gaming enthusiasts who have just upgraded to 2560 120hz IPS displays.
I doubt many people will jump ship to an inferior display simply to get no tearing. (even though its a significant improvement)
I would have liked to see Nvidia produce adapters for monitors that use DVI.
For me, this probably means that I’ll buy a 290X because I wouldn’t be able to take advantage of G-sync.
Great article and even
Great article and even greater technology.
Does this also work with Nvidia’s 3D glasses for 3d gaming?