Our Legacys Influence
NVIDIA announces G-Sync and we dive into why matters to you.
We are often creatures of habit. Change is hard. And often times legacy systems that have been in place for a very long time can shift and determine the angle at which we attack new problems. This happens in the world of computer technology but also outside the walls of silicon and the results can be dangerous inefficiencies that threaten to limit our advancement in those areas. Often our need to adapt new technologies to existing infrastructure can be blamed for stagnant development.
Take the development of the phone as an example. The pulse based phone system and the rotary dial slowed the implementation of touch dial phones and forced manufacturers to include switches to select between pulse and tone based dialing options on phones for decades.
Perhaps a more substantial example is that of the railroad system that has based the track gauge (width between the rails) on the transportation methods that existed before the birth of Christ. Horse drawn carriages pulled by two horses had an axle gap of 4 feet 8 inches in the 1800s and thus the first railroads in the US were built with a track gauge of 4 feet 8 inches. Today, the standard rail track gauge remains 4 feet 8 inches despite the fact that a wider gauge would allow for more stability of larger cargo loads and allow for higher speed vehicles. But the cost of updating the existing infrastructure around the world would be so cost prohibitive that it is likely we will remain with that outdated standard.
What does this have to do with PC hardware and why am I giving you an abbreviated history lesson? There are clearly some examples of legacy infrastructure limiting our advancement in hardware development. Solid state drives are held back by the current SATA based storage interface though we are seeing movements to faster interconnects like PCI Express to alleviate this. Some compute tasks are limited by the “infrastructure” of standard x86 processor cores and the move to GPU compute has changed the direction of these workloads dramatically.
There is another area of technology that could be improved if we could just move past an existing way of doing things. Displays.
Development of the CRT and the Refresh Rate
The very first CRT (cathode ray tube) was built in Germany in 1897 but it took a Russian scientist named Boris Rosing in 1907 to first draw simple shapes on a screen that would lead to the creation of the television and the monitor. 1934 saw the first commercially available electronic televisions with cathode ray tubes by Telefunken in Germany.
Image source: CircuitsToday
A CRT TV or monitor produces an image on a fluorescent screen with the use of a set of electron guns that accelerate and excite different color phosphors (red, green and blue most often). The guns move in a fixed and systematic pattern that is called a raster from left to right and from top to bottom (as you face the display) in order to draw the image provided by a graphics controller on the screen.
CRT displays were built around a refresh rate, otherwise known as the scan rate, which represents the speed of the electron guns ability to draw a complete image from top to bottom on the screen and then relocate back to the starting position (top left). All kinds of devices have a refresh rate – CRTs, LCDs, movie projectors, etc. On CRT monitors, a faster refresh rate resulted in reduced screen flicker and reduced eye strain for users. Screen flicker is the result of the eye being able to witness the phosphors going dark before the next “scan” that activates them. The faster the monitor could update the image and keep the phosphors illuminated, the less likely you were to see flickering.
Image source: Wikipedia
What may not be common knowledge is why refresh rates were set to the speeds they are most commonly built at. In the 1920s as TVs were first being produced it became obvious that running vacuum tube hardware at anything other than a multiple of the AC line frequency coming into a home was problematic. AC powers lines run at 60 Hz in the US and 50 Hz in Europe – starting to sound familiar? By running a TV at 60 Hz in the US manufacturers of televisions could prevent moving horizontal bands of noise that were caused by power supply ripple.
An entire industry was built around the 60 Hz (and 50 Hz in Europe) refresh rates which has caused numerous other problems in video. Films are recorded at 24 frames per second most of the time which is not an easy multiple of the 60 Hz refresh rate, introducing the need for technologies like telecine and 3:2 pulldown to match up rates. Many readers will likely remember uneven, jittering movement of movies being played back on TV – these rate match problems are the reason why.