Introduction of LCD Monitors
CRT displays were used on computers since the very earliest days as they replaced the arrays of light bulbs used for information output. In the 1980s though the CRT went through a revolution as the capabilities were improved from monochrome to 16 colors and resolutions of 640×540 (EGA) were reached.
LCD monitors (liquid crystal display) were first introduced to consumers in the mid-1990s at high prices and relatively low image quality compared to the CRT displays at the time. LCD manufacturers were forced to integrate the completely new display technology on the existing display controller infrastructure that was dominated by the CRT. Part of this integration was the adoption of a refresh rate even though from a technical point of few, LCDs functioned very differently.
LCDs produce an image on the screen with the combination of a persistent back light (CCFL, LED) and by direct voltage application. These pixels can be updated without interference from or with other segments of the display, in any order or at any rate. There is also no need to refresh the pixels in an LCD due to the CRT limitation called phosphor persistence as the pixels do not go dark between refreshes. This means that there is effectively no minimum refresh rate for LCD panels. (Though, in extreme cases of sub-15 FPS instances, a refresh might be necessary to avoid visible light variance.)
ASUS VG248QE – a 144 Hz Refresh LCD Monitor
Because LCD monitors don’t have a collection of electron beams that “paint” the screen and don’t need to reset back to a starting position, they do not have a technical reliance on a fixed update rate. Instead, LCD monitors are limited by the update rate of the crystal and silicon used in manufacturing. Today we call that period the response time. There are lots of debates currently about exactly how to measure it (grey to grey, etc.) but this rate is completely independent of the refresh rate as it stands today.
A display, LCD or CRT, at the time required the use of an analog to digital converter that was responsible for taking in the signal from the graphics system of a computer and translating that into the digital data that would be painted on the display. The controllers inside these displays were receiving data from the computer at a fixed rate that was related to the refresh rate of the CRT. Early LCDs used very similar electronics for the analog conversion and adopted the refresh rate along with them. It wasn’t until 2003 that LCD displays first outsold their CRT competitors and by that time refresh rates on LCDs were fully integrated.
Integration of Refresh Rates Today
Those of us old enough to remember CRTs as the dominant display type will remember the idea of having different refresh rate options for your monitor. High end monitors could run at 60 Hz, 75 Hz, 100 Hz, etc. to improve visual quality all while running at different resolutions (CRTs had no “native” resolution). A technology called EDID (extended display information data) was created in 1994 to allow a monitor to pass information to the computer system to indicate the available resolutions and refresh rates the monitor supported.
With that information, it is up to the graphics card to set and indicate the refresh rate back to the display. (This explains how you can create custom resolutions and timings in a GPU driver that may not be accepted by your monitor.) In fact, the scan rate is controlled by a signal called “vertical blanking” that is sent to the monitor, used with CRTs to give the electron guns time to move from their finishing position (bottom right) back to the starting position (top left), ready to draw another frame on the screen.
LCDs today do not need this signal as there are no electron gun to reposition. So why have LCD manufacturers continued to implement static refresh rates of 60 Hz, 120 Hz or even 144 Hz?
The answer lies in the connectivity options and controllers powering displays that are available today. As the analog signals of VGA were replaced with the likes of DVI and HDMI, these higher speed, digital connection options had to conform to existing interface patterns to maintain compatibility with older output types. How many DVI-I to VGA passive adapters do you have laying around with your graphics card?
DVI, HDMI and early versions of DisplayPort transmit display data in multiples of a refresh rate. A pixel clock is simply a timing circuit provided to divide an incoming scan line of video into individual pixels. The simplicity of the math might surprise you:
2560×1600 Resolution
2720 pixels * 1646 pixels * 60 Hz = 268.628 MHz pixel clock
Breakdown of current Display Timing Variables
The extra pixels on each dimension (160 on horizontal, 46 on the vertical) are actually part of the legacy CRT standard that include room for sync width and front/back porch which are meant to space out signal and synchronization communications. This gives time for the electron guns to reposition.
If you look at DVI and HDMI specifications you will often see the performance rated by maximum pixel clock. DVI is limited to 165 MHz per link or resolution equivalents of 1920×1200 @ 60 Hz and 1280×1024 @ 85 Hz. Recent updates to HDMI 2.0 have enabled 600 MHz total dual-link pixel clocks for resolutions of 3840×2160 @ 60 Hz.
DisplayPort introduced a self-clock feature while also supporting HDMI and DVI pixel clocks for backwards compatibility but readers of PC Perspective are also aware some active adapters are required for conversion to DVI/HDMI connectivity. These adapters communicate with the graphics controller on the computer to negotiate speeds and convert the protocol and signal.
So even though DisplayPort and HDMI can technically support some impressive bandwidth rates (17.28 Gbps for DP 1.2) they are working with legacy logic that makes them slow and “lazy”. The connectivity is there to push much higher screen refresh rates than we are accustomed to without some of the complications that occur with modern graphics cards and monitors.
The question you should be asking yourself now is: what happens if you completely re-write the logic of communications between a GPU and a monitor?
Why do lcd monitors need even
Why do lcd monitors need even that? Why not have the gpu have a done signal instead where each complete signal has a 1 or 2 bit done packet which tells the monitor that a frame is fully rendered and in a more dumb fashion the display simply displays each complete frame
All the monitor tells the videocard is the max frame rate that it can handle and the gpu will cap its frame rate to that limit.
Think of it like this, imagine you have 1 person showing drawings to a some people and next to that person, you have someone drawing pictures quickly and once he or she is finished he will ring a bell and the person showing the images to the group of people, will simply grab the next image and show them, not caring about the rate at which they are being drawn at.
Razor512 > From my
Razor512 > From my understanding, that is what G-Sync is attempting to resolve. To ignore the Monitor’s timing rate and display the images from the GPU when they are ready.
I believe they are focusing at LED/LCD monitors as these are widely available and the ideal display format trend. And being the most restricted in clock timing, it gives them a great foot into a large future for nVidia.
So when they have it well adopted, they will go down the chain to the next wide spread display.
We will know more in Fall 2014.
All I can say is, it’s about time someone as large releases something like this. There will be many more advantages from the GPU with these new features.
*page request error*
– Sorry
*page request error*
– Sorry for double post –
For Me, this maybe AMD’s last
For Me, this maybe AMD’s last nail in their coffin.
i head mostly ATI/AMD’s card’s, but it’s clear now
who is the leader & who is flower in this technology race.
wish Nvidia make cooler running cards. (80c ugh :S)
and that we don’t need sell a kidney for a very good card.
AMD/BBY drinking at a bar
AMD/BBY drinking at a bar together very soon.
Well if this will signal the
Well if this will signal the end of AMD GPU’s, wont it mean that NVIDIA will monopolize PC graphics card and we will pay an arm and a leg just to buy great graphics card????
I have to say the article
I have to say the article left me with the feeling that several crucial issues are not even mentioned. I am not an expert and I am not absolutely sure I remember everything correctly. But here are the issues I would like to be explained:
1. As far as I can remember LCD monitor do not tolerate very low refresh rates well – although these very low frequencies are probably not important for gmaing.
2. Flexible refresh probably has more important applications than improved gaming. How about power savings? How about discarding frame rate conversions on some archival videos? These seem to be more important for general public.
3. OLED in monitors and TVs is probably a very close futre. Other technologies like electrowetting are also a possiblity. These should be also considered, not just LCD.
4. It is understandable that Nvidia wants to make money on propriatary technology. But looking form a wider perspetivce only a standrdized non-propiatary technology seems sensible here.
In regards to #2 “Power
In regards to #2 “Power Savings”
The Displayport standards can directly control a display’s refresh rate. One of these reasons IS energy savings.
It’s not just a video out plug. It’s a whole new world.
I wouldn’t be surprised if G-Sync works because of Displayport standards and the vast number of possibilities that Displayport makes possible.
Yeah I’m bummed it’s another
Yeah I’m bummed it’s another proprietary push from Nvidia, but at least it is so basic of a concept they can lock it away in patents. I do have hope this could drive an open standard to be picked up that can be implimented.
Ugh, I meant can’t lock it
Ugh, I meant can’t lock it away.
Been hearing a lot about
Been hearing a lot about train derailment in Canada.
“tasks are limited by the “infrastructure””
Perhaps the Government should visit PCper and nVidia and discuss the concept of G-Sync and apply it to the real world! 🙂
Besides, computer hardware requires a lot of natural resources for productions and oil is part of that progress. *hint hint*
Cheers!
It would be awesome if
It would be awesome if somebody compared G-Sync with Lucid Virtual Vsync.
it is unfortunate that nvidia
it is unfortunate that nvidia has not catered this technology to many of us gaming enthusiasts who have just upgraded to 2560 120hz IPS displays.
I doubt many people will jump ship to an inferior display simply to get no tearing. (even though its a significant improvement)
I would have liked to see Nvidia produce adapters for monitors that use DVI.
For me, this probably means that I’ll buy a 290X because I wouldn’t be able to take advantage of G-sync.
Great article and even
Great article and even greater technology.
Does this also work with Nvidia’s 3D glasses for 3d gaming?