DisplayPort Adaptive-Sync is a VESA standard, pushed by AMD, that allows input signals to control when a monitor refreshes. A normal monitor redraws on a defined interval because old CRT monitors needed to scan with an electron gun, and this took time. LCDs never needed to, but they did. This process meant that the monitor was drawing a frame whether it was ready or not, which led to tearing, stutter, and other nasty effects if the GPU couldn't keep up. With Adaptive-Sync, GPUs don't “miss the train” — the train leaves when they board.
Intel has, according to The Tech Report, decided to support Adaptive-Sync — but not necessarily in their current product line. David Blythe of Intel would not comment on specific dates or release windows, just that it is in their plans. This makes sense for Intel because it allows their customers to push settings higher while maintaining a smooth experience, which matters a lot for users of integrated graphics.
While “AMD FreeSync” is a stack of technologies, VESA DisplayPort Adaptive-Sync should be all that is required on the monitor side. This should mean that Intel has access to all of AMD's adaptive refresh monitors, although the driver and GPU circuitry would need to be their burden. G-Sync monitors (at least those with NVIDIA-design modules — this is currently all of them except for one laptop I think) would be off limits, though.
This is BIG. Intel recently
This is BIG. Intel recently bought Alterra the people who make the G-Sync module for Nvidia. They also cross-license a lot of Nvidia GPUs tech.
They would have more insight to the future viability of G-Sync than anyone aside from Nvidia themselves and to decide to go the AMD route.
THATS BIG!!!
It’s not the AMD route at all
It’s not the AMD route at all since it’s only going to work on Intel hardware. Much the same that GSync only works on nVidia’s hardware. It’s very much the nVidia route.
What do you mean? The whole
What do you mean? The whole situation with adaptive sync vs. free sync is unclear to me. I have seen several things indicating that free sync is part of the VESA standard. For example, the Wikipedia entry for free sync indicates that it is part of the VESA standard. If a display supports adaptive sync, does this mean that it can automatically support free sync? That is, from the display perspective, is there any difference between adaptive sync and free sync? If Intel supports the display port 1.2a or 1.3 standard, then they will probably support adaptive sync. I don’t think it is clear whether they actually implemented variable refresh rate for games though. AFAIK, the initial reason behind developing adaptive sync was to reduce power consumption for mobile by reducing the screen refresh to a minimum when it is not changing. Hopefully they will support variable refresh rate for games somehow. I don’t see how they would ever use g-sync though.
Intel has been utilizing
Intel has been utilizing VBLANK for power consumption for a long time – since before G-Sync (NVIDIA Adaptive Sync) and FreeSync (AMD Adaptive Sync) have been in existence. Now, Intel needs to take that to the next level and fully embrace Adaptive Sync.
Technically speaking, ANY current FreeSync monitor could be used for ANY implementation of Adaptive Sync going forward. It requires NVIDIA to ditch the custom module for desktop the same way they ditched it on mobile and for Intel to develop their own software stack.
G-Sync, FreeSync, and I-Sync(?) will probably exist as brands forever, but they will all just be software stacks built around their individual GPU designs – like FreeSync and mobile G-Sync. Instead of a module inside the monitor, a small portion of the GPU will act as the controller. The nice part will be that any Adaptive Sync monitor will work with all three.
That is a bit misleading.
That is a bit misleading. Alterra just makes FPGAs. They may not have any actual data on how Nvidia’s g-sync module works unless Nvidia sent them the design for debugging or something. Using an FPGA is kind of like using a CPU that you then need to write software for. If I buy a CPU, the maker of that CPU doesn’tknow what software I run on it. FPGAs are programmed using a hardware description language like verilog rather than a software programming language. Alterra doesn’t necessarily have access to the verilog that Nvidia uses to program the FPGA.
If Nvidia is confident that there will be a larger volume of g-sync modules sold then they can actually use the verilog design to create a fixed function ASIC. This should be much cheaper, if there is sufficient volume. I tried to find out the price of the FPGA Nvidia is using, and it looked like it was around $200 in small volumes, if I had the right one. Nvidia would get a better price for a large number of parts though. I don’t know who takes the FPGA and mounts it on a board to make the actual g-sync module. Nvidia probably just contracts this out to some other company.
This is VESA adaptive sync
This is VESA adaptive sync and VESA is the standards origination, like the PCI-SIG(PCI) organization, or those other industry standards organizations that define USB(USB-IF). So to associate Adaptive Sync with only AMD is simply not completely correct. AMDs Free sync was developed from VESA’s eDP laptop sync display standard. Lots of companies contribute to VESA, and Intel is one member of VESA Along with AMD, Nvidia, and others. I’d expect that Intel would eventually be supporting any VESA standard. It’s no big deal for an industry to adopt a standard from an industry standards organization like VESA! And no ALTERRA only makes FPGAs it’s Nvidia that programms the FPGA’s that they purchase from ALTERRA! The G-Sync logic programmed into the FPGA is from Nvidia, and Nvidia shares that with NO ONE! Do you even Know what a FPGA is, and what it’s for!
Alterra only manufacture the
Alterra only manufacture the FPGA chip. Nvidia make the module itself, and program the FPGA.
what? that makes no sense,
what? that makes no sense, ALTERA makes FPGA’s and that is basically empty chips they have no knowledge about what NVIDA has done with them.
Its like saying that seagate knows what secrets NSA has as they store it on disks from seagate.
oops!! need to remember to
oops!! need to remember to refresh the web page before doing a reply, several people had already made the same point.
R.I.P. Gsync.
R.I.P. Gsync.
Now just watch how Adaptive
Now just watch how Adaptive Sync/Freesync will skyrocket.
I would love to see Allyn’s comment. I wonder if he will call Freesync again vaporware and continue insulting in the comments those who support Freesync/Adaptive Sync against the proprietary nature of GSync.
Now that Intel is going the adaptive sync root and considering that he shows the same love and enthusiasm for Intel like he shows for Nvidia, he might have a different song to sing this time.
Sure this is a blow to
Sure this is a blow to g-sync, but I doubt nvidia will really suffer. There is not much of a point to adaptive-sync to intel igp since almost nobody games on an igp and I doubt someone who doesn’t afford a discrete gpu would pay the small extra there is currently for an adaptive-sync monitor. The real market for those interested in variable refresh rate is now 80% nvidia.
What nvidia needs to do is find a way to lower the insane price and they will be fine.
No, it just means that Nvidia
No, it just means that Nvidia will eventually have to provide support for VESA Adaptive-Sync, along with its G-Sync if it wants to maintain market share. There will still be people buying Nvidia’s overpriced SKUs, and some still pay for G-Sync. What will happen is that the display manufactures will make all their products support VESA’s standards. VESA is the standards origination with even Nvidia represented on VESA’s committees and BOD. I’m sure the display manufactures get a slight markup on and above Nvidia’s BOM for the added G-Sync circuitry, so they will continue selling G-Sync enabled products. At some point in time all the TV controllers(CPUs/APU/SOC based) will have Adaptive-Sync baked into the controllers themselves, or the controller’s firmware, so it won’t matter as there will simply be no controllers for sale to the display industry that are not programmed/enabled for VESA’s Adaptive Sync, and the supplys of TV controllers without Adaptive Sync capabilities will dissappear. Once a feature becomes standard, that feature is usually baked in to all the TV controllers by default, and I do not see any display manufacturer not implementing a VESA standard if the capability comes standard with the controller’s hardware.
If you look at steam survey –
If you look at steam survey – how many people use intel gpu`s for gaming… id say nvidia`s potential gsync market share is below half tbh.
Nvidia should realize that
Nvidia should realize that their window of opportunity to monopolize the Variable Refresh Rate market with a proprietary technology (G-Sync) has passed. They should jump on board the VESA standard or contribute to it.
Why then Acer Predator X34
Why then Acer Predator X34 freesync is only 75hz when gsync will be 100hz?
They were never going to get
They were never going to get a monopoly on VRR, and I doubt they had ambitions to do so anyway.
G-Sync is a different solution to VRR that still has inherent advantages, with the module being able to maintain VRR even outside of the monitor’s hardware range and also able to predict pixel overdrive (minimizing ghosting). The module also allows for expanded functionality. As it is, they all support ULMB out of the box, and more features are likely to follow in short order.
AMD fanboys may be eager to dance on the grave on G-Sync, but this will hopefully motivate Nvidia to keep making their own solution better and more affordable. I don’t want a race to the lowest common denominator just because it’s an “open standard.”
ULMB might be the only reason
ULMB might be the only reason to get gsync going forward tho… but not if its only on small 16:9 TN garbage all the time.
And they cant combine gsync with ulmb at the same time.
There is nothing else to do on LCD really.
I really dont know what you expect they will deliver… some more gforce experience bullshit?
The new Acer XB270HU is an
The new Acer XB270HU is an IPS with ULMB, Asus’ successor to the Swift will be the same.
Hardware wise, the Gsync module already has added functionality, as it can control pixel overdrive and double frames. That’s just one thing Nvidia can further explore that is off limits to AMD, seeing as though they’ve limited themselves to manipulating the monitor’s VBLANK.
In the end, having your own hardware is just going to enable you to innovate more than a company tied to a standard.