During an interview that we streamed live with NVIDIA's Tom Petersen this past Thursday, it was confirmed that NVIDIA is not currently working on, or has any current plans to, add support for the VESA-based and AMD-pushed Adaptive Sync portion of the DisplayPort 1.2a specification. To quote directly:
There is no truth [to that rumor of NVIDIA Adaptive Sync support] and we have made no official comments about Adaptive Sync. One thing I can say is that NVIDIA as a company is 100% dedicated to G-Sync. We are going to continue to invest in G-Sync and it is a way we can make the gaming experience better. We have no need for Adaptive Sync. We have no intention of [implementing it]."
Discussion of G-Sync begins at 1:27:14 in our interview.
To be clear, the Adaptive Sync part of DP 1.2a and 1.3+ are optional portions of the VESA spec that is not required for future graphics processors or even future display scalar chips. That means that upcoming graphics cards from NVIDIA could still be DisplayPort 1.3 compliant without implementing support for the Adaptive Sync feature. Based on the comments above, I fully expect that to be the case.
The ASUS ROG Swift PG278Q G-Sync monitor
With that new information, you can basically assume that the future of variable refresh monitors is going to be divided: one set for users of GeForce cards and one set for users with Radeon cards. (Where Intel falls into this is up in the air.) Clearly that isn't ideal for a completely open ecosystem but NVIDIA has made the point, over and over, that what they have developed with G-Sync is difficult and not at all as simple as could be solved with the blunt instrument that Adaptive Sync is. NVIDIA has a history of producing technologies and then keeping them in-house, focusing on development specifically for GeForce owners and fans. The dream of having a VRR monitor that will run on both vendors GPUs appears to be dead.
When asked about the possibility of seeing future monitors that can support both NVIDIA G-Sync technology as well as Adaptive Sync technology, Petersen stated that while not impossible, he "would not expect to see such a device."
The future of G-Sync is still in development. Petersen stated:
"Don't think that were done. G-Sync is not done. Think of G-Sync as the start of NVIDIA solving the problems for gamers that are related to displays…G-Sync is our first technology that makes games look better on displays. But you can start looking at displays and make a lot of things better."
Diagram showing how G-Sync affects monitor timings
So now we await for the first round of prototype FreeSync / Adaptive Sync monitors to hit our labs. AMD has put a lot of self-inflicted pressure on itself for this release by making claims, numerous times, that FreeSync will be just as good of an experience as G-Sync, and I am eager to see if they can meet that goal. Despite any ill feelings that some users might have about NVIDIA and some of its policies, it typically does a good job of maintaining a high quality user experience with these custom technologies. AMD will have to prove that what it has developed is on the same level. We should know more about that before we get too much further into fall.
You can check out our stories and reviews covering G-Sync here:
- PCPer Live! NVIDIA Maxwell, GTX 980, GTX 970 Discussion with Tom Petersen, Q&A
- Acer XB280HK 28-in 4K G-Sync Monitor Review
- NVIDIA G-Sync Surround Impressions: Using 3 ASUS ROG Swift Displays
- PCPer Live! Recap – NVIDIA G-Sync Surround Demo and Q&A
- ASUS ROG Swift PG278Q 27-in Monitor Review – NVIDIA G-Sync at 2560×1440
If he thinks that people are
If he thinks that people are going to purchasing displays knowing that they are going to forever be locked into a specific GPU vendor to make proper use of it he is out of touch of reality. I buy my GPU based on what’s the best GPU. I buy my displays based on what’s the best display. Anything that complicates this gets ignored by me.
After 15 years, for me its
After 15 years, for me its NVIDIA or quit gaming.
Period
So you can have all VESA SUPER STANDARD Adaptive Sync monitors I will buy G-SYNC no matter what…actually 3 of them.
Period
Tom Peterson was there, and you idiots 🙂 didn`t ask when G-SYNC switching from FPGA to ASIC.
Who very elitist of you.
Who very elitist of you.
@Lithium
Who will make these
@Lithium
Who will make these asics?
no asic manufacturer is going to put some proprietary tech into it’s hardware without some serious palm greasing
Actually, NVIDIA could make
Actually, NVIDIA could make all monitor electronic.
Everything what is needed plus G-Sync.
YEP…no more MediaTEK, NoWatECH, reallYtech, what ever tech.
If you want a thing done well, do it yourself.
Tom sad that current monitors actually not that good for viewing…hehe
It almost looks like Nvidia
It almost looks like Nvidia is going back on it’s words from the Nvidia event last year in Montreal about maybe one day seeing G-Sync becoming a standard in everything from monitors to TVs.
No, that is still what Nvidia
No, that is still what Nvidia wants, G-Sync everywhere. Infact I’m sure that if Nvidia had everything exactly the way they wanted it, the VESA non optional standard would include G-Sync for every monitor and Nvidia GPUs would be the only ones compatible, every tv built from now on would be G-Sync compatible, and every console in the world would have a G-Sync compatible Nvidia GPU (and a Tegra CPU) in it. And to be fair, that’s ok. It is the duty of a corporation in a capitalist system to attempt to increase wealth for the shareholders, and the surest way to do that is to control more market share than the competitor. Whether we agree with that philosophy or not it is the way the system works. Such is life in America.
What Nvidia has stated however is simply that the adaptive sync portion of the newest VESA standards is optional and they have decided to opt out. It makes sense, seeing as how adaptive sync will be the driving power behind the only competing technology Free-Sync. Tom did claim that building a monitor compatible with both is something for the monitor manufacturers to decide on, if it is even possible (something that no-one seems to know the answer to).
Soon I hope we will see FreeSync at work, then we will know if this is actually a competition or if one is truly clearly better.
Ps. I’m fascinated by this topic yet there is very little chance of it concerning me for a long time. This 1080p 27″ I’m using right now will probably not be replaced until it dies. That is normally the way of the monitors for most people, isn’t it?
I guess the monitor makers,
I guess the monitor makers, who charge a ridiculous premium for g-sync equiped monitors, and NVIDIA, who lock us into their gpu’s, win, and we enthusiasts lose. And to think we are the reason NVIDIA is doing so well financially.
Ngreedia, the way its meant
Ngreedia, the way its meant to be paid.
I don’t know, if Nvidia is
I don’t know, if Nvidia is planning to use G-Sync in TN monitors only, this technology is dead for me. I want 9-60 hz IPS 1080p monitor.
Nvidia didn’t lock it in to
Nvidia didn’t lock it in to TN panels, the monitor makers did. 60hz 1080/1440p g-sync monitor is pointless cause most gpu’s can do 60 on both anyway. IPS needs to get faster first.
they will reverse this stance
they will reverse this stance when gsync dies and it will, along with mantle from AMD.
Let’s see what Intel does.
Let’s see what Intel does.
Nvidia sticks to it’s
Nvidia sticks to it’s proprietary stuff and doesn’t support open standards.
I can’t believe this. WHO COULD HAVE THOUGH THAT?
LOL and many many extra LOLs.
The new Nvidia’s are still OpenCL 1.1.
OpenCL 1.2 is 3 years old all ready.
So in 3 years Nvidia will keep producing cards with Displayport 1.2 and Nvidia fanboys will be proud for this tactic.
If AMD dies, Open standards die in computer graphics.
Remember this EVERYONE.
AMD has been killing
AMD has been killing themselves even without nvidia’s help. They make claims then not live up to them when product is released.
Even then Freesync isn’t open standard, its based off the adaptive sync standard using its protocol but )its as ppl love to use this word against nvidia) free sync is Proprietary software
G-Sync is superior to
G-Sync is superior to inferior AMD & VESA developed Adaptive Sync. There’s no reason why Nvidia would adopt inferior technologies like Adaptive Sync.
Adaptive Sync is OPTIONAL spec even in DisplayPort 1.3.
OpenCL, AMD can’t even compile large kernels in OpenCL, making it worthless for real work.
You live in a proprietary world, all those medical equipment in hospitals run on proprietary software, your car runs proprietary software, your TV runs proprietary software, don’t like it, too bad for you.
Windows is proprietary too, why do you use Windows if you’re such a “champion” of open standards?
spot on!
and if we were all
spot on!
and if we were all limited to inferior open standards, no one would have a reason to truly innovate.
Ye, no. You barely know what
Ye, no. You barely know what you are talking about.
Just because you see how monopolies are ripping the US a new asshole doesnt mean the same is happening in the rest of the world.
Anyone can make a processor.
China is making all the same fords, BMW`s and whatever under different names. Good. Cuz BRANDS and NAMES mean NOTHING. Its all just more stuff that our species makes. ITS NOT SOMETHING SET IN STONE.
Its a fucking shame that windows has this monopoly going on. But, if you have paid any attention to whats been going on the past year youll notice that most of the tech and software companies are trying to give microsoft the middle finger.
And good riddance. Because competition is the ONLY thing that will drive innovation. PROPRIETARY stuff and MONOPOLIES will NEVER DO THAT.
One thing everyone ignores
One thing everyone ignores is, adaptive sync and free sync are not one and the same, One is the standard the other is a proprietary implantation of the the standard.
true.
amusingly, some people
true.
amusingly, some people want Nvidia to become followers instead of leaders and invest in some FreeSync-like implementation to an unintentionally capable-to-some-extent standard pushed by AMD in response to the already available and supposedly better solution invented by Nvidia in the first place, just so it is more comfortable for them people to buy AMD products instead.
Also, unless Nvidia does
Also, unless Nvidia does something (in contract or in hardware that we don’t know about) to prevent its implementation, there’s nothing to stop Acer, Samsung, or Asus from having hardware for G-Sync and supporting Adaptive-Sync (thus enabling FreeSync) in the same display.
Trust me, its business as
Trust me, its business as usual!
When I read the rumors nvidia
When I read the rumors nvidia might be supporting adaptive sync I thought “well ad least for once they decided not to be reactionary jerks, maybe there still hope for them” And here comes Shrout to kill that sliver of hope, yes nvidia are still same old reactionary price gouging #$%&.
They are not going to say
They are not going to say anything that will get in the way of you investing in gsync right now. Should adaptive sync be just as good or better, you bet your sweet thug ass the’ll be allover it. It is potentially a much more cost effective solution.
This is the right answer.
This is the right answer. Nvidia has to say whatever they can to sell gsync. With adaptive sync coming out pretty soon they can’t say they will support it or else it would essentially stop sales of gsync in the market. If adaptive sync becomes popular enough, they will support it.
My guess is that it will become a niche feature of monitors and will essentially split gaming displays into two groups; amd Intel and nvidia. Most monitors made and sold will not have support for either.
adaptive sync soon? well
adaptive sync soon? well since monitor scalers won’t be out til end of year aka dec your lookin at mid to late q1 2015 before they are.
came here to post this. They
came here to post this. They have to say they won’t support, and people who don’t understand why are beyond hope.
From s business standpoint it
From s business standpoint it makes sense, they want to push their G-sync and make money of it. But from a consumer standpoint i think this sucks.
2 competing technologies are
2 competing technologies are BAD for the consumer? I don’t think so, friend. We need MORE competition in this space, if anything.
While Nvidia and AMD are
While Nvidia and AMD are fighting it out, the monitor manufactures could force any GPU vendor specific video synchronization hardware be included on a plug in module, that the user could optionally purchase and install, to give the monitor the vendor specific synchronization ability. This would allow the non industry open standard hardware to be optional, while any Open industry standards could be included in a built in fashion, so as to force the, unnecessary for open standards functioning, GPU monitor based hardware to not unfairly burden the consumer. If Nvidia raises to much trouble with display manufactures, the manufactures could take Nvidia before the Antitrust division of the justice department, and the appropriate courts.
This would ensure that only Nvidia’s customers would have to shoulder the burden, and not force a Nvidia tax on all monitors, just for the sake of propriety compatibility, with one vendor’s proprietary hardware. The same would apply to any GPU manufacture’s propriety hardware on third party manufactured monitors.
EDIT: GPU manufacture’s
EDIT: GPU manufacture’s propriety
to: GPU manufacture’s proprietary
“FREE”sync yet to be seen and
“FREE”sync yet to be seen and tested.
GSYNC is awesome, it works and it’s available.
From all the FAQ releases from AMD “Free”sync doesn’t even work within the same range as Gsync.
So for all you entitled little kids living in your parents basements.
Nothing in this world is free.
Your probably the same people who gripe about a sports car you dream of costing too much money too right.
I’m all for competition, as
I’m all for competition, as long as it’s fair competition, and any vendor specific acceleration/synchronization hardware On Third Party Monitor OEM built hardware that can be OPTIONALLY purchased and plug and play installed on a monitor’s expansion slot, is AOK by me, we all know already that monitors are PC unto themselves, so why not allow all GPU manufactures the ability to fairly compete, with whatever gismos and gadgets that can be optionally plugged into the Monitor’s motherboard to improve the gaming experience, and as long as it is at the users option, go for it.
AMD and Nvidia could offer optional helper SOCs/GPUs/hardware integrated into a PCI based mini plug and play module for Monitors, and offer all sorts of extra post rendering processing, or just plain extra GPU power, hosted on the monitor’s motherboard, to go along with any form of adaptive synchronization algorithms.
AMD and Nvidia could, if they so desire, have manufactured, or make for themselves, Branded monitors that only includes GPU vendor specific hardware ability, and no open standards, so long as on Third Party OEM Manufactured monitors, the rules of fair play require them to offer whatever GPU vendor specific hardware on optional PCI/other Modules, for the user to optionally install, plug and play style.
It’s not about the money.
It’s not about the money. It’s about open standards that don’t lock people into a single ecosystem. I know you have probably already decided to buy Nvidia cards no matter how good or bad they will be, but for those of us who get whatever is best at the time (and for the industry in general), it is important to avoid such crippling restrictions.
It’s funny that you call people “little kids” after making a post like this:
http://a.pomf.se/avxxun.png
So, you’r trying to make kids
So, you’r trying to make kids that still live with their parents, and can’t afford the premium cost of a G-sync monitor feel bad, is that it?
Btw calling you stupid would be an insult to really stupid people.
If the tech produces
If the tech produces equivalent results and freesync monitors can be manufactured and sold for less, maybe as much as $100 less, then people will buy freesync monitors and maybe amd gpu’s. In that case, NVIDIA will be inclined to support freesync monitors. I was going to get a new NVIDIA gpu, but I am starting to consider AMD, which is not something I would have done before Tom made their position clear.
Go buy AMD.
GM206 is coming
Go buy AMD.
GM206 is coming and will be impressive.
right, lets buy sh*t just
right, lets buy sh*t just because Tom said nvidia won’t support sh*t.
Get your point, no doubt.
Get your point, no doubt. Just frustrated and wish there was something we could do about it. Push comes to shove, I will prob end up getting another nvidia gpu, but would like it to work with both techs so I don’t have to pay up for a g-sync monitor.
The situation we are in right
The situation we are in right now is ideal really…
hardcore competition at its work
Just ignore the idiots kiddies in forums and see how in a year the end users will be the winners of all this.
Problem with Freesync and a
Problem with Freesync and a lot of AMD claims over last 1-2 years. AMD says it works same as G-sync but there has been 0 proof from independent reviews that back that, only AMD’s word which they haven’t been living up to that for a while now. So as far as we know yea G-sync costs extra but we know it works but don’t know if freesync works as well as they say.
Nvidia doing what they do
Nvidia doing what they do best: coming up with good ideas, and then completely killing their viability and adoption. Looks like I’ll be sticking with my CRTs for awhile longer.
“With that new information,
“With that new information, you can basically assume that the future of variable refresh monitors is going to be divided: one set for users of GeForce cards and one set for users with Radeon cards.”
i’ve expected this to be the case in the end. even if Adaptive Sync really is easy to implement that doesn’t mean anything if nvidia simply refuse to support it. just look at OpenCL. it is not that nvidia can’t do it better than AMD but most likely they choose to push their CUDA more.
BTW, congrats to Ryan and PC
BTW, congrats to Ryan and PC Per for breaking this story and still being the only tech site out there with this news. Shame on the other sites for not picking up your story.
I find this whole situation
I find this whole situation somewhat ridiculous. Variable refresh is obviously a good idea and it is obvious to me that this should be part of the display standard. Instead of a good standard we get Nvidia’s hack on top of the old display standard and AMD’s hack using features from a slightly new version of the standard. Then a bunch of fan boys argue which hack is better. Both of them will hopefully be killed by a proper display standard update.
I can’t imagine Intel supporting gsync from Nvidia. Both Nvidia and Intel are trying to break into the mobile market so they are direct competitors in that space. Intel also has a strong “not-invented-here” mindset. If history is any guide, Intel will come out with their own version, and both Nvidia and AMD will have to support it. Intel will make a powerful gpu eventually, and if combined with stacked memory, this could reduce Nvidia and AMD to niche players rather quickly anyway. Intel really could use variable refresh rate given how slow their current gpus are. For people gaming at 1080p with a powerful graphics card, it isn’t really that relevant yet.