Introduction
That’s right, we have the alpha version of mobility G-Sync up and running! See how variable refresh works without a G-Sync module!
It has been an abnormal week for us here at PC Perspective. Our typical review schedule has pretty much flown out the window, and the past seven days have been filled with learning, researching, retesting, and publishing. That might sound like the norm, but in these cases the process was initiated by tips from our readers. Last Saturday (24 Jan), a few things were brewing:
- Ryan was informed by NVIDIA that the memory layout of the GTX 970 was different than expected.
- The huge (now 168 page) overclock.net forum thread about the Samsung 840 EVO slowdown was once again gaining traction.
- Someone got G-Sync working on a laptop integrated display.
We had to do a bit of triage here of course, as we can only research and write so quickly. Ryan worked the GTX 970 piece as it was the hottest item. I began a few days of research and testing on the 840 EVO slow down issue reappearing on some drives, and we kept tabs on that third thing, which at the time seemed really farfetched. With those two first items taken care of, Ryan shifted his efforts to GTX 970 SLI testing while I shifted my focus to finding out of there was any credence to this G-Sync laptop thing.
A few weeks ago, an ASUS Nordic Support rep inadvertently leaked an interim build of the NVIDIA driver. This was a mobile driver build (version 346.87) focused at their G751 line of laptops. One recipient of this driver link posted it to the ROG forum back on the 20th. A fellow by the name Gamenab, owning the same laptop cited in that thread, presumably stumbled across this driver, tried it out, and was more than likely greeted by this popup after the installation completed:
Now I know what you’re thinking, and it’s probably the same thing anyone would think. How on earth is this possible? To cut a long story short, while the link to the 346.87 driver was removed shortly after being posted to that forum, we managed to get our hands on a copy of it, installed it on the ASUS G751 that we had in for review, and wouldn’t you know it we were greeted by the same popup!
Ok, so it’s a popup, could it be a bug? We checked NVIDIA control panel and the options were consistent with that of a G-Sync connected system. We fired up the pendulum demo and watched the screen carefully, passing the machine around the office to be inspected by all. We then fired up some graphics benchmarks that were well suited to show off the technology (Unigine Heaven, Metro: Last Light, etc), and everything looked great – smooth steady pans with no juddering or tearing to be seen. Ken Addison, our Video Editor and jack of all trades, researched the panel type and found that it was likely capable of 100 Hz refresh. We quickly dug created a custom profile, hit apply, and our 75 Hz G-Sync laptop was instantly transformed into a 100 Hz G-Sync laptop!
Ryan's Note: I think it is important here to point out that we didn't just look at demos and benchmarks for this evaluation but actually looked at real-world gameplay situations. Playing through Metro: Last Light showed very smooth pans and rotation, Assassin's Creed played smoothly as well and flying through Unigine Heaven manually was a great experience. Crysis 3, Battlefield 4, etc. This was NOT just a couple of demos that we ran through – the variable refresh portion of this mobile G-Sync enabled panel was working and working very well.
At this point in our tinkering, we had no idea how or why this was working, but there was no doubt that we were getting a similar experience as we have seen with G-Sync panels. As I digested what was going on, I thought surely this can’t be as good as it seems to be… Let’s find out, shall we?




I was a skeptic but I guess
I was a skeptic but I guess NVidia let the cat out of the bag “accidentally”. Great investigative work as always guys.
All real investigation was
All real investigation was done by people who didn’t call gamenab a troll while PCperspective gave reviews about him being a liar.
Couple people with ASUS G751 and MSI GT72 IPS LG got GSYNC while others called them liars.
Some of us had to post proofs on youtube multiple times.
We had no issue with, and
We had no issue with, and even named Gamenab as the first to discover this. What I *do* hold issue with, however, is that he tried to pass it off as his invention, making all sorts of claims that the module uses crypto, etc. It was too easy to see straight through his act.
Seriously, that guy needs to buy a lottery ticket if he was that lucky.
Yeah but:
a) those arguments
Yeah but:
a) those arguments were given in the way that anyone who read them would think that there is not a bit of truth information from him about a possibility of GSYNC without Module;
b) since when we can trust Nvidia’s unofficial information? Especially after GTX 970’s “official” one? Why don’t you AGAIN leave a chance that everything here is kinda deeper?
I don’t judge radical unless there is hard evidence. Sometimes it’s better to silently analyze to the ground.
My personal opinion: if not gamenab we would never get promise from NV about moduless GSYNC notebooks.
Agreed Allyn!
Agreed Allyn!
“His first video attempt
“His first video attempt showed a game running *windowed. Anyone with a shred of adaptive sync experience knows that it can only properly work full screen.
*Seriously, that guy needs to buy a lottery ticket if he was that lucky.”
So… after newest nVIDIA driver… Looks like Gamenab either was correct or he can buy two tickets in a row and still win 2 jackpots.
So taking apart a laptop to
So taking apart a laptop to it’s bare components and putting it back together, contacting NVidia and testing hours on end to figure out what is actually happening is not Investigative work?
Gamenab dude could have come out and said it was just this driver package instead of trying to shove it in our face as if it was his own creation.
You don’t know the deep story
You don’t know the deep story and TIMING, do you?
You’re right, it’s another
You’re right, it’s another conspiracy.
Look at this funny stuff
Look at this funny stuff 😀
http://youtu.be/spZJrsssPA0
based PC Perspective
based PC Perspective
This could have some really
This could have some really interesting implications for Gsync. The fact that Gsync is largely working without the module is curious indeed. Am I paying a 100+ premium for a module that really just guarantees that the monitor works for gsync- and makes it proprietary to Nvidia so it’s not Freesync compatible?
Sounds like the $100 premium
Sounds like the $100 premium is to guarantee that the display doesn’t shut off/fade to black when the frame rate gets too low or interrupted. Sounds worth it to me.
Such is the penalty of early adoption… ultimately not that bad.
If a frame rate stall causes
If a frame rate stall causes the screen to display a black/empty frame, then my first question is why can’t the GPU driver be tweaked to simply repeat the last image from the frame buffer?
Supposedly, the display and system perform a sort of handshake to exchange info on the minimum/maximum supported frame rates during initialization. The GPU therefore knows the rate at which it must spit out frames and it can either repeat the last frame (would cause stutter/judder) or try and do some form of predictive/interpolated guesswork to inject an approximated frame (artificial motion enhancement like 120Hz TV’s do which is not perfect either, aka. “soap opera” effect).
There is still some potential for variable latency and race conditions if the GPU is actively updating a frame buffer at the same time a frame is needed. Of course, double and triple buffering are well known concepts and should still be just as useful in this scenario.
I believe the SOLUTION to
I believe the SOLUTION to this is part of the reason for the G-Sync module for desktop monitors which includes fast memory. This memory holds the last frame and can REPEAT it if needed.
Without the G-Sync module we’re talking about a larger DELAY which of course brings its own issues especially at lower frame rates.
Wow! Interesting indeed! Me
Wow! Interesting indeed! Me thinks the dedicated G-Sync module is not long for this world.
Would a G-Sync module consume
Would a G-Sync module consume that much more power than a typical Scalar?
Can’t say that I’m
Can’t say that I’m surprised.
meanwhile, I’ll leave you with this gem I found on techreport
https://www.youtube.com/watch?v=spZJrsssPA0
Oh that is absolutely a gem
Oh that is absolutely a gem of a video.
i will just leave this here
i will just leave this here
https://www.youtube.com/watch?v=tNGi06cq_pQ
OMG that’s incredible, I love
OMG that’s incredible, I love it.
I still like my 970, but I might try to return it if the 380X comes out soon…
Tough week for Nvidia. I
Tough week for Nvidia. I would love to listen in on these phone calls.
I’m curious what kind of protocol they are using to talk to the monitor. Perhaps they discovered it the same way AMD did, many notebooks already have VRR monitors to save energy. Looks like Nvidia has found a way to leverage that.
I never thought for a moment
I never thought for a moment that Nvidia wasn’t going to come up with their own way of doing VRR without a proprietary module, it was just a matter of when.
Hell yes! now VR just needs
Hell yes! now VR just needs to implement this into VR panels and we’ll be ready to rock!
Nothing new
Didn’t AMD demo
Nothing new
Didn’t AMD demo the first FreeSync on a Toshiba laptop over a year agao.
AMD Demonstrates “FreeSync”, Free G-Sync Alternative, at CES 2014
http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014
this might be a
this might be a clue
“Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.”
Adaptive-Sync has been apart of the embedded displayport for a while now, so its no wonder
so it safe to bet that nvidia has done some driver mojo to enable *FreeSync* like things in their graphics cards
i wonder how old is the GPU in this laptop? if its a new core then its probably been updated to support it
How much was Nvidia expecting
How much was Nvidia expecting to charge for this (Free of hardware costs for Nvidia) laptop G-Sync! Maybe for Nvidia just the cost of printing the sticker that says G-Sync enabled, and passing that on at 9999.9999% markup to the Laptops’ buyers. First its Memory arithmetic errors 3.5 = 4, and now Free-Sync becomes G-free-sync(free as in free money for Nvidia), add an extra $150 to the BOM, and let the free cash flow, the green team is more about the green these days, than ever before.
NGREEDIA LIES AGAIN TO THEIR
NGREEDIA LIES AGAIN TO THEIR CUSTOMERS.
Thanks to Freesync and the price difference that will make Gsync monitors NOT competitive with the Freesync ones, Nvidia is getting ready to promote Adaptive Sync under the G-Sync brand.
If you bought a 970 with wrong specs,unfortunately for you the spec that says DP1.2 and not DP1.2a is correct. If you feel something penetrating you AGAIN from behind, don’t worry, it’s only NGREEDIA.
Are you guys talking about
Are you guys talking about this
Truth about the G-sync Marketing Module (NVIDIA using VESA Adaptive Sync Technology – Freesync)
http://gamenab.net/2015/01/26/truth-about-the-g-sync-marketing-module-nvidia-using-vesa-adaptive-sync-technology-freesync/
This guy got G-Sync to work on selected monitors that don’t have a G-Sync Module with a modded driver.
Yes, that is what we are
Yes, that is what we are talking about, but that guy just installed a leaked driver on his laptop. That's all he needed to do to 'get it working'.
G-Sync module out the window!
G-Sync module out the window! (read: this is SPARTA!!!!)
Finaly Nvidia found the light and decided to use FreeSync.
But it’s a shame for Nvidia fanboys that like to be robbed by Nvidia 😛
After reading the
After reading the article:
“mobile G-Sync” = VESA Adaptive Sync
“Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP™) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.
”
Source: VESA
Yeah, that was my takeaway as
Yeah, that was my takeaway as well.
TL:DR G-sync is just an old
TL:DR G-sync is just an old eDP with some fancy drivers for which nvidia charges 100 to 200$ premium.
LOL!!!
This is coming full
LOL!!!
This is coming full circle.
[PCPerspective] NVIDIA’s take on AMD’s under documented free sync
https://pcper.com/news/General-Tech/NVIDIAs-take-AMDs-under-documented-free-sync
Which is sourcing.
[TheTechReport] Nvidia responds to AMD’s ”free sync” demo
http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo
Which quotes Tom Petersen
Which again goes back to what G-Sync is doing.
[AnandTech] NVIDIA G-Sync Review
http://www.anandtech.com/show/7582/nvidia-gsync-review
Probably another miscommunication between engineering and marketing for sure.
I was about to dig up that
I was about to dig up that TechReport article myself.
If they have implemented this on laptops in alpha drivers that means VESA Adaptive-Sync support is in development. They probably won’t enable it on their GPUs until they sell all of their G-Sync monitor stock but NVidia is clearly developing support for Adaptive-Sync right now.
Tom Petersen suggsted otherwise a week ago on the 960 stream.
https://www.youtube.com/watch?v=HHrSe0DtXHk#t=3247
But he was more open to it than NVidia were back in September when they said they had “no plans” to support Adaptive-Sync.
https://pcper.com/news/Graphics-Cards/NVIDIA-Confirms-It-Has-No-Plans-Support-Adaptive-Sync
I don’t get all the hate for
I don’t get all the hate for this company all of a sudden. Yeah they screwed up with 970, yeah they are profit oriented. So are all companies. I think it’s proven that the module HAS a function, even if the core functionality is possible without. Can’t we all be happy about the possibility of reaching a standard soon, which won’t bind our monitor choice to the brand of our graphics chip? Can’t we acknowledge the fact that nvidia brought VRR to market one year before Amd, that technology has moved on and will continue to move in a direction which will make things possible which weren’t one year ago, I. E. VRR without a module?
Every day I get sadder about all this hate in the interwebs. 🙁
A big part of it is that they
A big part of it is that they sell folks expensive products as the greatest technology ever, and then a month or so later, something much better comes out and people feel left behind and at a high cost. That’s a big part of it in my opinion. But yes, if you understand technology and technology companies, this is how it is and this effect will continue to snowball (the better tech we have, the faster they can create better tech).
Well module free G-sync is
Well module free G-sync is not new OR better, just a new form of implementation – which BTW does not have EVERY feature or even as good a UX as a G-sync module enabled setup does.
This is just a natural progression as DP 1.2a allows for some of the G-sync features to be enabled, but not ALL! I know I sure as HELL am glad my monitor doesn’t go black from an FPS dip.
And the simple truth remains, with G-sync Nvidia has brought adaptive-sync capabilities to all of us well over a year before the FIRST “free sync” parts become available! As they still are completely unavailable IIRC!
im starting to hate nvidia
im starting to hate nvidia too they make great cards but damn it looks like they care more about the bottom line then the gamer community.AMD spent more time finding away to make VRR for free and so anyone can use it while nvidia charged us a couple hundread dollars.Im pretty sure Nvidia will still not take advantage of free sync even though they can do it just too still charge us money.Im switching to the red team next gen just because i see AMD as a company that cares about the gamers spending time and money for free sync and mantle just so we can player better without paying a huge sum of money.
AMD didn’t spend “time” doing
AMD didn’t spend “time” doing JACK SH!T! They sat around flapping til DP 1.2a became a STANDARD (as in most monitors and video cards use it anymore) – and AMD did NOTHING other than re-brand adaptive v-sync!
Nvidia at least had the ingenuity to goe their OWN path and create adaptive V-sync parts for consumers over a YEAR before AMD simply just made use of AN EXISTING STANDARD!
And now it seems Nvidia is going to make possible PART of the G-sync experience free for DP 1.2a (and even potentially HDMI and DVI eventually) – I see NOTHING but crap from AMD, yet the fanboys STILL find ways to justify the tons of carbon created by their inefficient power hungry PC parts (ALL of them!) – while STILL lagging behind a chip Intel made THREE YEARS AGO!
“The DisplayPort™
“The DisplayPort™ Adaptive-Sync specification was ported from the Embedded DisplayPort™ specification through a proposal to the VESA group by AMD.”
http://www.guru3d.com/news-story/vesa-adds-adaptive-sync-to-displayport-video-standard.html
They put together a proposal (eDP port to DP) and submitted it to VESA. They did their research part. They chose to move the industry forward with it.
GG monitor refresh rates. Stutter, tearing…gone are the days you had to pick your poison. Now you really only need to pick red or green. And a new monitor that suits.
oh there’s hate alright, for
oh there’s hate alright, for YEARS
the way its meant to be played came when? during the geforce FX series failure, they put a spin on crap
proprietary physx? cuda? gameworks originally without source code so devs arent allowed to optimize themselves?
blocking the AA setting from batman if you dont use nvidia, even though it works perfectly fine when you spoof your other brand as nvidia
blocking gpu physx if your primary card is a different brand, even though there was a period where it was allowed, & then there was a period of people modding the drivers to re-enable gpu physx for your secondary nvidia card
the class action lawsuit about the failing/desoldering mobile gpus
this isnt about traditional ‘make a good product, price if it’s premium, profit’, nvidia consistently tries to sabotage, spin, or hide things (which doesnt make sense, their hardware & drivers are competitive enough, there’s no need to be a douche)
& after all that… just get what you can use, i just bought a 660 because i need a driver for vista & amd stopped releasing them over a year ago
my laptop is a 570m because dont touch amd based laptops
gaming is a luxury anyway… as for the internet (i do not recommend you use the word ‘interwebs’), of course it’s going to be amplified hysteria so try not to read too much of it, just stick to actual tests, make decisions for yourself, etc
LMFAO! Look at a PhysX
LMFAO! Look at a PhysX enabled game, then compare the SAME game with PhysX off! HUGE difference! AMD fanboy jackoffs love to TRY to bash Intel and Nvidia separately – but they are BOTH smart enough to know their own places, Intel makes CPUs Nvidia makes GPUs.
The LAST time a Radeon card had a ACTUAL edge over Nvidia was back when they were still ATI! Because ATI were their own company they could afford to make a brand new architecture in the 9700 – and yes, that card and many subsequent variations thereof had Nvidia scrambling in the FX series to try to gain headroom, which they did – by the FX 8000 series.
You say PhysX is stupid, well where the fuck is Mantle? Oh, 3 games? Its been a banner year for AMD adoption!
And PhysX + DX12 will have Mantle DOA! AMD is like a drowning puppy, sad as hell to watch, waves battering them as with each paddle towards shore they release another POINTLESS SKU or space heater GPU (which has similar performance to an equal priced Nvidia card, just takes twice or more Wattage to power!) I used to love AMD, but ever since they swallowed ATI, they have been a sinking ship!
Over three years since SandyBridge was released and AMD still has NO consumer CPU that (for all intents and purposes, apart from EXTREMELY multi threaded workloads – which is more of an admirable trait in a Workstation or Server CPU) – can even get CLOSE to the performance of an i7-2600k, even at stock speeds it has better per-core speed than the stock 4.7 GHz clocked 9590, and that is without counting that ANY i7-2600K can EASILY hit at least 4.3 GHz with a dinky stock cooler (where the 9590 come with a fucking water block STOCK, and still cannot overclock!) – and can EASILY hit 5 GHz with just a Cooler Master Hyper 212 EVO!
So go and TRY to bash Nvidia (or Intel) truth is, they are BOTH doing FAR better than AMD for good reason! Ooooo boy, Nvidia got their memory specs wrong for the 970 – that means AMD is DEFINITELY the better choice! You flapping AMD fanboys are HILARIOUS!
dumbass, i bought nvidia,
dumbass, i bought nvidia, apparently you cant see in your blind rage
i said nvidia has good products, yet why do they keep doing anti competitive tactics when the products are very competitive alone
wtf do cpus have to do with anything? why dont you just continue jerking off in your corner
I have to agree with kn00tcn.
I have to agree with kn00tcn. I’ve always preferred Nvidia because my first laptop had their GPU and I was able to play Far Cry 1 despite being below the minimum specs. Good first experience.
But I take a big issue with their anti-competitive decisions. Currently using a 750 Ti and it is great! I am not sure my next GPU will be from Nvidia though. Depends on how they act. I want to support competition, not a company who lies and bullies.
LMFAO! Look at a PhysX
LMFAO! Look at a PhysX enabled game, then compare the SAME game with PhysX off! HUGE difference! AMD fanboy jackoffs love to TRY to bash Intel and Nvidia separately – but they are BOTH smart enough to know their own places, Intel makes CPUs Nvidia makes GPUs.
The LAST time a Radeon card had a ACTUAL edge over Nvidia was back when they were still ATI! Because ATI were their own company they could afford to make a brand new architecture in the 9700 – and yes, that card and many subsequent variations thereof had Nvidia scrambling in the FX series to try to gain headroom, which they did – by the FX 8000 series.
You say PhysX is stupid, well where the fuck is Mantle? Oh, 3 games? Its been a banner year for AMD adoption!
And PhysX + DX12 will have Mantle DOA! AMD is like a drowning puppy, sad as hell to watch, waves battering them as with each paddle towards shore they release another POINTLESS SKU or space heater GPU (which has similar performance to an equal priced Nvidia card, just takes twice or more Wattage to power!) I used to love AMD, but ever since they swallowed ATI, they have been a sinking ship!
Over three years since SandyBridge was released and AMD still has NO consumer CPU that (for all intents and purposes, apart from EXTREMELY multi threaded workloads – which is more of an admirable trait in a Workstation or Server CPU) – can even get CLOSE to the performance of an i7-2600k, even at stock speeds it has better per-core speed than the stock 4.7 GHz clocked 9590, and that is without counting that ANY i7-2600K can EASILY hit at least 4.3 GHz with a dinky stock cooler (where the 9590 come with a fucking water block STOCK, and still cannot overclock!) – and can EASILY hit 5 GHz with just a Cooler Master Hyper 212 EVO!
So go and TRY to bash Nvidia (or Intel) truth is, they are BOTH doing FAR better than AMD for good reason! Ooooo boy, Nvidia got their memory specs wrong for the 970 – that means AMD is DEFINITELY the better choice! You flapping AMD fanboys are HILARIOUS!
So in the future, they say
So in the future, they say they don’t know (module, no module, different module), I’ll give them that. But, I believe it is fair to say that they know (especially with this adaptive sync spec coming) in the future’s future the answer is NO SPECIAL NVIDIA MODULE will be required to enjoy this variable refresh technology. Unfortunately, I feel they would never admit to knowing that way because there is still money to be made with the “exclusivity of Nvidia modules” in whatever form they come in in the next generation, if that makes any sense. I just wish companies were honest, that’s all. That’s why we need folks like you PCPer to keep them honest as possible and give us info we need.
GSYNC is working on MSI GT72
GSYNC is working on MSI GT72 laptop too with IPS LG screen 9xxM GPU.
Confirmed 2 days ago the first time and many more after that.
You need to use nvlddmkm.sys file from 346.87 driver or just use gamenab’s one.
Gamenab told since the vary beginning that it’s his algorithm is in Nvidia’s driver. Believe him or no is everybody’s choice.
I don’t believe him. I find
I don’t believe him. I find it hard to believe some random Joe was able to write a algorithm that is not only compatible with the screen in that laptop, but also able communicate with Nvidia’s drivers without issue. Nvidia makes awesome stuff, but everything is proprietary.