Introduction
That’s right, we have the alpha version of mobility G-Sync up and running! See how variable refresh works without a G-Sync module!
It has been an abnormal week for us here at PC Perspective. Our typical review schedule has pretty much flown out the window, and the past seven days have been filled with learning, researching, retesting, and publishing. That might sound like the norm, but in these cases the process was initiated by tips from our readers. Last Saturday (24 Jan), a few things were brewing:
- Ryan was informed by NVIDIA that the memory layout of the GTX 970 was different than expected.
- The huge (now 168 page) overclock.net forum thread about the Samsung 840 EVO slowdown was once again gaining traction.
- Someone got G-Sync working on a laptop integrated display.
We had to do a bit of triage here of course, as we can only research and write so quickly. Ryan worked the GTX 970 piece as it was the hottest item. I began a few days of research and testing on the 840 EVO slow down issue reappearing on some drives, and we kept tabs on that third thing, which at the time seemed really farfetched. With those two first items taken care of, Ryan shifted his efforts to GTX 970 SLI testing while I shifted my focus to finding out of there was any credence to this G-Sync laptop thing.
A few weeks ago, an ASUS Nordic Support rep inadvertently leaked an interim build of the NVIDIA driver. This was a mobile driver build (version 346.87) focused at their G751 line of laptops. One recipient of this driver link posted it to the ROG forum back on the 20th. A fellow by the name Gamenab, owning the same laptop cited in that thread, presumably stumbled across this driver, tried it out, and was more than likely greeted by this popup after the installation completed:
Now I know what you’re thinking, and it’s probably the same thing anyone would think. How on earth is this possible? To cut a long story short, while the link to the 346.87 driver was removed shortly after being posted to that forum, we managed to get our hands on a copy of it, installed it on the ASUS G751 that we had in for review, and wouldn’t you know it we were greeted by the same popup!
Ok, so it’s a popup, could it be a bug? We checked NVIDIA control panel and the options were consistent with that of a G-Sync connected system. We fired up the pendulum demo and watched the screen carefully, passing the machine around the office to be inspected by all. We then fired up some graphics benchmarks that were well suited to show off the technology (Unigine Heaven, Metro: Last Light, etc), and everything looked great – smooth steady pans with no juddering or tearing to be seen. Ken Addison, our Video Editor and jack of all trades, researched the panel type and found that it was likely capable of 100 Hz refresh. We quickly dug created a custom profile, hit apply, and our 75 Hz G-Sync laptop was instantly transformed into a 100 Hz G-Sync laptop!
Ryan's Note: I think it is important here to point out that we didn't just look at demos and benchmarks for this evaluation but actually looked at real-world gameplay situations. Playing through Metro: Last Light showed very smooth pans and rotation, Assassin's Creed played smoothly as well and flying through Unigine Heaven manually was a great experience. Crysis 3, Battlefield 4, etc. This was NOT just a couple of demos that we ran through – the variable refresh portion of this mobile G-Sync enabled panel was working and working very well.
At this point in our tinkering, we had no idea how or why this was working, but there was no doubt that we were getting a similar experience as we have seen with G-Sync panels. As I digested what was going on, I thought surely this can’t be as good as it seems to be… Let’s find out, shall we?
Just tested the driver on my
Just tested the driver on my G71JY. Yup, G-Sync popped up. The panel refresh rate itself can be overclocked up to 100 Hz, which is rather exceptional for an IPS panel. I first heard it could do 100 Hz at ROG forums, and later on with newer Nvidia drivers it actually does default at 75 Hz. I wonder if they were actually trying to get it 120 Hz and it didn’t pan out…
shocking, “journalists”
shocking, “journalists” shilling for nvidia again while random curious user has to find out that nvidia is lying to customers again about gsync only available with some overpriced proprietary module, exactly as random user has to find out that his 970 doesnt have advertised vram
meanwhile, exactly like it was with gtx970 3,5gb vram, said user is being harassed, belittled, insulted and called names by nvidia astroturfers, fanboys and even “journalists” , while doing unbiased and independent research that will provide benefits to customers
Don’t forget the old good 257
Don’t forget the old good 257 Nvidia driver that was leaked without PhysX/CUDA lock and was working beautifully with AMD primary and Nvidia secondary card without the need for a patch.
This company really needs a boycottage to come at it’s senses. Even Intel that is an almost monopoly treats it’s customers with more respect. Probably even Apple.
Nvidia’s fanboys having
Nvidia’s fanboys having nightmares this week. HAHAHA
Geezus…
Some of the
Geezus…
Some of the commenters here can’t have read more than a few words of the article…
Protip… Read the whole thing… Understand the whole thing… Then comment…
Some serious stupidity and
Some serious stupidity and bad intentions circulating around here.
For eDP equipped laptops G-Sync module is not needed.
Gamenab/whoever just discovered hot water.
Year ago there wasn’t such tech for DESKTOPS.
Module was made just to make it possible on DESKTOPS in moment when scalers doesn’t support VESA eDP.
True is that you all should thank NVIDIA to finally show this very important technology and made VESA and monitor scaler chip makers to DO SOMETHING.
Free-sync is made of air and with zero work from AMD side.
Its just few year old VESA standard that no one actually use on PCs.
Thank you for the first line
Thank you for the first line in your post. It summarizes perfectly the rest of it.
…and yet everything he said
…and yet everything he said appears completely accurate from simple observations.
Of course. There’s a lot of
Of course. There’s a lot of very clear bias going on with most folks calling foul on all this. The Tech Report and AnandTech quotes (as well as from PCPer) from the Anon commenter above makes it all the more obvious that this was something they were gonna do anyway, likely sans module.
So the people who comment at
So the people who comment at PCPerspective are stupid with bad intentions. I think this comment should go directly on the first page and stay there for everyone to see it.
Also don’t forget to send a letter at AMD saying them that their Freesync is “made of air and with zero work from their side”. Because you agree with EVERYTHING in his post.
I’m sorry to say this, but
I'm sorry to say this, but without a shipping product, FreeSync is still technically vaporware (made of air).
He didn’t meant that, but
He didn’t meant that, but anyway. Maybe you should also send a mail to Samsung and ask them about their vaporware 4Ks that they announced for March.
Don’t let the Nvidia fan in you, beat the journalist in you.
That’s not a variable refresh
That's not a variable refresh panel. Please stop grasping at straws. You're coming across as desperate.
I am the desperate. OK, If
I am the desperate. OK, If you say so.
you dont get it…
nvidia
you dont get it…
nvidia said that they will not use at any form or anything from adaptive sync
this is why they have the g sync module a premium module designed to steal more money from fanboys
but that driver reveals something else..
if the monitor is an adaptive sync ready one and nvidia is using it under the name of g sync this could lead to some serious thor’s hammer comming down to nvidia head from vesa
and lol @ nvidia helping vesa while bashing them in the process
dp1.2a wether it is laptop or not it doesnt change the bare emc of the them…
im sure those news dp1.2a desktops monitors will work just fine when someone port that driver to them..rendering that overly priced piece of nothing g sync obsolete (well it was since vesa implented adaptive sync as a standar but anyway)
The G-Sync module provides
The G-Sync module provides hardware triple buffering (this is the reason of the good amount of RAM in the module), that reduces the latency in the panel and improves the timeframe in games without support of triple buffering (99%?).
The version of G-Sync for laptops is equal to Freesync, the desktop version with the G-sync module is slighty better than both.
“triple buffering … that
“triple buffering … that reduces the latency”
You couldn’t be … more wrong on that one.
Buffering causes delay. Triple buffering = 3 frames extra delay (50 ms @ 60 fps)
There is a reason why Oculus VR and competitive gamers use as little buffering as possible.
Many people do not have a
Many people do not have a proper understanding of triple buffering. It is just not the same thing as 'buffering three frames ahead' which many falsely believe. It can be done without added latency. As a matter of fact, triple buffering results in *reduced* latency and higher FPS when operating with VSYNC – having only two frame buffers forces the GPU to wait until a draw to complete before it can start working on the next frame (because it is already one frame ahead).
You appear to confuse “frame
You appear to confuse “frame rate” and “latency”.
Frame rate is the number of frames that GPU can render per given time period, typically one second.
Latency is the time differential between frame being drawn and frame being displayed.
Triple buffering increases latency over double buffering or single buffering but may increase frame rate stability. It’s main goal is to improve frame rate (as you state) by “stabilizing” frame rate in scenes that have very different frame rate between frames, i.e. quickly drawn frame followed by extremely particle heavy slow frame for example. In case of double buffering, the slow frame may still be rendered by the time that frame buffer is exhausted and you get a massive reduction in frame rate for that moment while display buffer has to wait for the slow frame to be drawn before it can be pushed to the display. With triple buffering it has one more frame of buffer to push to display before it has to wait.
So yes, triple buffering can increase FPS. However it does this at a cost of also increasing latency. That is how triple buffering inherently works in directx.
The other implementation, specifically page flipping which enables GPU to pick which frame to display which does indeed in some rare cases allow to reduce latency is not available in directx and therefore largely irrelevant for windows gaming which is overwhelmingly directx based.
What you are referring to is
What you are referring to is not 'real' triple buffering. Some games do not implement this correctly, resulting in increased latency (usually to a very noticeable degree). That is not what I was talking about, and those games that do it wrong are the exact reason for the misnomer.
You are arguing against
You are arguing against implementation in directx itself, which is the overwhelming industry leader right now. It’s done like this because it reduces framerate instability and microstutter that can be caused by page flipping with triple buffering in scenarios where frames have significantly different rendering cost for GPU.
Like Don Quixote fighting against wind mills.
COnclusion :
PhySick :
COnclusion :
PhySick : DRM
G-Spot : DRM
FUXR : DRM
TeXAA : DRM
0.5G : DRM
Lol…
“Obviously we went straight
“Obviously we went straight to NVIDIA…”
Obviously you go straight to nvidia before you write anything…
Guys remember that all the shills and fanboys stating that nvidia created new feature and deserves all the millions for royalties and proprietary hardware? Well, looks like that “new feature” is 4 year old vesa embedded display port spec after all. Ofc with cooler sounding name that surely took alot of time and money to think it up and customer robbing proprietary scaler. 🙂
If you want to find out and
If you want to find out and confirm what you think you’re seeing, then yes, you do go straight to Nvidia. That doesn’t make them shills, it makes them thorough.
Allyn, Ryan & Co., congrats guys, nice feature.
Also, how dare a company try to make money on a new feature. The “Freesync” reveal was such a blatant “us too! us too!” maneuver that AMD forgot, as is their wont, that they need to make money to survive as a company. They shifted the profits of the adaptive sync technology over to the monitor manufacturers. I don’t blame Nvidia for not wanting to get involved in AMD’s race to the bottom.
Otherwise, I agree that this tech should have been out years ago.
I had my portion of the
I had my portion of the article (the first two pages and most of page three) *before* we had our call with Nvidia. That call was the last thing that happened before we published. We needed their statements for the article. It's called journalistic integrity.
I think you mistaken by the
I think you mistaken by the definition of journalistic integrity.
1) Truthfulness – limited to a narrow scope of article being written
2) Accuracy – Again narrowed to fit article being written
3) Objectivity – Limited to Nvidia sources
4) Impartiality – None favored Nvidia sources
5) Fairness – None favored Nvidia sources
6) Public Accountability – None, Even when articles you shown lead directly to counter the original article.
I suggest your read these.
Journalism ethics and standards
http://en.wikipedia.org/wiki/Journalism_ethics_and_standards
I hope this is a troll. If
I hope this is a troll. If not, you’re mistaken.
Their “source” is not nvidia, but theirs and others own work in finding out how this laptop came to be able to run G-sync. Furthermore, who else are they going to contact?
They were running the story whether or not Nvidia commented on it. It’s a journalistic practice to give someone the opportunity to comment before publishing something, especially if it could be damaging to them. That’s how you get all sides of a story to present to a reader so they can make their own determination. Nvidia responded with additional information, because they were forced to do so, or look incredibly foolish.
Appears to me Nvidia ‘jumped
Appears to me Nvidia ‘jumped the gun’ with G-Sync, knowing the new VESA standard was in the works, to establish G-Sync monitors were worth a premium and don’t work with AMD cards so in the future they can implement their version of free sync on 1.2a compliant ‘G-Sync’ branded monitors that includes a cheap chip blocking AMD cards from accessing the adaptive sync feature in an attempt to lock those monitor buyers into Nvidia’s GPUs.
Would be a classic JHH~Nvidia dick move – if they think they can get away with it.
so it’s basically going to be
so it’s basically going to be AdaptiveSync branded as FreeSync by AMD and Gsync by Nvidia… hopefully, having both vendors supporting adaptive refresh rate on the same monitor is the best case scenario.
Perhaps it is time to get Tom
Perhaps it is time to get Tom Peterson back for a Q&A about the 970 memory issues and now Gsync functionality without a gsync module.
Lets see if they want to have some screen time when there isn’t a product launch. I don’t want to be cynical but I doubt they’ll oblige you. Though I would have a lot of respect if they stepped up in difficult times.
I think some serious questions and answers need to come and they need to be more upfront about what the gsync modules purpose is over and above adaptive vsync. They need to stop avoiding things and offer a bit more disclosure. They’ve patented the module I am sure so why hide it’s full functionality and why not answer out right if it is required?
JHH to Nvidia’s customers:
JHH to Nvidia’s customers: ┌∩┐(◣_◢)┌∩┐ you, I Have All your cash, so You can’t have ┌∩┐(◣_◢)┌∩┐ from Nvidia, without paying a whole ┌∩┐(◣_◢)┌∩┐ load of cash, and enjoy your 3.5 GB of ┌∩┐(◣_◢)┌∩┐ up memory, and G-Stink!
Dat Gsync module scam ! 😀
Dat Gsync module scam ! 😀
This makes it hard to believe
This makes it hard to believe the 970 story of accidental miscommunication between marketing and engineering story, maybe Nvidia just likes to talk out of the side of their mouths.
all i see lot of hitching in
all i see lot of hitching in the video
That is because you are
That is because you are watching a 30 FPS video of a display refreshing at 50 FPS. Smooth playback is not possible in the recorded video. Even if we posted this at 60 FPS, there would still be visible judder due to the rates being out of sync.
Could you have played it back
Could you have played it back at 30fps (in slow motion, of course)?
It would have still juddered.
It would have still juddered. The capture itself has to be in sync with or an even multiple higher (even multiples not so important if you are several times higher). Also, playback of that high FPS capture would need to be played back on a high FPS display (or slowed down on playback). This is why it is so hard to explain the effect of G-Sync / adaptive sync – it's just something you need to see in person.
All I want is a decent
All I want is a decent monitor with variable refresh rate tech that’ll work with amd and nvidia gpus and won’t break the bank.
I believe…
Wow alot of you people have
Wow alot of you people have some serious anger issues. I for one am stoked on mobile G-Sync especially if it can be added with software only!
A question and I think it’s rather important at least to me since I am in the market for a gaming laptop. Is there certain laptop displays I could look out for that are more likely to work with G-Sync mobile.
The laptops this partial
The laptops this partial solution works on may be just that – partial support. I believe newer TCON revisions may be required for functional (playable) Mobile G-Sync, which means it would be the next round of laptops and not necessarily the current ones.
If laptop makers actually
If laptop makers actually support panel self refresh (eDP 1.3), then it seems the need for a g-sync module goes away completely. This performs very similar function to what a g-sync module does. This development driver seems to use adaptive sync, but it may expect their to be a gsync buffer present. This could be the source of the blanking. When the frame time goes to high a g-sync display will refresh from the local buffer but since there isn’t one it just blanks the screen. It essentially lost the video signal.
Also, I am still wondering why they wouldn’t want to update the panel asynchronously if they have a buffer in the display controller. If you have 144 Hz panel, then why not scan it from the buffer at 144 Hz all of the time. Does this add too much latency? You can’t update the buffer in middle of a scan or it will cause tearing, so you could get up to one frame of latency, but is this noticeable at 144 Hz? It may require some extra circuitry to handle updates that come in middle of a scan (double buffering or something more clever). Unless I am missing something, this is what a g-sync display does when it has to wait too long for a frame (<30 Hz); it just does the scan out of the local buffer.
Panel self refresh actually
Panel self refresh actually gets in the way of what current G-Sync does. If a panel forced a self refresh, that refresh would have to complete before a new frame could be sent, causing that frame to judder. You need *complete* control of the path to avoid situations like this.
Nvidia has been deleting and
Nvidia has been deleting and hiding every single thread about this.
Eye opening when customer
Eye opening when customer service is using a site to defend it self and its actions.
https://forums.geforce.com/default/topic/808129/geforce-900-series/sites-that-support-nvidia/
The auto industry is
The auto industry is regulated strictly, with regards to the information its sales associates have to supply potential customers, and its about time, after 35+ years, that the reporting of vital information to the consumer about the electronic PC/laptop, and mobile devices market, have the same reporting requirements. Device/components reporting is in such disarray, where are the required data sheets, with important information so obfuscated, or just not supplied, information that is vital for the potential customer to make an proper purchasing decision is just not there, the requirement of a mandated data sheet, specific to the PC/Laptop/mobile device, as well as the SOCs/CPUs, and the GPU(should have their own data sheet requirement), and the makers of OEM/ODM parts, as well as the final PC, laptop, mobile device, should have to bundle all the appropriate data sheets, and provide them to their prospective customers.
This includes the brick and mortar stores, having display copies of all the relevant data sheets, online retailers should provide links to the relevant data sheets, this also includes the software such as graphics drivers(Are the graphics drivers updatable by the OEM, or can they be updated by The GPU’s maker), as well as listing all of the Bloat/adware that is bundled with the computer that is not vitally necessary for the operation of the device. It’s time for strong reporting regulations, or the consumer will continue to be in the dark, and unable to even make an educated guess.
It’s time to start writing letters, to the FTC, and any other federal agencies that oversee the markets, and include your state’s Attorney general in you list, the PC market, as well as the mobile market, is terrible for providing system specific information about their products, OEMs and ODMs alike, need strong information reporting requirements.
Amazing how many AMD trolls
Amazing how many AMD trolls are posting. Would love to know how many actually work at AMD. 😛
And by mentioning the AMD
And by mentioning the AMD specific(as claimed by you), you are implying that you may be likewise for the apposing side. Be sure to note the difference between the fanboi “troll” and the pissed off consumer who were just venting their anger against the big monopoly/ies abusing their market position, and the relative lack of any enforcement of fair market practices, in this new era of the poorly regulated big monopolies of the past 35 years.