Performance Results and Closing Thoughts
So the image quality results seem to line up with what NVIDIA has claimed but what about performance? How much can you gain moving from standard 4xMSAA to the new 4xMFAA in games that support it? Obviously we only had time test a few games (and remember only 20 games support it at all), but we used our standard GPU test bed including a Sandy Bridge-E Core i7-3960X, 16GB of DDR3 and a reference GeForce GTX 980 4GB card.
Even at 4x MSAA, GRID 2 is able to run at nearly 90 FPS at 2560×1440 and the Ultra quality preset. The move to 2x MSAA results in an average frame rate increase of 6%. MFAA gets you 4-5% of that performance back while improving image quality in many cases.
In BF4, running the game with 2x MSAA results in an average frame rate of 60 FPS at 2560×1440 using the Ultra preset. Moving to 4x MSAA drops that down to 53 FPS on average. The 4x MFAA test results in an average FPS of 58 FPS, a 9% increase over the 4x MSAA result.
Crysis 3 actually sees the biggest gain with MFAA at work! Both the 4x MFAA and 2x MSAA result hit 33-34 FPS on average while the 4x MSAA implementation drops down to 28 FPS: 17% slower. This game was tested at 2560×1440 and the Very High quality presets.
Closing Thoughts
NVIDIA's new Multi-Frame Sampled Anti-Aliasing is finally coming out, two full months behind the release and reveal of the GTX 980 and the MFAA technology in general. Despite that delay, the current shipping driver only supports MFAA on twenty PC games and uses a silent white list method that requires a lot of research on the part of the gamer to determine compatibility. Clearly this isn't what NVIDIA expected or desired, but that is where we are on the launch of the AA method with the baddest name around.
Still, even though we could fairly call this MFAA release small by expectations placed on the tech by NVIDIA, it does appear to work as desired in those games that are supported. In my time with it, the image quality it provided was better than 2x MSAA and nearly to that of 4x MSAA with performance closer to 2x MSAA than 4x MSAA. That alone would give MFAA a spot in our list of favorite features for Maxwell if it just supported more games!
Time will tell if MFAA is a feature that NVIDIA continues to work on and improve or if it will be one of the many graphics technologies from the last 15 years to find its way to the list of also-rans. Even looking at the list of ATI/AMD/NVIDIA specific AA methods alone will leave you dizzy with acronym-confusion. Not having SLI support for MFAA also seems like a really glaring omission considering these are the same types of users that are willing to enable off-shoot options in the control panel like this.
For now though, a very limited subset of NVIDIA's gamers (GTX 980/970) will be able to enjoy the benefits of MFAA on a very limited subset of modern PC games. It has potential, but needs a lot of work and attention from the driver team to keep the plates spinning.
“FXAA/MSAA/CFAA/MFAA/DSR –
“FXAA/MSAA/CFAA/MFAA/DSR – TRASH” (c) Forever proud FSAA/SSAA-god
Of course those AA methods
Of course those AA methods offer superior image quality. A less divine aspect is that they also bring dramatic framtime augmentations. No one is saying MFAA is explicitly superior to FSAA, or even MSAA; they do say, however, that MFAA’s frametime:imagequality ratio is very appealing. If I had had a Titan Black back when I had a GTX580, I’d be supersampling my heartout in my “Yes I can run Crysis T-shirt.” If you want to clean up a Frostbite 3 frame, or load a Ubisoft game’s title screen, efficient aliasing elimination is lucrative, and therefore worthy of Nvidia’s attention.
MFAA is trash simply because
MFAA is trash simply because it’s actually worse than MSAA (which is already being a piece of crap to begin with).
MFAA soapens up the image like hell, just like that FXAA garbage did. And for a PC any soap ever is a very big no-no. Leave soap to consoles.
It does NOT matter if MFAA has “better” consumption rates in comparison to MSAA or something else. Because the soapy trash is still soapy trash, no matter how much you try to justify it’s still-born “existence” or how hard you try to defend it in comparison to truly GODLIKE methods such as FSAA/SSAA.
The turd doesn’t become any less turd if you make it out of gold and cover it with gems – it’s still a turd in the end.
As for DSR…I’ve already said it many times before and I will never get tired of repeating this:
DSR effing sucks, seriously.
It tries to recreate FSAA/SSAA while making lesser hit on performance than the actual FSAA/SSAA does, but fails because it soaps up details rendered further away and produces glitches like the ones which AMD’s CFAA had. Don’t get me wrong though – it’s utterly useless trash in comparison to the full-blown FSAA/SSAA only, by itself (if you don’t consider that FSAA/SSAA exists out there) it’s not all that bad. It’s just fugly when you compare it to a pure FSAA/SSAA, not if you compare it to other methods. To put it out simply: there is still no technology better than FSAA/SSAA out there, FSAA/SSAA is STILL the most best method, there is nothing better. DSR is completely useless and just outright sucks, in that regard.
To clarify it somewhat easier: it’s fine by itself. It’s only truly bad if there’s full-blown FSAA/SSAA around (like in Witcher 2, for example), because in those cases DSR’s utter ugliness becomes extremely apparent and very easily noticeable. As for the glitches caused by DSR – go Google up “AMD CFAA glitches”, DSR has pretty much same exact problems.
The first 2 Google results
The first 2 Google results for your search terms are your posts about DSR… The rest are not really useful.
Stop producing autism.
Stop producing autism.
SSAA/FSAA doesn’t work
SSAA/FSAA doesn’t work properly with deferred lighting. UE3 is the worst offender…
If I use 4xSSAA @ 2560×1080, theres lighting jaggies everywhere and the framerate goes down to ~20.
With 4xDSR the engine literally multi-samples everything, including the deferred lighting passes. While it doesn’t result in a the complete elimination of sparkling for UE3, it does heavily reduce it compared to SSAA.
The frame rate is also about 3x better (averaging 58 vs 20 in mass effect 3). Also now with MFAA you can use 2x MSAA on top of DSR for free.
With engines that don’t heavily rely on deferred lighting, SSAA works great too.
Okay disregard that. Looks
Okay disregard that. Looks like for ages I hadn’t been setting up SGSSAA or TRSSAA properly and it wasn’t handling the deferred lighting. But it appears it is the case that it does.
Wtf 😀
Do you even know what
Wtf 😀
Do you even know what DSR technology act. does ?
I suppose not, but if u are open minded – like every racíonal person ,learn it!.
Google: “GeDoSaTo Tool”.
Super-sampling is something else than just some shitty api to smooth edges.
And I didn’t even mention that before, SSAA & FSAA are JUST “golden shits with diamonds”, because they’re using supersampling method to begin with.
..
you have no clue what you are
you have no clue what you are talking about, MFAA does not soften the image at all, it sounds like you are confusing it with DSR.
Again, isn’t “DSR” something
Again, isn’t “DSR” something I’ve been doing for 2+ years now?
Yes, I have to actually ask that question!
Still can’t for the life of me figure out if they are doing anything unique here..
Consistently for months before the 900’s launched…
I would usually run older/lower end games at 2X+ my screens native resolution, by forcing custom desktop resolutions through Nvidia CP, then in windows setting my resolution back to normal, then in game the new higher resolutions are selectable, but when returning to desktop goes back to normal. Really easy.
But since then all I’ve been able to do is 1.5x.
I do have a dual-link DVI cable on it’s way to see if my single-link that I found laying around is what’s limiting it to 1.5..
But if I discover that it’s not the cable locking it to 1.5..
Well.. That’s some sad and shady business right there.
I quite miss 2X+ resolution.
Am I the only person who ever did this or what?
I can use some insight.
I did the same until DSR was
I did the same until DSR was enabled. It’s the 13-tap Gaussian filter that makes it superior to Custom Resolutions – Sharpness/Smoothness slider.
Cool, thanks man. I’ve not
Cool, thanks man. I’ve not had time to go and research it more and when I did I didn’t find any answers.
I’ll look into that.
Though even with 2/4X and it’s awesome AA effect I’d find myself using 2/4x AA still to really make it pop.
Perhaps that dithering can sort of replace the need for 2x AA.
So Nvidia doesn’t even add
So Nvidia doesn’t even add support for their own games that are in 2014? Watchdogs and farcry 4 not on the list? Assasins creed unity is the only one i noticed.
You are correct – these are
You are correct – these are pretty glaring omissions. Hopefully NVIDIA can ramp up the white list pretty quickly.
Thanks im currently playing
Thanks im currently playing watchdogs with either txaa x4 os msaa x4 both are just horrible experiences at 1080p on new gtx 980 g1 @ 1500/7700 max everything, so i am in need of less taxing anti aliasing that doesn’t have my frames jumping all over the place 70 to 50 to 70 and back again 50 causes screen tearing on my 120hz monitor.
As well as redux
As well as redux
MFAA shows promise but I do
MFAA shows promise but I do not own a 900 series card so I’ll stick with other methods for now.
Regarding DSR , I was aware of Down Sampling before , but DSR adds an additional special filter which improves image quality.
Correct. Previous iterations
Correct. Previous iterations of down sampling have used very basic filters that were not optimized for the feature they were being used.
I wonder if it’s possible to
I wonder if it’s possible to create our our custom filters and just inject them into DX and again, use any card.
Or even rip their method entirely.
So how does this work in
So how does this work in terms of performance optimization ? Why does MFAA in Crysis 3 performs that much better than in GRID 2 ? I’m guessing none of those games have MFAA optimizations from the actual game developers ? So the implementation for each game has been done in the driver only ? Does this mean NVIDIA spent more time optimizing it for Crysis 3, can we expect that performance impact in future games that will take the time to optimize MFAA for their specific engine ? Who has access to the MFAA API, is NVIDIA contacting specific game developers to implement that feature in collaboration with them ?
As a gamer AA is the last
As a gamer AA is the last option I turn on in any game after I have every other setting to the Max. With BF4 for example I could not stand MSAA. I loved resolution scaling however. To me the quality comes first then performance. MFAA is cheap mens MSAA. If you don’t have the GPU power for 4X then 2X will not be much better either.
July 2015 – MFAA still not
July 2015 – MFAA still not available for SLI users.