FSAA Quality, Anisotropy Quality
This content was originally featured on Amdmb.com and has been converted to PC Perspective’s website. Some color changes and flaws may appear.
“Jaggies” are the result of rendering an edge using the display’s default pixel locations. Anti-aliasing removes these jagged edges by softening the edge to make it appear more smooth – creating the illusion of rendering an edge on the screen as if it were at a higher resolution. As a result, anti-aliasing is generally more important on lower resolution displays (i.e. 1024×768 on a 19″ monitor) as there are less pixels to render an edge. But many users, regardless of resolution and monitor size, use anti-aliasing to generally improve the way images look. Here we will test the three major chipset’s in anti-aliasing performance.
Comparing the FX5700U to the Radeon at 2xAA, it’s hard to tell which one is better. It would appear that in some places the FX5700U looks better whereas in other places the Radeon is superior. Looking at the 4xAA results, it appears both are equally matched as well. However, the biggest difference I can see between the two is the brightness of the scene itself. The Radeon appears to render the scene much more vividly and with brighter tones than the FX5700U. You can easily see this in the thumbnails above by comparing the color of the sky.
Comparing the FX5600U and the FX5700U, there’s very little difference between the two samples in terms of anti-aliasing. But what you do notice is the way the scene is coloured. Like comparing the Radeon to the FX5700U, the FX5600U appears brighter and more vivid compared to the FX5700U. Could it be that the FX5700U renders everything a little darker than its predecessor?
Here we’ll take a look at the different cards’ ability to sharpen textures as it recedes into the distance. We are using Morrowind again (Fort Ebonheart) to compare the effectiveness of each level of filtering. Below are results of anisotropic filtering cropped from the original 1024×768 image.
I know some of you are going to laugh when you see the following chart. Yes I know it looks like a muddy mess, that’s why they’re cropped samples! 🙂 Click on them to get the rest of the image.
|GeForce FX5700U||GeForce FX5600U||Radeon 9500|
Something must have changed from the series 40 NVIDIA driver to the series 50 driver. It would appear that their anisotropic filtering quality has improved quite a bit! By comparing the larger images of 2xAF for the FX5700U to the 2xAF on the Radeon, we can see the FX5700U looks slightly better. At 4xAF, the Radeon still lags behind slightly in terms of quality compared to the FX5700U. However once at 8xAF, both cards look about the same.
Comparing the FX5600U to the FX5700U, I can not see any significant difference between the two. As far as I’m concerned, there hasn’t been any change between from the FX5600U in terms of how the FX5700U does anisotropic filtering.
Even though the FX5700U appears to have slightly better texturing at 2xAF and 4xAF over the ATI-based card, you have to look at the overall performance penalty to determine if the extra texture sharpness is worth it to you.
Editor’s Note: The Detonator 50 drivers do offer a drastically improved AF and AA experience as well as performance in pixel shaders. After our article on the problems with UT2K3, NVIDIA took these words to heart and fixed the problems — hence the improvement we are seeing here today.
By improving the quality of the sampling, NVIDIA knew they were going to decrease the performance (framerate) of the cards, so they also made solid attempts at improving the performance as well by using a new compiler in their Detonator 50 driver set. These compiler techniques are exactly the same as you would have on a modern programming compiler, where the reordering of instructions has the ability to drastically improve performance with no loss on the quality of the data. In several applications in both DX9 and DX8, upgrading to the new Detonator 50s will show you a much better gaming experience.