Anti-Aliasing Performance
This content was originally featured on Amdmb.com and has been converted to PC Perspective’s website. Some color changes and flaws may appear.
“Jaggies” is the result of rendering an edge using the display’s default pixel locations. Anti-aliasing removes these jagged edges by softening the edge to make it appear more smooth – creating the illusion of rendering an edge on the screen as if it were at a higher resolution. As a result, anti-aliasing is generally more important on lower resolution displays (i.e. 1024×768 on a 19″ monitor) as there are less pixels to render an edge. But many users, regardless of resolution and monitor size, use anti-aliasing to generally improve the way images look.
Here we will test the three major chipset’s in anti-aliasing performance. We will compare both the anti-aliasing quality in and their frame rates as recorded in X2: The Threat demo at 1024×768.
Setup:
Anti-aliasing quality test | The Elder Scrolls III: Morrowind (town of Seyda Neen) |
Anti-aliasing performance test | X2: The Threat rolling demo |
Results:
Click the
thumbnails for the whole image.
All in all, I feel that ATI’s anti-aliasing is superior to NVIDIA’s because of both performance and quality. ATI manages to produce a better anti-aliasing image with a lower anti-aliasing level. In the case of the Radeon 9700, we see that 2xAA looks and performs like 4xAA on the FX5900 Ultra. The Radeon 9700 has the potential of being a “doorman’s FX5900 Ultra”, but does it hold true? We will explore this in more detail shortly.
A reader wrote to me regarding the last video card round-up stating that though ATI’s anti-aliasing looks better than NVIDIA’s in a still image, NVIDIA’s anti-aliasing produced less anti-aliasing artifacts when in motion. I put this to the test to see if I can see a difference between the two and honestly I couldn’t really see a difference. Perhaps my eyes aren’t discerning enough to pick out differences while the screen is moving. Perhaps it’d be an interesting exercise at a later date to compare movement using a tool that captures to video.
2xAA Radeon 9700
vs. 4xAA FX5900 Ultra
When analyzing the anti-aliasing
images and the performance numbers, we see something curious when comparing
the FX5900 Ultra to the Radeon 9700. At
1024×768, it would appear that the Radeon 9700 at 2x anti-aliasing looks and performs
similar to the FX5900 Ultra at 4x anti-aliasing. It’s almost like getting
a $400 card for $200. But let’s try 1600×1200 just to make sure.
1024×768 | Radeon 9700 | FX5900 Ultra |
AA Level | 2x |
4x |
AA Sample | ||
Freelancer | 56.0
fps |
49.2
fps |
Morrowind | 32.3
fps |
27.3
fps |
UT2K3 | 127.3
fps |
118.7
fps |
X2 Demo | 38.0 fps |
38.4 fps |
Price | ~ $200 USD |
~ $400 USD |
The story changes at 1600×1200. At 1600×1200, we see that the Radeon’s 2xAA is nothing near the 4xAA quality on the NVIDIA. It would seem that NVIDIA’s 4x anti-aliasing looks better at higher resolutions and performs at similar levels.
1600×1200 | Radeon 9700 | FX5900 Ultra |
AA Level | 2x |
4x |
AA Sample | ||
Freelancer | 47.4
fps |
40.1
fps |
Morrowind | 33.2
fps |
26.3
fps |
UT2K3 | 66.2
fps |
97.6
fps |
X2 Demo | 23.8 fps |
25.1 fps |
Price | ~ $200
USD |
~ $400
USD |
So what can we conclude here? I think it’s safe to say that if you’re looking to play games at a lower resolution, a Radeon 9700 will probably be the best investment as the anti-aliasing quality and performance is superior (not to mention the cheaper price tag). However, if you’re playing at higher resolutions, the NVIDIA has the advantage in both image and performance. Consumers in the market for video cards will have to decide if the extra $200 USD on the FX5900 Ultra is worth the improved anti-aliasing performance at 1600×1200.
I can’t see the results, like
I can’t see the results, like the graphs…