Image Quality Testing
For the final portion of our performance testing, we wanted to compare the differences in image quality between videos outputted using ATI Stream and CUDA. Why is this important? While transcoding times and CPU usage are quite meaningful, both of these elements are irrelevant if the image quality of the outputted video is less than stellar. So, we took screenshots at a certain point in each video clip and showed a zoomed-in, side-by-side comparison of the image quality in three of the video clips we used during performance testing.
HD.net “Las Vegas” test clip screenshots
HD.net “Las Vegas” test clip outputted to M2TS using NVidia CUDA (scaled down to 600px width)
HD.net “Las Vegas” test clip outputted to M2TS using ATI Stream (scaled down to 600px width)

Portion of HD.net “Las Vegas” test clip zoomed in 200 percent (CUDA on left, Stream on right)
Our zoomed-in screenshots show a few minute details in the image quality consumers can expect from outputted video from ATI Stream or CUDA enabled applications. The ATI screenshot looks a bit softer than its CUDA counterpart that can be seen in the model’s face and hair. The background behind the girl in NVidia’s version looks to be a bit brighter too. Everything else looks pretty identical and both had similar file sizes, which is also a consideration when transcoding video for different devices like the PSP or iPhone.
ATI Avivo HD test clip screenshots
ATI Avivo HD test clip outputted to MP4 using NVidia CUDA (original size)
ATI Avivo HD test clip outputted to MP4 using ATI Stream (original size)

Portion of ATI Avivo HD test clip zoomed in 200 percent (CUDA on left, Stream on right)
Again we see some small differences that can barely be seen by the naked eye between these two zoomed-in screenshots. There is some softness to the image quality in both screenshots, but for a video that’s optimized to be viewed on an iPod it should look a lot better at lower resolutions. There is some jagged lines in the buildings in the background, but that typically happens when you compress a video file to save on file space. In this particular comparison, I don’t see many distinguishable differences that consumers should be aware of.
NVidia’s “The Plush Life” test clip screenshots
“The Plush Life” test clip outputted to MP4 using NVidia CUDA (scaled down to 600px width)
“The Plush Life” test clip outputted to MP4 using ATI Stream (scaled down to 600px width)

Portion of “The Plush Life” test clip zoomed in 200 percent (CUDA on left, Stream on right)
For our final image quality test, we used NVidia’s “The Plush Life” to see if we could pinpoint all the small differences in detail, clarity, lighting, and sharpness between the screenshots above. At first glance, the CUDA screenshot on the left looks a bit sharper than the ATI capture. Another item that looks a bit sharper the seat in the bottom left corner of each screenshot. The seat looks clearer and you can see the detail in the seat as opposed to the ATI version that looks a bit blurry.
First of all, let’s set a few things straight. This was purely a subjective image quality test that might have had different results if my test bench had different specs, a different display, or if I used another video player other than VideoLAN media player. I did my best to capture screenshots that were at the same point in each video and to consistently evaluate each video on the same standards.
That being said, I think our CUDA screenshots slightly edged out ATI Stream in the image quality department. Most of the differences we discussed were barely noticable, but the last screenshots really allowed us to see those miniscule details in the character’s face and car seat. The other Stream screenshots were a bit soft and blurry in some sections, which indicates a lower-quality video. Overall, both platforms produced more than acceptable output video file that could be enjoyed in any consumer device that supports those video file formats. While we were watching these clips on our Westinghouse 24″ LCD, many people might watch them on their iPhones, Zunes, high-definition TVs, or other consumer devices.



Please change the tile.
Your
Please change the tile.
Your article is not about a comparison of Stream and CUDA performance, it is the difference between two software implementations utilising Stream and CUDA.
These technologies allow you to parallelise your algorithms, to imply that one technology performs ,as you essentially say, ‘better quality maths’ than the other is ignorant.
Please do not misdirect readers like this.
Regards.
Joe Bloggs
Please change you word.
Your
Please change you word.
Your comment is not about a reply to the article, it is a quantification of how butthurt you are.
These new breakthrows allow us to see how badly you are spell ,as you essentially try to use ‘larger words’ but not good at English.
Please do not obfuscate readers’ thoughtings like this.
Regards.
Bloe Joggs
damn dude, look at your own
damn dude, look at your own english, it’s absolutely dreadful!
Ya dude, your an idiot, your
Ya dude, your an idiot, your article is misleading. For sure!
Peace
Hater Bater Fuck Face
SO MUCH HATE !
SO MUCH HATE !
You are comparing two cards,
You are comparing two cards, one is nearly a year older than the other one, its elementary that the new one is going to win. This review is biased
Why are you not comparing the
Why are you not comparing the same frame in the outputs? How can you do a comparison of different frames and make a decision on differences in quality?
My personal gaming research
My personal gaming research team has found nVIDIA’s CUDA technology to be superior, but they compared current GPUs, not GPUs with a manufacturing time gap.
This is a very interesting
This is a very interesting article to contribute to my PC Hardware class, as I’m currently in a Network Admin program in Vermont. Please keep up the good work guys I love your site, and you have been very helpful over the last several semesters.
For BitCoin Minners AMD GPUs
For BitCoin Minners AMD GPUs faster than Nvidia GPUs!
Why?
Firstly, AMD designs GPUs with many simple ALUs/shaders (VLIW design) that run at a relatively low frequency clock (typically 1120-3200 ALUs at 625-900 MHz), whereas Nvidia’s microarchitecture consists of fewer more complex ALUs and tries to compensate with a higher shader clock (typically 448-1024 ALUs at 1150-1544 MHz). Because of this VLIW vs. non-VLIW difference, Nvidia uses up more square millimeters of die space per ALU, hence can pack fewer of them per chip, and they hit the frequency wall sooner than AMD which prevents them from increasing the clock high enough to match or surpass AMD’s performance. This translates to a raw ALU performance advantage for AMD:
An old AMD Radeon HD 6990: 3072 ALUs x 830 MHz = 2550 billion 32-bit instruction per second
A New Nvidia GTX 590: 1024 ALUs x 1214 MHz = 1243 billion 32-bit instruction per second
This approximate 2x-3x performance difference exists across the entire range of AMD and Nvidia GPUs. It is very visible in all ALU-bound GPGPU workloads such as Bitcoin, password bruteforcers, etc.
Secondly, another difference favoring Bitcoin mining on AMD GPUs instead of Nvidia’s is that the mining algorithm is based on SHA-256, which makes heavy use of the 32-bit integer right rotate operation. This operation can be implemented as a single hardware instruction on AMD GPUs (BIT_ALIGN_INT), but requires three separate hardware instructions to be emulated on Nvidia GPUs (2 shifts + 1 add). This alone gives AMD another 1.7x performance advantage (~1900 instructions instead of ~3250 to execute the SHA-256 compression function).
Combined together, these 2 factors make AMD GPUs overall 3x-5x faster when mining Bitcoins!
Fucking plagerism. Copy/paste
Fucking plagerism. Copy/paste from some other source, no citation or credit. Your education should be shredded and flushed down the toilet. Here is where you copied it from for people who want to read from someone with actual knowledge and not just ctrl+c —> ctrl+v.
https://en.bitcoin.it/wiki/Why_a_GPU_mines_faster_than_a_CPU
You plagerized me. I
You plagerized me. I complained about someone else who copied something and posted a link. All you did was change the link. You are a loser and the worst scum on the internet.
Why are we bitching about
Why are we bitching about plagiarism? If i wanted to make sure his info was correct i would’ve looked it up myself. I could care less if it was “plagiarized” as long as the information was correct.