Lets assume that each level of AA and AF each takes double the GPU power of the predecessor. 2x takes twice the power of the normal, and 16x takes twice as much as 8x. If a HD3870 can run a modern game at 1920 * 1200 with 0x AA and 0x AF, then a 1920 * 1200 16x/8x will take 2^4x and 2^3x as much power.
2560 * 1600 has about 1.75 as many pixels as 1920 * 1200, lets round it to 2x. Just to give it the most pessimistic outlook possible, lets say a future game will need 4x the horsepower for miscellaneous things as current GPUs. Between the rez and the ‘other’ category, that is 2^3 times more power needed.
In total, we have the need for 2^10 more power, or about 1024x the power of today’s HD3870, a crazy insurmountable number right? Even if this would get us to the holy grail of GPUdom, perfect frame rates with everything turned on, it can’t happen. 1000x is more than we can expect, right?
In powers of two, that is 10 doubling periods, the first one will happen in January. The next nine will be spaced about 6 months apart meaning this will be 4.5 years down the line for the worst case scenario possible. At that point, GPUs will be good enough for the highest end monitors out there today.
Are GPUs a dying breed of logic?
That would appear to be the case according to Charlie at The Inquirer. Many of his points are valid, but lines like “A two generation old ATI
X1300 low end card can drive two 30″ monitors” are obtuse. Sure, they can do that with 2D and basic Vista screens but you won’t be pushing dual 30″ 2560×1600 displays with any gaming involved. Also, much of the debate is based on resolutions and it doesn’t completely address the fact that even if resolutions don’t increase, shader and image quality can, without AA and AF — just look at our Crysis tests to see that.