While the current Nintendo console’s internals are very underpowered compared to the competition from the Xbox 360 and PS3, the company looks to leapfrog those consoles in the graphics department with the upcoming Wii U console. According to Engadget, the new Nintendo offering will come equipped with a GPU much like that of AMD’s 4800 series. The custom R770 chip is DirectX 10.1 and multi-display capable, allowing the console to output up to four SD video streams.
While the proposed chip is last-generation in terms of PC gaming, on the console front it will be the current highest-end GPU, with the Xbox 360 using a custom ATI X1900 GPU and the PS3 employing a custom RSX (”Reality Synthesizer”) graphics chip based on NVIDIA’s 7800GTX PC graphics card.
What do you think about Nintendo’s move to employ the AMD GPU?
Wouldn’t a 4800 series chip
Wouldn’t a 4800 series chip be two generations ago? 4800s where replaced with 5800s, which themselves have been replaced with 6900s. It’s almost certain that the 6900s themselves will be replaced at least once by the time the Wee U comes out.
Radeon 4800 and Radeon 5800
Radeon 4800 and Radeon 5800 are really different generation of chips but Radeon 6900 is just an update of the 5800.
The 6000 series is a brand
The 6000 series is a brand new series, where the 6900 series beats the 5800 series in both performance/wattage consumption/heat. I would have thought Nintendo would pick up on the newer 6000 series, somewheres along a 6850/70 due to it’s great temps and performance. The 4800 series have been known to have temp problems, especially for long use (Which the Wii U is focusing on as the console can stream to the tablet-controller.) If it’s not an enhanced version of the 4800 series where it can result in great FPS while also keeping at a low temp then it’ll have a greater fail rate then the Xbox 360s initial launch.
As to the question, “What do you think about Nintendo’s move to employ the AMD GPU?”
AMD & Nintendo seemed to have struck a deal, both respect eachother pretty well and both have been partnered since before the gamecube system. The only problem was that the Wii had little to no upgrade to what the gamecube had, so I’m happy to see this heavy jump in hardware for Nintendo.
This is going to be enough
This is going to be enough for Nintendo to compete in the coming 5 years. Graphics are becoming less important (but still important) as time rolls on and Nintendo has designed their machine with that in mind.
People shouldn’t judge 4800 series performance based on a PC as consoles allow developers metal access and thus opening up the full power of GPUs. Examples are the 7800 and 1900 custom chips in the current consoles. The Japanese Garden tech demo suggests the power of this GPU with its time-of-day change, weather effects, seasons change and high quality lighting.
Lastly, the custom aspect could provide a significant improvement in performance as it did with the 1900 series chip Xbox 360 uses.
I’m convinced that Wii U will end up where Dreamcast was graphically but since the technology across the board has evolved to great extent the graphical quality comparison will be great (Wii U) to excellent (PS3).
Sorry, I meant PS4 instead of
Sorry, I meant PS4 instead of PS3 at the end of the above post.
So no DX11? Either DX11 will
So no DX11? Either DX11 will fail in developer adoption like DX10 did, or Nintendo is going to miss the boat and have grossly sub-standard hardware AGAIN.
As a wii owner, I am super butthurt about its low res and output options (you have to pay an assload extra to buy component cables, and you still get 480p). And before anyone mentions its innovative peripherals, I find the Wii motion (both the original and the newer motion plus) to be laggy AND inaccurate.