Who doesn’t love rumor and speculation about unreleased products? (Other than the manufacturers of such products, of course.) Today VideoCardz is reporting via HardwareBattle a GPUZ screenshot reportedly showing specs for an NIVIDIA GeForce GTX 980 Ti.
Image credit: HardwareBattle via VideoCardz.com
First off, the HardwareBattle logo conveniently obscures the hardware ID (as well as ROP/TMU counts). What is visible is the 2816 shader count, which places it between the GTX 980 (2048) and TITAN X (3072). The 6 GB of GDDR5 memory has a 384-bit interface and 7 Gbps speed, so bandwidth should be the same 336 GB/s as the TITAN X. As far as core clocks on this GPU (which seems likely to be a cut-down GM200), they are identical to those of the TITAN X as well with 1000 MHz Base and 1076 MHz Boost clocks shown in the screenshot.
Image credit: VideoCardz.com
We await any official announcement, but from the frequency of the leaks it seems we won’t have to wait too long.
This is the card i am
This is the card i am upgrading my 780 to and i have been waiting for this one. The Witcher 3 is laying a pimp slapping on my 780 at 1440p and i’m sacrificing a lot of eye candy to keep a playable frame rate. Even with my ROG Swift Gsync monitor which at least makes sub 60 fps look butter smooth its time to make the leap. I think This card will be a very good investment for the next 2 years since it is a cut down version of the Titan x and not just another 980 with the same architecture as lets say the Asus 20th anniversary gold.
I’m hoping there are a decent selection of Overclocked versions with non stock coolers available shortly after release of the 980Ti. I assume MSI and EVGA will work their magic.
I’d go $1000 with taxes (Canadian)for this card , any higher and i might as well get a titan x insead since i can pick one up for $1350 with taxes.
Crossing fingers for a release in the next week or 2 and then my 780 will find a new home in my office pc.
“I think This card will be a
“I think This card will be a very good investment for the next 2 years”
Currently, GPUs are held back quite severely by the old 28nm process. Next year, we will see 14 and 16nm GPUs that should be massively better than current parts, followed by a several more years of limited progress due to slow process shrinkage. Get one of those 14nm parts if you want a long term investment.
If you are willing to spend
If you are willing to spend $1000 every 2 years, then you can certainly stay close to the bleeding edge if you want. We will be getting a lot of improvements in the next few years. We will finally get the jump to a smaller process tech. Do you think that even a Titan X will perform that well compared to a 14 nm GPU with HBM? The 20 nm planar node seems to have been mostly unworkable for large chips. We may see a massive jump with FIN-FETs at 20 nm. We may also see quick progression to 14 nm. I am wondering if the Titan X will be left far behind in just a year from now.
That’s if they don’t have any
That’s if they don’t have any leakage issues going to a much smaller process.
I’m in a similar position.
I’m in a similar position. GTX 780 and ROG Swift. The Witcher 3 is merciless. With a few settings turned out and some tweaks I can get around a 45FPS average at 2560×1440 with the core clock on the 780 at 1250MHz.
I’ll wait and see on new cards. With new process technology finally available next year, I think waiting a while might be worth it.
And you are using an older
And you are using an older driver? At least two of the newest drivers have gimped performance on the 6 and 700-series. Nvidia is working on a fix for that.
“Nvidia is working on a fix
“Nvidia is working on a fix for that.”
I doubt it. Why would they fix it and take away their loyal fans’ motivation to step up from 700 and buy a 900?
They did it on purpose.
I tried an older driver
I tried an older driver revision with The Witcher 3 and performance was around the same. I’ll be curious to see what the new driver brings.
Sufficed to say that when I spend $650 AUD on a graphics card only 18 months ago, I expect proper support for several years.
That ROG Swift wasn’t cheap either, Nvidia.
I’ve found that with the
I’ve found that with the latest Witcher patches and a modest OC on my 780, that I can turn hairworks on and set everything to ultra aside from foliage distance to high, and find it quite playable at 1440, on a non-Gsync Dell 2711U. There is still some chop in cutscenes, but it’s tolerable. I had considered a 980Ti upgrade myself, but I think I’ll be able to get by until the Pascal based 16/14nm parts next year.
“I think This card will be a
“I think This card will be a very good investment for the next 2 years”
…until next year when Pascal launches and the new drivers gimp everything from Maxwell down, like they did to Kepler.
$750USD or i am buying AMD
$750USD or i am buying AMD this time. and all that 6Gb better be “on the same bus”
“The 6 GB of GDDR5 memory has
“The 6 GB of GDDR5 memory has a 384-bit interface and 7 Gbps speed, so bandwidth should be the same 336 GB/s as the TITAN X”
We thought the 970 had the same bandwidth as the 980. Also. In fact Nvidia probably still list it as the same but it obviously is not.
Witcher 3 is a broken game
Witcher 3 is a broken game especially the hairworks with the 64X tesselation on it brakes card. If you edit the .ini file and decrease it to 8X or 16X it runs with twice the fps and looks the same.
NV pulling the old trick again…lots of tesselation to cripple AMD card even at the price to slow their own.
Not to say the latest drivers slow down the keplers compared to the maxwells. 960 beating a 780Ti…yeah sure.
I’ll tell ya, I wouldn’t at
I’ll tell ya, I wouldn’t at all be surprised if hairworks is designed such that when it detects a Maxwell card it runs 8x or 16x tesselation, and when it detects anything else, it defaults to 64x tesselation to deliberately trash performance.
Think about it. Edit the .ini to run 8x or 16x tesselation and hairworks runs brilliantly on Kepler. Use CCC to override application settings and force 8x tesselation and hairworks runs brilliantly on AMD.
Hairworks is closed and proprietary, no one but Nvidia can see the source code, so no one could possibly know. Until someone finds it.
It’s entirely speculation, but it’s just like the old Intel compiler trick – detect “GenuineIntel” and use the fastest possible instruction set, detect anything else and use the slowest possible instruction set. Thereby making everything but “GenuineIntel” run like crap in comparison.
They did a bad job of
They did a bad job of covering up the ROP/TMU count. I zoomed way in and matched up the bottom pixels of the shown numbers to other numbers in the image, looks like it’s 96/204 or 96/284:
http://i.imgur.com/D0Nf174.png
that would put it higher than
that would put it higher than the titan x for tmu. idk about this
The only number that looks
The only number that looks conclusive is the 2 though. Perhaps Titan X isn’t quite a fully enabled part, although it could still be fake. I am thinking Nvidia may need extra performance at 4k to try to compete with new AMD part though.
http://videocardz.com/55739/n
http://videocardz.com/55739/nvidia-geforce-gtx-980-ti-gpu-configuration-confirmed-gaming-performance-leaked
GPU-Z screenshot updated with 0.8.3 that was just released by TPU which properly supports GTX 980 Ti.