For those of you worried that GDC would sneak by without any new information for NVIDIA's GeForce fans, Jen-Hsun Huang surprised everyone by showing up at the Epic Games' keynote with Tim Sweeny to hijack it.
The result: the first showing of the upcoming GeForce TITAN X based on the upcoming Maxwell GM200 GPU.
JHH stated that it would have a 12GB frame buffer and was built using 8 billion transistors! There wasn't much more information than that, but I was promised that the details would be revealed sooner rather than later.
Any guesses on performance or price?
Jen-Hsun signs the world's first TITAN X for Tim Sweeney.
Kite Demo running on TITAN X
UPDATE: I ran into the TITAN X again at the NVIDIA booth and was able to confirm a couple more things. First, the GPU will only require a 6+8-pin power connections, indicating that NVIDIA is still pushing power efficiency with GM200.
Also, as you would expect, the TITAN X will support 3-way and 4-way SLI, or at very least has the SLI bridges to support it.
Any guesses on performance or
Any guesses on performance or price?
Performance: 50% more than a GTX 980 if it has the rumored 3072 cuda core count
Price: $1499
Price is rumored to be $1,349
Price is rumored to be $1,349 if the leak from Jan is to be believed (they did get the 12GB right)
So, technically it only has
So, technically it only has 11.5GB of VRAM, am it right?
You can say that again!!
You can say that again!!
And be wrong both times.
And be wrong both times.
LOL
LOL
HA
HA
NO
NO
i’ll take it no tax lol
i’ll take it no tax lol
Another niche product. Wonder
Another niche product. Wonder when they will reveal a 980 Ti.
AMD swoops in with a 390×2
AMD swoops in with a 390×2 and makes this card irrelevant.
Yea, lets have an argument
Yea, lets have an argument comparing one set of unknown performance metrics to another.
Not when [playing an UE4
Not when [playing an UE4 game…unreal engine 4 has no multi GPU.
Still uses software frame
Still uses software frame metering so cant run TF2 or any DX9 games lol
You mean single R390X? Right?
You mean single R390X? Right? Because it will be enough
I”m sorry but that UE4 kite
I”m sorry but that UE4 kite demo is kinda joke for running on that graphics card.
If you go on YouTube an look up the game “Black Desert Online” you’ll notice that the graphics in that MMO are just as good as the demo video or if not alot better. An BDO doesn’t even use the UE or Crytek game engines.
I looked up that game you
I looked up that game you mentioned and you’re an idiot.
I for real lol’d
I for real lol’d
lmao yep he an idiot
lmao yep he an idiot
you really comparing that
you really comparing that horrible looking thing to the UE4 kite demo?
So in other words nobody buy
So in other words nobody buy this card til the 390x is released to give that damn card a serious price cut. It looks like its going to be another titan Z incident again.
titan z was silly but not for
titan z was silly but not for the single gpu version of titan. gamer will find titan stupid but for small scale professional that needs DP titan actually a STEAL. imagine you can get something that perform as fast as the fastest Tesla but cost 3-4 times cheaper. and the original titan never drop in price despite being slower than 290X. only GTX 700 series did.
It still is cheaper than a
It still is cheaper than a typical workstation card.
Hey everyone,
Some of you are
Hey everyone,
Some of you are disappointed that we did not clearly describe the segmented memory of Titan X. when we launched it. I can see why,
so let me address it.
We invented a new memory architecture in Maxwell This new capability was created so that reduced-configurations of Maxwell can have a larger framebuffer -. Ie, so that Titan X is not limited to 11GB, and can have an additional 1GB.
Titan X is a 12GB card. However, the upper 512MB of the additional 1GB is segmented and has reduced bandwidth.
This is a good design because we were able to add an additional 1GB for Titan X and our software engineers can keep less frequently used
data in the 512MB segment.
Unfortunately, we failed to communicate this internally to our marketing team, and externally to reviewers at launch.
Since then, Jonah Alben, our senior vice president of hardware engineering, provided a technical description of the design, which was captured well by several editors. Here’s one example from The Tech Report.
Instead of being excited that we invented a way to increase memory of the Titan X from 11GB to 12GB, some were disappointed that we did not better describe the segmented nature of the architecture for that last 1GB of memory.
This is understandable But, let me be clear:.. Our only intention was to create the best GPU for you We wanted Titan X to have 12GB of memory,
as games are using more memory than ever.
The 12GB of memory on Titan X is used and useful to achieve the performance you are enjoying. And as ever, our engineers will continue to
enhance game performance that you can regularly download using GeForce Experience.
This new feature of Maxwell should have been clearly detailed from the beginning.
We will not let this happen again. We’ll do a better job next time.
Jen-Hsun
While funny, this is just a
While funny, this is just a copy-paste-edit of the Jen-Hsun post at nvidia blogs: http://blogs.nvidia.com/blog/2015/02/24/gtx-970/
Sorry to spoil the joke
Now, now. Don’t go spoiling
Now, now. Don't go spoiling their version of reality with facts, it only upsets them.
Titan Z – Titan Y – Titan X –
Titan Z – Titan Y – Titan X – Titan W – Titan V ??? backward till Titan A 😛
look at Jen face, unhappy?
Finally some real news about
Finally some real news about progress in the GPU space. I want more details!
My guesses are 35% faster than the 980 at 1080/1440, 50% faster than 980 at 4K, $1250.
You forgot about being 10%
You forgot about being 10% slower then 295×2 at 4k… 🙂
Probably draws half the power
Probably draws half the power to do it.
edit: ryan tweeted, 8+6 pin power so card is around 2-225watts maybe little more. You are comparing a 2 gpu card vs 1.
How do you get 225watts
How do you get 225watts ?
PCI-E = 75watts
6-pin = 75watts
8-pin = 150watts
Total = 300watts
If it was 225watts it be 2×6-pin. It needs a 8-pin because just like all maxwells once they run a GPU intense application they suck just as much power as their kepler counterparts.
Canned gaming benchmarks is where maxwell power consumption shines.
Software frame metering on a
Software frame metering on a garbage dual gpu card lol
Lol He’s signing it like he’s
Lol He’s signing it like he’s some sort of celebrity. NVidia makes some great cards with great engineers, but I really dislike Jen.
The guy insulting his
The guy insulting his customers in his blog presenting his new rip of.
Cute.
You say that but according to
You say that but according to market share, you are in a pretty small minority that think that.
Says the one with the
Says the one with the FUD/Terfer green merit badge, and shiny gold doubloons! that’s the problem when the Market Share is so out of whack, but market share can change, and in the mobile market the market share is more evenly balanced. Fore sure there needs to be a third discrete GPU market player, and a few mobile only players SKUs are getting up there in the SP/other counts, to maybe make a jump at some mobile discrete GPU offerings. Those SOC SKUs may just themselves take over the low end graphics discrete market needs entirely, but for sure there can be scaled up graphics discrete products made by formally in the SOC only graphics market IP players, 2 of them come to mind already.
so, are they going to try and
so, are they going to try and market this card as a gaming card at the beginning like they did last time? Because this “surprise” appearance sure makes it look that way. Wonder how long these cards don’t sell before they say it’s a semi-professional/pro-user card like they did with the first titan. I think this “surprise” is just to test the waters to see how people react before releasing more marketing on this card, instead of just out of the gate calling it a gaming card.
Nvidia punishes everybody
Nvidia punishes everybody with high prices cause nobody buys the Tegra stuff 🙁
The hair, not very Tressy,
The hair, not very Tressy, and still how long was the render time(per frame) of this short vignette! How about a few close up not so dynamic shots of a plant, or thick foliage, with lots shadows cast through layers of gently swaying leaves. Was this a single GPU, or a whole rack of GPUs on a grid server, And again in Jen-Hsun Huang’s back stage dressing room with the star on the door, the quick change back into the polyester, and spiked shoes, and back in the stretch limo, and off to the links, with Money Bags, Bottom Jaw the Third, and the other country clubbers. It’s pay through the nose, or kidney sales time, for any of JHH’s metal! Marketing and big production values, but that math may not add up. Well time to second mortgage the double-wide again.
This demo was running at 30
This demo was running at 30 FPS locked. Shadows were dynamic, you could see the ones being cast by the trees moving while the trees blew in the wind. I would assume this one done on one workstation from how Epic presented it, but no guess if it was single or multi-gpu. There was a few close up shots, I could tell they were using their distance field ambient occlusion because there was some artifacts, I’m curious if they were using their distance field shadows and GI as well.
All this hate nowadays for
All this hate nowadays for new more powerful graphics cards. Fucking shit generation of kids and hipsters….
Don’t you go blaming the
Don’t you go blaming the hipsters, they are all using Apple, or other thin and light overpriced SKUs with Intel “Ultrabook” crappy underpowered specifications. It’s more the hate for that phony, leather jacketed JHH, and his shyster tactics! Nvidia is not doing so well in the Phone SKU department, or in the tablet market, and hamstringing the Tablet market with more Android closed app ecosystem junk, and not some full Linux distro based tablet computers. For sure some steam OS based SKUs have been announced, and hopefully some Steam OS based tablets are in the planning stage! More powerful ways at milking more money, and such form customers has not added up for the green brand, and a lot of that hate is coming from Nvidia’s own customer base! What’s that math again 3.5 = 4! The older folks are not falling for JHH’s huckster song and dance/dog and pony show, but everyone is a little tired of the dishonest crap.
The hate is more along the
The hate is more along the lines of. Nvidia is becoming the Intel of GPU’s (look at the past 4 years of x86 *snore*). Lack of competition is never a good thing. Nvidia is upcharging each product line even more than normal when they had competition, etc, etc.
None of that is a good thing. Some of it is fanboyism, but when you create a card that’s meant to sell at $500 and have high profit margins and then you add another $150+ because you can, that’s not a good thing for the end-user.
People should watch the last PCPer Podcast where Ryan went over how drastically the prices have changed in the past 5 years and no abnormal performance increases per generation have occurred to warrant the change. Seriously, watch it. The proof is in the pudding.
Its because their dollars are
Its because their dollars are worthless and they want things that are less powerful and worse than what they had before.
It has to be:
Newly packaged
Poor performance compared to what came before it
Made in China
Cheap because their money is worthless
*Also, as you would expect,
*Also, as you would expect, the TITAN X will support 3-way and 4-way SLI, or at very least has the SLI bridges to support it.*
And what you don’t expect is Unreal Engine 4 not supporting SLI
https://answers.unrealengine.com/questions/21746/does-the-ue4-engine-support-sli.html
he only said not SLI friendly
he only said not SLI friendly not outright saying that UE4 not supporting SLI at all. i think this is also the case with CrossFire since both SLI and Crossfire rely on AFR. also Daylight which is an UE4 based games works on my 660 SLI
Daylight is not on UE4 its
Daylight is not on UE4 its using chrome engine
edit
sorry just woke up I was
edit
sorry just woke up I was thinking dying light 😛
BAH! to their focus on power
BAH! to their focus on power efficiency.