A new leak has sprung from the green team, with a 2080 Ti purportedly showing up on some Final Fantasy XV benchmarks. The cards are in reviewers hands so it is possible someone slipped up on their NDA and these accurately depict performance, though this being the internet it is also likely someone is trolling. If true, the new card is almost 25% faster than the mighty Titan Xp, at least in a Final Fantasy XV benchmark. Unfortunately it will also cost more than a Titan Xp when it does finally arrive.
Drop by The Inquirer for a peek.
"At least that's according to results that popped up in a leaked database of Final Fantasy XV benchmarks, hat tip to TechRadar, in which the RTX 2080 Ti racked up a score of 5,897 compared to the 4,756 achieved by the Titan Xp."
Here is some more Tech News from around the web:
- Google Home Max @ The Inquirer
- Game streaming’s latency problems will be over in a few years, CEO says @ Ars Technica
- Amazon reportedly preparing to jump the shark with Alexa-powered microwave @ The Inquirer
- Top-5 notebook brands and top-3 ODMs see increases in August shipments @ Digitimes Research
- iOS 12, thoroughly reviewed @ Ars Technica
- ‘Vaporized’ electrons in graphene boost signals into the terahertz range @ Physics World
- Microsoft pulls plug on IPv6-only Wi-Fi network over borked VPN fears @ The Register
- Litter-Robot III Open Air @ The Tech Report
So the Inquirer points to a
So the Inquirer points to a Techradar article which points to Wccftech article that looks at the Final Fantasy XV benchmark database!
And Page hits all around for that one!
And the Wccftech primate house daily feces flinging contest begins again with gaming folk speculating on AMD’s demise!
Really, Gamers, AMD will make more money off of It’s Zen/Epyc sales than any of its GPU sales, even if AMD could actually compete on the Flagship GPU level with Nvidia. AMD does not have to care about Flagship Gaming ever with that x86 64 bit IP(AMD created the x86 64 bit ISA extentions that even Intel uses) AND x86 32 bit cross license that AMD has from from Intel. And Intel cross licenses the x86 64 bit ISA from AMD.
Nvidia has to win the GPU race or Nvidia will be the one to go out of business as AMD has its CPU market revenue stream that has a bit more potential for revenues/profits than any GPU market revenues alone. AMD can continue to humor the gamers and focus on consoles and mainstram GPUs because AMD has its Epyc/Server revenue potential that makes any consumer GPU market sales a non issue for AMD’s continued prosperity.
If AMD’s CEO continues to play the wise business manager game then AMD should let Nvidia drastically increase discrete GPU MSRPs/ASPs over the next few years while AMD will have sufficient time to create some RTX type of GPU competition of its own and then field a line of competative Mainstream discrete GPUs with some Tensor core/Ray Tracing cores of their own.
AMD should let both Microsoft and Sony fund most of AMD’s Tensor Cores and Ray Tracing R&D costs while Nvidia continues to drive up the GPU market’s average discrete GPU MSRP/ASP figures. This is so AMD can undercut Nvidia’s already very high average MSRPs/ASPs for Turing and AMD can still make a handsome profit from Consumer GPU sales.
Nvidia driving up the ASPs on discrete GPUs can also benifit AMD’s GPU revenues if AMD plays the business game properly. AMD has what Nvidia can not match with AMD’s x86 based Epyc CPU SKUs and no discrete GPU can run on its own without some help from a CPU!
I say this to AMD: continue focusing on Epyc and Professional Vega(Vega 20 for the Professional GPU AI/Compute market) and let Nvidia get its GPU ASPs so high that Abdul Jabbar could not reach those Nvidia GPU ASPs with a Skyhook. And then AMD can come in and undercut Nvidia’s sky high ASPs a little, just enough for AMD to ALSO profit also from those very same Nvidia like Sky High GPU MSRPs/ASPs.
Look at AMD’s share pricese latey they are going up on Epyc News alone, AMD is no longer dependent on the consumer market for AMD’s continued prosperity! Watch those Epyc CPU, and Pro Radeon WX/Instinct GPU, revenue streams grow and consumer/games sales do not matter as much for AMDs continued survival anymore!
Edit: share pricese latey
To:
Edit: share pricese latey
To: share prices lately
TL;DR
As always.
TL;DR
As always.
… and someone comes in with
… and someone comes in with a random rant about AMD which has nothing to do with the article 😀
Wonder how many Tech
Wonder how many Tech reviewers will ignore no ray tracing games at launch and the performance % to price % increase.
I want to see legacy games
I want to see legacy games tested on Turing so readers can see if there are any substantial improvements in Turing’s Shader Cores, TMUs and ROPs over the Pascal GPU Micro-Arch’s Shader Cores, TMUs, and ROPs. It looks like the Turing SKUs have the same numbers of ROP’s as the Pascal generation variants.
So those improved Turing Shader cores and Cache subsystems and ROPs, TMUs also should see some increases in performance for legacy games if one refers to the Turing whitepaper! Clock speed improvements will help Turing also as well as power management but the old legacy games that do not make use of RTX/Turing’s RT cores and AI/Tensor cores IP will be agreat way to test Turing’s non RTX/AI IP also.
Do not discount Proper GPU power management featuers as that can help with higher average clock speeds for gaming! So that’s going to be interesting also to see for Turing over Pascal!
While it’s good to compare it
While it's good to compare it dollar-to-dollar, the chip they made this time around is absolutely huge. NVIDIA's own costs skyrocketed with Turing, not just their prices. It's not like they're just pocketing the extra $500.
But yes, I would definitely like to see reviewers include conventional titles, dollar performance metrics, etc. in their reporting. That's the point of a review — to see all relevant (to the reader) aspects of a product.
Scott, how long do you think
Scott, how long do you think it will take for Blender 3D/The other 3d graphics related software to begin to take advantage of Turing’s In Hardware Ray Tracing and In Hardware AI/Tensor Core IP?
Could you Ask Nvidia about Blender 3D and Other 3D software support that may be incoming for RTX Turing and get some timeframe from the Blender Foundation/Other 3D graphics software makers if possible.
I know that Adobe/Others are already doing AI related processing on older generation Nvidia and AMD GPUs also for things like edge/content detection AI related processing. I’m also seeing more AI related image filtering being done using AI/Tensor Flow libraries running on GPUs or even CPUs.
Forbes – New RTX 2080
Forbes – New RTX 2080 Benchmarks: ‘Final Fantasy XV’ Results Reveal Pricing Problem
https://www.forbes.com/sites/jasonevangelho/2018/09/17/new-rtx-2080-benchmarks-final-fantasy-xv-results-reveal-pricing-problem/#506d31651240
My 1080ti at 2139/5800 does
My 1080ti at 2139/5800 does 5717@standard and 4788@high, the 2080 just looks like a 1080ti with ray tracing and the ti is just too expensive to be called gaming card.
I see the 2080 Ti is said to
I see the 2080 Ti is said to be a 1080 Ti replacement when it it’s clearly a consumer version of the Titan with RTX and Tensor cores. The 2080 should be compared to the 1080 Ti, and when you do you see much less improvement, showing what waste of money these cards are. By the time we have a decent number of games supporting ray tracing we’ll be on next gen cards anyway. AMD is in the box seat if they price their 7nm cards right.