The wait for in-stock NVIDIA graphics cards without inflated price tags seems to be over. Yes, in the wake of months of crypto-fueled disappointment for gamers the much anticipated, long-awaited return of graphics cards at (gasp) MSRP prices is at hand. NVIDIA has now listed most of their GTX lineup as in-stock (with a limit of 2) at normal MSRPs, with the only exception being the GTX 1080 Ti (still out of stock). The lead time from NVIDIA is one week, but worth it for those interested in the lower prices and 'Founders Edition' coolers.
Many other GTX 10 Series options are to be found online at near-MSRP pricing, though as before many of the aftermarket designs command a premium, with factory overclocks and proprietary cooler designs to help justify the added cost. Even Amazon – previously home to some of the most outrageous price-gouging from third-party sellers in months past – has cards at list pricing, which seems to solidify a return to GPU normalcy.
The GTX 1080 inches closer to standard pricing once again on Amazon
Some of the current offers include:
MSI Gaming GeForce GTX 1080 ARMOR 8G – $549.99 @ Amazon.com
EVGA GeForce GTX 1070 SC GAMING ACX 3.0 – $469.99 @ Amazon.com
EVGA GeForce GTX 1060 SC GAMING 6GB – $299.99 @ Newegg.com
GTX 1070 cards continue to have the highest premium outside of NVIDIA's store, with the lowest current pricing on Newegg or Amazon at $469.99. Still, the overall return to near-MSRP pricing around the web is good news for gamers who have been forced to play second (or third) fiddle to cryptomining "entrepreneurs" for several months now; a disturbing era in which pre-built gaming systems from Alienware and others actually presented a better value than DIY builds.
That’s a great news!
That’s a great news!
Crypto miners will still be
Crypto miners will still be wanting AMD’s GPUs for a while longer as they do offer more compute/shaders than Nvidia’s Pascal SKUs. But at least the gamers will be happy that Nvidia’s Pascal GPUs are back in the MSRP range now.
Vega 56 has the exact same numbers of shader cores as the GTX 1080Ti with the Vega 64 way ahead with 4096 sheder cores for the coin hashing. Vega 64 is still selling in the $700 to $750 range even with the current price drops and Vega 56 down in the $450 to $650 range so that’s not so good for gamets but still allowes AND to sell all the Vega SKUs that they are producing.
Nvidia dropping to MSRP is great for gamers and AMD is not so much affected with AMD’s gaming GPU market share being as small as it is currently. Nvidia sure can not be motivated much to bring on its next generation what with its current Pascal offerings able to be sold at MSRP, so Nvidia will be in no hurry to move on from Pascal anytime soon.
Core count between AMD and
Core count between AMD and Nvidia is irrelavent, as they are not actually cores. AMD has long had a higher advertised “core” count, but its not translated into better performance.
The way the AMD architecture processes the information is why its better at mining. Something that doesn’t translate to gaming. But with HMB2 being 4 times more expensive than it was when AMD first started working on Vega, the Vega pricing and stock is unlikely to ever recover. HBM2 modules were quoted to AMD to be in the $30 range when they started on Vega, but currently suppliers are charging $120-$140.
As for Nvidia they have already moved on from Pascal. They are simply waiting for GDDR6, the speeds for which are not 100% finalized between manufactures. All indications are we will see the GTX 1180 by the end of the year. https://www.techpowerup.com/gpudb/3224/geforce-gtx-1180
A GPU shader core is a core
A GPU shader core is a core as much as a CPU core is a core and shader cores comes with Integer units, FP units and such. So AMD’s Vega SKUs have much more shader cores that are where the hashing calculations are done.
Really not actually cores! Where did you get your degree from the back of a matchbook!
And that Techpowerup GPU database on its GTX 1180 entry says: “This graphics card is not released yet. Data on this page may change in the future. ”
Really a GTX 1180 with only 64 ROPs is not going to have above the pixel fill rate of the GTX 1080(64 ROPs) and it’s sure the hell not going to beat the GTX 1080Ti with its 88 ROPs and the highest pixel fill rates of all consumer/gaming cards.
Looking at that TechPowerUp entry on the supposed GTX 1180 it looks like it has the same shader core/TMU counts as the GTX 1080Ti but only 64 ROPs! So that’s pure speculation on TechPowerUp’s part. The Vega 56 has the exact same numbers of shader cores and TMUs as the GTX 1080 Ti and if AMD wants to beef up some before Navi gets here competition to the GTX1080 Ti all AMD has to do is tapeout a new Vega micro-arch based base die with 88 or more available ROPs.
The GTX 1080 Ti is based on Nvidia’s GP102 base die tapeout and that comes with 96 total ROPs so Nvidia can still create a stronger GP102 based gaming variant. GP102 was never intended to be for gaming as GP102 is used for Nvidia’s Quadro line of SKUs mostly but once Vega 56/64 design became known to Nvidia well Nvidia had to get up and use its GP102 base die for its consumer/gaming oriented GTX 1080 Ti with 88 out of GP102’s 96 ROPs enabled.
Nvidia beats AMD in gaming with raw billions of dollars more in GPU investments that have Nvidia creating 5 base die tapouts each new generation with for example Pascal coming in the GP100, GP102, GP104, GP106, and GP108 base die tapeouts each with different complements of shaders/TMUs/ROPs for different market segements. AMD at the time of Vega’s release could only afford that one base die tapeout, Vega 10, that had to do double duty as a professional compute/AI market base die tapeout and a Flagship Gaming base die tapeout for Vega 56/64 consumer gaming usage also.
If you look at AMD’s Radeon Pro WX and Radeon Instinct MI25 professional SKUs markups on those Vega 10 base die tapeout based SKUs you will see where Raja did what AMD’s Management told him to do and create a compute/AI focused based die tapeout/design that could also be used for gaming, but AMD’s CEO knows where the real money is and that’s never with any consumer gaming only GPU variants as the professional market has the better markups and revenue potential for AMD.
Nvidia’s discrete Gaming GPU market share leaves AMD with not so much diecrete consumer market share at the moment for AMD to really justify the cost of creating only for the gaming focused GPU market! And that Professional Compute/AI market is where AMD is focusing on just look at that Vega 20 based SKU at 7nm and that’s specifically for AMD’s Radeon Instinct/WX professional Branding at release and not gaming!
Gamers do not appear to realise it but it’s mostly ROP’s and that Pixel Fill rate that’s giving the FPS rates and not as much a GPUs compute at the moment. And Nvidia has base die tapeouts that provide up to 96 ROPs to fling out the FPS metrics and all AMD needs to do is beef of the ROP counts on a new Vega base die tapeout to compete with that. But AMD is really only interested in mainstream discrete gaming and AMD will be getting more GPU/Graphics market share from its APUs and that Semi-Custom Vega die on that Intel EMIB/MCM module. So AMD does not care to invest in some falgship gaming only GPU funding unless that can be done with some binned peofessional Variant, the same as Nvidia did with the GTA 1080 Ti and its being based on the professional GP102 base die tapeout.
So if Gamers want a gaming Flagship GPU from AMD then they better hope that there will be some Vega 20 base die bins that do not make the grade to become Radeon Pro WX 9100s/Radeon Instinct MI25s. Because for AMD the consumer market does not produce enough markups for AMD to even care about winning any falgship GPU pissing contest. The real money is in the professional compute/AI markets and the best consumer market segement, revenues wise, is the mainstream market for both AMD and Nvidia.
If it can’t be said and read
If it can’t be said and read in one or two sentences it’s not worth reading or saying, your comment is longer than the damn article, if you’ve got so much to say why not setup your own blog or something so you don’t keep turning the comments section into versions of war and peace.
That’s Plenty of words in
That’s Plenty of words in those Walls-O-Text specifically there to piss you off and you grab at that bait every time.
It’s all about informing and setting your trigger switch off at the same time! And that’s Oh so sweet to watch that signle gray cell in that vast sea of lipids as it begins to twitch at such a rate as to set of a gigaton level of pure enegry release as that cell’s motion starts a chain reaction. WhyMe’s head has gone all Scanners to the Lipid Atoms MC**2 power and has taken out about a thousand square miles.
The thing is it seems to be
The thing is it seems to be pissing you off more than anything, what with the vast bilious clouds of vapid hatred spewing forth from keyboard.
I’m mean seriously how many times from how many different people before you get the picture, people are glazing over when they see your vast walls of text, you’re wasting your time as no one is reading what your saying.
And just to add, as i was
And just to add, as i was bored and actually bothered reading the first few lines of your diatribe, jmaster299 got his information from a discussion with David Kanter who has probably forgotten more than you’ll ever know, his degrees (Bachelor of Science) are in Mathematics with a specialization in Computer Science, and Economics from the University of Chicago.
So where did you get your’s other than copying and pasting from articles you find on the internet.
And here is that discussion,
And here is that discussion, GN intereviewed David Kanter and made a video titled “Why CUDA “Cores” Aren’t Actually Cores, ft. David Kanter”
To the guy who keeps posting these TL;DR great-wall-o-texts, you are wrong.
No one cares about Cuda cores
No one cares about Cuda cores as the Miners love thos extra shader cores on AMD’s Polaris and Vega GPUs and just look at AMD’s TFlops metrics and one can easiliy tell why the Minets bought up all the Polaris and Vega cards that they could and only whent over to Nvidia’s low shader core cout GPUs because there where no AMD GPUs remaining in the supply chains in very large numbers.
David Kanter can say what ever he wants but all GPUs make use of Floting point units, Integer units, and orher units and some form of schedulers that manages groups of GPU cores in lock step to do the math. Just you so over to the TechPowerUp GPU data base and look at those Core Counts with AMD compared to Nvidia and the miners sure the hell do not care about TMUs/ROPs or and FPS metrics, miners only care about Hashes/Sec sorts of metrics.
You can split hairs with Kanter’s statments all day long and call Nvidia’s Cores whatever also but at the end of the Day AMD’s consumer GPU have more FP/Flops and other compute metrics that are why the miners love AMD’s GPUs for mining.
wether you care about cuda
wether you care about cuda cores or AMD’s equivalent, you are still wrong about what they are, meaning they are not actual cores, just parts of one.
But go ahead and post another wall-o-diarrhea discounting someone who actually knows better than you.
Shader cores are shader cores
Shader cores are shader cores and CPU cores are CPU cores but they all make use of Floating Point Units and Integer Units and you are a damn fool for gettng so offended by any amount of text. Shader cores are where the FPUs are and Int Units, ALUs are on GPUs! So you can not get over your Cuda Core obsession and that Cuda is only a marketing term that Nvidia uses for its brand of GPU Shader cores.
You keep on with your little naming argument but I’ll not be influnced by any single online journalist’s definitions when there are plenty of whitepapers authored by PHDs that carry more weight than some online sources.
That “Technology” journalist definitions mean very little compared to the ACM and the academic journals and I see that the “Technology” journalist is no longer employed at that Linley Group Publication, and MR was a much better Trade Journal before it was acquired by the Linley Group!
I remember paying $240 for my
I remember paying $240 for my 8GB RX 480 in 2016. So now in 2018 I’d have to pay about $300 to get similar performance?
You would have to be insane
You would have to be insane to pay anywhere near MSRP for a 2 year old card that should have been discontinued 3 to 4 months ago.
Why in the hell should it
Why in the hell should it have been discontinued 3-4 months ago when its replacement is waiting on GDDR6?
Why do gamers need a new GPU
Why do gamers need a new GPU micro-arch each and every year when the GPU makers can now sell to miners as well as gamers, and hey don’t forget that Nvidia was the one that spearheaded that GPU for the professional compute/AI market that gives Both Nvidia and AMD so much better markups and margins than any gaming market ever could!
Let’s be realistic Nvidia could continue with its Pascal micro-arch and a new GP102 based SKU with say GP102’s full complement of 96 ROPs and that’s more FPS right there.
The only reason that Nvidia needs to move on to the Next GPU micro-arch is efficiency and process node shrinks with its Volta Micro-Arch for the professional markets. And the professional market demands those efficiency gains on a yearly basis and will pay the proper markups to get it. Gamers would be satified with higher FPS and Nvidia can sill give that using GP102’s remaining ROPs.
Nvidia is still selling GP102 based Quadro products so Nvidia is currently not wanting to make use of all of GP102’s 96 shaders on non professional SKUs with the GP102 based GTX 1080 Ti only useing 88 of GP102’s 96 available ROPs. But as soon as the GV102 based Quadros begin to replace the GP102 based Quadro SKUs then Nvidia can afford to make an even more powerful Pascal gaming SKU.
Nvidia’s GV104/or other Based GTX 1180 better have more than 64 available ROPs or that will not be able to best the GP104 based GTX 1080 that’s also only has 64 ROPs, the same as Vega 64/56 with their respective 64 ROPs.
Gamers do not really need new GPU micro-archs more than they need new base die tapeouts with more ROP’s for higher pixel fill rates and More TMUs for higher texel fill rates and enough shader cores to assist the latest Graphics APIs in doing their jobs.
With that New Microsoft DXR API for ray tracing that’s going to like to get its hands on as many shader cores as possible for ray tracing interaction calculations acceleration on the GPU, Nvidia will be forced to begin increasing its complement of shader cores on its gaming oriented GPU offerings.
AMD’s Vega will be ahead there with DXR acceleration and Vulkan will be getting its own Ray Tracing additions to that API also. So really most gamers are getting the latest GPU micro-archs from Nvidia as a result of what the Professional market demands and not what the gaming market demands. Ditto for AMD as that Professional market is what’s going to be paying for all the R&D bills that the consumer gaming market can not/will not pay in the form of proper GPU price markups.
Gamers are not the ones with the deep pockets that both AMD and Nvidia pay the real attention to, and money talks! Volta was designed for the Professional markets first and formost just like Vega Really also was for Professional usage. And gaming is where the less performant Vega dies go from AMD with Nvidia able to afford to do those 5 Different GPU base die tapeouts with Pascal having GP100(Tesla top end), GP102(professional, and gaming as an after thought), GP104(Consumer Gaming), GP106 and GP108(Low end comsumer gaming).
Nvidia still makes more than half of its revenues from gamers but JHH is working on fixing that problem and getting on a more profitable non gaming only focuesd track. AMD does not have a large enough share of the discrete Gamimg GPU market to justify the added expense of a gaming only flagship SKU and the real money for consumer/gaming is for both Nvidia and AMD in the mainstream GPU market.
Gamers need not be conserned with the GPUs generation as much as the generation’s ability to game at higher resolution. And Both Nvidia and AMD could engineer New Pascal and Vega micro-arch based DIE tapeouts will more ROPs and TMUs and enough shader cores without having to design a new GPU micro-arch at all. Gamers are only gettng the new GPU micro-archs because of the effencicy gain needs of Professional makets that need to be met and those professional markets are payng the bills not the consumer/gaming markets like most gamers that are deluding themselfs think.
Get over yourselfs gamers do you think that Nvidia and AMD can afford to stay in busines tending to your low markup/margin consumer gaming needs where there is so little profit to be had.
Professional Compute/AI market’s money talks and gamers can walk and the only thing really driving the need for new GPU mciro-archs is effencicy gains that the professional makets need on a yearly basis. Flagship gaming is not even as profitable as mainstream gaming in both Nvidia’s and AMD’s minds and both theose companies are sure focuesd on the pro markets with the proper markups more than any gamers who can not afford to pay but close to break even prices.
Valid question for PCPer,
Valid question for PCPer, have you guys noticed any increase in teh amount of storage used by the extremely long and numberous text walls? If so, is it costing you more?
Y r s Vry Vry Mch TRGGRD s
Y r s Vry Vry Mch TRGGRD s Vry TRGGRD by ths Wlls–Txt bt RPs nd TMs nd shdr cr cnts. h th ttl Btthrt wy tht y cmpln! h Y s vry mch cn nt ndrstnd th wrld tsd th cnfns f tht drk nd vry dnk smllng bsmnt wrld n whch y lv. h h H H, h H h H Hw! H Hw hw hw HW! Pffffff-t Hz H H h h Hw! H H H H HW! H H H H Hw! BffffffffffPffffff-T H h wh H Hw HW!