News from NPD Research today shows a sharp decline in discrete graphics shipments from all major vendors. Not great news for the PC industry, but not all that surprising, either.
These numbers don’t indicate a lack of discrete GPU interest in the PC enthusiast community of course, but certainly show how the mainstream market has changed. OEM laptop and (more recently) desktop makers predominantly use processor graphics from Intel and AMD APUs, though the decrease of over 7% for Intel GPUs suggests a decline in PC shipments overall.
Here are the highlights, quoted directly from NPD Research:
- AMD's overall unit shipments decreased -25.82% quarter-to-quarter, Intel's total shipments decreased -7.39% from last quarter, and Nvidia's decreased -16.19%.
- The attach rate of GPUs (includes integrated and discrete GPUs) to PCs for the quarter was 137% which was down -10.82% from last quarter, and 26.43% of PCs had discrete GPUs, which is down -4.15%.
- The overall PC market decreased -4.05% quarter-to-quarter, and decreased -10.40% year-to-year.
- Desktop graphics add-in boards (AIBs) that use discrete GPUs decreased -16.81% from last quarter.
An overall decrease of 10.4 % year-to-year indicates what I'll call the continuing evolution of the PC (rather than a decline, per se), and shows how many have come to depend on smartphones for the basic computing tasks (email, web browsing) that once required a PC. Tablets didn’t replace the PC in the way that was predicted only 5 years ago, and it’s almost become essential to pair a PC with a smartphone for a complete personal computing experience (sorry, tablets – we just don’t NEED you as much).
I would guess anyone reading this on a PC enthusiast site is not only using a PC, but probably one with discrete graphics, too. Or maybe you exclusively view our site on a tablet or smartphone? I for one won’t stop buying PC components until they just aren’t available anymore, and that dark day is probably still many years off.
Lot of things happening
Lot of things happening here.
1) GPU prices are rising.
2) Integrated graphics are getting better.
3) CPU performance increases are grinding to a halt.
4) Everyone who needs one has one, and probably performs better than good enough.
5) Die shrinks. We’ve been on 28nm for how long?
Honestly however, it’s the infinite growth paradigm that’s the problem here. If you can’t grow year over year, the impression is that you’re dying, and god forbid if you contract a bit.
All that being said, the population explosion combined with the mechanization of jobs means this system is going to have to go or we will eventually destroy ourselves fighting over the bread crumbs.
Grow or die. Or. Grow until you die.
Only one thing grows forever and that’s cancer.
This guy gets it.
Why cant
This guy gets it.
Why cant more people get it?
The current generation
The current generation doesn’t really offer better performance/dollar than the previous generation, so no wonder people don’t upgrade as fast. Maybe the next generation’s die shrink will bring better performance at an affordable price point.
Notice how it is tracking
Notice how it is tracking with 2009. My bet- we have fewer people with the discretionary capital to run out an buy a new card.
For me it comes down to 4k
For me it comes down to 4k gaming. I’m running a 7970 right now and doing just fine with it even on new games.
But, I game at 1080p. I want to go 4k but in order to get a decent experience you almost need 2 high-end cards. So now I’m looking at over $1000 in cards AND I still have to buy the 4k monitor which good ones are over $500.
So that’s why I haven’t been buying new stuff.
I also thought about buying a 4k monitor for non gaming use and turning my games down to 2560 x 1440. I’m not even sure if that will work correctly though.
I’m not surprised by AMD’s
I’m not surprised by AMD’s numbers, too many rebrands and not enough of the new stuff to go around.
+1 on integrated graphics getting better, I was shocked by what a newer I3 was able to run (low end game wise) let alone an AMD APU.
I really wish they would
I really wish they would include a listing with just discrete graphics cards listed. Or does this only include the people that only use the on chip GPU? Otherwise the data gets skewed by people that buy a new intel chip (which damn near all of them have a gpu now) but still use a discrete graphics card.
I game @ 1080p with GTX
I game @ 1080p with GTX 660ti. It plays all my games on ultra just fine. What is the benefit for me to buy a new card?
Well even if you are looking
Well even if you are looking for a good performing laptop its hard! With all the Ultrabook Apple style design ethos going on around from Intel’s copycat initiative, then that right there is why many people are forgoing new laptops.
What ever happened to the laptop as a desktop replacement and the regular from factor laptop with its regular and more beefy cooling solution. How many have complained when gaming on these newer laptop form factors that the discrete mobile GPU was being thermally throttled, or how many of the newer laptops that are for sale currently are made of with these thin and thermally constrained laptop cases that are leftovers from too much emphasis on the Ultrabook/Apple (Style Over Performance ) SKUs!
The OEM/Retail and OEM parts supply chains are full of these Ultrabook parts and the OEMs are getting them for less cost and trying to pawn them off on consumers. Look at the current Carrizo offerings, loads of 15 watt constrained thin and lights, and no full Carrizo 35watt parts, or parts that are able to do 35 watts that are restricted to a thin and light from factor case/weaker cooling solution thermally constrained to 15 watts laptop SKU.
I’m looking for a Carrizo based laptop and Discrete AMD GPU based combo Laptop with the Carrizo FX-8800p SKU made to utilize the APU part at 35 watts, and whatever the maximum thermal wattage the AMD discrete mobile GPU can run on without being thermally throttled, not much luck so far, and more lost sales.
Add to this windows 10, and its EULA that is not so appealing on more than just the privacy front, and that makes for more lost sales, I hope that the laptop OEMs are not foolish enough to actually not provide a windows secure boot off switch in the UEFI/BIOS on their new laptops that ship with windows 10 factory installed!! And there goes even more sales from people that will buy the device to re-image it with the OS of their choice, so that windows secure boot off switch better be in the UEFI/BIOS or its RMA time for the laptop.
uh the same thing happened
uh the same thing happened with smartphone sales…everyone got them… now they have to subsidize upgrades.
Remember TVs? big huge LCD upgrade time… mega sales…whoops everyones got one now..ok LED push…… um how do we make them keep buying TV’s every year forever? 3D!!!!!!!!!!! CURVE!!!!!!!!!!!! UMMMMMMMMMM UMMMM crap. market flattens.
I guess we are in a POST TV WORLD NOW!!!
Pretty much everyone has a
Pretty much everyone has a giant flat screen. The 3D feature wasn’t enough to get people to upgrade. 4k plus HDR might get some people to upgrade, but with the current economic situation in the US, I think a lot of people will not be buying this year.
Keeping my 780 until 16nm
Keeping my 780 until 16nm GP100 is out and Rec.2020 4K monitors too.
also note, PEOPLE ARE BUYING,
also note, PEOPLE ARE BUYING, they just aren’t out there slapping huge money at the OEMs.. they are buying smart, and on a budget.
If you sell good value, good priced machines, you will have buyers lined up to give you money. *me*
also the 2nd hand market is H-U-G-E right now for that reason as well…..people are buying, believe me. But they are being smart about it.
OEMs are losing out cause they don’t get *it* They’ve had it to easy for a long time.
Some of this may just be due
Some of this may just be due to general economic decline. A lot of people just don’t have the money for new components. Also, I don’t think better graphics at 1080p is really that interesting. It seems like we have reached a point of diminishing returns to some extent. When a new game comes out, developers have to target a relatively low system spec because people do not seem willing to upgrade for an increase in image quality. They can always argue that the developer has to support a relatively low spec (by current standards) to support the xbox one and PS4, so why can’t the developers provide settings compatible with whatever card they are running.
Some people may upgrade for 4k, but most people are not that picky when it comes to displays. Most of the displays people are using look terrible to me, but I suppose it is what you are used to. I don’t know how cheap 4k displays need to get before people really start making the switch from 1080. For a lot of people, the resolution increase really isn’t that compelling (over-used word lately?). It would be nice to get not only 4k, but also HDR. I have seen some demonstrations of high dynamic range video that look spectacular, but this will probably take a lot of time to come to video games. We also need the OS to support 4k properly, without making UI elements tiny/unusable. I haven’t read much on whether windows 10 supports this well or not.
The drive for VR displays could drive new hardware since you need to drive essentially 2 displays at high resolution and low latency. I don’t think this will be a that big of a factor for quite a while yet though. The current systems would certainly make me violently ill. I can’t even ride in a car without getting sick; it doesn’t bother me when driving though. I have gotten nauseated just from playing a 3D rendered game on a regular display in a dark room. They could make a much better device if screens are designed from the start to be part of a VR device. They could make a much smaller, higher resolution display, but I don’t know if the economics of this are feasible. Current displays are just using smart phone parts. The VR thing could actually go the other way though. If they have support for eye tracking and very low latency rendering, then they may not actually have to render very much for it to look good due to foveated vision.
Anyway, I do not see much of a killer app. Due to the slow down from the process tech stagnation over the last few years, people have become unwilling to upgrade without a very compelling improvement. I don’t think resolution is going to be enough. Even with a massive increase in gpu power coming with the next generation, is that going to matter if most people are still running 1080p? I think a lot of people are still running early GPUs based on 28 nm (Radeon HD 7000 series and GeForce 600 series parts) without much issues. That obviously depends on what you are playing though. World of Warcraft doesn’t take a high end card.
I bought my Radeon 5870
I bought my Radeon 5870 Eyefinity6 about 5 years ago and it is still a very good card. Had no games which it cannot run on maximum detail level + AA on fullhd. Why should i buy a new one?
yep, games are FUN, not
yep, games are FUN, not frames + res.
that isn’t what makes a game fun, a pretty game can suck. A low res game can be great.
“1) GPU prices are rising.”
“1) GPU prices are rising.” yes!
and thanks to the added Tax in EU, now it freaking hard to buy some new GPU there and computer Hardware too… DA!! for example 😛
im happy to ship you cards
im happy to ship you cards under the radar 🙂
Don’t forget the mining craze
Don’t forget the mining craze has died down.
I think cryptocurrency mining
I think cryptocurrency mining might also be a factor.
I think what the next big
I think what the next big compelling feature that will get people to upgrade their GPU is VRR once it becomes more mainstream. I haven’t been really serious about upgrading until I learned about VRR, and that is THE main reason I want to upgrade.