Even with the difficulties the PC market encountered over 2015 Intel still managed to make a good sized profit. Compared to Q4 of 2014 their profits shrank a mere 1% down to $8.76bn, a feat unequalled by other silicon slingers as the entire market shrunk by about 10%. Their data centre group provided the most impressive results, a 5% increase in revenue likely spurred by the growth of hosting providers for the various Clouds which formed or grew over the past year. The Inquirer also points out the release of the sixth generation of the Core family of processors certainly didn't hurt them either.
"INTEL HAS POSTED strong quarterly profits in its fourth quarter earnings, revealing results that were higher than Wall Street was expecting despite a tough year for the PC market."
Here is some more Tech News from around the web:
- Server retired after 18 years and ten months – beat that, readers! @ The Register
- The Day Netflix Blocked My VPN is the world's new most-hated show @ The Register
- Android Banking Malware SlemBunk Part of Well-Organized Campaign @ Slashdot
What about revenues, that is
What about revenues, that is where the money to keep thing going comes from, and Intel is not experienceing much growth and it already has most of the PC marketshare so it’s still not good news.
“the company’s results were affected more significantly by its larger data center business, which posted revenue of $4.3 billion in the fourth quarter and accounted for nearly half of Intel’s operating profit.
In all, Intel reported that profit declined 1% to $3.61 billion, or 74 cents a share, compared with $3.66 billion, or 74 cents a share, a year earlier. Total revenue rose 1% to $14.9 billion, showing growth after two quarters of declines. Analysts polled by Thomson Reuters had expected earnings of 63 cents a share on revenue of $14.8 billion.
Gross margin fell to 64.3% from 65.4%, but surpassed the company’s forecast of 62%.
For the current quarter, the company forecast revenue of about $14 billion, or $14.1 billion when adjusted for some acquisition-related items. Analysts polled by Thomson Reuters had expected revenue of $13.89 billion.”(1)
Those gross margins are falling But PC sales are better, let’s see how Zen affects this in late 2016 into 2017, a high performing Zen will make those margins on the PC and laptop SKUs come down if Zen can get close to Intel in performance at a much more affordable price point!
“Intel’s PC business fared considerably better in the fourth quarter, in large part because buyers chose more powerful processors that command higher price tags. The company said Thursday that the average selling price of its PC chips rose 17% compared with the fourth quarter of 2014.”(1)
(1)”Intel’s Data-Center Revenue Disappoints”
“Chip maker’s shares fall 5% in late trading after fourth-quarter results”
http://www.wsj.com/articles/intel-reports-drop-in-earnings-1452806538
Intel is going to lose the
Intel is going to lose the high margins in the CPU buisness eventually. The bottlenecks have shifted strongly in favor of the GPU. Going forward, many different companies will be able to make a good enough CPU. For most games, AMD already makes a good enough CPU with their old FX excavator CPUs.
GPUs are actually much lower margin than CPUs. Intel currently gets a high price for a small die at 14 nm. With GPUs, the margins are much lower. For the amount of silicon in a GPU, the price is significantly lower than a similarly sized CPU. There is still a lot of money for HPC compute products, but it is going to be artificial market segmentation. The price for a similar sized die with 64-bit compute resources will be a lot more expensive than a GPU with mostly 32-bit compute. I haven’t seen much comparing Intel’s Xeon Phi with GPU compute. I have the impression that GPUs are living up to expectations better than Xeon Phi though.
These market shifts will force Intel into the consumer GPU market, even if it is via integrated devices. They will have to sell large die parts for much lower margins than their Xeon parts. While Intel has been talking up the performance of their integrated GPUs, they are comparing a 14 nm integrated GPU to 28 nm low-end discrete parts. They still seem to be having issues with 14 nm, so I don’t expect another process shrink soon. They will have competing 14 nm GPUs from AMD and Nvidia this year. I don’t think Intel’s GPU will compare that favorably.
Also, Intel doesn’t seem to have a real answer to the silicon interposer technology. I have seen a little bit about their EMIB tech which embeds a chip upside down in the package substraight to allow for interconnect somewhat similar to silicon interposers. This looks like a response to silicon interposer tech and it may not be available soon, or may not be as good as silicon interposer technology. Silicon interposers could be a disruptive technology, and it is not coming from Intel. Intel obviously has a lot of money to spend on pushing their GPU technology but that seems somewhat similar to their attempts to break into the mobile market.
you should post this bs a few
you should post this bs a few more times, fuck off
Up yours Game-neck, those
Up yours Game-neck, those green teeth in your mouth need some attention, and maybe your SMOM can help you with her stripper tips! Enjoy your gimped Intel overpriced graphics!
I can’t express how strongly
I can’t express how strongly I hope you get IP banned for your constant, unwarranted personal and familial insults. Believe it or not, you’re one of the people who makes the comments here suck sometimes. You’re as bad as the Intel and Nvidia fanboys.
You holding a bunch of Intel
You holding a bunch of Intel stock? Markets change and companies need to change with them. Intel didn’t want to integrate the memory controller into the CPU, and AMD grabbed a lot of server market share when the Opteron came out. Intel eventually followed suit, and took that market share back though. Same thing might happen here eventually with silicon interposers, but Intel customers might get stuck with underperforming systems for a while. A lot of people got stuck with Pentium 4 systems with SDRAM when Intel was trying to kill off DDR in favor of Rambus memory (and block AMD at the same time). Those P4 systems really were pitiful, and Intel didn’t get anywhere near the amount of bad press that they deserved.
The fall of the CPU as the most important component is a separate issue. It is obvious that once “good enough” is reached, there is no advantage in paying high prices for supposed premium components. You don’t see many people paying extra for 10,000 RPM hard drives anymore because they really don’t make much difference in performance. The GPU and memory attached to it are much more important for most consumers. In fact, the x86 lock in is one of the main things keeping Intel from needing to compete with anyone except AMD. If not for that, they would need to compete with a large number of “good enough” ARM processors. For general use machines, most current cell phone processors are powerful enough. I know a lot of people who don’t even have a desktop or laptop anymore since a tablet allows them to do all of the things they need. I typed these comments on an iPad, which still seems to have a double post issue, but I don’t really care enough about that to go boot up the desktop.
I probably made a similar post before, I don’t know. I don’t always go back to see if anyone has replied. If you have a valid argument on why the CPU would be more important than the GPU, then feel free to give them. Also, feel free to explain why you don’t think silicon interposers are a revolutionary technology. In my opinion, we haven’t even started to see what can be done with them.
CPUs and GPUs have an
CPUs and GPUs have an insanely high profit margin. The physical chip will usually have a BOM that us under $5. The crazy high prices come from lack of competition, and trying to cover the R&D costs, and turn a profit before the part become becomes outdated.
Sites like electronics360 will sometimes show their CPU BOM estimates in their videos. The same applies to GPUs, insane profit margins, though their product cycle life is shorter than that of a CPU.
Intel has enjoyed really high quarterly profits because they have not had any real competition. With AMD’s misguided decision to abandon the high end market, they lost market share across the board, because they also gave up their reputation for making high performance products, thus for the general customer who is not in the know will simply assume that the intel part will be faster, even if for their price range, AMD has a better offering.
Intel is going to lose the
Intel is going to lose the high margins in the CPU buisness eventually. The bottlenecks have shifted strongly in favor of the GPU. Going forward, many different companies will be able to make a good enough CPU. For most games, AMD already makes a good enough CPU with their old FX excavator CPUs.
GPUs are actually much lower margin than CPUs. Intel currently gets a high price for a small die at 14 nm. With GPUs, the margins are much lower. For the amount of silicon in a GPU, the price is significantly lower than a similarly sized CPU. There is still a lot of money for HPC compute products, but it is going to be artificial market segmentation. The price for a similar sized die with 64-bit compute resources will be a lot more expensive than a GPU with mostly 32-bit compute. I haven’t seen much comparing Intel’s Xeon Phi with GPU compute. I have the impression that GPUs are living up to expectations better than Xeon Phi though.
These market shifts will force Intel into the consumer GPU market, even if it is via integrated devices. They will have to sell large die parts for much lower margins than their Xeon parts. While Intel has been talking up the performance of their integrated GPUs, they are comparing a 14 nm integrated GPU to 28 nm low-end discrete parts. They still seem to be having issues with 14 nm, so I don’t expect another process shrink soon. They will have competing 14 nm GPUs from AMD and Nvidia this year. I don’t think Intel’s GPU will compare that favorably.
Also, Intel doesn’t seem to have a real answer to the silicon interposer technology. I have seen a little bit about their EMIB tech which embeds a chip upside down in the package substraight to allow for interconnect somewhat similar to silicon interposers. This looks like a response to silicon interposer tech and it may not be available soon, or may not be as good as silicon interposer technology. Silicon interposers could be a disruptive technology, and it is not coming from Intel. Intel obviously has a lot of money to spend on pushing their GPU technology but that seems somewhat similar to their attempts to break into the mobile market.
When do you predict all this
When do you predict all this will happen. You’ve been touting Intel’s demise for years now.
hahaha Zen, just a mound for
hahaha Zen, just a mound for Intel’s cock to crow on
Zen does not have to beat
Zen does not have to beat Intel at the top end, Zen only has to beat the Core i5 quad cores, and get close the Intel’s Core i7 SKUs! AMD can always beat Intel on the price front, with even better graphics for the money spent. Those that try to compare Intel’s top end SKUs graphics that are rarely available at even Intel’s much higher price will never mention the costs($$$$) of that “Graphics”. Intel will continue to trot out its highest end SOCs with the Intel’s “highest” end graphics and try to compare those SKUs to everything for sale in the SOC/APU market place, but when the price/performance metric is weighed AMD will still come out on top!
It’s only when the metrics for only a CPU core’s performance, and non-Intel discrete graphics based high end gaming rigs is compared that AMD has any troubles! And that’s only comparing CPU core IPC to CPU core IPC and not graphics, because Intel always needs a little from AMD’s or Nvidia’s discrete GPU paired with Intel’s cores so that Intel can have any claims to desirability for high end discrete graphics gaming. Intel’s graphics is still not wanted for gaming, gamers only want the cores, because they will never use Intel’s graphics for high end gaming. Even on the middle level gaming end, it’s still the discrete GPU SKUs!
And when Zen gets paired with a larger GPU on the interposer, with the seperately fabricated Zen cores getting paired with a good midrange/High end seperately fabricated GPU die, all on the interposer with the Zen cores wired up with a much faster than any PCI connection directly between the Zen’s cores and the GPU’s ACE units, then all bets are off. If AMD makes a laptop Zen based APU on an interposer, and starts putting much more ACE units the GPU’s die, and wires it up to the CPU cores(Zen) via the interposer with direct pathways between CPU and GPU then Intel will have some serious problems, espically with the DX12/Vulkan APIs doing more of the gaming compute on the ACE units and the CPU cores will have less gaming work to do anyways.
AMD will only need to make some Zen CPU core only 4, 8, or more cluster dies and get some of its seperately fabricated GPU dies made for AMD to create a line of APU’s on an interposer (for laptops to server SKUs) with CPU die, GPU, die, and HBM/other dies all wired up interposer style to get a line of SKUs that Intel could not compete with for much better graphics integrated on an interposer with CPU cores and HBM. The APU on an Interposer with CPU, GPU and HBM all made up separately and wired up directly via the interposr is what is giving Intel sleepless nights.
hahahaha
hahahaha
Ha ha ha ha, you have green
Ha ha ha ha, you have green teeth!
when you charge 9999999999.99
when you charge 9999999999.99 for your chips and people still blindly buy the cause “intel duh” of course you will turn a profit.
First of all, I personally
First of all, I personally bought an Intel CPU because there was no competing product from AMD. I also thought it worth the price.
I’m not sure how many people are “blindly” buying their CPU’s.
Also, Intel has a lot of competition in mobile and has spent billions on research so making a profit isn’t a sure thing.
(Having said that, to understand Intel’s current profit state we need to pay careful attention to the merging of the mobile part of Intel as they had a loss there but managed to come ahead by merging it a year of so ago. We’d have to look at about TEN YEARS of research, sales etc to determine a trend.)
I remember a rumor years ago,
I remember a rumor years ago, Pentium times, when MS has empty loops inside of Windows code forcing people buying new CPUs. Then story about MS developer fired cause of writing some Windows code in assembly (or very optimized)…
Now MS is forcing Win 10 and quits support for older then Skylake CPUs. Accidentally?
Intel can not exist without MS and vice versa. They both have missed a mobile train, MS especially. Things are changing and Win is only on PC domain. And mobile domain is stronger and stronger leaving less and less reasons for most of people to even think about PC. Even a lot of commercial apps can run without any problem on mobile. Games still not but soon. And mobile domain is Linux. A little push, a little more organized Linux community could mess PC a lot. I’m Win guy but forcing Win 10 and new CPUs might push me onto other side!
Intel must have a serious talk to MS or there is no reason for new CPUs. They must start making better mobile CPUs and as a lot of you pointed out – better GPUs.
Forcing W10 is for security
Forcing W10 is for security reasons as the new OS has features optimized for that.
It also only affects people buying new CPU’s that want to use W8 or older so I’m not sure there will be a huge outcry. In the business world I believe they said the situation may be different (longer support) but I’m not clear on that.
And MS isn’t “only on PC” so I’m not sure why you said that. Yes, they’re a little late to the party but they are currently making W10 phones, laptops, and desktop PC’s work together under the Windows 10 umbrella.
I’ve seen demos of phones using TWO separate processors in which it was simultaneously running a regular copy of Windows 10 on the x86 Intel CPU and the phone OS on an ARM SoC.
(As for MS intentionally making “empty loops” that makes no sense.)