The heavy hitting partnership of IBM, Samsung and GLOBALFOUNDRIES have designed and created the first chip built on a 7nm process using Silicon Germanium channel transistors and EUV lithography. Even more impressive is their claim of 50% area scaling improvements ovver 10nm, a very large step in such small processes. IBM told PC World that they will be able to fit 20 billion transistors on a 7nm chip which is a tenfold increase over Braswell as an example of current technology. The Inquirer reports that this project also cements the deal between GLOFO and IBM; GLOFO will be the exclusive provider of chips for IBM for the next decade.
"IBM'S RESEARCH TEAM has manufactured functional test chips using a 7nm production process, making it the first in the industry to produce chips with working transistors of this size."
Here is some more Tech News from around the web:
- BlackBerry updates BES12 with Samsung Knox and Android for Work support @ The Inquirer
- Android 5.1.1 starts rolling out to Galaxy Note 4 owners @ The Inquirer
- Microsoft SLASHES 7,800 bods, BURNS $7.6bn off books in Nokia adjustment @ The Register
- Office 365 prices 'to rise by up to 13 per cent' @ The Register
- PAPAGO! GoSafe 520 Dashcam @ Bjorn3d
- How to Market Your Linux SysAdmin Skills @ Linux.com
- Show Us Your Human Interface; Win Laser Cutting Time @ Hack a Day
And where do you think that
And where do you think that Samsung got help from with that 14nm process, non other than that long standing research partnership with IBM, and IBM is why GlobalFoundries(GF) got access to the Samsung 14nm process node, because GF has taken over the chip fabrication side of the business for IBM, and both Samsung and GF will be supplying those OpenPower power8, and 9 licensees with plenty of available fabrication capacity, so IBM need not worry about its in house power8/powre9 fabrication needs in the future, and IBM will not have to be worrying about the headaches of maintaining a commodity business such as the fabrication is.
Both Samsung and GF will have an entire market of customers with which to amortize those expensive chip costs across, and keep those fabs operating at as close to 100% capacity as is possible, IBM made a smart Move getting out of the fab business is was a big cash drain for just IBM’s internal needs. Now IBM has possession of most of the really valuable CPU IP, and can license some of that IP, as well as the IP it has created with GF, and Samsung to sive IBM long term priority for fab space with GF, contractually, and with Samsung more so as a technology partner. The really valuable IP remains in IBM’s control, the Microarchitectural IP around the Power/power8, and its successors, and IBM through its/others creation of the openpower foundation, has a way of licensing the power8/power IP to a wider market Arm holdings style. Now we will see many licensed power8 from third parties like Tyan, and others! Hopefully AMD will grab some of that third party action and offer its services and GPU integration experience up for some custom business, and not let Nvidia get a bigger lead in the GPU/HPC acceleration market around the third party licensed power8’s. There is more than just x86 in the computing market, and that 7nm could hold a bunch more GPU cores on some AMD APUs, be they x86, ARM, or power8 based.
Big Blue still has some damn fine research labs, and more IP patents than even your most worshiped deities!
Edit: as the fabrication is
Edit: as the fabrication is
To: as the chip fabrication business is
Damn, I’m going blind.
Edit: as the fabrication
Edit: as the fabrication is
To: as the chip fabrication business is
Damn, I’m going blind.
remove double post, posting
remove double post, posting system acting up again!
Better yet remove all his
Better yet remove all his posts. He’s pretty excited about 7nm technology as if it’s going to be produced anytime soon. Just like that 9nm tech IBM developed in 2012 using nanotubes that are common in all our devices today.
LOL!
Silicon industry is
LOL!
Silicon industry is trouble and nobody wants to admit it. The same way nobody wants to admit how DX12 finally makes proper use of our multi-core systems.
Intel struggling with 14nm (delaying a year on the high-end) and now with 10nm (delaying for another year) makes me less hopeful that 7nm and below will be viable in the normal cadence. I’m willing to bet this is the beginning of a 10 year stalemate with silicon before we can pick up progress any further.
Physics, EUV, Quantum tunneling, etc. The limitations are stacking up and still no solid contender exists besides in “theory”.
You are kidding right? The
You are kidding right? The silicon industry is the one single most consistent and rapidly progressing industry out there. Please don’t tell me that all those cars on the streets are properly making use of their consumed resources… I find silicon production to be the most innovative, without a doubt. Sorry its not moving fast enough for you… if only it lived up to your expectations we could all be getting 10,000fps in quake3 by now, and rendering Pixar movies at home in real time. You should know that GPUs cannot compete with cpus in performance per thread, amongst a hefty number of other differences, quick google found this site for more information http://gamedev.stackexchange.com/questions/17074/are-there-any-benefits-for-using-the-cpu-instead-of-the-gpu
GPUs are designed to have
GPUs are designed to have massive numbers of parallel units that outnumber CPU core around 1000 to one on high end GPUs, so GPUs are of many ranks and files/Rows and columns of massively parallel Vector units, and other specialized for graphics tessellation units, ROP, SP, etc. units. So the GPU’s execution units do not need to be as power hungry on a per unit basis as a CPU’s cores, that would not be able to be crammed into that same die area that can be done for a GPUs with 1000’s of cores. CPUs can not even hope to be able to complete the number of FP, DP, and graphics specialized computations as a GPU can. GPUs can run circles around CPUs for many tasks, and the GPU hardware is changing faster than the software is able to catch up with the latest, the CPU cores to a lesser degree! AS AMD’s Mantel has shown, along with the ability of gaming engines to make use of all the CPU cores offered, consumers have been getting the short end of the stick as far as support for hardware in the operating system goes, and it’s been years since the CPU whent multicore, and years since the GPU was introduced. What Excuse does M$ have for not offering multi-adaptor support in its OS years ago, why the hell has it taken so many years to have support in the OS, all be it from DX12, and hopefully Vulkan, to be able to use both the integrated graphics and the discrete GPUs at the same time.
And the silicon industry is slowing to a crawl compared to a few years back, and it’s mostly dew to competition, or the lack of competition, and the lack of government enforcement of the laws on the books.
Chip fabrication technology and microarchitectural technology are two different types of technologies that have begun to slow down, In the microarchitectural segment the only bright spot is the custom ARM SOCs that are custom made to execute the ARMv8a ISA, and there is some hope from AMD on the x86 microarchitectural side that they will at least get close enough to Intel’s IPC rates, to allow AMD’s APU graphics to be in more laptop SKUs, and AMD is doing more on the HSA aware microarchitectural side of the equation to further and more seamlessly integrate the CPU cores with the GPU cores, Intel is happy to sit and milk its market share and offer up/force on users its not so good graphics at Intel’s not so affordable price points.
The chip fab technological improvements are hitting the walls of economics, before they are hitting the laws of physics limits, it’s no wonder that many companies are going fabless, and contracting with specialized fab partners in order to spread the R&D cost across more than one company, IBM is doing that, and Intel has had to mothball a new chip fab facility because it does not have enough sales volumes to justify the cost running a chip plant at less than as close to full capacity as possible, or loose too much money on the fab part of the business.
OSs are seriously in need of the under the hood types of innovations that improve performance, and at the very least allow all of the CPU/GPU hardware to be utilized all the time, instead of the past practice on only allowing one or the other integrated or discrete GPU to be operational at a time. It looks like the one big proprietary OS maker is more interested in uselessly re-designing the UI, and trying to make money by reselling the same software that used to come standard with the OS, via its APP store. Seriously the industry has been sick for a few years now, with the lack of fair competition being one of the major sicknesses over even more than the past few years.
It’s coming up to 14nm from
It’s coming up to 14nm from Samsung, helped along by IBM, and a lot more competition from non IBM licensed Power8s, 7nm is just more icing on the cake. So no one expects your bosses at Intel to tell the truth, the spin-doctors are always around and the tech sites are not going to point out the obvious new market for fear of little or no review samples, and some lost advertising sales! That 14nm is going to be there from Samsung and GF, and maybe even with a smaller layout pitch between the 14nm transistors, there are some things about that IBM announcement technology wise that can be used to help get the distance between transistors much closer even for the 14nm node. I fully expect that AMD’s custom business may see AMD adding their GPU cores as accelerators to some third party power8 licensee’s power8 based product, along with AMD providing some workstation/APU based x86 Zen based APU products to the market. There is going to be a big market in China for licensed third party power8’s, and Nvidia is not the only one with a GPU product to use as a accelerator, and AMD has the DP performance! Nvidia is doing some OpenPower business with IBM, for some of IBM’s internal contracts with the government and supercomputer contracts.
IBM has a large patent portfolio, and they have been tops in getting new patents for 2 decades. This new market along with IBM’s exit from the chip fab business will mean more for the market as a whole than any new fab technology that Intel keeps cloistered behind its crumbling market stranglehold walls. IBM is a technology licenser for the entire market via its partnership with GlobalFoundries and Samsung.
Maybe we won’t have to have Intel’s crappy integrated graphics shoved down our throats for much longer when we go to purchase laptops. Zen is on the way, and it just has to be in the ballpark IPC wise for AMD graphics to make the case for the switch. And don’t go praising Intel’s expensive/not there graphics that only wind up in the most outrageously overpriced SKUs, AMD has much better value for the dollar graphics wise! AMD will be benefiting from that IBM research, both for CPU cores, and GPU cores.
How is that Contra Revenue working out for you Chipzilla!
yeah AMD is going to be
yeah AMD is going to be rolling out tons of new products when they’re hemorrhaging money and they’re stock price is $1.98. The only competition AMD presents is to NVIDIA and they’re barely doing that. How’s that fantasy world you live in working out for you Douchebag!
$1.98 is a good price for a
$1.98 is a good price for a private equity firm to take AMD private, and let AMDs engineers do their job, and AMD’s GPUs have more SP, and DP performance than Nvidia’s overpriced SKUs, that only the Gamenecks love, ’cause Billy Bob needs his ego stroked and pockets emptied by the Green Goblin. AMD offers up its good share of competition, forcing your green toothed self’s favorite brand obsession to lower it prices and let you get you ego fix at a lesser price, one wonders whether you get more enjoyment over the bragging, than you do actually gaming with your overpriced kit.
I’ll take AMD’s better SP, and DP, performance to go along with its competitive showing, as I use AMD GPUs for more than just gaming. S/A has just released some openGL benchmarks on the FURY, and they are not too shabby, but AMD has always offered a more well rounded GPU usage experience for its customers, the bit-coin miners sure used AMD before the bit-coin ASICs came on line, and some people do more than game with their GPUs.
So what is your excuse, what do you put your GPU on display and invite the local yokels over to see your new and shiny! Mine is down in the case doing its job, I do not give a 10th of a rat’s shiny red A$$ what it looks like! It does the job for the amount I paid for it, better than that over priced stripped down(Of SP, DP performance) green slime.
AMD is more than Just GPUs, although AMDs GPU will be even better integrated with its Zen CPUs in the future, so expect AMD will still be around long after the toothless yokels are no longer impressed with you new and shiny, you old green tooth!