Rumours have reached the sensitive ears of DigiTimes about the inclusion of USB 3.1 and WiFi chips on Intel's upcoming 300-series chipsets. This move continues the pattern of absorbing secondary systems onto single chips; just as we saw with the extinction of the Northbridge after AMD and Intel rolled the graphics and memory controller hubs into their APUs. This will have an adverse effect on demand from Broadcom, Realtek and ASMedia who previously supplied chips to Intel to control these features. On the other hand this could lower the price AMD will have to pay for those components when we finally see their new motherboards arrive on market. Do not expect to see these boards soon though, the prediction for the arrival of the 300-series of motherboards is still around 12 months from now.
"Intel reportedly is planning to add USB 3.1 and Wi-Fi functions into its motherboard chipsets and the new design may be implemented in its upcoming 300-series scheduled to be released at the end of 2017, according to sources from motherboard makers."
Here is some more Tech News from around the web:
- Android 7.0 Nougat beta available now for Galaxy S7 and S7 Edge owners @ The Inquirer
- Google rejects EU's Android antitrust charges @ The Inquirer
- You mean Office 365 deployments don't secure themselves? @ The Register
And so we march ever closer
And so we march ever closer to the day when all computers will be SoCs.
Damn, I am too impatient for
Damn, I am too impatient for that day to arrive. We’ll probably have cooling problems though but hopefully by the time PC components turn into SOCs we’ll have much better cooling equipment and thermal dissipation.
SOCs Not so much for all
SOCs Not so much for all computing, but on the consumer level most except for very high end gaming systems which will still be making use of PCIe based graphics. It will be more of the system on an interposer/system on a module(organic/other) variety that will get the laptop market business with HBM2/newer taking over from the DIMM based DRAM offerings on most consumer SKUs! With HBM2 at first being used alongside DIMM based DRAM and the HBM2 acting like a cache of faster and higher bandwidth memory to hide any slower lower bandwidth memory accesses from the Interposer based GPU/Graphics. So the Interposer based CPU cores and the Interposer based Larger separate GPU die will feed from HBM2 mostly with any accesses to slower DIMM based DRAM staged to the HBM2 for faster consumption from the GPU that needs the extra bandwidth that HBM2 provides to operate more efficiently.
When the cost of HBM2 becomes nearer to the cost of DIMM Based DRAM then there may be systems with all HBM2 memory, mostly Laptops and not PC’s or workstations that will probably keep at least 2 channels to off interposer based DIMM based DRAM above and beyond even 32GB of HBM2. As some PC systems will still need 64GB and above of DIMM based DRAM memory to complement the at least 32GB of HBM2.
HBM2 is not suitable for use
HBM2 is not suitable for use as main memory; it’s latency is too high.
Is it worse than gddr5? The
Is it worse than gddr5? The ps4 atleast uses that as main memory already.
Actually it’s more like
Actually it’s more like “System On 2 Chips” but it makes sense for intel to do so, kicking out of the motherboard some industry competition at the same time.
The USB-IF has done a piss
The USB-IF has done a piss poor job of educating the public on it’s USB Type-C plug form factor and electrical standard and its usage with the appropriate USB 3.0 and USB 3.1 controller chips attached to the motherboard on most computers. The USB_IF and Intel/Vesa/Other standards bodies have also done a piss poor job of describing to the public and the press the various USB Type-C alternative mode usage configurations for TB3, HDMI, DisplayPort over the USB Type-C standard plug form factor and electrical standard. Very Little educational/primer material was provided to the press in the form that the press could readily understand, including even the technical press getting any total picture of how the various standards(TB3, other protocol standards) where to be properly integrated with the USB-IF’s USB Type-C plug from factor and electrical standard.
Not included where some actual Plug Pin-Outs Diagrams/Graphics/wire and chip layouts so the technical press could make heads or tails of just what protocols PCIe/others where to be tunneled over the TB3 wires/protocol if available through the USB Type-C plug and which pins provided backwards compatibility to the various native HDMI, DP, USB wiring for any non TB3 enable Plugs/Cables that only provided there respective NON TB3 Intended HDMI/USB/DP/Other connectivity and pins.
That problem with the TI peripheral/cable(Power delivery negotiation for USB Type-C power standard) controller chip and TB3/USB Type-C on the Apple macbook pro SKUs is just one example. As is the continued improper use USB-IF naming guidance with respect to the USB Type-C Gen 1, and the USB Type-C Gen 2, proper naming that the USB-IF told the device manufacturers to use when describing any USB Type-C plug paired with a USB 3.0 or USB 3.1 controller chip on the devices motherboard.
To this Day there are still reporters using and marketing copy full of the confusing “USB 3.1 Type-C” misnaming that only leads to confusion as the USB Type-C plug can be paired with a USB 3.0, or a USB 3.1 controller chip, not to mention the alternative mode usage confusion[pulls out hair until bald]. So having a USB Type-C plug does not guarantee by default any USB 3.1 connectivity as that requires a USB 3.1 controller chip on the motherboard, or now for Intel the USB 3.1 controller chip functionality included in Intel’s new chip sets.
You are absolutely right and
You are absolutely right and it can be a nightmare trying to put up a quick post about a USB thingy without needing a full paragraph to describe what type of USB it is … especially if the manufacturer doesn't provide full specs either.
This one does specifically say USB 3.1.
The full paragraph, and not
The full paragraph, and not the first time, is in hope that eventually it will stick in some reader’s minds with regards to the Type-C USB-IF naming specifications! But Yes you are correct this one is concerning the USB 3.1 controller chip and that chip’s functionality integrated into Intel’s latest chipsets, late from Intel as always but at lest it’s there now a opposed to not at all! USB 3.0 chipset integration took a good while for Intel also, ditto for AMD and their USB 3.0/3.1 support integrated into their new chipsets. Now for some PCIe 4.0 comparability and plenty of bandwidth for TB3, the newer SSDs, and other 4k/higher ports using the latest HDMI and DP standards all hung off of USB-IF’s Type-C plug/electrical ports. Adapting sure is hard when the technology is changing and the port adapters costs are breaking the bank for even the non Apple users.
It’s very much not your fault more so than the USB-IF’s for the lack of understanding and confusion that is still going on regarding the USB Type-C plug and power delivery standard. And that is mostly the fault of those who write the ad copy for the PC/Laptop and other devices. The technical writers are not of the same level as they where before the Internet days when devices came with both technical manuals(replete with the device’s full electrical schematics) and user manuals! There where no questions that could not be answered with a full set of the device’s schematics to look over including the pinouts on the device’s ports. There was not much confusion with those complete manuals written in an organized and coherent manner and not scattered out over the interwebs in some random/incomplete manner like things are currently.
“This continues the
“This continues the absorption of secondary systems into single chips, just as we saw with the extinction of the Northbridge after AMD and Intel rolled memory the graphics and memory controller hubs into their APUs.”
Are there some words missing or a few extra words in this sentence?
Ya, I could clean that up a
Ya, I could clean that up a bit I suppose
Wasn’t USB 3.1b supposed to
Wasn’t USB 3.1b supposed to be already be natively supported on 200 chipsets?