An new era for computing? Or, just a bit of catching up?
A glimpse into Intel’s self-esteem?
Early Tuesday, at 2am for viewers in eastern North America, Intel performed their Computex 2013 keynote to officially kick off Haswell. Unlike ASUS from the night prior, Intel did not announce a barrage of new products; the purpose is to promote future technologies and the new products of their OEM and ODM partners. In all, there was a pretty wide variety of discussed topics.
Intel carried on with the computational era analogy: the 80's was dominated by mainframes; the 90's were predominantly client-server; and the 2000's brought the internet to the forefront. While true, they did not explicitly mention how each era never actually died but rather just bled through: we still use mainframes, especially with cloud infrastructure; we still use client-server; and just about no-one would argue that the internet has been displaced, despite its struggle against semi-native apps.
Intel believes that we are currently in the two-in-one era, which they probably mean "multiple-in-one" due to devices such as the ASUS Transformer Book Trio. They created a tagline, almost a mantra, illustrating their vision:
"It's a laptop when you need it; it's a tablet when you want it."
But before elaborating, they wanted to discuss their position in the mobile market. They believe they are becoming a major player in the mobile market with key design wins and outperforming some incumbent system on a chips (SoCs). The upcoming Silvermont architecture pines to be fill in the gaps below Haswell, driving smartphones and tablets and stretching upward to include entry-level notebooks and all-in-one PCs. The architecture promises to scale between offering three-fold more performance than its past generation, or a fifth of the power for equivalent performance.
Ryan discussed Silvermont last month, be sure to give his thoughts a browse for more depth.
Merrifield is the reduced performance and lowest power consumption end of the Silvermont spectrum. Intended for smartphones, it is expected to be shipping in devices in time for Barcelona (Mobile World Congress), early 2014. Intel expects, and claims, a 50% increase in performance over the current generation Atom processors while maintaining better battery life. Merrifield will also be the first entry with an integrated LTE modem, reducing the number of chips (and associated wattage) required to make your mobile device function in any mobile capacity. Important stuff.
The next Silvermont tier is Bay Trail, a quad-core SoC for everything up to (and including) entry-level laptops and all-in-one desktops. Devices which include these processors are expected to ship late back-to-school season or early into the holiday season.
With sample Bay Trail devices, Intel demonstrated the ability to play Android games directly from the Google Play store alongside another device playing Torchlight 2 from Steam through Windows 8. Sure, the graphics are not Battlefield 3, but native access to a gigantic catalog of software was the major benefit of the x86 architecture since hand-coding assembly fell out of fashion.
4K video streamed over "an experimental" LTE network.
In other words, they probably were the only ones on it, but still.
One of the last Bay Trail demos streamed and decoded a 4K video, a feat in itself, over their integrated LTE modem. The Gigabyte-branded video was being streamed over Far EasTone via their trial LTE network. You may recall the difficulty recent Atom processors had simply streaming a Youtube video without stuttering: 4K is impressive for an Atom.
But, above Bay Trail, leaves Silvermont and enters the realm of Core architecture: Haswell. Sadly, the announcement did not even acknowledge the higher performance enthusiast tier, Ivy Bridge-E. Intel did discuss their developer process, and a lot more candidly than I would have expected. They outright stated that Haswell was 2 years in development by the time they refocused on battery life and graphics performance. It was not a gentle nudge.
About the most interesting insight into Intel's research and development was a small comment near the end: Intel spends $12 billion in capital per year. "Moore's Law is expensive". The main way that Intel competes as a company is dropping R&D cheques bigger than their competitors can possibly afford and outright beat whatever products oppose them. Contrasting the $12 billion that Intel spends each year, AMD's total annual revenue rests between $6 and $7 billion.
Basically Intel spends twice as much money as AMD's total revenue.
A fanless Windows 8 tablet with a full Core processor.
The fruit of this investment leads to: fanless Haswell devices, lengthened battery life, reduced cost of manufacturing, and increased functionality. The end of the keynote highlighted Intel's research into alternative input paradigms. A Creative Labs depth-sensing camera was briefly mentioned in a slide and a laptop webcam was able to, on stage, measure pulse (and apparently blood pressure) by monitoring the veins in your forehead.
And, like Google announced at I/O a couple of weeks ago and Newegg offered in their search box for a long time now, you are able to voice-search through Nuance Dragon. The examples given were to control your video playback by voice and to search for a drug that you are completely unable to spell. In case you were curious, the monitoring keyword (like "Ok Google" or "Xbox") was "Dragon".
I don't believe she's a doctor, but she plays one on keynotes.
It would have been nice to see some acknowledgment of Ivy Bridge-E, but that is not where Intel feels uncomfortable. Addressing their discomfort was ultimately the theme of the keynote. Intel spent an obscene amount of money and changed plans two years in motion to pivot toward what they are not yet succeeding at. Intel is comfortable with raw performance.
I do not get the impression they want to turn their back on the high end but rather that they just want to chase the bus down before it gets too far from the station. They suffered a great agony, most obviously with Larabee, when they neglected the GPU market for a decade; given how clearly mobile threatens their dominance on computing silicon, would you expect them to make the same mistake?
Hopefully it is just a passing insecurity which, when resolved, ceases to exclude the enthusiast.