A question was asked of Dan Luu about what new tricks silicon has learned since the early days of the eighties. The answer covers a gamut of what tools those who work on low level code such as drivers and UEFI/BIOS now have at their disposal. It is far more than just the fact that we have grown from 8 bit to 64 bit or the frequencies possible now that were undreamed of before but delves into the newer features such as out of order instructions and single instruction, multiple data instructions. If you are not familiar with how CPUs and GPGPUs operate at these low levels it is a great jumping off point for you to learn what the features are called and to get a rough idea of what tasks they perform. If you know your silicon through and through it is a nice look back at what has been added in the last 25 years and a reminder of what you had to work without back in the days when flashing a BIOS was a literal thing. You can also check the comments below the links at Slashdot as they are uncharacteristically on topic.
"An article by Dan Luu answers this question and provides a good overview of various cool tricks modern CPUs can perform. The slightly older presentation Compiler++ by Jim Radigan also gives some insight on how C++ translates to modern instruction sets."
Here is some more Tech News from around the web:
- CES 2015: Dell, Lenovo and HP showcase potential of Intel’s 5th-gen Core chips @ The Inquirer
- Insert 'Skeleton Key', unlock Microsoft Active Directory. Simples – hackers @ The Register
- Lego Avengers Assemble to the Helicarrier! @ Hack a Day
- TechwareLabs CES 2015 Event Coverage: Thermaltake
- Toshiba tosses out uber-slim THREE TERABYTE HDD @ The Register
- BlackBerry adopts the iPhone for promotional Twitter campaign @ The Inquirer
- The BenQ W1080ST+ & W1070+ Home Cinema Projector Launch Event @ TechARP
I never knew silicon had an
I never knew silicon had an intrinsic intelligence or the ability to learn, just thought it was another element on the periodic table. If you really Know your computers you Know that long before “days of the eighties”, was when the high level programming languages were created, and not much has changed, except a lot of repackaging of the old as the new, with some minor improvements that the marketing types would have you believe are revolutionary.
I’ll stick with the academic texts, and the professional trade journals, and forgo the Slashdot, other online “schools” of “Knowledge”. It’s not simply a matter of Knowing the element silicon, its Knowing the logic, and the other elements that are diffused, sputtered, or otherwise applied to silicon to make working circuitry.
High level languages have been around longer than the 80’s, as have compilers, linkers, and other tools that take the object code/executables, and assemblies, DLLs, etc. created by the compilers/linkers/whatever. While online it’s mostly about the ads and the clicks, and much less about the learning, more so now, than even the 80s-90s, or 00s, the level of “knowledge” and understanding is hidden under piles of ad links, and false assumptions.
It’s better to get down to the local JR college/university library, and look over the old bound copies of Microprocessor Report, EE times, ACM, Hot chips symposium, and others, online maybe Lexus/Nexus academic(from a library’s online terminal/IP address).
There are too many that can not separate a microarchitecture from the particular fabrication node it was fabricated on, can not tell the difference from a stack based microarchitecture, or a register based microarchitecture, and other microarchitectures that do not fit to mold of what is used today. There are some stack based microarchitectures that are no longer in widespread use, that have a more secure, built into the hardware/microarchitecture stack pointer protection, than any general purpose microarchitectures of today, but because of the current market forces, they are not utilized for internet facing assets, the current CPU/Security business interests would frown on any solution that does not involve IP/ISA that they control.
Good luck with any reasonable data mining online, Google based, or other, as even the simplest of generic term, or phrase, has been coopted and made into a business name, or Rap Star’s/Rock star’s name, even hits for instinctual/scientific terms/phrases mostly point to ads, and false leads to webpages and more loaded/displayed flashing ad banners, far be it for Google to provide and disambiguation, or see also headings at the top of their search listings, but that’s not in Google’s business model.
The Internet can NEVER replace a proper library, or the proper classification of Knowledge, the net is just a big pile of 52 pickup, only the randomly flung cards have been replaced with ads, and not much content.
—-R.I.P. Dr. Dobb’s, your articles will live on, in many archives, and bound volumes. Maybe the Necromancers will revive you, because even Undead, You would be more alive than Slashdot, and those other pretenders.
What is your point? Should we
What is your point? Should we abandon new instructions that enable complex algorithms to be done in fewer cycles?