Greatest computing inventions of all time?
The 25th anniversary of Invention & Technology (from American Heritage) is marked by a list of the “top twenty five revolutionary inventions in the United States.” At least that is how it’s reported by IT economist (and sometime historian) Shane Greenstein in his blog, Virulent Word of Mouse. (I was unable to find the article for free on the website.)
According to Shane, the top 25 list includes:
- For communications they list two, voice over radio and FM.
- …
- For Electronics the magazine lists the transistor, the laser, flat-panel TV and charge-coupled devices.
- For computers the editors list COBOL, video games, and social networking.
He wants to replace the communications entries with cellphones and the commercial Internet. For computers, he suggests the IBM S/360 trumps Cobol or at least social networking.
These debates are inherently unwinnable, but let me offer my two cents. Certainly some higher level language belongs on the list — Cobol has the advantage of being the first major one and the one that effectively invents the idea of software.
While I like the idea of the S/360, I don’t see how a list could omit the microprocessor. Perhaps there’s a little bias here: Shane has spent part of his research career studying the S/360 and the Internet, while mine has more focused on the PC (as did my industry career.)
Still, the microprocessor shifted computers from multi-million-dollar room-sized behemoths to ubiquitous devices embedded in cars and TVs and even light switches. When I learned Fortran I had to beg time every other week on an aerospace company’s IBM 1130. Today our household owns several dozen devices that are more powerful than the 32kb 1130 — a dozen PCs (most of them unused), a variety of cellphones, PDAs and of course the computers in our autos.
In another 20 years, the S/360 and Cobol will be footnotes in history, but we’ll still have some form of microprocessor. I’m guessing the processor in my refrigerator or perhaps my lightswitch will outclass today’s smartphone, but it will still be a microprocessor.
PS: Thanks to Jeffery Stein and Paul Ceruzzi for letting me join the blogging team here at the IT History Society.