A few historians of computing among lots of historians of science

When recently I got contacted about the opportunity to contribute to this blog, I thought as a first post to report on the panels on the history of computing of the 6th Three Societies Meeting . This joint meeting of the British Society for the History of Science, the History of Science Society and the Canadian Society for the History...

The First What?

The first business computer. The first Systems Analyst. As a curator, I always demur when asked "what was the first....? There's no end to it, and technology does not proceed that way. A new technology does not suddenly appear in fully functional form; it "eases up" to functionality. At some point you say "it's ready." But it probably isn't ready,...

SAGE and the Origins of Modern Computing

An old, rare IBM film about SAGE recently surfaced on YouTube -- what a fantastic resource that web site is. The film brought back many discussions I've had with my colleagues about the place of SAGE in the history of computing. Paul Edwards saw SAGE as the centerpiece of the "Closed World" of computing. IBM historians have discussed its role...

Two Dispatches from the U.K.

Despite my dissertation research on Konrad Zuse, I've been accused of a bias toward the American side of computer history. Here are a couple of news items from the U.K. that may offset that. The first concerns what may be the first recording of music generated by a computer--the Manchester "Baby," in 1951! That is 6 years before the famous...

Moore's Law Again, and a (Possibly) Naked Emperor

In an earlier post (March 20), I discussed Moore’s Law and its relation to the history of computing. Once again I feel compelled to return to the topic—this time, to discuss its impact, not on computer science and technology, but on its historians. Put simply, historians of technology, including me, find Moore’s Law unnerving. The existence of an exponential growth...

History of Computing--the View from Montana

In an earlier post I mentioned the American Computer Museum of Bozeman, Montana. You can look at its web site for details. Now that the weather is getting warm, it is time for all of us who are interested in computing history to figure out a way to get to Bozeman and see it. You don't really need an excuse...

Science Fiction, Science Fact, and the Future of Computing

Last February I had the privilege of attending a conference on “ Imagining Outer Space,” held in Bielefeld, Germany. I have been to many conferences on the history of rocketry and space travel, and on the social and cultural implications of the Space Age, but none of them were as stimulating as this conference was. A theme present in almost...

What we don't know

An obituary in a recent Washington Post brought back a flood of memories for me, and reminded me of a topic I had been meaning to discuss but had put aside. Samuel S. Snyder is a name that should be familiar to many historians of computing—he authored an article on “Computer Advances Pioneered by the Cryptologic Organizations” for one of...

"Cybernetics is the Universal Solvent of Technology"

Those words were spoken by the late Professor W. David Lewis, of Auburn University, discussing a talk I had given about the relationship of computing to aerospace. We all know the corollary: if you discover a universal solvent, in what container can you hold itFor myself, working at the National Air and Space Museum, this paradox came home forcefully when...

Moore's Law, Steve Case, and YouTube

Moore’s Law is an empirical observation—that the density of computer memory chips doubles about every 18 months, and it has been doing so for the past four decades. Magnetic storage capacity, and to a less-regular extent, processor speeds and telecommunications bandwidth have also been increasing exponentially in a complementary fashion. We all know the results, not just in consumer products...

Pages