IT History Society Blog

Archive for the ‘Uncategorized’ Category

Approved IEEE Milestone: Birth of the 1st PC Operating System (CP/M)

Wednesday, January 22nd, 2014
Gary A. Kildall, PhD (1942 – 1994),  developed and then demonstrated the first working prototype of CP/M (Control Program for Microcomputers) in Pacific Grove in 1974. Together with his invention of the BIOS (Basic Input Output System), Kildall’s operating system allowed a microprocessor-based computer to communicate with a disk drive storage unit and provided an important foundation for the personal computer revolution.   This article reviews the history of CP/M and why it was so important.
A plaque to commemorate this IEEE Milestone will be installed in Pacific Grove, CA on April 25, 2014 with a brief ceremony that is scheduled to start at 2pm.  The  intended site for the plaque, a two-story Victorian residence at the corner of Lighthouse Avenue and Willow Street (Pacific Grove, CA). served as the DRI headquarters building from 1978 to 1991.  The leader of this milestone initiative was Dick Ahrons.  Brian Berg, Tom Rolander, and David Laws are working on the April 25th dedication program for this epic event.
Details on the  April 25th CP/M Milestone program will be posted as a comment when that event is formally announced.
History of CP/M:
In 1976, Gary Kildall incorporated Digital Research, Inc. (DRI) to commercialize the program and released version 1.3 of CP/M and BIOS.
CP/M was the first commercial operating system to allow a microprocessor-based computer to interface to a disk drive storage unit. CP/M played an important role in stimulating the hobbyist personal computer movement of the 1970s.  Its ability to support software programs on a wide variety of hardware configurations enabled early use of microcomputer systems from many different manufacturers in business and scientific applications. Microsoft DOS, as licensed to IBM for the original PC, was written to emulate the “look and feel” of CP/M. Thus CP/M was the forerunner of the operating systems that now power the majority of the world’s computers and led to the personal computing revolution.
The major challenge that Kildall had to overcome in the development of CP/M was the design and debugging of the “complex electronics … to make the diskette drive find certain locations and transfer data back and forth.”
BIOS, which is included in this IEEE milestone, was the first software run by a microprocessor based PC when powered on.   The BIOS software was “built into” the PC (stored in a Read Only Memory) and was the first software run by a PC when it powered on.The fundamental purposes of the BIOS were to initialize and test the system hardware components, and to load an Operating System from a disk drive (or other mass storage) into primary memory (core or semiconductor) that was accessed by the processor.  The BIOS provided an abstraction layer for the hardware, i.e. a consistent way for application programs and operating systems to interact with the keyboard, display, and other input/output devices. Variations in the system hardware were hidden by the BIOS from programs that used BIOS services.


The following recollections are abstracted from pages 53 – 55 of “Computer Connections“, an unpublished autobiography that he wrote and distributed to friends and family in 1994. “Memorex … had come up with the new “floppy disk” to replace IBM punched cards. I stared at that damn diskette drive for hours on end … trying to figure a way to make it fly. I tried to build a diskette controller … but I, being mainly hardware inept … couldn’t get my controller to work. So I built an operting (sic) system program … I called it CP/M [but] I just couldn’t figure out how to make that damn disk drive work. Out of frustration, I called my good friend from the University of Washington, John Torode. He designed a neat little microcontroller and after a few months of testing that microcontroller started to work. We loaded my CP/M program from paper tape to the diskette and “booted” CP/M from the diskette, and up came the prompt *. This may have been one of the most exciting days of my life.”
Before Kildall’s development of CP/M, micro-computer manufacturers provided proprietary applications software that worked only on their own hardware. All programs had to be written from scratch to operate on each unique machine configuration. CP/M was initially designed to work on the Intel 8080 microprocessor and allowed computer systems built by any manufacturer who used that chip to run applications programs written by third-party suppliers. CP/M introduced a new element of competition into the computer marketplace that stimulated rapid growth in the use of low-cost systems in business, industry and academia and eventually in the home. According to Kildall, “CP/M was an instant success. By 1980, DRI had sold millions of copies of CP/M to manufacturers and end-users.”
Kildall’s own public account of the history of CP/M was published in Dr Dobbs Journal in 1980: THE EVOLUTION OF AN INDUSTRY: ONE PERSON’S VIEWPOINT, “Dr. Dobb’s Journal of Computer Calisthenics & Orthodontia”, Vol.5, No.1, (January 1980) (number 41), page 6-7.
Numerous popular accounts of the history of CP/M have been published in newspaper and magazine articles and in books, as well as online. Most of them focus on the fictitious story that DRI lost out to Microsoft on the IBM PC operating system decision in the summer of 1980, because Kildall had taken the day off to go flying. Kildall refutes this story in “Computer Connections” but it is probably most eloquently recounted in Harold Evans’ book on U.S. pioneers and innovators  “They Made America: Two Centuries of Innovators from the Steam Engine to the Search Engine” (2004) ISBN 0-316-27766-5.
According to that chapter (paraphrased here):
After Kildall’s wife Dorothy refused to sign IBM’s “ludicrously far reaching non disclosure agreement (NDA)”  in the morning of IBM’s visit to his home, Kildall and Tom Rolander met with IBM that same afternoon.  Once the NDA was agreed to and signed by Kildall, IBM revealed its plan.  Rolander demonstrated DRI’s MP/M-86 – the new multi-tasking Operating System for Intel’s 8086 microprocessor.  Rolander and Kildall wanted that OS to be the new defacto standard for microprocessor based OS’s as they believed multi-tasking was the wave of the future.
Negotiations began on how much IBM would pay DRI. Kildall believed they could strike a deal.  Kildall wrote, “We broke from the discussions, but nevertheless handshaking in general agreement of making a deal.”
That night, Kildall and his family went to the Caribbean for vacation.  When they returned home one week later, Gary called IBM in Boca Raton several times, but “they had gone off the air.”  IBM had gone back to Microsoft to make a deal on the latter’s PC operating system (PC-DOS which was later renamed MS-DOS).   Bill Gates allegedly told IBM that Kildall had not yet finished designing CP/M to run on a 16 bit microprocessor and that Microsoft could by itself meet IBM’s requirements.
Kildall and others claimed that PC-DOS had copied (and was therefore a clone of) CP/M.  DRI threatened to sue IBM for copyright infringement if they proceeded to use PC-DOS in the first IBM PC which was scheduled to be announced in four months (August 1981).
IBM offered to market CP/M-86 along with Microsoft’s PC-DOS (for the first IBM PC) on the condition that Kildall (DRI) would not sue IBM for infringement of CP/M copyrights.  IBM accepted that it would pay DRI a standard royalty rate (presumably for each copy of CP/M sold).
But Kildall did not know that IBM would screw them by making CP/M-86 prohibitively more expensive than PC-DOS which had a 6 to 1 price advantage – $40 vs $240.  IBM evidently had no intention of selling CP/M-86.  Kildall called IBM to request they reduce the price of CP/M, but no one called back.
Kildall wrote, “The pricing difference set by IBM killed CP/M-86.  I believe to this day that the entire (pricing) scenario was contrived by IBM to garner the existing standard at almost no cost.  Fundamental to this conspiracy was the plan to obtain the waiver for their own  PC-DOS produced by Microsoft.”
CP/M rapidly lost market share as the microcomputing market moved to the PC platform, and it never regained its former popularity. Byte magazine, at the time one of the leading industry magazines for microcomputers, essentially ceased covering CP/M products within a few years of the introduction of the IBM PC
Before CP/M there was PL/M:
Prior to the work on CP/M, Kildall consulted for Intel and National Semiconductor to help them adapt his PL/M+ compiler to their microprocessor development systems.  Unlike other contemporary languages such as Pascal, C or BASIC, PL/M had no standard input or output routines. It included features targeted at the low-level (microprocessor based) hardware, and could support direct access to any location in memory, I/O ports and microprocessor interrupt flags, in a very efficient manner. PL/M was the first high level programming language for microprocessor based computers and the original implementation language for the CP/M operating system.
+ PL/M was the first high level programming language for microprocessors. It was created by Kildall in 1972.
From the Gary Kildall chapter of the earlier referenced They Made America book:
“Intel was abuzz in 1973 with the triumph of the Intel 8008 chip, which doubled the power of its first microprocessor, and Kildall was drawn to spend more and more  time  there.  After his “eyeballs gave way”, he would  spend  the  night sleeping  in his Volkswagen van in the parking lot. He became a trader  in  an electronic  bazaar,  swapping  his software  skills  for  Intel’s  development hardware.  One morning, he knocked on the door of Hank Smith, the  manager  of Intel’s little software group, and told him he could make a compiler for the Intel 8008  microprocessor, so that his customers would not need to go  through  the drag  of low-level assembly language. Smith did not know what  Kildall  meant.
Kildall  showed how a compiler would enable an 8008 user to write  the  simple equation  x = y + z instead of several lines of low-level  assembly  language.  The manager called a customer he was courting, put the phone down and, with  a big  smile, uttered three words of great significance for the  development  of the Personal Computer: “Go for it!”
The  new  programming language, which  Kildall called  PL/M,  or  Programming  Language  for  Microcomputers,  was  immensely fruitful.   Intel  adopted  it,  and  Kildall  used  it  to  write   his  own microprocessor  applications, such as Operating Systems and utility  programs.
It  was the instrument for developing the PL/I-80 compiler that he  worked  on with  Dan  Davis for three years. “Gary was very visual”, Davis told  me.  “He would  design  things more or less graphically, and then transfer  his  design into code. He even had an aesthetic about his drawings. He was very  thorough, patient  and persistent in ensuring his solutions were not only  correct,  but elegant.”  Kildall’s  reward  was  Intel’s  small  new  computer  system,  the Intellec-8.”
Closing Comment:
This author knew Gary Kildall from 1973 to 1976 from his consulting work at National Semiconductor and the (1975 and 76) Asilomar MicroComputer Workshops.  Contrary to accounts of his being arrogant or aloof, he was just the opposite- gracious, polite, considerate and well behaved.  He did not come across as a hippie, but rather as a trustful academic.  At the 1975  Asilomar workshop, Kildall acknowledged and praised my work developing and demonstrating real time microprocessor applications at National Semiconductor.  He said I was doing great work there.  I felt very proud of myself that day and thanked Gary for his kind words.  IMHO, Kildall was a gentleman and a scholar, but evidently not a shrewd, cut-throat businessman.
Primary References:

IEEE Global History Network webpage for this milestone has a complete summary with links and videos:  Milestones:The CP/M Microcomputer Operating System,1974,_1974

IEEE Milestone Plaque citation summarizing the achievement and its significance:

“Dr. Gary A. Kildall demonstrated the first working prototype of CP/M (Control Program for Microcomputers) in Pacific Grove in 1974. Together with his invention of the BIOS (Basic Input Output System), Kildall’s operating system allowed a microprocessor-based computer to communicate with a disk drive storage unit and provided an important foundation for the personal computer revolution.”

Additional References:
  • CP/M and Digital Research Inc. (DRI) Web pages
  • Gary Kildall Special (Video)
  • A Short History of CP/M
  • Gordon Eubanks Oral History (Computerworld 2000)
Supporting Materials:
Numerous original documents, images, personal reminiscences, and videos contributed by employees are posted on the Digital Research Inc. page of the IT Corporate Histories Collection website hosted by the Computer History Museum at:


Why Care Who Invented the First Computer?

Tuesday, January 21st, 2014

During January some of you might have noticed a running dialogue among historians and other interested parties about who invented the “first” computer. There was no agreement reached on the correct answer to that question. Discussions about “firsts” pop up about every five years, almost like short-lived brush fires on the side of the road as historians travel on to do their serious work. Over the past forty years I have seen articles and books, even one lawsuit on patents, over the question of “firsts.”

The interest in “firsts” is not limited just to computers, it seems to be all over the place regarding any modern technology, and it is not limited to historians, but includes those interested in patent protections, copyrights, or just plain old curiosity. It seems for many reasons “firsts” are interesting and important. In the movies it is the “aha” moment that we see when Madame Curie discovered radiation, or earlier Alexander Graham Bell made his first phone call. In recent years there have been discussions about the first PC, with suggestions that it may have been made in Eastern Europe, or that some guy did it in California in the early 1960s and didn’t tell anyone. Failure to exploit technological innovations is another source of conversation, with Xerox Park the perennial favorite for letting too many cool widgets slip by.

Many of these discussions, indeed most of them, are only interesting, not terribly relevant. While it would be great if we could identify the first person who made the first tooth brush and nail down the date of that glorious event, it is never going to happen. We cannot get agreement on when the first computer was built, although this year Colossus is ahead in the race; past contenders for that recognition have included MIT’s differential analyzer, ENIAC, EDVAC, and UNIVAC, each a champion in its own time.

The reason we cannot agree on which was the first computer, or even the first e-mail sent, or the first PC ever built is because computing and all other inventions normally emerge in an evolutionary manner. People take an existing device, and modify it so slightly; then someone else does the same thing to that modified device, and so forth, until such a point that (a) someone gives it a name and it takes off as the “first” of something or (b) when connected to something else becomes so useful that it now comes out of the shadows as if a brand new item. There were PCs in the late 1960s and early 1970s, but not until Apple and IBM were able to sell desktop computers in quantity did the notion of the PC finally take hold, with Apple doing its magic in the late 1970s, IBM in the early 1980s. Those are examples of the first type of innovation. That may be going on today with the humble light bulb as it evolves into something other than an incandescent bulb.

As for (b)—linking things together—connecting computers to telephone lines made possible online and telecommunications. Did that happen in 1940? In 1969? Or later with development of the World Wide Web? The answer is yes for each time. So, what are we to make of all this?

The takeaway is simple. All IT is really quite complex to invent, to make, to use, so much so that when you study specific examples in any detail, you realize no one individual could create all the elements involved. It was and remains an iterative process. Historians who have examined IT’s history tell that story continuously, such as Paul Ceruzzi, Tom Haigh, and Jeff Yost, to mention a few. They keep finding that “firsts” turn out to be “combinations,” “iterations,” “evolutions,” not one time spectacular events. And that is a core lesson about how computers came about and why the technology continues to evolve. Literally millions of people in over 100 countries are tinkering with hardware, software, components, computer science, uses, and ideas about information management. Bottom line, it is nearly impossible to be the “first” to invent anything as complex as a computer. Even developers of cell phone “apps” are building on each other’s prior work.

But, it still is fun to argue who invented the first computer or the first toothbrush.

Old Software and Games….They’re Alive!

Friday, January 10th, 2014

internet-archive-gamesEver get the urge to mess with Visicalc or WordStar again? Play the original Donkey Kong or Adventure on your computer? Now you can!

The Internet Archive, in a Christmas gift to the world, has unleashed the Historical Software Archive, a collection of prominent and historically notable pieces of software that you can run in your browser.  They range from pioneering applications to obscure forgotten utilities, and from peak-of-perfection designs to industry-crashing classics. And if you get the urge to play the videogames you grew up with from Coleco, Atari, Magnavox and Odyssey, you can head to the Console Living Room, a collection of console video games from the 1970s and 1980s.

These come by way of JSMESS, a Javascript port of the MESS emulator, a computer and console emulator that has been in development for over a decade and a half by hundreds of volunteers. The MESS emulator runs in a large variety of platforms, but is now able to run embedded in most modern browsers, including Firefox, Chrome, Safari and Internet Explorer.

Stop wasting time – or more likely, start wasting time – and see what the Internet Archive has brought us.

More about the Historical Software Archive

More about the Console Living Room

Computer Pioneer Alan Turing Pardoned by UK for “crime” he didn’t commit

Wednesday, December 25th, 2013

The United Kingdom has finally pardoned Alan Turing for a gay sex conviction which tarnished the brilliant career of the code breaker credited with helping win the war against Nazi Germany and laying the foundation for the computer age.

Turing’s contributions to science spanned from computer science to biology, but he’s perhaps best remembered as the architect of the effort to crack the Enigma code, the cipher used by Nazi Germany to secure its military communications. Turing’s groundbreaking work – combined with the effort of cryptanalysts at Bletchley Park near Oxford and the capture of several Nazi code books – gave the Allies the edge across half the globe, helping them defeat the Italians in the Mediterranean, beat back the Germans in Africa and escape enemy submarines in the Atlantic.

“It could be argued and it has been argued that he shortened the war, and that possibly without him the Allies might not have won the war,” said David Leavitt, the author of a book on Turing’s life and work. “That’s highly speculative, but I don’t think his contribution can be underestimated. It was immense.”

Turing also pioneered the field of computer science, theorizing the existence of a “universal machine” that could be programmed to carry out different task years before the creation of the world’s fully functional electronic computer. Turing ideas matured into a fascination with artificial intelligence and the notion that machines would someday challenge the minds of man. When the war ended, Turing went to work programing the world’s early computers, drawing up – among other things – one of the first computer chess games.


Personal Perspective:

When this author took his first computer science class in the Fall of 1966, the instructor described “the Turing machine” - a hypothetical device that manipulates symbols on a strip of tape according to a table of rules.  It was invented in 1936 by Alan Turing and used to help computer scientists understand the limits of mechanical computation.  In particular, the Turing machine gave rise to the concepts of “algorithm” and “computation” within the model of a general purpose computer (stored program machine).

Turing is widely considered to be the father of computer science and artificial intelligence.  His pardon was long overdue!


Testimonials to Doug Englebart: Dec 9, 2013 @CHM

Sunday, December 15th, 2013

Computer visionary Doug Englebart was posthumously honored on December 9th at the Computer History Museum (CHM) in Mt View, CA.  The date of this event was significant, because December 9 was the 45th Anniversary of the “Mother of All Demos.^”  Doug’s wife, daughter, and several people that worked with Doug or knew of his work made brief speeches to honor him.  The speakers included: Bill English, Chief Engineer at SRI who built the 1st mouse based on Englebart’s notes ; Stewart Brand, President of Long Now Foundation and Whole Earth Catalog publisher; futurist Paul Saffo, Guerrino De Luca, Chairman of Logitech;  Curtis Carlson, CEO of SRI; Adam Cheyer, Co-founder of Siri; and Elizabeth “Jake” Feinler, former Director at SRI.

Funding for this event was provided by SRI International and Logitech.


^  “The Mother of All Demos” is a name given retrospectively to Douglas Engelbart’s December 9, 1968, computer demonstration at the Fall Joint Computer Conference in San Francisco. The live demonstration featured the introduction of a complete computer hardware—software system called the oN-Line System or more commonly, NLS. The 90-minute presentation essentially demonstrated almost all the fundamental elements of modern personal computing: multiple windows, hypertext, graphics, efficient navigation and command input, video conferencing, the computer mouse, word processing, dynamic file linking, revision control, and a collaborative real-time editor (collaborative work). Engelbart’s presentation was the first to publicly demonstrate all these elements in a single system. The demonstration was highly influential and spawned similar projects at Xerox PARC in the early 1970s. The underlying technologies influenced both the Apple Macintosh and Microsoft Windows graphical user interface operating systems in the 1980s and 1990s.


Marc Weber, Web historian and founding curator of the CHM’s Internet History Program, said: “When it comes to the kind of knowledge navigation and collaboration tools that were the heart of Engelbart’s system, we’ve climbed only the first rung of the ladder. The same is true when it comes to the daunting goal that drove him to build all of his technology — to augment human intellect so that we might better address the world’s big problems.”


It was said that Engelbart’s great insight was making computers interactive and easier to use.  That in turn would result in a “flowering of humanity.” He foresaw computer technologies as augmenting people’s abilities and intellect rather than replacing them.   Many speakers acclaimed Doug as a visionary, but did not articulate his accomplishments or what his vision actually was.  For that reason, the talks didn’t live up to the expectations of many in the audience, including this author.  The SRI Augmentation Research Center was frequently mentioned by speakers, but it’s mission, funding, or accomplishments were not described.  


From Wikipedia:

Under Engelbart’s guidance, the Augmentation Research Center developed, with funding primarily from DARPA, the NLS to demonstrate numerous technologies, most of which are in modern widespread use; this included the computer mouse, bitmapped screens, hypertext; all of which were displayed at The Mother of All Demos in 1968. The lab was transferred from SRI to Tymshare in the late 1970s, which was acquired by McDonnell Douglas in 1984, and NLS was renamed Augment. At both Tymshare and McDonnell Douglas, Engelbart was limited by a lack of interest in his ideas and funding to pursue them, and retired in 1986.


John Markoff wrote in the December 16, 2013  NY Times:
“During the 1960s in Menlo Park, Calif., at the Stanford Research Institute, Dr. Engelbart created a research group to design what he described as the oN Line System, or N.L.S. It was intended to augment small groups of knowledge workers. Along the way, he pioneered computer interfaces by inventing the computer mouse, hypertext and many of the other components of modern computing.”


Comments on the event:

Here’s an anonymous comment, received via email, from someone who knew and worked with Englebart that flew from out of state to attend this event:

“I enjoyed being at the Englebart event, but was disappointed that some of the speakers didn’t look towards the future and speculate what might lay ahead based upon Doug’s work.  I would have also liked to hear from some people who worked on the social side of what Doug had envisioned.”

CHM Chairman of the Board Len Shustek wrote in an email: “I thought the evening was terrific. It really gave a sense of the man, and of the disappointment that he couldn’t make more progress on his agenda.”


Doug Englebart was elected as a CHM Fellow in 2005:  “For advancing the study of human-computer interaction, developing the mouse input device, and for the application of computers to improving organizational efficiency.”

Doug’s bio is at:,Engelbart/

Guide to the SRI ARC/NIC records is at: