LEO, more formally known as Lyons Electronic Office, was the world’s first business computer, having been developed by the British company J. Lyons & Co. Ltd. between 1947-1954. John Simmons was very much the genius behind this adventure into business process re-engineering. His papers are archived at the Modern Record Centre at Warwick University in Coventry, England. Many of these contain valuable information about the ideas behind the LEO development. The Centre has recently selected some of the papers related to LEO and set them up as a separate digitized archive which includes a short introduction to the LEO story by Frank Land. These can be accessed at:
The event was in the form of an interview/conversation led by CHM CEO/moderator John Hollar. Mr. Rattner spent a lot of time discussing his early life at Hollywood High School and how he got involved in electronics and as a EE student at Cornell University. It was less about Intel’s early history. However, here are a few quick takes on Intel from Mr. Rattner:
On Gordon Moore: “His ability to pick the technology that was going to win in the end. Gordon managed all the twists and turns of semiconductor technology- from P channel to N channel to CMOS.” In the mid 1970s, he allocated 7% of Intel’s R&D budget on CAD, recognizing that LSIs would be too complicated to design using a pocket calculator.
On Bob Noyce: “Very approachable; couldn’t spend more than 5 minutes with him without getting excited about whatever he was thinking about.”
On High Performance Computing: ”The early supercomputers were experimental parallel machines. We barely knew how to program them. There was a constant sense of discovery of what was going on. We were immersed in a research environment, even though our original mission was to provide tools for researchers.” Rattner founded the computer server labs at Intel.
“We first had to convince people with money that flashing up a bunch of microprocessors was THE path to high performance computing.” Rattner was the only one who spoke of the “attack of the killer micros,” at a conference in upstate New York. “Tide turned in 1991 with the big Delta machines, which beat the Cray XMT supercomputer in a performance test. ”A microprocessor machine never lost its title as fastest supercomputer, with possible exception of a Japanese built machine,” he said.
“We had to tackle all manner of fundamental arithmetic, algorithms, scheduling, communications, etc. There wasn’t much off-the-shelf knowledge available at that time. So Intel put together a first rate team of numerical analysts, computational scientists and algorithmic experts that worked closely with customers like Argonne National Labs.”
Intel had a 5 to 7 year view and could practically tell you what the performance would be.” In the early 1990s, the big parallel machines eclipsed vector machines in performance. However, there was criticism that the parallel machines were too hard to program and the algorithms themselves weren’t that efficient (compared to those running on vector machines). It was difficult to get Federal government funding for supercomputer research, he said.
On Moore’s Law and how it’s going to play out: If it is independent of the underlying technology, it will last for many decades in the future. If it pertains to a specific technology, it has already ended, e.g. Intel’s Silicon Gate semiconductor process which was replaced by several others over the last few decades. Intel is now looking at ‘non-charge effect” silicon devices and others that deal with quantum effects. CMOS is pretty good in efficiently pushing charge around, but Intel is looking at alternative quantum effect technologies for the future. Rattner thinks that the opportunity is there for many new device architectures with alternative quantum effects. Will we still call it Moore’s Law in 2020, when transistors don’t operate on the same fundamental physics as they do today? Molecular and nano-tube transistors are examples of future material science structures that will be manufactured in many different ways. Silicon will still play a role as a substrate, but other materials may be overlaid on top of it. You will probably see new semiconductor technologies on memory devices, such as flash memory replacements, he said.
Intel Labs has been working on Digital Radio for quite some time. Rattner said Analog Radio’s don’t scale, i.e. Moore’s Law doesn’t apply. Digital Radio returns to the fundamental mathematics of communications (i.e. information theory). Intel is focused on Digital Radio – both reception and transmission- as a “computational problem” that’s built using high end microprocessors. Sadly, Rattner didn’t give a progress report on Intel’s research in this area, which is still not a commercial technology or product (see Reference below from 2009).
Photography via image processors is another interesting research area. ”Bio” (biology?) is also on Rattner’s list of future digital research projects. Image processors in smart phones are getting more and more powerful with each successive generation, he said.
Unlike conventional video that’s transmitted over 3G/4G wireless networks that operate “open loop,” video aware wireless networks would operate ”closed loop,” which will provide a better user experience. The basic concept is for the receiver to monitor and measure image quality in real time and send back status information over the communications channel to the transmitter. Such that at each stage of the channel, the system can determine the proper video resolution with a higher or lower frame rate.
Intel’s work with Stephen Hawkings for a new User Interface to enable him to communicate better. Hawking is paralyzed due to a degenerative disease called amyotrophic lateral sclerosis (ALS). He uses small muscle twitches in his face to select words on a custom computer system so he can communicate. Sadly, his condition has progressed to the point where he can only manage roughly one word per minute. After meeting with Hawking himself, Intel’s Rattner is spearheading a project to improve Hawking’s computer system, and allow for an increase in words per minute.
Note: CHM CEO/Moderator John Hollar emailed this author, stating that Intel’s work with Hawking was probably the most interesting part of this conversation he had with Rattner.
Event Video at http://www.youtube.com/watch?v=C8HbjTACgp0&feature=c4-overview&list=UUHDr4RtxwA1KqKGwxgdK4Vg
Could it really be true: All Digital Radios from Intel?
In November 1943, an electrical engineer working in the telecommunications department of Britain’s General Post Office named Tommy Flowers designed and built the world’s first programmable computer. Named ”Colossus,” the thermionic tube-based programmable computer successfully broke the supposedly unbreakable Lorenz cipher used by Hitler and the German High Command during the Second World War. Afterward, Flowers had a long, successful career which included the development of the first all-electronic telephone exchange.
To honor Flowers, a memorial bust created by sculptor James Butler MBE will be unveiled at Adastral Park, BT’s global research and development headquarters at Martlesham, Suffolk, England in December. In addition, the Tommy Flowers’ Computing Science Scholarship in association with BT is offering academic mentoring, financial support and professional experience to students starting Year 12, and two Tommy Flowers’ Awards for Commitment to Computing have been launched by BT to celebrate the inspirational teaching of Computer Science at Key Stages 2 and 3.
Conceiving separate read/write lines for Intel’s 1103 1K DRAM, in lieu of the requirement for 1.5volt signal to restore the memory cells which were just read/written from the chip. The success of the 1103 established Intel as a brand name start-up, provided revenue to fund improvements in their semiconductor process, and generated cash to pay bills/salaries.
Generation of a computer model/iterative simulator to help refine Intel’s semiconductor process.
Creation and specification of design tools/ development system for the 4004 microprocessor chip set
Design and development of 1st LSI Codec/filter that could be used for anytime slot within a DS1 (T1) or E1 line that were used to interconnect Telco central offices.
- Herculean efforts to get the 1103 DRAM to work in the field. Dave visited all of Intel’s computer customers to get feedback that enabled Intel to correct deficiencies in the part so that it would work in a memory system
- Establishing Intel’s “marketing machine” from 1976 to 1978: seminars, systems approach to microprocessors, FAEs, etc
- Telling Andy Grove that marketing was important and Intel’s microprocessor chips would not sell themselves without a decent marketing budget
- Very skillfully running Intel’s Microprocessor Div as GM for 13 years
- Leading the team that convinced IBM to design in the 8088 with 8 bit peripherals and I/O bus, rather than microprocessor chip sets from Moto or Zilog (which had a better microprocessor architecture)
- Coining the term “Intel Inside,” but more importantly enticing IBM and other computer customers to advertise that Intel was inside their PCs by writing them a check to do so.
- Getting Intel to make microprocessors their main business, displacing semiconductor memories in 1983. Incredibly, Dave revealed that throughout the 1970s Intel made more money on microprocessor development systems/In Circuit Emulators than they did on microprocessor chip set sales.
Group photo taken after close of panel session: Ted, Dave and Alan (from left to right)
Alan Weissberger introduces panelists and Silicon Valley legends, Ted Hoff and Dave House. As Weissberger points out, Hoff and House were instrumental players in helping build the foundation in the 1970s and 1980s for Intel to become an enduring brand and a symbol of Silicon Valley. In the videos that follow, Hoff and House share their inside stories on Intel, the process of invention that led to the microprocessor and the strategy that turned Intel from invisible to a consumer-demanded brand.
Ted Hoff, PhD EE Stanford University