This history of computers was written as background historical information for a talk on the future of computers at the Carlyle Gardens Computer Club. When I started working out the structure of my talk on what the history of computers tells us about future computers, I had way too much detail about computers in mind. What I needed was to pinpoint really significant events, and place them in context. That task is alas not well done in this page.
The history of computers until 1949 is mostly about mathematical advances, and better ideas.
Thank goodness we do not use Roman numerals for arithmetic. The replacement of Roman numbers by the ten Arabic numerals and the decimal system starting in the 9th Century. Only remnants of Roman numerals remain, on old clock faces.
Computers today work with a binary system, consisting of the numerals zero and one. This works just like decimal numbers, if you are missing nine fingers. Binary numbers were known to the Indian and Chinese thousands of years ago. An 11th Century commentary on the IChing, or Book of Changes, made much of the Ying Yang nature of the world. In Europe, Gottfried Leibniz wrote On the Art of Combination
in 1666, but binary was ignored, even in his own early mechanical calculator. Computers work well with binary, because electrical switches are best designed to be either on or off, a binary choice.
In 1847, George Boole developed Boolean algebra. This proved fundamental to the AND, OR, XOR and NOT logic gates in computers.
Claude Shannon did an MIT thesis in 1937. It was called A Symbolic Analysis of Relay and Switching Circuits. It was the first time Boolean algebra and binary numbers were considered for use in machinery.
Sequential logic was another advance, where outputs depend also on the history of the input, and is needed to allow a computer to remember. Talk about clocks, using metronome and learning music analogy.
There were also thought experiments by Alan Turing and John von Neumann). World War II speeded development of electromechanical calculators, and led directly to the use of valves (vacuum tubes) as computer switches.
Numerous mechanical devices contributed to the development of early computers. Important ones were the early 19th Century invention by Joseph Jacquard of punched cards for looms to weave patterned silk automatically. This lead to Herman Hollerith's punched cards in the 1890 US census, and accounting and totalising machines from companies that eventually became IBM.
In the mid 19th Century Charles Babbage designed a mechanical difference engine, for calculating logarithms.
Babbage spent the rest of his life working (and changing) a mechanical analytical engine design that included the fundamentals of modern computers. Even to plans to use punched cards for programs.
The first person to design programs for the Babbage engine was Augusta Ada Byron, the Countess of Lovelace, the only legitimate child of the poet Lord Byron. In 1843, she considered how to program the Babbage engine. It was probably Ada who suggested using the Jacquard punch cards. Her name is honoured as the Ada programming language.
ENIAC in 1946 was first computer to use valves (and with later modifications by John von Neumann, a stored program). It had 17,468 valves, and a half million hand soldered connections.
Designed in 1943, ENIAC was obsolete before it was completed (some things do not change). It was 24 metres long, occupied 63 square metres of floor space, and weighed around 30 tonnes. That was a whole building. It also drew 150 kW of electricity. It operated at a peak of 5,000 operations per second. Using various tricks, the failure rate dropped to one tube failure in two days, with the record time between failures of 116 hours in 1954.
ENIAC calculated with decimal numbers, not binary numbers. It used patch cords for programming, and decade switches to input numbers, not punched cards. Just shows you should never buy the first model of a computer.
Those patch cord problems were solved by punched cards, and all later mainframe computers used binary arithmetic. Here is a neat photo gallery of early computers.
The transistor was invented in 1949, and was destined to replace power hungry valves. Popular Mechanics predicted computers in the future may weigh no more than 1.5 tons
. By 1953, there were 100 computers in the world.
Jack Kilby at Texas Instruments invented the integrated circuit (ic) in 1957. It became the foundation of modern digital circuits. The integrated circuit combined multiple transistors and other components in a single chip. The integrated circuit was also independently invented by Robert Noyce. The history of mainframe computers shows continuous development in speed and capacity ever since.
It did not take long for computers to start wasting time. In 1962, MIT student Steve Russell wrote the first computer game, Spacewar! IBM produced their very successful System 360 range between 1964 and 1978. Seymour Cray at Control Data produced the first supercomputer, the CDC6600.
In 1969, Data Genera shipped 50,000 of its low cost Nova Mini computers for $8,000 each. It was a 16 bit computer, and used large scale integrated circuits to fit the entire central processor for the computer on a single 38 cm circuit board. Computers were now desk sized, even filing cabinet sized, not room sized.
In 1965 Gordon Moore suggested Moore's Law, that processor complexity doubled every year. A decade later, he revised Moore's Law to every two years). So far this observation remains true.
Intel was founded by Robert Noyce, Gordon Moore and others in 1968. Their original market was memory chips, not computer processors. They accidentally produced the first commercial microprocessor, the Intel 4004, designed by Federico Faggin, released on 15 November 1971. It was 4 bit, had 2300 transistors, and handled 60,000 instructions a second, at a clock speed of 740 kHz. This put the entire central processor of a computer into a single integrated circuit. Intel had been contracted by a company wanting to build a calculator. Intel's designer decided to make a general purpose computer processor integrated circuit, rather than a specific set of chips for the calculator.
As an aside, I mention that a secret military microprocessor beat Intel.
Intel's next accidental microprocessor, the 8 bit 8008, was released on 1 April 1972. Again, it was built to fill a specific contract. Two years later, Intel released the 8080, the first microprocessor that was easy to build a complete personal computer around. It had a 2 MHz clock. It was also expensive. Competition started. Motorola introduced the simpler 6800, and MOS the much cheaper 6502. All these chips at this time were 8 bit.
Intel introduced the 16 bit 8086 in June 1978. It had 29,000 transistors. Intel introduced the 134,000 transistor 80286 processor in February 1982. Intel and AMD both released 1 GHz processors in 2000. The Intel Pentium 4 out in 2000 had a very fast clock speed, but had performance issues with older programs. The planned clock increases to 10 GHz never happened, with it peaking around 3.8 GHz. Basically increased heat generated in the processors stopped further clock speed development by around 2005.
Although Intel's great rival AMD briefly held a position of sales parity and had superior chips in the mid to late 1990's, and early 2000's, this is no longer the case. AMD have lost serious money during the past three years. Their only profitable quarter was when they won a billion dollar law suite against Intel. A sales strategy of hoping your better funded rival will make a mistake is not viable in the long term.
The first personal computers were built around 1972 using integrated circuits, and most were unusable. For example, the Kenbak had front panel switches and lights. No keyboard, no display output. Personal computers needed the microprocessor and peripherals.
The first commercial personal computers kits based on microprocessors appeared late in 1974. MITS (Micro Instrumentation and Telemetry Systems) of Albuquerque, New Mexico, announced the MITS Altair 8800 on the cover of the 1st January 1975 issue of Popular Electronics. The first MITS computer kit had 256 bytes of memory (that is bytes). 10,000 kits were sold at US$397.
Paul Allen read the Popular Electronics article, and with Bill Gates, started adapting the BASIC computer language to run on the MITS Altair using a simulator they wrote for a DEC PDP-10 minicomputer. On 2 Jan 1975, Bill Gates and Paul Allen completed BASIC, and took it to MITS. Jan 3, Paul Allen joined MITS as Director of Software. Gates followed him later that year to form an informal partnership called Micro-soft, complete with hyphen.
In 1975 and 1976, many brands of personal computer became available, and several became popular (Commodore, Tandy) over following years. These mostly had a keyboard, an audio cassette interface for programs, and used a TV or the new green screen monitors for a display. Some expensive models had eight inch or five and a quarter inch floppy disks, but most did not. Most used a Microsoft BASIC as a program language.
Almost all the numerous computer companies of the 1970's were killed off by the IBM Personal Computer.
IBM started working on the IBM PC in 1979. They asked Bill Gates about an operating system. He sold them BASIC, and suggested they consider CP/M as an operating system. IBM tried to contact Gary Kildall, the former Intel employee who was the author of CP/M, but Kildall was off flying. IBM returned to Bill Gates, and again asked him about an operating system. Bill Gates agreed to do an operating system. Microsoft bought rights to use QDOS or 86-DOS as the basis for PC-DOS. Microsoft's MS-DOS was one of several operating systems the IBM PC could use (including a version of CP/M). The IBM PC was released in August 1981. It had a cassette port, just like all the rest. At this point, there was only one directory per drive (sub-directories appeared in March 1983).
Because IBM did not have an exclusive right to use MS-DOS, many companies built IBM clone computers, able to run Microsoft's MS-DOS. By 1994, 100 million computers used MS-DOS. Computers compatible with the IBM PC killed off almost every other personal computer design within a decade of its introduction.
With all the competition making cheap computers, IBM was eventually unable to make a decent profit on personal computers. IBM quit the personal computer business in late 2004, selling their business to Lenovo of China.
Apple was a latecomer, founded 1 April 1976. It was not until mid 1977 that the Apple II appeared, with an 8 bit 6502 processor at 1 MHz, and 4 KB of RAM. In July 1978, a floppy disk drive (finally) became available for the Apple II.
In January 1984, Apple released the Macintosh, with only 128 KB of memory, and a 512x342 pixel monochrome display. This was the first commercially successful computer with a graphical interface, windows, icons and a mouse. Critics asked why a new thing like a mouse was needed.
In mid 2005, Apple announced the Macintosh would move from IBM PowerPC chips to Intel chips in future. These are the same chips used in Windows computers. Various people pronounced the death of Apple. Instead, it grew to the second most valued company in the USA, exceeding even Microsoft.
The Osborne 1 was the first true portable computer, despite being sewing machine sized. It appeared in 1981, and was designed to fit under an airline seat. All I can think is that airlines must have had large seats in those days. This picture of the components shows why it was so large, and why the display was a tiny five inches.
Luckily Grid made a true portable the next year, with a plasma display, but it was expensive and not compatible with MS-DOS. Only NASA used the Grid Compass. It took until 1986 for IBM to bring out a portable.
The Apple PowerBook of 1991 is interesting as it was the first portable computer to have a palm rest and a pointing device (a track ball). Most other portable computers changed their design to imitate it.
Basically, almost anything you can do with a desktop computer can also be done with a notebook computer. Plus, you can use a notebook computer anywhere. There are a few high power exceptions, mostly for games playing and video graphics.
Once a notebook computer gets as small as a keyboard, and has a real thin display, shrinking the computer design much more is a waste of time. Something different is probably on the horizon.
A typical Pentium 4 desktop computer from say 2004, running at 1.7 GHz, would use close to 110 watts when it starts. When idle, no power management, close to 60 watts. During full power saving, no hard disk spin, machine in sleep mode, 35 watts. A faster desktop computer will be closer to 200-300 watts. This article tells how to find computer power consumption via Google. The monitor takes additional power, usually at least 50 watts, but 100 watts is also likely. You can only use it at a desk, with a power connection.
You can fine tune your Windows 7 power consumption using tips from Microsoft. Download the prototype Microsoft Joulemeter to calculate Windows 7 power use (caution, this is a research alpha product).
Laptops typically use around 20 watts. Read this user account of how notebook computers use less power to get an idea of the difference.
Even the lightest current model Apple notebook computers run at least five hours on batteries. The two most popular (but heavier) MacBook and MacBook Pro models run nine to ten hours on batteries. New models automatically suspend, but will wake instantly after up to a month of storage.
In contrast, phones work for a day or more on battery power alone, as do music players.
Previous ideas. Next internet.
Never go to a doctor whose office plants have died.