History of computing hardware
|History of computing|
|1960s to present|
Computing hardware has been an essential component of the process of calculation and data storage since it became necessary for data to be processed and shared. The first recorded computing hardware was literally hard. The Phoenicians stored clay shapes representing such things as livestock and grains in containers, which were used not only by merchants but by accountants and government officials of the time. Even today, an experienced abacus user using a device several thousands of years old can complete basic calculations more quickly than the average person using a standard four-function hand calculator.
This narrative presents the major developments in the history of computing hardware and attempts to put them into perspective. For a detailed timeline of events, see computing timeline. The history of computing, is an overview and treats methods intended for pen and paper, with or without the aid of tables.
Earliest devices for facilitating human calculation
Humanity has used devices to aid in computation for millennia. One example is a device for establishing equality by weight: the classic scales, later used to symbolize equality in justice. Another is simple enumeration: the checkered cloths of the counting houses served as simple data structures for enumerating stacks of coins, by weight. A more arithmetic-oriented machine is the abacus. One of the earliest machines of this type was the Roman abacus.
First mechanical calculators
In 1623 Wilhelm Schickard built the first mechanical calculator and thus became the father of the computing era. Since his machine used techniques such as cogs and gears first developed for clocks, it was also called a `calculating clock'. It was put to practical use by his friend Johannes Kepler, who revolutionized astronomy.
Leibniz also described binary code, a central ingredient of all modern computers. However, up to the 1940s, many subsequent designs (including Charles Babbage's machines of the 1800s and even ENIAC of 1945) were based on the harder-to-implement decimal system.
John Napier noted that multiplication and division of numbers can be performed by addition and subtraction, respectively, of logarithms of those numbers. Since real numbers can be represented as distances or intervals on a line, the simple translation or sliding operation of two lengths of wood, suitably inscribed with linear or logarithmic intervals, was used as the slide rule by generations of engineers and other mathematically inclined professional workers, until the invention of the pocket calculator. Thus the engineers in the Apollo program to send a man to the moon made their calculations on slide rules, which were accurate to 3 or 4 significant figures.
Punched card technology 1801–
In 1801, Joseph-Marie Jacquard developed a loom in which the pattern being woven was controlled by punched cards. The series of cards could be changed without changing the mechanical design of the loom. This was a landmark point in programmability. In the twentieth century, electricity was first used for calculating and sorting machines.
In 1890 the United States Census Bureau used punch cards and sorting machines designed by Herman Hollerith to handle the flood of data from the decennial census mandated by the Constitution. Hollerith's company eventually became the core of IBM. IBM developed punch card technology into a powerful tool for business data processing and produced an extensive line of specialized unit record equipment. By 1950 the IBM card had become ubiquitous in industry and government. The warning printed on most cards, "Do not fold, staple or mutilate," became a motto for the post-World War II era.
Leslie Comrie's articles on punch card methods and W.J. Eckert's publication of Punched Card Methods in Scientific Computation 1940, described techniques which were sufficiently advanced to solve differential equations, perform multiplication and division using floating point representations, all on punched cards and plugboards similar to those used by telephone operators. The Thomas J. Watson Astronomical Computing Bureau , Columbia University performed astronomical calculations representing the state of the art in computing.
In many computer installations, punched cards were used until (and after) the end of the 1970s. For example, science and engineering students at many universities around the world would submit their programming assignments to the local computer centre in the form of a stack of cards, one card per program line, and then had to wait for the program to be queued for processing, compiled, and executed. In due course a printout of any results, marked with the submitter's identification, would be placed in an output tray outside the computer center. In many cases these results would comprise solely a printout of error messages regarding program syntax etc., necessitating another edit-compile-run cycle.
Punched cards are still used and manufactured in the current century, and their distinctive dimensions (and 80-column capacity) can still be recognised in forms, records, and programs around the world.
The defining feature of a "universal computer" is programmability, which allows the computer to emulate any other calculating machine by changing a stored sequence of instructions.
In 1835 Charles Babbage described his analytical engine. It was the plan of a general-purpose programmable computer, employing punch cards for input and a steam engine for power. One crucial invention was to use gears for the function served by the beads of an abacus. In a real sense, computers all contain automatic abaci (technically called the ALU or floating-point unit).
While the plans were probably correct, or at least debuggable, the project was slowed by disputes with the artisan who built parts. It was ended by the end of government funding.
Ada Lovelace, Lord Byron's daughter, translated and added notes to the "Sketch of the Analytical Engine" by L. F. Menabrea . She has become closely associated with Babbage. Some claim she is the world's first computer programmer, however this claim and the value of her other contributions are disputed by many.
A reconstruction of the Difference Engine II, an earlier, more limited design, has been operational since 1991 at the London Science Museum. With a few trivial changes, it works as Babbage designed it and shows that Babbage was right in theory.
The museum used computer-operated machine tools to construct the necessary parts. Authorities dispute whether the machine tools of his time could manufacture parts of the required precision. If led and funded properly, his project could have improved techniques. Many authorities believe Babbage failed because his designs were overly ambitious, he had problems with labor relations, and was politically inept.
By the 1900s earlier mechanical calculators, cash registers, accounting machines, and so on were redesigned to use electric motors, with gear position as the representation for the state of a variable. People were computers, as a job title, and used calculators to evaluate expressions. During the Manhattan project, future Nobel laureate Richard Feynman was the supervisor of the roomful of human computers, many of them women mathematicians, who understood the differential equations which were being solved for the war effort. Even the renowned Stanislaw Marcin Ulam was pressed into service to translate the mathematics into computable approximations for the hydrogen bomb, after the war.
Analog computers, pre-1940
Before World War II, mechanical and electrical analog computers were considered the 'state of the art', and many thought they were the future of computing. Analog computers use continuously varying amounts of physical quantities, such as voltages or currents, or the rotational speed of shafts, to represent the quantities being processed. An ingenious example of such a machine was the Water integrator built in 1936. Unlike modern digital computers, analog computers are not very flexible, and need to be reconfigured (i.e., reprogrammed) manually to switch them from working on one problem to another. Analog computers had an advantage over early digital computers in that they could be used to solve complex problems while the earliest attempts at digital computers were quite limited. Since computer programs were not yet widely popular concepts in this era (despite Babbage's pioneering work), the solutions were often hard-coded into forms such as graphs and nomograms, which could represent the analogs of solutions to problems such as the distribution of pressures and temperatures in a heating system, for example. But as digital computers have become faster and used larger memory (e.g., RAM or internal store), they have almost entirely displaced analog computers, and programming, or coding has arisen as another human profession.
First generation of modern digital computers 1940s
The era of modern computing began with a flurry of development before and during World War II, as electronic circuits, relays, capacitors and vacuum tubes replaced mechanical equivalents and digital calculations replaced analog calculations. The computers designed and constructed then have sometimes been called 'first generation' computers. First generation computers were usually built by hand using circuits containing relays or vacuum valves (tubes), and often used punched cards or punched paper tape for input and as the main (non-volatile) storage medium. Temporary, or working storage, was provided by acoustic delay lines (which use the propagation time of sound in a medium such as wire to store data) or by Williams tubes (which use the ability of a television picture tube to store and retrieve data). By 1954, magnetic core memory was rapidly displacing most other forms of temporary storage, and dominated the field through the mid-1970s.
In 1936 Konrad Zuse started construction of the first Z-series, calculators featuring memory and (initially limited) programmability. Zuse's purely mechanical, but already binary Z1 , finished in 1938, never worked reliably due to problems with the precision of parts.
In 1937, Claude Shannon produced his master's thesis at MIT that implemented Boolean algebra using electronic relays and switches for the first time in history. Entitled A Symbolic Analysis of Relay and Switching Circuits , Shannon's thesis essentially founded practical digital circuit design.
In November of 1937, George Stibitz, then working at Bell Labs, completed a relay-based computer he dubbed the "Model K" (for "kitchen", where he had assembled it), which calculated using binary addition. Bell Labs thus initiated a full research program in April 1939 with Stibitz at the helm. Their Complex Number Calculator , completed January 8, 1940, was able to calculate complex numbers. In a demonstration to the American Mathematical Society conference at Dartmouth College on September 11, 1940, Stibbitz was able to send the Complex Number Calculator remote commands over telephone lines by a teletype. It was the first computing machine ever used remotely over a phone line.
In 1938 John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed the Atanasoff Berry Computer (ABC), a special purpose computer for solving systems of linear equations, and which employed capacitors fixed in a mechanically rotating drum, for memory. The ABC machine was not programmable, though it was a computer in the modern sense in several other respects.
In 1939, development began at IBM's Endicott laboratories on the Harvard Mark I. Known officially as the Automatic Sequence Controlled Calculator, the Mark I was a general purpose electro-mechanical computer built with IBM financing and with assistance from some IBM personnel under the direction of Harvard mathematician Howard Aiken. Its design was influenced by the Analytical Engine. It was a decimal machine which used storage wheels and rotary switches in addition to electromagnetic relays. It was programmable by punched paper tape, and contained several calculators working in parallel. Later models contained several paper tape readers and the machine could switch between readers based on a condition. Nevertheless, this does not quite make the machine Turing-complete. The Mark I was moved to Harvard University to begin operation in May 1944.
Zuse's subsequent machine, the Z3, was finished in 1941. It was based on telephone relays and did work satisfactorily. Z3 thus became the first functional program-controlled computer. In many ways it was quite similar to modern machines, pioneering numerous advances, such as floating point numbers. Replacement of the hard-to-implement decimal system (used in Charles Babbage's earlier design) by the simpler binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time. This is sometimes viewed as the main reason why Zuse succeeded where Babbage failed, however, most of today's general purpose machines continue to have decimal adjustment instructions, decimal arithmetic is still essential for commercial and financial applications, and decimal floating-point hardware is being added to some new machines (binary continues to be used for addressing in almost all machines).
Programs were fed into Z3 on punched films. Conditional jumps were missing, but since the 1990s theoretical purists have pointed out that Z3 was still a universal computer (ignoring its physical storage size limitations). In two 1937 patents, Konrad Zuse also anticipated that machine instructions could be stored in the same storage used for data - the key insight of what became known as the Von Neumann architecture and was first implemented in the later British EDSAC design (1949). Zuse also claimed to have designed the first higher-level programming language, (Plankalkül), in 1945, although it was never formally published until 1971, and was implemented for the first time in 2000 by the University of Berlin -- five years after Zuse died.
Zuse suffered dramatic setbacks and lost many years during World War II when British or American bombers actually destroyed his first machines. Apparently his work remained largely unknown to engineers in the UK and US until much later, although at least IBM was aware of it as it financed his post-war startup company in 1946 in return for an option on Zuse's patents.
During World War II, the British made significant efforts at Bletchley Park to break German military communications. The main German cypher system (the Enigma in several variants) was attacked with the help of electro-mechanical machines called bombes. A bombe ruled out possible Enigma keys by performing chains of logical deductions implemented electrically. Most possibilities lead to a contradiction, and the few remaining could be tested by hand.
The Germans also developed a series of cypher systems (called Fish cyphers by the British and Lorenz cyphers by the Germans) which were quite different than the Enigma. As part of an attack against these cyphers, Professor Max Newman and his colleagues (including Alan Turing) helped specify the Colossus. The Mk I Colossus was built in very short order by Tommy Flowers at the Post Office Research Station at Dollis Hill in London and then shipped to Bletchley Park.
Colossus was the first totally electronic computing device. The Colossus used only vacuum tubes and had no relays. It had paper-tape input and was capable of conditional branching. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total). Details of their existence, design, and use were kept secret well into the 1970s. Winston Churchill is said to have personally issued an order for their destruction into pieces no larger than a man's hand. Due to this secrecy the Colossi were not included in many histories of computing. A reconstructed copy of one of the Colossus machines is now on display at Bletchley Park.
Turing's pre-War work was a major influence on theoretical computer science, and after the War he went on to design, build and program some of the earliest computers at the National Physical Laboratory and at the University of Manchester. His 1936 paper included a reformulation of Kurt Gödel's 1931 results as well as a description of what is now called the Turing machine, a purely theoretical device invented to formalize the notion of algorithm execution, replacing Gödel's more cumbersome universal language based on arithmetics. Modern computers are Turing-complete (i.e., equivalent algorithm execution capability to a universal Turing machine), except for their finite memory. This limited type of Turing completeness is sometimes viewed as a threshold capability separating general-purpose computers from their special-purpose predecessors.
The US-built ENIAC (Electronic Numerical Integrator and Computer), often called the first electronic general-purpose computer, publicly validated the use of electronics for large-scale computing. This was crucial for the development of modern computing, initially because of the enormous speed advantage, but ultimately because of the potential for miniaturization. Built under the direction of John Mauchly and J. Presper Eckert, it was 1,000 times faster than its contemporaries. ENIAC's development and construction lasted from 1941 to full operation at the end of 1945. When its design was proposed, many researchers believed that the thousands of delicate valves (ie, vacuum tubes) would burn out often enough that the ENIAC would be so frequently down for repairs as to be useless. It was, however, capable of up to 100,000 simple calculations a second for hours at a time between valve failures.
To `program' ENIAC, however, meant to rewire it - some say, this does not even qualify as programming, otherwise any type of rebuilding some limited computer might be viewed as programming. At the time, however, unaided calculation was seen as enough of a triumph, to view the solution of a single problem as the object of a program. Several years later, however, it became also possible to execute stored programs set in function table memory, which made programming less a one-off effort, and more systematic.
All machines at that date still lacked what became known as the von Neumann architecture: their programs were not stored in the same memory 'space' as the data and hence programs could not be manipulated as data.
The first working von Neumann machine was the Manchester "Baby" or Small-Scale Experimental Machine, built at the University of Manchester in 1948; it was followed in 1949 by the Manchester Mark I computer which functioned as a complete system using the Williams tube for memory, and also introduced index registers. The other contender for the title "first digital stored program computer" was EDSAC, designed and constructed at the University of Cambridge. Operational less than one year after the Manchester "Baby", it was capable of tackling real problems. EDSAC was actually inspired by plans for EDVAC (Electronic Discrete Variable Automatic Computer), the successor of ENIAC; these plans were already in place by the time the ENIAC was successfully operational. Unlike the ENIAC, which used parallel processing, EDVAC used a single processing unit. This design was simpler and was the first to be implemented in each succeeding wave of miniaturization, and increased reliability. Some view Manchester Mark I / EDSAC / EDVAC as the "Eves" from which nearly all current computers derive their architecture.
The first universal programmable computer in continental Europe was created by a team of scientists under direction of Sergei Alekseyevich Lebedev from Kiev Institute of Electrotechnology , Soviet Union (now Ukraine). The computer MESM (МЭСМ, Small Electronic Calculating Machine) became operational in 1950. It had about 6,000 vacuum tubes and consumed 25 kW of power. It could perform approximately 3,000 operations per second. Another early machine was CSIRAC, an Australian design that ran its first test program in 1949.
Manchester University's machine became the prototype for the Ferranti Mark I. The first Ferranti Mark I machine was delivered to the University in February, 1951 and at least nine others were sold between 1951 and 1957.
In June 1951, the UNIVAC I (Universal Automatic Computer) was delivered to the U.S. Census Bureau. Although manufactured by Remington Rand, the machine often was mistakenly referred to as the "IBM UNIVAC". Remington Rand eventually sold 46 machines at more than $1 million each. UNIVAC was the first 'mass produced' computer; all predecessors had been 'one-off' units. It used 5,200 vacuum tubes and consumed 125 kW of power. It used a mercury delay line capable of storing 1,000 words of 11 decimal digits plus sign (72-bit words) for memory. Unlike earlier machines it did not use a punch card system but a metal tape input.
In November 1951, the J. Lyons company began weekly operation of a bakery valuations job on the LEO (Lyons Electronic Office). This was the first business application to go live on a stored program computer.
Also in 1951 (July), Remington Rand demonstrated the first prototype of the 409, a plugboard-programmed punch card calculator. This was first installed, at the Internal Revenue Service facility in Baltimore, in 1952. See the Rowayton Historical Society's timeline (and other documents at that site) for details. The 409 evolved to become the Univac 60 and 120 computer in 1953.
The next major step in the history of computing was the invention of the transistor in 1947. This replaced the fragile and power hungry valves with a much smaller and more reliable component. Transistorised computers are normally referred to as 'Second Generation' and dominated the late 1950s and early 1960s. Despite using transistors and printed circuits these computers were still large and primarily used by universities, governments, and large corporations. For example the vacuum tube based IBM 650 of 1954 weighed over 900 kg, the attached power supply weighed around 1350 kg and both were held in separate cabinets of roughly 1.5 meters by 0.9 meters by 1.8 meters. It cost $500,000 or could be leased for $3,500 a month. However the drum memory was originally only 2000 10-digit words, a limitation which forced arcane programming, to allow responsive computing. This type of hardware limitation was to dominate programming for decades afterward, until the evolution of a programming model which was more sympathetic to software development.
In 1955, Maurice Wilkes invented microprogramming, which is widely used in the CPUs and floating-point units of mainframe and other computers, such as the IBM 360 series. Microprogramming allows the base instruction set to be defined or extended by built-in programs (now sometimes called firmware, microcode, or millicode ).
In 1956, IBM sold its first magnetic disk system, RAMAC (Random Access Method of Accounting and Control). It used 50 24-inch metal disks, with 100 tracks per side. It could store 5 megabytes of data and cost $10,000 per megabyte.
The first implemented high-level general purpose programming language, FORTRAN, was also being developed at IBM around this time. (Konrad Zuse's 1945 design of the high-level language Plankalkül was not implemented at that time.)
In 1959 IBM shipped the transistor-based IBM 1401 mainframe, which used punch cards. It proved a popular general purpose computer and 12,000 were shipped, making it the most successful machine in computer history. It used a magnetic core memory of 4000 characters (later expanded to 16,000 characters). Many aspects of its design were based on the desire to replace punched card machines which were in wide use from the 1920s through the early 70s.
In 1960 IBM shipped the transistor-based IBM 1620 mainframe, originally with only punched paper tape, but soon upgraded to punch cards. It proved a popular scientific computer and about 2,000 were shipped. It used a magnetic core memory of up to 60,000 decimal digits.
In 1964 IBM announced the S/360 series, which was the first family of computers that could run the same software at different combinations of speed, capacity and price. It also pioneered the commercial use of microprograms, and an extended instruction set designed for processing many types of data, not just arithmetic. In addition, it unified IBM's product line, which prior to that time had included both a "commercial" product line and a separate "scientific" line. The software provided with System/360 also included major advances, including commercially available multi-programming, new programming languages, and independence of programs from input/output devices. Over 14,000 System/360 systems were shipped by 1968.
Also in 1964, DEC launched the PDP-8 much smaller machine intended for use by technical staff in laboratories and for research.
Third generation and beyond, post-1958
- Main article: History of computing hardware (1960s-present)
The explosion in the use of computers began with 'Third Generation' computers. These relied on Jack St. Claire Kilby's and Robert Noyce's independent invention of the integrated circuit (or microchip), which later led to Ted Hoff's invention of the microprocessor, at Intel.
The History of computer hardware in communist countries was a bit different.
By the late 1950s, researchers like George Gamow noted that the long sequences of nucleotides in DNA formed a genetic code, yet another form of coding or programming, this time of genetic expressions. By the 1960s, they had identified analogs of the halt instruction, for example.
By the turn of the millennium, researchers noted that the models for physical systems described by quantum mechanics could be viewed as probabilistic computational elements, with potential computing power far exceeding any of the previous hardware mentioned above. On June 17th 2004, quantum teleportation on the scale of atoms by two research teams on two continents was published in Nature, thus demonstrating that quantum information processing was indeed coming into being, step by step.
- Computing timeline
- CPU design – includes a history of design concepts
- History of operating systems
- History of the Internet
- History of the graphical user interface
- Programming language timeline
- Computer architecture – how computers are designed
- Computers in fiction
- Famous Names in the History of Computing. Free source for history of computing biographies.
- Stephen White's excellent computer history site (the above article is a modified version of his work, used with Permission)
- Yahoo Computers and History
- Paul Pierce's computer collection
- IEEE computer history timeline
- Konrad Zuse, inventor of first working programmable digital computer
- The story of the Manchester Mark I, 50th Anniversary web site at the University of Manchester
- The Moore School Lectures and the British Lead in Stored Program Computer Development (1946 -1953), article from Virtual Travelog
- Logarithmic timeline of greatest breakthroughs since start of computing era in 1623
- Rowayton Historical Society's Birthplace of the World's First Business Computer
- OLD-COMPUTERS.COM, extensive collection of information and pictures about old computers