Yet, isn’t it true that all new ideas arise out of a milieu when ripe, rather than from any one individual? (Busa, 1980, p. 84)
Reading through the immense literature available about the early history of modern computing, it is tempting to consider the historiography of modern computing as a battle of firsts. Often, the chronicles of technical evolutions are used to put a forgotten pioneer in the limelight or to contradict or corroborate one’s claims of originality and invention. It is indeed true that each design or machine, whether it is an analog or a digital, a mechanical or an electronic computer, can always claim to be the first in its own kind or the first at something which is now generally considered as essential for the design of the modern computer. But since reinvention and redefinition are at the heart of the history of modern computing, real firsts hardly exist. This doesn’t mean that the alleged firsts can’t be true influential pioneers, or as Maurice Wilkes put it: ‘People who change the world stand facing two ways in history. Half their work is seen as the culmination of efforts of the past; the other half sets a new direction for the future.’ (Wilkes, 1995, p. 21) Additionally, history shows that the same technology can develop independently on different places at the same time. This is true for ideas, hardware, as well as for applications. Charles Babbage’s (1791-1871) revolutionary idea to use punched cards both for numerical storage and control, for instance, is generally acknowledged as an essential step in the evolution of stored-program computers, and Babbage is fully credited for it. However, Babbage adopted his idea from the silk weaving industry where punched-card programmed weaving is attributed to Joseph Marie Jacquard (1752-1834) around 1801. (Menabrae, 1961 [1842], p. 232; Lovelace, 1961 [1843], p. 264-265) But Jacquard had predecessors in France as early as the 1720s. Research by the Austrian computer pioneer Adolf Adam (1918-2004) has shown that programmed weaving was also invented in Austria between 1680 and 1690. (Zemanek, 1980, p. 589) In the nineteenth century, Herman Hollerith (1860-1929) rediscovered, patented, and popularized punched cards as data storage media when he built his tabulators for the US Census of 1890. Babbage’s original double use of the perforated media for both data storage and operation control, however, was rediscovered by Howard H. Aiken (1900-1973) for application in his Automatic Sequence Controlled Calculator (ASCC), later to be referred to as the Harvard Mark I – not to be confused with the Manchester Mark I. Programs were read from paper tape, and data was read from IBM punched cards. The ASCC was built by IBM (International Business Machines), completed in 1943, and publicly announced and demonstrated in 1944. The wide news coverage marks that event as the beginning of the computer age for some historians of technology and computing. (Ceruzzi, 1983, p. 43) But in Germany, Konrad Zuse (1910-1995) had managed to build a working automatic and programmable calculating machine independently of any British or American project by December 1941. Isolated by the political reality of that time, Zuse had conceived and built his machine – later to be called the Z3 – on his own and with his private money. Data was directly input by a numerical keyboard and the calculator could run programs from perforated 35mm movie film. But the direct and demonstrable influence in the use of perforated media from Babbage over Hollerith to Aiken is absent in the work of Zuse, who always claimed that he had no knowledge of Babbage nor of his work at that time (Zuse, 1980, p. 611).
For other historians, the invention and description of the stored-program principle marks the beginning of the history of modern computing. But that beginning can’t be defined precisely either. The stored-program principle states that the ‘operating instructions and function tables would be stored exactly in the same sort of memory device as that used for numbers’ (Eckert and Mauchly, 1945 cited in Ceruzzi, 2003, p. 21) and is generally called the von Neumann architecture. John von Neumann (1903-1957) described this principle in his notorious First Draft of a Report on the EDVAC in 1945. (von Neumann, 1993 [1945]) Von Neumann had joined the University of Pennsylvania’s Moore School of Electrical Engineering where a team led by J. Presper Eckert (1919-1995) and John Mauchly (1907-1980) were designing and building the ENIAC (Electronic Numerical Integrator And Computer) from 1942 till 1946 and the EDVAC (Electronic Discrete Variable Automatic Calculator) from 1945 till 1951. The report undoubtedly reflected their joint ideas but because of the draft status of this report it lacked the acknowledgements and cross-references that would have been present in a final version of this report, and it bore von Neumann’s name only. This has been interpreted as a deliberate claim by von Neumann of the stored-program principle as his original idea. When von Neumann tried to file an Army War Patent for himself on the basis of this draft, this was declined because of possible conflicting claims from Eckert and Mauchly. The patent office ruled that the draft report had been circulated widely before any patent claim on the EDVAC and therefore the material was considered in the public domain and thus unpatentable. (McCartney, 2001, p. 146-147) So officially, von Neumann never made his claim, and the draft was never turned into a final report. Historians of modern computing, nowadays, agree that Eckert and Mauchly deserve equal credit for the von Neumann architecture. (See e.g. Wilkes, 1995, p. 21-22; McCartney, 2001, p. 177-128; Ceruzzi, 2003, p. 20-24) However, it was undoubtedly von Neumann’s international reputation as a mathematician that offered a wider reach to this idea. Arthur W. Burks (°1915), who was on the team that developed, designed, and built the ENIAC, has claimed that Eckert and Mauchly ‘invented the circulating mercury delay line store, with enough capacity to store program information as well as data.’ and that von Neumann ‘worked out the logical design of an electronic computer’ (Burks, 1980, p. 312) and that they did not know of Konrad Zuse’s work at that time. (Burks, 1980, p. 315) Konrad Zuse’s 1936 patent application (Z23139/GMD Nr. 005/021) stated, after a description of data storage in the memory of the Z1, that ‘Auch der Rechenplan läßt sich speichern, wobei die Befehle in Takt der Rechnung den Steuervorrichtungen zugeführt werden’ (cited in Zuse, 1999) and this can be interpreted as the description of a von Neumann architecture with program and data modifiable in storage. In 1947 in the former USSR, Sergei Alekseevich Lebedev (1902-1974) probably came to the stored-program architecture he used in building the MESM (Small Electronic Calculating Machine) independently of any Western effort, (Crowe and Goodman, 1994, p. 11-13) and the M-1 designed by a team led by I.S. Brouk () at the Soviet Institute of Energy in 1950-1951 stored programs in its main memory without any knowledge of the EDSAC report. However, the role of Soviet computer developments in the history of modern computing is generally been overlooked by Western historians. In Australia, Maston Beard (d. 2000) and Trevor Pearcey (1919-1998) also claim to have laid down the basic ideas of their vacuum tube stored-program computer (CSIRAC) in 1946-1947 independently of the early computer research projects in Britain and the US, except for the ENIAC. (Beard and Trevor, 1984, p. 106-108)
At least one other characteristic of the Z3 is interesting in the light of the alleged battle of firsts. Already in 1935, Zuse had decided to use binary floating-point number representation instead of the decimal system which was later used by, for instance, the ENIAC. The Z3 design relied on relay binary circuits, instead of the faster but much more expensive vacuum tubes. (Ceruzzi, 1990, p. 205) Binary relay circuits were also used in the ASCC design and in the designs of George R. Stibitz’ (1904-1995) Complex Number Computer (CNC) at Bell Telephone Laboratories from 1935 onwards. Suekane (1980, p. 576) reports on a similar invention of the relay binary circuit by Mr. Shiokawa in Japan in 1938 and on Dr. Ono’s idea to use a binary system in his statistical computer in 1939. It is highly unlikely that the Japanese knew of the developments in Germany and the US, and Zuse and Stibitz definitely did not know of each other. Consequently, Zuse’s calculators cannot be considered as direct ancestors to the modern digital computer, and his work has not had much influence in the world of computers. However, as Paul Ceruzzi pointed out, ‘their overall design and their use of binary numbers and floating-point arithmetic make them resemble the modern computer far more than an ancestor like the ENIAC does’ (Ceruzzi, 1983, p. 40) and ‘it remained for others to rediscover his fundamental concepts of binary, floating-point number representation and separation of memory, arithmetic, and control units.’ (Ceruzzi, 1990, p. 206-207)
A fourth and last example of the simultaneity of inventions that is so characteristic of the history of modern computing is the invention of the integrated circuit, better known by its popular name ‘the silicon chip’. On February 6, 1959 Jack St Clair Kilby (1923-2005) filed a patent on his invention of ‘miniaturized electronic circuits’. Working on microminiaturization at Texas Instruments in the summer of 1958, Kilby had realized that when all individual components of an electronic circuit such as transistors, resistors, and capacitors would be made out of germanium or silicon, an entire circuit could be created out of one block of semiconductor material. At another company, Fairchild Camera and Instrument, Robert S. Noyce (1927-1990) came to the same conclusion in January 1959 and filed a patent on his ‘semiconductor device-and-lead structure’ on July 30, 1959. Noyce used a piece of silicon instead of germanium as semiconductor material. Noyce was granted his patent on April 25, 1961 (US Patent 2,981,877) and Kilby got his patent only on June 23, 1964 (US Patent 3,138,743). Both companies started court cases over the inventions which lasted for a decade. In the meantime they had agreed to grant each other licences and equally share the credit for the invention between Kilby and Noyce. However, in Britain Geoffrey W.A. Dummer (1909-), a British engineer at the UK Royal Radar Establishment, came up with the original idea of a monolithic circuit in May 1952 when he wrote: ‘With the advent of the transistor and the work in semiconductors generally, it seems now possible to envisage electronic equipment in a solid block with no connecting wires.’ That block could be made, according to Dummer, ‘of layers of insulating, conducting, rectifying and amplifying materials’. (cited in Wolff, 1976, p. 45) In 1957 Dummer himself is reported to have demonstrated a metal model of a semiconductor integrated circuit using silicon crystals, but his efforts were a dead end, because the technology was simply not there yet. (Idem) Looking back on his invention of the integrated circuit against the background of both Dummer’s and Kilby’s work, Noyce commented once: ‘There is no doubt in my mind that if the invention hadn’t arisen at Fairchild, it would have arisen elsewhere in the very near future. It was an idea whose time had come, where the technology had developed to the point where it was viable.’ (cited in Wolff, 1976, p. 51)
This is perhaps true for all technological inventions and developments, as the four examples above clearly illustrate. Instead of trying to answer the question who really invented the electronic digital computer it is probably more relevant to try to grasp an insight into the period and the forces which enabled the development of the modern electronic digital computer, and hence offered attentive scholars the opportunities to start thinking about the use of the computer in non-mathematical applications such as humanities research. Early Humanities Computing efforts were highly dependent on the availability of computing devices and computing time offered to them by private companies or non-humanities academic departments. Nowadays, scholars select the devices and configurations that serve their project best and have quasi non-stop access to several computer processing units at the same time. In the pioneering years, scholars had to adapt their projects to the specificity, availability, limits, and operation modes of the machine and had to accept that the possibility of their work was owing to the three underlying factors that were quintessential to the shaping of the computer industry as it emerged and their interaction with each other. These three factors were technology, the customers, and the suppliers (Pugh and Aspray, 1996). Among the customers and the suppliers we see a clear distinction between government and government-funded organizations and commercial business. Both groups built and used computers for either scientific or administrative purposes. Among the main customers were the national security services, space, engineering, computer science, and meteorological departments, and commercial businesses such as insurance companies. The suppliers were universities, government agencies, academic spin-offs, start-up companies, or new departments of existing companies. The technology was spread via academic journals, monographs, professional conferences reports, newsletters, and visits, or via patents, cross-licensing, and sales and was in constant flux depending on the supplier, the intellectual capital and the investment capital available to the research, design, and manufacturing organisations. It has been argued that the necessities of the Second World War – calculations of trajectories, radar development and improvement, cryptologics and cryptanalysis (Snyder, 1980), and research into war technology such as the atomic bomb – boosted the development of early computers in war waging nations like the US, the UK, and Germany, whereas scientific research was directly and indirectly halted in the occupied countries (Van Oost, 1999, p. 199) where a ‘war-weariness prevented exploitation of electronic applications’. (Dummer cited in Wolff, 1976, p. 53) The development of computers in the Netherlands, for instance, happened completely independent of any military funding, use, or purpose, and was targeted specifically at scientific research, business administration, organisation, and development. (Van Oost, 1999, p. 128) The application of computing to automate operation and the process of making things in commercial business, introduced by the Jacquards of their time, was named automation in the 1950s – a lexical incorrect abbreviation for automatization – and popularized in 1952, for instance, by John Diebold’s book Automation. The Advent of the Automatic Factory. (Diebold, 1952)
Nevertheless, the individual efforts of brilliant engineers and scientists can hardly be underestimated in the interaction of technology, customers, and suppliers, and historiography is always looking for the person behind the machine. Many of them have been adorned with the epiteth ‘Mother’ or ‘Father of the (modern) computer’. A quick, unscholarly search on the World Wide Web using the Google search engine named Grace Murray Hopper and Augusta Ada King, Countess of Lovelace as mothers of the computer and returned the following list of pioneering fathers: Howard H. Aiken, John Vincent Atanasoff, Charles Babbage, John von Neumann, George R. Stibitz, Alan Turing, and Conrad Zuse.
4 comments:
you find the same thing with digital cameras. The first digital camera was invented by a kodak engineer in the 1970s, but they didnt see a concievable market for it, so it was one of those inventions that didnt really see the light of day, until.... someone else re-invented it. there is, indeed, nothing new under the sun.
and I win first blog comment :)
Agreed, priority in invention is not always clear, and does not always align with historical influence. Over-emphasis on priority can certainly distort a history, and priority disputes do have a history of leading to acrimony. But do histories of computing really over-emphasize priority? Perhaps we remember Jacquard not (wrongly) because he was the first — if you are right, he wasn't — but (rightly) because his technique was more widely influential than that of his predecessors?
And even if historians do as a class tend to obsess over priority issues, does not our cultural emphasis on priority in invention as a way of earning kudos have beneficial side effects? At the least, it encourages openness rather than secrecy. It is perhaps culturally relative of me, but where I come from, that's worth something.
I think most historians would argue that it isn't so much a matter of deciding once for all who "really" was first to invent a technology as a matter of understanding why, at various times, someone has been regarded as such. There's a mythology of the great inventor running from Hero of Alexander down to Edison and beyond, which needs to have its heros. And there are social/cultural forces that make us want to push individuals into that position more on the grounds of their nationality or ethnicity than on the basis of their actual achievement -- which we, mostly, couldn't assess anyway.
Thanks for the nice article. I appreciate your efforts. great blog.
finnish website translation
Post a Comment