First Interregnum--A Short History of Computing

People have long realized the competitive advantages that could be realized by having available more efficient data storage and computational ability. From counting on the fingers, to making marks on the walls of caves, to the invention of picture numbers, to the modern check or banknote, there has been a steady progression away from directly manipulating the objects that computations describe and toward the use of abstractions for the originals. Mechanical devices have played an important part in this sequence. More than one culture came up with the idea of placing beads on a string (the abacus.) In some places, these are still the preferred calculating device after several thousand years. A skilled operator can calculate the cost of a large number of purchases on an abacus much faster than most people could enter them into a calculator.

Some who have studied the ancient British monument known as Stonehenge have come to the conclusion that it was an enormous calculating device for making astronomical predictions. Other monuments left by the Babylonians, South and Central American Indians, and South Sea Islanders may have had similar purposes. The Scottish mathematician John Napier (1550-1617) devised Napier's bones and published tables of logarithms intended to simplify tedious arithmetic computations. These led directly to the wooden or bamboo slide rule, known and loved by many student generations prior to the development of inexpensive electronic calculators.

To the French mathematician and theologian Blaise Pascal (1623-1662) goes the honor of inventing the first mechanical adding machine (1642). It was based on a system of gears similar to those in a modern automobile odometer and was used for computing taxes. However, parts for the device could not be manufactured with sufficient precision to make it practical, and it never became widely used About thirty years later, the famous German mathematician and co-inventor (with Newton) of calculus, Gottfried Wilhelm von Leibniz (1646-1716), made a similar but more reliable machine that could not only add and subtract but also multiply, divide, and calculate square roots. There were many people who improved calculating machines over the next century, and by 1900 they had an important place in government and commerce. As late as the mid 1960s electromechanical versions of these calculators could do only basic four function arithmetic, weighed thirty pounds, and took up half a desktop.

Meanwhile, another idea important to the modern computer was emerging--that of the stored program or instruction sequence. This idea arose in connection with the development of automatic looms by the French inventor Joseph Marie Jacquard (1752-1854). First shown at the 1801 Paris Exhibition, these looms used a collection of punched metal cards to control the weaving process. The machine, with some variations, is still used today, though it is now controlled by punched paper cards or tapes, or by direct connection to a microcomputer.

The first computer--a machine combining computational ability with stored programs--was designed by the British mathematician Charles Babbage (1792 - 1871). He worked on his "Difference Engine" for about eleven years before abandoning the project. Later, he designed a much more ambitious "Analytical Engine," that was intended to be an algebraic analogue of Jacquard's loom. Although Babbage even had a programmer for the engine (Lord Byron's daughter, Ada Augusta, the Countess of Lovelace,) this machine was never constructed in his lifetime. Its concepts were not realized until 1944 when the Mark I computer was developed in the United States.

By this time, the punched paper medium had become standardized through the work of Herman Hollerith. He devised a card data storage and sorting system for the U.S. Census Bureau, which was first employed in the 1890 census. Hollerith left the bureau six years later to form his own company, the name of which was changed to International Business Machines in 1924.

Meanwhile, vacuum-tube technology had developed to the point where an electronic computer could be manufactured. The first of these were the British code-breaking devices Colossus Mark I and Colossus Mark II built in 1943 and 1944 for the British intelligence service at Bletchley Park. The latter attained speeds not matched by other computers for a decade. When the war was over, these machines were dismantled and their parts sold for surplus.

At about the same time, the groundwork of a number of researchers in the United States came to fruition in the construction of the Electronic Numerical Integrator and Calculator (ENIAC) by J. P. Eckert and J. W. Mauchly at the University of Pennsylvania. This machine, which contained over 18,000 vacuum tubes, filled a room six meters by twelve meters and was used principally by military ordinance engineers to compute shell trajectories. In subsequent years, many similar computers were developed in various research facilities in the United States and Britain. Such devices, which generally were limited to basic arithmetic, required a large staff to operate, occupied vast areas of floor space, and consumed enormous quantities of electricity.

Eckert and Mauchly were also responsible for the first commercial computer, the Universal Automatic Computer (UNIVAC,) which they manufactured after leaving the university. Their company was eventually incorporated into Sperry (now merged with Burroughs to become UNISYS), which still manufactures large industrial computers. Today, those early vacuum-tube monsters are referred to as "first-generation computers," and the machines that are their successors are called "mainframes."

The transistor, developed at Bell labs in late 1947, and its improvement during the early 1950s, was designed to replace the vacuum tube, reducing both electrical consumption and heat production. This led to the miniaturization of many electronic devices, and the size of typical computers shrank considerably, even as their power increased. The transistorized machines built between 1959 and 1965 formed the second generation of computers.

The price was still in the hundreds of thousands to millions of dollars, however, and such machines were generally seen at first only in the headquarters of large research and government organizations. Even by the mid-1960s, not all Universities had even one computer, and those that did often regarded them as exclusive toys for the Mathematicians and research scientists. There were occasional courses at the fourth-year level, but Freshman introductions to Computer Science had not yet become popular.

The invention of the integrated circuit dramatically changed things in the computing world. The first result was another, even more significant size reduction, for what once took up several floors of a large building now occupied a small box. The first of these third-generation computers was the IBM System 360, which was introduced in 1964 and quickly became popular among large businesses and universities This size reduction also resulted in the first "pocket" calculators, which appeared on the market in the early 1970s. Even at the initial price of several hundred dollars, these put into the hands of the average person more computing power than the first UNIVAC had possessed. New models proliferated so rapidly and so many new features were incorporated into the pocket calculator that one company decided to have a chip designed that would allow it to program new functions so as to cut down the time necessary to bring a new model to market.

The chip, called the 4004, gave way to the 8008, and then to the 8080 and 8080A. The latter became the backbone of the new small-computer industry, as numerous companies developed kits and fully assembled computers. In its later reincarnations by Zilog as the Z-80 and other descendants, such as the 8085, 8088, 8086, and now the 80186, 80286, 80386, 80486, Pentium, and P6, this invention lives on in millions of microcomputers. Not long after the 8080 became a commercial reality, Motorola developed the 6800 chip, which had the advantage to programmers of being cheaper and somewhat easier to work with than the 8080. It, too, became popular for a time, but soon gave way to other designs.

At about the same time the Z-80 was developed, the 6501 and 6502 chips were derived from the 6800 as low-cost industrial process controllers. In 1976, the 6502 was also used to build a small computer, this one entirely contained on a single board. It was called the Apple I, and Apple Computer Corporation went on to sell millions of the Apple ][ and its descendents, the ][+, //e, //c and //GS, surpassing all other manufacturers of small computers in the process.

In 1977, Radio Shack joined the competition with its Z-80 based machines. In Europe, the equivalent popularizing role was played by Commodore (a Canadian company) and by Sinclair (a British firm). A few years later, IBM came into this market with the 8088-based PC. The mere presence of the giant changed the whole market for a time, with most other manufacturers seeking to make machines compatible with those of IBM. Eventually some of these "clone" makers, such as Compaq, became a larger presence in the market than IBM itself. By the mid 1990s, the machines generating the most attention were capable of storing more and manipulating larger numbers than anything previously seen in the microcomputer market. They were also capable of handling the processing requirements of the graphical user interface (GUI) first realized in the Xerox Star, the Apple Lisa and Macintosh, then in Commodore's Amiga and Atari's machines, and now being demanded by most computer users. The integration of circuits had now reached the point where millions of components were being crammed into a single chip. Between 1987 and 1991, major new commitments were made by Apple with the Motorola 68030-based Macintosh IIx, IIc and IIi models and by IBM with their OS/2 machines. With the latter, IBM also followed Apple's lead into graphics-oriented software, helping to ensure this style of interface a continuing acceptance in the marketplace. Graphical user interfaces were also adopted by the makers of scientific workstations such as those made by Sun Microsystems, and were being attached to other machines running the UNIX operating system.

In the early 1990s, Microsoft, already the dominant manufacturer of operating systems for the Intel 80x68 chips and of applications for both these and the Macintosh platforms, had begun to market a GUI called Windows that was a rough imitation of the Macintosh Operating System. The courts ruled, however, that it was not a close enough imitation to fall under copyright law, and Windows gradually became dominant on the Intel based machines.

By 1995, Apple had formed partnerships with Motorola and IBM to develop new microprocessor technology and was already marketing machines based on the new PowerPC RISC chip, while IBM was porting its operating systems to the new chip as well. The two were readying new operating systems and preparing specifications for a common hardware platform on which to run them. Apple had licensed its operating system and the first Macintosh clones were appearing on the market. Microcomputers had become powerful enough that the minicomputer category had been all but crowded out of the market on price/performance considerations.

The late 1990s and early 2000s saw more changes. By this time, Apple had switched to the PowerPC chips from Motorola (and later to derivative chips made by IBM) and had rewritten its operating system as OS X to sit on top of free BSD UNIX for added stability, more security, and better multitasking. IBM's OS-2 had all but vanished from the marketplace, and, while newer variants of Microsoft Windows were dominant in consumer machines, many hobbiests were experimenting with (and most servers had adopted) LINUX, an open source version of UNIX, also out of concerns for stability and security.

While much of the marketing activity and most of the headlines were focusing on the microcomputer segment of the industry, the larger machines had undergone some startling changes as well. The fourth generation of supercomputers can be used in situations where the complexity of the calculations or the quantity of data is so great as to be beyond the ability of even an ordinary mainframe device. These machines are used by governments, the military, and in academic research institutions. Still newer generations of computers are on the drawing boards in the United States and Japan, and many of the new developments will undoubtedly filter down to become more consumer-oriented devices in the future.

At the opposite end of the scale, pocket sized computing devices had also become important to some people. These ranged from the DOS based miniaturized version of the desktop sibling to the specialized personal time and communications organizer (Personal Digital Assistant or PDA.) Also called the Personal Intelligence Enhancement Appliance (PIEA) these devices boast handwriting recognition, wireless communications abilities, cellular telephone circuitry, a digital camera, and sophisticated time management functions.

For most applications in the near future, however, microprocessor-based computing devices will have sufficient power to suit the majority of individual, academic, and business uses. They are inexpensive, easy to link (network) together for sharing other resources (such as storage devices and printers), and they run languages and other programs similar to those found on mainframe computers. Much of the new development work (particularly in programming and publishing) is being done with microcomputers in view, and it is safe to predict that the descendants of these machines are the ones most people will be referring to when they speak about computers in the future.

The larger machines, however, will also continue to grow and change, as will the organizations depending on them. Moreover, the computers of the future will be as different from those of today as these are from the ones of the late 1940s. They will be smaller (down to pocket size), faster, and with greater storage capacity. They will be integrated with video and communications technology to give immediate access to worldwide data bases. They will undoubtedly become easy to use, and at some point the need to offer university level courses in their operation will cease, for they will have become common technical appliances.

The Internet, especially the portions known as the World Wide Web (WWW) has become a kind of prototype for the universal distributed library (the Metalibrary) of the future, and most organizations have connections for e-mail, if for nothing else.

Computers have already profoundly changed many of society's institutions (business, banking, education, libraries). They will have an even greater effect on institutions in the future. They have also raised or caused new ethical issues, and these will need to be addressed in the interests of social stability. In addition, developments in computing have affected or given rise to other new products and methods in a variety of fields, further demonstrating the interdependence of ideas, society, and technology.

There are microprocessors in stereos, televisions, automobiles, toys and games. The entertainment and telecommunications industries are heavily dependent on new electronic technologies. Computers themselves are directly attached to research instruments that gather and interpret data in basic physics, chemistry, and biology experiments. The resulting changes and advances in scientific research have also caused profound effects on society and its institutions. They have resulted in new social and ethical questions being raised, whose very asking could not have been anticipated in the industrial age. These include issues relating to software copyright, data integrity, genetic engineering, artificial intelligence, displacement of human workers by robots, and how to live in and manage an information-based society.

Some of the technical trends and the possible social and ethical consequences will be examined and extrapolated in more detail in later sections of the book. It is at least possible to conclude at this point that the advent of the fourth civilization (aka "the information age") is owed more to the modern computer than to any other single invention of the late industrial period.

-- from the first edition; revised 1995 06 13, 2002 06 10, and 2004 01 08

Assignments

1. Write a full biography (minimum 1000 words) of one of the individuals whose name is important in the history of computing.

2. Select a specific time or technology important to the history of computing and write a fuller (minimum 1000 words) exposition than that contained here.

3. Write your own "future history" of computing.


Contents