INDUSTRIAL REVOLUTIONS: FROM CANAL SYSTEMS TO COMPUTER NETWORKS

By Thomas P.Hughes

William R. and Erlyn J. Gould Distinguished Lecture at the University of Utah, September 13, 2000.

http://www.lib.utah.edu/gould/1999/lecture99.html

Today's so-called "Information Revolution" is often compared to past industrial revolutions, especially a British Industrial Revolution which took place between 1750 and 1830 and a Second Industrial Revolution which is believed to have occurred in the United States between 1880 and 1940. The comparison, however, remains vague, because we are not informed about the nature of industrial revolutions to which the Information Revolution is compared.

In the Gould Distinguished Lecture, I shall describe not only the essential technical core, but also the managerial, economic, social, and cultural characteristics of the British and Second Industrial Revolutions. Assuming that the Information Revolution is comparable, I shall then suggest by analogy with earlier revolutions what events and trends we might experience in this time of rapid change.

Historians have written countless articles and books about the British Industrial Revolution, a number about the Second Industrial Revolution, and a few essays about the nature of industrial revolutions in general. Yet, their descriptions and analyses are often inaccessible for non-professionals. Not all historians even agree that there have been industrial revolutions. Analyzing quantitative economic data, some find slow cumulative change instead of the discontinuity associated with a revolution. 1

In general, however, the concept "industrial revolution" has become a commonplace among historians. They define the British Industrial Revolution with some precision in both quantitative and qualitative terms. A lesser number of historians also write about a Second Industrial Revolution, but with less agreement and precision. In this essay, I shall draw upon their work as well as historical sources in an effort to discover whether comparing the Information Revolution with past industrial revolutions is a reasonable, justifiable, and an informative venture

Initially we should acknowledge that the British Industrial Revolution and the Second Industrial Revolution do not exist as objective phenomena. Historians imaginatively construct concepts of industrial revolutions in order to impose intelligible patterns on the chaos of past events. Such intellectual constructs keep history from being, as one historian remarked years ago, "just one damned thing after another."

First, we shall discuss the concept of a British Industrial Revolution. After this, we shall turn to explore the concept of a Second Industrial Revolution. Then we shall proceed by analogy to ascertain if the supposed Information Revolution resembles the multifaceted industrial revolution.

 

BRITISH INDUSTRIAL REVOLUTION

History textbooks now provide a packed down account of a British Industrial Revolution occurring between roughly 1750 and 1830. With ease and familiarity, they tell of inventors of humble background and limited means introducing a wave of inventions. Autodidactic engineers often gained hands-on experience as apprentices before presiding over canal and other projects. Great Britain did not then have engineering colleges. Industrial research and development as we know it today did not exist. The notable achievements of the independent inventors and civil engineers resulted in British industrial and government leaders long doubting that engineers needed formal education and that industry should fund laboratories.

Rarely do textbook accounts reveal that technical innovations interacted with one another in a systematic way. This synergistic, feedback interaction is a major explanation for the rapid technical changes associated with industrial revolutions. Newly invented steam engines, for example, drove recently mechanized textile factories; furnaces produced iron using abundant coke instead of scarce charcoal; iron machine tools turned out steam engine components; and canals made possible the flow of coal, iron, cotton, and other commodities throughout the evolving system. When deciding whether there is an Information Revolution comparable to the British Industrial Revolution, we shall look for such systematic interactions.

Narrowly focused accounts of industrial revolutions ignore economic, social, and even political changes accompanying technical ones. In so doing, these accounts fail to capture the essence of an industrial revolution. Economic, social, political, and demographic changes are often ignored. For instance, when18th-century British landowners seeking increased yield enclosed with fences and hedges fields once worked in common by their tenants, the tenants sought work in rapidly industrializing Birmingham, Manchester, and other inland cities. Engineers laid out canals that opened these and other interior cities to industry. Formerly commerce and industry concentrated along rivers and coastal ports. We shall look for such demographic changes in connection with the Information Revolution.

On the political level, technical change stimulated the rise of an urban middle class whose wealth came from industry rather than from the land. Leading industrialists and their banking and commercial associates cultivated laissez-faire liberalism, advocated free trade, opposed the Corn Laws, or tariffs, which had raised the price of imported grains and demanded suffrage be extended to middle class voters in the industrial cities. Political spin-off from the Information Revolution has yet to be deeply explored and analyzed.

 

SECOND INDUSTRIAL REVOLUTION

Some historians also contend that the pace of technical change accelerated in the United States in the late 19th century. The resulting transformation over several decades is seen by some as a Second Industrial Revolution comparable in its essentials to the British Industrial Revolution. Like the earlier revolution, the later involves a wave of inventions bringing new interlocking means of transportation, new power sources, new materials, mass production of consumer goods, advances in industrial chemistry, and innovative modes of production. Like the first revolution, the second is seen as involving economic and social changes. In describing the Second Industrial Revolution, historians give more attention, however, to new modes of management, organizational forms, and to cultural changes than they do in characterizing the British Industrial Revolution.

Professional inventor-entrepreneurs, including Thomas Edison, Elmer Sperry, Nikola Tesla, and Lee De Forrest, created a wave of inventive activity sparking the Second Industrial Revolution. In 1896 a writer in the Scientific American referring to the remarkable outpouring of U.S. patents since the Civil War, exuberantly insisted that his was "an epoch of invention and progress unique in the history of the world. It has been," he observed:

    a gigantic tidal wave of human ingenuity and resource, so stupendous in its magnitude, so complex in its diversity, so profound in its thought, so fruitful in its wealth, so beneficent in its results, that the mind is strained and embarrassed in its effort to expand to a full appreciation of it.2

Professional inventor-entrepreneurs gave full time to presiding over the innovation process from invention, through development, and into deployment. In the period from about 1880 to 1910, they launched major technological systems. Edison is best remembered for his electric lighting systems; Sperry introduced automated, feedback controls for machinery, aircraft and ships; Tesla invented a widely used electric motor; while De Forrest pioneered in radio. Others devoting part of their time to invention also introduced major technological systems. Alexander Graham Bell of telephone fame and Orville and Wilbur Wright known for pioneering in aviation come immediately to mind. When turning to the Information Revolution, we shall look for a similar wave of innovation made by comparable inventive types.

By World War I, industrial research laboratories staffed by physicists and chemists were displacing the independent inventors as sources of patents and innovations. The industrial laboratories, General Electric's and Bell Telephone's among them, tended to focus upon improvements in existing systems, especially those being manufactured by their parent companies. As a result, the innovations from the industrial laboratories appear commulative and conservative while those of the independents have a radical cast because corporate ties do not tether them. Will Silicon Valley innovations also become less radical with the passage of time? Will research scientists displace legendary computer hardware and software inventors working out of garages?

The industrial research laboratory is but one of many new organizational forms spawned during the Second Industrial Revolution. Four-year engineering colleges and engineering departments in universities proliferated. An electrical engineering curriculum energized the engineering profession. In 1882 about seventy U.S. colleges and universities offered a professional degree in engineering, but none gave a degree in electrical engineering. By 1899 there were 89 institutions giving an engineering degree and almost all offered a four-year course in electrical engineering.3 We shall also ask about the role of higher education during the Information Revolution.

While factories with centralized power sources and coordinated machinery spread during the British Industrial Revolution, more complicated forms of industrial organization arose during the Second Industrial Revolution. The availability of electric motors allowed a flexible reorganization of the layout of factories. On the organizational level, giant manufacturing firms, like General Motors, rationalized management by introducing semi-autonomous divisions specializing in a particular product. Middle managers presided over the divisions, and top management coordinated the divisions and allocated resources among them. 4

Scientific management became a characteristic of the Second Industrial Revolution. The gifted and eccentric Frederick W. Taylor gave up an opportunity to attend Harvard University and became a laborer and then a foreman in Philadelphia's Midvale steel works. Subsequently as a scientific management consultant, Taylor rationalized plant layout and human labor, which he treated as a machine component. He is remembered as the father of the scientific management that is deeply embedded in managerial practice today.

Taylorism along with Fordism are hallmarks of the Second Industrial Revolution. Henry Ford and his engineers organized material and energy flows so that raw materials entered his automobile factories in an unending stream and Model T automobiles issued forth in a steady stream from moving assembly lines. By mass production, he sharply reduced overhead costs and the price of Ford automobiles. Taylorism and Fordism spread from the United States to the rest of the industrializing world including the Soviet Union. By analogy we shall ask about managerial innovations associated with the Information Revolution.

Among organizational innovations, utility companies increased greatly in number, especially in the field of electricity supply. Several large utilities created complex physical and organizational forms called "interconnections." On the physical level these exchanged electricity through transmission lines. On the organizational, committees representing participating utilities managed the interconnections. Holding companies and interconnections anticipated even more complex organizational forms mushrooming during the Information Revolution. Electrical transmission networks, or grids, anticipate computer networks. We shall ask if technical and organizational changes interact during the Information Revolution.

As with the British Industrial Revolution, new power sources constitute a technical core of the Second Industrial Revolution. Because so many technical artifacts require a power source, the introduction of a new one, such as electrical transmission and the internal combustion engine, causes a cascade of technical innovations which taken together contribute to the temporal and spatial changes associated with an industrial revolution. Because information production and distribution is pervasive in industrialized societies, we shall observe a cascade of information- driven changes in our day.

During the Second Industrial Revolution large technological systems increasingly structured the industrialized world. Transportation, communication, and energy systems superimposed grids and networks upon the landscape. A highway system obviously shapes where we live, work and play, as do airline routes and electric transmission and distribution lines. Because older systems persist, despite the appearance of ones designed to replace them, history becomes stratigraphic which a "geological section" can display.5 Layering diagrams of these systems on a map of a heavily industrialized region would reveal a virtually impenetrable web of long-lived lines of force shaping social life. Now the Internet web increases the density of layered systems. This technoscape becomes a setting for historical change.

To create these systems, system builders integrate technical components and organizational ones. In the case of an electric power system, for example, technical components are obviously required, but manufacturers and investment institutions are needed as well. Individuals and groups function as system builders creating and coordinating the heterogeneous components of a sociotechnical system. Thomas Edison and his associates, for example, built a system including utilities, manufacturers and investment banks. System builders populate the information domain today.

The Second Industrial Revolution brought not only new sociotechnical systems, but also dramatic cultural changes. Leading architects and artists sought to express the spirit of the Revolution. Walther Gropius, an eminent German architect, brought a machine-inspired, international style of architecture to the United States. Americans Charles Sheeler and Louis Lozowik were among the prominent American artists who expressed in their works the spirit of the modern technological era. Artists and architects expressed the order, system and control inherent in the Second Industrial Revolution. They believed that there was a modern style comparable to the Medieval, Renaissance and Baroque. Lozowick caught their spirit when writing:

 

    The history of America is a history of gigantic engineering feats and colossal mechanical construction.
    The skyscrapers of New York, the grain elevators of Minneapolis, the steel mills of Pittsburgh, the oil wells of Oklahoma, the copper mines of Butte, the lumber yards of Seattle give the American industrial epic in its diapason.... 6

We will ask if the Information Revolution has similarly inspired leading American architects and artists to attempt to express its essence.

 

INFORMATION REVOLUTION

Like the British and the Second Industrial Revolutions, the putative Information Revolution began with a flurry of inventions. Not initially interrelated, breakthroughs in semiconductor electronics, computers and software in time converged through contingency. The interactions, as in the earlier revolutions, result in a synergistic and systematic heightening of the impact of the individual innovations.

In tracing these developments and interactions, we first turn to semiconductor electronics. In 1947 Walter Brattain, John Bardeen, and William Shockley at Bell Laboratories patented their invention of a semiconductor transistor. As with so many inventions, a long sequence of anticipatory ones, especially in the wartime field of semiconductors for radar detection, provided a foundation upon which they created a device which could be developed technically and economically. Again, as in the case of so many inventions, Shockley who headed the Bell Labs semiconductor team used analogy to invent. He envisioned a solid-state, junction transistor that would function like a three-element vacuum tube. 7

Because tubes provide rectification and amplification of electrical currents in countless radio, telephone, and control systems including those used by the military, a reliable, smaller, less heat- generating transistor replacement, like the electric motor during the Second Industrial Revolution, found an enormous market. Bell Laboratories licensed the patented device so its manufacture spread.

Shockley left Bell Labs in 1955 to start a company to develop, manufacture and market transistors. Shockley said he wanted to make a million dollars and see his name in the Wall Street Journal, not only in the Physical Review. In establishing a small research and development startup, he not only anticipated an Information Revolution pattern, but he also reinforced a trend by locating in Stanford University's research park in Palo Alto, California, soon to be the heart of Silicon Valley. Inventor-entrepreneurs of earlier revolutions also established startups to manufacture and market their inventions, but not in association with universities.

Known for the invention of the transistor, Shockley attracted to "Shockley Semiconductor an unusually gifted group of engineers and scientists" including Robert Noyce and Gordon Moore who would become legendary Silicon Valley figures.8 They found, however, Shockley's managerial tactics eccentric, so they left with six colleagues to establish in 1957 Fairchild Semiconductor. A few months after Jack Kilby of Texas Instruments patented an integrated circuit in 1959, Noyce independently invented a silicon integrated circuit and introduced the planar process for manufacturing it.

The introduction of integrated circuits became a milestone in semiconductor history for they allowed resistors and capacitors to be combined with transistors in an electronic circuit on a single silicon wafer. This miniaturization responded to the Air Force's need for reliable electronic components for guidance and control aboard missiles, especially the Intercontinental Ballistic Missile "Minuteman" of the 1960s.9 Military procurement also stimulated the early development of aviation, feedback controls, and radio decades earlier.

Another major breakthrough came in 1971 with the invention of the microprocessor by Marcian (Ted) Hoff, an engineer at the Intel Corporation which had been founded by Noyce and Moore in 1968 after leaving Fairchild. The microprocessor went beyond the integrated circuit in complexity by incorporating hundreds of thousands of circuit components dedicated to the logic of calculating or control, thus becoming a computer on a chip. Memory chips evolved along with logic chips.

Production of transistors, integrated circuits, and microprocessors skyrocketed and Silicon Valley witnessed a proliferation of startup companies not only making chips, but computers as well. By 1970 semiconductor, computer, and software developments converged and the Information Revolution was underway. This positive feedback interaction resembles the steam, iron and factory feedback loop of the British Industrial Revolution.

Turning to modern computer history, we note that it began about the same time as the transistor was invented. In 1946 at the University of Pennsylvania a team of engineers and scientists headed by physicist John Mauchly and engineer J. Presper Eckert introduced ENIAC (Electronic Numerical Integrator and Computer). A giant, mainframe, general purpose, digital machine, the ENIAC weighed 30 tons, had 70,000 vacuum tubes, and occupied the area of a small gymnasium. The Army funded the three year project in hope of finding a quicker way of calculating the projectory of shells and bombs. Hundreds of women using hand-operated mechanical calculators could not keep up with the demand.

After the war, Mauchly and Eckert established a firm to build large, mainframe digital computers for massive data processing by government and business. Their UNIVAC 1 computer of 1951 processed the U.S. Census. John Von Neumann, a brilliant mathematician, rationalized the design of large mainframe computers with stored memory and in the 1950s and 60s several universities and research centers handcrafted mainframe computers based on his design architecture. These scientific computers did massive calculations such as those required for analysis of nuclear explosions.

In the 1950s at MIT's military-funded Lincoln Laboratory, a team of engineers headed by Jay Forrester developed the Whirlwind computer which took information from radar about projected attacking aircraft, processed the information, and then controlled the interceptor aircraft or missiles. Calculators were becoming information processors. A disproportionate number of computer pioneers, including Ken Olsen who founded the Digital Equipment Corporation, learned their trade using Whirlwind spawned computers at Lincoln Laboratory. Learning by doing was playing a role comparable to formal education.

Information processing and control computers such as the Whirlwind and its successors opened new vistas. Forrester realized that computers made possible simulation, or modeling, not only of military, but also of business and urban systems. Soon software and hardware designers transformed information laden simulations into graphic ones and the world of virtual reality was born.

After IBM, long a producer of mechanical calculators, entered the mainframe market, its System/360 computer introduced in 1964 found widespread use in the business world. Preparing the software for System/360 proved to be a massively frustrating project that called the attention of hardware manufacturers to the critical role software programmers played in the evolution of the computer. 10

Use of integrated circuits and microprocessors allowed designers in the 1970s to introduce smaller computers than mainframes, first the "minicomputers" and later desktop, personal computers. Larger than desktop computers, minicomputers cost many thousands of dollars, but they allowed skilled individuals to interact directly with the machines rather than through a professional staff of mainframe operators.11 C. Gordon Bell, who helped design Digital Equipment Corporation's pathbreaking PDP-11 minicomputer, remarked "the semiconductor density has really been the driving force, and as you reach different density levels, different machines pop out of that in time." 12

Historian Paul Ceruzzi calls 1974 the annus mirabilis of personal computing, a year in which Edward Roberts designed and marketed the "Altair" computer. A personal, desktop computer selling as a kit for $400, the Altair used an Intel microprocessor and advanced computer architecture. Its performance compared with that of minicomputers costing ten times more. Within a few years, other inventors supplied peripheral equipment, including the floppy disk for memory storage. William Gates III and Paul Allen, then students at Harvard University, designed its software.13

Personal computing took another giant step in the 1970s when engineers and scientists at the Palo Alto Research Center of the Xerox Corporation designed and built the Alto computer. It featured icons, the mouse, and a pull-down menu, all invented by Douglas Engelbart, as well as a laser printer. For complex reasons, Xerox's commercial design of this innovative computer was not successful. Steve Jobs and Steve Wozniak working in Silicon Valley in 1977 introduced the Apple II computer and later the Macintosh which had Alto features, a floppy drive and an elegant and simple architecture. After IBM introduced a personal computer with spreadsheet software and word processing, Time magazine in 1983 named the computer "Machine of the Year." The IBM machine used Gates MS-DOS operating system propelling Gates's Microsoft Company into the lead as a software producer.

Designers of minicomputers and personal computers did not intend them to function interconnected, but computer engineers who trained as electrical engineers knew about economic advantages arising from linking consumers, electrical machines, and utilities in networks, or grids during the Second Industrial Revolution. In 1963 the Advanced Research Projects Agency (ARPA) of the U.S. Defense Department funded Project Mac, a multimillion dollar MIT project, to develop time sharing which connected a number of user terminals to an individual minicomputer. J. C. R. Licklider, a visionary ARPA program manager, championed time sharing and prepared the ground for interconnecting time-sharing computers in a network.

ARPA also funded the ARPANET. Developed by several university computer centers and Bolt Beranek and Newman, a small high-technology firm in Cambridge, Massachusetts with close connections to Harvard University and MIT, the ARPANET, an interconnection of mainframe computers, became after 1971 the core of the Internet. Initially used by the military and research universities to share software, to transfer files, and for remote login, the Internet soon became the carrier for e-mail, news, and discussion groups.

Use of the Internet expanded dramatically after Tim Berners-Lee, a scientist at CERN, the European particle physics laboratory, wrote the prototype for what has become known as the World Wide Web. It provides access to an enormous array of documents located at computer sites throughout the world. The usefulness of the Web increased greatly when Marc Andreesen, a student at the University of Illinois, and Eric Bina composed a program for a web browser which allows Web users to search effectively for specific documents. At the turn of the century, the Web has become not only a way to search for information, but also a means to search and shop for goods and services.

As semiconductor devices, computers, and software, such as compilers, interacted to increase the means of processing information, telecommunication innovations concurrently provided means of rapidly transmitting and distributing large masses of information. Semiconductor, digitalized switches are installed at the core of telephone systems. Large-capacity, glass fiber, or optical, cables replace copper wire for transmission and distribution. Satellite communication also supplements ground transmission.

Not only does interaction of technologies characterize the Information Revolution, but, as in the case of earlier industrial revolutions, the interaction of technology and management distinguishes it as well. Early in this century, scientific management with its emphasis upon the integration of worker and machine, and Fordism with its stress upon the ceaseless flow of materials and finished products became the innovative modes of management. Now another innovative style of management spreads in the information sector.

Hierarchy, specialization, standardization, centralization, expertise and bureaucracy characterize Second Industrial Revolution management. In the advanced information sector today, flatness, interdisciplinarity, heterogeneity, distributed control, meritocracy and nimble flexibility characterize management. The new management vocabulary and style resonates interestingly with the values widely held by persons under thirty during the counterculture 1960s. Many engineers, scientists and managers presiding over the Information Revolution spreading so rapidly after 1970 had formative experiences during the counterculture years. Computer enthusiasts meeting at the Homebrew Club in Palo Alto and those having coffee at the Wagon Wheel nearby exhibited the "laid back" hacker style, as did the student members of the MIT Model Railroad Club. The culture of Silicon Valley has been described as information sharing, collective in learning, informal in communication, fast moving, flexible in adjustments, entrepreneurial, start-up inclined and thoroughly networked. 14

Contractual networks, systems engineering, and a project approach to innovation also distinguish the management of the Information Revolution. These characteristics stem in part from the formative experiences of thousands of engineers and managers designing, developing and deploying large weapon and space systems during and after World War II, including Intercontinental Ballistic Missiles (ICBM). Military agencies and the National Aeronautical and Space Agency funded and presided over these projects, but industrial and university contractors provided the engineering, science, and field management. In order to schedule and coordinate the thousands of contractors involved, the military in conjunction with experienced industrial managers and engineers, such as those at the Ramo Wooldridge Corporation, introduced a systems- engineering type of management.

Both systems engineering and projects are commonplace not only in the military-industrial-university sector, but also in the information sector. Projects are of limited duration and usually involve a team of managers, engineers, scientists, and skilled workpeople who are gathered together for the duration of the project and who then disband.15 They create a particular product. Projects are dedicated to innovation. They are analogous to open casting associated with the production of plays: actors gather for a particular play, develop the play through rehearsals, put on performances and then disband.16 Projects differ from a cumulative, product-improvement style characteristic of large manufacturing corporations with industrial research laboratories that flourished during the Second Industrial Revolution.

The nimble, virtual corporation common today in the information industry is a variation upon the contractual network used by the U. S. military to develop weapons systems. A virtual corporation maintains a core competency to make a few components for a computer or telecommunication system and then contracts out to other companies the manufacture of the remaining components in the system. Sometimes a virtual corporation will assemble as well as market the system. The Internet allows a virtual corporation to function as a systems engineer scheduling and coordinating the contractors over which it presides during a project. Not invested in facilities to manufacture components made by its contractors, a virtual corporation is nimble. It can shed its contractors and move to another domain. In contrast, Henry Ford attempted to own and manage the entire Ford system from raw materials to finished automobiles.

The important role played by the government-industrial-university complex differentiates the Information Revolution from earlier industrial revolutions. From World War II to about 1970 government funding, especially from the Department of Defense and the National Science Foundation, launched and gave momentum to the computer revolution.17 As we have noted, government funded the development of the ENIAC and early mainframe computers used for scientific research. The military funded development of the Whirlwind computer and its successors at the Lincoln Laboratory. Government procurement provided a market for early computer manufacturers, including IBM. ARPA funded time-sharing and the ARPANET as well as computer use in artificial intelligence and graphics at university computer centers. Federal government continues to be a major source of funds for computer science and electrical engineering departments in universities. Recently computer and software manufacturers have invested heavily in research and development.

University scientists and engineers are responsible for a disproportionate number of the major computer hardware and software inventions. ENIAC at the University of Pennsylvania; Von Neumann computer architecture at Princeton Institute for Advanced Study; time sharing at MIT; graphics at the University of Utah; ARPANET at several universities including University of California, Los Angeles; minicomputer workstations at Stanford University; UNIX operating system at University of California, Berkeley; reduced Instruction set computing (RISC) at UC Berkeley; and the web browser at the University of Illinois, Urbana are a few of the inventions nurtured in university environments. Their relative freedom of problem choice in a university environment as compared to engineers and scientists in industrial research laboratories partially explains the radical, breakthrough nature of university inventive activity. University researchers now play a role comparable to the independent inventors who inaugurated the Second Industrial Revolution.

The Information Revolution like the British Industrial Revolution and the American Second Industrial Revolution centers upon particular places. In the case of the industrial revolutions, Birmingham, Manchester, Detroit, became "milieux of innovation."18 Silicon Valley, California, Boston-Cambridge, Massachusetts became such milieux during the Information Revolution. Much as engineers and industrial workers congregated at the sites of the industrial revolutions, skilled craftspersons, engineers, scientists, and managers gravitate to innovation and entrepreneurial sites of the Information Revolution.

Turning to the cultural realm we ask, has the Information Revolution influenced art and architecture. Prominent architects and artist during the Second Industrial Revolution expressed in their works positive attitudes toward technology.19 After World War II, however, many artists and architects reacted strongly against technology. Nuclear weaponry, environmental degradation, and other negative effects of large-scale, post-World War II technological activities made "the values of technology less interesting and even distasteful."20 Technology connoted to them repressive system, order, and control.

Denise Scott Brown, an influential urban planner and partner in the architectural firm of Venturi, Scott Brown and Associates, expresses views representative of a generation of urban planners and architects influenced by counter-culture values.21 Referring to the controlled accidents cultivated by 1950s action painters, or abstract expressionists, she asks her peers to learn from the comparable spontaneity expressed in the vernacular architecture of Los Angeles, Las Vegas, and urban sprawl. She contrasts this with the rigid order and control imposed by traditional planners and architects.

In their paintings, Robert Motherwell, Willem de Kooning, Barnett Newman, Jackson Pollock, Robert Rauschenberg and other post-World War II Abstract Expressionists reacted against the military-industrial system, in general, and bureaucratic order and control, in particular.22 In fact, they were responding negatively to technological values and events associated with the Second Industrial Revolution of earlier decades. They were not yet aware of the "laid back," counter culture values that emerged as the Information Revolution spread, values such as informality, interdisciplinarity, heterogeneity, distributed control, meritocracy and nimble flexibility.

The Information Revolution and these values nurture imaginative art forms, among them computer graphics, including movie animation and virtual reality.23 Within several decades, artists may find computer-using, graphic designers and programmers providing tools, especially software, capable of shaping art in ways comparable to the manner in which linear perspective influenced painting during the early European Renaissance.

In the same vein, Herbert Muschamp, architectural critic for the New York Times has written about the future of architecture:

 

    It would be odd if the computer did not set off a spatial revision comparable to that sparked by Brunelleschi's development of linear perspective in 15th-century Florence. Indeed, blob architecture undermines the enduring authority of perspectival space. 24

Architects now use software originally intended for automobile styling and movie animation to design buildings which Wes Jones, a Los Angeles architect, calls blob architecture. Blob refers to designing spaces using three-dimensional, computer models. These allow architects to manipulate a model as if it were soft putty. A change in one dimension reverberates throughout the entire model. The noted architect Frank Gehry who designed the highly-praised, art museum in Bilboa, Spain, takes a slightly different approach. He models his buildings initially in a soft material and then turns the model over to computer experts to develop computer models and plans for the engineers and craftsmen to follow.25 Persons trained in architecture and fine art also use rendering software to create spaces, or environments, for computer games, spaces in which people-or creatures in the case of games-live, work, and play. 26

A synergistic combination of government funding along with industry and university research and development brought enhanced computer graphics and virtual reality to the marketplace.27 Professors David Evans and Ivan Sutherland, the inventor in 1963 of Sketchpad, a seminal three-dimensional, graphics system, created at the University of Utah in the 1960s the leading academic center for work in interactive graphics. They also established Evans & Sutherland which produced graphical display systems especially for the military. Among the Ph.D. alumni of Utah's program in computer graphics in the 1960s and 70s are Alan Kay, who developed the graphical user interface at Xerox Park; John Warnock, who founded Adobe Systems; Nolan Bushnell founder of the Atari Company; James Clark, founder of Silicon Graphics and head of Netscape; and Edwin Catmull a founder of Pixar Animation Studios which helped produce the animated film Toy Story. Utah alumnus Catmull at Pixar Animation Studios and Alvy Ray Smith of LucasFilm, along with a number of talented young graphics specialists, developed the computer program RenderMan used in making Toy Story, the first entire, computer-generated film, the dinosaurs in Jurassic Park, and the cyborg in Terminator 2.

 

CONCLUSION

If we reflect upon the history of the British and the American industrial revolutions, we find that the hallmark of an industrial revolution is the many fields of human activity involved. A cluster of inventions, especially ones affecting numerous technical systems, may be seen as the core of a revolution, but related and notable changes also occur in the organizational, managerial, demographic, social, and cultural realms. We found that today's Information Revolution also involves such broad changes, so a comparison with earlier industrial revolutions seems justifiable. Because the Information Revolution has probably not yet run its course, we can predict by analogy that it may bring even more diverse changes and not simply technical ones.28 We should shape these changes so that we can, in the spirit of the Gould Lectures enhance the quality of life.

 



ENDNOTES

1To accept the occurrence of a revolution economic historians look for discontinuities in such economic criteria as the rate of growth of the national income, especially per capita, and they search for rapid changes in the share of agricultural, industrial, and service sectors in the national economy. After failing to find quantitative evidence of such rapid changes they conclude that there no valid measurements with which to define or recognize the phenomenon of the Industrial Revolution. S. Pollard, "The Concept of the Industrial Revolution," p. 13 in paper read at Terni October 1987 on occasion of an ASSI (Associazione dsi Storia E Studi Sull'Imprisa) conference.

2Edward W. Byrn, "The Progress of Invention During the Past Fifty Years," Scientific American, 75 (25 July l896): 82-3. I have written on invention and independent inventors in "The Era of Independent Inventors," Science in Reflection: The Israel Colloquium: Studies in History, Philosophy, and Sociology of Science, ed. Edna Ullmann-Margalit ,III (1998): pp. 151-68.

3 I. O. Baker, "Engineering Education n the United States at the End of the Century," Science (2 November 1900): pp. 666-674 and Charles R. Mann, A Study of Engineering Education (New York: Joint Committee on Engineering Education of the National Engineering Societies, 1918): pp. 18-24.

4 Alfred D. Chandler, Jr., The Visible Hand: The Managerial Revolution in American Business (Cambridge, Mass.: The Belknap Press, 1977).

5 Emmanuel Le Roy Ladurie, "Rural Civilization," The Territory of the Historian, ed. Emmanuel Le Roy Ladurie, trans. Ben and Sian Reynolds (Chicago: University of Chicago Press, 1979): p. 79.

6 Louis Lozowick,"The Americanization of Art," in Machine-Age Exposition Catalogue: The Little Review, XII (1926-29): p. 18

7 Michael Riordan and Lillian Hoddeson , Crystal Fire: The Birth of the Information Age (New York: W.W. Norton, 1997): pp. 143-4. I have relied upon Riordan and Hoddeson in constructing my semiconductor narrative.

8 On Noyce see, Robert Slater, Portraits in Silicon (Cambridge, Mass.: MIT Press, 1987): pp. 153-61.

9 Paul E. Ceruzzi, A History of Modern Computing (Cambridge, Mass.: MIT Press, 1998): 181-2. I have relied upon Ceruzzi for my basic narrative about computers.

10 Martin Campbell-Kelly, and William Aspray, Computer: A History of the Information Machine (NewYork: Basic Books, 1996): 196-204. System/360 computers used transistors in place of tubes and stimulated the interaction of computer and semiconductor technology. See also, Emerson Pugh, Rebuilding IBM: Shaping an Industry and Its Technology (Cambridge, Mass.: The MIT Press, 1995): pp.263-77.

11 The best remembered on the early minicomputers was the Digital Computer Corporations PDP series introduced in the 1960s.

12 Bell quoted in Ceruzzi, A History of Modern Computing: p. 211.

13 . Ceruzzi, A History of Modern Computing; pp. 226-41.

14 AnnaLee Saxenian, Regional Advantage: Culture and Competition in Silicon Valley and Route 128 (Cambridge, Mass.: Harvard University Press, 1996): pp. x-xi and 2-3. Steven Levy, Hackers: Heroes of the Computer Revolution (Garden City, N.Y.: Anchor Press/Doubleday, 1984): pp. 20-4

15 Thomas P. Hughes, Rescuing Prometheus (New York: Pantheon Books, 1998): pp. 3-14.

16 Susannah Wilson, drama teacher at Philadelphia Community College, suggested the analogy.

17 National Research Council: Committee on Innovations in Computing and Communications: Lessons from History, Funding a Revolution: Government Support for Computing Research (Washington, D.C.: National Academy Press, 1999).

18 Manuel Castells uses and discusses the origin of the concept in The Rise of the Network Society (Oxford: Blackwell, 1996): p. 36.

19 Thomas P. Hughes, American Genesis: A Century of Invention and Technological Enthusiasm, 1870-1970 (New York: Penguin Books, 1990): pp.312-52

20 Meyer Schapiro, "The Liberating Quality of Avant-Garde Art," Art News 56 (Summer 1957): p. 39.

21 Denise Scott Brown, "On Pop Art, Permissiveness, and Planning," Journal of the American Institute of Planner XXXV (May 1969): 184-6.

22 (no note available)

23 The largest professional society for computer graphics is the Association for Computing Machinery's "Special Interest Group in Graphics," (SIGGRAPH), which draws about 40,000 people to its annual conference and includes a large number of artists in its constituency. Many people attribute Siggraph's success to its mixing of technologists and artists. They host an electronic film festival every year that demonstrates computer-generated animations. Jerry Sheehan of the National Research Council called Siggraph to my attention.

24 Herbert Muschamp, " A Queens Factory Is Born Again, as Church," New York Times, (5 September 1999): Arts Section, p. 30. I encountered the Muschamp article several days after writing above that computer graphics might influence art as perspective did in the Renaissance.

25 I am indebted to Professor Patricia Conway of the School of Fine Arts at the University of Pennsylvania for this information.

26 Stephen Miller, "The Wizard Turns Out to Be an Artist," New York Times, 5 September 1999, Business Section, p. 10.

27 Funding a Revolution: Government Support for Computing Research): p.242. I have drawn upon many conversations with Professor Timothy Lenoir of Stanford University in preparing this section on graphics and virtual reality. I have also used his essay in progress entitled "All but War is Simulation: the Military-entertainment Complex."

28 Some scholars, for instance, believe the intensity and scope of the Information Revoluton will increase as developments in molecular biology and genetic engineering interact with those in computing Castells, Network Society: pp. 47-50.

 


This original lecture was given in the Gould Auditorium in the Marriott Library at the University of Utah. September 13th, 2000.



 

RETURN TO:
The Lectures Index | The Lecture Series Main Page