A Brief History of the U.S. Federal Government and Innovation (Part III: 1945 and Beyond)

By History Center Staff

It is only through “learning by doing” that organizations create the competence to bring embryonic technologies to commercially viable products and services. The experimentation that accompanies learning-by-doing can be costly and entail considerable risks. If private enterprise can imagine commercial promise in a radically new technical idea then it will gamble. But there are limits to a firm’s willingness to gamble. If the idea is too speculative, with the outcome of R&D completely unpredictable, or if the scale of the R&D is too large or far exceeds the perceived commercial value of the idea, then firms will be extremely reluctant to gamble. Surprisingly, public enterprise has often been willing to underwrite big-stake gambles when private industry has been unwilling or unable to do so. The visible hand of government has made mistakes, but it has also played a strong role in laying the foundation for America’s technological and industrial leadership. Spanning the period from the American Revolution to the end of World War II, part one and part two of this series looked at the roles of the U.S. federal government as an actor in, and director of, the innovation process. Through six examples, this third and last part of the series will illustrate the hands-on role of government in shaping the direction, rhythm, timing of innovation during the post World War II period. The six examples are 1) the aeronautical industry, 2) the Federally Funded Research and Development Centers (FFRDC), 3) computers, 4) semiconductors, 5) The Global Positioning System, and 6) the Lithium battery.

Aeronautic and Space Industries

In the 1980s, Nathan Rosenberg and David Mowery — two eminent historians of technology, economics and business — examined, in considerable detail, the role of the government in the U. S. aviation industry. Their work underscores the considerable investment, first with the National Advisory Committee on Aeronautics (NACA, 1915 — 1958), and then with the National Aeronautics and Space Administration (NASA, 1958 — present), that the U.S. federal government made in America’s producers of aircraft. These investments helped set the stage for America’s global technological pre-eminence in aeronautical and space technology. In addition to directly helping to mitigate the high costs and risks of R&D, the government, through its power of procurement, allowed industry to take bigger gambles on R&D. Many of the significant innovations in commercial aircraft design were first nurtured in military applications. Jet engines and air frames are good examples

It is difficult to imagine the U.S. aircraft industry of the early 1940s developing the jet engine on its own initiative. The R&D costs were enormous, the technical uncertainty high, and the prospect of a commercial market was far from self-evident. For military enterprise, however, the life-and-death struggle of WW II and the ensuing Cold War made the jet engine a gamble worth taking. The bulk of postwar airframe technology also emanated from government-sponsored research. Boeing illustrates this military to civilian shift quite well. In 1957, only 2 percent of Boeing’s sales were nonmilitary. By 1966, Boeing’s civilian sales leapt to 52 percent, and then climbed to 77 percent by 1971.

The considerable technological achievements of the NASA through the many decade history of the nation’s space program also underscores the central role of government in creating and sustaining the technological capacity for humanity to venture out into the solar system and beyond. The exploration of space is reminiscent of humanity’s exploration of another inhospitable environment. When the distances were great, the risks high, and the investments considerable, as was the case in transoceanic exploration, the state became the prime mover and sponsor behind the journeys of exploration. One should not forget that, for their time, the ships used in exploration during the 15th through 18th centuries were advanced, large and complex technological systems: propulsion, navigation, life-support were some of the subsystems. As financial risks became manageable and the technical uncertainties were reduced, private enterprise took on an increasingly larger role in maritime transportation. One can imagine a similar process unfolding with space travel. There are already several companies talking about the idea of taking tourists into space, and NASA set the foundation upon which they can build.

Federally Funded Research and Development Centers (FFRDC)

At start of WWII virtually all America’s scientific and technological talent rested in universities and industry. In the total war environment of WW II, the U.S. military had to harness the scientific and engineering know-how of its entire population. As detailed in the previous article, the federal government established a number of ad hoc university based, R&D organizations around specific technological programs such as radar, the computer, and the atomic bomb. A great deal of very advanced technical expertise had been created within this collective of government sponsored labs. With the war’s end, there was concern among defense policy makers that most of this know-how would be dissipated because most of the scientists wanted to return to research within the traditional academic environment. Even before the war ended, the United States had concluded that the geopolitical ambitions of the U.S.S.R would come into direct conflict with its own ambitions. Therefore military preparedness became the cornerstone of American postwar military policy. Preserving and expanding the wartime network of research centers into the postwar period was an essential element in this preparedness. From this effort eventually emerged the Federal Research Contract Centers which then transformed into the Federally Funded Research and Development Centers (FFRDC).

Since their inception, the nature and purpose of these FFDRCs have evolved. The federal agencies funding the FFDRCs have expanded beyond the department of Defense (DOD) to include the Department of Energy (DOE), National Aeronautics and Space Administration (NASA), the National Science Foundation, the National Institutes of Health (NIH), the Federal Aviation Administration (FAA), and others. DOD and DOE, however, support the lion’s share of the FFDRCs. Initially, FFDRCs were R&D laboratories. But soon the FFDRCs expanded their roles to include study and analysis centers or “think tanks,” as well as system engineering and technical direction centers. The number of centers changed over the years. While new Centers were created others were dissolved. In 1950 there were 23 FFRDCs. The number increased to a peak of 74 in 1969, and then declined to the mid 30s during the 1980s. In 2010, there were 40. The most successful centers had a function that could not be effectively carried out by a federal agency or a for-profit company.

In the FFDRCs, the federal government created, and continues to maintain, an arms-length organizational infrastructure that helps strengthens America’s capacity for technological innovation. In a 1995 study of the FFDRCs, the congressional Office of Technology Assessment, concluded that the network of these centers “are able to address long-term problems of considerable complexity and to analyze technical questions with a high degree of objectivity borne of having renounced any possibility of selling products to the federal government or forming partnerships with those who do, while remaining outside of the federal government itself.” Even a partial list of the 40 present day FFDRCs reveals an impressive collection of scientific and technical capabilities: the Los Alamos National Laboratory, Lawrence Livermore National Laboratory, the Lincoln Laboratory; the Jet Propulsion Laboratory; the think tanks overseen by RAND; the Software Engineering Institute; the Center for Communications and Computing; the Homeland Security Systems Engineering and Development Institute; the National Renewable Energy Laboratory; National Cancer Institute at Frederick; and the Center for Advanced Aviation System Development. The FFDRCs are not a replacement for innovation in the private sector. Neither do they undermine it. Rather, in nurturing a national pool of scientific and technical expertise that can take on high-risk technical challenges, FFDRCs complements the private sector’s market-driven approach to innovation. The enduring FFDRCs created a body of scientific and technical expertise that could not have been recruited, sustained, and managed within the civil service. Whether FFDRC’s efforts have led to long-term economic benefits needs to be examined on case-by-case basis, but in the case of computers their influence has surely been beneficial.

Computation and the Electronic Digital Computer

The electronic digital computer is perhaps the single most important technological development in the last 60 years. Digital processing has become embedded in just about all facets of human life in technologically advanced societies Ubiquitous computational technologies enable modern economic growth and shape human interactions in profound ways. From a scan of the products and services that define digital technology in the 21st century one might think that America’s global leadership in computer technology rests, and has always rested, squarely on the shoulders of private enterprise. One cannot deny that this leadership in the digital economy illustrates the creativity, vitality, and the daring of America’s private sector, but to conclude that private enterprise was the prime mover of the computer revolution is to ignore history. The visible hand of government was there to nurture the computer’s development when there were few civilian incentives and little market rationale to support this embryonic industry.

The urgent need to produce ballistic firing tables during World War II, as discussed in part II of this series, prompted the U.S. government to fund the development of a new, and revolutionary computational technology: the electronic digital computer. Like the Manhattan Project, this computer, named ENIAC, was a top secret project. On the other side of the Atlantic, the British government was funding its own top secret project to design and build an electronic digital computer called Colossus. While ENIAC was designed to do high-speed numerical computation, the goal of Colossus was high speed code decryption. If the war had not ended with the Soviet Union emerging as a new world military power and a real threat to the United States and Western Europe, the electronic digital computer might have languished for some time. Postwar tensions with the Soviet Union turned the computer into an essential tool for national security. The development of thermonuclear weapons created computational needs that would have seemed unimaginable to physicists and engineers a decade earlier, and secrecy in this and other technological, economic, and diplomatic matters became paramount, so did the need to develop the most advanced computational technology for cryptology, and for the capture and analysis of foreign intelligence.

Immediately following World War II, the U.S. government funded and guided the creation of a national competence in digital computer technology. The list of machines that ensued is a Who’s Who of the early years of computer technology. Funded by the U.S. Army and Navy, and the Atomic Energy Commission, John Von Neumann embarked on a project, at the Institute of Advanced Study in Princeton, N.J., to design and build one of the first, post-ENIAC, generation of computers. The IAS computer, as it was called, was replicated in the national laboratories as well as other defense related R&D centers: MANIAC at Los Alamos, ORDVAC at Aberdeen, AVIDAC at Argonne, ORACLE at Oak Ridge, JOHNNIAC at Rand, and ILLIAC 1 at the University of Illinois, While much of military related research was kept secret, the work on the IAS computer was circulated widely, which aided the development of computer design know-how in key universities and industries. The National Bureau of Standards, with support from the U.S. Army, became an active player in the design and building of computers. Launched by the National Bureau of Standards in the Spring of 1950, SEAC was perhaps the first operational electronic digital computer in operation in the U.S.

There are many stories of the U.S. federal government directly nurturing the creation and expansion of technological competence within the computer industry, but in the interest of brevity one will be highlighted. Started immediately after WW II and paid for by the Navy, the Whirlwind computer project at MIT realized the first breakthrough in random access memory called magnetic core-memory, which later became critical for the commercialization of computers. Also from Whirlwind came the development of graphic displays using cathode ray tubes (CRTs) to visualize the movement of airplanes in real-time. The Whirlwind team and its know-how quickly became integrated into the much larger project called the Semi-Automated Ground Environment (SAGE) air defense system. SAGE was an ambitious project that merged communications and computer technology as parts of an air defense system against Soviet attack. The cost of this project was enormous. Perhaps as much at $10 billion had been spent on this project by the time it was completed in the early 1960s. IBM built the computer, the biggest of its day, and the Burroughs Corporation handled the communications technology. SAGE provided IBM and Burroughs the opportunity to move up a learning curve that could have never been provided by the civilian marketplace. From its experience in SAGE, in the early 1960s, IBM developed SABRE for American Airlines, one of the world’s first computerized airline reservations. Integrating digital communications, stat display, and information processing, SABRE made IBM a leader in real-time transaction technology. The SABRE system still continues to this day, albeit highly transformed.

The Early Years of Semiconductor Miniaturization

The technical advances and economic benefits of computers and semiconductors are inextricably linked to each other. Captured in Moore’s Law, the virtuous circle of this computer-semiconductor feedback bears witness to the creativity and vitality of private enterprise to innovate and market new products and services for the civilian marketplace. But as with the computer, history once again reveals that the early growth of semiconductor technology depended, in part, on the active participation of government in the innovation process. In today’s world, the great value of semiconductors to human existence appears self-evident. And yet, in the decade following the transistor’s invention, there was still real doubt about the transistor’s broader market potential. It was during this period of market uncertainty that government support helped nurture the still embryonic semiconductor industry.

The transistor, invented in 1947 at Bell Labs, offered a radically new device for electronic amplification based on very different scientific principles than the vacuum tube, but it would take some time before this radically new electronic component would generate business to rival the vacuum tube. At the start of World War II, vacuum tube sales were nearly $250 million. By 1951, vacuum tubes were a $4 billion dollar industry. The radio and phonograph manufacturers generated most of the vacuum tube purchases. In the immediate post WW II years, a relatively new technology, television, exploded into a huge mass market and the demand for vacuum tubes soared. Through the 1950s, vacuum tube sales still dominated the consumer electronics market. Ten years after the invention of the transistor, vacuum tubes were outselling transistors by more than 13 to 1.With transistors being far more expensive than vacuum tubes, the manufacturers of radios and televisions had little incentive to switch to solid–state devices. The early makers of computers also faced a similar cost constraint. Replacing thousands of tubes with transistors would be very expensive proposition.

For the military, however, the transistor was essential to the progress of its weapon and mobile communications systems. Increasing complexity characterized the design of new weapons systems in the 1950s. More and more components were being jammed into circuits to do ever more sophisticated tasks. Increasing complexity translated into physically larger systems. Complexity also meant correspondingly higher energy demands and heat dissipation issues for electronic systems. There were limits to the number of electronic components that one could stuff into an airplane or missile. So miniaturization was essential. Increased complexity also brought problems of reliability. As the number of components (particularly vacuum tubes) increased, the “mean time between failures” (MTBF) of the entire system got shorter. The reliability problem was compounded by the less than ideal conditions in which these systems would operate. The more sophisticated the system, the more likely it would fail. To the military mind the implications were truly frightening. In 1953, one senior U.S. Navy Officer, referring the lessons of Mary Shelley’s novel, captured this fear in the following words: “like the creator of Frankenstein we have produced devices which in the hands of the operating forces, are so unreliable that they could lead to our ultimate destruction.” The complexity, miniaturization, and reliability difficulties were further accentuated by the military’s obsession with ultimate performance. Because of the relative simplicity of radios, phonographs, and televisions, complexity was never such an issue for the consumer electronics industry.

The military devoted a great deal of money to miniaturizing the vacuum tube and to looking for alternatives. Government money poured into transistor R&D at Bell Labs and elsewhere. Through its willingness to pay for expensive components, the military provided a much needed revenue stream to transistor companies. Digital technology was the ultimate expression of the increasing complexity of post WW II electronic technology. Government helped both the computer and semiconductors industries get through the early high-risk part of the technology curve. The U.S. private sector then took over and turned digital electronics into a global, mass-market revolution.

A 1968 report by the OECD on the nature and origins of America’s global dominance in semiconductors concluded that U.S. leadership had been made possible by the active participation of the U.S. federal government. In those early uncertain years of this new technology, government agencies, particularly those with interest in defense, were willing to support solid-state R&D, to subsidize engineering effort required to install production capacity, and to pay premium prices for procurement of new devices when the civilian market was incapable of doing so. With its deep pockets, the U.S. military thus helped sustain a young semiconductor industry as it “learned by doing.”

Global Positioning System (GPS)

It is becoming difficult to imagine everyday life without the Global Position System (GPS). Embedded in most smart phones, and even automobiles, GPS is changing the way products and services are marketed to consumers. Surprisingly, however, the origins of this technology had absolutely nothing to do with consumer demand or the marketing vision of private enterprise.

GPS had its origins in Transit, a satellite system deployed by the U.S. Navy in 1964 to determine the positions of its ships and ballistic submarines across the world’s oceans. The U.S. Air Force was also interested in Transit as a real-time navigational tool for its aircraft, but Transit was inadequate for the Air Force’s needs. Ships move over a 2-dimensional surface while planes fly in 3-dimensions. So the Air Force started on a system of its own. Prompted by cost and the need for inter-service operability, the Department of Defense (DOD) called for a coordinated approach. In 1968, DOD established a tri-service steering committee called the Navigation Satellite Executive Group (NAVSEG). NAVSEG spent several years hammering out a set of specifications — the number of satellites, type of orbit, signal protocols and modulation techniques — and developing cost estimates that all of the services could accept. By 1973, the group reached a compromise, and in 1974, DOD started the long-term project to build the NAVSTAR GPS. By 1994, the 24th and final GPS satellite was in orbit. By 1995, DOD stated that it had spent about $8 billion to develop and deploy GPS. To this day, DOD still owns and operates the world’s global positioning system. GPS’ 24 satellites lie in six orbital planes and circle 20,200 km above the Earth in 12-hour orbits.

Even before NAVSTAR GPS was fully operational, the civilian markets — predominantly the maritime and aviation industries — were knocking on DOD’s door, asking for access to the system. How could one ever get any reasonable return on such a massive investment? Where was the market to justify the great financial risks? But supply created a demand.

In 1983, prompted by the downing of Korean Air flight 007, President Reagan allowed civilian aviation to share the GPS technology. In 1991, the United States made GPS available on a continuous basis for civilian use around the world. However, because of security concerns, the Dept. of Defense purposely reduced the accuracy of the GPS available to the civilian sector. In 2000, when President Clinton ordered “intentional degradation” discontinued, GPS became a true mass-market technology: automobiles, recreational vessels, and the countless number of “apps” for the new generation of smart devices to name but a few examples. Few could have imagined the diverse applications GPS could accommodate and the mass-market appeal it could generate, and it is hard to imagine any profit-driven institution, committing to spend billions of dollars to answer these questions.

The Early Years of Lithium Battery Technology

By the 1960s, progress in battery technology was running up against the constraints of Faraday’s Law of Electrolysis. Simply put, with energy densities having reached a peak, any increase in battery output required a corresponding increase in the battery’s size: double the output, the double the size. Lithium chemistry offered higher energy densities. Lithium chemistry for batteries had been first suggested for pacemakers in the 1960s. But only in the 1970s did a serious effort begin at Exxon on producing a lithium battery. Industrial R&D on the battery, however, soon ended. Lithium was a difficult metal to work with, Because of its extreme reactivity: much expensive basic research was needed before Lithium could be a viable technology. For battery manufacturers, the technical and market uncertainties associated with Lithium were too high to justify large R&D investments and the great expense of retooling production for this new battery technology. But the U.S. Army took a great interest in Lithium technology.

With the ever growing number of electronic technologies used by the Army, the development of lighter-weight, cost-effective power sources with higher energy densities had become essential to the deployment of new battlefield devices. As a result, in the 1980s, the U.S. military started to underwrite a lot the basic R&D to develop the Lithium battery. Although progress was made, the reader may recall that as late as 2003, the shortage of Lithium batteries affected the scheduling operations in the second Gulf War.

Throughout the 1980s, millions of Lithium batteries were sold, but only as an expensive technology for a specialized market, the military. The consumer market opened up when, in the early 1990s, Sony in collaboration with the Asahi Chemical Co. pioneered the Lithium-ion battery. Although the availability of Lithium batteries for consumers was a private sector accomplishment, there had been 15 years of R&D, overseen mainly by the military, on Lithium approaches to the battery. Although the Lithium-ion battery would in time become the heart of the consumer electronics, in the early years the big user remained the military. In 1995, the U.S. Army first introduced the rechargeable BB-2847 lithium-ion battery for Night Vision equipment. While the U.S. military led the way in different Lithium technologies, America’s private sector missed the opportunity to grab the baton and lead in Lithium-ion battery technology. As the Japanese were scaling up Lithium-ion battery production in the 1990s, a Joint Battery Industry Sector Study, led by U.S. and Canadian military services, expressed concerned over the slow diversification of North American industry from military to civilian applications. The legacy of this issue can still be seen today, when technical competence in battery technology will be crucial to the future of the electric car industry, and may determine which nation becomes the technological leader in this field

Conclusion

Over the course of the three articles, we have demonstrated the long, broad and deep role of the U.S. Government in fostering technological innovation. Initially the focus was on creating a level playing field for innovators–patents are actually required by the Constitution!–and on investment in the development of military technology, as at the Springfield armory. Ultimately such investment spread to transportation, telecommunications and public health. After the civil war, investment continued in the form of land-grant colleges and the beginnings of the federal laboratory system. The two World Wars and the Cold War sparked ever greater investment in research and development directly in the federal laboratories and also in contract laboratories, often partnered with universities. The pattern has been one of the federal government funding and sometimes directing research in areas where there is a long-term potential benefit for the nation but where such research is too expensive or the outcome too unclear for a private entity to risk investment.

The improvements to society brought about by this federal involvement are too numerous to cover in three short articles; we have just scratched the surface here. We could just as well have discussed the ARPANET, a computer communications network developed under the stewardship of the Department of Defense’s Advanced Research Projects Agency (ARPA). ARPANET was the paradigm and foundation for the subsequent development of the Internet. For more information on this and other important stories, the reader is urged to visit the IEEE Global History Network.

While reasonable people can disagree about the appropriate levels of federal research and development activity for the 21st century, we hope that any discussions on the matter will all take this rich history into account.

References

Here are some of the works consulted for this series of articles, which the reader may want to reference for further detail.

Ernst Braun and Stuart MacDonald, Revolution in Miniature, (Cambridge: Cambridge University Press, 1982)

Robert Buderi, The Invention that Changed the World: How a Small Group of Radar Pioneers Won the Second World War and Launched a Technological Revolution (New York: Simon & Schuster, 1996)

H.A. Christopher, S. Gilman, and R. P. Hamlen, “U.S. Army Research Laboratory Power Sources R&D Programs”, IEEE AES Systems Magazine, May 1993, 7-10.

Lewis Coe, The Telegraph: A History of Morse’s Telegraph and its Predecessors in the United States, (Jefferson, NC: McFarland & Co., 1993)

A. Hunter Dupree, Science in the Federal Government, A History of Policies and Activities to 1940. (Cambridge, Mass., Belknap Press of Harvard University Press, 1957)

Kenneth Flamm, Creating the Computer: Government, Industry, and High Technology, (Washington D.C.: The Brookings Institution, 1988)

Victoria A. Harden, “A Short History of the National Institutes of Health,” National Institutes of Health website, https://history.nih.gov/exhibits/history/index.html, accessed 11 July 2011.

David Mowery and Nathan Rosenberg, “The Commercial Aircraft Industry”, in Government and Technical Progress, Richard Nelson (ed.), (New York: Pergamon Press, 1982), 101-162.

Office of Technology Assessment, Congress of the United States, A History of the Department of Defense Federally Funded Research and Development Centers, (Washington, D.C.: Government Printing Office, June 1995)

Emerson W. Pugh and Lars Heide, “IEEE STARS: Punched Card Equipment,” IEEE Global History Network website, https://ethw.org/STARS:Punched_Card_Equipment, accessed on 11 July 2011.

Richard Rhodes, The Making of the Atomic Bomb, (New York: Simon and Shuster, 1986)

Merritt Roes Smith, Harper’s Ferry Armory and the New Technology: The Challenge of Change, (Ithaca, NY: Cornell University Press, 1977)

Roger L. Williams, The Origins of Federal Support for Higher Education: George W. Atherton and the Land-Grant College Movement, (University Park, PA: Pennsylvania State University Press, 1991)

G. Pascal Zachary, Endless Frontier: Vannevar Bush, Engineer of the American Century, (New York; The Free Press, 1997)

Exit mobile version