Suppose a new element is discovered. How soon can any likely dangerous or questionable uses of this element be determined? And can designers be constrained from developing innovative products and systems because of their possible unsafe, unethical, or other antisocial uses? These have been, and continue to be, open questions.
One attempt to address them has been through technology assessment (TA). Another more recent refinement of TA is termed “responsible research and innovation” (RRI).
The notion that research and innovation are the sole responsibility of the engineer or scientist is largely obsolete. Except for isolated cases of individual entrepreneurs, the development of innovative designs are guided (and limited) by financial resources, corporate objectives, public and customer interest, legal restrictions, and government regulations. Thus efforts to guarantee successful and “responsible” outcomes of research and innovation must necessarily involve all of these factors. This is particularly true for innovations involving large systems that affect the general populace, like energy, transportation, and information networks, and which involve major concerns of health, safety, privacy, and national security.
Perhaps the first major effort to identify, support, and promote desirable technical developments at the national level dates from 1972, when the Office of Technology Assessment (OTA) was created by the U.S. Congress. During its 24-year existence OTA conducted studies that yielded some 750 reports for the edification of and use by Congressional members and committees.
Among the OTA reports related to national and global security were Nuclear Proliferation and Safeguards (1977, 1995), Arms Control in Space (1984), Arming U.S. Allies (1990), Anti-terrorism Technology (1991, 1992), and Proliferation of Weapons of Mass Destruction (1999).
Transportation studies included Automated Guideway Transit Systems (1975), Community Planning for Mass Transit (case studies of ten major cities from Boston to San Francisco) (1976), Automatic Train Control (1976), Air Traffic Alert and Collision Avoidance (1989), and Tiltrotor Aircraft and Maglev Vehicles (1991).
OTA reports related to energy usage included Applications of Solar Technology (1978) and New Electric Power Technologies (1985).
Among hundreds of other OTA studies were these: Cancer Risks from the Environment (1981), Preparing for Science and Engineering Careers (1987), HDTV (1990), High Temperature Supercomputers (1990), and Remotely-Sensed Data Technology (1994).
The OTA shut down in 1995, when its annual budget was $21.9 million and Congress withdrew funding despite much criticism of its decision to do so.
TA Lives On
The U.S. Government Accountability Office subsequently assumed some of the duties of the defunct OTA.
In 1949, the non-profit, bipartisan International Center for Technology Assessment was founded “to provide the public with full assessment and analysis of technological impacts on society.”
In 2010, the Woodrow Wilson Center for Scholars, in a report on “Reinventing Technology Assessment,” proposed the creation of a “nationwide network of non-partisan policy research organizations, universities, and science museums” that would conduct “both expert and participatory technology assessments for Congress and other clients.”
With the advent of the European Union, technology assessment took on a new role, with the designation Responsible Research and Innovation (RRI) and with the added consideration of ethical matters and an emphasis on science, technology, and society issues.
The EU program for research and innovation has been designated “Horizon 2020.” Its objective is to “better align” the process and outcomes of R&I with the “values, needs, and expectations of European society.” Its stated methodology is to (1) engage society more broadly in its R&I activities, (2) increase access to scientific results (open access), (3) ensure gender equality in both the research process and content, (4), account for the ethical dimension, and (5) promote both formal and informal science education.
In one definition of RRI, researchers and innovators are urged to anticipate risks and “design socially robust agendas for risk research and risk management” and to involve potential users and other stakeholders in their research.
In a philosophical argument regarding the term “responsible research,” there are those who note that research itself cannot be “irresponsible.” Rather, irresponsibility can arise only in poorly controlled or misapplications of new technology, as, for example, in the case of nuclear weaponry or cyber-security issues. Thus “responsible application of technology” might be a more apt designation (its acronym notwithstanding). Unfortunately, many of the OTA and RRI studies suggest that in many cases needed intervention is not undertaken, or even recognized as necessary, until a system has been expensively and widely deployed. As existing systems age or expand in usage, unpredictable and sometimes devious applications are often devised.
The substantive question is whether the misuses or overuses of a technological innovation can be predicted at a sufficiently early stage, then avoided or limited through appropriate intervention, including national or global policies, that is prompted or aided by studies like those of the OTA or RRI.
- Horizon 2020, the EU Framework Programme for Research and Innovation, https://ec.europa.eu/programmes/horizon2020/ (retrieved Aug. 20, 2015)
- International Center for Technology Assessment, https://www.icta.org/ (retrieved Aug. 18, 2015)
- Owen, R., Bessant, J., and Heintz, M., Responsible Innovation: Managing the Responsible Emergence of Science and Innovation in Society, Wiley, 2013.
- RRI Trends (monitoring RRI in 16 European Countries) https://rritrends.res-agora.eu (retrieved Aug. 21, 2015)
- Bimber, B., Politics of Expertise in Congress: The Rise and Fall of the Office of Technology, State University of New York Press, 1996.
- Mooney, C. “Requiem for an Office,” Bulletin of the Atomic Scientists, Sept. 2005, Vol. 61, No. 5.
- The Office of Technology Assessment Archive https://ota.fas.org/ (retrieved Aug. 21, 2015)
- OTA Publications Archive https://www.princeton.edu/~ota/ns20/alpha_f.html (retrieved Aug. 23, 2015)
- Christiansen, D., “When Designers Should Say “No’.” IEEE “USA Today’s Engineer, June 2010.
Donald Christiansen is the former editor and publisher of IEEE Spectrum and an independent publishing consultant. He is a Fellow of the IEEE. Comments may be e-mailed directly to the author at email@example.com.