It is not always easy to envision how a new technology will be used-or misused. To what extent might its developers (we engineers, usually) be expected to predict unforeseen uses, particularly those which may prove harmful or dangerous? As engineers, we are noted for our ability to develop new technologically sophisticated products, but not necessarily for being good at, or even interested in, predicting how they may be misused.
I am intentionally omitting from this discussion those cases in which the developers should have been aware of some key weakness (e.g., a particular failure mode) and must therefore take responsibility for harm to any user who has not violated the specifiied conditions of use.
In any particular case, the engineering community is hesitant to posit and publicize applications of a new product that have not been tested, even if they might seem to pose no negative consequences. And we are not likely to publicize uses that may be harmful or even criminal.
Yet when a new product or system becomes widely available to the general public, its government, or to foreign nations (friendly or otherwise), its uses are no longer controllable by its developers, and thus the ability to limit misuse resides with the public and/or its appointed or elected representatives.
Some analysts thus propose that fewer products that prove unsafe or otherwise yield unwanted consequences might find their way to market if both the general public and our elected representatives were more technically literate. They argue that pertinent laws and intelligent regulations could then rein in harmful applications, or even prohibit a potentially dangerous product from reaching the market. But opponents of this notion view it as a pipe-dream, citing the following issues.
First, they note that technical competency of U.S. students, notably neglected in the 1960s and beyond, has not since improved, nor has the technical literacy of the U.S. Congress. Second, today’s youngsters, their parents (and even some grandparents), consider themselves as technically literate as need be, since they are skilled at using all the new technology that comes their way, even though they may not know what’s in it or how it works.
What users really need to be more concerned about are the effects of using these high-tech products, say the critics. Much of today’s high tech is computer-based, and the potential downside of embedding oneself online is well known, including its addictive nature, the avoidance of face-to-face socializing, cyberbullying, invasion of privacy, phishing, loss of security, and financial crime.
Might these problems be ameliorated in any way if the users of the burgeoning crop of computer-based tools and toys were made more familiar with how they actually work? The critics of the “technical literacy will solve it” theory are skeptical. They note that the engineering community itself is not particularly comfortable in providing advice to lawmakers or the general public on ways to curb the ill-advised uses of high-tech devices. Indeed, we are notably ill at ease in dealing with the social or institutional aspects of an issue because the facts are often hard to fathom and the application situations are never static.
And finally, a few extreme skeptics have also floated the notion that if by some unlikely means members of the general public could be made (nearly) as technically literate as scientists and engineers, a significant number of them might then find even more ways to use new high-tech devices for devious purposes.
The subject of how military technology is developed and deployed is a topic unto itself-a companion to the broader issue of the methodology and ethics of warfare. Military technology, at whatever level of sophistication, can be and always has been “misused.” Even weaponry or technology that had been ostensibly developed for peaceful or defensive purposes only has inevitably been used offensively-nuclear weaponry being the classic example.
To say it is a sensitive topic is hardly necessary. In a 1984 survey of IEEE members residing in the United States, 86 percent listed the nuclear arms race and arms control as a top concern. Two years earlier, 72 percent of respondents to another IEEE survey said that methods of test-ban verification should be a concern of the IEEE itself. Yet among the minority who disagreed, one said “Nuclear arms as an issue is far too large and complex for the IEEE to become involved.” And another: “We have no business interfering when we have no competence.”
No Easy Solutions
Despite its desirability, it seems doubtful whether methodologies can be developed for reducing the frequency of deploying new high-tech products or systems that may invite or enable unexpected consequences.
And although engineers are concerned about societal problems that could result from the uses of “our” technology, it appears that we are better conditioned to develop countermeasures, including technology that will make the offending products obsolete and thus lessen their ill effects.
My neighbor summed up the seemingly never-ending challenge: “You guys do some good stuff. But the tendency of your technologies to run amok must require your attention 24-7. I mean: Nukes, bad-boy computers, planes that disappear in mid-flight, and now self-driving cars and drones.”
I had no immediate response.
Donald Christiansen is the former editor and publisher of IEEE Spectrum and an independent publishing consultant. He is a Fellow of the IEEE. You can write to him at firstname.lastname@example.org.