A couple of months ago, we checked in for a flight coming home from the holidays. A flight change — no surprise given the time of year — occurred in the second leg of our trip between Denver and Manchester, N.H. To make sure we had the right boarding passes, we printed new ones from a kiosk that also printed out our luggage tags at the same time. It didn’t dawn on me that our luggage would not be directed to the same destination. Apparently, this problem has been known by the airline involved, or at least their lost luggage department personnel. Note that multiple software systems had to fail in this case, since minutes after printing the inconsistent luggage tags and boarding passes, the system “accepting” our luggage (human, but with computer scans of the luggage tags and passes) approved the tags as part of our ID and ticket confirmation. This reflects multiple software engineering failures.
Licensed “Professional Engineers” (PE) exist in most engineering professions. In many states, you cannot operate with the title “Engineer” unless you are a PE, although most also have the “corporate exemption” where company job titles may include “engineer” for non-PEs. In some states, the company title cannot include “Engineering” if they do not have a PE on staff, and a PE must “sign-off” on engineering designs/plans, etc., even if they are not produced by a PE or software is critically involved. Typically, this is required where “health and safety” are involved. Will the design of the hospital devices electrocute or over-radiate[i] the staff or patient? Will the bridge fail under a given load, or in high winds? Falling bridges and buildings, which still occur, are examples of engineering failures or failures to follow engineering plans/standards, etc.
My suitcase is not a real health and safety issue (we are advised to carry on any critical medications, etc.), but it is an example of a software engineering failure in a country where software PE licensing has been all but ignored, and then abandoned. In 2019, after a few years of actually licensing software engineers, the NCEES[ii] – the body that approves and applies the testing for PE licensing – dropped software engineering because too few persons were taking the test. Imagine for a moment deciding to not bother licensing surgeons because too few were taking the test, just let anybody with a knife proceed as they wish.
Currently, I will guess that 90% of software — from my personal web page to phone apps that play solitaire — doesn’t affect health or safety or significantly create corporate liability. So far, the airline that displaced my luggage has provided more in ticket vouchers than I paid for my flights. Perhaps with ten more flights they will give me my own plane. Of course, they know that customer satisfaction has a price, even if they can’t fix known software bugs in a reasonable period of time. But airframe software, autonomous vehicle software, and air traffic control software are dead center in the health and safety domain, with an emphasis on “dead.” Modern airframes and even cars have forward looking radar (or should have it). So, how can an airplane with this technology and altimeters actually fly into the ground? Pilot training, misinformation by vendors or, unfortunately, engineers may create a risk situation, but in some very real sense, the plane had the information that the trajectory would be fatal, and did not take action. A few highprofile semi-autonomous car accidents reflect the same failure.
I was aware of a failure at the Palmdale, California, air traffic control center some years back where failure to detect and properly respond to a system exception condition left a number of planes in the L.A. airspace without guidance — fortunately saved by their collision-detection and avoidance software. The LA Times reported on Palmdale software failures that occurred in 1999, 2000, 2001 and 2004 in a September 2004 article[iii]. With the explosion (hopefully an exaggeration) of the Internet of Things, the risks of software failures are expanding exponentially, along with anticipated user frustration and likely liability for the corporations involved. Recently, Alexa suggested shorting an electrical outlet to a 10 year old girl[iv], after a number of reports over recent years that the “outlet challenge” was a bit dangerous. While Alexa software engineering is not directly involved in the selection of responses to “give me a challenge,” the failure to allow and take into account feedback related to online posts is likely to become a liability beyond a public relations challenge.
In her 2021 book, This is How They Tell Me the World Ends: The Cyber-Weapons Arms Race, Nicole Perlroth notes that one of the key steps towards more secure systems is having key software developers certified in cybersecurity concepts. This, without good software engineering best practices, is only half of the solution. The challenge, of course, is for industry to recognize when PE-level capabilities are needed, and to assure that the right professionals with the right remit are assigned to these tasks. Industry demand for licensed software engineers, as well as support for professionals to get the right training and complete the process is essential, along with NCEES restoring the testing regime.
At some point, liability for software failures will become sufficiently painful to reach a tipping point. The IEEE Computer Society has developed the underlying international standards and Body of Knowledge[v] that is the basis for their related Certified Professional Software Developer[vi] program. This is also the basis for the NCEES licensing program.
Nathaniel Bornstein is quoted: “The most likely way for the world to be destroyed, most experts agree, is by accident. That’s where we come in; we’re computer professionals. We cause accidents.”[vii] It would be easier to face this intended humor if we had embraced the discipline — and even licensing — that might minimize these risks.
[i] Leveson, Nancy, “The Therac-25: 30 Years Later,” Computer ( Volume: 50, Issue: 11, November 2017)
[ii] The National Council of Examiners for Engineering and Surveying, https://ncees.org/
[iii] Malnic, Eric and Blankstein, Andrew, “System Failure Snarls Air Traffic in the Southland,” L.A. Times, https://www.latimes.com/archives/la-xpm-2004-sep-15-me-faa15-story.html (accessed 9 Jan 2022)
[iv] Hanson, Kait, “Amazon Alexa told 10-year-old to do dangerous ‘outlet challenge,’ mom says,” Today.com, https://www.today.com/parents/parents/amazon-alexa-device-challenges-10-year-old-outlet-challenge-rcna10250 (accessed 9 Jan 2022)
[v] Software Engineering Body of Knowledge (SWEBOK), IEEE Computer Society https://www.computer.org/education/bodies-of-knowledge/software-engineering (accessed 9 Jan 2022)
[vi] Software Professional Certification Program, IEEE Computer Society, https://www.computer.org/education/certifications (accessed 9 Jan 2022)
[vii] https://www.goodreads.com/author/quotes/604194.Nathaniel_S_Borenstein (accessed 9 Jan 2022)
Jim Isaak is President Emeritus of the IEEE Computer Society and lead IEEE standards efforts on web site engineering best practices and on Operating Systems interface standards (POSIX/UNIX/Linux…) and a volunteer with IEEE USA Policy Committees addressing Privacy, AI, Cybersecurity and related issues.
Opinions expressed in this article are not necessarily those of IEEE or IEEE-USA.