Photo Credit: iStock.com/ the_guitar_mann
The recent Boeing 737 MAX crashes, in both of which all passengers and crew were lost, brought to the fore two issues: how best can today’s pilots interface with planes that can “fly themselves,” and how and by whom is a complex technology approved for public applications in which the public itself has little or no input or control.
The 737 MAX was a product of Boeing’s decision to upgrade its highly successful “Baby Boeing” 737, which had been introduced in the 1960s and more than 10,000 of which had been purchased by airlines worldwide. The 737 MAX was intended to compete with its rival, the recently introduced Airbus 320neo. Even prior to its certification in 2017 by the Federal Aviation Administration (FAA), Boeing had orders for more than 5000 MAXes from more than 100 carriers. It began deliveries in May 2017.
The 737 MAX employs larger, more powerful engines, which are mounted significantly forward of the conventional 737 engines. To help prevent stalls on the MAX, Boeing developed a new flight-control system, the MCAS (Maneuvering Characteristics Augmentation System). It is intended to swivel the tail’s horizontal stabilizer to push the nose of the plane down if it rises too swiftly, threatening a stall.
The Seattle Times reported that, in contrast to the Boeing tradition of giving the pilot complete control of the aircraft, the MCAS flight control system was designed to act independently, without pilot input. Possibly as a result, pilot training exercises for the MAX may not even refer to the MCAS, and the 737 MAX pilot manuals provide no information about it.
The MCAS responds to the input from an angle of attack sensor. Two of the sensors are mounted on the forward exterior of the MAX, one on either side. Boeing has made it optional for purchasers of the MAX to acquire an angle of attack indicator that displays to the pilots the readings of the two sensors, as well as a “disagree light” that is activated when the sensor readings are not identical. The FAA has not mandated installation of either the angle of attack indicator or the disagree light. But in the future they may well be required on the 737 MAX. Then if a serious disagreement between the readings of the two sensors occurs, the MCAS can be automatically disabled and the pilot can take control. Both American Airlines and Southwest Airlines had the two safety features installed in the 737 MAXes already in their fleets, while United Airlines refrained because its pilots “use other data to fly the planes” according to a United spokesperson.
There proved to be some discrepancies between Boeing’s specifications for the operational functioning of the MCAS and its actual response in use. In a report to the FAA the stabilizer was said to be movable a maximum of 0.6 degrees by the MCAS, but in service it could move 2.5 degrees. Also, Boeing did not account for how the MCAS reset itself each time a pilot reacted, enabling the system to repeatedly push the plane’s nose down. And finally, the input required to activate the MCAS was specified to be based on the readings of both sensors, but in fact was based on just one.
In 2016, the U.S. Transportation Department reported that the FAA had not ensured that airlines adequately trained pilots on hands-on flying or even on how to monitor a plane on autopilot. It was feared that newly-trained pilots were becoming button-pushers but might not be able to fly the planes when automation failed.
Despite the possibility that training of passenger pilots might include the FAA-specified 1500 hours of flight time and up to 30 days in simulation, when in the cockpit of an advanced jet they quickly begin to rely on autopilot. To compensate, some airlines urge pilots to turn the automation off on “beautiful” days to keep their manual skills intact.
In the continuing growth of air travel, young pilots often do not get the 1500 hours flight time specified by the FAA. In the recent Ethiopian crash, one of the pilots had only 200 hours of flight time. Today, pilot time is largely devoted to programming and monitoring pieces of equipment, inputting data, and checking operation. Young Asian and Middle Eastern pilots have limited or no experience in military planes or in other local flying opportunities, so must place their trust heavily on automation. In emergency situations they must decide quickly which button to push and may be too busy to notice what the plane is actually doing—or not doing.
As flight technology grows in complexity, who best defines safety procedures, and who should monitor their application?
In 2005, the FAA’s Organization Designation Authorization (ODA) enabled in-house inspectors at Boeing to oversee testing and design review, a program that was fully implemented by 2009. In 2018, the FAA was mandated by Congress to fully delegate safety functions to industry.
It made some sense. Who knows better the strengths and weaknesses of their products than the designers, engineers, and manufacturers themselves? Or, in some cases, the users.
Yet engineering professor Henry Petroski, in his book Design Paradigms, warned that once a design becomes accepted, incremental extrapolation in succeeding generations tends to be the norm and first principles tend to be overlooked. Also, design decisions that may prove to be critical are often relegated to less experienced engineers.
It may be unwise to exclude third parties (e.g., government agencies) completely, or, conversely, to grant them excess power to certify the safety of completely new products.
Following the 2010 Deepwater Horizon oil rig disaster, an industry-financed study concluded that a mentality existed among rig operators that “I don’t want to find problems; I want to do the minimum necessary to obtain a good test.” The study concluded that companies cut corners on federally mandated tests of blowout preventers.
The recent airliner accidents are no doubt sending a foreboding message to developers of AV automobiles. One major difference: in an AV there will be no on-board pilot to aid in attempting to avoid a serious accident—or to share the blame for failing to do so.
Your thoughts are welcome.
- Gates, D., “Flawed analysis failed oversight: How Boeing, FAA certified 737 MAX flight control system,” seattletimes.com retrieved Mar. 18, 2019
- Nicas, J., and Z. Wichter, “As Pilots Rely on More Automation, Their Skills and Confidence Erode,” The New York Times, 15, 2019. P. A13.
- Gabrekidan, S., J. Glanz, and J. Nicas, “Questions Rise As Flight Data Links Crashes: Similarities Are Found In Crashes’ Data,” The New York Times, 18, 2019, p. A1.
- “Difference Engine: Crash Program? Cockpit Automation,” https://www.economist.com/babbage/2013/08/26/difference-engine-crash-program retrieved Mar. 17, 2019
- Fitzpatrick, A., “Boeing, the FAA and newly nervous flyers,” Time, Mar. 25, 2019, p. 8.
- Casner, S., “Dumbing It Down in the Cockpit: As automation gets sharper, pilots’ thinking skills are getting duller,” https://slate.com/technology/2014/12/automation-in-the-cockpit-ismaking-pilots-thinking-skills-duller retrieved Mar. 17, 2019
- Cockpit Automation-Advantage and Safety Challenges https://skybrary.aero/index.php/Cockpit_Automation_Advantages_and_Safety_Challenges retrieved Mar. 17, 2019
- OIG Audit Report: Enhanced FAA Oversight Could Reduce Hazards Associated with Increased Use of Flight-Deck Automation 2016 http://www.skybrary.aero/bookshelf/books/4077.pdf
- Gelles, D., and T. Kaplan, “How Boeing’s Jet Was Deemed Safe Gets Closer Look: Was a Revision Missed?,” The New York Times, 20, 2019, p. 2.
- Lavin, A., and H. Suhartono, “Pilot Who Hitched a Ride Saved Lion Air 737 Day Before Deadly Crash” retrieved Mar. 21, 2019
- Dahrul, F., H. Suhartono, and K. Park, “Indonesia is Interviewing the Off-duty Pilot Aboard Lion Air Flight” retrieved Mar. 21, 2019
- Tabuchi, H., and D. Gelles, “Doomed Jets Lacked 2 Key Safety Features that Boeing Sold as Extras,” The New York Times retrieved Mar. 21, 2019
- Hennigan, W. J., “Second-Hand Safety: Investigation probes the FAA and Boeing after the deadly crashes,” Time, 1, 2019.
- Christiansen, D., “The ‘Inconceivable’ Consequences of Failure,” The Best of Backscatter, 5, IEEE-USA, 2016.
- Christiansen, D., “When Engineers Should Say ‘No’,” The Best of Backscatter, Vol. 5, IEEE-USA, 2016.
- Christiansen, D., “Accidents Waiting to Happen,” The Best of Backscatter, Vol. 1, IEEE-USA, 2008.
- Petroski, H., Design Paradigms, Case Histories of Error and Judgment in Engineering, Cambridge University Press, 1994.
- Layton, E., The Revolt of the Engineers, Case Western Reserve University Press, 1971; reprinted with a new introduction by the author in 1986, The Johns Hopkins University Press.
- Christiansen, D., “Who’s in Charge Here?,” The Best of Backscatter, Vol. 1, IEEE-USA, 2000.
- Kitroeff, N., D. Gelles, J. Glanz, and H. Beech, “Pilots Followed Boeing Checklist Before Crashing: An Inquiry in Ethiopia,” The New York Times, Apr. 5, 2019, p. 1.
- Gelles, D., “Boeing Curbs Production of 737 MAX, Citing Safety,” The New York Times, 6, 2019, p. B1.
- Smithsonian Channel, “Air Disasters” series.
- Edwards, E., “Automation in Civil Transport Aircraft,” Applied Ergonomics, Dec. 1977, p. 194.