Upgraded Boeing 737 MAX commercial jets have returned to the skies. The blame game, however, goes on.
Can we find a principle that cuts through the spin?
Accidents Happen; Catastrophes Take Effort
A total of 346 people died in two separate crashes of Boeing 737 MAX jets in October 2018 and March 2019. The crashes triggered worldwide grounding of 737 MAXes pending investigations and fixes.
A September 2020, U.S. House Transportation Committee investigative report, penned by the Committee’s Democratic staff, blamed both Boeing and the Federal Aviation Administration (”FAA”) for “repeated and serious failures.” They included “a horrific culmination of a series of faulty technical assumptions by Boeing’s engineers, a lack of transparency on the part of Boeing’s management, and grossly insufficient oversight by the FAA.”
A High-Stakes Blame Game
We’ve seen this kind of story before.
Boeing and the FAA face a “prisoner’s dilemma.” Should they claim to have acted properly in consultation? Or will they each blame the other? Boeing may claim exoneration through FAA review and approval; the FAA may claim that Boeing misled or failed to make accurate and adequate disclosures.
Boeing has already taken multi-billion dollar hits to stock price, 737 MAX contracts, and long-term reputation. Company CEO Dennis Muilenburg resigned in December 2019.
Going forward, billions of dollars in judgments and fines are at stake, as well as criminal charges against the company and its personnel. Boeing must also heal its wounded reputation.
Sequence of Events
Hindsight is of course 20/20. Here is the sequence of key decisions that not only initially put the 737 MAX into the air, but put two of them into the ground:
- Boeing decides to develop a more fuel-efficient version of its popular 737 aircraft
- Fuel efficiency requires engine selection and placement that change the plane’s handling characteristics, introducing in particular a tendency to stall
- To avoid costly simulator retraining of 737 pilots, Boeing develops an automated flight-control system (Maneuvering Characteristics Augmentation System, or “MCAS”). In theory, MCAS will prevent or react to a stall by pushing down the jet’s nose through incremental adjustments of up to 0.6 degree each to the rear stabilizer. The system relies on input from a single sensor, which is industry practice, assuming accurate classification of the risk of a faulty reading. Some 737 MAX versions, including the two planes that crashed, lack alerts that would signal a malfunctioning sensor
- System designers and regulators also assume that pilots will diagnose and react to an overcorrection or faulty sensor within four seconds. Moreover, certain corrective actions by a pilot do not switch off the MCAS system, unlike some other automated systems. In this regard, MCAS resembles an automotive cruise-control that might, depending on how the driver tapped the brakes, maintain speed — even when the driver wanted to slow down or stop
- In response to test-pilot concerns over low-speed stalls, Boeing quadruples the amount the system can repeatedly move the stabilizer, to increments of 2.5 degrees. Reportedly, a top FAA pilot knows about the change, and Boeing mentions it in a letter and in a number of Boeing presentations to FAA officials. The change apparently does not require Boeing to file an updated safety assessment with the FAA. Senior FAA officials claim to have been left in the dark, some arguing that an update might/would have made a difference
- Boeing decides not to tell cockpit crews about MCAS or how it works. Draft-manual references describing MCAS and its purpose are deleted from the final manual delivered to customers. Reportedly, the company reasons that pilots have trained for years to deal with a comparable problem which requires the same response as an MCAS malfunction. Boeing and regulators apparently agree that pilots don’t need to know what causes a nose-down problem so long as they know how to respond
- Neither Southwest (the 737 MAX’s first and biggest customer) nor any other airlines informs its pilots about MCAS. Southwest also deletes references to MCAS and related emergency procedures from its own pilot manuals
- The FAA signs off on 737 MAX training via laptop or tablet, waiving time-consuming and expensive simulator training
The cumulative effect of the above decisions is a system that might, in the event of a single-sensor malfunction, steer the airplane into the ground unless, within a four-second window, pilots take specific-but-uncommunicated corrective actions.
In response to the crashes, Boeing installed a second sensor. Furthermore, MCAS will only kick in if both sensors agree, will only activate once, and will no longer override pilot use of the control column.
The company has provided customers and pilots with documentation on MCAS and related procedures. Simulator training has also been developed and rolled out.
Finally, Boeing has revised parameters for pilot response times and behaviors.
Cutting Through The Spin
Boeing’s fixes pin down — from a technical perspective — the causes of the two, 737 MAX crashes.
For the dead passengers and their families, these fixes represent too little, too late.
As noted above, hindsight is 20/20. Human nature also leads us to reason backwards, to light upon causes that match our preconceptions or preferred explanation, and to cherry pick the facts, or to interpret the grey areas in light of our prejudices.
The Democratic staff of the House Committee cite corporate mismanagement and regulatory capture. Some people were evil or reckless.
On the other hand, a Wall Street Journal article points the finger at unreasonable expectations over pilot response and response time. Some people were negligent.
Concurrently, Boeing maintains that it built an aircraft that met regulatory standards, with proper disclosure of technical trade-offs and assumptions. In other words, the company built a plane that was not defective, but could be improved upon. Some people were dutiful but unlucky.
Grey areas limit our ability to cut through spin. There are ranges of reasonable actions and decisions. Here, we have opinions, but not certainties. We wrestle with problems and suggest improvements around the edges, but cannot hope to solve the underlying challenge.
For the 737 MAX, two such areas stand out. The first is Boeing’s safety culture. An ethical safety culture understands that building a safe plane and satisfying regulators are related but not the same. People face temptations to game the system. Did that happen here, consciously or unconsciously? It may be hard to say for sure.
Similar doubts arise with regard to regulatory capture. We don’t want regulators over-identifying with the companies they oversee, or yielding to political pressure. But neither can society afford an unyielding, adversarial, and completely risk adverse bureaucracy that kills industries in the name of safeguarding them.
Acting with unwarranted certainty in grey areas, we fool ourselves into thinking that every failure stems from a failing. This leads us to over-punish and overcorrect.
The Bright Line Of Principle
Cutting through the spin requires us to find a bright line of principle. In this regard, a long-standing rule of flying is telling a pilot the condition of his or her aircraft. This means not just its current state, but its operational characteristics and limits, flying procedures, and emergency protocols.
Boeing ostensibly put 737 MAX pilots at the center of operational safety, but withheld from them information on key features of the aircraft they were asked to fly.
…Insert Text Above
Had the pilots known about MCAS, they might have refused to fly the plane. Or first demanded simulator training and/or changes to MCAS. Pilots certainly would have been better prepared to respond, as well as better positioned to share knowledge and feedback with other pilots, airlines, Boeing, and regulators.
The clear fault of the 737 MAX lies in Boeing, the FAA, and the airlines letting pilots risk their lives in the cockpit without letting them act as pilots. As aviation hero Sully Sullenberger testified before the House Committee: “[Not] even the existence of MCAS, much less its operation, was…communicated to the pilots who were responsible for safely operating the aircraft….”
Sullenberger also pointed out design flaws in the MCAS system. But, these followed from being aware of it in the first place, and only then applying his real-world pilot’s judgment and experience to design issues.
In not trusting pilots with information on the 737 MAX’s MCAS system, what Boeing, the FAA, and airlines really communicated was their distrust of the plane.
End of spin. End of story.
(This blog first appeared in Forbes. Reprinted with permission.)