There is no small complexity in the task of transporting hundreds of people through the sky hundreds of kilometers an hour. More than 100,000 airlines depart and land every day, but two six-month fatalities have shocked passengers, regulators, and industry.
Crashes of the Boeings 737 Max in Indonesia and Ethiopia offer a window into all the complexity. Boeing and its CEO Dennis Muilenburg want the story to be simple: a software problem that can be solved with a quick patch. But it does not capture the mistakes made by Boeing and US aviation authorities to confirm the aircraft to carry passengers.
But all that piles beyond what is likely to be the highlight of the investigation of the incident: training and user experience of the people in the cockpits. Pilots did not have enough training to understand how the MCAS worked, and two important safety features – a display showing what the sensor had detected, and an easy warning if other sensors disagreed – were paywall.
Minimizing Exercise and Cockpit Changes was an Economic Decision: The upgraded aircraft would be more attractive to potential buyers if they were not going to spend expensive hours retraining their pilots. The Federal Aviation Administration decided Boeing's training and safety plans were fine. Now the researchers want to know why. The answers can be expensive for Boeing, and for America's reputation as a leader in the safe deployment of aviation technology.
Software is easy to blame, for many people, computer science is a mystery. But these crashes stem from an experience that we are all familiar with: the pressure to deliver on a tight schedule, the temptation to cut corners and the hope that a small cloth will not destroy the entire program in a large complex world.
Read more about quartz coverage of the Boeing 737 Max crisis. This post was originally published in the weekend edition of the Quartz Daily Brief newsletter. Sign up here.