As airlines and safety regulators worldwide scramble to understand why two Boeing 737 Max 8 jets crashed in chillingly similar accidents, more indications are pointing to how an automated anti-stalling system may be linked to the model’s unusually deadly debut. The safety feature—the Maneuvering Characteristics Augmentation System (MCAS)—appears to have sent both planes into their fatal dives as pilots struggled to keep aloft. The 737 Max 8 and 9 were grounded by regulators around the world last week. Here are key details that have been reported—most significantly by the Seattle Times—about a series of engineering, regulatory, and political missteps that preceded software being installed on a widely used plane without pilots apparently fully understanding its risks.
The origin of the expression is as follows. It was said that a group of fishermen caught a large number of turtles. After cooking them, they found out at the communal meal that these sea animals were much less edible that they thought: not many members of the group were willing to eat them. But Mercury happened to be passing by –Mercury was the most multitasking, sort of put-together god, as he was the boss of commerce, abundance, messengers, the underworld, as well as the patron of thieves and brigands and, not surprisingly, luck. The group invited him to join them and offered him the turtles to eat. Detecting that he was only invited to relieve them of the unwanted food, he forced them all to eat the turtles, thus establishing the principle that you need to eat what you feed others.
Often, we need fast answers with limited resources. We have to make judgements in a world full of uncertainty. We can’t measure everything. We can’t run all the experiments we’d like. You may not have the resources to model a product or the impact of a decision. How do you find a balance between finding fast answers and finding correct answers? How do you minimize uncertainty with limited resources?
Some of this fear results from imperfect risk perception. We’re bad at accurately assessing risk; we tend to exaggerate spectacular, strange, and rare events, and downplay ordinary, familiar, and common ones. This leads us to believe that violence against police, school shootings, and terrorist attacks are more common and more deadly than they actually are – and that the costs, dangers, and risks of a militarized police, a school system without flexibility, and a surveillance state without privacy are less than they really are.
There are nine or so principles to work in a world like this: Resilience instead of strength, which means you want to yield and allow failure and you bounce back instead of trying to resist failure. You pull instead of push. That means you pull the resources from the network as you need them, as opposed to centrally stocking them and controlling them. You want to take risk instead of focusing on safety. You want to focus on the system instead of objects. You want to have good compasses not maps. You want to work on practice instead of theory. Because sometimes you don’t why it works, but what is important is that it is working, not that you have some theory around it. It disobedience instead of compliance. You don’t get a Nobel Prize for doing what you are told. Too much of school is about obedience, we should really be celebrating disobedience. It’s the crowd instead of experts. It’s a focus on learning instead of education.