FAILURE IS ALWAYS AN OPTION!
FAILURE IS ALWAYS AN OPTION!
FAILURE IS ALWAYS AN OPTION!
Technically speaking, nothing. Werner has no background in skating. But I believe he is one of us.
This suspicion started a couple of years ago when I stumbled upon Werner Herzog – A Guide for the Perplexed. Not knowing much about him I skimmed the back cover, which eerily read like a skater’s manifesto. Werner preaches maxims like getting the shot by any means necessary, carrying bolt cutters everywhere, and thwarting institutional cowardice with guerrilla tactics. His film school teaches lock picking, forgery, and his entire career has been built on a DIY approach to life, his craft banged into existence through decades of trial and failure.
Because Werner’s approach to life and filmmaking mirrors the ethos of skating in so many ways, I decided to track him down to chat about the similarities and differences between our two worlds.
As airlines and safety regulators worldwide scramble to understand why two Boeing 737 Max 8 jets crashed in chillingly similar accidents, more indications are pointing to how an automated anti-stalling system may be linked to the model’s unusually deadly debut. The safety feature—the Maneuvering Characteristics Augmentation System (MCAS)—appears to have sent both planes into their fatal dives as pilots struggled to keep aloft. The 737 Max 8 and 9 were grounded by regulators around the world last week. Here are key details that have been reported—most significantly by the Seattle Times—about a series of engineering, regulatory, and political missteps that preceded software being installed on a widely used plane without pilots apparently fully understanding its risks.
via https://qz.com/1575509/what-went-wrong-with-the-boeing–737-max–8/
Some people are calling the 737MAX tragedies a #software failure. Here’s my response: It’s not a software problem…
via https://threadreaderapp.com/thread/1106934362531155974.html
Any human with above room temperature IQ can design a utopia. The reason our current system isn’t a utopia is that it wasn’t designed by humans. Just as you can look at an arid terrain and determine what shape a river will one day take by assuming water will obey gravity, so you can look at a civilization and determine what shape its institutions will one day take by assuming people will obey incentives. But that means that just as the shapes of rivers are not designed for beauty or navigation, but rather an artifact of randomly determined terrain, so institutions will not be designed for prosperity or justice, but rather an artifact of randomly determined initial conditions. Just as people can level terrain and build canals, so people can alter the incentive landscape in order to build better institutions. But they can only do so when they are incentivized to do so, which is not always. As a result, some pretty wild tributaries and rapids form in some very strange places.
via http://slatestarcodex.com/2014/07/30/meditations-on-moloch/
[John MacWilliams’s] more general point was that managing risks was an act of the imagination. And the human imagination is a poor tool for judging risk. People are really good at responding to the crisis that just happened, as they naturally imagine that whatever just happened is most likely to happen again. They are less good at imagining a crisis before it happens—and taking action to prevent it. For just this reason the D.O.E. under Secretary Moniz had set out to imagine disasters that had never happened before. One scenario was a massive attack on the grid on the Eastern Seaboard that forced millions of Americans to be relocated to the Midwest. Another was a Category Three hurricane hitting Galveston, Texas; a third was a major earthquake in the Pacific Northwest that, among other things, shut off the power. Yet, even then, the disasters they imagined were the sort of disasters that a Hollywood screenwriter might imagine: vivid, dramatic events. MacWilliams thought that, while such things did happen, they were not the sole or even the usual source of catastrophe. What was most easily imagined was not what was most probable. It wasn’t the things you think of when you try to think of bad things happening that got you killed, he said. “It is the less detectable, systemic risks.” Another way of putting this is: The risk we should most fear is not the risk we easily imagine. It is the risk that we don’t.
via http://www.vanityfair.com/news/2017/07/department-of-energy-risks-michael-lewis
Perhaps the most dramatic example of a massive scandal that cannot seem to be reversed involves Annie Dookhan, a chemist who worked at a Massachusetts state lab drug analysis unit. Dookhan was sentenced in 2013 to at least three years in prison, after pleading guilty in 2012 to having falsified thousands of drug tests. Among her extracurricular crime lab activities, Dookhan failed to properly test drug samples before declaring them positive, mixed up samples to create positive tests, forged signatures, and lied about her own credentials. Over her nine-year career, Dookhan tested about 60,000 samples involved in roughly 34,000 criminal cases. Three years later, the state of Massachusetts still can’t figure out how to repair the damage she wrought almost single-handedly.
This is the richest and wildest park in Bucharest – but this wasn’t supposed to happen at all. Under communism, the plan was for Vacaresti to be filled with water; in the idiotic capitalist years of the mid-2000s, it was meant to be a concert venue. Both projects collapsed – and the zone became a metaphor for the failures of Romania’s development under both communist and free market principles. Yet its teeming wildlife, lawless beauty and inhabitants of drifters, scraping a living by harvesting the wild, lay the foundations for how this troubled country could prosper.
http://www.citymetric.com/skylines/hole-bucharest-s-become-nature-reserve–398
Product failure is deceptively difficult to understand. It depends not just on how customers use a product but on the intrinsic properties of each part—what it’s made of and how those materials respond to wildly varying conditions. Estimating a product’s lifespan is an art that even the most sophisticated manufacturers still struggle with. And it’s getting harder. In our Moore’s law-driven age, we expect devices to continuously be getting smaller, lighter, more powerful, and more efficient. This thinking has seeped into our expectations about lots of product categories: Cars must get better gas mileage. Bicycles must get lighter. Washing machines need to get clothes cleaner with less water. Almost every industry is expected to make major advances every year. To do this they are constantly reaching for new materials and design techniques. All this is great for innovation, but it’s terrible for reliability.
http://www.wired.com/design/2012/10/ff-why-products-fail/all/
There are nine or so principles to work in a world like this: Resilience instead of strength, which means you want to yield and allow failure and you bounce back instead of trying to resist failure. You pull instead of push. That means you pull the resources from the network as you need them, as opposed to centrally stocking them and controlling them. You want to take risk instead of focusing on safety. You want to focus on the system instead of objects. You want to have good compasses not maps. You want to work on practice instead of theory. Because sometimes you don’t why it works, but what is important is that it is working, not that you have some theory around it. It disobedience instead of compliance. You don’t get a Nobel Prize for doing what you are told. Too much of school is about obedience, we should really be celebrating disobedience. It’s the crowd instead of experts. It’s a focus on learning instead of education.