Home/High-Reliability Organizations

The short interval before thinking as usual resumes

Surprises can be valuable. Even the Trumpites were surprised by the election victory. CNN, as I recall, quoted a “senior Trump campaign source” early on election night who said it would “take a miracle” to win. Meanwhile, liberals are in shock or despair. Everyone is a bit flummoxed. So what happens next?

The reaction to surprises is a very interesting and important thing to watch. There's two main paths people take.

1) try to work out why you were surprised. What did you miss? What didn't you pay attention to? What could you do differently next time?

2) struggle to reconcile events with your previous view so you retain as much of the preexisting narrative or perspective as possible.

 

People generally go the second route, and so they learn little or nothing from events. As Weick and Sutcliffe put it in their book Managing the Unexpected,

The moral is that perceptions of the unexpected are fleeting. When people are interrupted, they tend to be candid about what happened for a short period of time, and then they get their stories straight in ways that justify their actions and protect their reputations. And when official stories get straightened out, learning stops…In that brief interval between surprise and successful normaling lies one your few opportunities to discover what you don't know.

In all the reactions to the election, all the pieces of analysis and journalism and commentary, decide for yourself whether they are going route 1 or route 2.

 

2017-05-11T17:32:34+00:00 November 10, 2016|High-Reliability Organizations, Situation Awareness|

Who gave the order to shoot down a civil airliner?

The loss of flight MH17 over Ukraine, with debris, bodies and dead children's stuffed animals strewn over the remote steppe, is unspeakably tragic. Major Western countries are being swift to accuse and condemn Russian rebel groups, and by extension Putin, of a repugnant crime.

It's unlikely, however, that someone identified a Malaysian airliner overhead and deliberately chose to shoot it down. It's more probable Russian rebels didn't have the skill or backup to know they were firing at a civilian airliner.

It might not change the moral blame attached to the incident. At best it would be awful negligence. It might not affect the desire to hold leaders accountable.

But it ought to make people stop and think about how decisions get made as well. The near-automatic default in most public and market discussion is to think in rational actor terms. Someone weighed the costs and benefits of alternatives. They chose to shoot down the airliner. So find the person who made that horrible choice.

So how do you deal with a world in which that doesn't happen most of the time? Where people shoot down airliners without intending to? When the financial system crashes, or recessions happen, or the Fed finds it hard to communicate with the market? Where people ignore major alternatives, or use faulty theories and data? When they fail to grasp the situation and fail to anticipate side-effects?

There's actually a deeper and more important answer to these questions.

Who was to blame for Challenger?

Let's go back to the example of the Challenger Shuttle Disaster I mentioned in the last post, because it's one of the most classic studies of failed decisions in recent times. Here was an organization – NASA – which was clearly vastly more skilled, disciplined and experienced than Russian rebels. But they still encountered a catastrophic misjudgment and failure. Seven crew died. Who was to blame?

The initial public explanation of the shuttle disaster, according to the author Diane Vaughan, was middle management in NASA deliberately chose to run the risk in order to keep to a launch schedule. Like so many corporations, production pressure meant safety was ignored. Management broke rules and failed to pass crucial information to higher levels.

In fact, after trawling through thousands of pages later released to the National Archives and interviewing hundreds of people, she concluded that no-one specifcally broke the rules or did anything they considered wrong at the time.

On the one hand, this is good news – genuinely amoral, stupid, malevolent people may be rarer than you'd think from reading the press. In another way, though, it is actually much more frightening.

NASA, after all, were the original rocket scientists – dazzlingly able people who had sent Apollo to the moon some years before. NASA engineers understood the physical issues they were dealing with far better than we are ever likely to be able to understand the economy or market behavior.

NASA had exceptionally thorough procedures and documentation. They made extensive effrots to share information. They were rigorous and quantitative. In fact, ironically the latter was part of the problem, because observational data and photographic evidence about penetration of the O-ring seal was discounted as too tacit and vague.

So what was the underlying explanation of the catastrophe? It wasn't simply a technical mistake.

Possibly the most significant lesson from the Challenger case is how environmental and organizational contingencies create prerational forces that shape worldview, normalizing signals of potential danger, resulting in mistakes with harmful human consequences. The explanation of the Challenger launch is a story of how people who worked together developed patterns that blinded them to the consequences of their actions. It is not only about the development of norms but about the incremental expansion of normative boundaries: how small changes – new behaviors that were slight deviations from the normal course of events – gradually became the norm, providing a basis for accepting additonal deviance. p409

Conformity to norms, precedent, organizational structure and environmental conditions, she says,

congeal in a process that can create a change-resistant worldview that neutralizes deviant events, making them acceptable and non-deviant.

Organizations have an amazing ability to ignore signals something is wrong, including, she says, the history of US involvement in Vietnam.

The upshot? Often individuals and corporations do carry out stupid and shortsighted activities (often because they ignore trade-offs.) But more often they have an extraodinary ability to ignore contrary signals, especially if they accummulate slowly over time, and convince themselves they are doing the right thing.

People develop “patterns that blind them to the consequences of their actions” and develop change-resistant worldviews. That's why I look for blind spots, becuse research shows it is the key to understanding decisions and breakdowns. You can look for those patterns of behavior. One sign, for example, is the slow, incremental redefinition, normalization and acceptance of risk that Vaughan describes.

I'm going to look much more at systems in coming posts.