It may turn into one of the most spectacular corporate disasters in history. What were Volkswagen thinking? Even after it became apparent that outsiders had noticed a discrepancy in emissions performance in on-the-road tests, the company still kept stonewalling and continued to sell cars with the shady software routines.
We won't know the murky, pathological details for a while. But understanding how this happens is urgent. If you ignore this kind of insidious problem, billion-dollar losses and criminal prosecutions can occur.
In fact, it's usually not just one or two “bad apples,” unethical criminals who actively choose stupid courses of action, although it often suits politicians and media to believe so. It's a system phenomenon, according to some of the classic studies (often Scandanavian) like Rasmussen and Svedung.
.. court reports from several accidents such as Bhopal, Flixborough, Zeebrügge, and Chernobyl demonstrate that they have not been caused by a coincidence of independent failures and human errors. They were the effects of a systematic migration of organizational behavior toward accident under the inﬂuence of pressure toward cost-effectiveness in an aggressive, competitive environment.
It's not likely anyone formally sat down and did an expected utility calculation, weighting financial and other benefits from installing cheat software, versus chances of being found out times consequent losses. So the usual way of thinking formally about decisions doesn't easily apply.
It's much more likely that it didn't occur to anyone in the company to step back and think it through. They didn't see the full dimensions of the problem. They denied there was a problem. They had blind spots.
It can often be hard to even find any point at which decisions were formally made. They just … happen. Rasmussen & co again:
In traditional decision research ‘decisions’ have been perceived as discrete processes that can be separated from the context and studied as an isolated phenomenon. However, in a familiar work environment actors are immersed in the work context for extended periods; they know by heart the normal ﬂow of activities and the action alternatives available. During familiar situations, therefore, knowledge-based, analytical reasoning and planning are replaced by a simple skill- and rule-based choice among familiar action alternatives, that is, on practice and know-how.
Instead, the problem is likely to be a combination of the following:
- Ignoring trade-offs at the top. Major accidents happen all the time in corporations because often the immediate benefits of cutting corners are tangible, quantifiable and immediate, while the costs are longer-term, diffuse and less directly accountable. They will be someone else's problem. The result is longer-term, more important goals get ignored in practice. Indeed, to define something as a technical problem or set strict metrics often embeds ignoring a set of trade-offs. So people never think about it and don't see problems coming.
- Trade-offs can also happen because general orders come from the top – make it better, faster, cheaper and also cut costs – and reality has to be confronted lower down the line, without formally acknowledging choices have to be made. Subordinates have to break the formal rules to make it work. Violating policies in some way is a de facto requirement to keep your job, and then it is deemed “human error” when something goes wrong. The top decision-maker perhaps didn't formally order a deviation: but he made it inevitable. The system migrates to the boundaries of acceptable performance as lots of local, contextual decisions and non-decisions accumulate.
- People make faulty assumptions, usually without realizing it. For example, did anyone think through how easy it to conduct independent on-the-road tests? That was a critical assumption on whether they would be found out.
- If problems occur, it can become taboo to even mention them, particularly when bosses are implicated. Organizations are extremely good at not discussing things and avoiding clearly obvious contrary information. People lack the courage to speak up. There is no feedback loop.
- Finally, if things do wrong, leaders have a tendency to escalate, to go for double -or-quits. And lose.
There scarcely seems to be a profession or industry or country without problems like this. The Pope was just in New York apologizing for years of Church neglect of the child abuse problem, for example.
But that does not mean that people are not culpable and accountable and liable for things they should have seen and dealt with. Nor is it confined to ethics or regulation. It is also a matter of seeing opportunity. You should see things. But how? That's what I'm interested in.
It's essential for organizational survival to confront these problems of misperceptions and myopia. They're system problems. And they are everywhere. Who blows up next?