Home/Decisions, Perception, Quants and Models, Risk Management/How to avoid disasters? Decision-making skill is more important than knowledge

How to avoid disasters? Decision-making skill is more important than knowledge

How can you avoid destructive mistakes?

My point is that wisdom in decision-making is vastly more important – not just practically, but philosophically – than knowledge.

So says Nicholas Nassim Taleb in latest book, Antifragile: Things That Gain from Disorder. It is a kaleidoscope of irritable, over-the-top, brilliantly cantankerous thought – and definitely worth a read. Decision-making is about consequences and payoffs, he says. Knowledge is about whether something is true or false in general. They are two different things. People become suckers by confusing the two.

“Wisdom” in decision-making is definitely not the classic 1960s expected-utlity decision theory (in the tradition of Howard Raiffa), however, or the game theory taught in business schools. Nor is it academic economics or risk-management. In fact, the successful former options trader is deeply skeptical about academic knowledge and forecasting and formal expertise in general. Practitioners learn to see accurately because they have to if they want to survive.

People with too much smoke and complicated tricks and methods in their brain start missing elementary, very elementary things. Persons in the real world can’t afford to miss these things; otherwise they crash the plane. Unlike researchers, they were selected for survival, not complications. So I saw the less is more in action: the more studies, the less obvious elementary but fundamental things become; activity, on the other hand strips things to their simplest possible model.

What matters is consequences and exposure.

In real life…exposure is more important than knowledge; decision effects supersede logic. Textbook “knowledge” misses a dimension, the hidden asymmetry of benefits – just like the notion of average. The need to focus on the payoff from your actions instead of studying the structure of the world (or understanding “True” and the “False”) has been largely missed in intellectual history.

This barely skims the surface of the book, which could be summed up by “decide based on fragility, not probability.” Don’t look at predictions of the future which invariably tend to be wrong, he says, but look at your exposure to upside and downside.

Taleb, of course, is much to his own annoyance becoming a major public intellectual as a result of his previous books, Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets and The Black Swan: The Impact of the Highly Improbable, which attack many of the central assumptions of the risk management community and economists.

At times he can verge on sounding populist and anti-intellectual altogether, and he has a knack for blistering attacks which upset people.  I’ll come back to the issue of when you can rely and when you can’t rely on expertise (which I talked about before here, and here, for example).

But he says something here that I very much agree with. What matters to practical people – in markets, in corporations, in daily life – is decisions. Not predictions or forecasts or data or information. And practitioners have to recognize situations if they want to survive. The problem is people almost always have serious blind spots that make survival more difficult.

 

2017-05-11T17:32:52+00:00 September 12, 2013|Decisions, Perception, Quants and Models, Risk Management|