People usually resist changing their view in response to evidence for a very long time, even when they can suffer catastrophic damage as a result. It can sometimes take decades for essential ideas to filter through to the point where people actually use them. It’s taken almost forty years for Kahneman and Tversky’s initial papers in the early 1970s to filter through to wider public acceptance and bestselling books, for example.
No wonder it often takes months for Fed communications to sink in with the market. It’s essential to focus on how people frame issues and how long it takes them to change their mind.
I’ve looked at some other perspectives recently which have still not reached even the degree of awareness that Kahneman currently enjoys. Charles Lindblom argued that in practice most significant policy decisions are made by incremental “muddling through,” not the rational choice approach taught in economics and business courses and mostly believed by markets. Herbert Simon examined the boundary conditions of rational action, and though a Nobel Prize-winner in Economics, disputed much of the way the profession saw the world. Henry Mintzberg (re)discovered that successful managers rarely rely on formalized decision or planning systems, and indeed attempts at rigorous planning have most often led to disaster in corporations and government. It adds up to a whole zoo of blind spots waiting to entrap decision-makers.
And I haven’t even touched yet on perhaps the single most important reason why policymakers and markets frequently make major errors: a lack of systems thinking. Instead of reductiviely breaking things down into component parts, systems thinking focuses on the connections and relationships and feedback loops of a live system as a whole. It looks and stocks and flows, and lags and delays, and adaptation and complexity.
The orginal impetus came from from MIT mathematician Norbert Weiner and Austrian biologist Ludwig Von Bertalanffy in the 1930s and 1940s. But one of the most important contributions came from an MIT engineer and management expert, Jay Forrester. In a 1971 paper, The Counterintuitive Behavior of Social Systems, he argued:
The human mind is not adapted to interpreting how social systems behave. Social systems belong to the class called multi-loop nonlinear feedback systems. In the long history of evolution it has not been necessary until very recent historical times for people to understand complex feedback systems. Evolutionary processes have not given us the mental ability to interpret properly the dynamic behavior of those complex systems in which we are now imbedded.
Indeed, current economics almost completely ignores nonlinear dynamic behavior in favor of linear comparative statics. One reason is the math is too hard, and does not produce neat solutions.
But treating a complex system, like almost all political and market and business systems, as if they are simple linear systems can produce painful error and blowback. Says Forrester:
…. social systems are inherently insensitive to most policy changes that people choose in an effort to alter the behavior of systems. In fact, social systems draw attention to the very points at which an attempt to intervene will fail. Human intuition develops from exposure to simple systems. In simple systems, the cause of a trouble is close in both time and space to symptoms of the trouble. If one touches a hot stove, the burn occurs here and now; the cause is obvious. However, in complex dynamic systems, causes are often far removed in both time and space from the symptoms.
One result is policy resistance; systems react in unanticipated and often opposite ways to simplistic intervention or expectations.