The midterm elections today will likely just produce the usual cyclical swing against the party in power. The national debate has been particularly arid this year, largely focused on targeted messages to mobilize the base instead of changing people’s minds.
But much of the difference between people, and points of view, is not about the direct or immediate effects of particular policies, anyway. It’s not about immediate facts, or even always about immediate interests. According to Robert Jervis, in System Effects: Complexity in Political and Social Life,
At the heart of many arguments lie differing beliefs – or intuitions- about the feedbacks that are operating. (my bold)
It’s because, as saw before, most people find it very hard to think in systems terms. Politicians are aware of indirect effects, to be sure, and often present that awareness as subtlety or nuance. But they usually seize on one particular story about one particular indirect feedback loop, instead of recognizing that in any complex system there are multiple positive and negative loops. Some of those loops improve a situation. Some make it worse, or offset other effects. Feedback effects operate on different timescales and different channels. Any particular decision is likely to have both positive and negative effects.
The question is not whether one particular story is plausible, but how you net it all together.
Take the example of Ebola again. The core of the administration case was that instituting stricter visa controls or quarantine in the US might have the indirect effect of making it harder to send personnel and supplies to Africa, and containing the disease in Africa was essential.
That is likely true. It is story which seems coherent and plausible. But there is generally no attempt to identify, let alone quantify or measure other loops which might operate as well, including ones with a longer lag time. Airlines may stop flying to West Africa in any case if their crew fear infection, for example. Reducing the chance of transmission outside West Africa might enable greater focus of resources or experienced personnel on the region. More mistakes in handling US cases (as apparently happened in Dallas) might significantly undermine public trust in medical and political authorities. You can imagine many other potential indirect effects.
The underlying point is this: simply identifying one narrative, one loop is usually incomplete.
Here’s another example, at the expense of conservatives this time. Much US foreign policy debate effectively revolves around “domino theory”, and infamously so in the case of the Vietnam war. The argument from hawks in the 1960s was that if South Vietnam fell, other developing countries would also fall like dominoes. So even though Vietnam was in itself of little or no strategic interest or value to the United States, it was nonetheless essential to fight communism in jungles near the Laos border – or before long one would be fighting communism in San Francisco. Jervis again:
More central to international politics is the dispute between balance of power and the domino theory: whether (or under what conditions) states balance against a threat rather than climbing on the bandwagon of the stronger and growing side.
You can tell a story either way: a narrative about positive feedback (one victory propels your enemy to even more aggression) or balancing feedback (enemies become overconfident and overstretch, provoke coalitions against them, alienate allies and supporters, or if we act forcefully it will produce rage and desperation and become a “recruiting agent for terrorism”.)
The same applies to the current state of the Middle East, where I have a lively debate going with some conservative friends who believe that the US should commit massive ground forces to contain ISIS in the Middle East, or “small wars will turn into big wars.” It’s in essence a belief that positive feedback will dominate negative/balancing feedback, domino-style.
But you can’t just assume such a narrative will play out in reality. South Vietnam did fall, after all But what happened was that the Soviet Union ended up overreaching in adventures like the invasion of Afghanistan. The other side collapsed.
The lure of a particular narrative, of focusing on one loop in a system, is almost overwhelming for many people, however. It’s related to the tendency to seize on one obvious alternative in decisions, with limited or no search for better or more complete or relevant alternatives.
The answer is not to just cherry pick particular narratives about feedback loops and indirect effects which happen to correspond with your prior preferences. That usually turns into wishful thinking and confirmation bias. Instead, you need to get a feel for the system as a whole, and have a way to observe and measure and test all (or most of ) the loops in operation.