Home/Situation Awareness

The short interval before thinking as usual resumes

Surprises can be valuable. Even the Trumpites were surprised by the election victory. CNN, as I recall, quoted a “senior Trump campaign source” early on election night who said it would “take a miracle” to win. Meanwhile, liberals are in shock or despair. Everyone is a bit flummoxed. So what happens next?

The reaction to surprises is a very interesting and important thing to watch. There's two main paths people take.

1) try to work out why you were surprised. What did you miss? What didn't you pay attention to? What could you do differently next time?

2) struggle to reconcile events with your previous view so you retain as much of the preexisting narrative or perspective as possible.


People generally go the second route, and so they learn little or nothing from events. As Weick and Sutcliffe put it in their book Managing the Unexpected,

The moral is that perceptions of the unexpected are fleeting. When people are interrupted, they tend to be candid about what happened for a short period of time, and then they get their stories straight in ways that justify their actions and protect their reputations. And when official stories get straightened out, learning stops…In that brief interval between surprise and successful normaling lies one your few opportunities to discover what you don't know.

In all the reactions to the election, all the pieces of analysis and journalism and commentary, decide for yourself whether they are going route 1 or route 2.


2017-05-11T17:32:34+00:00 November 10, 2016|High-Reliability Organizations, Situation Awareness|

Primary sources versus primary evidence

People often make disastrous mistakes by relying on primary witness testimony. But if primary sources are often deeply unreliable, as we've seen, why do we take the trouble to put witnesses in the box in a court room?

I spent a few weeks on a jury in New York City a year ago. The judge warned us not to speculate, or rely on gossip or hearsay, and judge only on the evidence we heard.

However, the judge also always specficially instructs a jury that a major part of their task is to judge the credibility of the witnesses. They have to weigh the evidence they hear, not meekly accept any testimony without question. A jury is supposed to use their common sense experience to decide whether witnesses appear evasive or hesitant or confused.

This is not analysis, in theoretical terms. It is not abstract. It is a matter of common sense and experience. Primary testimony without judgment of credibiliy is useless.

A jury is also expected to be alert for inconsistencies between witnesses. Indeed, the most important aspect of primary sources is not what any one says in isolation, but where anomalies or inconsistencies point to problems in the evidence.

Added to that, the most persuasive and important evidence in most trials is not testimony from particular witnesses or sources. It is hard objective evidence, less subject to distortion. That means sound recordings, video camera evidence, medical reports, or DNA evidence. It means e-mails, internal documents, receipts, tickets or fingerprints. It means records of interviews, transcribed at the time, calendars and measurements of skid marks or damage.

Primary sources are important. But only if you have a layer of judgment to figure out the value of what they say.



2017-05-11T17:32:48+00:00 January 21, 2014|Communication, Confirmation bias, Decisions, Mindfulness, Perception, Situation Awareness|

Trusting your lying eyes

Often the biggest source of problems is taking something which is true to a limited extent, and mistakenly overextending it. I've called it the Tylenol Test before. Take two pills: you cure your headache and feel better. Take the entire bottle at once: you end up laid out cold and dead on a mortuary slab.

This applies to the notion of “primary sources.” We have a lot of respect for direct witness testimony in the culture. Journalists like to emphasize it. ABC calls its main news broadcasts “Eyewitness News” in most major American cities, for example. There is a kind of folk wisdom that surrounds access to primary information. “I saw it with my own eyes.” “I heard it for myself.”

The trouble is more and more evidence is accummulating about just how unreliable primary sourced witness information can be. One of the most famous recent experiments in psychology, which I looked at here, demonstrated that most people quite literally cannot see the gorilla in the room even when it walks right in front of them. Witnesses frequently cannot see things they do not expect to see.

The authors of the experiment conclude in a recent book:

We all believe that we are capable of seeing what’s in front of us, of accurately remembering important events from our past, of understanding the limits of our knowledge, of properly determining cause and effect. But these intuitive beliefs are often mistaken ones that mask critically important limitations on our cognitive abilities.


In essence, we know how vividly we see some aspects of our world, but we are completely unaware of those aspects of our world that fall outside of that current focus of attention. Our vivid visual experience masks a striking mental blindness—we assume that visually distinctive or unusual objects will draw our attention, but in reality they often go completely unnoticed.

And that is before you introduce the distortions of memory.

Although we believe that our memories contain precise accounts of what we see and hear, in reality these records can be remarkably scanty. What we retrieve often is filled in based on gist, inference, and other influences; it is more like an improvised riff on a familiar melody than a digital recording of an original performance.

And that is before you introduce the possibility of deception, or selective leaking of information, or manipulation, too.

Attention is an unstable and unreliable thing. People are very easily influenced by distractions. People become disengaged by routine tasks. They lose focus. They see things with an emotional hue, and look for particular things they expect to see in a scene. And when people recall things they have seen or thought, they are prone to only remember vivid details. They even make up details altogether to make a memory more coherent. This is what goes into primary witness information.

The result is that primary information is highly unreliable. It often constitutes the noise, rather than the signal. What matters is being able to sift and judge the credibility of primary information. Without that, primary information will actually leave you worse off.




2017-05-11T17:32:49+00:00 January 19, 2014|Communication, Confirmation bias, Decisions, Mindfulness, Situation Awareness|

Raise the champagne (but not until you learn the lessons of 2013)

We’re all one year older as we reach the end of 2013, but are we one year wiser?  The end of the year is a time to look back and see if there are any ways to do things better next year.

For the hedge fund industry, there just has to be new approaches to sustain long-term survival.  The industry had a very bad year in 2013, at least measured in terms of investment returns.  The average “smart money” hedge fund made 8.2% and charged 200bp for that positive performance. But if you put your money in the dumbest, cheapest global equity index fund, you made almost 21%, and got charged less than a tenth of the fees for it.

So it is essential to learn from the experience of a rough year for the industry. Clearly, it’s difficult to improve or turn things round without trying to learn from outcomes and mistakes.

The problem  is it is also often extremely difficult to learn from experience, for a range of different reasons. There are multiple serious blind spots in individual and organizational learning.

One bedrock theme for me is insight about macro policy, like the Fed, or market behavior and opportunities for outperformance, are basically about sensitivity to evidence. People’s assimilation and “stretchiness” in response to evidence and events is actually more important than the objective underlying evidence itself. So paying very close attention to how and when and why people learn from experience is essential.

The most obvious issue is people often just prefer to move on to the next thing. They are reluctant to review outcomes or try to learn from them at all.  Raise the champagne anyway. 2013 is already history.  At least next year might be better. (It ought to be a little bit better ,  just because of regression to the mean.)

Markets have remarkably short memories. Policymakers have remarkably selective ones.

The Person and the Situation

But here’s one deeper problem. Who or what do you blame for bad outcomes?  If something goes wrong (like horrendous investment returns or policy errors),  do you blame the person or the situation?  There are some frequent deep distortions here, so much so that social psychologists call it the “fundamental attribution error.” (see the classic analysis by Nisbett and Ross in Human Inference: Strategies and Shortcomings in Social Judgement).

People pay too much attention to the constraints and headwinds in their situation when it comes to explaining their  own success or failure.

So if a fund manager has a bad year, it’s because of the situation. Blame the Fed and Congress and the tough challenges in the markets. You can expect to read variations on that in a lot of fund manager reports to their clients on 2013. This also means you don’t have to think much about potential mistakes or errors you might have made.  If you don’t recognize mistakes, you can’t correct them.

But if you had a good year, it’s all because of your own talent and skill and effort. This also means you don’t have to think much about the drivers of the situation or the prior likelihood of success.

In sharp contrast, people also believe that other people’s success or failure is almost entirely due to their personal qualities or dispositions,  and not their situation. We pay too little attention to situational factors in other people’s decisions.   You will read plenty of variations on that in comments on Bernanke’s tenure at the Fed when he steps down in  few weeks. It will be all about Bernanke’s skills and judgments, with less attention to the situation in which he found himself. Hedge fund clients will be much less sympathetic to claims that poor performance was because of a tough general situation.

It pays to be aware of when and why people are invoking the person or the situation as explanations for outcomes. And it is critical to observe  signs of how they learn and adapt.





Awareness can’t be modeled

One of the fundamental business needs is awareness, an ability to look at a situation in a fresh and perceptive way, to retest assumptions and look for anomalies. Models and forecasts often trap people into elaborate mechanisms that grow remote from reality. You can have the best analytics in the world, but if you’re asking the wrong question you are also asking for disaster.

Alucidate looks for crucial information. But we are much more about using it to ask the right questions, by comparing different perspectives.

Here’s the former head of market risk at Merrill , interviewed in the NYT. He was a PhD in Physics, and a leading early figure in the quant influx onto Wall St. He eventually learned to look for the human factors.

But the numbers more often disguise risk than reveal it. “I went down the statistical path,” he said. He built one of the first value-at-risk models, or VaR, a mathematical formula that is supposed to distill how much risk a firm is running at any given point. …

Instead of fixating on models, risk managers need to develop what spies call “humint” — human intelligence from flesh and blood sources. They need to build networks of people who will trust them enough to report when things seem off, before they become spectacular problems.

Like Emmanuel Derman, the head quant at Goldman we looked at here, he thinks people trust too much in the math in isolation. As for the VaR measures he helped pioneer?

In Mr. Breit’s view, Wall Street firms, encouraged by regulators, are on a fool’s mission to enhance their models to more reliably detect risky trades. Mr. Breit finds VaR, a commonly used measure, useful only as a contrary indicator. If VaR isn’t flashing a warning signal for a profitable trade, that may well mean there is a hidden bomb.

The best way to get awareness is to talk to someone with an informed but outside, different view. In real decisions, human factors like discerning someone else’s motivations and intentions, and how they will react in a crisis, are the essentials.

The President is without a doubt sitting in the White House this morning wishing he had more intel on what the North Korean leaders’ intentions are. Hundreds of billions of dollars of satellites and radar are of only limited help in that.



Why we need checklists

How do you prevent complexity leading to a fiery crash? There is a fine story about the origin of pilot’s checklists in Atul Gawande’s The Checklist Manifesto: How to Get Things Right. Boeing’s first major four-engine plane was a disaster when it first flew in 1935. People thought it was too complicated to ever fly. 

A small crowd of army brass and manufacturing executives watched as the Model 299 test plane taxied onto the runway. It was sleek and impressive, with a 103-foot wingspan and four engines jutting out from the wings, rather than the usual two. The plane roared down the tarmac, lifted off smoothly, and climbed sharply to three hundred feet.

Then it stalled, turned on one wing, and crashed in a fiery explosion. Two of the five crew members died, including the pilot, Major P. Hill.  An investigation revealed that nothing mechanical had gone wrong. The crash had been due to “pilot error,” the report said. Substantially more complex than previous aircraft, the new plane required the pilot to attend to the four engines, each with its own oil-fuel mix, the retractable landing gear, the wing flaps, electric trim tabs that needed adjustment to maintain stability at different airspeeds, and constant-speed propellers whose pitch had to be regulated with hydraulic controls, among other features. While doing all this, Hill had forgotten to release a new locking mechanism on the elevator and rudder controls.

The Boeing model was deemed, as a newspaper put it, “too much airplane for one man to fly.” The army air corps declared Douglas’s smaller design the winner. Boeing nearly went bankrupt.

Still, the army purchased a few aircraft from Boeing as test planes, and some insiders remained convinced that the aircraft was flyable. So a group of test pilots got together and considered what to do. What they decided not to do was almost as interesting as what they actually did. They did not require Model 299 pilots to undergo longer training. It was hard to imagine having more experience and expertise than Major Hill, who had been the air corps’ chief of flight testing. Instead, they came up with an ingeniously simple approach: they created a pilot’s checklist.

The military ended up ordering 13,000 of the planes, which became the B-17. Pilots still methodically walk around your Boeing or Airbus plane today, kicking the tires and checking off the take-off procedure. Even if they do it on iPads.

2017-05-11T17:32:59+00:00 February 18, 2013|Books, Mindfulness, Situation Awareness|

Attention is the crucial scarce factor in decisions, not information

Herbert Simon won the Nobel Prize for Economics essentially for a single book, Administrative Behavior, published three decades earlier in 1945, long before the current fashion for behavioral finance was invented.

He said:

.. The critical scarce factor in decision-making is not information, but attention. What we attend to, by plan or by chance, is a major determinant of our decisions. (p124).

It pays to be alert to what you pay attention to and what you can perceive. You need checklists.


2017-05-11T17:32:59+00:00 February 11, 2013|Books, Decisions, Mindfulness, Perception, Situation Awareness|