Home/Mindfulness

Primary sources versus primary evidence

People often make disastrous mistakes by relying on primary witness testimony. But if primary sources are often deeply unreliable, as we've seen, why do we take the trouble to put witnesses in the box in a court room?

I spent a few weeks on a jury in New York City a year ago. The judge warned us not to speculate, or rely on gossip or hearsay, and judge only on the evidence we heard.

However, the judge also always specficially instructs a jury that a major part of their task is to judge the credibility of the witnesses. They have to weigh the evidence they hear, not meekly accept any testimony without question. A jury is supposed to use their common sense experience to decide whether witnesses appear evasive or hesitant or confused.

This is not analysis, in theoretical terms. It is not abstract. It is a matter of common sense and experience. Primary testimony without judgment of credibiliy is useless.

A jury is also expected to be alert for inconsistencies between witnesses. Indeed, the most important aspect of primary sources is not what any one says in isolation, but where anomalies or inconsistencies point to problems in the evidence.

Added to that, the most persuasive and important evidence in most trials is not testimony from particular witnesses or sources. It is hard objective evidence, less subject to distortion. That means sound recordings, video camera evidence, medical reports, or DNA evidence. It means e-mails, internal documents, receipts, tickets or fingerprints. It means records of interviews, transcribed at the time, calendars and measurements of skid marks or damage.

Primary sources are important. But only if you have a layer of judgment to figure out the value of what they say.

 

 

2017-05-11T17:32:48+00:00 January 21, 2014|Communication, Confirmation bias, Decisions, Mindfulness, Perception, Situation Awareness|

Trusting your lying eyes

Often the biggest source of problems is taking something which is true to a limited extent, and mistakenly overextending it. I've called it the Tylenol Test before. Take two pills: you cure your headache and feel better. Take the entire bottle at once: you end up laid out cold and dead on a mortuary slab.

This applies to the notion of “primary sources.” We have a lot of respect for direct witness testimony in the culture. Journalists like to emphasize it. ABC calls its main news broadcasts “Eyewitness News” in most major American cities, for example. There is a kind of folk wisdom that surrounds access to primary information. “I saw it with my own eyes.” “I heard it for myself.”

The trouble is more and more evidence is accummulating about just how unreliable primary sourced witness information can be. One of the most famous recent experiments in psychology, which I looked at here, demonstrated that most people quite literally cannot see the gorilla in the room even when it walks right in front of them. Witnesses frequently cannot see things they do not expect to see.

The authors of the experiment conclude in a recent book:

We all believe that we are capable of seeing what’s in front of us, of accurately remembering important events from our past, of understanding the limits of our knowledge, of properly determining cause and effect. But these intuitive beliefs are often mistaken ones that mask critically important limitations on our cognitive abilities.

[..]

In essence, we know how vividly we see some aspects of our world, but we are completely unaware of those aspects of our world that fall outside of that current focus of attention. Our vivid visual experience masks a striking mental blindness—we assume that visually distinctive or unusual objects will draw our attention, but in reality they often go completely unnoticed.

And that is before you introduce the distortions of memory.

Although we believe that our memories contain precise accounts of what we see and hear, in reality these records can be remarkably scanty. What we retrieve often is filled in based on gist, inference, and other influences; it is more like an improvised riff on a familiar melody than a digital recording of an original performance.

And that is before you introduce the possibility of deception, or selective leaking of information, or manipulation, too.

Attention is an unstable and unreliable thing. People are very easily influenced by distractions. People become disengaged by routine tasks. They lose focus. They see things with an emotional hue, and look for particular things they expect to see in a scene. And when people recall things they have seen or thought, they are prone to only remember vivid details. They even make up details altogether to make a memory more coherent. This is what goes into primary witness information.

The result is that primary information is highly unreliable. It often constitutes the noise, rather than the signal. What matters is being able to sift and judge the credibility of primary information. Without that, primary information will actually leave you worse off.

 

 

 

2017-05-11T17:32:49+00:00 January 19, 2014|Communication, Confirmation bias, Decisions, Mindfulness, Situation Awareness|

The thousand different ways organizations deny problems and resist change

I’ve been talking about how organizational culture can conceal problems. Culture is also one major reason why organizations fail to notice and  resist change. People’s assumptions give them some equilibrium and stability, and that stability gives people and organizations an identity.

There have to be at least three factors for an organization to generate enough motivation to change (continuing with the classic analysis by Edgar Schein). They are:

.. 1) enough disconfirming data to cause serious discomfort and disequilbrium; 2) the connection of the disconfirming data to important goals and ideals concerning anxiety and/or guilt; and 3) enough psychological safety, in the sense of seeing a possibility of solving the problem without loss of identity or integrity, thereby allowing members of the organization to admit the disconfirming data rather than defensively denying it.

The trouble is organizations are capable of ignoring data that contradict their preferred view for a very long time.

It is not an uncommon situation, therefore, that disconfirming data have existed for a long time but because of a lack of psychological safety, the organization has avoided anxiety or guilt by repressing it or by denying the relevance, validity or even existence of the data. The essence of psychological safety, then, is that we can imagine a needed change without feeling a loss of integrity or identity. .. The identity that the organization has built up and that has been the source if its success must now be preserved, even if that means ultimate failure to adapt successfully to a changing environment.

It often needs a long period of increasing suppressed anxiety before an organization is ready to pay attention. Without that, even the most visionary leaders may fail to reach people.

The importance of visionary leadership can be understood in this context, in that the vision sometimes serves the function of providing the psychological safety that permits the organization to move forward. Without a period of prior disconfirmation it is not clear that a visionary leader would be given much attention.

Most prophets tend to be cast out, and sometimes stoned out.

Some people might find this talk of “psychological safety” and “visionary leadership” unsatisfyingly subjective. But we know that most organizations do not survive very long, and their lifespan has been plunging. That falling survival span is a matter of hard, objective fact. It is also the personal experience of just about everyone that organizations are capable of denying the most obvious problems for a very long time.

The question is what they can do about it so they survive just a little longer. For one thing, they have to be aware of their blind spots. If they are not, all the detailed forecasts and sales projections and new product lines in the world may fail them.

How good people end up in jail

Another day, another management life destroyed by foolishness. Following yet another conviction, this time of a former head of structured credit at Credit Suisse, this NYT article muses on why apparently good people end up committing white collar crimes.

Perhaps misconduct by some groups can be ascribed to the belief that so long as everyone else seems to be doing something, it cannot actually be wrong. Continuing investigations into global banks’ manipulation of the London interbank offered rate, or Libor, as well as foreign currency exchange rates are replete with examples of traders exchanging information and boasting of their ability to artificially raise or lower a benchmark rate. These are not isolated instances, but part of a pattern of conduct over months and even years. So it cannot be chalked up to the heat of the moment.

What is so puzzling about people who have led otherwise good lives is that they are unlikely to have engaged in the misconduct if it is presented to them in stark terms. Ask a Wall Street trader, for example, whether he or she would trade on material nonpublic information received from a corporate insider, and the answer from most would be “no” — at least if there was a reasonable chance of being caught.

But under pressure to produce profits for a hedge fund or a bank, traders are often on the lookout for an “edge” on the market that can slowly take them closer to crossing the line into illegality. Add to that the vagueness of the insider trading laws in determining when information is “material,” and it can be easy to cross into illegality without necessarily noticing it.

There are a whole range of ethical blind spots. I talked about it before here. Incidentally, if you haven't seen Orange is the New Black on Netflix yet – upper-middle class New York woman ends up in correctional facility – it's intense but great.

 

2017-05-11T17:32:50+00:00 November 26, 2013|Decisions, Industry Trends, Mindfulness|

People can’t see problems coming

Why do companies like SAC get themselves into such (alleged) trouble? It’s an important question for anyone who wants to make sure their own organization isn’t crippled by bad behavior. Just think how many incidents of foolish behavior we’ve seen in recent years, from LIBOR manipulation to Madoff’s theft and the exploitation of clients at some investment banks. HSBC narrowly avoided criminal indictment for money laundering, but got hit by a $2 billion fine.

It’s not just finance, either, despite finger-pointing by activists. The biggest-selling British newspaper was vaporized by phone-hacking journalists. Detroit has just gone bankrupt in large part because of a generation-long looting of the city by corrupt “progressive” politicians. And let’s not mention the church.

Sometimes there is clearly pure venality, and every barrel has a few rotten apples.

Just as often, I think, people just can’t see the serious risks they are running. Or, because certain kinds of behavior seem normal because “everyone is doing it”, it is convenient not to ask questions. Groupthink takes over. People lose perspective. They see the immediate gain and deny the existence of longer-term costs. They focus on one goal to the exclusion of all others, and common sense as well. Indictments and billion-dollar fines follow.

As Dan Ariely says in his book Predictably Irrational: The Hidden Forces That Shape Our Decisions:

We can hope to surround ourselves with good, moral people, but we have to be realistic. Even good people are not immune to being partially blinded by their own minds. This blindness allows them to take actions that bypass their own moral standards on the road to financial rewards. In essence, motivation can play tricks on us whether or not we are good, moral people. As the author and journalist Upton Sinclair once noted, “It is difficult to get a man to understand something when his salary depends upon his not understanding it.”

People find it convenient to be in denial right up to the point everything collapses. Decision-makers often have an astonishing capacity not to see things right in front of them. Blind spots destroy organizations.

 

Criticism is useless without curiosity

I looked at advice by Buffett and Dalio earlier this week. Seek out criticism, they say.

It isn’t criticism for its own sake which is valuable, however. You don’t necessarily gain much from someone yelling at you or tell you you are doing everything wrong. Neither do you necessarily gain by triumphantly refuting someone else’s objections.

Instead, the trick is to be able to step outside your own perspective and see how facts could fit another explanation. It’s understanding the difference between perspectives which is the key, rather than just arguing loudly from different positions.

There’s an interesting case study in Gary Klein’s book Streetlights and Shadows: Searching for the Keys to Adaptive Decision Making about  the limits of feedback, including the ability to make sense of it or shift mental models. Klein specializes in “naturalistic” decision-making – how skilled people actually make urgent decisions in the field under pressure, rather than at leisure with spreadsheets. I mentioned one of his previous books in Alucidate’s conceptual framework here.

Doug Harrington was a highly skilled pilot who had landed on aircraft carriers in F-4 aircraft hundreds of times. But he kept failing to qualify to land an A-6 aircraft, despite continued feedback from landing signal officers (LSO) on the ship. “Veer right”, the ground control repeatedly told him on every approach. But the feedback didn’t help work out what was wrong. He faced the immediate end of his naval flying career, or worse, a crash into the back of a ship.

The Chief Landing Officer eventually asked Harrington how he was lining up the plane. It turned out the A-6 has the cockpit laid out with side-by-side seats rather than navigator behind the pilot. That slight difference in perspective threw off the pilot’s  habitual way of lining up the nose of the plane against the carrier. Feedback and criticism alone didn’t help him figure out what was wrong. A small shift in perspective did.

The LSO was not a coach or a trainer. He didn’t give any lectures or offer any advice. He didn’t have to add any more feedback. What he brought was curiosity. He wanted to know why a pilot as good as Harrington was having so much trouble. He used Harrington ‘s response to diagnose the flaw in Harrington’s mental model. Then the LSO took the interaction another step. Instead of just telling Harrington what was wrong, the LSO found an easy way for Harrington to experience it. Harrington already suspected something might be wrong with his approach. The simple thumb demonstration was enough for Harrington to form a new mental model about how to land an A-6.

Mental models, or mindsets, are more important than criticism or argument in isolation.

It’s not just a matter of criticism, but curiosity. I’ve always found the most successful decision-makers and traders are the ones who want to know how other people think

 

2013-11-22T12:04:20+00:00 May 8, 2013|Books, Decisions, Mindfulness, Perception, Psychology|

Why we need checklists

How do you prevent complexity leading to a fiery crash? There is a fine story about the origin of pilot’s checklists in Atul Gawande’s The Checklist Manifesto: How to Get Things Right. Boeing’s first major four-engine plane was a disaster when it first flew in 1935. People thought it was too complicated to ever fly. 

A small crowd of army brass and manufacturing executives watched as the Model 299 test plane taxied onto the runway. It was sleek and impressive, with a 103-foot wingspan and four engines jutting out from the wings, rather than the usual two. The plane roared down the tarmac, lifted off smoothly, and climbed sharply to three hundred feet.

Then it stalled, turned on one wing, and crashed in a fiery explosion. Two of the five crew members died, including the pilot, Major P. Hill.  An investigation revealed that nothing mechanical had gone wrong. The crash had been due to “pilot error,” the report said. Substantially more complex than previous aircraft, the new plane required the pilot to attend to the four engines, each with its own oil-fuel mix, the retractable landing gear, the wing flaps, electric trim tabs that needed adjustment to maintain stability at different airspeeds, and constant-speed propellers whose pitch had to be regulated with hydraulic controls, among other features. While doing all this, Hill had forgotten to release a new locking mechanism on the elevator and rudder controls.

The Boeing model was deemed, as a newspaper put it, “too much airplane for one man to fly.” The army air corps declared Douglas’s smaller design the winner. Boeing nearly went bankrupt.

Still, the army purchased a few aircraft from Boeing as test planes, and some insiders remained convinced that the aircraft was flyable. So a group of test pilots got together and considered what to do. What they decided not to do was almost as interesting as what they actually did. They did not require Model 299 pilots to undergo longer training. It was hard to imagine having more experience and expertise than Major Hill, who had been the air corps’ chief of flight testing. Instead, they came up with an ingeniously simple approach: they created a pilot’s checklist.

.
The military ended up ordering 13,000 of the planes, which became the B-17. Pilots still methodically walk around your Boeing or Airbus plane today, kicking the tires and checking off the take-off procedure. Even if they do it on iPads.

2017-05-11T17:32:59+00:00 February 18, 2013|Books, Mindfulness, Situation Awareness|

Attention is the crucial scarce factor in decisions, not information

Herbert Simon won the Nobel Prize for Economics essentially for a single book, Administrative Behavior, published three decades earlier in 1945, long before the current fashion for behavioral finance was invented.

He said:

.. The critical scarce factor in decision-making is not information, but attention. What we attend to, by plan or by chance, is a major determinant of our decisions. (p124).

It pays to be alert to what you pay attention to and what you can perceive. You need checklists.

 

2017-05-11T17:32:59+00:00 February 11, 2013|Books, Decisions, Mindfulness, Perception, Situation Awareness|

The Flop of L’Enfant Plaza

I want to point to broader problems of perception and perspective on the blog, rather than specific discussion of global macro issues ( for clients).

Here’s an example of how much context and attention shapes what we see.

If you saw a hundred dollar bill on the sidewalk, you’d pick it up, right? What about a music performance which had sold out hundred dollar tickets a few days before? The Washington Post conducted a famous experiment at a Metro stop in 2007.

No one knew it, but the fiddler standing against a bare wall outside the Metro in an indoor arcade at the top of the escalators was one of the finest classical musicians in the world, playing some of the most elegant music ever written on one of the most valuable violins ever made. His performance was arranged by The Washington Post as an experiment in context, perception and priorities — as well as an unblinking assessment of public taste: In a banal setting at an inconvenient time, would beauty transcend?

The answer was no. The Post had worried about crowd control, assuming at least some sophisticated Washingtonians would stop even on a busy commuter trip.

In the three-quarters of an hour that Joshua Bell played, seven people stopped what they were doing to hang around and take in the performance, at least for a minute. Twenty-seven gave money, most of them on the run — for a total of $32 and change. That leaves the 1,070 people who hurried by, oblivious, many only three feet away, few even turning to look.

Bell’s usual concert fee was equivalent to a thousand dollars a minute. It’s very hard to see even superlative opportunities if they are not in a familiar context and if we are distracted by routine.

Bell headed off on a concert tour of European capitals. But he is back in the States this week. He has to be. On Tuesday, he will be accepting the Avery Fisher prize, recognizing the Flop of L’Enfant Plaza as the best classical musician in America.

2017-05-11T17:33:00+00:00 February 1, 2013|Mindfulness, Perception|

Smart Choices

Making good decisions requires serious skill. One of the major experts in the field over many decades is Howard Raiffa, at Harvard Business School. He has written a distillation of his more formal teaching together with two other researchers, John Hammond and Ralph Keeney: Smart Choices: A Practical Guide to Making Better Decisions.

It’s a very good little book which sets out a procedure to follow they call PROACT. It does not tell you what to decide, they say, but there is a lot to be said about how to decide.

PROACT stands for Problem, Objectives, Alternatives, Consequences and Tradeoffs. They also devote chapters to Uncertainty , Risk Tolerance and Linked Decisions, when a decision made today will have knock-on effects on future decisions.

The very first element shows how hard it can be to get decisions right, however. Figuring out what the problem actually is, and how you frame the issue “can make all the difference”.

To choose well, you need to state your decision problems carefully, acknowledging their complexity and avoiding unwarranted assumptions and option-limiting prejudices..The way you state your problem frames your decision. Posing the right problem drives everything else. .. To make sure you get the problem right, you need to get out of the box and think creatively.

Easily said, but much harder to do. You can question constraints or identify key elements, or look for triggers which explain why the problem has arisen, they say. But in practice one of the main ways to get the problem definition right is

Gain fresh insights by asking others how they see the situation… Their ideas will help you see your problem in a new light, perhaps revealing new opportunities or exposing unnecessary, self-imposed constraints.

After that, you need to think about your objectives are – what you want and need. The trouble is decision-makers often fail to spell this out – and fail as a result.

Why? Often decision-makers take too narrow a focus. Their list of objectives remains brief and cursory, omitting important considerations that become apparent only after they have made a decision. They concentrate on the tangible and the quantitative (cost, availability) over the intangible and subjective (features, ease of use). “Hard” concerns drive out the “soft’. In addition they tend to stress the short term (enjoy life today) over the long term (have a comfortable retirement). .. Easily measurable objectives won’t always illuminate what really matters. Watch out for this trap!

The next step is to generate alternatives. But, the authors say, “you can never choose an alternative you haven’t considered… Thus the payoff from seeking good, new, creative alternatives can be extremely high.”

You need to do your own thinking first, in case you are overinfluenced by advisors before you have thought it through. But after that,

you should then seek the input of others to get additional perspectives.. Keep an open mind during these conversations. The primary benefit may not be the specific ideas that others provide, but simply the stimulation you get from talking about your decision.

Once that is done, you examine the consequences of each alternative with as much accuracy, completeness and precision as you can, perhaps setting them out in a consequences table. The next step is to think about trade-offs between alternatives which match some objectives better than others. They outline a method of making “even swaps” between alternatives to eliminate some options which are dominated by others.

They discuss handling uncertainty, including different outcomes which are mutually exclusive and collectively exhaustive (which has become a mantra among McKinsey alunni) and using decision trees. Linked decisions can be structured to learn about the underlying problem over time. Finally, they go over a list of decision traps and biases, such as anchoring on first thoughts, overconfidence and base-rate mistakes.

One constant theme is that it is easy to be led astray by misperception:

At every stage of the decision-making process, misperceptions, biases and other tricks of the mind can distort the choices we make. Highly complex and highly important decisionsare the most prone to distortion because they tend to involve the most assumptions and the most estimates. The higher the stakes, the higher the risks. .. The best protection against all psychological traps is awareness. Even if you can’t eradicate the distortions ingrained in the way your mind works, you can build tests and disciplines into your decision-making process that can uncover and counter errors in thinking before they become errors in judgement.

Of course, I’ve started Alucidate precisely to try to help provide outside, independent, systematic advice that will help improve decisions, so I find this persuasive at any rate.

It is striking that the authors cover many of the standard decision techniques which are taught in business schools. But there are two consistent themes that run through the whole book that limit purely formal techniques. Framing is essential: you need to make sure you see the right problem and right objectives, as no techniques will help you find the right solution to the wrong problem. And it helps to talk to people , especially those who can help you understand the problem and alternatives.

In general, I think systematic approaches and checklists to make sure you don’t forget important elements are very valuable, especially in high pressure/high stakes environments when it is easy to get swept up in the crisis or the mood of the moment. I’ll look at other books about the use of checklists in due course.

 

 

 

2017-05-11T17:33:00+00:00 January 23, 2013|Books, Decisions, Mindfulness, Perception|