Random Excerpt: The Logic of Failure

Every week I crack open The Local Economy Revolution: What’s Changed and How You Can Help to a random chapter, and copy what I find there into here.  If you like what you see here, you will probably like the book.  And if you don’t like what you see here, you might like the rest of the book anyways.  Ya never know until you try, right?

 

 

The Logic of Failure: Making better plans

When our plans fail us, it’s often because our blind spots, our limited assumptions and our overlooked mis-interpretations equipped us with a wrong or faulty strategy.  We often set ourselves up for that failure because we didn’t know and could not see all the things we were missing.

One of the books that has been most influential on my thinking over the past few years is a 20-year old volume with the catchy title The Logic of Failure: Recognizing and Avoiding Error in Complex Situations  by Dietrich Dorner.  The book details the results of a series of studies examining how people made decisions in complex and ambiguous environments.

Complex and ambiguous… sounds nothing like the communities we work with, right?

Add to that the fact that the participants were typically given economic development and public policy scenarios, and it starts to hit uneasily close to home.

In some respects, it’s a depressing read.  Participants in Dorner’s studies make a lot more mistakes than correct decisions, and much of the time they fail, miserably.  By studying the participants’ choices and assumptions closely, and doing that a mind-numbing number of times, Dorner develops a pretty reliable differentiation between those who made consistently good decisions, and those who set themselves up for disaster again and again.

Dorner illustrates a large number of differences in how successful and unsuccessful participants approach and manage the tasks.  Here is one that particularly stood out for me:

Both the good and the bad participants proposed with the same frequency hypotheses on what effect higher taxes, say, or an advertising campaign to promote tourism in Greenvale [an imaginary city] would have.  The good participants differed from the bad ones, however, in how often they tested their hypotheses.  The bad participants failed to do this.  For them, to propose a hypothesis was to understand reality; testing that hypothesis was unnecessary.  Instead of generating hypotheses, they generated “truths.”[i] [emphasis mine]

How often do we test our hypotheses?  How often do we assume that a project will have a certain impact without taking a hard look at whether those assumptions are sound?

How often do we go back and re-examine the basic assumptions that we built our last plan on?

How often have we generated our own “truth,” expended enormous resources on that truth, and then acted surprised when something hits us that we didn’t see coming?

Admitting that we might not have the Truth takes bravery.  Taking apart and examining the foundations of the structures we have built feels rightly dicey.  But the termites work silently until the structure falls down.

Since we know that even our best ideas can create unintended consequences, one of the most important things we can do is test our hypotheses – regularly, not just during the plan development phase, but before, and after.   We are perfectly capable of that.  We just need to do it.

___

A follow-on piece of guidance from Dr. Michael Roberto of Bryant University, from the Art of Critical Decision Making,Teaching Company lecture series. During the series, Dr. Roberto walks through two critical decision points of the John F. Kennedy presidential administration. During the first, the failed Bay of Pigs invasion in 1961, Kennedy made the decision to invade Cuba on the basis of advice from a small, relatively ad-hoc group of public policy advisors — a small group with so much “expertise” on the topic that they missed key information that fell outside their expectations… and set the invasion up for disaster.

When the Cuban Missile Crisis came along in 1962, Kennedy learned from that mistake, and he set up a completely different process for building his advisory team, establishing their objectives and enabling them to work through to a conclusion. More specifically, a conclusion that didn’t end with a nuclear war.

Dr. Roberto provides this summary of a key lesson from the Kennedy experience:

Many leaders fail because they think of decisions as events, not processes… We think of the decision maker sitting alone at a moment in time, pondering what choice to make.  However, most decisions involve a series of events and interactions that unfold over time.  Decisions involve processes that take place inside the minds of individuals, within groups, and across units of complex organizations.

When confronted with a tough issue, we focus on the question, “what decision should I make?”  We should first ask, “how should I go about making this decision?” [emphasis mine]

In most cases, the source of what happens probably lies in how we decided to decide.

 

[i] Dietrich Dorner, The Logic of Failure: Recognizing and Avoiding Error in Complex Situations.  Basic Books, 1996.  P. 26.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s