27 December 2008

Required Reading for the Obama Cabinet

Dietrich Dorner's The Logic of Failure makes points that escape some policy makers, lessons Dorner draws from systems simulations. Taken to heart, the following lessons might be enough to keep Obama's administration from the mistakes of Bush. Dorner's book - or at least these very succinct excerpts from it - ought to be required reading for Obama's Cabinet.

Wouldn't it be wonderful if we could move beyond the "policy as an article of faith" phase of politics?
Both the good and the bad participants proposed with the same frequency hypotheses on what effects high taxes, say, or an advertising campaign to promote tourism in Greenvale would have. The good participants differed from the bad ones, however, in how often they tested their hypotheses. The bad participants failed to do this. For them, to propose a hypothesis was to understand reality; testing that hypothesis was unnecessary. Instead of generating hypotheses, they generated truths.

Catastrophes of just the last few years include Katrina and the 2008 credit crisis.
One basic error accounts for all the catastrophes: none of the participants realized that they were dealing with a system in which, though not every element interacted with every other, many elements interacted with many others. The conceived of their task as dealing with a sequence of problems that had to be solved one at a time. They did not take into account the side effects and repercussions of certain measure. They dealt with the entire system, not as a system but as a bundle of independent minisystems. And dealing with systems this way breeds trouble: if we do not concern ourselves with the problems we do not have, we soon have them.
"Catastrophes" seem to hit suddenly, but in reality the way has been prepared for them. Unperceived forces gradually eat away at the supports necessary for favorable development until the system is finally unable to resist any longer and collapses.

A leader ought to provide a narrative that is bigger than the moment, that provides more context than the urgency or false calm of the immediate situation.
The information a newspaper-reading citizen receives about economic developments or the spread of epidemics, for example, lacks both continuity and constant correctives. Information comes in isolated fragments. We can assume that those conditions make it considerably more difficult to develop an adequate picture of developments over time.

No amount of planning can compensate for the dispersion of control with a shared goal. Bush's discipline to keep all staff on message was probably just one more thing that worked against him.
We would do better to follow the advice of Napoleon, whose motto for such situations was, "On s'engage et puis on voit!" (Freely translated, "One jumps into the fray, then figures out what to do next.")
In very complex and quickly changing situations the most reasonable strategy is to plan only in rough outlines and to delegate as many decisions as possible to subordinates. these subordinates will need considerable independence and a thorough understanding of the overall plan. Such strategies require a "redundancy of potential command," that is, many individuals who are all capable of carrying out leadership tasks within the context of the general directives.

It is not enough to state standard platitudes as goals. Meaningless might help with political speeches, but it makes policy formulation difficult.
Bad participants simply compiled laundry lists of goals without providing operative principles for achieving them, and they made no distinction between primary and secondary measures. The bad participants provided a "jumble," not an "organized set" or measures.

Remember how much grief the neo-cons managed to create for Kerry because he simply used nuanced language that suggested that he understood the complexity of a situation?
Thomas Roth used results from a planning game to identify characteristics of good problem solvers. Roth studied the language that good and bad participants used while engaged in a simulation game called "Tailor Shop." The task was to manage a small plant that manufactured clothing. Using records of what the participants said as they thought out loud during the experiment, Roth studied the nature of their problem-solving language.
Roth found that the bad problem solvers tended to use unqualified expressions: constantly, every time, all, without exception, absolutely, entirely, completely totally, unequivocally, undeniably, without question, certainly, solely, nothing, nothing further, only, neither ... nor, must, and have to.
The good problem solvers, on the other hand, tended more toward qualified expressions: now and then, in general, sometimes, ordinarily, often, a bit, in particular, somewhat, specifically, especially, to some degree, perhaps, conceivable, questionable, among other things, on the other hand, also, moreover, may, can, and be in a position to.
It is obvious from these two lists that the good problem solvers favored expressions that take circumstances and exceptions into account, that stress main points but don't ignore subordinate ones, and that suggest possibilities. By contrast, the bad problem solvers used "absolute" concepts that do not admit of other possibilities or circumstances.

"History will be my judge" is a wonderful punt on self reflection and trying to understand the consequences of choices.
The desire to check on efficacy seems to increase but never assumes major propositions. That is odd because we would expect that rational people faced with a system they cannot fully understand would seize every chance to learn more about it and would therefore behave "nonballistically." For the most part, however, the experiment participants did not do that. They shot off their decision like cannonballs and gave hardly a second thought to where those cannonballs might land.
Strange, we think. But an explanation is readily available. If we never look at the consequences of our behavior, we can always maintain the illusion of our competence. If we make a decision to correct a deficiency and then never check on the consequences of that decision, we can believe that the deficiency has been corrected. We can turn to new problems. Ballistic behavior has the great advantage of relieving us of all accountability.
The less clear a situation is, the more likely we are to prop up our illusion of competence with ballistic behavior. Such behavior reduces our sense of confusion and increases our faith in our own capabilities.


Lifehiker said...

Very interesting. Anyone who has had long experience in leading real world projects would relate to these observations. There is a lot of subtlety and uncertainty in every major challenge, and accepting this fact enables the flexibility that makes for progress.

Regarding Obama's cabinet, I hope it includes enough people who are real "doers", people who have both the intellect to think through policy and the experience to implement policy effectively.

Bush's failure may be attributable largely to his poor staff choices. If you don't know what makes good leaders, you probably won't choose such people.

Dave said...

Is the opposite of this mindset gridlock from constantly re-evaluating information and not acting?

I sometimes find myself doing that.