10 April 2017

What a Manager Should Know: Deming's Four Elements of Profound Knowledge

Deming argued that there are four elements of profound knowledge that define a what managers should know.
1. Appreciation for a system
2. Understanding of variation
3. Psychology, and
4. A theory of knowledge

To effectively manage or understand an organization, you don't need a deep understanding of any one of these but you need some understanding of all of these. Also, each of these elements makes more sense within the context of the other three; the four form a system.

"A bad system will beat a good person every time."
- Deming 

1. Appreciation for a system and 2. understanding of variation
Deming used things like control charts that tracked data over time to determine what was common cause and what was special cause. To understand variation is to understand the difference between what comes from the system and what does not. People in 2000 in the US were 6X more productive than people in 1900 in the US. It wasn't because they worked harder (in fact, average work weeks dropped from about 60 hours a week to about 38 hours a week in that time). It's because they had better systems. You'll never get as far trying to make people work harder in an old system as you will by improving that system.

Here is a set of 100 data points representing rework (imagine an auto assembly line that created 1,000 cars a day, say) that mostly varies from about 10 to 40 cars that need to be reworked each day That much variation yields a control chart that suggests that normal variation falls within a range of 1 to 47 cars. (Normal variation is what we can expect from the system.)


Only one data point in the above chart - the one on day 16 that hits 57 - appears to come from special cause. All the rest of the variation is just a normal part of the day to day variation. 

Normal variation can still be explained as special by people who don’t understand it. It often is. "Orlando was not paying attention and we had 6 cars in a row assembled with the brake pads swapped. That's what happened." There is always a story to go with the data. And there is often a person we can name in that story. Normal variation can be explained but those explanations are themselves randomly associated with outcomes of a stable system. (This does not just happen on assembly lines. Each day, regardless of whether it goes up or down, moves a lot of moves a little, analysts say things like, "Investors were skittish today because of ..." What would actually be remarkable would be a day in which the major indices finished exactly where they started. Variation is normal. It is only over longer periods of time that you can spot a general direction.)

The only story for a data point that deserves explanation in the above graph is what happened on day 16. That is unusual and the explanation for that day will likely tell you something. It is special, meaning that what happened on that day isn’t explained by the normal rise and fall of our system, is variation that lies outside the normal bounds of daily variation.

Meanwhile, if you don't like it when you have to rework more than, say, 25 cars in a day, you need to look at the system. Is the process you're using dependent on guys like Orlando performing four different assembly steps every 5 minutes for 2 hours in a row before he gets a break? Is there any data suggesting that the average person can sustain focus and accuracy for that long without attention wandering? It's easy to say that Orlando should focus but do you have any data suggesting the average person hired for this role does? If that part of the process is consistently contributing to, say, 4 to 15 of the rework events each day, then we know that changing that process has the potential to reduce rework by about 10 units a day. (Note that this goal of ten is not the product of some arbitrary goal that came of the fact that we have ten fingers but instead comes from examination of the data that suggests we get an average of 10 errors a day from the process Orlando works.)

Once we know what is wrong with the system, rather than blame Orlando for the errors, we can brainstorm solutions. What if we gave Orlando breaks every 90 minutes instead of every 120? What if we rotated the person responsible for this really demanding process step so that no one had to do this task more than 2 hours a day? What if we changed the process so that Orlando has to do just 3 steps every 5 minutes instead of 4 steps? And so on. If we find a plausible theory for improvement, we can implement it for, say, another 30 to 100 days to see if this brought down errors. If it did, we have made progress and we can turn to some other issue within the system.

The behavior of the system is typically stable even as we change who we hire. (And the hiring process is part of the system. If we make a real change in what we screen for when we interview candidates, that too, might improve our system.) Systems define most outcomes. Changing teachers or politicians, employees or bankers is often like changing the cast in your play in the hopes that Romeo & Juliet will end happily.

Also, systems can behave in unexpected ways. A system has emergent properties that none of its part have. For instance, an engine cannot get you across town, nor can a steering wheel nor tires nor an axle. But when these parts are brought together in a system like a car, they can. Organizations are made up of knowledge workers who have to coordinate in order to create value. The person who designs a new product is worthless unless there is a person who can make it. Even those two are worthless if someone can't sell what they make, and so on. Just like the parts of the car cannot get you across town, the parts of an organization can't create value; through coordination, though, these people can create enormous value that emerges from their interactions. As a manager, you need to understand their current output as something that has lots of normal variation and you need to appreciate that the efforts in one part of the process can create problems in another step. (For instance, optimizing each part of the car could result in 87 different size bolts. This complexity could make it more difficult to keep all your parts stocked and even errors in assembly as you raise the risk of someone using the wrong bolt that is just fractionally off in size. Doing what is best for the system - changing the design so that it relies on just 3 different size bolts for instance - might mean doing what is less than optimal for a specific part.) The point is to optimize the system and that depends on people within it cooperating rather than competing.

Which brings us to psychology. 

If you have people within a system compete for promotions and raises rather than cooperate to create a fabulous product, you lay land mines for issues. If you have people work towards local goals rather than cooperate to create a success for the whole organization, you can easily encourage sub-optimization. Worse, you can disengage people through the use of extrinsic motivation.

The worst kind of motivation focuses people so much on the rewards that they don’t pay much attention to the task itself. One study of four-year-old children who tend to love a drum at that age broke the kids into three groups. One group was told that the box in front of them had a special gift for them for playing the drum. They stared at it distractedly the whole time they were pounding. Another group was told, almost in passing, that they’d get a prize for playing the drum. The third group was told nothing but was turned loose in the same toy room that included a drum. The second group was most likely to later identify the drum as their favorite toy, which gave rise to a notion of minimal sufficiency principle, [Mark Lepper] “using rewards or threats that are minimally sufficient to get kids to do the desired behaviors, but not so strong that the kids view the threats or rewards as the reason they are acting that way.”

As with systems or variation, psychology is rich with much more than the simple considerations I’ve mentioned. Deming felt that so much of what we do in school and work undermines the intrinsic motivation of people to learn, engage, cooperate, and create. He often showed this chart (video to follow).




This psychological question of how the system you have designed engages or disengages people might be the most important question of all.

"90% of what matters cannot be measured."
- Deming 

Finally, the fourth element of profound knowledge is the theory of knowledge. How do you know what you know? Your data and people’s behavior might be stable but what if the environment changes? How do you know that customers like your product? What about it do they like? The advances in UX since the time of Deming (he died in the early 1990s) have taken this question seriously. It’s worth remembering that he made his name as a management consultant but first got to Japan as a person to help with the census (“How do we know how many people live in Nara?” “How do we count people staying in a hotel on the night of the census? Are they counted as residents of the city of the hotel or the city they claim as home?”). And he got into the position to help to define this after getting a PhD in Physics. He studied phenomenon and tried to understand how we knew what we knew, and carried that basic inquiry into the question of how to count the population of an entire nation and how to measure quality in a product or service.

Evidence for what you know comes from data but data comes after you’ve formulated a theory. If you change what you are trying to measure or what you believe about the phenomenon, the data may suddenly be made obsolete or you’ll need to collect it differently. Your theory of knowledge is bound up in how you measure variation and how you define and understand the system you expect people to engage in.

Theory of knowledge, psychology, variation and systems. You can start anywhere and go everywhere but the real goal is to understand what you are dealing with in terms of a system and how that enables or disables people from realizing their potential within that system. This means understanding the difference between common cause and special cause variation and even a deeper understanding of how you know anything at all. All of it is humbling but it also leads to continuous learning and improvement as you continue to inquire on all of those fronts. Systems evolve with the people within them and the environment around them … or they become obsolete.

Ultimately, a successful social inventor or entrepreneur creates a system that outlasts them. The US didn’t collapse when Thomas Jefferson and John Adams died hours apart on the country’s 50th anniversary. Apple’s stock didn’t fall to zero when Steve Jobs died. The real value is less about your efforts within a system than your ability to improve or create the system. It’s true that some people run much faster than others but no one outruns a jet; what you want to do in improving or creating a system is to create something that performs much better than the people within it could hope to on their own.  A great manager does the same and I’m not sure how you’d do any of it without at least some intuitive or learned understanding of Deming’s profound knowledge.

No comments: