31 July 2012

What Deming Could Have Told Ballmer - How Rankings Destroyed Billions at Microsoft

Fascinating article in Vanity Fair on Microsoft's lost decade. In the midst of stories and stats to illustrate Microsoft's fall is this simply fact: "The iPhone brings in more revenue than the entirety of Microsoft." 


Management genius W. Edwards Deming used to excoriate executives and school officials who relied on grades, or rating, and ranking. He would have had fun with Microsoft CEO Steve Ballmer.

One of Deming's deadly diseases was annual performances and ranking employees. Deming thought that ranking showed a lack of understanding of systems and variation. Employees will invariably vary in their performance (whether the entire employee group is comprised of  Nobel Prize winners or recovering brain trauma patients freshly awoken from comas, there will be top ranked and bottom ranked people). And a business like Microsoft in 1995 will produce far more impressive results than a business like General Electric in 1905, regardless of rankings. Rankings simply distract people and undermine teamwork and innovation. Deming could have told CEO Ballmer this but he knew better. It is sad what his ranking of employees did to the potential of what was once one of the most amazing companies in the world.

From the article:


At the center of the cultural problems was a management system called “stack ranking.” Every current and former Microsoft employee I interviewed—every one—cited stack ranking as the most destructive process inside of Microsoft, something that drove out untold numbers of employees. The system—also referred to as “the performance model,” “the bell curve,” or just “the employee review”—has, with certain variations over the years, worked like this: every unit was forced to declare a certain percentage of employees as top performers, then good performers, then average, then below average, then poor.“If you were on a team of 10 people, you walked in the first day knowing that, no matter how good everyone was, two people were going to get a great review, seven were going to get mediocre reviews, and one was going to get a terrible review,” said a former software developer. “It leads to employees focusing on competing with each other rather than competing with other companies.”Supposing Microsoft had managed to hire technology’s top players into a single unit before they made their names elsewhere—Steve Jobs of Apple, Mark Zuckerberg of Facebook, Larry Page of Google, Larry Ellison of Oracle, and Jeff Bezos of Amazon—regardless of performance, under one of the iterations of stack ranking, two of them would have to be rated as below average, with one deemed disastrous.For that reason, executives said, a lot of Microsoft superstars did everything they could to avoid working alongside other top-notch developers, out of fear that they would be hurt in the rankings. And the reviews had real-world consequences: those at the top received bonuses and promotions; those at the bottom usually received no cash or were shown the door.“The behavior this engenders, people do everything they can to stay out of the bottom bucket,” one Microsoft engineer said. “People responsible for features will openly sabotage other people’s efforts. One of the most valuable things I learned was to give the appearance of being courteous while withholding just enough information from colleagues to ensure they didn’t get ahead of me on the rankings.”Worse, because the reviews came every six months, employees and their supervisors—who were also ranked—focused on their short-term performance, rather than on longer efforts to innovate.“The six-month reviews forced a lot of bad decision-making,” one software designer said. “People planned their days and their years around the review, rather than around products. You really had to focus on the six-month performance, rather than on doing what was right for the company.”……..
In the end, the stack-ranking system crippled the ability to innovate at Microsoft, executives said. “I wanted to build a team of people who would work together and whose only focus would be on making great software,” said Bill Hill, the former manager. “But you can’t do that at Microsoft.”
Sad that a company that created so much wealth through an understanding of, and ability to create, computer operating systems showed such a lack of understanding of the dynamics of social systems.

29 July 2012

Rlike on Future Possibility and Past Habits

An oddly devastating quote:


"Now I come to you full of future. And from habit we begin to live our past."
- Rilke

Meditate on this for awhile. It may define you more than you'd ever care to admit.

Courtesy of Maria Popova, arguably the web's most intriguing curator.

28 July 2012

London's Drunken Olympic Marathon

The 1908 Marathon in the London Olympics is notable for a few reasons. (Warning: historical trivia follows.)

1. The initial winner was a Brit but he was eventually disqualified because he collapsed four times before the finish line and British officials essentially carried him over it. He was disqualified and an American then won.


2. The American thinks he would have won outright except that he drank champagne a couple of miles away from the finish line. Champagne was then considered an energy drink akin to red bull. The champagne, he claims, gave him cramps. (Although obviously he was not as in bad a shape as the poor Brit who couldn't even stand.) 

3. This first modern day marathon was 26.2 miles, a distance that became the standard. This distance was not chosen because of history or precedent but simply because it was the distance between the window from which Princess Mary wanted to watch the runners begin the race and the Royal Box at the Stadium where the race ended. 


25 July 2012

Romney's Vision of America - Church & State Separate but Equal? When Tax Rates = Tithing



I'm reading Mormon America, a fascinating book, made more so by the fact that we've got about a 50% chance of electing a Mormon in November.

Among other things, it reports that Mormons are expected to tithe 10% of their income, a percentage that generates an estimated $5 billion a year in church revenue. This means that the average Mormon tithes about 3X what members of other churches do. (And might be reason enough for the average Mormon to feel like government taxes are simply too much additional burden.)

Curiously, the basis for tithing was established in the Old Testament, when the 10% was essentially a tax to cover not just the temple and priesthood but any or all costs we'd normally associate with government. 10% was not a religious tax; it was a kingdom tax.

Meanwhile, federal tax rates as a percentage of GDP are below 15% for the first time since Romney or Obama were born. And Romney wants to cut taxes further. He doesn't say what his target is, but it has to be closer to 10% of GDP than 20%.

And if that were to happen, we'd have a terribly curious situation.

The president would be supporting and participating in a 10% tax to pay for religion: church buildings, educational materials, and some charity and missionary work (although the bulk of missionary work is funded by families beyond the 10% tithe).

He'd also be supporting and participating in a 10% tax to support all federal activities, from social security and medicare to defense, health and human services, transportation (for instance, interstate freeways), education, homeland security, federal parks, immigration and naturalization, corps of engineers (think of dikes to protect New Orleans), agriculture programs, energy programs, homeland security, national science foundation (think basic research of the like that might discover Boggs Higson or fusion energy or a cure for cancer), veterans benefits, department of justice (the supreme court is just a fraction of this), small business administration, department of state (think foreign embassies and formulating and executing policy for our dealings with about 200 countries), NASA, national disasters, TARP and other federally insured bank programs, etc. 


That hardly seems balanced to me. I wonder if Mitt is also active in a church group pushing for lower tithing rates in the same way that he's part of a group pushing for lower taxes. As it now stands, separate but (nearly) equal funding for sectors of such differing scope doesn't make sense. 

Pry My Gun From My Cold, Dead Fingers - Explaining America's Gun Culture - repost

36 mass murders in 30 years and still no chance that congress would pass laws to do so much as ban assault weapons or armor-piercing bullets. Along the 2,000 miles of border between San Diego and Corpus Christi, there are about 6,700 gun dealers. All on the American side, of course: guns are illegal in bullet-ridden Mexico.

Perhaps the reason that Americans love their guns has its roots deep in the collective history of Western Civilization.

The conservative impulse is always a desire for what once was, a pull towards some idealized past. This is not to say that it is at odds with progress: Renaissance thinkers sought to return to what they saw as the glorious past of the Greeks and Romans. They were not deliberately trying to create something new but were, instead, trying to revive the classical world that had died centuries before.

Conservatives try to retain what is – or even return to what they imagine once was. Religious conservatives feel a pull towards a past in which the church could define society. Social conservatives want to return to an imagined past in which social mores were more clear, were simpler, when mores were more moral.

Without understanding what came before the nation-state, one can’t really understand the love of guns in this country. It seems to have its roots in feudal Europe, a pull towards an idealized world that came before the rule of the state.

Few realize that feudal Europe was a world in which disputes were settled by, well, feuds. There was no recognized central government and if two families or regions or people had a difference, they settled it through negotiation, trickery, or force.

Although clergy were prohibited from carrying arms, everyone else was expected to be armed and prepared to defend himself. Rights were not guaranteed but had to be fought for and “war” regularly broke out between families, groups, and even individuals. Although this seems to us like anarchy, these feuds were one characteristic of feudalism.

Medieval society was closer to anarchy than anything we know of as a modern nation-state. Force was used to settle issues and anyone could wage war against anyone else. Feuds were common in this age of feudalism. If this seems primitive, American readers might do well to remember what happened to the chief author of the Federalist Papers, which did so much to define the United States. Even for him, law did not offer an honorable resolution to disagreement: Hamilton, the first American Treasury Secretary, was killed in a duel because Aaron Burr thought that Hamilton’s criticism ruined his candidacy. The bill of rights that included a right to bear arms drew from this earlier time of feuds when the individual distrusted government’s monopoly on power. Even this constitutional genius had more regard for honor than law; culture changes slowly and involves more than just reason. Although they didn’t use pistols, many a medieval man died as did Hamilton, although for them there would have been, for them, no alternative of resorting to law rather than a personal duel or feud.

Conservatives today have a love of guns because theirs is still an uneasy trust in the machinations of justice as administered by the state.

Now for us normal people, the thought of justice administered by force rather than law or reason seems unreasonable. And perhaps many conservatives would articulate the same thing. But in some deeper recess of their coerced social evolution, they still pull back from the idea of ceding power to faceless bureaucrats. The right to bear arms becomes sacred because it represents at a real and visceral level the last bastion of resistance against the monopoly on power that the state uses as the basis for everything else it imposes upon its citizens – from laws, regulations, and taxes. 

23 July 2012

Why Mormons are so Politically Active in Opposition to Same-Sex Marriage

I got a revelation, to borrow a  religious term, about why Mormons are so active politically in their opposition to same-sex marriage. The LDS church helped to fund and to organize a great deal of the opposition against California's recent proposition to legalize same-sex marriage and Mitt Romney promises to push for a constitutional amendment to ban it.This might have something to do with their history.

Mormons wanted Utah to become a state. Their practice of polygamy made the rest of America resistant to  that idea. Polygamy had to end for Utah to become a state.

When LDS President Woodruff found a way to end polygamy, he had to both show Mormon willingness to be good, law-abiding Americans and show respect for early church revelations that had made polygamy one of their defining practices. Even while ending polygamy, he didn't want to show disrespect for earlier teachings of church founders, particularly of the prophet Joseph Smith and his immediate successor Brigham Young.

So Woodruff wrote the following in what is - like the Pope's official pronouncements in the Catholic Church - considered to be as binding as scripture.

"nothing in my teachings to the Church or in those of my associates,during the time specified ... can be reasonably construed to inculcate or encourage polygamy," and so church members should follow his example in submitting to the law. He concluded, "I now publicly declare that my advice to the Latter-day Saints is to refrain from contracting any marriage forbidden by the law of the land."

By not explicitly refuting early church teachings, Woodruff's carefully worded revelation essentially turned over the definition of marriage to the state. It is a curious situation, essentially subcontracting the definition of marriage to the government. And in that context, it makes sense that they'd be so eager to resist a change in the law on which their reliant. Because of this, Mormons aren't just ordinary political activists; they're protecting their church teachings.

The 2nd Amendment Settles It - Just Like the 6th Commandment

"Thou shalt not kill."

This 6th commandment seems to be terribly clear. It comes with no caveats, no clarifications, no exceptions, no parenthetical asides. But “Thou shalt not kill” quickly breaks down in a world where people argue about whether the definition of killing should include self-defense, warfare, abortion, capital punishment, or turning animals into meat. If you wanted to be sure not to offend God, it is probably safest to assume the most extreme interpretation, and outlaw all of the above. Or you could use some judgment.

"A well regulated militia being necessary to the security of a free state, the right of the people to keep and bear arms shall not be infringed."

The NRA focuses on the last clause in this 2nd amendment, the right of the people to keep and bear arms. Their  interpretation is akin to that of the pacifist, vegetarian, anti-abortion, anti-capital punishment advocates who argue that there are no exceptions to the 6th commandment. For them the amendment is simple.

But of course there is an exception, even beyond the normal caveats that thinking people are likely to add. The exception is articulated in the first clause, linking this right to bear arms to the security of a free state and a well regulated militia. But somehow the NRA has rewritten that first clause in the public imagination to make the second amendment read, 

"An unregulated individual being more important than the security of the state, the right of any person to keep and bear arms shall not be infringed."  

And they've gotten away with it. Each year, more than 30,000 Americans are killed by guns but still legislators are afraid to regulate America's free-lance militias of one.


San Diego Daily Paper Makes it Official: Obama Worst President Ever

In an editorial published in yesterday's paper, the Union Tribune's Editorial board ranked the five worst presidents of all time: Barack Obama, they assure us, tops the list.

The reason he's so bad? He's been unable to repair the damage done by George W. Bush who, as it turns out, is nowhere on the list. Hmm. Maybe George got bonus points for creating a mess that a Democrat couldn't repair in a single term.


22 July 2012

The Cold Comfort of Conspiracy Theories

Colorado Batman shooting shows obvious signs of being staged

Mike Adams, author of the above piece, goes on to argue that it made no sense that James Holmes surrendered (it made sense that he shot dozens of people?), that a grad student could not afford a $1,000 assault rifle (and so obviously a $1,500 laptop is definitely out of the question for any grad student), and that the bobby-trapped apartment showed signs of sophistication and planning beyond the scope of a grad student majoring in neuroscience. The obvious conclusion to be drawn is that the FBI brainwashed him and sent off to accomplish this mission.

Conspiracy theories seem inevitably convoluted, fly in the face of simpler theories, almost inevitably ignore facts and require the construction of hidden "facts." It's easy to laugh at conspiracy theories but they serve a purpose. 

Conspiracy theories assure us that the world is not subject to random events, that in fact things happen for a reason. And while conspiracies suggest that evil forces shape events, they at least assure us that someone is shaping events. 

19 July 2012

Would You Like to Hear My Collection of Famous Songs?

I don't mean to brag, but I own "Hey Jude" by the Beatles.

I wonder what the song Sgt. Pepper's Lonely Heart Club Band would sell for it you could own it in the same way that you could own Munch's The Scream ($119.9 million) or Cezanne's The Card Players ($254 million) or DaVinci's Mona Lisa (priceless).

Owning a painting and owning a song are very different things. Maybe that's because music is meant to be communal. If you could dance to paintings, maybe they'd be affordable and owned by millions of people as well.

So you have to wonder if in the future, the price of the Mona Lisa will be 99 cents once we learn how to copy art with as much fidelity as we now copy music. Will the iPad do for art what the radio did for music?

When We Can Make Manufactured Memories Our Own

Eventually, the natural progression from spoken word to the virtual reality of novels and movies might take us to the point at which we're actually able to download memories. A memory of bungee jumping or running the  110 meter hurdles fast enough to win an Olympic gold medal or making love to an exotic beauty will be available to anyone in the same way that videos now are.

This could complicate the issues of identity. If the memory feels like your own and memory defines us, don't such memories constitute an erosion of self? Worse, what if someone smuggles a memory into ours, making us the unwitting culprit in a crime or responsible for a debt or raising a child?

But these shared memories could also create a new basis for community. Now, punks or goths or sports fans share an identity based on shared experiences like albums or football seasons. Memories would accentuate this. Think about the ability to convince a group that they'd all shared an experience like war, all shared the experience of setback and fear, courage and victory, warfare and peace. Perhaps the most alluring thing about the shared memories will be their ability to take us past the point of individuation that threatens to leave us feeling isolated and lonely.

Manufactured memories could become the basis for extremely cooperative communities where everyone feels as though they understand one another, even if it is at the threat of losing one's own sense of self.

I guess this is far too cryptic to qualify as science fiction. Just call it blog fiction. And for now we'll share this memory of a curiously ill-formed notion about future technology. And a reminder that any technology can change our sense of identity, as car enthusiasts, gamers, bakers, and bankers can all attest.

18 July 2012

Intel and a New Management Technology

It's Intel's 44th birthday today. Here's an excerpt from my book, The Fourth Economy, about Intel's importance.


The Invention of the Computer and the Spread of Knowledge

As the world was recovering from World War Two in 1948, Bell Labs announced a product that it thought might “have far-reaching significance in electronics and electrical communication."[1] Three of its employees would eventually share a Nobel Prize for inventing this transistor, a small piece of technology that replaced bulky vacuum tubes and laid the foundation for the world’s most rapid increase in technological ability. Perhaps no single invention did more to overcome the limit of information than the transistor.
Of course, turning information from something that limited the economy into something that the economy seemingly slid atop like dress shoes on ice was to take more than this single invention. Most obviously, this transistor was to become a part of a computer chip, where transistors were bundled into many on-off switches that could be used to represent numbers, letters, and, eventually, even pictures and videos.
Furthermore, this hardware was of little consequence without programming - a means to manipulate the transistors into an encoded array that could be used to represent, store, and process information.
In that same year, 1948, Claude Shannon coined the word “bit” to begin quantifying information as prelude to it being processed by a machine. 
By the 1960s, the transistor was part of a computer chip. The chip was embedded in a sea of information theory that was advancing its design and use in myriad ways. Computer programming was changing how people thought and what it was possible to do with this new chip. And it was this set of technologies – more than typewriters and telephones – that would come to define the information economy for most. Yet even with the advent of this new technology, something was missing. Technological invention alone is rarely enough; to make real gains from the computer chip required social invention, a change in corporate culture.
One of the three co-inventors of the transistor began a company to exploit this new technology. He didn’t just understand the transistor; he had, after all, helped to create it.
William Shockley was co-inventor of the solid-state transistor and literally wrote the book on semiconductors that would be used by the first generation of inventers and engineers to advance this new technology. He had graduated from the best technical schools in the nation (BS from Cal Tech and PhD from MIT), and was the epitome of the modern knowledge worker.
Shockley hired the best and brightest university graduates to staff his Shockley Semiconductor Laboratory. Yet things were not quite right. It wasn’t technology, intelligence, or money that his company was lacking. Something else was missing.
To answer what it was leads us to the question of why information has so much value.
One of the beliefs of pragmatism is that knowledge has meaning only in its consequences. This suggests that information has value only if it is acted upon. Information that is stored in secret has no consequences. By contrast, information that informs action - knowledge or belief, as a pragmatist would see it - needs to be both known and acted upon. This is information that needs to be distributed. Imagine a stock market with lots of information about companies and the economy but only a handful of people who buy and sell stocks. As long as only the elites act upon the information, that information doesn’t have much value to anyone else. Put another way, distributed information has value only when it drives distributed decision making or action.
So, in order for information to become important, it has to inform important actions. Furthermore, the more people who act on this information, the more valuable it will be.
What was missing from Shockley’s approach to this brand new technology was a management style or culture that would make information processing valuable. Shockley was apparently a fairly autocratic manager. He was creating information technology that got more value as it drove more decisions but he was, in style, not the kind of person to give up much control. Largely because of this, it was not Shockley who would become a billionaire from computer chips, but instead, a few of his employees.
The year 1968 was the kind of year that would have made even today’s 24-7 news coverage seem insufficient. Even then it was obvious that it was a historical year. In January, the North Vietnamese launched the Tet offensive, making it all the way to the U.S. Embassy in Saigon; this might have been the first indication that those unbeatable Americans could be beaten. In the United States, civil rights demonstrations that devolved into deadly riots were the backdrop for Lyndon Johnson’s signing of the Civil Rights Act. Martin Luther King, Jr. and Robert Kennedy - iconic figures even in life - were assassinated within months of each other. The musical Hair opened on Broadway and Yale announced that it would begin to admit women. For the first time in history, someone saw the earth from space: astronauts Frank Borman, Jim Lovell, and William Anders became the first humans to see the dark side of the moon and the earth as a whole, an image that dissolved differences of borders and even continents. Any one of these stories could have been enough to change modern society. Yet in the midst of all these incredible events, two entrepreneurs quietly began a company that would transform technology and business, a company that would do as much to define Silicon Valley as any other.
Gordon Moore and Robert Noyce founded Intel in July 1968. Moore gave his name to “Moore’s Law,” a prediction that the power of computer chips would double every eighteen months. Here was something akin to the magic of compound interest applied to technology or, more specifically, information processing.
Moore and Noyce had originally worked for Shockley, but they left his laboratory because they didn’t like his tyrannical management. They then went to work for Fairchild Semiconductor, but left again, because, “Fairchild was steeped in an East Coast, old-fashioned, hierarchical business structure,” Noyce said in a 1988 interview. "I never wanted to be a part of a company like that."[2]
It is worth noting that Moore and Noyce didn’t leave their former employers because of technology or funding issues. They left because of differences in management philosophy.
Once when I was at Intel, one of the employees asked if I wanted to see the CEO’s cubicle. Note that this was an invitation to see his cubicle, not his office. We walked over to a wall that was - like every other wall on the floor - about five feet high, and I was able to look over the wall into an office area complete with pictures of the CEO with important people like President Clinton. In most companies, one can tell pretty quickly who is higher up in the organization than others; the level of deference and the ease of winning arguments are pretty clear indicators of who is where in the organizational chart. By contrast, I’ve never been inside a company where it was more difficult to discern rank than Intel. Depending on the topic, completely different people could be assertive, deferential, or argumentative. One of Intel’s values is something like “constructive confrontation,” and this certainly played out in more than one meeting I attended. When a company makes investments in the billions, it can’t afford to make a mistake simply because people have quaint notions about respect for authority. Intel’s culture seems to do everything to drive facts and reasons ahead of position and formal authority. This might have partly been a reaction of its founders to the management style of their former employer, Shockley.
The first company they left, Shockely’s Labs, no longer exists. The other, Fairchild Semiconductor, lost money last year and has a market cap of less than $2 billion, just a fraction of Intel’s market cap of more than $135 billion[3].  Intel’s net profit in 2010 was over $12 billion, and it employs more than 100,000 people worldwide. Moore and Royce’s open culture seems to have made a difference.
Information technology has little value in a culture that hoards information. Information technology makes sense as a means to store, distribute, and give access to information and has value as tool for problem solving and decision making.
The pioneers of information technology, like Moore and Noyce, understood this and realized - at some level - that it made little or no sense to create hierarchies where information was held and decisions were made at one level and people were merely instructed at another. The knowledge worker needed information technology as a basis for decisions and action. Before 1830, up until the time of the railroad, the information sector of the American workforce was less than 1 percent.[4] By the close of the 20th century, nearly everyone seemed to need technology for storing and processing information.
By paying double typical wages, Henry Ford created a new generation of consumers for his car. Moore and Noyce didn’t just create information technology; they helped to popularize a management culture that distributed information and decision making, helping to create even more demand for their amazing new technology.


[1] James Gleick, The Information (New York: Pantheon Books, 2011), 3.
[2] Daniel Gross, ed., Forbes: Greatest Business Stories of All Time (New York: John Wiley & Sons, 1996), 251.
[3] Based on stock prices as of June 2012.
[4] Beniger, The Control Revolution, 23.

Birthers in Support of Obama

Presidential biographer David Maraniss is a tad incredulous at the birther conspiracy theorists claiming that Obama was born overseas. At the time that Obama was born, his father was being closely followed by INS who were considering deporting him back to Kenya. Because of this, there are actually official government documents verifying that Barack's parents were both in Hawaii when he was born. More than the birth certificate, that is.

Still, that story is not nearly as interesting as the claim of a sheriff with his own posse. Arizona Sheriff Joe Arpaio, is using his considerable authority to denounce the president's birth certificate as fake.

Oddly, I prefer Sheiff Joe's story to Maraniss's story. It takes a clever man to fake his own death; it would take real genius to fake your own birth. The thought that our president is clever enough to do that actually consoles me.


16 July 2012

Stephen Covey Dead, His 7 Habits Alive

I loved teaching Stephen Covey's 7 Habits. I got to work for the Covey Leadership Center for about 3 years and I was sometimes amazed that they paid me to do something that was so gratifying. He was a good man who weaved together and created an array of useful tools for better understanding and changing self. One of the questions we asked in the Principle Center Leadership seminar was, "How many seeds in an apple?" The guesses ranged from a couple to a dozen. The next question was, "How many apples in a seed?" Nobody knows. Not yet anyway. And the same could be said about the influence of Stephen's thinking.

For me, teaching the 7 Habits was an opportunity to have a 3 day conversation about how we got to be who we are and how we might change that.

7 Habits of Highly Effective People was a carefully chosen name. Effective rather than successful because success connoted living one's life by someone else's yard stick, measuring your life by fame, wealth, or waist size. By contrast, effective suggested that you were able to be who you wanted to be, able to live according to whatever standards mattered to you.

Habit 1: Be Proactive 
We all react. That's a given. Some people are proactive some of the time, and they are the ones who create lives, relationships, companies, and communities.
The obvious reaction is an eye for an eye, responding in anger to someone who is rude or thoughtless. The less obvious reaction is grasping an opportunity as it comes along, making the best of something we perceive.  Proactive suggests that you are, instead of idly scanning the radio dial of life and possibility, seeking for particular opportunities, perhaps even making them.

Habit 2: Begin with the End in Mind
Attendees of the 7 Habits seminar got feedback from themselves, family, friends, co-workers, supervisors, direct reports, etc. Based on this, they were able to see how they scored on the 7 Habits and inevitably Habit 3 - Put First Things First - was the lowest score. Most people thought that this showed a lack of discipline and vowed to work harder; they might have been right. I think that often our procrastination and poor priorities result from taking short cuts on this habit, though. Tony Gwynn's college coach once said of Gwynn's dedication to hitting, "I sometimes think that we confuse work ethic with love for what we're doing."

What do you want to accomplish? What matters to you? What unique intersection of need from other people and ability from you is going to guide you to your own contribution? What matters so much that you would need work hours as a means to tell you when you have to stop rather than when you get to stop? What is the end you have in mind that tells you what otherwise good opportunities to ignore and what doesn't deserve time, meditation, or effort and what does? If you're not clear on this, the next 5 habits won't do much for you.

Habit 3: Put First Things First
Covey popularized the notion of big rocks. You'll never have enough time, so the trick is to make sure that you carve out time for a few really big accomplishments and let the little things fit in around it, like sand and pebbles fitting in around the big rocks. Curiously, if you try to accomplish a lot, you might find yourself less clear about what you actually accomplished. This goes back to your end in mind; what few big accomplishments do you want to look back on? What are you doing this week to get closer to that?

Habit 4: Think Win-Win
The first three habits will get you to a point of independence, beyond defining your life in reaction to what is going on around you or in reaction to what just happens. The next three habits get you to a point of interdependence, focusing on the relationships that define your personal and work life.

You might be a win-lose personality, wanting to beat the people around you in terms of arguments, possessions, market share, etc. Or you might just be a win personality, not caring whether the other people around you win or lose but merely focusing on getting your own win. Worse, you might be a lose-lose or lose-win personality, someone destructive or a martyr.

You have to have courage to go for a win. You have to have consideration to want the other person to win. With both, you can get into the space of win-win, starting with the notion that a relationship, a transaction, a situation can result in a win for each person in it.

Habit 5: Seek First to Understand and Then to be Understood
Before you can state what you want - what your win is - you must first be able to clearly state what their win is. To do that might require lots of work or simply a few minutes of listening. Before you can be heard, you need to hear them. Hear their emotion, hear their position, hear their context.

When you are done listening, you can then state your win in the context of theirs. A few things are possible. Your wins may have nothing to do with each other, in which case you might have no basis for a relationship or a transaction. Or your wins are in conflict, in which case you may need to compromise or even design some larger solution that allows your wins to be jointly met, which brings us to the next habit.

Habit 6: Synergize
Synergy means that we've got system properties at work. Two lonely people can be combined into one happy couple. Sometimes wins can be better met jointly than in isolation. And most of society is testament to the power of synergy, from families to nation-states and trading partners. If you have an end in mind, chances are pretty good that this will involve other people; if you want any chance of achieving that end in mind, you'll likely need to tailor that end to match the visions that other people are carrying around in their head. Seeking first to understand might give you the opportunity to influence your own vision as well as theirs, shaping it into some workable plan.

Habit 7: Sharpen the Saw
Finally, we don't indefinitely stay at the same level of commitment after one revival meeting, the same level of fitness after one workout or the same level of refreshed after one nap. The various ways of being human - the physical, emotional, social, spiritual, and intellectual - all need continual regeneration and exercise. To create a proactive life suggests that you've got the resources for doing that; without pausing to sharpen the saw, you'll exhaust your resources prematurely.

Stephen didn't invent the idea of an intentional life but he did provide us with a wonderful framework and tools for becoming more effective at creating  one. Intentionality aside, though, I just felt lucky to have worked for him. He was a good man who made the world better for thousands of people he had never even met. Now that's a life.

15 July 2012

The Not So New Problem of Immigration

In 2009, about 12.5% of America's population was foreign born.
That is considerably down from 1901, when it was an incredible 52%.

13 July 2012

War and Suicide

Fritz Haber was a German war hero for his invention of the poison chlorine gas used to kill French soldiers in the First World War. "His wife of thirty-two years, Clara, had long condemned his work as inhumane and immoral and demanded he stop, but to such concerns he gave a stock reply: death was death, no matter the cause." Nine days after the first gas attack in which he proved the efficacy of his poison and tools for delivering it, Clara committed suicide.
[story from In the Garden of Beasts by Erik Larson.]

Meanwhile, this year, American soldiers are killing themselves at a rate of 1 per day.

12 July 2012

Job Market Inflection Point?

"Teen hiring  jumped 22% in June over the same month last year in what is shaping up to be the best summer since 2007 for 16- to 19-year-olds to get jobs"



"US jobless claims plunge to lowest in 4 years"


Click through the above to find lots of caveats and clarifications but still, that's not a bad pair of headlines for one day. 


What's notable is not so much that the American jobs recovery is slow. What's notable is that it continues in spite of a slow down in Asia, a slow-motion, euro-induced financial crisis, and continual public sector layoffs at the state and local levels. 


Total private sector employment has returned to its pre-Great Recession levels. By contrast, total public sector employment is roughly 3% lower than when Obama took office. At some point, this steady rise becomes self sustaining rather than teetering on the brink of  collapse; we won't know when that point was for years, but I'm still betting that we'll look back to 2012 as that point.

Your Mama Invented Baseball

I think baseball was designed by mothers.

First, we've all come to accept the wisdom of wearing helmets while riding motorcycles or even bikes. Only baseball players wear helmets just for running. And curiously, many of them don't even run that fast.

Second, while football players get excited about just doing their job (watch a guy who has just tackled someone strut), baseball players are obligated to restrain their emotions. Unless it's a bottom of the ninth situation, even a grand slam gets just a tight-lipped smile and a restrained trot.

Finally, in football or basketball, if a player is a great scorer, his teammates keep giving him the ball. Baseball? It doesn't matter how well you hit, you have to wait your turn. Even a pitcher who might have about a 5% chance of getting on base gets his turn - and often the guy with the highest probability of getting on base has to wait until after this pitcher makes his obligatory out.

I could be wrong about this, but it kind of makes sense. "Put your helmet on before you run!" "Don't act all excited. Are you trying to make those guys who got out feel bad?" "Wait your turn! It doesn't matter that you  hit better than Jeremy. You let him hit first." It's hard to imagine anyone other than a mom saying that.

11 July 2012

Where Bad Actors Go to Die - The GOP Political Theater

Let's see if I have this straight.  Today the House Republicans are voting to repeal the Affordable Care Act legislation. For the 33rd time.

As legislation, this is irrelevant. Their bill - like its 32 previous versions - will never pass the Senate.

As a clarification of their position, this is redundant. There is probably no political fact more obvious to Americans than Republicans's opposition to health care coverage.


This from the party that explains that they'll cut the deficit by eliminating government waste.

And yet the Republican Party wins on two counts by blowing off actual accomplishment in favor of political theater. One, the worse the economy, the better the prospects for Romney. Two, the more disgusted people are with government, the more likely they are to vote for them, the anti-government party. It's evil genius but it's still genius.

09 July 2012

Haidt: Morality Didn't Evolve for Truth Seeking

Jonathan  Haidt's Righteous Mind is a delightfully written and thought-provoking book. He covers quite a lot about morality before he moves to the punch line that you may have seen in his TED talk or in various interviews he gives. (Essentially, he explains how the morality of conservatives and liberals differs and why.)

He argues that morality evolved as a means to win approval, not to find the truth. You make think it is "true" that it's bad to deny rights to (fill in the blank with everything from Tutsis to Jews to women to Jewish women Tutsis) but discovering or arguing for something that feels "true" to you, curiously enough, is not the role of morality. Morality does show up as an obviously right or obviously wrong thing to us, but, those judgments that feel so real to us turn out not to be universal truths. So while it may be that what is "true" evolves  (Philistines should be killed, debt is sinful, women should not vote, gays should be imprisoned, animals are food, work need not be fulfilling ...) over time, your tribal identity depends on a clear statement of morality that "feels" true at a particular time.

Science and skepticism are for the discovery of truth. And that is uncomfortable, requiring questioning and living in doubt and hard work. Very few people are scientists. Morality is for winning group approval so that you prove you can be trusted. It's much easier than science and nearly everyone is moral (although not necessarily in a way that your group might recognize).

Perhaps that's obvious to everyone else. To me, it felt like a remarkable kind of revelation.

06 July 2012

Comparing Obama to Presidents Nixon thru Bush

Life's not fair. Obama inherited a mess and if he tries to point that out, he gets labeled irresponsible. The Romney campaign likes to point out that under Obama unemployment has stayed persistently above 8%. And they're right that this is awful. But it's worth pointing out that Obama is the first president since World War II to inherit an economy with an unemployment rate over 8%.


So, I'm going to compare the performance of the economy under Obama with presidents Nixon, Carter, Reagan, Bush, Clinton, and Bush, assuming that they had all inherited the same unemployment rate as Obama. This is the unemployment rate they'd have at the time of their re-election assuming that they'd changed the rate as much as they did. For example, under Nixon unemployment rose from 3.4% to 5.6%, or 2.2 percentage points. Had Nixon added those same 2.2 percentage points to Obama's starting point of 8.3%, he'd have an unemployment rate of 10.5%. Clinton lowered unemployment by 2.2 percentage points, so he'd have an unemployment rate of 6.1%. It's wonderfully superficial as far as economic analysis goes, but it does offer a curious comparison point from the same baseline.



Apparently it's tough to lower unemployment in one's first term. On this list, only under Clinton did unemployment significantly drop during a first term. (Carter and Obama's (so far) drop of a tenth of a percent seems fairly token, hardly better than Reagan's inability to budge unemployment at all.) 



Job Report: Obama Nearly as Bad as Bush or Nearly as Good as Reagan?

Last month, the economy created just 80,000 jobs. That's just not good enough, working out to fewer than a million jobs a year when we would need about that many jobs per month to get back to full employment by year end.

Counting from his first full month as president, the economy under Obama has performed poorly. Only GW Bush, with an average monthly job loss of 25,000  the same number of months into his presidency did worse than Obama's loss of 12,000. See rankings below.



But if we discount the first full year of his presidency, assuming that inherited conditions like a post-2000 market bubble bursting or a global recession had little to do with their policies, Obama fares better in comparison. With job creation of 131k per month, the economy under him rates just blow Reagan's 135k.


It's worth noting that Republicans are far more adept at eluding dismal job creation records than are Democrats. Carter lost re-election and GW Bush won. It's an interesting set of comparisons. Will voters conclude that Obama is as bad as GW Bush (and barely re-elect him?), nearly as good as Reagan (and decisively re-elect him?) or more like Carter (well intentioned but unable to shake the label of ineffectual in spite of actual numbers?)? 

Packaging matters and Republicans seem to best understand that. It's not clear, though, that Obama's campaign knows how to package his mediocre performance in the wake of such exciting promise. It still seems to me that if job creation under Obama starts trending upwards - particularly if it is enough to cause unemployment to drop below 8% - he'll easily win the re-election. If his numbers continue like this, though, the November election seems like it could be decided by the flip of a coin. 


04 July 2012

The Hopelessness of Reconciling Ideological Differences


 "Debating Republicans is like playing chess with a pigeon. You could be the world's best chess player, but the pigeon is still going to knock over all the pieces, crap on the board and walk around triumphantly."


I've just begun Jonathon Haidt's book, The Righteous Mind. His objective is to explain why conservatives and liberals reach such different conclusions with the hope that he'll draw a bridge between the two groups. Out of the gate, though, he shares cultural studies that suggest to me the hopelessness of the task.

A study asked Indians and Americans to judge the morality of different people in a story.

One story asks you to judge a husband. A young married woman went alone to see a movie without informing her husband. When she returned home her husband said, "If you do it again, I will beat you black and blue." She did it again; he beat her black and blue.

Another story asks you to judge a widow in your community who eats fish two or three times a week.

A person from India thought it wrong for the widow to eat fish but saw nothing wrong with the husband keeping his promise. The Americans concluded the opposite. Neither saw their judgement as having anything to do with culture and saw, instead, that it had everything to do with morality.

And in this perhaps we have the condition of conservatives and liberals. You're unlikely to ever convince the Indian that it is okay for the widow to eat fish. (Fish is apparently considered "hot" food that has aphrodisiac effects, making it potentially harmful to the community that this widow would be firing up her sex drive with no socially acceptable outlet. Apparently.) Or convince the American (most Americans) that it's fine to beat your wife black and blue. There is no way to argue the other person out of his belief.

I'll stay tuned. It's nice to think that people from different cultures could persuade one another. Sadly, however, there seems to be little evidence of it.


03 July 2012

Europe's Point of Political Bifurcation: Independence Day or Interdependence Day?


Europe seems to be at a point of bifurcation. It can either raise to a higher level of complexity and political interdependence or spin off a few countries from inclusion in the euro, into simpler political and economic relationships.

Bifurcation is a point at which a system can either collapse into something simpler or cohere into something with greater complexity.

One route that lies before Europe is less economic cohesion. Certain countries might stop using the euro, reverting to their own currency as a means to allow currency devaluation and economic adjustment. This would give the countries sovereignty but would represent, for Europe, a simpler, less interdependent state.

Another route that lies before Europe is more political cohesion. Germany might actually send more funds to Greece and Italy in ways akin to New York sending funds to Louisiana. Not just budget targets but actual budgets for countries would begin to merge and become similar. This suggests greater political interdependence, a higher level of complexity.

In the words of UK Prime Minister David Cameron, Europe has a choice between sovereign nation-states or a European super-state. Put in terms of the 4th of July, they have a choice between independence day or interdependence day. 

At the point of bifurcation, you face great uncertainty. One thing that is certain, though, is that the status quo has become untenable. Soon, Europe will have to choose either greater political interdependence or less economic interdependence.