24 February 2015

Jeb Bush as Republican IQ Test

There is a very real possibility that a father and his two sons will be presidents of this great country for two full decades (George H.'s 4 years plus George W.'s 8 plus Jeb's 8). Three men from one family to reach the top position in their field out of a nation of 300 million. It would be an unprecedented achievement for any family in any field, from entrepreneurship and pop music, to literature and science.

A Martian would be convinced that the Bush family must have some extraordinary mix of speaking ability, empathy, and policy genius to have so consistently won the hearts of the American people. It would be harder for mere earthlings to reach that conclusion.

You can hear Jeb Bush in his biggest speech yet, speaking to foreign policy here. I encourage you to listen for the full 35 seconds.

There is not a single utterance - or tongue stumble - that Jeb makes that is any different from those made by his brother. He's talking about foreign policy, about responding to the threat of terrorism. This is obviously a narrative about how to conduct ourselves in a post 9-11 world.

His conclusion? "We have no reason to apologize for our leadership."  This is probably the most frightening thing he uttered, suggesting as it does that either family pride, happy ignorance of actual events, or a stubborn denial has convinced him that there is nothing he would have done any differently than his brother.

Now you might believe that George W. simply had a string of bad luck. Possibly. It is also possible that the man who rightfully characterized himself as "the decider" made the worst series of decisions of any president in history.

His office was seeing a lot of communication in the summer of 2001 that Condi Rice - his National Security Adviser - categorized as "Osama bin Laden determined to strike in US." He decided to vacation at his ranch for the whole of August before 9-11.

15 of the 19 9-11 hijackers were from Saudi Arabia. Not one was from Afghanistan or Iraq. Bush decided to invade Afghanistan and Iraq.

Lawrence Lindsey was director of the National Economic Council when the invasion of Iraq began. Donald Rumsfeld estimated the Iraq war and occupation would cost $50 billion. Lindsey suggested it might cost $200 billion. Bush decided that Lindsey should be "forced to resign" and Rumsfeld should be trusted. So far, the attack and occupation of Iraq has cost about $2 trillion, 10X Lindsey's offensive estimate.

Never in the history of this amazing country has any president decided to finance a war with a tax cut. Well, not until George W. made that decision. Dubya inherited a budget surplus of $236 billion . The year he decided to invade Iraq with a tax cut, he created a deficit of $378 billion, a swing of  $614 billion from just three years earlier. By his last year in office, his annual deficit had grown another trillion dollars. From Clinton's last year to Bush's last year, the deficit grew by as much as Canada's entire GDP.  Only 10 of the world's 200-some countries have GDP larger than Canada.

Well it may have been expensive, but at least Iraq was worth it, right? It's hard to imagine how you'd argue that.

2,977 people were killed on 9-11. In the invasion and occupations of Afghanistan and Iraq, 6,717 service people have died and even that tragic number is only 4% of the 150,000 Iraqis who've been killed since our invasion.

We pumped enormous amounts of money, armaments, and military expertise into that area. ISIS is now well armed and has been able to tap into outrage to make that region a parody of Bush's vision of Iraq as a "beacon of hope." I'm not sure what data would support the claim that Iraq was a success.

Putting aside the fact that randomly pulling a professor from any American university's department of policy would yield someone more grounded in facts and sound theory than Jeb, it is enough to be alarmed that this man sees no reason to apologize for American (read, my brother's) foreign policy. All of these atrocious outcomes and THERE IS NOTHING TO LEARN HERE is the conclusion the man is hoping we will make. Given he can't admit that his big brother made a single mistake, he can't learn a thing from Dubya's disastrous administration. With no learning, there is no progress.

The Republican Party has been given an IQ test. If they choose Jeb, they will prove that they've  slipped from linkage to facts and theory into the realm of ideology. Ideologues don't deal in theories. They deal in unchanging principles. They don't learn and adapt. They persevere. They don't just reject evolution in theory. They reject it in practice.

The Republican Party emerged in the mid-1800s when the Whig Party died. If Republicans decide they are ideologues rather than pragmatists, if they nominate another Bush or someone like him, they will follow the path of the Whigs, dying out as the last of the baby boomers who so fondly remember Reagan die out. Curiously, this party that  is home to so many who resist science has a simple choice: evolve or die. That evolution starts with - if not an apology for past policies than at least a radical change in them. It is harder to imagine Jeb Bush leading this change than it is to imagine Tito Jackson becoming more famous than his siblings Michael and Janet.


20 February 2015

Confusing STEM Education, Innovation and Entrepreneurship

Innovation is a form of technological invention. Human genome sequencing that results from a swab from the inside of your cheek, for instance. Or mounting lightweight cameras onto helmets or drones to capture video of skiing from the perspective of the skier or the surfer from the perspective of a sea gull, for example.

Entrepreneurship is a form of social invention. 23andMe, the company that sells a personal genome test, for instance. Or GoPro, the company that sells cameras for use in odd places like drones or dashboards, or example.

The skills for innovation and entrepreneurship can overlap but they don't have to. By 1900, Andrew Carnegie was probably the richest man in the world. Carnegie's early success depended on a process patented by Henry Bessemer. Carnegie was a businessperson who knew how to solve new problems with strong, affordable steel that could be used to build skyscrapers and railroads. Steve Jobs did not invent the computer (although his partner Steve Wozniak did make some advances in circuit board design that helped to evolve it).

You could be innovative but not entrepreneurial. R&D labs are full of brilliant scientists who really could not tell you the commercial application of their efforts. And of course you can be entrepreneurial without being innovative. Starbucks original founders did not invent coffee roasting. McDonalds' Dick and Mac (fortunately for us, their most famous burger was named after Mac) did not invent the hamburger. The original founders of these companies were not even the ones who made the brands wildly successful. Ray Kroc and Howard Schulz have their names forever associated with McDonalds and Starbucks because they made them an international success and took the companies public.

There is a great deal of emphasis on STEM education - science, technology, engineering, and math majors. This is nice. It helps with innovation and we can always use more STEM graduates. They're probably the group least likely to be unemployed.

But even STEM graduates might not become innovators. They might not develop new technologies and products. That depends on particular sorts of institutions as varied as good grad schools, a strong financial community willing to bet on long-shots (venture capitalists, for instance) and strong R&D labs within universities and corporations.

And of course, those innovators may never become entrepreneurs. The skills required for product invention do not automatically lend themselves to entrepreneurship.

On that note, here is a fascinating chart that shows the gap between STEM and innovation. The gap between innovation and entrepreneurship is even wider. The policies that encourage each are very different but it's not obvious that policy makers make that distinction.


30 January 2015

Why Corporations Could Get Higher Returns Than VCs

Daniel Kahneman is the only psychology professor to have won a Nobel Prize in Economics. His studies with Adam Tversky on how people value things is one of the reasons. They suggest something counter-intuitive about risk.

In one study, Kahneman ran a few scenarios with coffee mugs from a person’s alma mater. In one scenario, he gave people the mug and let them “take ownership” before offering to buy it back from them. In the other study, he let people buy the mug that was not yet in their possession. On the surface, you would think that they would value the mug the same way in both instances, but they didn’t. He had to pay, on average, $7 and some change to buy the mugs back from people. By contrast, he had to sell the mug for about $3 and change to get them to buy when they did not already own it. Kahneman’s conclusion was that people put a higher price on loss than they do gain. It is more painful to lose what you have than never get something comparable. He had to pay people twice as much to give up their mugs as they were willing to pay to buy them.

This makes sense. You might feel the pang of a relationship that fails to materialize but that is nowhere near as devastating as a divorce. Not getting the job is rarely as painful as being laid off. Loss is painful and we put a premium on avoiding the loss of valuable things.
This is one reason that Warren Buffet is worth so much. He sells insurance. People pay a premium to avoid loss. Richard Thaler discovered that people would not pay more than $200 to avoid the 1 in 1,000 chance of immediate death, which suggests they value this added probability about the same as a smart phone. But if you offer to pay someone to accept 1 in 1,000 odds of immediate death, they refuse it even for $50,000.[1]

Just to be clear on this oddly paradoxical approach to risk, here are the scenarios.

Scenario 1: You have a choice between two pills to cure what ails you. The $1 pill has a one in a 1,000 chance of killing you immediately. The premium pill eliminates this chance. How much extra will you pay for the premium pill?
Research suggests that the average person will not pay more than $200 for the premium pill.

Scenario 2: You are offered pay to be in a research study, testing the $1 pill that carries a one in a 1,000 chance of killing you immediately. How much will you demand to take this risk?
Research suggests that the average person will not do it even for $50,000. 

Oddly, this does not result from a difference in the probability of death: in both cases, it is one in 1,000. These two scenarios offer the exact same risk, the exact same chance of death. One measures how little you are willing to lose to avoid the risk and the other measures how much you have to gain to accept the risk. The pain of loss is much greater than the allure of gain. We do not feel so bad about losing out on big gains but we desperately try to avoid even small losses.

These differences in how we value risk help to explain why the derivatives, junk bond and futures markets are worth trillions. Smart investors will buy risk from people at a discount. Now that would just be interesting if it were not for something really fascinating that it suggest about how corporations could use behavioral psychology and the popularization of entrepreneurship to earn returns that venture capitalists would envy.

The prime candidates for entrepreneurial ventures are actually people in their 30s and 40s. Their startups are less likely to fail and reasonably so. They have more experience than people in their 20s and more drive than people in their 50s. They have learned about processes, products and people and typically know at least one industry reasonably well. But they have one major disadvantage in comparison to the twenty-something crowd: they have so much to lose.

Imagine a 40 year old who has been in the industry – any industry – long enough to have a potentially lucrative idea. He knows a cheaper way to make an old product or has an idea for an innovative new product or how to create a new market. He also knows that executing this idea will require capital. And leaving his job. And working at risk for at least a year – more often 3 to 7 years. It is not hard to imagine that at 40, he has been married for 10 or 12 years. His children are 9 and 7 and he has a small amount saved for their college. He is 8 years into his 30-year mortgage. He is 10 years into his job and now gets 4 weeks of vacation and is fully vested in the 401(k), which is just starting to seem sizable – but still not enough for retirement. The man has a lot to lose. Most importantly, he puts more weight on the cost of losing all that than he does on the potential gain from his entrepreneurial venture.
The twenty-five year old, by contrast, has almost nothing to lose. For this reason alone, she might be the better candidate for entrepreneurship.

If Kahneman’s studies are right, our 40 year old values what he has now vs. what he could have at a rate of about 2 to 1. If Thaler is right, he values it at a rate of at least 250 to 1. In any case, the emotional cost of losing what he already has is great. He would be sick to wake up at 47 with no 401(k), no business, no money for college for his 16 and 14 year old children, and no equity in his home. The prospect of this is more terrifying than the hope of waking up at 47 to a net worth of $5 or 10 million and the expectation of doubling that every 2 to 5 years. His preference for the second scenario is not as great as his desire to avoid the first. Loss is more sharply felt than gain.

This suggests that corporations have a great deal to make by offering entrepreneurial opportunities to their employees. Countless employees who could be great entrepreneurs shy away from the prospect because they have something to lose. What a corporation would have to offer as a percentage of returns would be less – perhaps considerably less – than what a venture capitalist or traditional banker would have to offer. A successful venture could return considerably more to the corporation than it might to the venture capitalists simply because the employee as entrepreneur would have so much less to lose than the employee who leaves his job to become an entrepreneur.





[1] Peter L. Bernstein, Capital Ideas Evolving [John L. Wiley & Sons, Hoboken, New Jersey, 2007] p. 15.

Stay tuned. This is an excerpt from an update to The Fourth Economy: Inventing Western Civilization, soon to be released with revisions. In the 20th century, we popularized knowledge work. In this generation we will popularize entrepreneurship. This is going to be big.

20 January 2015

Why Financial Bubbles Are Becoming More Common (and Dangerous)

 

“Between June 1998 and June 2008 the notional[1] value of outstanding over-the-counter derivatives exploded from $72 trillion to $673 trillion … just under eleven times global gross product.”[2]
“The gross debt of the US financial system grew from 20 percent of GDP in 1979 to 120 percent in 2008. In the UK, the gross debt of the financial system reached two and a half times GDP in 2007.”[3]

How is it possible to have so much capital?

One theory offered by people like former Fed Chairman Ben Bernanke and Europe’s most respected financial commentator, Martin Wolf, is that we have a savings glut. There is no way to distinguish between a savings glut and an investment dearth but in either case, the supply of capital is great enough that interest rates hit zero. (When something is scarce, its price goes up. When there is a glut of it, the price goes down. Interest rates are the price of capital and interest rates have been at or near zero for most of this new century.)

A glut of capital would have seemed like a fantasy to people living in the second economy (1700 to 1900) when capital was the limit in the same way that an all-you-can-eat buffet would have seemed like a dream come true to medieval peasants. But excess can bring its own problems and the financial crisis of 2008 that triggered the Great Recession was a reminder that a financial system can be weakened by either too little or too much capital. 

Capital was the limit to progress in the second economy. Capital made the biggest difference and was the source of the most new jobs and wealth. In most – but not every – instance, more or better capital was the answer to the question of how to make progress. If we imagine the economy as requiring four factors of production, the second economy looked like this.



During the second economy, to make more capital was to make more progress. Limits define the output of a system. The two-lane stretch of highway  determines how many cars pass through in rush hour, not the three-lane stretch beyond it. In 1700, there were hardly any machines able to do work. Instead, work was  manual. More machines, more factories, and more stocks and bonds to finance them all increased total economic output.

But with an investment dearth, capital is no longer the limit. In the fourth economy, the new limit is entrepreneurship. Our economy looks more like this now.


To create more credit, to generate more capital, is like building up a glut of products between your second and fourth machine in the factory. It is the equivalent of increasing the capacity of the capital without a corresponding increase in knowledge workers or entrepreneurs. In the bubble leading up to the financial crisis of 2008,  financial markets were wildly productive. That would have been a good thing had entrepreneurs been hiring workers at record levels, but they were not. The result was a huge bump in capital that simply bid up the price of existing assets (like homes and stocks) rather than creates them.


This is akin to an increase in the rate at which we build up inventory (the product that passes through machine two) without increasing the rate of products that leave the factory (product that passes all the way through machine four). Inventory can only stack up for so long before the second machine has to halt production. This becomes the cycle of booms and busts when the second machine is capital. More capital at the dawn of the fourth economy created bubbles. Once we popularize entrepreneurship, this rise in capital output will be hugely beneficial.

For now, though, we have trillions in capital sloshing about the globe in search of returns that, like waves in a bathtub, can only go so far in one direction before it will reverse direction. In the late 1990s, it flowed into the stock market and drove up prices there, first creating a bubble and then a bust in 2000 when it left. In the mid-2000s, it flowed in the housing market, driving up prices there, first creating a bubble and then a bust in 2008 when it left. Capital has become dangerous, driving up the price of assets every time it sees one as key to returns and then driving down their price once it realizes the gains are the result of a surplus of capital bidding up the price of a scarcity of assets.
That dynamic was at work in the 2008 financial meltdown. When markets fell, assets as varied as real estate and stocks dropped tens of trillions of dollars. About 30 million people around the world lost their jobs and millions lost their homes.

Until we get better at developing entrepreneurs and making employees more entrepreneurial, we will have an investment dearth. We continue to get better at making capital but that is no longer the limit. Once we get better at making employees more entrepreneurial, we will have a good use for such huge amounts of capital. At that point, we might even shift from an investment dearth to an investment glut (or savings shortage). At that point, there will be an abundance of opportunities for capital. Until then, an excess of capital will continue to create a series of bubbles and busts.





[1] Notional value inevitably exceeds the market value of a derivative. You might have a derivative based on a $100 stock price but would reasonably expect its value to move only 5 to 20% of that, a fraction that would set the market value. “Still, the market value of derivatives rose from $2.6 trillion to $35.3 trillion in December 2008, itself more than half global output,” Wolf writes.
[2] Martin Wolf, The Shifts and the Shocks: What We’ve Learned – and Have Still Yet to Learn – From the Financial Crisis, [Penguin Press, Kindle Version, 2014] p. 128.
[3] Wolf, The Shifts and the Shocks, p. 129.

09 January 2015

This Decade's Extraordinarily Good Job Creation Performance

December's job report completes the fifth year of this decade. So far, this has been an extraordinarily good decade.

First, a simple fact. The economy has now created more jobs in the first half of this decade than it did in all of the 1950s.

First, measured by jobs created, 2014 is the best year yet for the decade. This recovery is not slowing. Not yet.


The unemployment rate, too, dropped last year at an impressive rate, the second year in a row in which it dropped by more than 1 point. At 5.6%, it is back to what it was in the summer before Lehman Brothers' bankruptcy in 2008. Remember that in December of 2009, it was 9.9%. 


So how does this compare to past decades? It stands out.

Not only has the American economy created more jobs in the first half of this decade than it did in all of the 1950s, it has easily outperformed the first five years of every decade since. 10.7 million jobs is nearly double what the economy created in the first half of the 1980s.

I wonder if the pundits will change their tune about this terrible economy before the decade is over or if they're just going to leave that to historians.


22 December 2014

Why You Should Be Blown Away by Median Wage Growth of 2.1%

Wages rose 2.1% year over year through June 2014. That's been met with a yawn but it is a phenomenal number.

2.1% doubles wages every 33 years. Sustained, a wage growth of 2.1% increases wages by 8X in a century, enough to raise the median wage of $32,154 in 2000 to $256,927 by 2100. 

Does that sound outlandish? It's not. Wages grew about 8X in the 20th century. But more impressive than the increase in wages was the change in what you were able to buy over the course of the last century. At any price. I'm not sure how you would factor that into a calculation of wage growth. And the addition of outrageously cool things to buy is likely to continue. This new century has already given you new products like personal genetic analysis and a ticket for space travel. 

Here is a partial list of things that you could buy in 2000 that you could not buy in 1900. A surprising number of them are quite affordable. Any one of these would be deserving of a post that explores just what it means. (One day of travel would get you to the other side of the globe? For the average person in 1900 - very, very few had cars - a day's travel meant going into town and back.) Imagine what extraordinary things a person in 2100 with an income of a quarter of a million will be able to buy for Christmas presents.

Radio
A photocopy
Ticket to a movie
A video
TV
Airplane ticket. To anywhere
Airplane
Helicopter
Rocket
Anything plastic
Air conditioner
Teabag
Microwave oven
Electric refrigerator
Safety razor
Crossword puzzle
Bra
Vibrator
Rocket
Penicillin
Antibiotics
Hepatitis-B vaccine
Polio vaccine
Insulin
The Pill
LSD
Bubble gum
Nylons
Laser
Velcro
Credit card
Mutual fund share
A pacemaker
Valium
Viagra
Prozac
Computer
Personal computer
Smartphone
Video game
Email account
Website
Video conference. With anyone. From anywhere
GPS

18 December 2014

Anniversary of Verdun - Not Just Worse Than Anything We've Experienced But Worse Than We Can Imagine

Try to wrap your mind around the scale of this tragedy. 9/11 happens. 2,996 lives tragically ended on a Tuesday. The country shocked. And then on Wednesday, 2,996 more lives are taken in a similar attack. Twice in a row now, the country is incredulous. And then on Thursday, another 2,996  lives are taken. Friday, 2,996  more. Every day this happens for 100 days, from 9/11 to 12/21.

Think of how many families would be shattered, the inconceivable number of young lives cut short. The baffled and heartbroken parents and siblings. Imagine how shell shocked people would become. Think about how hard it would be for the nation to absorb a loss of that magnitude, to even make sense of it. Imagine how much it would change us.

The Battle of Verdun ended on this day 98 years ago. In this single battle, 300,000 were killed.

By the time that World War 1 was over, 16 million people had been killed, 7 million of them civilians. 16 million is the equivalent of a 9/11 every day for nearly 15 years. This in a world where total population was about one-fourth of today's population.

It's not just that we really have no equivalent experience in our lifetime. We can scarcely imagine it.

Thank you Thomas for the reminder.

13 December 2014

Halfway Point of the Decade that Will Transform the Middle East

Arab Spring seems to have begun with Mohammed Bouazizi, a 26-year-old Tunisian with 6 kids. Police had confiscated his vegetable cart – his livelihood - in December of 2010. When officials refused to hear his complaint, he protested by setting himself on fire. Within a month protesters had ousted Tunisia’s government.[1] 

This was the start of Arab Spring, a contagion of protest and revolution that swept through the Middle East. By 2013, the heads of state in Tunisia, Egypt, Libya, and Yemen had been chased from office and civil wars had started in two countries and protests in half a dozen others. These were protests orchestrated through Twitter and Facebook and played out in the streets with guns and tear gas. Secularists and religious groups have agreed to oust dictators but now have to agree about how to govern.

For me, this picture of the Camel with Google camera mounted on it seems to capture the Middle East's current, odd juxtaposition of old world and new better than anything I could imagine. 

Evolution is hugely defined by environment. The Middle East is going through democratic revolution and reform in such a very different environment than did the West that it will be fascinating to see what it produces. Hopefully, among the things it produces there will be at least a few more pictures like this. 




[1] Jim Clifton’s The Coming Jobs War, (New York, NY, Gallup Press, 2011) Kindle location 558. 
Picture from Slate, video here.

10 December 2014

Post- 9/11 Interrogations

This week's release of information about the CIA's torture glosses over something fundamental about the interrogations.

Kurt Eichenwald, in 500 Days: Decisions and Deceptions in the Shadow of 9/11 recounts how American interrogators worked at Guantanamo Bay. Reading it, I was struck by how odd their assumption was about the folks they'd captured. The prisoners at Guantanamo were - most of them - soldiers of some kind. Many were from Afghanistan, a place where 92% of the people don't even know about 9/11. (This is not a place where the average person watches the evening news or surfs the Internet for investigative reporting. And where - between Russians and the Taliban - they've had their own atrocities to deal with.)

Imagine that al-Qaeda soldiers captured some American soldiers, took them to an island in the Middle East, and interrogated them about Bush's secret plans for fighting Islam or details about Cheney's underground bunker. It wouldn't even matter if these al-Qaeda interrogators were polite when they asked questions of the American soldiers. It would still be stupid. It wouldn't matter if the al-Qaeda interrogators offered the Americans access to American football games and BBQ or tortured them. No kindness or threat would change the fact that these men were ignorant of what al-Qaeda wanted to know.

These prisoners at Guantanamo Bay were largely low-level soldiers who were nonetheless interrogated daily, as if they had great secrets to reveal. They didn't. The men working at Guantanamo were under pressure to produce results and they tried everything from kindness to threats to torture and nothing worked for the simple reason that the men they'd captured didn't know anything useful.

As with so much in the wake of the Bush Administration's reaction to 9/11, the folks executing policy were struggling to build a skyscraper on a foundation of reinforced cardboard. Even on a good day, when you're given an impossible task, what you try may be unreasonable. And on really bad days, what you try may be immoral.

08 December 2014

Needs No Further Explanation

Or perhaps more precisely, I couldn't provide any explanation. This just seems to capture something perfectly.

The Arguments About Keynesian Economics Are Almost Never About Keynesian Economics

Keynes had a major insight into economics that people still seem to miss. It has to do with equilibrium and it plays out something like this.

1.
Capital markets are essential to a thriving economy. Without investors willing to fund new schools and factories, new businesses and non-governmental organizations, we'd have no innovation, no growth, no creation of wealth or jobs. As an economy becomes more dynamic, this steady infusion of investment becomes more important just to sustain normal growth and employment. 

2.
During a period of healthy growth, the equilibrium for capital markets and labor markets are coincident. Letting investment markets do their natural thing will result in new jobs and economic expansion. 

3.
The equilibrium for capital markets can shift. One day the price of capital can induce savers to invest and the next day it can convince them to hold their money in cash rather than make loans or fund new businesses. 

4.
This new equilibrium in capital markets shifts the equilibrium in labor markets. Unemployment can spike as capital sits idle.  

5.
This shift in equilibrium for capital and labor markets will drive down GDP. Sales will fall. 

6.
At this point, it's perfectly rational for every sector to sustain the recession. Households without jobs won't buy. Businesses without sales won't hire. Investors without sufficient number of employed households or expanding businesses to loan to will sit on their money.

7. 
In other words, at this point every "localized" market - labor, goods, and capital - will be in equilibrium. The good news is that the economy is stable. The bad news is that it has reached stability at recessionary levels. This general equilibrium needs to shift before it will make sense for investors, households, and businesses to change their behavior. There is nothing happening in the two other markets to drive a change in the third.

8.
Government can make changes that will shift equilibrium in all the "localized" markets. By spending more, by cutting taxes, by putting more money into circulation or lowering interest rates, it can engineer temporary growth, helping to put an economy back on track for steady expansion. It can disrupt the equilibrium in the localized markets to get things moving again.

Valid rebuttals to Keynesian economics include criticism about a governments' ability to time stimulus, the level of downturn deserving of Keynesian stimulus, and the degree to which a Keynesian style intervention needs to be coordinated with other countries to be effective. You could even argue about whether expectations might mitigate the effect of such government policies.

Invalid rebuttals include "I don't like big government." Keynesian policy recommendations are incidental to the size of a government. Countries like Denmark (where government spending is 58% of GDP), Sweden (51%), Germany (45%) the US (41%), Mexico (27%), Singapore (17%), or the Philippines (16%) can all engage in Keynesian policies. Whether you want to emulate Southeast Asia or Northern Europe is irrelevant. Whether your long-term target for government spending is 50% of GDP or 15% of GDP, government spending can still be increased in the short-term in response to recessions and downturns. You can induce capital markets to engage in ways that stimulate the whole economy. 

For me, the real genius of Keynes was his realization that reaching equilibrium in capital markets didn't automatically ensure equilibrium in the general economy. Before Keynes, we only had economics. After Keynes, we had microeconomics and macroeconomics. His policies were key to creating conditions that allowed the limit to shift from capital to knowledge workers.