In analysis, you have two kinds of errors: false positives and false negatives. [This blog post was inspired by Phil Rosenzweig's piece for Edge.org.]
You will make errors. You'd like to think that you put only guilty people in prison or only buy stocks that will steadily increase your wealth but .... the fact is that we're bad at understanding the past much less predicting the future.
So, given that you will make errors, it is important to ask in which direction you'd rather err. Would you rather put 10 people in prison who deserve to go free or put 10 people on the streets who deserve prison? Would you rather frequently pass on buying a stock that doubles in value again and again or frequently buy stocks whose value collapses? When you adopt an approach to investing, criminal trials or developing policy, in which direction would you rather err?
Rosenzweig points out that the type of error we would prefer depends on the context. When it comes to science, we want to avoid a false positive. The data has to be very clear for a majority of scientists to say, "We have a new theory." You don't rewrite textbooks to accommodate one study. The same thing in a trial. We need to be very certain that a person is guilty before we put them in prison for a decade.
By contrast, in business you may be more accepting of a false positive. "We think this new market has great promise," your marketing team says. An engineer on your team tells you that he thinks he can solve the design problem that has been creating big warranty costs for the company. Even if the likelihood of them being right is low, it could be perfectly rational to invest in that new market or technology. If you lose one million dollars in bad investments 9 times during the year but on the 10th time you crack open a billion dollar market, your company wins. If you wait for proof that the market is profitable ... well, it will be your competition who provides that proof, not you. The data that proves it is a good investment comes too late to inform your decision about whether or not to invest.
You don't want 9 good guys in prison in order to make sure that you keep one bad guy off the streets; you do want to lose $10,000 in 9 bad investments if it means that you make $1,000,000,000 in 1 good investment. Sometimes you want to avoid false positive sand sometimes you want to avoid false negatives.
So where does government policy fall? Do we prefer false positives or false negatives?
Dee Hock was the founding CEO of VISA, the man who dreamed up the modern credit card. He said that every policy has intended effects and unintended effects; you always get the unintended effects. Whatever you try in government policy has a very good chance of leading to some behavior you hoped to avoid. If you make heroin illegal, you drive up its price which can drive up the crime rate as users steal to pay for it. If you force companies to file more paperwork and conduct more studies before allowing them to go public, your community ends up with fewer startups. Even knowing this you may decide it is better to make heroin illegal or save investors from scams masquerading as legitimate startups, but you should anticipate such consequences from policies. Expecting to get policy right on the first attempt is like thinking you can write code without debugging. I think that argues for embracing policy as something more like investments than criminal trials; we want to try 9 things that fail in order to try the one thing that will make our communities better.
Progress does not mean getting rid of problems. Progress means solving better problems. In one century parents face the problem of getting their kids enough to eat to stay alive. In the next century parents face the problem of getting their kids to eat healthy. They are both problems but the second is so very, very much better than the first. Good policy isn't necessarily predictable and it certainly isn't flawless; it can, though, bring us to the point of dealing with a better set of problems. If you don't try new things your problems never get better.
One other reason to treat policy as a place where it is better to have false positives is that, ultimately, our standard of living is a product of our knowledge. In David Deutsch's fascinating book The Beginning of Infinity, he makes the argument that it simply isn't true that the earth is - alone among the places we know in this universe - the one place that is safe for us to inhabit. Or, rather, he points out that the claim is not that simple. As it turns out, most of us live in places where we'd die without knowledge of how to make and wear clothes, shelter, heating or air conditioning. We don't actually live on a particularly friendly planet, he points out, so much as we now have knowledge that allows us to live in hostile places like Fairbanks, Alaska or Phoenix, Arizona. With even more knowledge we can figure out how to live in even more hostile places like the moon or Mars. Where we can live or even how well we live is a product of knowledge. We know how to combat infection now, so our lives are longer. We know how to provide potable - that is, safe to drink - water straight out of a tap to anyone in the country; that simple knowledge may have done more to extend life expectancy than all the very cool drugs discovered in the last century.
One big reason to try things that may not work is that we will create more knowledge. Curiously, the very fact that policies have unintended effects is one more reason to try new things: every test or trial has the potential to reveal new knowledge. Again, it is the accumulation of knowledge that will make our lives better. Experiences that result in new knowledge can fuel progress.
One last thing. Our condition has changed in recent centuries in ways that suggests that we should more actively make mistakes. If all you have is $1,000, you should probably invest that $1,000 cautiously; if you have $10,000,000, you should take some risks with $1,000 increments. When a bad move is likely to result in starvation, it's a pretty good idea to avoid false positives, to insist on certainty and err towards clinging to traditions. You can't blame people from 1500 for wanting certainty. But as we gain prosperity and margin for error ... well, errors are easier to make. Our condition is better than it was centuries ago; that should make us more bold in trying new things.
A huge percentage of scientific research does not result in anything cool or useful. A majority of business startups fail. Yet a prosperous community that is making progress will have lots of both kinds of activity going on, innovation and entrepreneurship that is more likely to fail than succeed. Government policy should be treated more like these domains than the courtroom where we look for overwhelming proof before moving forward. When it comes to progress, it is less about looking at existing evidence than it is about creating new evidence, less about what we have proven than what we might prove.