12 September 2014

When Our Work Becomes as Quirky and Unique as We Are

Ernest Hemingway woke up early with the goal of writing 500 words. He wrote standing up and said he was "done by noon, drunk by three." He built his own boxing ring inside his home to spar with guests and friends.

Unlike Hemingway, Agatha Christie wrote when she felt like it and her favorite place to think through mysteries was sitting in the bathtub while eating apples.

Charles Dickens was an insomniac who walked for miles at night, hoping that being lost would inspire his creativity.

Maya Angelou checked into a hotel room stripped bare of paintings or other distractions to write each day. Until 2 PM, her only companions in the room were a Bible, thesaurus, a bottle of sherry, a deck of cards, and some crossword puzzles. After 2 she would head home to edit.

Truman Capote always wrote lying down.

[These - and a few other - delightfully quirky personality traits and working styles of famous authors can be found here in a post written by Ginni Chen.]

These writers were some of the most productive people in history. (Think about how many dollars in revenue have been generated by Dickens' writing alone.) And their writing styles were unorthodox, tailored to their own peculiarities and productivity rather than a social convention embodied in standard work hours and spaces, 8 to 5 and cubicles.

There was a time in history when only the rich and powerful were assured enough food or were free to express individual religious beliefs. Maybe we'll eventually progress to the point that everyone has the freedom to work in a style that expresses who they are.

09 September 2014

Why iPay May Be Apple's Most Lucrative Product Yet

iPay may prove to be Apple's most lucrative product.

Today, Apple announced the release of two new iPhones, an iWatch, and iPay, which will work like a mobile wallet. They've teamed with various credit card companies like VISA, Mastercard and AMEX to enable iPhone users to simply pay with their phones as if their favorite device was a credit card.

On the surface that might sound fairly innocuous. They are certainly not the first to offer the ability to make a digital purchase, as reported by Molly Wood here. But it's worth remembering that Apple wasn't the first company to make a digital music player. They just made it wildly popular.

First, some background. During the 20th century, a quiet revolution transformed finance. One of the reasons it might have been so quiet is that it has the oddly eye-glazing name of banking disintermediation. But disintermediation gets to the heart of how the Information Economy transformed finance, which plays right into Apple's new market.

Once upon a time, bankers were uniquely positioned upon a wall that separated the folks saving money from those who wanted to borrow it. They could take money from the savers, paying them 1% for their money, and then loan it to the borrowers at 10% (less or more). Upon their wall, they were uniquely positioned to see each party, parties who could not see each other. Savers and borrowers didn't know each other so the banker played intermediary. This is a pretty lucrative position to be in. Still. (Last year Citigroup's revenues were $76 billion.)

But information technology has made it easier for borrowers and savers to find each other without the bank playing intermediary. As the cost of information has dropped, this wall separating borrowers and savers has slowly lowered, and with it the bankers' lofty perch. This disintermediation has a long history, one I explore in my book. The most recent instance of disintermediation is peer-to-peer lending. Lending Club, a San Francisco-based company founded in 2007, has facilitated $4 billion in loans. They are to lending what eHarmony is to romance. Why pay the banks the 9% difference between what you get for saving and she has to pay to borrow when you two can split the difference? There are billions - trillions - that can be retained within households by cutting out the bank. But of course far fewer people know and trust Lending Club than Apple.

Now Apple will get millions of people comfortable with the natural extension of what Dee Hock, VISA's founding CEO, realized years ago: money is just information. As millions of Apple users become comfortable with the idea of using their phones for purchases, it won't be long before they become comfortable using their phones - and the extensive networks they represent - for loans. People hate banks and love Apple. It's perfectly plausible that Apple's foray into finance will do to banks what their popularization of the iPod did to record companies.

And there is a lot of money to be had in finance. More, even, then in music.

Cubicle-Roots Funding for R&D (One Approach to the Popularization of Entrepreneurship)

I'm once again inside a company working with a team of technical experts who are planning the development of a fairly complex system. Already it looks like senior managers' expectations are out of line with the teams' perception. That's dangerous because senior managers invest the money.

This problem has its roots in organizational design, the allocation of power. Product development is inherently complex and there is no good way for just a few people in positions of power to fully understand what they're investing in. Employees who might blow the whistle on  a key problem that could sink the project may think twice about such honesty if the result is a cancellation of the project and them losing jobs.

Today, senior managers approve a project, agreeing to invest millions to get a new product to the point that they can sell it for profit. But product development is ripe with risk. Technology can fail to work as predicted, forcing management to scrap it. And given that a product is dependent on so many different technologies, it's worth remembering that it takes an unexpected failure in just one technology to drive serious delays, compromises or overruns. Debugging critical software can take longer than planned, resulting in a product offering that is largely obsolete by the time it is released. A key supplier can change terms, driving up costs to the point that the cost of goods sold wipes out projected profits. And, of course, the internal dynamics of the team itself can mask dysfunction until the project blows up.

You can rely on a model in which elites looking down on this complexity judge it. Or you might consider a model that actually depends on the perspective of people who live within this complexity. It's a bit like the difference between reliance on central planning and a reliance on markets.

Readers of my blog and book  know that I'm arguing for the popularization of entrepreneurship. Among other things, this means nudging - in some cases radically shifting - the role of employee to something more akin to entrepreneur. A different model for product development could illustrate what that might look like.

Imagine that rather than having senior managers make funding decisions about which products to pursue, you relied on the wisdom of the crowd. More specifically, had organizations take their lead from employees whose willingness to invest - or not - would signal the new product's potential.

Imagine that anyone in the organization - from a charismatic CEO like Steve Jobs to an introverted programmer or designer - could make presentations to the organization proposing a development project. (And yes, this very process would drive education in NPV education, market analysis, technology risk, etc. To properly support it would lead to more widespread business education for employees.)
Raise your hand above the cubicle if you like this proposal

Imagine that a portion (10%? 33%?) of every employee's 401(k) fund had to be invested in either a fixed interest annuity with low-risk and return (say, 1 or 2% above inflation) or the company's R&D projects.

Imagine further that employees would be able to investigate any potential project that individuals are proposing, able to do due diligence on this investment possibility. Given some portion of their wealth would be a function of the success of these internal projects, they could use personal relationship and company data to determine who had the right personality to lead a team and which technologies had brilliant potential and which had obvious flaws.

Imagine that whenever employees encountered a proposal they were excited about they could invest some portion of their 401(k) internal allotment, taking a stake in its future success.

Imagine that only when employee-led investments hit some critical mass that the company would match (1 to 1? 100 to 1?) employee investments and fund a new project. A business plan might come from a confident project manager who could make a part of the plan a tripling of his salary - or a significant portion of the future value of this new product. Key technical people might be able to propose similar raises or equity-sharing plans. And remember, if the employees strongly disliked any part of the plan, they could simply refuse to invest. A form of negotiation might emerge in the form of iterative proposals that would finally result in a plan that attracted investors.

Imagine that the result would be that R&D funds were more strategically allocated, based on richer and more nuanced understandings than any senior managers might have. And imagine, too, that such proposals would occasionally make certain teams or team members rich. Perhaps even give some intrapreneurs more money than the CEO.

Imagine that such mechanisms would help to popularize entrepreneurship, help to distribute income and wealth more broadly throughout the organization and - at the same time - create more total wealth and income.

Whether it would make employees the equivalent of venture capitalists or make R&D funding more like a kickstarter campaign would likely depend on the culture and specifics of the process. In either case, it would promise a less centralized, more market-driven model than what we have now. That seems to have worked for nation-states where some percentage of the citizens in a developed country are likely to make more than the chief executive. (About 6 million Americans make more than we pay Obama.)  It might be worth trying within the corporation.

06 September 2014

If Only We Could Find the Government Fact Factory

On more than one occasion, Republican friends have simply denied the accuracy of the data that now suggests the economy is improving. Instead of arguing about it, I've decided that the simplest way to deal with this is to quickly agree and then say, "Even worse, they made up all the data about the Great Recession. Not only is the economy not recovering but it, in fact, never went into a recession. None of it even happened. It's like the moon landing."

For folks with theories about the world that they hold fondly, it is indeed as David Byrne sang: "Facts just twist the truth around."

05 September 2014

How the August Job Numbers Could Boost Your Stock Portfolio

Economic forecasters were expecting 225,000 jobs last month but today the government reported just 142,000. If recent trends hold, you could use this news to strengthen your portfolio.

First, the consensus forecast of 225,000 was about what we'd averaged for the year. That's not much of a forecast. One can almost imagine consensus forecasters saying, "What's the average monthly gain for 2014? Let's use that for our estimate for next month."

The problem with that approach is that July and August have been the worst months for job creation during this recovery. During the months of summer, job creation averages about two-thirds of what it is during the rest of the year. If you really thought that the average of 225,000 was going to hold for the year, you would adjust that downwards by about a third for August to arrive at a number more like 153,000 - which isn't far off from the 142,000 reported. (And Paul Davidson in US Today reports that the numbers for August have been routinely adjusted upwards in the months after.)

A little dip in August seems more aligned with seasonal trends than an indicator that the recovery has changed. But the appearance of a dip might trigger a weaker market, particularly once September numbers come in.

September has also been weak for job gains, averaging 70-some percent of annual totals. When those numbers get reported next month, people are liable to make it mean that the economy is weakening. One weak month they can brush off but two looks like the start of something bad. It could be one of those ugly Octobers when the market falls. If that happens, wait for panic to set in and then buy those stocks you wished you owned a couple of months ago. A little panic lowers prices.

But if trends hold, October job gains - which won't be reported until November 7 - are likely to turn things around. October has run about 30% higher than the average month, meaning that it could be roughly double what we see for August and September. Also, positive job gains through October would break the all-time record for consecutive months of job gains. Shocked at this sign of dramatic improvement, panic will dissolve into a sigh of relief and then a loud hallelujah. And you could make a little money on the stocks you bought during that time in October when the financial shows gave their microphones to the analysts who always see signs of a coming financial apocalypse.

An August number of 142,000 doesn't really challenge the prospect that we are on track for creating more jobs than any year since 1999. It's actually just a reminder that a month when most people take vacation is never a great month for finding a new job. But if people insist on making it mean something, it could mean something (good) for your portfolio. Stay tuned.

This jobs report marks an important milestone. Per Jason Furman's blog, above average unemployment is now largely a function of long-term unemployment. Short-term rates have returned to their pre-Recession levels.

04 September 2014

Count the Cost Twice, Cut the Check Once: Foreign Policy in a World of Endless Problems

Frank Bruni has criticized President Obama for his "Messy Words." Obama recently said that "If you watch the nightly news it feels as though the world is falling apart," and then went on to say, "The world has always been messy," seemingly dismissing today's outrages with a nonchalance that offends Bruni. Bruni knows how horrific conditions are (he's one of our best, most conscientious reporters) and he wants something done.

McCain would like us to bomb all the bad guys. Bruni would like us to save all the innocents. Emotionally, I think they have the high ground. Practically, I think these responses drive us to at least two things: it gets us started on projects we've no hope of completing and it promises us an endless source of outrage and upset.

Woody Allen once quipped, "I can't enjoy a meal as long as I know someone, somewhere is starving."

There are a variety of topics that lend themselves to outrage. Right now. South Sudan is still a failed nation-state. It seems to me that there is no worse curse for people then to find themselves in such a state: institutions collapsed, crime rampant, hardly allowance for the basic needs much less ability to create jobs or hope. Gaza. Still. Ebola is spreading. In Libya and Syria innocent people are being slaughtered. Iraq is devolving into a mess. Putin seems to be toying with the idea of rebuilding the Soviet Union. And of course there are still plenty of obnoxious - even criminal - examples of sexism and racism in this country.

And I'm old enough to know something about this list. It changes but it never ends. You could maybe start all the projects suggested by such a list but you can pretty much guarantee that you won't finish them. Not during any one term, anyway.

I'm glad that we have people like Bruni to remind us that these sources of outrage and tragedy involve real people who we should really care about. I'm even glad for McCain's frequent reminders that we might bring justice to the bad guys. We do have a moral obligation to help. But quite honestly, I'm even more glad that we have a president so different in character from George W. Bush.

I like a little hesitancy and thought as prelude to projects as messy and uncertain as intervention in a foreign country. Whether it's marriage, having a child or intervening in a foreign country, it seems wise to hesitate before starting a project that might never end.

29 August 2014

Fields, Markets, Dialogue and the Empty Space Between Things That Defines Them

Systems realities emerge out of interaction. To understand them, you don't look to the parts but rather the interaction between those parts. The space between parts that defines those parts could be referred to as fields, markets, or dialogue. 

Here's something Einstein wrote that I think can be adapted for use in multiple spaces. 

"A new concept appeared in physics, the most important invention since Newton's time: the field. It needed great scientific imagination to realize that it is not the charges nor the particles but the field in the space between the charges and the particles that is essential for the description of physical phenomena."*

I wonder if this insight could be simply changed to describe economics, like this. 
"A new concept appeared in economics: the market. It needed great imagination to realize that it is not the institutions nor the individuals in them but the field in the space between institutions and individuals that is essential for the description of economic phenomena."

Or perhaps society could be defined like this.
“A new concept appeared in sociology: dialogue. It required great imagination to realize that it is not society or the individuals in it but the dialogue in the space between society and individuals that is essential for description of social phenomena.”

Individuals define their society and the way they interact but of course society defines individuals. You’ll never understand either without understanding both. Or perhaps more importantly, without understanding the relationship between the two.

This is one reason that it’s so important for people to be heard. You can’t define a relationship without a dialogue. Monologues simply broadcast the status quo; dialogues involve both parties and allow for a change in the relationship. Given society can’t change without change from individuals and individuals can’t change without change from society, the first step to changing this relationship is a dialogue about it. Shut people up, shut them down, and nothing changes. Silence is a loud vote for the status quo.

*Albert Einstein, quoted p. 92 of Walter Isaacson's Einstein: His Life and Universe, Simon & Schuster, New York, NY 2007.

23 August 2014

Dow Tops 17,000. Are Stock Market Returns Steadily Improving?

The Dow closed above 17,000 for the week. In the 1999 bull market, it peaked at about 11,500 and in the pre-Great Recession market of 2007, it peaked at nearly 14,000. It is at a new high. I may have to re-think taking advice from the Tea Party. If they are right that he's a socialist, it is terribly confusing that capital markets have performed so well during Obama's administration.

Here's an interesting list ranking stock market performance during a president (using the closing number as the end of the year they left office - typically 9 to 11 months after their successor is sworn in).

Dow Jones, Avg. Annual Returns

Calvin Coolidge (R) 32%
William Clinton  (D) 26%
Ronald Reagan (R) 24%
Dwight Eisenhower  (R) 19%
George H. Bush  (R) 17%
Franklin Roosevelt  (D) 15%
Warren Harding  (R) 13%
Harry Truman  (D) 10%
John Kennedy  (D) 8%
Gerald Ford  (R) 4%
Lyndon Johnson  (D) 3%
Teddy Roosevelt  (R) 1%
George W. Bush  (R) 0%
William Taft  (R) 0%
Woodrow Wilson  (D) -1%
James Carter  (D) -2%
Richard Nixon  (R) -6%
Herbert Hoover  (R) -17%

During Obama's administration the market is up an average of 21% per year, which would put him between those other socialists, Reagan and Eisenhower. It would also mean that 4 of the last 5 presidents are among the top 6 administrations in terms of stock market performance. That suggests that market returns have been going up in the last 30-some years.

This chart, however, suggests that it is less a matter of positive years getting better than it is negative years becoming less severe. Perhaps this has nothing to do with presidential policy but instead reflects the fact that the Federal Reserve is getting better at mitigating risk in the market. Market contractions are becoming shorter in duration and less severe. That's enough to help anyone's performance.

Why They're Finally Legalizing Pot - Or How They're Preparing Us for When Robots Take Over

I had come to my favorite deli to dine with my octogenarian friend Bernard. He had his 20-something grandnephew Delbert with him. My initial discomfort with Delbert has melted as I realized he was a fount of odd conspiracy theories, which I have a weakness for collecting.

"Delbert," I exclaimed as I sat down. "It's been a long time. You've been keeping busy?"

"Dude," he exclaimed back. "You have no idea."

"It's certainly true," Bernard injected, "that I have no idea. This kid has been going on about programmable synapticons and the obsolescence of humanity."

"Programmable neurons and synapses, Bernie. There is no such thing as a synapticon."

"What are we talking about," I inquired.

"Robots! They are no longer on the other side of the globe. They are on the horizon. We can see them, dude, and they are going to make us obsolete."

"This sounds more like a 1970s sort of sci-fi than a 2010s reality," I countered. "People used to talk about robots but that is so passe."

"That's like a half century ago," Delbert pointed out, his math only slightly off. "That's like forever in robotics evolution. And that's the problem. They aren't just about to be smarter than us. They will be just picking up speed as they pass us."

"When will that be," I asked, expecting some vague answer from Delbert. I was surprised by his precise response.

"2042, dude. I'll be like your age."

"How do you get 2042?"

"This guy James Barrat has studied artificial intelligence and he says that the average of the experts' estimate of when we'll finally have a computer as smart as, say, a recent college grad is 2042. And of course then they'll be able to re-design themselves, intentionally evolve in ways that we can't. But everybody is working on this computer. It'll be like having smart college grads who don't get distracted by hunger or relationships or Facebook or bathroom breaks. These things will really be able to work 24/7. IBM, DARPA, Israel, Google ... everybody is racing to create this first because they'll make monster money from it."

"So these are real experts. Not sci-fi?"

"Dude, that's just it. It was sci-fi in 1865 when Jules Verne wrote about going to the moon. 100 years later, in 1965, it was actual science that NASA was just a few years away from making real. Now, you think robots smarter than us are sci-fi in the 1970s and again, about a century later, it will be actual science."

"Wow." It was an inane response, one that would probably take all of two lines of code to turn over to a robot. Still, it was all I could think to say. I suddenly felt like it would take a pull-string doll and not a robot to replace me.

"Which is why they're finally legalizing pot, dude."

"Wait. What?"

"Think about it. They subsidize drugs that make us productive, right? We have prescription drugs that help us to feel less anxious, help us to focus at work. We have drugs that help us to sleep at night so we're productive the next day at work. Those drugs aren't just legal. They're subsidized. They want us to have them. Because they need us to be productive for the economy to work. But drugs that just make you happy without forcing you to first be productive? They've been totally illegal dude. And now they're changing that."

"Sorry to be so slow, Delbert, but what's the connection?"

"The robots are going to do all the work. We won't need to. So we don't need to be productive anymore. And it'll make it easier for them to take over because if we're happily stoned, we won't care so much."

"So we won't have a role anymore?"

"Well, maybe they'll keep us to buy the stuff they make. Robots are just as happy to work. What do they know about recreation, right? Or going to the mall," he chuckled at the vision of robots with shopping baskets on their arms as they shopped for shoes or ripe pears. "So maybe our role will be consumers.You know," he shrugged, "to keep the whole cycle going."

"Or they'll keep us as pets," Bernard interjected caustically.

"Dude," Delbert turned excitedly. "Exactly. There's a guy whose only goal in life is to teach robots empathy so that when they take over they'll be kind to us and keep us as, like, pets or maybe servants. Because this guy - and he's an expert - he's convinced they're taking over in just a couple more decades."

"So what does a person do in this brave new future if he doesn't care for pot," I asked.

"I don't know, dude. Maybe you could play video games?"

Oh, and lest you think that Delbert is making everything up, here's an intriguing video that my friend Damon sent me, suggesting that we will, indeed, be made obsolete about the time today's babies would enter the workforce. If indeed it takes that long.

19 August 2014

Poverty in the Former Confederacy

The US Conference of Mayors just released a report that includes this table:

Richard Florida uses this as another example of the increasing wage gap in the US. People in the bottom of the list are twice as likely to be getting by with incomes of less than $35,000 than people at the top of the list.  That makes a huge difference to a community. I think it's just further example of how policies that have their roots in the Civil War continue to cost us. Every single metropolitan area in the top ten in this list - from the ones in Texas to Georgia to Louisiana to Arkansas to West Virginia - is in the former confederacy. Every single metropolitan area on the bottom of this list - from Manchester, NH, to Washington, DC - is outside of it.

There are really just two ways to explain why some people are poor and some are not. One is to look for differences in their situation, using something like a pareto chart to understand the impact of parents' income, community investments in education and private investments in capital, etc. In this approach, generational wealth and income are subject to forces that can be manipulated and changed through policy and private action. Another is to say, "Well, some people are just like that." Under this umbrella lies belief in aristocracy, celebrity-worship, racism, and a shrug of the shoulders about differences, a weird sort of acceptance of life's inevitability (or even God's will). It's no coincidence that the former Confederacy is - to this day - a place where poverty and poor education is more rampant than in any other in this country. Racism is, of course, the ultimate belief in one's helplessness to change realities like income disparity.This region has been very reluctant to let go of its racists beliefs.  (It wasn't until February of 2013 that Mississippi officially abolished slavery.) It's still a place that rejects policies that would change incomes as the intrusion of "big government." A goodly portion of the south is still accepting of poverty and shrugs its shoulders about why some people are poor. In their minds, little can be done and so little is done. 

I think it would be worth tracking two economies as we seek to understand what's happening in the US. I suspect that tracking the former confederacy and the rest of the country separately would yield some interesting data that would help to make policies on both sides of this (still big) cultural divide. 

12 August 2014

A Big Day for Women in Math and Science

On this day, in 1942, movie star Hedy Lamarr patented spread spectrum communications and frequency hopping, a technology that is the basis for modern WiFi and Bluetooth.

Today, exactly 72 years later, Iranian-born Maryam Mirzakhani became the first woman in history to win the Fields Medal - mathematics equivalent to the Nobel Prize. 

So, it only takes a half century or so, but women are eventually getting recognition for achievements in math and science without being talented enough to co-star with of the likes of Clark Gable and Spencer Tracy, Lana Turner and Judy Garland. Or, perhaps, Stanford's Mirzakhani simply hasn't yet taken the time to show that side of her talents.

10 August 2014

"Corporate entrepreneurship and innovation will be the next big thing for the next 10 years" - Steve Blank

Steve Blank recently said,

Corporate entrepreneurship and innovation will be the next big thing for the next 10 years, and the business school that sets up a program for that will be printing money from executive education and gradating a cadre of MBAs who will be snapped up by large companies that are desperate to reintroduce innovation inside their corporations.
Are any business schools making strides in corporate innovation? Corporate innovation is something that’s coming down the pipe. But I haven’t seen a business school that has understood that this is a big idea. The first couple that do will own the space. It’s wide open. We’re going to have a great time in the next five to 10 years.
This is a huge opportunity and unmet need – business schools haven’t pivoted yet. Now they’re getting the startup innovation courses right, but corporate innovation is a lot more complicated, and the startup techniques and classes don’t apply. There has been very little literature and research on the subject. 
I find this affirming because Blank has been ahead of the curve for probably a decade and the fact that he's seeing the need to make corporations more entrepreneurial aligns with the ideas I've been advocating for decades.

I think that the popularization of entrepreneurship is going to transform the corporation because so many of us are employees now. We couldn't leave knowledge work to autodidacts last century and we can't leave entrepreneurship to the innovative few outside of corporations in this.

Some people are beginning to understand that entrepreneurship will lead development now the way that capital did in the 19th century or the way that knowledge work did in the last. I'm still not hearing much about how that will transform business, but if people aren't talking about that, they still don't fully understand what we're facing. Popularizing entrepreneurship within the corporation will transform it as much as mutual funds and credit cards transformed finance and as much as democracy transformed the nation-state.

Market Economy
Big Social Transformation
Where Power is Dispersed
First, Agricultural
1300 to 1700
Second, Industrial
1700 to 1900
Third, Information
1900 to 2000
Bank & Financial Markets
Fourth, Entrepreneurial
2000 ~

Why the Resolution of a Centuries Old Debate Between Spinoza and Descartes Calls Free Speech Into Question

"I can't believe that!" said Alice.
"Can't you?" the Queen said in a pitying tone. "Try again: draw a long breath and shut your eyes."
Alice laughed. "There's no use trying," she said: "One can't believe impossible things."
"I daresay you haven't had much practice," said the Queen. "When I was your age I always did it for half-an-hour a day. Why sometimes I've believed as many as six impossible things before breakfast."
- Lewis Carroll

Intriguing paper from three University of Texas researchers on belief that tests Spinoza and Descartes notions of how we come to believe. Descartes thought that we comprehend something and then choose whether or not to believe it. For him, belief was separate from understanding and came later. Spinoza did not agree. He thought that belief was integral to understanding, that we accept a statement in order to comprehend it. For him, it was un-believing that came after and required the effort and belief that came automatically. Belief did not have to be done but instead had to be undone.

Gilbert, Tafarodi, and Malone's studies give weight to Spinoza's view: believing simply happens through the act of comprehension. The act of unbelieving relies on logical capacity, correct information, or cognitive resources. No everybody has these. You can reject the notion that Smith is pregnant if you know Smith is a man (logical capacity) but you can't reject it if you have wrong information (you thought Smith was a woman). Obviously, quite a few people are lacking either the logic, correct information, or cognitive resources needed to refute lies. For these people, exposure to an idea will mean belief in it. This is obviously troubling in a world in which freedom of speech is protected.

The authors, knowing that bad ideas in circulation are likely to be embraced (by at least a few) once we've been exposed to them, still vote for freedom of speech. Their reasoning is that conversation has the potential to help people take a skeptical approach to something false they've heard, but conversation between people can't help them to believe in true statements they haven't heard. In their words,
In short, people can potentially repair their beliefs in stupid ideas, but they cannot generate all the smart ideas that they have failed to encounter. Prior restraint [e.g., censoring political dissent or pornography] is probably a more effective form of belief control than is unbelieving, but its stunning effectiveness is its most troublesome cost. 
What does this mean? Apparently we don't adopt beliefs. They, instead, adopt us. It's worth examining your beliefs from time to time to decide which ones to reject, knowing that barring this act of intentional skepticism we're likely to protect ideas that don't deserve it. If anything, their studies suggest that you're likely biased more towards beliefs that don't deserve protecting than biased towards refuting beliefs that do.

The paper:
You Can't Not Believe Everything You Read
Daniel T. Gilbert
Department of Psychology University of Texas at Austin
Romin W. Tafarodi
Department of Psychology University of Texas at Austin
Patrick S. Malone
Department of Psychology University of Texas at Austin