The Invention of the Computer and the Spread of Knowledge
As the world was recovering from World War Two in 1948, Bell
Labs announced a product that it thought might “have far-reaching significance
in electronics and electrical communication."[1]
Three of its employees would eventually share a Nobel Prize for inventing this
transistor, a small piece of technology that replaced bulky vacuum tubes and
laid the foundation for the world’s most rapid increase in technological
ability. Perhaps no single invention did more to overcome the limit of
information than the transistor.
Of course, turning information from something that limited
the economy into something that the economy seemingly slid atop like dress
shoes on ice was to take more than this single invention. Most obviously, this
transistor was to become a part of a computer chip, where transistors were
bundled into many on-off switches that could be used to represent numbers,
letters, and, eventually, even pictures and videos.
Furthermore, this hardware was of little consequence without
programming - a means to manipulate the transistors into an encoded array that
could be used to represent, store, and process information.
In that same year, 1948, Claude Shannon coined the word
“bit” to begin quantifying information as prelude to it being processed by a
machine.
By the 1960s, the transistor was part of a computer chip.
The chip was embedded in a sea of information theory that was advancing its
design and use in myriad ways. Computer programming was changing how people
thought and what it was possible to do with this new chip. And it was this set
of technologies – more than typewriters and telephones – that would come to
define the information economy for most. Yet even with the advent of this new
technology, something was missing. Technological invention alone is rarely
enough; to make real gains from the computer chip required social invention, a
change in corporate culture.
One of the three co-inventors of the transistor began a
company to exploit this new technology. He didn’t just understand the
transistor; he had, after all, helped to create it.
William Shockley was co-inventor of the solid-state
transistor and literally wrote the book on semiconductors that would be used by
the first generation of inventers and engineers to advance this new technology.
He had graduated from the best technical schools in the nation (BS from Cal
Tech and PhD from MIT), and was the epitome of the modern knowledge worker.
Shockley hired the best and brightest university graduates
to staff his Shockley Semiconductor Laboratory. Yet things were not quite
right. It wasn’t technology, intelligence, or money that his company was
lacking. Something else was missing.
To answer what it was leads us to the question of why
information has so much value.
One of the beliefs of pragmatism is that knowledge has
meaning only in its consequences. This suggests that information has value only
if it is acted upon. Information that is stored in secret has no consequences.
By contrast, information that informs action - knowledge or belief, as a
pragmatist would see it - needs to be both known and acted upon. This is
information that needs to be distributed. Imagine a stock market with lots of
information about companies and the economy but only a handful of people who
buy and sell stocks. As long as only the elites act upon the information, that
information doesn’t have much value to anyone else. Put another way,
distributed information has value only when it drives distributed decision
making or action.
So, in order for information to become important, it has to
inform important actions. Furthermore, the more people who act on this
information, the more valuable it will be.
What was missing from Shockley’s approach to this brand new
technology was a management style or culture that would make information
processing valuable. Shockley was apparently a fairly autocratic manager. He
was creating information technology that got more value as it drove more
decisions but he was, in style, not the kind of person to give up much control.
Largely because of this, it was not Shockley who would become a billionaire
from computer chips, but instead, a few of his employees.
The year 1968 was the kind of year that would have made even
today’s 24-7 news coverage seem insufficient. Even then it was obvious that it
was a historical year. In January, the North Vietnamese launched the Tet
offensive, making it all the way to the U.S. Embassy in Saigon; this might have
been the first indication that those unbeatable Americans could be beaten. In
the United States, civil rights demonstrations that devolved into deadly riots
were the backdrop for Lyndon Johnson’s signing of the Civil Rights Act. Martin
Luther King, Jr. and Robert Kennedy - iconic figures even in life - were
assassinated within months of each other. The musical Hair opened on
Broadway and Yale announced that it would begin to admit women. For the first
time in history, someone saw the earth from space: astronauts Frank Borman, Jim
Lovell, and William Anders became the first humans to see the dark side of the
moon and the earth as a whole, an image that dissolved differences of borders
and even continents. Any one of these stories could have been enough to change
modern society. Yet in the midst of all these incredible events, two
entrepreneurs quietly began a company that would transform technology and
business, a company that would do as much to define Silicon Valley as any
other.
Gordon Moore and Robert Noyce founded Intel in July 1968.
Moore gave his name to “Moore’s Law,” a prediction that the power of computer
chips would double every eighteen months. Here was something akin to the magic
of compound interest applied to technology or, more specifically, information
processing.
Moore and Noyce had originally worked for Shockley, but they
left his laboratory because they didn’t like his tyrannical management. They
then went to work for Fairchild Semiconductor, but left again, because, “Fairchild
was steeped in an East Coast, old-fashioned, hierarchical business structure,”
Noyce said in a 1988 interview. "I never wanted to be a part of a company
like that."[2]
It is worth noting that Moore and Noyce didn’t leave their
former employers because of technology or funding issues. They left because of
differences in management philosophy.
Once when I was at Intel, one of the employees asked if I
wanted to see the CEO’s cubicle. Note that this was an invitation to see his cubicle,
not his office. We walked over to a wall that was - like every other
wall on the floor - about five feet high, and I was able to look over the wall
into an office area complete with pictures of the CEO with important people
like President Clinton. In most companies, one can tell pretty quickly who is
higher up in the organization than others; the level of deference and the ease
of winning arguments are pretty clear indicators of who is where in the
organizational chart. By contrast, I’ve never been inside a company where it was
more difficult to discern rank than Intel. Depending on the topic, completely
different people could be assertive, deferential, or argumentative. One of
Intel’s values is something like “constructive confrontation,” and this
certainly played out in more than one meeting I attended. When a company makes
investments in the billions, it can’t afford to make a mistake simply because
people have quaint notions about respect for authority. Intel’s culture seems
to do everything to drive facts and reasons ahead of position and formal
authority. This might have partly been a reaction of its founders to the
management style of their former employer, Shockley.
The first company they left, Shockely’s Labs, no longer
exists. The other, Fairchild Semiconductor, lost money last year and has a
market cap of less than $2 billion, just a fraction of Intel’s market cap of
more than $135 billion[3]. Intel’s net profit in 2010 was over $12
billion, and it employs more than 100,000 people worldwide. Moore and Royce’s
open culture seems to have made a difference.
Information technology has little value in a culture that hoards
information. Information technology makes sense as a means to store,
distribute, and give access to information and has value as tool for problem
solving and decision making.
The pioneers of information technology, like Moore and Noyce,
understood this and realized - at some level - that it made little or no sense
to create hierarchies where information was held and decisions were made at one
level and people were merely instructed at another. The knowledge worker needed
information technology as a basis for decisions and action. Before 1830, up
until the time of the railroad, the information sector of the American
workforce was less than 1 percent.[4]
By the close of the 20th century, nearly everyone seemed to need
technology for storing and processing information.
By paying double typical wages, Henry Ford created a new
generation of consumers for his car. Moore and Noyce didn’t just create
information technology; they helped to popularize a management culture that
distributed information and decision making, helping to create even more demand
for their amazing new technology.
No comments:
Post a Comment