What were the most disruptive information technologies?
The Commoditization of Disruptive Information Technologies
It is important to understand the diffusion of disruptive innovations. This diffusion occurs when “innovators” buy and then, through an influence process, encourage others to buy. It always “starts with the first sale,” and the innovation can “diffuse” through society very quickly.
As we take our first steps into the twenty-first century we can see that technology is central to the economic growth of nations, large companies, and individuals. Technological change plays a major role in industry structural change, as well as in creating new industries. Virtually all contemporary accounts of how technological change proceeds in capitalist economies are based on Schumpeter’s work.
Schumpeter relied on the works of Karl Marx, a German economist, to develop the concept of “creative destruction,” and on the work of Nikolai D. Kondratieff, a Russian economist. In 1925 Kondratieff published The Long Waves in Economic Life, identifying “long waves” of economic growth, some lasting more than fifty years, from the end of the 1780s to the 1920s. These “Kondratieff Waves” were based on what Schumpeter called “clusters of technical innovations.” There is much to be learned from understanding the dynamics of technological change. As Marx said, “History doesn’t repeat itself, but it does rhyme.”
Consider how many years it took each innovation to reach 10 million users:
– Radio needed twenty years
– television ten years
– the Internet three years
– Netscape twenty-eight months
– Hotmail seven months. By 2002 Hotmail had some 150 million users receiving 2 billion messages each day.
Some influential business thinkers have written about this “rate of adoption.” Everett Rogers, who is best known for his classic Diffusion of Innovations, found that customers adopt an innovation at different times after it becomes available. The curve exists because important innovations and new technologies typically diffuse through society often over many years rather than impact everybody at once.
Rogers found that there are five sets of customers:
1. Innovators. The first 21/2 to 5 percent of those who adopt the innovative product. They are willing to try innovative ideas along with the risk.
2. Early Adopters. The next 13 to 15 percent of adopters. They are considered opinion leaders in their community and domain space. They adopt innovations early, but carefully.
3. Early Majority. Perhaps the next 30 to 34 percent. They are deliberate in their decision making. They adopt innovations before the average person, but rarely are they known as leaders.
4. Late Majority. Perhaps 34 percent or less. They are skeptical and adopt an innovation only after the early majority has proved it.
5. Laggards. The remaining 16 to 20 percent. They are bound by tradition, suspicious of changes, and adopt the innovation only after it becomes a commodity.
This diffusion process can be characterized as responding to both “technology-push” and “demand-pull” influences over time. Technology-push occurs when changes in scientific and engineering knowledge make new products or processes feasible or reduce their costs. Demand-pull occurs when the market for an innovation expands, causing the benefits realizable through innovation to exceed costs. Another great business thinker, Harvard Business School professor Clayton Christensen, released The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail in 1997.
Today it is a well-known book that explains how disruptive products or systems create entirely new markets or even industries.
Here are some famous examples of disruptive inventions:
– Johann Gutenberg’s printing press
– the Watt-Boulton steam engine
– Gottlieb Daimler’s internal combustion engine
– Edison’s electric light bulb
– Marconi’s “ship-to-shore” wireless radio
– the Wright brothers’ first flight and their studies on aerodynamics
– IBM’s disk drive
– Boeing’s first trans-Atlantic jet, the 707
– Sony’s Betamax
– even the shipping containers found at thousands of ports today
In Crossing the Chasm: Marketing and Selling High-Tech Products to Mainstream Customers, Geoffrey A. Moore describes how some innovations make the “leap” from early adopters and innovators into the mainstream consumer marketplace through incredible demand-pull. The pull is so strong that the innovation quickly becomes a “commodity.” Commodity products are based on widely available standard components and typically have little differentiation.
As Clayton Christensen explains, “A product becomes a commodity when product differentiation loses its meaning, when the features and functionality have exceeded what the market demands.” The introduction of HP’s DeskJet in the early 1990s is a great example. It was “the death knell of the dot-matrix printer” industry. The DeskJet was “commoditized” as its price went from just under $1,000 to $365 by 1994, and by 2003 we found one at Staples for $49.
We now circle back to Schumpeter and Kondratieff. Recall how clusters of innovations build up into a huge economic wave, or a revolution. The high technology revolution—or more accurately defined as the convergence of computer and telecommunication technology—began when Marconi formed his Wireless Telegraph Company in London in 1897. In 1908 Lee de Forest, an American electrical engineer, invented the audion (vacuum) tube in Palo Alto, California, and created the world of “electronics,” leading up to television in the 1930s, and the military applications of electronics for detection and navigation.
Between Marconi and Google, there were thousands of innovations grouped into hundreds of clusters. We condense the discussion of the high-tech wave into four layers. From the bottom up they are the computer, the Internet, the microprocessor, and the Web browser.
1. Commoditization of the Computer (1940s–1985)
Computers did not originate in Silicon Valley. Work on the first electronic computer, the ENIAC (Electronic Numerical Integrator and Calculator) began under the leadership of Drs. J. Presper Eckert and John Mauchly at the University of Pennsylvania in 1942. Designed to produce firing and bombing tables for the Army, it was 100 feet long, weighed 30 tons, and “contained no less than 18,000 vacuum tubes.” For years after the ENIAC, a computer was sort of a “shrine to which scientists and engineers made pilgrimage.” Their use was strictly rationed and only those in white lab coats got anywhere near them.
In 1959 Kenneth Olsen and Harlan Anderson exploited their experiences at MIT’s Lincoln Labs to start the Digital Equipment Corporation (DEC). They put the computer into the hands of the users who could interact with it directly via a keyboard, and see what was entered on a monitor. In 1965 their PDP-8 “minicomputer” cost around $80,000. Ten years later, Dr. H. Edward Roberts introduced a computer kit for $500. In 1981 IBM introduced the “personal computer” that sold for under $1,500, and Time magazine’s “Man of the Year” award in 1982 went to the PC. The one-billionth PC was shipped in April 2002, and PC number two billion will ship sometime in 2007.
2. Commoditization of the Internet (1965–1999)
The Internet is a global network of computer networks that creates value by reducing the costs of transmitting information. The seeds of the Internet were planted by the U.S. government when Russia’s German rocket scientists, left over from World War II, were unleashed and launched Sputnik, the world’s first satellite, on October 4, 1957. Its simple beeping back to Earth brought science to everyone’s attention around the globe. In direct response, on January 7, 1958, President Eisenhower formed the Advanced Research Projects Agency (ARPA) to coordinate all U.S. technological research. One day in 1966, Robert Taylor at the Pentagon wondered why the three computers he was financing did not connect. He proposed the “ARPAnet” to provide interactive access between ARPA-funded computer resources around the country, and to save money that ARPA would otherwise have to spend on buying more supercomputers.
The first successful ARPAnet experiment was on September 2, 1969. Leonard Kleinrock conducted the first transmission of data across a 400-mile line from UCLA in Los Angeles to Stanford Research Institute (SRI) at Menlo Park, California. In the years to follow, this ARPAnet grew into a parallel network for the research community, and an Internet “backbone network” for academic institutions was funded by the National Science Foundation (NSF) in 1986. In 1990 the NSF established a “commercial use policy” that cleared the way for applications beyond academic research and development. The networking of the research community was no longer confined to computer scientists.
By 1992 there were about 1 million homes in the United States connected to the Internet. In the fall of 1993 America Online (AOL) started providing “easy-to-use” access to home subscribers. Earthlink offered “all-you-can-eat” access in 1995, and NetZero was offering free access in 1999. Researchers at International Data Corporation (IDC) reported that there were about 550 million Internet users around the world in 2001 and predicted that more than 100 million new users will be added to the Internet each year through 2006, reaching just over 1 billion.
3. Commoditization of the Microprocessor (1971–1999)
The invention of the transistor is often called one of the greatest inventions of the twentieth century. John Bardeen, Walter Brattain, and William Shockley demonstrated their first transistor at AT&T’s Bell Labs in New Jersey on December 23, 1947. They would later win the Nobel Prize in 1956. Their transistor was desperately needed. The nation’s growing hunger for telephone services was eating AT&T alive. Their operating system relied on vacuum tubes; they realized that if demand continued to grow, they would have to hire half the nation as operators. The invention of the transistor reduced the size of computers considerably. But it was the invention of the microprocessor, a single piece of silicon etched with thousands of transistors, that launched the introduction of reliable, low-cost electronic computers into the economy.
In 1965, Intel co-founder Gordon Moore wrote in a paper that the speed and the power of computer chips had been doubling about every eighteen months. What “Moore’s Law” means is that a computer twenty years from now will be able to calculate in thirty seconds what the average computer today would take an entire year to complete. In forty years, a computer would be able to calculate in thirty seconds what today’s computer would take a million years to complete.
Employee number 12 arrived at Intel in 1971. His name was Marcian Edward “Ted” Hoff. His first job was to design a set of chips that would go into a calculator for Busacom, a Japanese company. But he wanted to put all the circuitry on a single integrated circuit and then program it. Hoff’s device was launched as Intel’s first 8088 microprocessor chip in 1975. It proved to be “a Magna Carta moment in the world of technology.” Intel’s Pentium launched in 1993 had 3.1 million transistors, and designers can now routinely place 300 million on a chip. Intel’s first chip with a billion transistors will arrive in 2007.
Likewise, the price of computing power has fallen by 99.9 percent in a single generation, and computers are delivering 66,000 times more power per dollar spent than the computers of 1975 did. Moore put it another way, “If the automotive industry paralleled those same advances in value and efficiency, the cars we drive today would cruise at a million miles per hour, cost about five dollars, and we’d be getting close to 250,000 miles to the gallon.” As a historical note, the public announcement of the transistor’s invention in 1947 was buried deep in The New York Times in a weekly column called “The News of Radio.” There, it was suggested that the device “might be used to develop better hearing aids for the deaf, but nothing more.” In 1997 Time magazine named Andy Grove, co-founder of Intel, “Man of the Year.”
4. Commoditization of the Web Browser (1993–early 2000s)
The Web is not a “physical” thing that exists in a certain “place.” It is a “space” in which information exists. J. Neil Weintraut called it “the Rosetta stone of the Internet for the masses.” In Robert Reid’s Architects of the Web, Weintraut says, “The Internet was a massive library of some of the most advanced information and discussion forums in the world from leading research institutions, but locating and getting the information was obtrusively difficult. It was akin to walking down each aisle of a library, scanning each book just to figure out what is there, but doing all this in the dark!”
In 1989 Tim Berners-Lee sat down in the European Particle Physics Laboratory (CERN) in Geneva to invent the World Wide Web. When Berners-Lee started working on his Web project, there were about 800 different computer networks plugged into the Internet and about 160,000 computers filled with information. He invented a “Web client that allows a human to read information on the Web.” It solved incompatibility among all the different servers, computing systems, and infrastructures.
In the winter of 1993, Marc Andreessen and Eric Bina posted the first version of Mosaic, a Web browser they developed for the National Center for Supercomputing Applications (NCSA) at the University of Illinois. They eventually made two key changes. They added graphics to what was otherwise boring text-based software, and more importantly, they translated the software from so-called UNIX computers to the Microsoft Windows operating system. Within a few weeks, tens of thousands of people had downloaded copies of it. By spring 1995 more than 6 million copies were in use on 85 percent of the computers surfing the net around the world.
Before Mosaic “you had to be a UNIX hacker” or a “computer nerd” to access the Internet. Andreessen’s intent was “to make all the resources on the Internet available with one click.” In December 1993 The New York Times described Mosaic as “an application program so different and so obviously useful that it can create a new industry from scratch.”
But it took an Act of Congress to open “the floodgates to digital commerce.” Up until 1991, free enterprise over the Internet was legally forbidden because the Internet was created for the military, research institutions, and academia. The “Boucher Bill” amendment to the NSF Act of 1950, introduced by Representative Rick Boucher, a congressman who represents Virginia’s Ninth District, was signed into law by President George H. W. Bush on November 23, 1992. In the summer of 1995, an investment banker quit his job, headed west to Seattle, and launched a new venture. At the end of 2002, the sales for his venture, Amazon.com, reached $4 billion and he was leading a company worth nearly $9 billion. What happened was that “e-commerce exploded from $12.4 billion in 1997 to $425.7 billion by 2002, a 3,332 percent increase.”
Tim Berners-Lee’s invention of the Web browser has forever changed the shape of modern life, altering the way people do business, entertain, and inform themselves. His invention is often compared to Gutenberg’s printing press, Bell’s telephone, and Marconi’s radio. Time magazine hailed him as one of the 100 greatest minds of the twentieth century, saying, “He took a powerful communications system that only the elite could use and turned it into a mass medium.” More than 375 million queries are made each day on the Internet. Google alone responds to more than 200 million search queries per day in 74 languages from 100 different countries. Tim Berners-Lee chose not to profit from his invention. In April 1993 CERN declared that they would not charge a royalty on the Web protocol and code created by Berners-Lee. It was his gift to the world.