Only the Paranoid Survive
Page 6
In another example of denial, IBM’s management steadfastly blamed weakness in the worldwide economy as the cause of trouble at IBM in the late 1980s and early 1990s, and continued to do that year after year as PCs progressively transformed the face of computing.
Why would computer executives who had proven themselves to be brilliant and entrepreneurial managers throughout their careers have such a hard time facing the reality of a technologically driven strategic inflection point? Was it because they were sheltered from news from the periphery? Or was it because they had such enormous confidence that the skills that had helped them succeed in the past would help them succeed in the face of whatever the new technology should bring? Or was it because the objectively calculable consequence of facing up to the new world of computing, like the monumental cutbacks in staff that would be necessary, were so painful as to be inconceivable? It’s hard to know but the reaction is all too common. I think all these factors played a part, but the last one—the resistance to facing a painful new world—was the most important.
Perhaps the best analogy to Charlie Chaplin’s late conversion to the new medium is recent reports that Steve Chen, the former key designer of the immensely succesful Cray supercomputers, started a company of his own based on high-performance, industry-standard microprocessor chips. Chen’s previous company, which attempted to create the world’s fastest supercomputer, was one of the last holdouts of the old computing paradigm. But as Chen described his switch to a technological approach he once eschewed, he did so with a slight understatement: “I took a different approach this time.”
“10X” Change: Customers
Customers drifting away from their former buying habits may provide the most subtle and insidious cause of a strategic inflection point—subtle and insidious because it takes place slowly. In an analysis of the history of business failures, Harvard Business School Professor Richard Tedlow came to the conclusion that businesses fail either because they leave their customers, i.e., they arbitrarily change a strategy that worked for them in the past (the obvious change), or because their customers leave them (the subtle one).
Think about it: Right now, a whole generation of young people in the United States has been brought up to take computers for granted. Pointing with a mouse is no more mysterious to them than hitting the “on” button on the television is to their parents. They feel utterly comfortable with using computers and are no more affected by their computer crashing than their parents are when their car stalls on a cold morning: They just shrug, mumble something and start up again. When they go to college, these young people get their homework assignments on the college’s networked computers, do their research on the Internet and arrange their weekend activities by e-mail.
Consumer companies that are counting on these young people as future customers need to be concerned with the pervasive change in how they get and generate information, transact their business and live their lives, or else those companies may lose their customers’ attention. Doesn’t this represent a demographic time bomb that is ticking away?
Changing tastes in cars
None of this is new. During the 1920s the market for automobiles changed slowly and subtly. Henry Ford’s slogan for the Model T—”It takes you there and brings you back”—epitomized the original attraction of the car as a mode of basic transportation. In 1921, more than half of all cars sold in the United States were Fords. But in a post-World War I world in which style and leisure had become important considerations in people’s lives, Alfred Sloan at General Motors saw a market for “a car for every purse and purpose.” Thanks to GM’s introduction of a varied product line and annual model changes, by the end of the decade General Motors had taken the lead in both profits and market share, and would continue to outperform Ford in profit for more than sixty years. General Motors saw the market changing and went with the change.
Attitude shifts
Sometimes a change in the customer base represents a subtle change of attitude, yet one so inexorable that it can have a “10X” force. In hindsight, the consumer reaction to the Pentium processor’s floating point flaw in 1994 represented such a change. The center of gravity of Intel’s customer base shifted over time from the computer manufacturers to the computer users. The “Intel Inside” program begun in 1991 established a mindset in computer users that they were, in fact, Intel’s customers, even though they didn’t actually buy anything from us. It was an attitude change, a change we actually stimulated, but one whose impact we at Intel did not fully comprehend.
Is the Pentium processor floating point incident a stand-alone incident, a bump in the road or, to use electronic parlance, “noise”? Or is it a “signal,” a fundamental change in whom we sell to and whom we service? I think it is the latter. The computer industry has largely become one that services consumers who use their own discretionary spending to purchase a product, and who apply the same expectations to those products that they have for other household goods. Intel has had to start adjusting to this new reality, and so have other players in this industry. The environment has changed for all of us. The good news is, we all have a much larger market. The bad news is, it is a much tougher market than we were accustomed to servicing.
The point is, what is a demographic time bomb for consumer companies represents good news for us in the computer business. Millions of young people grow up computer-sawy, taking our products for granted as a part of their lives. But (and there is always a but!) they’re going to be a lot more demanding of a product, a lot more discerning of weaknesses in it. Are all of us in this industry getting ready for this subtle shift? I’m not so sure.
The double whammy in supercomputers
Sometimes more than one of the six competitive forces changes in a big way. The combination of factors results in a strategic inflection point that can be even more dramatic than a strategic inflection point caused by just one force. The supercomputer industry, the part of the computer industry that supplies the most powerful of all computers, provides a good case in point. Supercomputers are used to study everything from nuclear energy to weather patterns. The industry’s approach was similar to the old vertical computer industry. Its customer base was heavily dependent on government spending, defense projects and other types of “Big Research.”
Both changed in approximately the same time frame. Technology moved to a microprocessor base and government spending dried up when the Cold War ended, increasing pressure on defense-spending reduction. The result is that a $1 billion industry that had been the pride and joy of U.S. technology and a mainstay of the defense posture of this country is suddenly in trouble. Nothing signifies this more than the fact that Cray Computer Corporation, a company founded by the icon of the supercomputer age, Seymour Cray, was unable to maintain operations due to lack of funds. It’s yet another example illustrating that the person who is the star of a previous era is often the last one to adapt to change, the last one to yield to the logic of a strategic inflection point and tends to fall harder than most.
“10X” Change: Suppliers
Businesses often take their suppliers for granted. They are there to serve us and, if we don’t like what they do, we tend to think we can always replace them with someone who better fills our needs. But sometimes, whether because of a change in technology or a change in industry structure, suppliers can become very powerful—so powerful, in fact, that they can affect the way the rest of the industry does business.
Airlines flex their muscles
Recently, the supplier base in the travel industry has attempted to flex its muscles. Here, the principal supplier is the airlines, which used to grant travel agents a 10 percent commission on every ticket sold. Even though travel-agent commissions were the airline industry’s third largest cost (after labor and fuel), airlines had avoided changing the commission rates because travel agents sell about 85 percent of all tickets and they did not want to antagonize them. However, rising prices and industry cutbacks finally forced the airl
ines to place a cap on commissions.
Can travel agencies continue as before in the face of a significant loss of income? Within days of the airlines’ decision, two of the country’s largest agencies instituted a policy of charging customers for low-cost purchases. Will such a charge stick? What should the travel agencies do if the caps on commissions remain a fact of life and if their customers won’t absorb any of their changes? One industry association predicted that 40 percent of all agencies might go out of business. It is possible that this single act by the suppliers can precipitate a strategic inflection point that might in time alter the entire travel industry.
The end of second sourcing
Intel, in its capacity as a supplier of microprocessors, accelerated the morphing of the computer industry when we changed our practice of second sourcing.
Second sourcing, once common in our industry, refers to a practice in which a supplier, in order to make sure that his product is widely accepted, turns to his competitors and offers them technical know-how, so that they, too, can supply this product.
In theory, this unnatural competitive act works out as a win for all parties: the developer of the product benefits by a wider customer acceptance of the product as a result of a broader supplier base; the second-source supplier, who is a recipient of the technology, clearly benefits by getting valuable technology while giving little in return. And the customer for the product in question benefits by having a larger number of suppliers who will compete for his business.
In practice, however, things don’t often work out that well. When the product needs help in the marketplace, the second source usually is not yet producing, so the primary source and the customers don’t have the benefit of the extended supply. Once the product is fully in production and supply catches up with demand, the second source is in production too, so multiple companies now compete for the same business. This may please the customer but certainly hurts the wallet of the prime source. And so it goes.
By the mid-eighties, we found that the disadvantages of this practice outweighed its advantages for us. So we changed. Our resolve hardened by tough business conditions (more about this in the next chapter), we decided to demand tangible compensation for our technology.
Our competitors were reluctant to pay for technology that we used to give away practically for free. Consequently, in the transition to the next microprocessor generation, we ended up with no second source and became the only source of microprocessors to our customers. Eventually our competition stopped waiting for our largesse and developed similar products on their own, but this took a number of years.
The impact this relatively minor change had on the entire PC industry was enormous. A key commodity, the standard microprocessor on which most personal computers were built, became available only from its developer—us. This, in turn, had two consequences. First, our influence on our customers increased. From their standpoint, this might have appeared as a “10X” force. Second, since most PCs increasingly were built on microprocessors from one supplier, they became more alike. This, in turn, had an impact on software developers, who could now concentrate their efforts on developing software for fundamentally similar computers built by many manufacturers. The result of the morphing of the computer industry, i.e., the emergence of computers as a practically interchangeable commodity, has been greatly aided by the common microprocessor on which they were built.
“10X” Change: Complementors
Changes in technology affecting the business of your complementors, companies whose products you depend on, can also have a profound effect on your business. The personal computer industry and Intel have had a mutual dependence on personal computing software companies. Should major technological changes affect the software business, through the complementary relationship these changes might affect our business as well.
For example, there is a school of thought that suggests that software generated for the Internet will grow in importance and eventually prevail in personal computing. If this were to happen, it would indirectly affect our business too. I’ll examine this in more depth in Chapter 9.
“10X” Change: Regulation
Until now, we have followed the possible changes that can take place when one of the six forces affecting the competitive well-being of a business changes by a “10X” factor. That diagram illustrates the workings of a free market—unregulated by any external agency or government. But in real business life, such regulations—their appearance or disappearance—can bring about changes just as profound as any that we have discussed.
The demise of patent medicines
The history of the American drug industry provides a dramatic example of how the environment can change with the onset of regulation. At the start of the twentieth century, patent medicines made up of alcohol and narcotics were peddled freely without any labels to warn consumers of the dangerous and addictive nature of their contents. The uncontrolled proliferation of patent medicines finally triggered the government into the business of regulating what was put into the bottles, and led to the passage of a law requiring manufacturers of all medicines to label the ingredients of their elixirs. In 1906, the Food and Drugs Act was passed by Congress.
The drug industry changed overnight. The introduction of the labeling requirement exposed the fact that patent medicines were spiked with everything from alcohol to morphine to cannabis to cocaine, and forced their manufacturers to reformulate their products or take them off the shelves. The competitive landscape changed in the wake of the passage of the Food and Drugs Act. Now a company that wanted to be in the drug business needed to develop knowledge and skills that were substantially different from before. Some companies navigated through this strategic inflection point; many others disappeared.
The reordering of telecommunications
Regulatory changes have been instrumental in changing the nature of other very large industries. Consider the American telecommunications industry.
Prior to 1968, the U.S. telecommunications industry was practically a nationwide monopoly. AT&T—”the telephone company”—designed and manufactured its own equipment, ranging from telephone handsets to switching systems, and provided all connections between phone calls, both local and long distance. Then, in 1968, the Federal Communications Commission ruled that the phone company could not require the use of its own equipment at the customer’s location.
This decision changed the landscape for telephone handsets and switching systems. It opened the business up to foreign equipment manufacturers, including the major Japanese telecommunications companies. The business that had been the domain of the slow-moving, well-oiled monopoly of the benign “Ma Bell” became rife with competition from the likes of Northern Telecom of Canada, NEC and Fujitsu from Japan, and Silicon Valley startups like ROLM. Telephone handsets, which the customer used to receive as part of the service from the old AT&T, now became commodities to be purchased at the corner electronics store. They were largely made at low labor cost in countries in Asia and they came in all sorts of shapes, sizes and functions, competing aggressively in price. And the familiar ringing of the telephone was supplanted by a cacophony of buzzes.
But all this was only a prelude to even bigger events.
In the early 1970s the U.S. Government, following a private antitrust suit by AT&T competitor MCI, brought suit, demanding the breakup of the Bell system and asking for the separation of long-distance services from local-access services. The story goes that after years of wrangling in the federal courts, a struggle which promised to go on for many more years, Charles Brown, then chairman of AT&T, one morning called his staff one at a time and told them that instead of putting the company through many years of litigation with an uncertain outcome, he would voluntarily go along with the breakup of the company. By 1984 this decision became the basis for what is known as the Modified Final Judgment, supervised by Federal Judge Harold Greene, that prescribed the way long-distance companies were to conduct business with the seven r
egional telephone companies. The telephone service monopoly crumbled, practically overnight.
I called on AT&T locations in those days to sell Intel microprocessors to their switching systems divisions. I still remember the profound state of bewilderment of AT&T managers. They had been in the same business for most of their professional lives and simply had no clue as to how things would go now that the customary financial, personal and social rules by which they had conducted themselves, division to division, manager to manager, were broken.
The impact of those events on the entire communications industry was equally dramatic. A competitive long-distance industry was created. Over the subsequent decade, AT&T lost 40 percent of the long-distance market to a number of competitors, some of whom, like MCI and Sprint, became multibillion-dollar companies themselves. A new set of independent companies operating regional telephone systems, often called the Baby Bells, were created. Each of these, with revenues in the $10 billion range, was left the task of connecting individuals and companies in their area to each other and to a competitive long-distance network. The Modified Final Judgment let them operate as monopolies in their own areas, subject to a variety of restrictions in terms of the businesses they might or might not participate in.