Only the Paranoid Survive

Home > Nonfiction > Only the Paranoid Survive > Page 9
Only the Paranoid Survive Page 9

by Andrew S. Grove


  IBM considered these developments a very major danger and decided to invest in x-ray equipment in a big way. Our people took this news very seriously. IBM technologists were extremely competent and their perception of the threat was ominous. Nor were they alone in this view. Nevertheless, after studying the issue, the Intel technologists decided that x-ray techniques were fraught with problems and that they were not production-worthy. Most importantly, they felt that our current technology could evolve to achieve ever finer features well into the future.

  The way IBM and Intel responded to the x-ray technology threat showed that one company deemed it “signal,” while the other classified it “noise.” We decided not to pursue the x-ray approach. (Ten years later, it appears that we were right. As of this time of writing, to my knowledge, neither IBM nor the Japanese manufacturers are planning to use x-ray technology in manufacturing any time soon.)

  In this case, competent and serious-minded people came to a different set of conclusions about a given set of facts. This is not at all uncommon. There simply is no surefire formula by which you can decide if something is signal or noise. But because there is no surefire formula, every decision you make should be carefully scrutinized and reexamined as time passes. Ten years ago, we decided that x-ray technology was not a “10X” factor. However, we continued to watch it, looking to see if the threat grew, waned or stayed the same.

  Think of the change in your environment, technological or otherwise, as a blip on your radar screen. You can’t tell what that blip represents at first but you keep watching radar scan after radar scan, looking to see if the object is approaching, what its speed is and what shape it takes as it comes closer. Even if it lingers on your periphery, you still keep an eye on it because its course and speed may change.

  So it is with x-ray technology. It is on our radar screen and has been for years. Today, we still don’t think we need to invest in it. But a year from now, three years from now, five years from now, as we exhaust other means that are—for now—more cost-effective, the balance might shift and what we once correctly determined was noise might well emerge as a signal we had better pay heed to. These things are not cut and dried, and even if they were, things change. Therefore, you have to pay eternal attention to developments that could become a “10X” factor in your business.

  RISC versus CISC

  As potential “10X” factors go, the x-ray technology issue was relatively simple. Technologists at IBM had one opinion, their counterparts at Intel had another. We did what our collective judgment indicated we should do.

  Things get a lot more complicated when the differences of opinion are not just between ourselves and others but when we argue inside the company as well. The story of the ferocious “RISC” versus “CISC” debates (which continue to this day) provide a good example of such a situation. RISC and CISC are acronyms for arcane computer terms—Reduced Instruction Set Computing and Complex Instruction Set Computing. For our purposes, it’s enough to know that they describe two different ways of designing computers and, therefore, microprocessors.

  The debate over their merits divided the computing industry and almost tore it apart. CISC was the older approach; RISC was a newer technique. CISC designs require a lot more transistors to achieve the same result that RISC chips can accomplish with fewer transistors.

  Intel’s chips are based on the older CISC scheme. By the time other companies started to pursue RISC techniques in the late eighties, the then current Intel microprocessor, the 386, was on the market, and the next generation Intel microprocessor, the 486, was in development. The 486 was a higher-performance, more advanced version of the same architecture that we used in the 386; it ran the same software but it ran it better. This was an extremely important consideration at Intel; we were (and are) determined that all our new microprocessors would be compatible with the software our customers bought for their earlier microprocessors.

  Some of our people took the position that the RISC approach represented a “10X” improvement, a level of improvement that in the hands of others could threaten our core business. So, to hedge our bets, we put a big effort into developing a high-performance microprocessor based on RISC technology.

  This project had a major drawback, however. Even though the new RISC chip was faster and cheaper, it would have been incompatible with most of the software that was available in the marketplace. Compatibility of a product was—and still is—a big factor in making it popular; therefore the idea that we would come up with an incompatible chip was not an appealing one. To get under the management radar screen that guarded our compatibility dogma, the engineers and technical managers who believed RISC would be a better approach camouflaged their efforts and advocated developing their chip as an auxiliary one that would work with the 486. All along, of course, they were hoping that the power of their technology would propel their chip into a far more central role. In any event, the project proceeded and eventually gave birth to a new and very powerful microprocessor, the 1860.

  We now had two very powerful chips that we were introducing at just about the same time: the 486, largely based on CISC technology and compatible with all the PC software, and the 1860, based on RISC technology, which was very fast but compatible with nothing. We didn’t know what to do. So we introduced both, figuring we’d let the marketplace decide.

  However, things were not that simple. Supporting a microprocessor architecture with all the necessary computer-related products—software, sales and technical support—takes enormous resources. Even a company like Intel had to strain to do an adequate job with just one architecture. And now we had two different and competing efforts, each demanding more and more internal resources. Development projects have a tendency to want to grow like the proverbial mustard seed. The fight for resources and for marketing attention (for example, when meeting with the customer, which processor should we highlight?) led to internal debates that were fierce enough to tear apart our microprocessor organization. Meanwhile, our equivocation caused our customers to wonder what Intel really stood for, the 486 or the 1860?

  I was watching these developments with growing unease. The issue concerned the heart of our company, the microprocessor business that we had put our faith in and repositioned the company around when we abandoned the memory business just a few years earlier. It didn’t involve factors that might or might not arise a decade from now, like x-ray technology; it demanded a decision immediately, and the decision was crucial. On the one hand, if the RISC trend represented a strategic inflection point and we didn’t take appropriate action, our life as a microprocessor leader would be very short. On the other hand, the 386’s fantastic momentum seemed sure to extend into the 486 and perhaps even to future generations of microprocessors. Should we abandon a good thing, which for now at least was a sure thing, and lower ourselves back down into a competitive battle with the other RISC architectures, a battle in which we had no particular advantage?

  Although I have a technical background, it is not in computer science and I was not that comfortable with the architectural issues involved. To be sure, we had lots of people who had the right background but they had all split into warring camps, each camp 100 percent convinced of its own chip’s supremacy.

  Meanwhile, our customers and other industry partners were not of one mind either. On the one hand, the CEO of Compaq, a major and very technically savvy customer of ours, leaned on us—on me, in particular—and encouraged us to put all our efforts into improving the performance of our older CISC line of microprocessors. He was convinced that the architecture had enough power in it to last the rest of the decade, and he was unhappy seeing us split our resources and spend lots of time and money on something that was of no use to Compaq. On the other hand, the key technology manager at Microsoft, the company that provided most of the software that our customers used in conjunction with our microprocessors, was encouraging us to move toward an “860 PC.” As the head of one of our European customers told me, “Andy, thi
s is like the fashion business. We need something new.”

  When the 486 was formally introduced, the reaction of the customer community was extremely positive. I remember sitting at the product introduction in Chicago with a virtual Who’s Who of the computer manufacturing world, all of whom showed up to announce their readiness to build 486-based computers, and thinking, “RISC or no RISC, how could we possibly not put all our efforts into supporting this momentum?” After this event, the debates were over and we refocused our efforts on the 486 and its successors.

  Looking back at these debates six years after the fact, I shake my head about how I could have even considered walking away from our traditional technology that then had, and still has, phenomenal headroom and momentum. Today, in fact, the advantages of RISC technology over CISC technology are much smaller than they appeared then. Yet at that time we were seriously considering a major shift of resources.

  Is It or Isn’t It?

  Sometimes the event that signals a strategic inflection point is dramatically clear. I doubt that it required a lot of study to conclude that the Modified Final Judgment that led to the breakup of the old AT&T was a monumental event. It probably was also pretty clear that when the FDA was formed and the truth-in-labeling act was passed, the world of patent medicine changed once and for all. There was no question that these events represented key changes in the environment of the businesses that operated under their influence.

  Most of the time it’s not like that. Most strategic inflection points, instead of coming in with a bang, approach on little cat feet. They are often not clear until you can look at the events in retrospect. Later, when you ask yourself when you first had an inkling that you were facing a strategic inflection point, your recollections are about a trivial sign hinting that the competitive dynamics had changed. In the earlier story about memories, Intel visitors to Japan came back with the report that Japanese businessmen who had previously been very respectful of us now seemed to look at us with a newfound derision. “Something changed, it’s different now,” people said when they returned from Japan. And this comment and observations like it heightened our awareness that a real change was upon us.

  So how do you know whether a change signals a strategic inflection point?

  Ask these questions to attempt to distinguish signal from noise:

  Is your key competitor about to change? First, figure out who your key competitor is by asking a hypothetical question that I call the “silver bullet” test. It works like this: if you had just one bullet in a figurative pistol, whom among your many competitors would you save it for? Asked point-blank, this question usually provokes a visceral response and I find that people can normally give an answer without much hesitation. When the answer to this question stops being as crystal clear as it used to be and some of your people direct the silver bullet to competitors who didn’t merit this kind of attention previously, it’s time to sit up and pay special attention. When the importance of your competitors shirts, it is often a sign that something significant is going on.

  In an analogous fashion, you should ask, is your key complementer about to change? Does the company that in past years mattered the most to you and your business seem less important today? Does it look like another company is about to eclipse them? If so, it may be a sign of shirting industry dynamics.

  Do people seem to be “losing it” around you? Does it seem that people who for years had been very competent have suddenly gotten decoupled from what really matters? Think about it. You and your management have both been selected by the evolutionary forces of your business to be at the top of your organization. Your genes were right for the original business. But if key aspects of the business shift around you, the very process of genetic selection that got you and your associates where you are might retard your ability to recognize the new trends. A sign of this might be that all of a sudden some people “don’t seem to get it.” Conversely, it may be that you yourself are often inclined to shake your head in confusion. When they don’t get it or you don’t get it, it may not be because of encroaching age; it may be because the “it” has changed around you.

  Helpful Cassandras

  The Cassandras in your organization are a consistently helpful element in recognizing strategic inflection points. As you might remember, Cassandra was the priestess who foretold the fall of Troy. Likewise, there are people who are quick to recognize impending change and cry out an early warning.

  Although they can come from anywhere in the company, Cassandras are usually in middle management; often they work in the sales Organization. They usually know more about upcoming change than the senior management because they spend so much time “outdoors” where the winds of the real world blow in their faces. In other words, their genes have not been selected to achieve perfection in the old way.

  Because they are on the front lines of the company, the Cassandras also feel more vulnerable to danger than do senior managers in their more or less bolstered corporate headquarters. Bad news has a much more immediate impact on them personally. Lost sales affect a salesperson’s commission, technology that never makes it to the marketplace disrupts an engineer’s career. Therefore, they take the warning signs more seriously.

  The other night, I checked my electronic mailbox and found a message from our sales manager in charge of the Asia-Pacific region. He passed on some breaking news from his area that had to do with a potential competitive element. His story was a familiar enough scenario—and yet as he began to talk about the new item, his tone was quite concerned, almost scared. “I don’t mean to be an alarmist and I know that situations like this come up all the time but this one really concerns me …,” he wrote. He was in no position to suggest a course of action; he was just asking me to pay attention to this development and urging me to take it seriously.

  My immediate reaction was to shrug off his news. I feel much safer back here in California than he does in “enemy territory.” But is my perspective the right one? Or is his? After all, being there doesn’t automatically make him right in his assessment. I could claim to have a better overall perspective on things. Yet I have learned to respect changes in the tone of messages from people in the field. I will watch further developments of this news item more carefully than I would have otherwise and, in fact, I have since decided to initiate a broader study of its potential implications.

  You don’t have to seek these Cassandras out; if you are in management, they will find you. Like somebody who sells a product that he is passionate about, they will “sell” their concern to you with a passion. Don’t argue with them; even though it’s time-consuming, do your best to hear them out, to learn what they know and to understand why it affects them the way it does.

  Classify the time you spend listening to them as an investment in learning what goes on at the distant periphery of your business, whether you think of distances in geographical or technological terms. Think of it this way: when spring comes, snow melts first at the periphery, because that’s where it’s most exposed. Factoring in the news from the periphery is an important contribution to the process of sorting out signal from noise.

  There is a fine distinction here. When I say, “Learn what goes on at the periphery of your business,” it means something different than if I had said, “Learn what goes on in your business.” In the ordinary course of business, I talk with the general manager, with the sales manager, with the manufacturing manager. I learn from them what goes on in the business. But they will give me a perspective from a position that is not terribly far from my own. When I absorb news or information coming from people who are geographically distant or who are several levels below me in the organization, I will triangulate on business issues with their view, which comes from a completely different perspective. This will bring insights that I would not likely get from my ordinary contacts.

  Of course, you can’t spend all of your time listening to random inputs. But you should be open to them. As you keep doing it, you will d
evelop a feel for whose views are apt to contain gems of information and a sense of who will take advantage of your openness to clutter you with noise. Over time, then, you can adjust your receptivity accordingly.

  Sometimes a Cassandra brings not tidings of a disaster but a new way of looking at things. During the height of Intel’s RISC versus CISC debate, when I was most confused, our chief technologist asked to see me. He sat down and methodically took me through his point of view while at the same time representing the other side’s argument in the most objective way that I heard. His knowledge and insights made up for my own lack of self-confidence and expertise in this area and helped me listen to the ongoing debates with a better grasp of what I was hearing. Although this encounter did not lead me to a firm position, it helped me form a framework in which to better evaluate everyone else’s arguments.

  In the case of Intel’s exit from the memory business, how did Intel, the memory company, get to where only one factory out of eight was producing memory chips by the mid-1980s, making our exit from memories less cataclysmic? It got there by the autonomous actions of the finance and production planning people, who sat around painstakingly allocating wafer production capacity month by month, moving silicon wafers from products where they seemed wasteful—memories were the prime example of this—to other products which seemed to generate better margins, such as microprocessors. These people didn’t have the authority to get us out of memories but they had the authority to fine-tune the production allocation process by lots of little steps. Over the course of many months, their actions made it easier to eventually pull the plug on our memory participation.

 

‹ Prev