The “technostructure” was highly populated by Harvard MBAs. In 1976, headhunting firm Heidrick & Struggles analyzed the profiles of the CEOs of the 500 largest industrial corporations and 300 largest financial corporations in the country. Twenty percent of them had MBAs; half of those were Harvard MBAs. In other words, 10 percent of the country’s top CEOs came from HBS.2 The School’s own surveys showed that as of 1977, more than 10,000 graduates had the titles of chairman, president, owner, partner, or general manager of the firm they worked for. In 1979, they dug even deeper, and found that 20 percent of the top three officers in each of the Fortune 500 companies was an HBS graduate. In some companies, the density was off the charts: At Ford Motor Company, 14 percent of vice presidents and general managers had degrees from HBS.3
American executives, particularly its MBAs, had come to think of themselves as exemplars of modern management as the 1970s got started. And why wouldn’t they? They had three decades of trade surpluses to point to and strong profit growth interrupted only by periodic but mercifully brief recessions; the country’s low-cost, standardized-product system of mass production had propelled its citizenry to a standard of living unprecedented in history. Thanks to the likes of HBS, Wharton, Stanford, GSIA, and others, the codification of managerial precepts of strategic planning, marketing, operations, organizational behavior, and finance was so far ahead of any other country that the question of whether adopting American methods was the right choice was answered by, “Well, what else is there?” Half of the fifty largest manufacturing firms in the country in 1972 had already been in the top 50 in 1947, and only five hadn’t been in the top 200.4
And then everything went wrong at once, and America’s belief in itself—seemingly impregnable just a few years before—was badly shaken. The Vietnam War was still dragging on interminably when the oil embargo hit, followed by runaway inflation, high interest rates, a brutal recession, the abandonment of the gold standard, and subsequent devaluation of the dollar. The country had failed to keep up in terms of capital investment: the average age of the nation’s plant and equipment had been allowed to grow to twenty years, twice as old as Japan’s. Investment in research and development was in decline, too. What was up? Dividends. The ratio of dividends to operating cash flow was 11 percent higher in the late 1970s than in the late 1960s, and 30 percent higher in 1980.5 You’re supposed to increase dividends when the going is good, not when your factories are creaking due to old age. Throw in the revival of both European and Asian industry, and suddenly American executives were back on their heels.
In 1969, the United States was home to 40 of the world’s top 50 industrial firms. By 1974, that number had dwindled to fewer than 30. America’s management was indicted—justifiably so—for its failure to prepare for such seismic shifts in the global economy. The auto industry was hit hardest. In 1950, 85 percent of all cars were made in the United States. By 1980, Japan had overtaken the United States as the world’s largest producer of cars.6 In 1965, just 6 percent of the cars sold in the United States were imported; in 1980, 28 percent were.7 In 1960, imported goods accounted for 1.8 percent of U.S. sales; in 1985, they accounted for 15 percent. America was in the throes of a corporate Pearl Harbor.
The managerial corps, happy to take all the credit for the good times, suddenly lacked enough fingers to point at everyone they were trying to blame for the loss of preeminence. It was the unions, addicted to their cost-of-living increases, that were strangling American industry of its competitive strength. It was Congress, which was trying to regulate the free market out of existence. It was an administration that didn’t understand that it was giving away the farm with free trade agreements.
But as to the question of quality, there was nowhere to point but in the mirror. Somehow, during America’s long orgy of self-congratulation, product quality had slipped so far down the list of priorities that the country pretty much forgot about it. Or at least that’s how the charitable explanation goes. Len Caust, one of the ’49ers, sounded confessional in 1986 when he admitted that quality hadn’t been forgotten, but rather ignored: “Your average MBA sees the consumer as a mark, a factor to be got around. He pays lip service to the notion of providing what the customer wants, because that’s what he’s supposed to talk about—publicly.”
One man had seen the light, and his name was W. Edwards Deming. An engineer, statistician, and later management consultant, Deming had helped develop the sampling techniques used by the U.S. Census Bureau and Bureau of Labor Statistics. When he later developed a system to keep statistical track of quality, and thereby reduce expenses, and increase productivity and market share, he found no takers in the United States. He did, however, find eager converts in Japan, and the spread of his thinking is widely credited for laying the foundation for Japan’s postwar economic resurgence. American managers? According to Fortune, their reaction was, “Go away, Deming, we’re making money.”8
H. Edward Wrapp, a professor of business policy at the University of Chicago, told Dun’s Review in 1980 that “we have created a monster . . . the business schools have done more to insure the success of the Japanese and West German invasion of America than any one thing I can think of [by] producing a horde of managers with . . . talents that are not in the mainstream of enterprise . . . [and] the tragedy is that these talents mask real deficiencies in overall management capabilities. These talented performers run for cover when grubby operating decisions must be made and often fail miserably.”9
The New York Times Magazine ran a cover story in January 1981, “Overhauling America’s Business Management,” which documented the sudden reversal in the reputation of American managers. The list of contributing factors wasn’t short. There was the adversarial relationship between management and labor. There was an excessive focus on short-term profit. There was a lack of entrepreneurial verve, evidenced by the mad rush to find safety in the bosom of a large-company job: Whereas just 44.5 percent of HBS graduates had taken jobs at large companies in 1969, by 1975 that percentage had risen by more than half, to 68 percent. There was the takeover of top corporate positions by the financial types, who knew little about the fundamentals of the businesses they ran.
And then there was this: “[Some] financial yardsticks that managers rely upon so much in deducing whether to make investments may yield results that are badly distorted in the current period of high inflation. The validity of some of these yardsticks, like ‘discounted cash flow’ or virtually indecipherable formulas for figuring ‘return on investment,’ is being called into question to some extent.” Meaning: Not only were business schools churning out too many numbers people; they were telling them to look at the wrong numbers to boot. “It may be that some of the basic tools we’ve been teaching in business schools for 20 years are inordinately biased toward the short term, the sure payoff,” Lee J. Seidler, a Wall Street analyst and professor at New York University’s business school, told the magazine.10
“[It] is a widely held view that the current M.B.A. might be part of the current problem,” the story continued. “The charge is that even the leading business schools, like at Harvard or Stanford, have been teaching how not to manage a modern American company; that they have simply taught business as business has been practiced, and not helped lead the way to necessary change.”11
Or worse: In 1974, ’49er Lester Crown, whose father had founded the company Material Service in 1919, which merged with General Dynamics in 1959, later to become the largest defense contractor in the country, agreed to turn state’s evidence in exchange for immunity from prosecution when it emerged that executives at Material Service had bribed Illinois state legislators in exchange for supporting a law that would allow heavier cement trucks on the state’s highways. Along with a group of other Chicago-area construction firms, the company had established a $50,000 slush fund to help move the legislation along, and Crown had given $8,000 of his own money while also asking seven company employees to pad their expense reports in order to reimburse hi
m.
Crown, who admitted he knew about the money flows, insisted that he thought it was for legitimate campaign contributions. And he was unrepentant. “There isn’t a cultural, ethical change required in this company,” he told the New York Times. Others weren’t so sure. “Bribery is a major felony involving serious moral turpitude,” Representative John D. Dingell wrote in a letter to Secretary of Defense Caspar W. Weinberger regarding the episode. “The election to, and the retention on, the Board of Directors of an individual who admittedly was involved in the commission of a major crime is a statement of the integrity of the management of our nation’s largest defense contractor.”12
John McArthur, then dean of HBS, preferred to point the finger at change. “[Bear] in mind that the economic world changed drastically and irrevocably in the 1970’s,” he told the reporter. “American management and business schools are now in transition, struggling to respond to the changes. We do have a serious problem in this period of transition, but this nation also has enormous resources. I think the shift is under way, both out there in the corporate world and at business schools like this one.”13
One of the most scathing critiques of American managers’ focus on short-term gain at the expense of long-term competitiveness stood out not so much for its main point—it was true, but it wasn’t exactly news—as for the fact that the point had been made in the pages of the Harvard Business Review. Titled “Managing Our Way to Economic Decline,” it was written by William Abernathy and Robert Hayes, both members of the operations management department at HBS, and both with backgrounds in industry—IBM for Hayes and DuPont and General Dynamics for Abernathy. And what they were doing was taking a shot across the bow of their colleagues in finance, whose status had eclipsed those of the operations faculty in the very same way that finance had overtaken production and operations in industry itself.
The article took American managers to task for their laundry list of excuses for their failure to compete—government regulation, inflation, monetary policy, tax laws, labor costs, the price of oil. “A German executive . . . will not be convinced by these explanations,” wrote the authors. “Germany imports 95% of its oil (we import 50%), its government’s share of gross domestic product is about 37% (ours is about 30%), and workers must be consulted on most major decisions. Yet Germany’s rate of productivity growth has actually increased since 1970 and recently rose to more than four times ours. . . . No modern industrial nation is immune to the problems and pressures besetting U.S. business. Why then do we find a disproportionate loss of competitive vigor by U.S. companies?”14
Their answer? The “new managerial gospel” that preferred analytic detachment to actual experience and short-term cost reduction rather than investment in long-term technological competitiveness. American spending on R&D in critical research-intensive industries such as machinery, chemicals, and aircraft, they pointed out, had dropped by the mid-1970s to about half its level of the early 1960s. American managers, they argued, had “abdicated their strategic responsibilities.” The authors deplored the overreliance on the use of marketing surveys as the primary consideration in new product development—it constrains the imagination, and produces imitative products rather than innovative ones.
But what stood out most of all was the fact that they seemed to be criticizing the very types of managers that HBS was producing: “What has developed, in the business community as in academia, is a preoccupation with a false and shallow concept of the professional manager, a ‘pseudo-professional’ really—an individual having no special expertise in any particular industry or technology who nevertheless can step into an unfamiliar company and run it successfully through strict application of financial controls, portfolio concepts, and a market-driven strategy.” They attacked head-on the new “corporate religion” in which a facility with data analysis was an acceptable substitute for actual expertise. “Its first doctrine, appropriately enough, is that neither industry experience nor hands-on technological expertise counts for very much. At one level, of course, this doctrine helps to salve the conscience of those who lack them. At another, more disturbing, level it encourages the faithful to make decisions about technological matters simply as if they were adjuncts to finance or marketing decisions. . . . More disturbing still, true believers keep the faith on a day-to-day basis by insisting that as issues rise up the managerial hierarchy for decision they be progressively distilled into easily quantifiable terms.”
In The Age of Heretics, Art Kleiner reveals that publication of the piece was delayed by about a year because of a controversy among members of the Review’s editorial board about whether to publish it at all. And no wonder—it effectively divided the faculty in two.15 Hayes even said so himself: “My own problem was really with my colleagues. One could read into the articles that I was being critical of finance. I was being critical of control measurement systems, and of strategy. And a lot of my good friends on the faculty were teaching those things.” For the next fifteen years, the article was the most requested reprint from HBR.
Fans of cutthroat capitalism love to quote Joseph Schumpeter on creative destruction. Less known are concerns he voiced in his 1942 work, Capitalism, Socialism, and Democracy, in which he worried that by giving control of the modern corporation to salaried managers, the people who would then be at the economy’s steering wheel would no longer have any incentive to innovate and generate new wealth (that is, creative destruction) but instead be focused on minimizing risk to maximize personal job security.16
And when they weren’t overly focused on their own money, they were overly focused on the corporation’s money, at the expense of paying attention to product management. At General Motors, Alfred Sloan had installed systems of financial reporting based primarily on managerial accounting data rather than product quality. Sloan famously said that GM was in the business of making money, not automobiles.
“When executive suites are dominated by people with financial and legal skills,” wrote Abernathy and Hayes, “it is not surprising that top management should increasingly allocate time and energy to such concerns as cash management and the whole process of corporate acquisitions and mergers. This is indeed what has happened. In 1978 alone there were some 80 mergers involving companies with assets in excess of $100 million each; in 1979 there were almost 100. This represents roughly $20 billion in transfers of large companies from one owner to another—two-thirds of the total amount spent on R&D by American industry. In 1978, BusinessWeek ran a cover story on cash management in which it stated that, ‘the 400 largest U.S. companies together have more than $60 billion in cash—almost triple the amount they had at the beginning of the 1970s.’ The article also described the increasing attention devoted to—and the sophisticated and exotic techniques used for—managing this cash hoard.
“Worse,” they continued, “the great bulk of this merger activity appears to have been absolutely wasted in terms of generating economic benefits for stockholders. Acquisition experts do not necessarily make good managers. Nor can they increase the value of their shares by merging two companies any better than their shareholders could do individually by buying shares of the acquired company on the open market (at a price usually below that required for a takeover attempt).”
Would it all have gone so wrong had there been less patriotic triumphalism during the glory years? Surely not. By refusing to take other countries, particularly non-European countries like Japan, seriously, American managers pretty much set themselves up for an eventual defeat.
“Not once, while I was there, did I hear anyone even bother to question the relative merits of our capitalist system versus others,” says one alumnus from the 1960s. “There were occasional discussions of greed overcoming other considerations, but we never dwelled on them. There was some talk of ethics, but everyone assumed that since we were coming from high moral ground, those things would take care of themselves, and we might as well go straight for the money. And there was never a discussion of too much mon
ey for too few people. It never came up once. And the cliché is indeed true—I came out of HBS conservative, anti-labor, and anti-government. That changed very quickly, although if I’d followed the traditional path into finance, I’m not sure if the real me would have emerged.”
Even worse, the prosperity of the post-war boom had given them the “psychic luxury of feeling like benefactors as well as bosses,”17 and this while sitting idly by as the quality of many American products sank well below those of their German and Japanese counterparts. Fast-forward thirty years, however, and the glossing over can be seen in In Their Time: The Greatest Business Leaders of the Twentieth Century, written in 2005 by HBS professors Anthony Mayo and Nitin Nohria. “The lack of focus on product quality would eventually become a major liability for U.S. manufacturers but that was hard to see in the general prosperity of the 1950s and 1960s as corporate profits continued to rise.”18
Hard to see for business school professors, perhaps. But for everybody else? U.S. manufacturing, which by that point had come to specialize in planned obsolescence, plastic, synthetics, and other assorted crap, certainly knew that they were stinting on quality in favor of profits. Consider the television hit Mad Men, which, while fictional, did a fantastic job of revealing the alienation and ruthlessness underlying America’s high-gloss exterior at the time. David Riesman wrote The Lonely Crowd in 1950 and William Whyte wrote The Organization Man in 1956—fifteen years had passed since the warnings of the hollowing out of America’s corporate soul by the time the country came to terms with its transformation into a nation of paper-pushers.
Of course, it was surely difficult for the faculty at HBS to see it coming, because to have done so would have been to admit that it was selling an outdated product. Recall that when HBS was founded, it needed to create a “product” that would “sell.” It wisely chose to target large corporations, which in the post–World War I years were emerging as the dominant economic institutions of the country. What HBS sold those “customers” was not a freethinking, idiosyncratic graduate who might be able to see a “lack of focus on product quality” for the liability that it was, but rather fine-tuned bureaucrats, men who would help make the complex machinery of the large firm run as smoothly as possible. They could read financial statements, but few of them were inclined to create or even engage in the actual businesses those financials described.
The Golden Passport Page 40