Cubed

Home > Other > Cubed > Page 26
Cubed Page 26

by Nikil Saval


  The design and opening of the AT&T Building were a media event, the likes of which the public hadn’t seen since the opening of the GM research facilities in the 1950s. Time magazine put Johnson on its cover: unsmiling, a gray wool coat draped over his shoulders and trailing like a cape, he clasped a miniature AT&T Building–shaped slab, like Moses bearing the tablets of the law: Johnson leading his children on an exodus from the tyranny of modernism into the promised land of plentiful building commissions. Michael Graves, one of the regulars at the Four Seasons table, got a commission to design the Portland Municipal Services Building for Portland, Oregon, in 1982, thanks chiefly to Johnson’s intervention. In the same city where Pietro Belluschi’s Equitable Building, the first aluminum-and-glass-skinned building of the International Style, had appeared, Graves’s Municipal Services Building was a garish fever dream of terra-cotta and navy blue, an even more daring version of postmodernism than Johnson’s corporate tower. Making the building squat and boxy where AT&T was thin and soaring, Graves pushed the absurdity of the design to its limits. He studded his facade with deliberately undersized windows, some as small as four feet square, crowded around a central glass curtain wall framed by two seven-story beams, which had the inadvertent appearance of a liturgical cross. Seven-story fake maroon columns culminated on two faces in keystone figures, on two others in blue-and-gold ribbons that were styled to look like prizes granted at a county fair. Belluschi himself, then eighty years old, called it “an enlarged jukebox” and “oversized beribboned Christmas package” that would have been better on the Las Vegas strip than in sober Portland.26 It was the kind of attack the postmodernists loved.

  Philip Johnson, bearing the AT&T Building. Time & Life Pictures

  How did office workers deal with the architects’ pretensions? In Graves’s building, not so well. The deliberate wackiness that led him to produce such small windows meant that the dwellers of the central open-plan, cubicle-ridden offices enjoyed little in the way of natural light. In the AT&T Building, however, Johnson was aware that he had to attract employees who largely commuted from the New York suburbs, and so he had successfully imitated all the benefits of the suburban corporate campus. Besides having its own medical office, gym, and multiple dining rooms, the office building had an impressive two-story-tall “sky lobby,” an entrance five stories above the ground floor, where workers got off one set of elevators to enter another set, surrounded by walls of gleaming white marble. The cubicled work floors, surrounded by corridors of private offices, were ten feet high, two and a half feet more than the standard, and the core of the building was only thirty feet away from the windowed periphery, meaning that workers got a significant amount of natural light. In a nice touch, they were also given adjustable task lighting in addition to the usual fluorescent ceiling panels. Of course, given the imperial leanings of AT&T, the company had a grand staircase leading to a three-story executive floor, filled with faux-Georgian panels and moldings.27

  Soon after it appeared, the AT&T Building began to symbolize a changing American workplace in a very different way than its owners and designers had imagined. An antitrust case launched against the company in 1974 came to a conclusion in 1982. AT&T, which had held a monopoly on U.S. telecommunications for generations, had lost. Its new building opened in 1984, just as it was tasked with carrying out its divestiture order. AT&T sold off two-thirds of its assets; in the first two years of divestiture, it laid off 56,000 people. Between 1984 and 1992, 107,291 unionized employees were discharged—one of the biggest business discharges ever, in a decade that would see many more such mass layoffs.28 Johnson had made AT&T’s office spaces flexible, with ceiling grooves designed for easy slotting and removing of office walls. Now the kind of reorganization AT&T was engaged in meant getting rid of offices—and the people in them—altogether. “Flexibility,” that sacred word in business, came to have a sinister meaning. By the end of the decade, when the country was sliding into yet another recession, AT&T was questioning the need of having a fifteen-hundred-person corporate headquarters at all. In 1992, it paid New York City the $14.5 million tax abatement that it no longer qualified for. Many of the employees were shifted to an old building; still more were told that they should “work from home”—an unusual phrase that most office workers had not heard before. A number of unassigned cubicles were available to anyone who needed to come in. Otherwise they sat empty.

  “Stress is high in my life right now,” an AT&T manager wrote in his diary in 1983. “Principally because of the job. The problem is that I see myself standing alone … It’s damn near impossible to keep from going crazy … Sometimes I feel that this stress is self-induced, because of my conscientiousness … In this era of ambiguity, uncertainty and inordinate turf battles, the manager who really cares may well kill himself with anxiety and worry and what those emotions generate—stress.”29 This was a former organization man who now confronted the total collapse of his world. Just ten years prior, office workers still believed that they were essential to their companies—so essential that many of them stayed where they started until they retired, having moved steadily up the ladder. But a new breed of executives, threatened by waves of global competition from Germany and Japan and seeking more and more profits to deliver to shareholders, sliced through their ranks. All the old certainties seemed to have dissolved in an instant.

  In the early 1980s, things were looking so desperate that people of all stripes, especially managers and managers-to-be, began to buy business books. Hitherto these were objects so shameful that people might, or should, have covered them in brown paper, but in 1982, the worst year of the U.S. recession, people bought Tom Peters and Robert H. Waterman’s study of high-performing companies, In Search of Excellence, in such massive quantities that they kept it on the best-seller list for the entire year. This was despite the fact that in writing the book, as Peters himself later admitted, they “had no idea what [they] were doing.”30 A year before, readers had similarly devoured William Ouchi’s Theory Z—a sequel, of course, to Douglas McGregor’s Theories X and Y that had so influenced Propst—a glimpse into the world of Japanese management, whose secrets appeared to be the reason the United States was getting its ass kicked, economically speaking. These books and ideas were so pervasive in the 1980s that they were all encapsulated in the Mike Nichols film Working Girl (1988). Melanie Griffith’s Staten Island secretary, Tess McGill, arrives at her new job at the opening of the film and unloads her books at her desk, among which is In Search of Excellence, a sign of her entrepreneurial spirit and appetite for creative destruction. Later in the film she surprises a potential investor by confronting him at his daughter’s wedding, and tries to seduce him into doing a deal by flattering his foresight in breathy tones of excitement, enumerating the high points of 1980s business thought: “You’re the man who … applied Japanese management principles while the others were still kowtowing to the unions, the man who saw the Ma Bell breakup coming from miles away.”

  And yet the theories were often antithetical to the message that American corporations appeared to take from them. For example, Japanese management theory, as Ouchi had it, was not antiunion. Indeed, Ouchi argued that any attempt by management to drive out a union would give employees “further proof of the duplicity of management.”31 He pointed out (however duplicitously himself) that Japanese companies had developed a cooperative rather than adversarial relationship with labor. Moreover, Ouchi spoke passionately of the lifetime employment policies of the Japanese corporation. (These in fact had been influenced by American thinking developed by the business theorist W. Edwards Deming, who had taught management to the Japanese during their period of postwar reconstruction.)32 Overarching trust and symbolic gestures toward egalitarianism were the keys to Japanese management. “Theory Z” was a model for making corporations more clannish and the hierarchy less authoritarian. Ouchi even advocated an open-plan layout, over private offices and partitions, to capture more fully the trust that higher-ups were supposed to
place in their subordinates. For Peters and Waterman, too, the problem was definitely not in the fact that Americans had to deal with unions or tough regulations; they noted that the Germans had tougher unions, and both the Germans and the Japanese had stricter regulations.33 They agreed with Ouchi, and most office design theorists, that looseness and openness in office design were the keys to superior management. All of these made for the success of the books; they were also conspicuously ignored.

  There was one message that Americans obviously took from the business books: the need to cut staff, to achieve—in the argot of In Search of Excellence—“lean form.” Ouchi had noted that the Japanese were light on managerial ranks, and so did Peters and Waterman. Somehow, the Americans had to get there. And the only way to get there was to cut. “The numbers in many companies—both levels and employees—are staggering,” Peters and Waterman wrote, in one of the book’s few somber moments. “Ford over the last twenty-four months, in an effort to become more competitive with the Japanese, has cut more than 26 percent of its middle management staff; President Donald Petersen believes this is only the beginning. Reductions in the neighborhood of 50 percent, or even 75 percent, in levels and bodies are not uncommon targets when businessmen discuss what they could honestly do without.”34

  The targets proved about right. The 1980s became known as one of the meanest decades for corporate America in many generations; the 1990s, which were perhaps even meaner (statistically at least), somehow avoided the designation—perhaps because by then people had gotten inured to the practice. In those two decades, the generous benefits and stable wage increases that had defined a generation would vanish. Largely through brutal mass layoffs, American manufacturing workers would decline from a peak of 19.4 million in 1979 to 14.3 million in 2005. Of the country’s five hundred largest manufacturers in 1980, one in three would disappear by 1990.35 A spree of mergers and acquisitions and corporate raids, fueled by soon-to-be-infamous junk bonds, became regular headlines. The union movement had its spine split by newly emboldened corporations; it declined from a peak of about 35 percent of the labor force in the 1950s to 12.7 percent of the workforce today, hovering around 6 percent of the private-sector labor force. The aggressiveness of the new era was signaled by one government action, unrivaled in spectacle: Ronald Reagan’s decision in 1981 to fire 11,345 striking air-traffic controllers, whose union had endorsed him for president.

  Many job losses were in blue-collar sectors, due to deregulation of industry, plant closings, and offshoring. As for the office itself, insecurity had crept in by at least the mid-1980s; for instance, in 1985 BusinessWeek reported that at least one million white-collar or “non-production” jobs had been lost since 1979, because of the even heavier losses in what it called “smokestack America.” (In a particularly grim irony that year, the New York Times reported that companies were purging their ranks of in-house business economists. That same year, the World Design Congress declared Propst’s Action Office to be the most influential design of the previous quarter century.) But the fact remained that the office side of America was still safer than the factory side of it. Office workers were certainly told—constantly—that it was. The economy—as Propst, Drucker, and so many others had promised and continued to promise—would become perpetually more “postindustrial” and knowledge oriented. Machines and machine-made goods could be produced anywhere. Knowledge, however, unique to the character of the ever more individualized white-collar worker, was best produced at home. Perhaps the office worker labored in a cubicle, not a cozy corner office, but he might ascend to the corner office someday, and in the meantime his semipermanent walls were better than the laborer’s open shop floor, which seemed more and more like a dangerous no-man’s-land.

  By the end of the decade, whatever remained of that fantasy would at last be punctured. On or around October 19, 1987, everything changed. The Dow shed 23 percent of its value in a day, and in the recessions that followed, white-collar workers—particularly managers and mid-level executives—began to recognize themselves as the targets of mass downsizing. Between 1990 and 1992, 1.1 million office workers would be laid off, exceeding blue-collar layoffs for the first time. In the ten days following the 1992 election of Bill Clinton, the pace of white-collar layoffs quickened (General Motors, 11,000 jobs; BellSouth, 8,000; Travelers, 1,500; Chevron, 1,500; DuPont, 1,243). The rate of layoffs in the early 1990s ended up being much higher than that of the mean years of the 1980s.

  The chief victims of the cuts were middle management—the organization men (by the 1980s, about a third of them were organization women as well) who had defined American business in the preceding decades. Despite the critiques of organization man ideology, their ranks had swelled in the 1970s, growing at twice the rate of the lower rungs (clerks, typists, secretaries) of the white-collar workforce. Managers had in fact grown at twice the rate of the rest of the workforce: 43.1 percent. Meanwhile, thanks in large part to automation, which ended up creating more jobs for technical workers than production workers, manufacturing employees fell considerably. As a result, the ratio between workers in management and those in production had flipped. After World War II, around three-quarters of corporate employees did production work, while only a quarter did administration. By 1980, these numbers had changed places.36 Executives looking around at Germany and Japan, which appeared to be outcompeting the United States in the early 1980s, saw much lower rates of managers to manufacturing. American companies seemed fat—and so the phrase “trimming the fat” became one of the millions of gross euphemisms (“downsizing,” “restructuring,” even “dehiring”) executives used for mass layoffs.

  But the cost of shedding middle management would prove high, for middle managers had been the basis of the American middle class itself. The promise of stability, clean work, and relatively high pay, all tied to company loyalty, had provided stability to American politics and work for two generations. The fact that Tom Rath—of Gray Flannel Suit fame—could turn down an executive position so that he could work less and spend time with his family (but still earn enough money to maintain a large home in suburban Connecticut) was part of the dream of middle classness that the United States had genuinely extended to thousands of men and, increasingly in later years, women. In the 1980s and 1990s, the U.S. corporate world broke that silent contract. Of course, when contracts are unspoken, they’re so much easier to break. The cushy, boring world of the office described so well in Life in the Crystal Palace became a more frightening place, ruled by a deep-seated psychological fear of being fired. By the 1990s, fear was not only the casual effect of workplace reorganization; it was the goal. The Intel CEO Andy Grove, in his classic book of management theory Only the Paranoid Survive, put it succinctly, slyly contrasting his philosophy with that of the original practitioner of Japanese management:

  The quality guru W. Edwards Deming advocated stamping out fear in corporations. I have trouble with the simple-mindedness of this dictum. The most important role of managers is to create an environment in which people are passionately dedicated to winning in the marketplace. Fear plays a major role in creating and maintaining such passion. Fear of competition, fear of bankruptcy, fear of being wrong, and fear of losing can all be powerful motivators. How do we cultivate fear of losing in our employees? We can only do that if we feel it ourselves.37

  What had been a supposedly unintended by-product of corporate restructuring gradually became a business principle. Succumbing to one of their many fits of italics, Peters and Waterman had written that, as workers, “we simultaneously seek self-determination and security.”38 As it happened, the new office would provide neither. And nothing symbolized this transfigured world better than its furniture.

 

‹ Prev