American Experiment

Home > Other > American Experiment > Page 246
American Experiment Page 246

by James Macgregor Burns


  An empire? Americans hardly thought in such grandiose terms. Yet their influence reached to the Caribbean, to Alaska and Hawaii, both of which became states in 1959, to old and to newly reconquered bases in the Pacific, to their military protectorate Taiwan. Military, economic, diplomatic ties intertwined American power with Latin American nations through the Inter-American Treaty of Reciprocal Assistance of 1947, with European and Middle Eastern nations through the North Atlantic Treaty of 1949, with Australia, New Zealand, Pakistan, Thailand, and other nations through the Southeast Asia Collective Defense Treaty of 1954, with Japan and South Korea through mutual defense treaties. Dulles spent much of his time rushing from capital to capital repairing and refurbishing these ties. On the vast chessboard of Atlantic and Pacific power few doubted which nation held queen, castles, and knights.

  European allies viewed American military and technological prowess with awe and fear. The Yanks’ advanced aircraft, their H-bombs and capacity to deliver them, their magnificent flattops, along with their fancy big automobiles, refrigerators, computers, and other gadgetry, were the talk of European capitals. But foreigners feared that Americans with their awesome military power were like children playing with dangerous toys. European pundits wrote scorchingly of American pretensions of leading the “free world,” of American soldiers and other visitors who scattered their dollars, Cokes, slang words, and bastard children across the Continent. French intellectuals attacked the new barbarism, while nourishing the consoling thought that Europe could serve, in Max Lerner’s words, “the role of a cultural Greece to the American Rome—a Greece which, while conquered, takes the conqueror captive.” Still, in the aftermath of the Marshall Plan most people in countries benefiting by the Plan admired America’s “free elections” and other democratic institutions, saw the “real issue” as “communism and dictatorship versus democracy and freedom,” and had faith in the Plan itself as aiding European economic recovery.

  The Kremlin responded to Western militancy with its own provocations and interventions, threats and bluster, combined with occasional essays at détente. Behind its actions lay deep fears—of an increasingly independent and hostile China, a turbulent and unpredictable Middle East, unrest in its satellites that might lead to outbreaks of Titoism, a resurgent Germany. Above all Moscow feared renewal of the old threat of Western—now “capitalist”—encirclement. The view from Moscow was of American power stretching from the Aleutians through Korea and Japan to the Philippines, from Southeast Asia to Turkey and Greece up through Europe to Scandinavia. Acutely aware of his strategic nuclear inferiority, Khrushchev resorted even to bluff to conceal it—most notably when he flew the same squadron of his Bison bombers in circles around a reviewing stand at a 1955 Aviation Day ceremony.

  Yet even uneasier and more mistrustful than the Russians in the late 1950s were most Americans. If a leader as levelheaded as Eisenhower could have been “obsessed” by fear of a communist Iran on the borders of the Soviet Union, it was hardly strange that many Americans, saturated from school days by talk of Soviet power and Bolshevik evil, should match fear with fear, anger with anger. Thus Khrushchev’s paper-fort deception with his Bisons triggered in Washington a sharp “bomber gap” scare that in turn produced a quick boost in B-52 bomber-building. Visiting the United States, Europeans who viewed themselves as occupying the nuclear front lines were amused to find Americans huddling—intellectually if not physically—in bomb shelters. Americans during the 1950s found themselves feared in their image around the globe but fearful themselves of the future.

  Not that most Americans worried about their survival as a nation. Having appointed themselves guardians of liberty, however, they feared for the survival of freedom in the Western world. As in the past, they were far more effective in saluting freedom than in defining it. But definition was crucial. FDR’s Four Freedoms—of speech and religion, from fear and want—were only a starting point. What kind of freedom—individual, civil, economic, religious, ethnic? Freedom for whom—minorities, blacks, women, artists, intellectuals, censors of textbooks, extremists, pornographers, noncitizens? Freedom when—after World War II, after the cold war? Freedom from whom? The reds abroad? The “commies” at home? The “feds”? Corporation chiefs? Foremen? Deans? Religious zealots? Group and community pressures? It would become clear in the 1950s and 1960s that threats to these freedoms emanated from far more complex and numerous sources than the Politburos of Moscow and Peking.

  The Technology of Freedom

  The most striking aspect of freedom in America in the 1950s was its grounding in the nation’s technological might and economic abundance. During the half century past, Americans had proved their capacity to outproduce their rivals in automobiles, domestic appliances, and a host of other manufactures; during the 1940s they had astonished the world with their feats in building ships and weapons. By the early 1950s Americans could—and did—boast that their per capita income of approximately $1,500 was roughly double that of the British and the Swedes, more than four times that of the Russians. With 7 percent of the world population, the United States had over 40 percent of the world’s income.

  Would freedom flower in America during the fifties amid such plenty? The prospects were highly mixed. Historically freedom had flourished not where life was nasty, brutish, and short but in expanding economies that fostered equality of opportunity, more sharing, vertical and horizontal mobility. Still horrified by revelations about Nazi mass slaughter, shocked by new revelations about the monstrous “crimes of the Stalin era,” many Americans valued their own liberties all the more. Yet the 1950s, at the height of cold war anxieties, turned out to be a decade of intolerance of other Americans’ ideas. Individualism in the economic marketplace was not matched by individual liberty in the political and intellectual marketplace.

  Advocates of the free market were happy with their economic freedoms during the Age of Eisenhower, however, and even more with their economic successes. “We have entered a period of accelerating bigness in all aspects of American life,” proclaimed Eric Johnston confidently in 1957. “We have big business, big labor, big farming and big government.” The former head of the United States Chamber of Commerce even mused whether this was the start of an age of “socialized capitalism.” Certainly American business, if not “socializing,” was consolidating, bureaucratizing, innovating, and proliferating at home and overseas. Over 4,000 mergers and acquisitions of manufacturing and mining concerns occurred during the 1950s—a dramatic number though almost 3,000 fewer than in the 1920s. Large firms took over smaller ones less to run risks than to minimize them; despite Joseph Schumpeter’s warning of the “perennial gale of creative destruction,” the survival rate of large firms during the decade was almost 100 percent. Elaborate systems of recruitment, personnel, information, and leadership training expanded in the big corporations, with the help of complex office machines and business school graduates.

  The power of the American economy, however, lay far less in bigness and organization than in technological and scientific advances stemming from a century of experimentation and invention and later propelled by the imperative demands of two world wars and the cold war. And just as nineteenth-century army ordnance needs had promoted such important innovations as interchangeable parts, so world war needs fueled such varied practical achievements as penicillin, jet propulsion, and radar. Massive federal spending for invention and development carried on through the cold war years; by the late 1950s Washington was financing nearly 60 percent of the nation’s total research and development budget. In one major respect, however, twentieth-century technology was more than simply a wider and more varied activity than that of the nineteenth. In Nathan Rosenberg’s words, “an increasing proportion of technological changes” were now “dependent upon prior advances in systematized knowledge.” Innovators were more dependent than in Edison’s day on scientific disciplines such as physics and chemistry.

  Some of the postwar advances in specific fields were spe
ctacular. In October 1947, Captain Charles E. Yeager burst through the invisible barrier that had seemed to fix a limit to the speed of flight by flying the experimental rocket-powered X-1 faster than the speed of sound. In September 1948, an air force Sabre set a world speed record for jet fighters at 670 miles an hour; five years after that a Super Sabre became the first jet to cross the sound barrier, hitting 755, and in 1957 a Voodoo jet topped 1,200. The nuclear submarine Nautilus was reported to have used 8.3 pounds of uranium fuel to travel 60,000 miles. After the inglorious “Kaputnik” of the first Vanguard, the American space program made steady progress. And by 1960 the X-15 rocket plane was flying almost twice as fast as the Voodoo.

  More down-to-earth, but crucial to a wider technology, were advances in the sector in which Yankee tinkerers had pioneered a century earlier with their milling and grinding machines. This was the machine tool industry. Severely depressed after its World War II expansion—300,000 machine tools were dumped onto the market after the war—the industry burgeoned during the cold war. Aircraft manufacture, a voracious consumer of machine tools, also became increasingly interlinked with the electronics industry, which was now producing its own miracles. Though eventually developing its own huge domestic market, for years electronics reflected wartime need for miniaturization of electrical circuits in proximity fuses for bombs, gunfire control mechanisms, radar and sonar. As late as the mid-1960s the federal government was still providing two-thirds of the “R&D” costs of the electrical equipment industry, which included such giants as General Electric and American Telephone and Telegraph.

  Earthiest of all—and perhaps most important of all for its worldwide implications—was innovation in farming. Improved harvesters and other machines, combined with better fertilizers and sprays and new plant strains, produced higher output per acre, a vast increase in production, and a steep decrease in the total work hours in the United States devoted to agriculture. By 1960 8 percent of the labor force was occupied with farming, compared with 63 percent a century before. Hybrid corn, a systematic crossing of selected inbred lines, resulted in an increase in the average yield of corn per acre from 23 bushels in 1933 to 62 bushels in the mid-1960s. Thus hybrid corn research paid off handsomely, returning, it was estimated, seven times its cost by the mid-1950s. The lion’s share of the boost in farm yield came from—and profited—huge family farms, commercial farms, and other components of “agribusiness” that controlled the production and marketing of key foods and fibers through vertical integration, while millions of small farmers and migrant farm workers clung to a precarious livelihood.

  Out of the “Enormous Laboratory,” as Max Lerner called it, poured not only new machines and gadgets but the makings of wholly new or immensely enlarged industries—television, antibiotics, electronics, jet aircraft, rocketry. But the actual laboratories that produced this cornucopia of hardware were also the scenes of quiet encounters in one of the oldest intellectual conflicts in America—between the ideal of pure science and the practices of applied science.

  Many Americans still venerated the ideal of committed, disinterested science, of free, undirected research, of idle speculation and inspired hunch, of lack of pressure for immediate “practical” results, of a clear separation from the cash nexus—all the more because they could claim only one American in the past century who was comparable to such luminaries of European science as Darwin, Mendel, and Faraday. This was Josiah Willard Gibbs, the Yale mathematician whose work in thermodynamics, vector analysis, and statistical mechanics had belatedly won him an international reputation and whose laws of chemical energetics had enormous impact on processes as varied as the refining of oil, the synthesizing of rubber, and the separation of metals from their ores.

  Of scientific eminences the postwar United States had its share—Isador Isaac Rabi and J. Robert Oppenheimer in physics, Hermann Joseph Muller in genetics, George Gaylord Simpson in evolutionary biology, Harlow Shapley in astrophysics, and scores of others. Yet most of these scientists largely depended on the theoretical work of Europeans. Most notably, it was the transformation of theoretical physics undertaken by Einstein, Heisenberg, and others in Germany that had laid the groundwork for atomic fission. Now, as the United States basked in its world economic supremacy, had the time and occasion come for Americans to make great theoretical contributions to pure science?

  A century before, Karl Marx had warned that science could not for long be autonomous, that it was a social activity, that the nature of the demand for science was even more important than the quality of its supply. In America, science had to pay the piper. Giant corporations were eager to put vast sums of money into research, but of a special kind, really research and development. While the firms varied in their toleration of free research, sooner or later they expected a payoff in new inventions, patents, profits. The R&D departments emphasized team research, committee decisions, pooled facilities, narrowly focused investigation. There was little encouragement of idle curiosity, messing around, just looking out the window. “The underlying principle, rarely formulated precisely but ever present,” a study concluded, “has been that originality can be organized; that, provided more people can be equipped with technical knowledge and brought together in larger groups, more new ideas must emerge; that mass production will produce originality just as it can produce sausages.” Military needs created even heavier demands for scientific group-think and the organization man.

  Politicians and scientists alike attacked the restrictions on Soviet science, but Americans could hardly be complacent. Aside from confronting seductive commercial and military demands on R&D, scientists had to contend with a popular double impulse to worship them and to fear them—the worship leading to unduly high popular expectations followed by disappointments, the fear leading to suspicion of their unorthodoxy and associations, as witness the classification of Robert Oppenheimer as a “security risk.” Pleased by statements such as that of Harvard’s president, James Conant—subsidies should go to persons, not projects—some scientists sought to protect their freedom of inquiry and communication by remaining in the universities. But scholars in the groves of academe were not free from political and press attacks, outside pressures for directed research, the temptations to undertake team projects and group authorship, the enticements of big corporate and military money.

  Perhaps the major obstacle to “free science,” however, was the empirical tradition in American scientific thought. The heroes of American popular science were the Thomas Edisons who disdained formal abstract knowledge or theorizing and preferred to tinker “by guess and by God” in their labs. It was this feet-on-the-ground compulsion that had channeled American genius into technology and engineering. If the nation were now to make as well a truly substantial contribution to scientific progress, greater freedom to reflect and to brood, freer play for the creative imagination, were crucial.

  Possibly some of the applied scientists, ensconced in their big laboratories and snug in their teams, recalled the lot of Professor Gibbs. He had worked at Yale almost alone and undisturbed. He had no team. He had few close friends and few students. He had no wife or children. He had no pay from Yale for a decade or so, until Johns Hopkins in 1880 offered him a professorship with salary, at $3,000 a year. Only then did Yale put him on its payroll, at $2,000, “with prospects of an early increase.”

  One controversial application of “science” related to the men and women who in turn related to machines. Initially called “scientific management,” it was first popularized by Frederick W. Taylor. After brilliant inventions of automatic grinding, forging, and tool-feeding mechanisms, Taylor had moved on at the turn of the century to time-and-motion studies designed to fit workers more closely to the imperatives of the machines and thereby increase industrial efficiency. The production process was functionalized and standardized by dividing it into measurable and controllable units of time and motion. Under Taylor’s leadership the idea was picked up by a host of large corporations, including Amer
ican Locomotive, Brighton Mills, Yale and Towne Lock. Machines, however, proved more easily manageable than men. Most workers preferred to follow their own motivations, rhythms, craft routines, group standards. A strike of molders in 1911 at the huge Watertown arsenal near Boston led to a government investigation and later a ban on Taylorism in government arsenals. A young assistant secretary, Franklin D. Roosevelt, imposed the ban in navy yards.

  Turning away from Taylorism as a system of managerial dictation— Taylor himself declared each worker must become “one of a train of gearwheels”—some “industrial scientists” tried to civilize the production process by “human engineering” or “human relations.” Psychologists and other social scientists were enlisted in this cause. Often benign in intent while manipulative in technique, “humanizing” turned out to be an effort to motivate workers through their own psychological processes rather than through managerial controls. Advocates of the method said that it promoted better communication, involved workers in at least minor decisions, enhanced “group feeling” and a sense of teamwork, fostered “leadership” as opposed to “control.” During and after World War II, the idea of human relations in industry flourished.

  Still the workers resisted. When Henry Ford II said that solving “the problem of human relations” would immensely speed up “progress toward lower costs,” men and women on the line could wonder whether their welfare or lower costs and higher profits were the goal. Union heads spoke sarcastically of foremen receiving training in the art of convincing workers “that they really are deeply beloved by the boss,” of employers “trooping to the special classes at Harvard” to learn that while the bosses were in business for a fast buck, workers reported to the plant each morning “for love, affection, and small friendly attentions.”

 

‹ Prev