Day of Empire

Home > Nonfiction > Day of Empire > Page 30
Day of Empire Page 30

by Amy Chua


  There is no doubt that racism caused the United States considerable international embarrassment. In one notorious case, when Haiti's secretary of agriculture arrived in Biloxi, Mississippi, in 1947 for a conference, the hotel (not expecting the secretary to be black) refused for “reasons of color” to let him stay with the other conference attendees. After the incident, an outraged editorial in a Haitian newspaper wrote, “The Negro of Haiti understands that the word democracy in the United States has no meaning.”

  In part, the U.S. government's postwar receptivity to civil rights reform reflected American interests in bolstering the country's international stature. In a 1948 New York Times Magazine article, Robert E. Cushman, a member of President Truman's Committee on Civil Rights, argued: “ [T]he nation finds itself the most powerful spokesman for the democratic way of life, as opposed to the principles of a totalitarian state. It is unpleasant to have the Russians publicize our continuing lynchings, our Jim Crow statutes and customs, our anti-Semitic discriminations and our witchhunts; but is it undeserved?” Cushman concluded, “[Americans] are becoming aware that we do not practice the civil liberty we preach; and this realization is a wholesome thing.”24

  As the twentieth century unfolded, the oppressiveness of the Soviet regime became increasingly manifest, and its claims of equality increasingly bankrupt. Corruption, patronage, and ossification spread throughout the Soviet Union. Even its supposed ethnic tolerance proved hollow. Russian hegemony and chauvinism vis-à-vis non-Russian peoples—not to mention occasional brutal military interventions—generated intense resentment throughout central Asia, the Baltic Republics, and Eastern Europe. Meanwhile, as the U.S.S.R. grew ever more closed and stagnant, the United States went in a very different direction.

  America's civil rights revolution in many ways began with the 1954 landmark case of Brown v. Board of Education. In Brown, the Supreme Court struck down race-based school segregation, rejecting the doctrine of “separate but equal” in public education. In the early 1960s, President John E Kennedy put his presidency squarely behind the cause of civil rights, passionately arguing in a nationwide television address:

  We preach freedom around the world, and we mean it, and we cherish our freedom here at home, but are we to say to the world, and much more importantly, to each other that this is a land of the free except for the Negroes; that we have no second-class citizens except Negroes; that we have no class or [caste] system, no ghettoes, no master race except with respect to Negroes?25

  Kennedy also summoned to Washington the leaders of America's most prestigious universities and implored them to diversify their student bodies, telling the group, “I want you to make a difference…Until you do, who will?”

  President Kennedy was assassinated in 1963. A year after his death, Congress passed the 1964 Civil Rights Act, which enacted sweeping voting reforms, required employers to provide equal employment opportunities, and made it illegal to discriminate on the basis of race in public places such as hotels, restaurants, and theaters. Around the same time, Yale University president Kingman Brewster embarked on unprecedented institutional reforms, with Harvard shortly following suit. Brewster hired R. Inslee (“Inky”) Clark to be Yale's new admissions director, with the mandate of building a more pluralistic student body. Brewster and Clark eliminated geographical factors for admission—which had been a way to limit Jewish students—and reduced preferences for alumni legacies and prep school students. The result was a spike in the percentage of Jewish students in the freshman class, from 16 percent in 1965 to about 30 percent in 1966. Clark's first class contained 58 percent public school students, more financial aid applicants than non-financial aid applicants, more minorities of every kind— and the highest SAT scores in Yale's history.

  Clark's new admission policies came under direct fire from members of the Yale Corporation and alumni contributors. Summoned before the Yale Corporation in 1966 to discuss the changes, Clark explained that in a changing country, leaders might come from nontraditional places, including in the future minorities, women, Jews, and public school graduates. A Yale Corporation member retorted, “You're talking about Jews and public school graduates as leaders. Look around you at this table. These are America's leaders. There are no Jews here. There are no public school graduates here.”

  But Brewster and Clark, as well as their counterparts at other institutions, persisted. The number of black and other minority students accepted to Ivy League schools rose dramatically during the sixties. In 1960, the “Big Three” had collectively just 15 African American freshmen; in 1970, there were 284 (83 at Yale, 103 at Princeton, and 98 at Harvard). Overall, between 1970 and 1980, the number of African American college graduates increased by 91 percent.26

  The changing face of U.S. higher education was part of a much more radical transformation of American society. The sixties and their aftermath did not end the primacy of white Anglo-Protestant men in the corporate world or in Washington, but women, blacks, and other minorities made impressive inroads in American business, politics, and culture. At the same time, new immigration policies dramatically changed the demographics of American society.

  The 1965 Immigration Act abolished the racially and ethnically discriminatory national-origin quota system instituted in the 1920s. Immigration rates exploded, from roughly 70,000 a year during the quota years to about 400,000 a year by the early 1970s, 600,000 a year by the early 1980s, and over 1 million in 1989. Between 1990 and 2000, approximately 9 million immigrants arrived in the United States, more than in any other decade except the heyday of Ellis Island at the turn of the century. The sources of immigration changed as well. Whereas before 1965 the vast majority of immigrants to the United States hailed from Europe, after 1965 they came overwhelmingly from Asia and Latin America. The rise in legal migration was accompanied by an increase in illegal entries. In 1960, foreign-born residents of the United States were distributed principally as follows:

  Italy 1,257,000

  Germany 990,000

  Canada 953,000

  United Kingdom 833,000

  Poland 748,000

  In 2000, the distribution was as follows:

  Mexico 7,841,000

  China 1,391,000

  Philippines 1,222,000

  India 1,007,000

  Cuba 952,00027

  AMERICAN WORLD DOMINANCE

  In January 1991, during the First Gulf War, viewers around the world watched, rapt, as the world's most powerful bombs and smartest missiles, fired from history's first stealth aircraft and guided by the world's most sophisticated satellite navigation system, took out target after target—bunkers, bridges, air defense towers, Scud missile launchers—with laser precision. For the next five weeks, U.S. Apaches, Pave Lows, Hornets, and Nighthawks pounded enemy territory, inflicting maximum destruction with a staggeringly low American fatality rate. Then, it was over: “the most awesome and well-coordinated mass raid in the history of air power.” If there was any doubt before, the breathtaking precision of Operation Desert Storm made it crystal clear: The U.S. military was light-years ahead of any other military force on the planet.28

  It was not only in military might that the United States had achieved a stunning global preeminence. In the 1980s, the productive capacity that America added to what it already possessed exceeded the entire productive capacity of West Germany—Europe's largest economy. After a relatively mild recession in 1990-91, the U.S. economy exploded yet again, reaping massive gains from the microprocessing revolution and yielding “the greatest period of wealth creation in the history of the world.” While only a decade before, doubters had wondered whether U.S. business could remain competitive with Japan and a uniting Europe, by the 1990s America's economy had opened up a staggering lead over all other nations of the world. At the opening of the twenty-first century, America's gross domestic product, calculated in current dollars, represented an astonishing one-third of total world output, twice the size of Japan's and China's economies combined, and more than
three times Great Britain's share of gross world output at its imperial height.

  America was the country that benefited most from globalization. In the words of George Soros, an immigrant who had built a multibillion-dollar fortune in the United States from scratch, “The trend of globalization is that surplus capital is moving from the periphery countries to the center, which is the United States.” Throughout the 1990s, American corporations like Wal-Mart, Nike, McDonald's, ExxonMobil, Coca-Cola, and Disney continued to dominate the world economy, despite anti-American sentiments. The dollar was the world's dominant currency, English the dominant language, and America's the most emulated culture. As the twentieth century came to a close, with Russia in chaos, Europe stagnant, and Japan mired in recession, the United States of America had no real competition—militarily, economically, even culturally. The world had a new hyperpower.29

  There were many reasons behind the United States’ sudden vault to world dominance, the most spectacular being the collapse of the former Soviet Union. Had the U.S.S.R. not imploded, we might still live in a bipolar world today. On the other hand, all the same factors that had steadily brought the United States to superpower status also underlay its achievement of world dominance.

  It is well known that the United States won the race for the atomic bomb because of the contributions of Albert Einstein and other refugee physicists. Less well known is the similar role that immigrant scientists played in America's stunning triumph in the “information technology” race, which has transformed the world in the last quarter century. The boom America enjoyed during the 1980s and 1990s was directly fueled by two revolutionary developments, one technological and one financial: the discovery of the microchip and the creation of venture capitalism. The former gave birth to the computer age, and the latter to Silicon Valley, which in turn allowed new “information technology” to be exploited at lightning speed. The origins of these two developments are closely connected, and, once again, both were the fruit of American openness to immigrant talent and enterprise.

  Eugene Kleiner arrived in the United States in 1941 at the age of eighteen, having fled Vienna just before the Nazi takeover. Although lacking a high school diploma, Kleiner later graduated from Brooklyn Polytechnic with an engineering degree. In the early 1950s, Kleiner was recruited to California by the controversial physicist William Shockley, who, a few years earlier at Bell Labs, had participated in an unexpected invention. Using a bent paper clip, strips of foil, and a small piece of semiconducting material, Shockley's team produced a tiny device that, to their astonishment, amplified electric current. They called the device a transistor.

  Shockley left Bell Labs to start his own company with the idea of developing a multiple-transistor semiconductor. Shockley insisted on using germanium as the semiconducting material. Kleiner and others on the team believed silicon would be superior, but the difficult and increasingly paranoid Shockley brooked no disagreement. Soon, Kleiner and seven colleagues broke away, scraping together $3,500 of their own money to pursue their silicon-based research. But even in the 1950s, $3,500 was woefully inadequate, and it was virtually impossible to secure investment funding to back an untried scientific idea in its germinal stages. Nevertheless, after writing a now famous letter to a New York stockbroker, Kleiner managed to get his group funded. As a result, Kleiner and his colleagues became the officers of their own company, Fairchild Semiconductor.

  Shockley won the 1956 Nobel Prize for his role in discovering the transistor. He also went on to gain considerable attention as a professor at Stanford, particularly for his racist eugenic beliefs. (He often publicly warned that “intellectually inferior” blacks were procreating at a dangerously high rate.) But his company, Shock-ley Semiconductor, was a commercial failure.

  By contrast, Kleiner and his colleagues succeeded in producing the world's first commercially practical integrated circuit—out of silicon. Within a short time, Fairchild Semiconductor grew from twelve employees to 12,000, with revenues of $130 million a year. Santa Clara Valley, previously known mainly for its plums and walnuts, would never be the same again.

  Now wealthy, Kleiner decided to try something new. No doubt reflecting on his own difficulties starting Fairchild Semiconductor, Kleiner had the idea of creating an investment fund for breakthrough scientific innovations. Although venture capital is a familiar concept today, it was not in the early 1970s. Virtually unique in its time, the investment firm that Kleiner cofounded—which eventually became the now legendary Kleiner, Perkins, Caufield & Byers—adopted the strategy of aggressively searching out and betting big on untried technology while allowing (indeed encouraging) the inventors to retain a large ownership stake in the new companies. The formula succeeded: The companies that Kleiner, Perkins helped launch include AOL, Genentech, Compaq, Lotus Development, Netscape, Quantum, Sun Microsystems, Amazon.com, and Google.

  Kleiner, who died in 2003, is often credited with both “starting Silicon Valley” and “virtually inventing venture capital.” The Kleiner, Perkins business model transformed American finance, fueling an explosion of venture capitalism in the last quarter of the twentieth century. It is no coincidence that the rise of venture capitalism owed so much to a refugee from Nazi Europe or that it played so large a role in America's world leadership in the computer age. Venture capitalism was nothing less than a late-twentieth-century incarnation of strategic tolerance. Just as in Rome or the Great Mongol Empire, America's global dominance has depended on its ability to bring in and mobilize the world's cutting-edge talents and intellectual capital. In the 1980s and 1990s, American venture capitalism was phenomenally successful in doing just that, offering enormous inducements to young scientists, inventors, and entrepreneurs of all backgrounds, rich or poor, white or minority, native or immigrant, to pursue their ideas in America.

  Andrew Grove, born Andräs Gróf in Budapest, Hungary, was one of those entrepreneurs. In 1956, the twenty-year-old Grove and his family fled the turmoil of the Hungarian Revolution, arriving in New York City onboard a rusty ship the following year. Like Kleiner, Grove did not attend a fancy school. He graduated at the top of his class from the City College of New York, waiting tables to cover tuition. Hating the cold Northeast winters, Grove then made his way to the University of California, Berkeley, where he received his Ph.D. in chemical engineering in 1963.

  For Grove, America was truly a land of tolerance and opportunity. As a boy in Hungary, he successfully hid from the Nazis with his family, only to be humiliated after the war by a childhood friend who told Grove that his father had forbidden him from playing with Jews. Later, when Hungary became a puppet state of the U.S.S.R. and the Soviet tanks rolled in, Grove's prospects seemed only bleaker.

  Sunny California could not have been more different. After Berkeley, Grove got a job at Fairchild Semiconductor, the firm Eugene Kleiner had cofounded. There, Grove impressed everyone with not just his energy and brilliance but his extraordinary attention to detail. In 1968, when Robert Noyce and Gordon Moore, two of Fairchild's other original founders, left the company to strike out on their own, they invited Grove to be their director of operations. The decision was a surprise to many: Grove's thick Hungarian accent and impaired hearing did not make him the likeliest of choices. But Noyce and Moore had only one employment criterion: they wanted the best talent available.

  Noyce was one of the inventors of the integrated circuit. Moore was arguably the best pure engineer at Fairchild. Their plan in founding a new company was to turn the multiple-transistor integrated circuit into a memory device. In 1968, computer memory storage was still being handled through magnetic-core technology. Noyce and Moore believed they could pack more transistors onto their silicon chips and turn them into memory devices smaller, cheaper, and more powerful than magnetic-core memory. In short, Noyce and Moore set out to build what the world would soon call microprocessors, also known as microchips. They called their new company Integrated Electronics—later shortened to Intel.

  Interestingly, the
man who came to be widely regarded as the driving force behind Intel was neither Noyce nor Moore but Andy Grove. Before Intel could mass-produce its microprocessors, there were a thousand problems to overcome—technical, administrative, strategic, and commercial. It was Grove, more than anyone else, who solved these problems. Described in company pamphlets as one of Intel's three cofounders, Grove became Intel's president in 1979 and its CEO in 1987. When Time magazine named Grove as its 1997 Man of the Year, it described him as “the person most responsible” for the microchip and hence the Digital Revolution, which, in Time's words, transformed the end of the twentieth century “the way the Industrial Revolution transformed the end of the [previous] one.”

  Under Grove's stewardship, Intel by the late 1990s was worth $115 billion, more than IBM. It produced almost 90 percent of the world's PC microprocessors—churning out a quadrillion transistors every month, with seven million of them etched onto silicon microchips smaller than a dime. Among the foreign giants that Intel towered over in the 1990s were Samsung, Toshiba, Hitachi, Fujitsu, NEC, and Siemens. Today, despite fierce competition and sporadic crises, Intel remains the world's largest producer of microprocessors.30

  Like the printing press and the steam engine in their eras, the microchip was the core invention of the computer age. It underlay all the new software and hardware that would give us CDs, DVDs, VCRs, iPods, iTunes, TiVo, digital cameras, cell phones, BlackBer-ries, and other products that would forever change the way human beings live, think, and communicate. It drove the explosion of a new Internet-connected global economy and what Thomas Friedman has called the “new talent era.”

 

‹ Prev