Book Read Free

The Bell Curve: Intelligence and Class Structure in American Life

Page 6

by Richard J. Herrnstein


  This does not mean that a member of the cognitive elite never crosses paths with a person with a low IQ, but the encounters that matter tend to be limited. The more intimate or more enduring the human relationship is, the more likely it is to be among people similar in intellectual level. That the brightest are identified has its benefits. That they become so isolated and inbred has its costs. Some of these costs are already visible in American society, while others lie over the horizon.

  Human society has always had some measure of cognitive stratification. The best hunters among the Bushmen of the Kalahari tend to score above the average of their tribe on modern intelligence tests and so, doubtless, would have the chief ministers in Cheop’s Egypt.1 The Mandarins who ran China for centuries were chosen by examinations that tested for understanding of the Confucian classics and, in so doing, screened for intelligence. The priests and monks of medieval Europe, recruited and self-selected for reasons correlated with cognitive ability, must have been brighter than average.

  This differentiation by cognitive ability did not coalesce into cognitive classes in premodern societies for various reasons. Clerical celibacy was one. Another was that the people who rose to the top on their brains were co-opted by aristocratic systems that depleted their descendants’ talent, mainly through the mechanism known as primogeniture. Because parents could not pick the brightest of their progeny to inherit the title and land, aristocracies fell victim to regression to the mean: children of parents with above-average IQs tend to have lower IQs than their parents, and their children’s IQs are lower still. Over the course of a few generations, the average intelligence in an aristocratic family fell toward the population average, hastened by marriages that matched bride and groom by lineage, not ability.

  On the other hand, aristocratic societies were not as impermeable to social mobility as they tried to be. They allowed at least some avenues for ability to rise toward the top, whereupon the brains of the newcomer were swapped in marriage for family connections and titles. England was notably sagacious in this regard, steadily infusing new talent into the aristocracy by creating peerages for its most successful commoners. The traditional occupations for the younger sons of British peers—army, navy, church, and the administration of the empire—gave the ablest younger sons in the aristocracy a good chance to rise to the top and help sustain the system. Indeed, the success of some English families in sustaining their distinction over several generations was one of the factors that prompted Francis Galton to hypothesize that intelligence was inherited. But only a minority of aristocratic families managed this trick. It remained true even in England that, after a few generations, the holder of any given aristocratic title was unlikely to be smarter than anyone else. When one observer wrote of the aristocracy in Queen Victoria’s day that “all the social talk is stupid and insipid,” he was being more accurate than perhaps he realized.2

  Even in less rigidly stratified societies, stratification by cognitive ability has been weak and inconsistent until this century because the number of very bright people was so much greater than the specialized jobs for which high intelligence is indispensable. A true cognitive elite requires a technological society. This raises a distinction that is so important, and forgetting it can so easily lead to needless misunderstanding, that it is worth emphasizing: To say that most of the people in the cognitively demanding positions of a society have a high IQ is not the same as saying that most of the people with high IQs are in such positions. It is possible to have cognitive screening without having cognitive classes. Mathematical necessity tells us that a large majority of the smart people in Cheop’s Egypt, dynastic China, Elizabethan England, and Teddy Roosevelt’s America were engaged in ordinary pursuits, mingling, working, and living with everyone else. Many were housewives. Most of the rest were farmers, smiths, millers, bakers, carpenters, and shopkeepers. Social and economic stratification was extreme, but cognitive stratification was minor.

  So it has been from the beginning of history into this century. Then, comparatively rapidly, a new class structure emerged in which it became much more consistently and universally advantageous to be smart. In the next four chapters, we examine that process and its meaning.

  Chapter 1

  Cognitive Class and Education, 1900-1990

  In the course of the twentieth century, America opened the doors of its colleges wider than any previous generation of Americans, or other society in history, could have imagined possible. This democratization of higher education has raised new barriers between people that may prove to be more divisive and intractable than the old ones.

  The growth in the proportion of people getting college degrees is the most obvious result, with a fifteen-fold increase from 1900 to 1990. Even more important, the students going to college were being selected ever more efficiently for their high IQ. The crucial decade was the 1950s, when the percentage of top students who went to college rose by more than it had in the preceding three decades. By the beginning of the 1990s, about 80 percent of all students in the top quartile of ability continued to college after high school. Among the high school graduates in the top few percentiles of cognitive ability, the chances of going to college already exceeded 90 percent.

  Perhaps the most important of all the changes was the transformation of America’s elite colleges. As more bright youngsters went off to college, the colleges themselves began to sort themselves out. Starting in the 1950s, a handful of institutions became magnets for the very brightest of each year’s new class. In these schools, the cognitive level of the students rose far above the rest of the college population.

  Taken together, these trends have stratified America according to cognitive ability.

  A perusal of Harvard’s Freshman Register for 1952 shows a class looking very much as Harvard freshman classes had always looked. Under the photographs of the well-scrubbed, mostly East Coast, overwhelmingly white and Christian young men were home addresses from places like Philadelphia’s Main Line, the Upper East Side of New York, and Boston’s Beacon Hill. A large proportion of the class came from a handful of America’s most exclusive boarding schools; Phillips Exeter and Phillips Andover alone contributed almost 10 percent of the freshmen that year.

  And yet for all its apparent exclusivity, Harvard was not so hard to get into in the fall of 1952. An applicant’s chances of being admitted were about two out of three, and close to 90 percent if his father had gone to Harvard.1 With this modest level of competition, it is not surprising to learn that the Harvard student body was not uniformly brilliant. In fact, the mean SAT-Verbal score of the incoming freshmen class was only 583, well above the national mean but nothing to brag about.2 Harvard men came from a range of ability that could be duplicated in the top half of many state universities.

  Let us advance the scene to 1960. Wilbur J. Bender, Harvard’s dean of admissions, was about to leave his post and trying to sum up for the board of overseers what had happened in the eight years of his tenure. “The figures,” he wrote, “report the greatest change in Harvard admissions, and thus in the Harvard student body, in a short time—two college generations—in our recorded history.”3 Unquestionably, suddenly, but for no obvious reason, Harvard had become a different kind of place. The proportion of the incoming students from New England had dropped by a third. Public school graduates now outnumbered private school graduates. Instead of rejecting a third of its applicants, Harvard was rejecting more than two-thirds—and the quality of those applicants had increased as well, so that many students who would have been admitted in 1952 were not even bothering to apply in 1960.

  The SAT scores at Harvard had skyrocketed. In the fall of 1960, the average verbal score was 678 and the average math score was 695, an increase of almost a hundred points for each test. The average Harvard freshman in 1952 would have placed in the bottom 10 percent of the incoming class by 1960. In eight years, Harvard had been transformed from a school primarily for the northeastern socioeconomic elite into a school populate
d by the brightest of the bright, drawn from all over the country.

  The story of higher education in the United States during the twentieth century is generally taken to be one of the great American success stories, and with good reason. The record was not without blemishes, but the United States led the rest of the world in opening college to a mass population of young people of ability, regardless of race, color, creed, gender, and financial resources.

  But this success story also has a paradoxically shadowy side, for education is a powerful divider and classifier. Education affects income, and income divides. Education affects occupation, and occupations divide. Education affects tastes and interests, grammar and accent, all of which divide. When access to higher education is restricted by class, race, or religion, these divisions cut across cognitive levels. But school is in itself, more immediately and directly than any other institution, the place where people of high cognitive ability excel and people of low cognitive ability fail. As America opened access to higher education, it opened up as well a revolution in the way that the American population sorted itself and divided itself. Three successively more efficient sorting processes were at work: the college population grew, it was recruited by cognitive ability more efficiently, and then it was further sorted among the colleges.

  THE COLLEGE POPULATION GROWS

  A social and economic gap separated high school graduates from college graduates in 1900 as in 1990; that much is not new. But the social and economic gap was not accompanied by much of a cognitive gap, because the vast majority of the brightest people in the United States had not gone to college. We may make that statement despite the lack of IQ scores from 1900 for the same reason that we can make such statements about Elizabethan England: It is true by mathematical necessity. In 1900, only about 2 percent of 23-year-olds got college degrees. Even if all of the 2 percent who went to college had IQs of 115 and above (and they did not), seven out of eight of the brightest 23-year-olds in the America of 1900 would have been without college degrees. This situation barely changed for the first two decades of the new century. Then, at the close of World War I, the role of college for American youths began an expansion that would last until 1974, interrupted only by the Great Depression and World War II.

  The three lines in the figure show trends established in 1920-1929, 1935-1940, and 1954-1973, then extrapolated. They are there to highlight the three features of the figure worth noting. First, the long perspective serves as a counterweight to the common belief that the college population exploded suddenly after World War II. It certainly exploded in the sense that the number of college students went from a wartime trough to record highs, but this is because two generations of college students were crowded onto campuses at one time. In terms of trendlines, World War II and its aftermath was a blip, albeit a large blip. When this anomalous turmoil ended in the mid-1950s, the proportion of people getting college degrees was no higher than would have been predicted from the trends established in the 1920s or the last half of the 1930s (which are actually a single trend interrupted by the worst years of the depression).

  In the twentieth century, the prevalence of the college degree goes from one in fifty to a third of the population

  Sources: 1900-1959: U.S. Bureau of the Census 1975, H751-765. 1960-1992: DES, 1992, Table 229.

  The second notable feature of the figure is the large upward tilt in the trendline from the mid-1950s until 1974. That it began when it did—the Eisenhower years—comes as a surprise. The GI bill’s impact had faded and the postwar baby boom had not yet reached college age. Presumably postwar prosperity had something to do with it, but the explanation cannot be simple. The slope remained steep in periods as different as Eisenhower’s late 1950s, LBJ’s mid-1960s, and Nixon’s early 1970s.

  After 1974 came a peculiar plunge in college degrees that lasted until 1981—peculiar because it occurred when the generosity of scholarships and loans, from colleges, foundations, and government alike, was at its peak. This period of declining graduates was then followed by a steep increase from 1981 to 1990—also peculiar, in that college was becoming harder to afford for middle-class Americans during those years. As of 1990, the proportion of students getting college degrees had more than made up for the losses during the 1970s and had established a new record, with B.A.s and B.S.s being awarded in such profusion that they amounted to 30 percent of the 23-year-old population.

  MAKING GOOD ON THE IDEAL OF OPPORTUNITY

  At first glance, we are telling a story of increasing democracy and intermingling, not of stratification. Once upon a time, the college degree was the preserve of a tiny minority; now almost a third of each new cohort of youths earns it. Surely, it would seem, this must mean that a broader range of people is going to college—including people with a broader, not narrower, range of cognitive ability. Not so. At the same time that many more young people were going to college, they were also being selected ever more efficiently by cognitive ability.

  A compilation of the studies conducted over the course of the century suggests that the crucial decade was the 1950s. The next figure shows the data for the students in the top quartile (the top 25 percent) in ability and is based on the proportion of students entering college (though not necessarily finishing) in the year following graduation from high school.

  Again, the lines highlight trends set in particular periods, here 1925-1950 and 1950-1960. From one period to the next, the proportion of bright students getting to college leaped to new heights. There are two qualifications regarding this figure. First, it is based on high school graduates—the only data available over this time period—and therefore drastically understates the magnitude of the real change from the 1920s to the 1960s and thereafter, because so many of the top quartile in ability never made it through high school early in the century (see Chapter 6). It is impossible to be more precise with the available data, but a reasonable estimate is that as of the mid-1920s, only about 15 percent of all of the nation’s youth in the top IQ quartile were going on to college.4 It is further the case that almost all of those moving on to college in the 1920s were going to four-year colleges, and this leads to the second qualification to keep in mind: By the 1970s and 1980s, substantial numbers of those shown as continuing to college were going to a junior college, which are on average less demanding than four-year colleges. Interpreting all the available data, it appears that the proportion of all American youth in the top IQ quartile who went directly to four-year colleges rose from roughly one youth in seven in 1925 to about two out of seven in 1950 to more than four out of seven in the early 1960s, where it has remained, with perhaps a shallow upward trend, ever since.5

  At mid’century America abruptly becomes more efficient in getting the top students to college

  Sources: Eagle 1988b; Taubman and Wales 1972; authors’ analysis of the National Longitudinal Survey of Youth (NLSY). See below and the introduction to Part II.

  But it is not just that the top quartile of talent has been more efficiently tapped for college. At every level of cognitive ability, the links between IQ and the probability of going to college became tighter and more regular. The next figure summarizes three studies that permit us to calculate the probability of going to college throughout the ability range over the last seventy years. Once again we are restricted to high school graduates for the 1925 data, which overstates the probability of going to college during this period. Even for the fortunate few who got a high school degree in 1925, high cognitive ability improved their chances of getting to college—but not by much.6 The brightest high school graduates had almost a 60 percent chance of going to college, which means that they had more than a 40 percent chance of not going, despite having graduated from high school and being very bright. The chances of college for someone merely in the 80th percentile in ability were no greater than classmates who were at the 50th percentile, and only slightly greater than classmates in the bottom third of the class.

  Between the 1920s and the 1960s, college at
tendance becomes much more closely pegged to IQ

  Source: Taubman and Wales 1972, Figures 3, 4; and authors’ analysis of NLSY students who graduated from high school in 1980-1982.

  Between the 1920s and the 1960s, the largest change in the probability of going to college was at the top of the cognitive ability distribution. By 1960, a student who was really smart—at or near the 100th percentile in IQ—had a chance of going to college of nearly 100 percent.7 Furthermore, as the figure shows, going to college had gotten more dependent on intelligence at the bottom of the distribution, too.8 A student at the 30th percentile had only about a 25 percent chance of going to college—lower than it had been for high school graduates in the 1920s. But a student in the 80th percentile had a 70 percent chance of going to college, well above the proportion in the 1920s.

  The line for the early 1980s is based on students who graduated from high school between 1980 and 1982. The data are taken from the National Longitudinal Survey of Youth (NLSY), which will figure prominently in the chapters ahead. Briefly, the NLSY is a very large (originally 12,686 persons), nationally representative sample of American youths who were aged 14 to 22 in 1979, when the study began, and have been followed ever since. (The NLSY is discussed more fully in the introduction to Part II.) The curve is virtually identical to that from the early 1960s, which is in itself a finding of some significance in the light of the many upheavals that occurred in American education in the 1960s and 1970s.

 

‹ Prev