The Coming of Post-Industrial Society

Home > Other > The Coming of Post-Industrial Society > Page 57
The Coming of Post-Industrial Society Page 57

by Daniel Bell


  All this makes the question of the relation of intelligence to genetic inheritance very touchy. Is intelligence largely inherited? Can one raise intelligence by nurture? How does one separate native ability and drive from improvements in skill acquired through education? The average IQ of college graduates is 120, while that of high-school graduates is only 107. As Fritz Machlup, the Princeton economist, has commented: “The greater earning capacity of college graduates, compared with high-school graduates, is, no doubt, to a large extent [the figure is about 40 percent] the result of superior native intelligence and greater ambition; it would be quite wrong to attribute all of the incremental earnings to the investment in college education.” 48

  The logic of the argument has been pushed further by the Harvard psychologist Richard Herrnstein. Using data assembled by Arthur Jensen of Berkeley—that 80 percent of a person’s IQ is inherited, while environmental factors account for only 20 percent—Herrnstein then proceeds to extend the implication:

  1. If differences in mental abilities are inherited, and

  2. if success in society requires those abilities, and

  3. if the environment is “equalized,”

  4. then social standing will be based to some extent on inherited differences.49

  Herrnstein’s argument mixes up two different situations: the assertion that in American society today occupational position is largely a function of IQ, and the model of a meritocracy, whose stratification system would be determined by IQ. Herrnstein concludes, however, that if all persons are given an equal start, and equality of opportunity is fully realized, then heredity will become the decisive factor, since the social environment would be the same for all. And he draws a dismal picture of the new poor:

  ... there will be precipitated out of the mass of humanity a low-capacity (intellectual and otherwise) residue that may be unable to master the common occupations, cannot compete for success and achievement and are most likely to be born to parents who have similarly failed.50

  The relation of genetics to intelligence to social-class position involves five different kinds of disputed questions. The first is the question whether one can ever fix with any exactness the proportions of genetic inheritance and environment to intelligence. (This is possible only if one assumes they are causally independent, i.e. that biological endowment does not influence the environment; but this is highly unlikely.) Second is the question of what the IQ tests actually measure, whether only particular skills or some more general and unified underlying intelligence. Third is the question whether IQ tests are “culture-bound,” including even the self-styled “culture-fair” tests which do not deal with school-taught knowledge but ask the child to deduce relations and correlates within simple non-representational drawings. Fourth, the question whether the social class of the parent is more important than IQ in determining entry into college or occupational position in the society. Finally, the crucial question whether these relationships—between intelligence, social class background and other factors—have changed over time at all and, to that extent, whether the society is becoming more meritocratic.51

  What the parties to these disputes mix up, however, are two very different kinds of issues. One, whether the society—because of either social-class privilege or cultural advantage (e.g. the selective biases of IQ tests)—does or does not provide genuine equality of opportunity, or a fair start for all; and two, whether a society in which genuine equality of opportunity did prevail, but a new form of income and status inequality based on merit resulted, would be desirable? In other words, is it a more genuine equality of opportunity that is wanted, or an equality of result? It is the shuttling from one to another of these positions that has marked the populist argument in recent years and created a confusion in the political demands raised in its wake.

  Initially, equality of opportunity was the main preoccupation. The explicit fear created by a post-industrial society is that the failure to get on the educational escalator means the exclusion from the privileged places in society. A meritocratic society is a “credentials society” in which certification of achievement—through the college degree, the professional examination, the license—becomes a condition of higher employment. Education thus becomes a defensive necessity. As Lester Thurow has observed:

  As the supply of educated labor increases, individuals find that they must improve their educational level simply to defend their current income positions. If they don’t, others will, and they will find their current job no longer open to them. Education becomes a good investment, not because it would raise people’s incomes above what they would have been if no one had increased his education, but rather because it raises their income above what it will be if others acquire an education and they do not. In effect, education becomes a defensive expenditure necessary to protect one’s “market share.” The larger the class of educated labor and the more rapidly it grows, the more such defensive expenditures become imperative.52

  The logical outcome of these fears on the part of disadvantaged groups is a demand for “open admissions” to universities. The underlying rationale of the demand has been the argument that social-class origin of the parent was the primary factor skewing selection in the occupational system, and that open admission to colleges, despite low grades, would enable minority groups to compete more fairly in the society. To that extent, open admissions is no more than the historic American principle that everyone should have a chance to better himself, no matter where he starts. It is also the optimistic American belief that giving any student more education will do him more good. This was the logic behind the land-grant college acts; it was the long-standing practice of the public universities, outside the East, even before World War II.53

  But for some, the extension of this demand has become an attack on the meritocratic principle itself. As one proponent writes: “As long as open admissions remains limited to a few institutions, it poses no threat to the meritocracy. Recruitment into the elite will be based not on whether one went to college, but on where one went to college. Universal open admissions, however, would destroy the close articulation between the meritocracy and the system of higher education; further, by the very act of abolishing hierarchy in admissions, it would cast doubt on hierarchy in the larger society.” 54

  That argument, however, if pushed to its logical conclusion, would mean that admission to all higher schools in the country, from Parsons College to Harvard, should be by lot. And the further conclusion, since elite schools would still be defined by their faculty, would be to make teaching assignment in the national university system a matter of lot as well.

  Open admissions is a means of widening equality of opportunity for disadvantaged students by broadening access to the university. But there is also the question of place in the university structure itself—in the faculty, staff, and administration. In their comprehensive study of the American occupational structure, Peter Blau and Otis Dudley Duncan have shown that almost all the different minority groups have been able to achieve commensurate status, power, and economic rewards—with the exception of women and blacks. Clearly, if there is discrimination—on the basis of sex, or color, or religion, or any criterion extraneous to the stated one of professional qualification—there is no genuine equality of opportunity. The second effort to widen equality has been the effort to expand the number of places of minorities in the system.

  In the 1960s, the government declared it a matter of public policy that “affirmative action” had to be taken to rectify the discrimination against minorities. The policy of affirmative action was first proclaimed by President Johnson in an executive order of 1965. It stated that on all federal projects, or in any employment situation that used federal money, employers had to prove they had sought out qualified applicants from disadvantaged groups; had to provide special training where necessary, if qualified applicants could not be found immediately; and had to hire preferentially from among minority groups when their qualifications were roughly
equal to those of other applicants. This program, combined with others such as Head Start and compensatory education programs, was designed to redress a historic cultural disadvantage and, quite deliberately, to give minority-group members, especially blacks, an edge in the competition for place.

  In the first years of the government affirmative-action program, the efforts were directed primarily within the skilled trades—especially the building trades where there had been a deliberate policy of racial exclusion. In the early 1970s, the Nixon administration, acting through the Department of Health, Education and Welfare (HF.W), extended the program to universities, and each school with federal contracts was asked to provide data on the number of minority persons in each position. academic and non-academic, and to set specific goals for increasing the number of minority-group persons in each classification. As Edward Shils summarized the order:

  Universities were informed that for each category of employee in the university it would be necessary to specify rates of renumeration and number in each category by “racial breakdown, i.e. Negro, Oriental, American Indian, Spanish-surnamed Americans....” This had to be accompanied by an “Affirmative Action Program which specifically and succinctly identif[ies] problem areas by division, department location and job classification, and includes more specific recommendations and plans for overcoming them.” The “Affirmative Action Program” must “include specific goals and objectives by division, department and job classification, including target completion dates on both long and short ranges as the particular case may indicate. Analytical provision should be made for evaluating recruitment methods and sources; the total number of candidates interviewed, job offers made, the numbers hired with the number of minority group persons interviewed, made job offers and hired.”...55

  The initial intention of the Executive Order was to eliminate discrimination. But discrimination is difficult to prove, especially when the qualifications required for a job are highly specific. And the government’s test became: Are the members of the minority groups to be found in employment, at every level, in numbers equal to their proportion in the population? Or, if women earned 30 percent of the Ph.D.s, are 30 percent of the faculty women? What this meant, in theory, was to set “target” figures for women and blacks. In practice, this has meant, quotas, or priorities in hiring, for persons from these groups.

  What is extraordinary about this change is that, without public debate, an entirely new principle of rights has been introduced into the polity. In the nature of the practice, the principle has changed from discrimination to representation. Women, blacks, and Chicanos are to be employed, as a matter of right, in proportion to their number, and the principle of professional qualification or individual achievement is subordinated to the new ascriptive principle of corporate identity.56

  The implications of this new principle are far-reaching. One can, “logically,” insist on quotas where the skill is homogeneous, where one person can readily substitute for another. But by focusing on group identity rather than the person, by making the mechanical equation of number of women Ph.D.s to number of positions they should hold, the government assumes that “educated labor” is “homogeneous”—that individual talent or achievement is less important than the possession of the credential. This may be true in many occupations, but not in university teaching and research, where individual merit is the singular test. Putting someone in a tenure position, which is capitalized at three-quarters of a million dollars, is very different from hiring a black rather than a white plumber; simply having the degree is not necessarily the qualification for the high position.

  Furthermore, quotas and preferential hiring mean that standards are bent or broken. The inescapable assumption of the ascriptive criterion as regards tenured university positions is that minority persons are less qualified and could not compete with others, even if given a sufficient margin. What effect does this have on the self-esteem of a person hired on “second-class” grounds? And what effect does it have on the quality of a university, its teaching and research and morale, if its faculties are filled on the basis of quotas?

  But quotas themselves are no simple matter. If “representation” is to be the criterion of position, then what is the logic of extending the principle only to women, blacks, Mexicans, Puerto Ricans, American Indians, Filipinos, Chinese, and Japanese—which are the categories in the HEW guideline? Why not to Irish, Italians, Poles, and other ethnic groups? And if representation is the criterion, what is the base of representation? At one California state college, as John Bunzel reports, the Mexican-Americans asked that 20 percent of the total work force be Chicanos, because the surrounding community is 20 percent Mexican-American. The black students rejected this argument and said that the proper base should be the state of California, which would provide a different mix of blacks and Chicanos. Would the University of Mississippi be expected to hire 37 percent black faculty because that is the proportion of blacks in the population of Mississippi? And would the number of Jews in most faculties of the country be reduced because the Jews are clearly overrepresented in proportion to their number?

  And if ethnic and minority tests, why not religious or political beliefs as the criteria of balanced representation? Governor Reagan of California has said that conservatives are highly underrepresented in the faculties of the state universities, a fact evident when the political coloration of those faculties is compared with voting results in California; should conservatives therefore be given preference in hiring? And should particular communities be asked to support the teaching of certain subjects (or the presence of certain books in school libraries) which are repugnant to the beliefs of that community?—a question first raised in the Virginia House of Burgesses in 1779 and a principle restated by the Tennessee legislature in the 1920s in barring the teaching of evolution in that Fundamentalist state.

  The historic irony in the demand for representation on the basis of an ascriptive principle is its complete reversal of radical and humanist values. The liberal and radical attack on discrimination was based on its denial of a justly earned place to a person on the basis of an unjust group attribute. That person was not judged as an individual, but was judged—and excluded—because he was a member of a particular group. But now it is being demanded that one must have a place primarily because one possesses a particular group attribute. The person himself has disappeared. Only attributes remains. The further irony is that according to the radical critique of contemporary society, an individual is treated not as a person but as a multiple of roles that divide and fragment him and reduce him to a single dominant attribute of the major role or function he or she plays in society. Yet in the reversal of principle we now find that a person is to be given preference by virtue of a role, his group membership, and the person is once again “reduced” to a single overriding attribute as the prerequisite for a place in the society. That is the logic of the demand for quotas.

  DE-SCHOOLING

  From a different direction there has come another attack on the idea of meritocracy: the argument that all schooling is being subordinated to the demands of technocratic thinking and that the school is assuming a disproportionate influence in the society. The argument is made most sharply by Ivan Illich:

  The hidden curriculum teaches all children that economically valuable knowledge is the result of professional teaching and that social entitlements depend on the rank achieved in a bureaucratic process. The hidden curriculum transforms the explicit curriculum into a commodity and makes its acquisition the securest form of wealth. Knowledge certificates—unlike property rights, corporate stock or family inheritance—are free from challenge .... school is universally accepted as the avenue to greater power, to increased legitimacy as a producer, and to further learning resources.57

  For Illich—whose mysterious role as both Catholic heresiarch and prowler in the corridors of power has made him a figure of cultural curiosity58—there is a distinction between schooling and education. Schooling is
an instrument that enables a person to accumulate a “knowledge stock,” just as business once allowed individuals to accumulate a “capital stock.” 59 Education is the “free determination by each learner of his own reason for living and learning—the part that his knowledge is to play in his life.” Since schooling has become completely instrumental, and a barrier to education, one must eliminate schools and create a process whereby each person can pursue the education he wants and needs.

  For Illich, schooling creates a new hierarchy in which the hierophants of knowledge maintain their position by arcane and technical knowledge that is closed off from the rest of society.60 “Effective access” to education requires “a radical denial of the right of facts and complexity of tools on which contemporary technocracies found their privilege, which they, in turn, render immune by interpreting its use as a service to the majority.”

  In place of institutions—which only develop vested interests to maintain the privileges of its administrators—Illich would substitute “learning webs” made up of skill-exchanges, peer-matching and Educators-at-Large, intellectual sadhus or gurus, wandering scholars, available at call. There would be no compulsory attendance, no credentials, just education pour le gout in the street bazaars of learning.61 And all of it financed by the tax money hitherto spent on the schools.

  The distinction between education and schooling is a relevant one. At one time, the two were joined. We then lived, as James Coleman has put it, in an “information-poor” society.62 The degree of direct experience on a farm or in the small town may have been large, but the range of vicarious experience—the acquaintance with the world of art, or cultures or politics outside the immediate region—was limited to books and school. School was the central organizer of experience and the codifier of values. Today the situation has changed enormously. Whether the amount of direct experience of the child has shrunk is moot; it is perhaps romantic fallacy to believe that the child today, with the increased mobility of travel and the variety of urban stimuli available to him, has fewer direct experiences than before. But the range of vicarious experience, with the spread of communication and the wider windows onto the world offered by television, diverse magazines, picture books and the like, has broadened enormously. Education takes place outside the school, in the multifarious influence of media and peer group, while schools, because of their gatekeeper roles, have become more vocational and specialized.

 

‹ Prev