Slouching Towards Gomorrah

Home > Other > Slouching Towards Gomorrah > Page 32
Slouching Towards Gomorrah Page 32

by Robert H. Bork


  For egalitarians, there is always lurking the nightmare that there may be genetic differences between ethnic groups that result in different average levels of performance in different activities. Only that fear can explain the explosive rage with which some commentators received The Bell Curve by the late Richard Herrnstein and Charles Murray, which, as a small part of a much larger thesis, concluded that there are heritable differences in cognitive ability among the races.26 Some comments expressed respectful and thoughtful disagreement, some asked for careful reexamination of the data and arguments, but some did little more than shout “Nazi.” Herrnstein and Murray are not racists but serious scholars. They may be right or they may not, but the episode indicates the degree to which the ideology of egalitarianism censors expression and thought in sensitive areas.

  Then, too, science may be offensive to egalitarians because it is a difficult enterprise, reserved in its most important spheres for people of high intelligence who have put in years of arduous study and work. Scientific knowledge is increasingly arcane, beyond the understanding of most of us. This exposes scientists to envy and hence to the inevitable charge of elitism. Radical egalitarians enjoy nothing more than lowering or destroying the prestige of an elite class. This suggests that the scientific temper’s role in destroying the intellectual authoritarianisms of the past—religious and political—was not valued entirely, or perhaps even primarily, for the good supposedly done humanity but rather for the simple reason that elite classes were being brought down. Now it is the scientists’ turn.

  This development can be seen in any number of academic, previously intellectual, fields. Sometimes called post-modernism or post-structuralism, this denial of truth is, as Gertrude Himmelfarb says, “best known as a school of literary theory. But it is becoming increasingly prominent in such other disciplines as history, philosophy, anthropology, law, and theology… “27 It is also becoming increasingly difficult to call some of those subjects “disciplines.” In every case—the attack on reason, on the concept of truth, and on the idea that there is an objective reality to which we must attempt to make our words and theories correspond—the impetus behind such assaults comes from the political left. Himmelfarb demonstrates that fact about history, Searle about curricular reform, Gross and Levitt about science, David Lehman about literary studies,28 and I have attempted to do so about academic constitutional theory.29 Nonsense these attacks may be, but, as the history of our century teaches, there is no guarantee that nonsense will not prevail, with dire results. In law, philosophy, literary studies, and history, among other subjects, we are raising generations of students who are taught by the “cutting edge” professors that traditional respect for logic, evidence, intellectual honesty, and the other requirements of discipline are not merely passe but totalitarian and repressive, sustaining existing social, political, and economic arrangements to the benefit of white, heterosexual males. To change society in radical directions, it is said, it is necessary to be rid of the old apparatus.

  The nonsensical denial of objective truth reached its apogee with physicist Alan Sokal’s hilarious unmasking of the social constructionists’ approach to science. He wrote an article entitled “Transgressing the Boundaries: Toward a Transformative Hermeneutics of Quantum Gravity,” which was accepted and published by the magazine Social Text in a special issue about “Science Wars.” Sokal took care to appeal to the editors’ ideological preconceptions, asserting that

  deep conceptual shifts within twentieth century science have undermined [the] Cartesian-Newtonian metaphysics …; revisionist studies in the history and philosophy of science have cast further doubt on its credibility…; and, most recently, feminist and poststructuralist critiques have demystified the substantive content of mainstream Western scientific practice, revealing the ideology of domination concealed behind the facade of “objectivity”…. It has thus become increasingly apparent that physical “reality,” no less than social “reality,” is at bottom a social and linguistic construct…30

  When he revealed his coup, Sokal said “Anyone who believes that the laws of physics are mere social conventions is invited to try transgressing those conventions from the windows of my apartment. (I live on the twenty-first floor.)”31 He said that any competent undergraduate physics or math major should have spotted the article as a spoof that had no logical sequence of thought, but relied upon strained analogies and bold assertions.32

  The arrogant relativism that Sokal exposed as intellectual nonsense is a peculiar semi-nihilism. While it professes to be rid of logic and principles, it also has a fierce left-wing political and cultural agenda, which it could not have without accepting some principles. Thus, this component of the academic left preaches nihilism only to attack their opponents’ certitudes. Their own left-wing certitudes may not be attacked, precisely because the attack must rest on rationality, the possibility of which they have denied.

  Concerned about the breadth of the attacks on rationality, some 200 scientists, doctors, philosophers, educators, and thinkers met at the New York Academy of Sciences. “Defenders of scientific methodology were urged to counterattack against faith healing, astrology, religious fundamentalism and paranormal charlatanism. But beyond these threats to rational behavior, participants at the meeting aimed their barbs at ‘post-modernist’ critics of science who contend that truth in science depends on one’s point of view, not on any absolute content.”33

  The conferees deplored the distortion of scientific ideas, such as the physics of relativity and quantum mechanics (pillars of twentieth-century thought), into arguments that nothing in science is certain and that mystery and magic have an equal claim to belief. At risk was not only science but every subject dependent on disciplined, rational thought. Dr. Paul Kurtz, a professor of philosophy at the State University of New York at Buffalo, argued that post-modernists of both the political left and right denied that scientific knowledge was possible. This causes an “erosion of the cognitive process which may undermine democracy.”34

  The conferees were quite right to be concerned about the decline in rationality, but the connection between the cognitive process and democracy is not a simple one. No one who observes the state of democratic public discourse can believe that it relies heavily on either the cognitive process or rationality. Our quadrennial presidential debates are not the displays of reasoning and sustained argument that the Lincoln-Douglas debates were. If, contrary to the evidence, the candidates were up to such a discourse, the electorate would not want it. If there were a demand for politicians who engage in rational exploration of the issues, we would have such politicians. Instead, we have proposals that are demonstrably irrational—the balanced budget to the Constitution (with no discussion of how it could possibly be enforced), reviving the Tenth Amendment to confine the federal government to the enumerated powers (an idea Americans would not tolerate if anybody explained that it meant, among other things, the end of Social Security and Medicare), and raising the minimum wage.

  The balanced budget amendment came within a single vote in the Senate of being proposed for ratification by the states. A rise in the minimum wage is approved by a large majority of Americans, yet its pernicious effects are plain and indisputable. A simple and fundamental principle of economics is that if the price of anything is raised, less of it will be purchased. Any minimum wage that raises wages above the market price will inevitably mean that fewer workers are hired or retained in their jobs. If anybody could show that this was not so, he would achieve immortality in history by destroying a centuries-old foundational principle of economics. Yet the policy rolls on, propelled not by any cognitive process but by irrational demagoguery. The case for democracy is not that it produces policy by deep thought, but that it is the safest form of government for its citizens.

  The professor is right to the extent that an increase in irrationality will lead to even worse democratic results. Erosion of the cognitive process will also produce a society that is less competent in fields
from economic activity to scholarship. A less competent society is a less affluent one and a less happy one. These are reasons enough to resist the decline of intellect in all of its manifestations.

  14

  The Trouble in Religion

  Some of the most acute observers have thought that religion is essential to the health of American culture and, perhaps, to the survival of our democratic institutions. Most of these commentators viewed religion as the basis of morality, which is fundamental to all else. It is significant, then, that religion was seen as secure and central to American life in the nineteenth century but has appeared increasingly problematic and peripheral in the twentieth.

  Alexis de Tocqueville thought it important that America’s women were supremely religious because women are the protectors of morals. Christianity reigns without obstacle, by universal consent, he said, and the consequence is that every principle of the moral world is fixed and determinate. While the law permits Americans to do as they please, religion prevents them from contemplating, and forbids them to commit, what is rash or unjust. Americans hold religion to be indispensable to the maintenance of republican institutions. Despotism may govern without faith, but liberty cannot. These observations, contained in the first volume of Democracy In America,1 were written in 1834. They could hardly have expressed greater confidence about religion’s beneficial role in our national life.

  In volume two, published in 1840, Tocqueville’s remarks about religions prospects were less optimistic. Perhaps further reflection had suggested the presence of a worm in the American apple. He noted that Christianity had felt, to some degree, the influence that social and political conditions exercise on religious opinions. He saw the struggle of religion with that spirit of individual independence which is her most dangerous opponent. In discussing the progress of Roman Catholicism in the United States, Tocqueville said: “One of the most ordinary weaknesses of the human intellect is to seek to reconcile contrary principles and to purchase peace at the expense of logic. There have ever been and will ever be men who, after having submitted some portion of their religious belief to the principle of authority, will seek to exempt several other parts of their faith from it and to keep their minds floating at random between liberty and obedience.” Finally, he remarked that democratic nations incline to pantheism.2

  That was a remarkable performance. In a few pages Tocqueville not only recognized that the Zeitgeist was capable of changing the substance of religion but anticipated the ravages that radical individualism would inflict upon religion. Thus, he also foresaw the emergence of “cafeteria Catholics,” who obey only those teachings of the Church they find congenial, and the coming of pantheistic New Age religions. Those observations may certainly be read as predictions of trouble in religion in America, and, given the centrality Tocqueville accorded religion, as predictions of trouble in morals, culture, and self-government. If so, the predictions have been lavishly borne out.

  Whether the link between religion and morality can be demonstrated conclusively, as I have come to believe it can, it is true that the coming of trouble in our culture coincided with a decline in the influence of religion. “In the mid-nineteenth century England and America reacted to the consequences of industrialization, urbanization, immigration, and affluence by asserting an ethos of self-control,” James Q. Wilson writes, “whereas in the late twentieth century they reacted to many of the same forces by-asserting an ethos of self-expression.”3 Religion and the voluntary associations inspired by religious life were the source of the ethos of self-control working through the processes of habituation in the family, the schools, the neighborhood, and the workplace. The secular ethos of self-expression led to excesses, according to Wilson, because of the unwillingness of certain elites to support those processes of habituation. He does not draw a conclusion about the importance of religion, but his observations do more than merely suggest that importance.

  The late Christopher Lasch, who was by no means a conservative, asked “what accounts for [our society’s] wholesale defection from the standards of personal conduct—civility, industry, self-restraint—that were once considered indispensable to democracy?” He answered that a major reason is the “gradual decay of religion.” Our liberal elites, whose “attitude to religion,” Lasch said, “ranges from indifference to active hostility,” have succeeded in removing religion from public recognition and debate.4

  There is a very considerable additional danger in removing religion from public recognition and debate, as Richard John Neuhaus persuasively argued in The Naked Public Square.5 There is in many people a need for a belief in the transcendent to give meaning to their lives. By removing religion from the public space, we marginalize it, we deny its importance to society and relegate it to the private sphere. But if men need a transcendence that can be brought to bear on public affairs, and if religion is denied that role, other forms of transcendence, some of them quite ugly and threatening, may move in to occupy the empty space. In part, that has already happened. Many of the causes of the day—from environmentalism to animal rights—are pressed with an enthusiasm, a zealotry, that can only be called religious, and sometimes violence has resulted. There is also a splintering of morality when religion no longer provides a common set of moral assumptions.

  It is by no means universally conceded that morality flows from religion. A denial that religion is essential to morality comes from a surprising source. C. S. Lewis, a devout Anglican, wrote: “Men say, ’How are we to act, what are we to teach our children, now that we are no longer Christians? ’You see, gentlemen, how I would answer that question. You are deceived in thinking that the morality of your father was based on Christianity. On the contrary, Christianity presupposed it. That morality stands exactly where it did; its basis has not been withdrawn for, in a sense, it never had a basis. The ultimate ethical injunctions have always been premises, never conclusions…. Unless the ethical is assumed from the outset, no argument will bring you to it.”6

  Lewis seems to make morality the basis for religion rather than the other way round. But morals or ethics, as he says, cannot be reached by reason, so the question becomes where does morality come from. What can cause humans to assume the ethical from the outset? If morality can be created and maintained independently of religion, if it is prior to religion, then the decline of religion need not be a matter of overwhelming social concern; religion becomes a matter of individual salvation after death, of overwhelming importance to the individual but of little social concern. Yet it is observable that religion and morality have declined together.

  James Q. Wilson argues in The Moral Sense that people have a natural moral sense that is in part biological and in part derives from family life and natural human sociability. He does not deny religion a role but does not discuss it. In On Character, however, he refers to “processes of habituation that even in the absence of religious commitment lead to temperance, fidelity, moderation, and the acceptance of personal responsibility.”7 The question is, of course, whether secular habituation can sustain itself over generations. I was inclined at one time to think that it could, that each generation would teach its children virtues that they in turn would pass to their offspring. We all know persons without religious belief who nevertheless display all the virtues we associate with religious teaching. That might seem to suggest that religion is unnecessary to morality, but the counter argument is that such people are living on the moral capital of prior religious generations. Since secular habituation is grounded only in tradition, that moral capital will be used up eventually, having nothing to replenish it, and we will see a culture such as the one we are entering.

  This is not to dismiss Wilson’s persuasive showing that humans have a natural moral sense, but the evidence so far suggests rather strongly that the natural moral sense is not of itself adequate to provide the level of morality necessary to save a culture. Wilson himself suggests that conclusion: “Having thought about the matter for many years, I can find no compl
ete explanation for the worldwide increase in crime rates that does not assign an important role to a profound cultural shift in the strength of either social constraints or internal conscience or both, and I can find no complete explanation of that cultural shift that does not implicate to some important degree our convictions about the sources and importance of moral sentiments.”8 The natural sources of moral sentiments that he discusses are, presumably, what they always have been. Thus, something additional must be found that accounts not only for rising crime rates but for the more general cultural degeneration that is the subject of this book. I find it difficult to imagine what that something else might be, for America, other than the ebbing of religious faith.

  There are, of course, countries with high ethical standards—low rates of divorce and illegitimacy, for example—that are not only not Christian but are not religious in any Western sense. Japan appears to be such a country. Japan, however, is also not a Western culture. Its religion, Shinto, features ancestor worship, which is a way of revering traditional virtues and thus enforcing morality. The homogeneity of the Japanese population also makes it possible, for the time being at least, to maintain morality through tradition.

 

‹ Prev