The Mother Tongue
Page 15
In English, in short, we possess a language in which the parts of speech are almost entirely notional. A noun is a noun and a verb is a verb largely because the grammarians say they are. In the sentence “I am suffering terribly” suffering is a verb, but in “My suffering is terrible,” it is a noun. Yet both sentences use precisely the same word to express precisely the same idea. Quickly and sleepily are adverbs but sickly and deadly are adjectives. Breaking is a present tense participle, but as often as not it is used in a past tense sense (“He was breaking the window when I saw him”). Broken, on the other hand, is a past tense participle but as often as not it is employed in a present tense sense (“I think I’ve just broken my toe”) or even future tense sense (“If he wins the next race, he’ll have broken the school record”). To deal with all the anomalies, the parts of speech must be so broadly defined as to be almost meaningless. A noun, for example, is generally said to be a word that denotes a person, place, thing, action, or quality. That would seem to cover almost everything, yet clearly most actions are verbs and many words that denote qualities—brave, foolish, good—are adjectives.
The complexities of English are such that the authorities themselves often stumble. Each of the following, penned by an expert, contains a usage that at least some of his colleagues would consider quite wrong.
“Prestige is one of the few words that has had an experience opposite to that described in ‘Worsened Words.’ ” (H. W. Fowler, A Dictionary of Modern English Usage, second edition) It should be “one of the few words that have had.”
“Each of the variants indicated in boldface type count as an entry.” (The Harper Dictionary of Contemporary Usage) It should be “each . . . counts.”
“It is of interest to speculate about the amount of dislocation to the spelling system that would occur if English dictionaries were either proscribed or (as when Malory or Sir Philip Sidney were writing) did not exist.” (Robert Burchfield, The English Language) Make it “was writing.”
“A range of sentences forming statements, commands, questions and exclamations cause us to draw on a more sophisticated battery of orderings and arrangements.” (Robert Burchfield, The English Language) It should be “causes.”
“The prevalence of incorrect instances of the use of the apostrophe . . . together with the abandonment of it by many business firms . . . suggest that the time is close at hand when this moderately useful device should be abandoned.” (Robert Burchfield, The English Language) The verb should be suggests.
“If a lot of the available dialect data is obsolete or almost so, a lot more of it is far too sparse to support any sort of reliable conclusion.” (Robert Claiborne, Our Marvelous Native Tongue) Data is a plural.
“His system of citing examples of the best authorities, of indicating etymology, and pronunciation, are still followed by lexicographers.” (Philip Howard, The State of the Language) His system are?
“When his fellowship expired he was offered a rectorship at Boxworth . . . on condition that he married the deceased rector’s daughter.” (Robert McCrum, et al., The Story of English) A misuse of the subjunctive: It should be “on condition that he marry.”
English grammar is so complex and confusing for the one very simple reason that its rules and terminology are based on Latin—a language with which it has precious little in common. In Latin, to take one example, it is not possible to split an infinitive. So in English, the early authorities decided, it should not be possible to split an infinitive either. But there is no reason why we shouldn’t, any more than we should forsake instant coffee and air travel because they weren’t available to the Romans. Making English grammar conform to Latin rules is like asking people to play baseball using the rules of football. It is a patent absurdity. But once this insane notion became established grammarians found themselves having to draw up ever more complicated and circular arguments to accommodate the inconsistencies. As Burchfield notes in The English Language, one authority, F. Th. Visser, found it necessary to devote 200 pages to discussing just one aspect of the present participle. That is as crazy as it is amazing.
The early authorities not only used Latin grammar as their model, but actually went to the almost farcical length of writing English grammars in that language, as with Sir Thomas Smith’s De Recta et Emendata Linguae Anglicae Scriptione Dialogus (1568), Alexander Gil’s Logonomia Anglica (1619), and John Wallis’s Grammatica Linguae Anglicanae of 1653 (though even he accepted that the grammar of Latin was ill-suited to English). For the longest time it was taken entirely for granted that the classical languages must serve as models. Dryden spoke for an age when he boasted that he often translated his sentences into Latin to help him decide how best to express them in English.
In 1660, Dryden complained that English had “not so much as a tolerable dictionary or a grammar; so our language is in a manner barbarous.” He believed there should be an academy to regulate English usage, and for the next two centuries many others would echo his view. In 1664, the Royal Society for the Advancement of Experimental Philosophy formed a committee “to improve the English tongue,” though nothing lasting seems to have come of it. Thirty-three years later in his Essay Upon Projects, Daniel Defoe was calling for an academy to oversee the language. In 1712, Jonathan Swift joined the chorus with a Proposal for Correcting, Improving and Ascertaining the English Tongue. Some indication of the strength of feeling attached to these matters is given by the fact that in 1780, in the midst of the American Revolution, John Adams wrote to the president of Congress appealing to him to set up an academy for the purpose of “refining, correcting, improving and ascertaining the English language” (a title that closely echoes, not to say plagiarizes, Swift’s pamphlet of sixty-eight years before). In 1806, the American Congress considered a bill to institute a national academy and in 1820 an American Academy of Language and Belles Lettres, presided over by John Quincy Adams, was formed, though again without any resounding perpetual benefits to users of the language. And there were many other such proposals and assemblies.
The model for all these was the Académie Française, founded by Cardinal Richelieu in 1635. In its youth, the academy was an ambitious motivator of change. In 1762, after many years of work, it published a dictionary that regularized the spellings of some 5,000 words—almost a quarter of the words then in common use. It took the s out of words like estre and fenestre, making them être and fenêtre, and it turned roy and loy into roi and loi. In recent decades, however, the academy has been associated with an almost ayatollah-like conservatism. When in December 1988 over 90 percent of French schoolteachers voted in favor of a proposal to introduce the sort of spelling reforms the academy itself had introduced 200 years earlier, the forty venerable members of the academy were, to quote the London Sunday Times, “up in apoplectic arms” at the thought of tampering with something as sacred as French spelling. Such is the way of the world. Among the changes the teachers wanted and the academicians did not were the removal of the circumflex on être, fenêtre, and other such words, and taking the -x off plurals such as bureaux, chevaux, and chateaux and replacing it with an -s.
Such actions underline the one almost inevitable shortcoming of national academies. However progressive and far-seeing they may be to begin with, they almost always exert over time a depressive effect on change. So it is probably fortunate that the English-speaking world never saddled itself with such a body, largely because as many influential users of English were opposed to academies as favored them. Samuel Johnson doubted the prospects of arresting change and Thomas Jefferson thought it in any case undesirable. In declining an offer to be the first honorary president of the Academy of Language and Belles Lettres, he noted that had such a body been formed in the days of the Anglo-Saxons English would now be unable to describe the modern world. Joseph Priestley, the English scientist, grammarian, and theologian, spoke perhaps most eloquently against the formation of an academy when he said in 1761 that it was “unsuitable to the ge
nius of a free nation. . . . We need make no doubt but that the best forms of speech will, in time, establish themselves by their own superior excellence: and in all controversies, it is better to wait the decisions of time, which are slow and sure, than to take those of synods, which are often hasty and injudicious” [quoted by Baugh and Cable, page 269].
English is often commended by outsiders for its lack of a stultifying authority. Otto Jespersen as long ago as 1905 was praising English for its lack of rigidity, its happy air of casualness. Likening French to the severe and formal gardens of Louis XIV, he contrasted it with English, which he said was “laid out seemingly without any definite plan, and in which you are allowed to walk everywhere according to your own fancy without having to fear a stern keeper enforcing rigorous regulations” [Growth and Structure of the English Language, page 16].
Without an official academy to guide us, the English-speaking world has long relied on self-appointed authorities such as the brothers H. W. and F. G. Fowler and Sir Ernest Gowers in Britain and Theodore Bernstein and William Safire in America, and of course countless others. These figures write books, give lectures, and otherwise do what they can (i.e., next to nothing) to try to stanch (not staunch) the perceived decline of the language. They point out that there is a useful distinction to be observed between uninterested and disinterested, between imply and infer, flaunt and flout, fortunate and fortuitous, forgo and forego, and discomfort and discomfit (not forgetting stanch and staunch). They point out that fulsome, properly used, is a term of abuse, not praise, that peruse actually means to read thoroughly, not glance through, that data and media are plurals. And from the highest offices in the land they are ignored.
In the late 1970s, President Jimmy Carter betrayed a flaw in his linguistic armory when he said: “The government of Iran must realize that it cannot flaunt, with impunity, the expressed will and law of the world community.” Flaunt means to show off; he meant flout. The day after he was elected president in 1988, George Bush told a television reporter he couldn’t believe the enormity of what had happened. Had President-elect Bush known that the primary meaning of enormity is wickedness or evilness, he would doubtless have selected a more apt term.
When this process of change can be seen happening in our lifetimes, it is almost always greeted with cries of despair and alarm. Yet such change is both continuous and inevitable. Few acts are more salutary than looking at the writings of language authorities from recent decades and seeing the usages that heightened their hackles. In 1931, H. W. Fowler was tutting over racial, which he called “an ugly word, the strangeness of which is due to our instinctive feeling that the termination -al has no business at the end of a word that is not obviously Latin.” (For similar reasons he disliked television and speedometer.) Other authorities have variously—and sometimes hotly—attacked enthuse, commentate, emote, prestigious, contact as a verb, chair as a verb, and scores of others. But of course these are nothing more than opinions, and, as is the way with other people’s opinions, they are generally ignored.
So if there are no officially appointed guardians for the English language, who sets down all those rules that we all know about from childhood—the idea that we must never end a sentence with a preposition or begin one with a conjunction, that we must use each other for two things and one another for more than two, and that we must never use hopefully in an absolute sense, such as “Hopefully it will not rain tomorrow”? The answer, surprisingly often, is that no one does, that when you look into the background of these “rules” there is often little basis for them.
Consider the curiously persistent notion that sentences should not end with a preposition. The source of this stricture, and several other equally dubious ones, was one Robert Lowth, an eighteenth-century clergyman and amateur grammarian whose A Short Introduction to English Grammar, published in 1762, enjoyed a long and distressingly influential life both in his native England and abroad. It is to Lowth we can trace many a pedant’s most treasured notions: the belief that you must say different from rather than different to or different than, the idea that two negatives make a positive, the rule that you must not say “the heaviest of the two objects,” but rather “the heavier,” the distinction between shall and will, and the clearly nonsensical belief that between can apply only to two things and among to more than two. (By this reasoning, it would not be possible to say that St. Louis is between New York, Los Angeles, and Chicago, but rather that it is among them, which would impart a quite different sense.) Perhaps the most remarkable and curiously enduring of Lowth’s many beliefs was the conviction that sentences ought not to end with a preposition. But even he was not didactic about it. He recognized that ending a sentence with a preposition was idiomatic and common in both speech and informal writing. He suggested only that he thought it generally better and more graceful, not crucial, to place the preposition before its relative “in solemn and elevated” writing. Within a hundred years this had been converted from a piece of questionable advice into an immutable rule. In a remarkable outburst of literal-mindedness, nineteenth-century academics took it as read that the very name pre-position meant it must come before something—anything.
But then this was a period of the most resplendent silliness, when grammarians and scholars seemed to be climbing over one another (or each other; it doesn’t really matter) in a mad scramble to come up with fresh absurdities. This was the age when, it was gravely insisted, Shakespeare’s laughable ought to be changed to laugh-at-able and reliable should be made into relionable. Dozens of seemingly unexceptionable words—lengthy, standpoint, international, colonial, brash—were attacked with venom because of some supposed etymological deficiency or other. Thomas de Quincey, in between bouts of opium taking, found time to attack the expression what on earth. Some people wrote mooned for lunatic and foresayer for prophet on the grounds that the new words were Anglo-Saxon and thus somehow more pure. They roundly castigated those ignoramuses who impurely combined Greek and Latin roots into new words like petroleum (Latin petro + Greek oleum). In doing so, they failed to note that the very word with which they described themselves, grammarians, is itself a hybrid made of Greek and Latin roots, as are many other words that have lived unexceptionably in English for centuries. They even attacked handbook as an ugly Germanic compound when it dared to show its face in the nineteenth century, failing to notice that it was a good Old English word that had simply fallen out of use. It is one of the felicities of English that we can take pieces of words from all over and fuse them into new constructions—like trusteeship, which consists of a Nordic stem (trust), combined with a French affix (ee), married to an Old English root (ship). Other languages cannot do this. We should be proud of ourselves for our ingenuity and yet even now authorities commonly attack almost any new construction as ugly or barbaric.
Today in England you can still find authorities attacking the construction different than as a regrettable Americanism, insisting that a sentence such as “How different things appear in Washington than in London” is ungrammatical and should be changed to “How different things appear in Washington from how they appear in London.” Yet different than has been common in England for centuries and used by such exalted writers as Defoe, Addison, Steele, Dickens, Coleridge, and Thackeray, among others. Other authorities, in both Britain and America, continue to deride the absolute use of hopefully. The New York Times Manual of Style and Usage flatly forbids it. Its writers must not say, “Hopefully the sun will come out soon,” but rather are instructed to resort to a clumsily passive and periphrastic construction such as “It is to be hoped that the sun will come out soon.” The reason? The authorities maintain that hopefully in the first sentence is a misplaced modal auxiliary—that it doesn’t belong to any other part of the sentence. Yet they raise no objection to dozens of other words being used in precisely the same unattached way—admittedly, mercifully, happily, curiously, and so on. No doubt the reason hopefully is not allowed is that somebody at The New York Times once
had a boss who wouldn’t allow it because his professor had forbidden it, because his father thought it was ugly and inelegant, because he had been told so by his uncle who was a man of great learning . . . and so on.
Considerations of what makes for good English or bad English are to an uncomfortably large extent matters of prejudice and conditioning. Until the eighteenth century it was correct to say “you was” if you were referring to one person. It sounds odd today, but the logic is impeccable. Was is a singular verb and were a plural one. Why should you take a plural verb when the sense is clearly singular? The answer—surprise, surprise—is that Robert Lowth didn’t like it. “I’m hurrying, are I not?” is hopelessly ungrammatical, but “I’m hurrying, aren’t I?”—merely a contraction of the same words—is perfect English. Many is almost always a plural (as in “Many people were there”), but not when it is followed by a, as in “Many a man was there.” There’s no inherent reason why these things should be so. They are not defensible in terms of grammar. They are because they are.