The Story of Ain't

Home > Other > The Story of Ain't > Page 5
The Story of Ain't Page 5

by David Skinner


  In a letter to the president of Yale he protested against compulsory chapel and complained about last Sunday’s sermon, unloosing a quick and brutal spray of invective that must have stunned the Honorable James Rowland Angell. The sermonizer, said Dwight, was guilty of “puerile, stupid twaddle” and seemed to have “a remarkable power of hypnotizing himself with magniloquent platitudes.”

  And so on: “Not one intelligent remark did he make.” Lest the university president think the letter a joke of some kind, the writer vouched for every word: “All I have said here I am quite sincere in. Be assured.”

  He was like a faucet of words that could not be turned off. His mother, always worried about his social and professional prospects, advised that perhaps he should not have so caustically reviewed a faculty member’s book in the student newspaper.

  He reviewed people, just as he reviewed books and sermons. In 1925, to a girlfriend of Dinsmore’s, he addressed a particularly vicious diatribe made all the worse by the smug vanity of its literary style: “I missed in you a certain dignity, an aloofness and sense of personal pride that I fancy is a sign of the lady. . . . Humility is one of the Christian virtues, but as G. K. Chesterton paradoxically points out, humility is merely pride carried to a splendid extreme.”

  Speaking of religion: “And then too there was the fact that you are a Jewess, and are rather obviously one, to make me react unfavorably. For I dislike rather violently the Jews as a race.”

  If he sounds villainous on the subject of Jews, he is little better on the subject of blacks. After seeing The Birth of a Nation, D. W. Griffith’s pioneering silent film of 1915, known even then for its retrograde politics, Macdonald—a future Today show film critic—raved, “Its emotional kick is tremendous. You want to tear in pieces the cocky insolent niggers and carpetbaggers of ‘after-the-war’ days, and when the good old Ku Klux Klan comes sweeping down on horseback and rescues the besieged whites, you want to cheer.”

  Taking a step back from his initial visceral response, Macdonald writes, “In spite of my prejudice against the modern K.K.K., that was the way I felt.”

  Such words were, of course, less shocking in the 1920s, though even then the N-word was a “substandard” term, as the 1934 Webster’s Second put it, “used familiarly, now chiefly contemptuously.” Macdonald did sometimes use it unmaliciously, as when he raved to Dinsmore about Fletcher Henderson’s jazz band: “Sassy, boy, that’s some band . . . those niggers played like men possessed.” The air of condescension, though, is quickly gone as he characteristically looks to name the precious qualities that elevated the musical performance: “The disciplined passion they exhibit is the very essence of the greatest art.”

  The semiprivate mental grapplings of youth, viewed in the harsh light of retrospect, deserve better than snide dismissal. Overall, Macdonald was a riveting if morally dubious specimen: Rare is the person of any age so joyfully committed to the tricky work of self-report.

  Oddly, he did not immediately seek work as a writer upon graduation from Yale. He made the rounds of advertising firms in New York but accepted a position in the manager training program at Macy’s, thinking a year of business experience would make him a more desirable applicant when he did go into advertising. “Up to this time, you know,” he told Dinsmore, “literature was my end-all and be-all and my greatest ambition was to one day create it. Well, right now, I don’t care much whether I ever set pen to paper again.”

  He frankly admired the businessmen he’d met, and felt an intemperate curiosity about them. “These men were so cold, so keen, so absolutely sure of themselves and wrapped up in business that I felt like a child before them.” If this was the real world, it made campus life seem trivial. “I thought of profs here [at Yale] and back at Exeter who represented culture: they made a poor showing besides these men.”

  And the larger world of art and letters—whence recently came Eliot’s The Waste Land and Hemingway’s The Sun Also Rises, among other masterpieces—looked shrunken and useless. “I tell you, Dinsmore, that American art, letters, music, culture is done. There are hundreds of businessmen, thousands of them, who are better in their line of work than the best poet or painter we have today.” Before the letter was sent, however, his excitement died down. “This is all too exaggerated,” he said, adding that he had not forgotten his and Dinsmore’s dream to live on a farm together. Nor was he shedding his literary ambitions. But business had caught his eye and he wanted to see what it was all about.

  The year was 1928, and an infatuation with capitalism was not at all ridiculous. A boom market was on, and stock prices had more than tripled in five years. The American businessman not only seemed awesome to a young man deciding on a career; he seemed awesome to the increasing rolls of middle-class investors. He seemed awesome in general.

  But by October the freshly minted Yale grad was growing sardonic. He was working the floor of Macy’s, telling customers about the wonderful new fabric called rayon. Coming up was an exhibition from the Rayon Institute of America, intended, he told Dinsmore, “to educate the public about rayon, that is, in order to delude people into believing that rayon is twice as cheap a material as silk, which is true, and also twice as good, which . . . is also true.” (Sarcastic ellipses in original.) For his part in the exhibition, trainee Macdonald would stand around and explain how this new wonder fabric was made and tell anyone “who is sap enough to question me that at eleven and three each and every day there will be shown at no charge a motion picture with the title The Romance of Rayon.”

  Come the new year, he was fully contrite. “Now I realize what a fool I was to go to Macy’s and from now on I freely accept that I am an intellectual-artist-man of ideas.” In March 1929 Macdonald began a new job in the editorial trenches of Henry Luce’s growing media empire.

  Chapter 6

  “Surely goodness and mercy shall follow me all the days of my life; and I will dwell in the house of the Lord forever.” So sayeth the King James Bible—wrongly, according to Lindley Murray, writing in the late eighteenth century. The passage was “not translated according to the distinct and proper meaning of the words shall and will,” said Murray. Instead, “it ought to be, will follow me, and, I shall dwell.”1

  Murray’s textbook was the standard reference on grammar in the United States and was popular well into the nineteenth century, when few such books departed from the practice first observed by Johannis Wallis in 1615 that shall is to be used in the first person and will in the second and third persons.2 From this came Murray’s complaint about Psalm 23, as well as the rule that no one should ever use will to ask a question in the first person.

  Will I? Never.

  In the 1920s, this accumulated wisdom was examined by a professor named Charles Carpenter Fries, originally from Reading, Pennsylvania. He had taught ancient Greek for several years, until the subject was no longer required for admission to elite colleges, and then switched to teaching literature and composition before heading to the University of Michigan to study rhetoric and the history of English.3

  At Michigan he encountered the scientific view of language. It was, for him, a “new world” that eventually changed his “whole view of language and grammar.” His intellectual assumptions were completely overturned. “It seemed to me as revolutionary as the Copernican system in astronomy, the germ theory of disease in medicine, or the study of molecular structure in physics.”4

  Fries (pronounced like freeze) sought to observe the barest facts of language: not what might be said in an interpretive spirit, but what could be noticed and taken as physical evidence. Equipped, like many of his breed, with a vast patience for the minutest details, he once wrote a study of punctuation in Shakespeare. The other question that fascinated him concerned how language should be taught.

  An enthusiast in temperament, he occasionally delivered sermons at First Baptist Church in Ann Arbor (he had gone to divinity school for two years and almos
t become a minister), and he was prone to moral exhortation at work, telling younger colleagues to consider the benefits of exercise, swimming in particular.5 A very clear writer, Fries was uniquely suited to popularizing the findings of his adopted field, linguistics, and applying its lessons to those points of grammar and usage that make elementary students and even professional editors anxious. Full of liberal confidence, he didn’t mind reducing the miasma of scholarship to a clear set of truisms the layman can understand.

  And it bothered him that the layman didn’t understand the lessons of linguistics. For over a hundred years the objective facts of language had been studied comparatively and historically, yet “the modern scientific view of language . . . and the results of scholarly investigations in the English language have not reached the schools.”6 In 1927, he became the president of the National Council of Teachers of English, where he sought to evangelize on behalf of the modern scientific view of grammar.

  Teachers didn’t seem to realize that pronunciation was enormously variable and there was no such thing as a single correct standard. Some even believed that spelling could be used as a guide to “correct” pronunciation. One teacher he’d encountered was so insistent on this point that she told her students that since laughter was spelled just like daughter and slaughter, it should be pronounced the same way—a ridiculous notion since, in modern English, spelling often provides little or no guidance on pronunciation.

  The popular view of language was a primitive view, Fries noticed. People confused words with the things they represented and invented euphemisms to escape harsh realities, such as when they said passing to avoid saying death. They ascribed mystical powers to words. John Ruskin had said that wife truly meant “weaver,” telling the womenfolk, “In the deep sense, you either must weave men’s fortunes and embroider them, or feed upon them and bring them to decay.” Thomas Carlyle saw in king a connection to canning (an obsolete word meaning ability), adding that the truly “able man” indeed “has a divine right over me.”

  These moralizing flourishes issued from the assumption that common words could be unmasked to reveal their secret selves and yield insights into the fundamental natures of wives and kings and so on. The next step was for such mistaken beliefs to limit the use of certain words. It had been said that metropolis, on the basis of its Greek roots in the words for mother and city, should only be used to refer to cathedral cities, thus “Canterbury is the metropolis of England, but London is not.” Aggravate, according to Richard Grant White, “is misused . . . in the sense of provoke, irritate, anger.” The supposedly correct word in such cases was irritate, while the true meaning of aggravate only covered situations where something bad was made worse. Awful, according to yet another master of the king’s English quoted by Fries, could only mean awe-inspiring.

  But it was not so. “The real meaning of any word,” argued Fries, “must be finally determined, not by its original meaning, its source or etymology, but by the content given the word in actual present usage. . . . Even a hardy purist would scarcely dare pronounce a painter’s masterpiece awful, without explanations.”7

  Language was far more complicated than laymen realized. After he became a father, Fries would illustrate this hidden complexity using the word chair. The routine began with Fries asking for a definition. One of his children might answer, “A piece of furniture with four legs for sitting on.” Then the questioning began. Do all chairs have four legs? Is a chair always for sitting on? Can a chair not be a thing at all but a word to describe something else? Can a person be a chair? Can a chair be an action?

  Of course, not all chairs have four legs and not all chairs are for sitting on. And in chair cushion, the word only describes something that is definitely not a chair. And there are, of course, people called chairs, frequently a chairman or a chairwoman who, indeed, is said to chair (verb) a committee or some other body. Then how can the idea of a chair be stated to unify all these permutations of chair? Is it even possible? Can chair really be defined?

  The lesson of the exercise was that no supposedly fixed element of what a chair really is could survive interrogation. Everything that seemed to belong to one example of chair could be shown to be completely absent from some other chair.8

  When the language proved too elusive for the rules and definitions and principles we devised for it, the only recourse was to evidence. To test the meaning and usage of words, it was necessary to develop extensive empirical records of their appearance in speech and writing. To get to the heart of shall and will, Fries did just that.

  He assembled fifty plays of the English theater, stretching back from the present day to 1550. The choice of literature type was a deft touch, as it offered “the best compromise between the living spoken English and the written English of literature.”9 He identified plays for each decade, and then (with help from his devoted wife, Agnes) tallied nearly twenty thousand usages. Finally, he examined each instance of will and shall to see how they compared with the rules promulgated by Lindley Murray and others.

  Now, as far as Fries was concerned, a rule could only be verified by being widely followed. “There can thus never be in grammar an error that is both very bad and very common. The more common it is, the nearer it comes to being the best of grammar.”

  A rule of language that was not widely followed was, ipso facto, meaningless, an absurdity, without standing in reality. It was like the “true” meaning of wife or metropolis or awful, something made up by one of those mercurial writers whose opinions were based on a very partial understanding of language history and a most selective interest in the language as used by native speakers.

  In first-person usage, Fries found that the traditional rules for shall and will held up, but examples from the first decades of the twentieth century showed will taking over an increasing share of shall’s business. And outside of the first person, the traditional rules bore less and less of a resemblance to actual usage. The Oxford English Dictionary said that in second-person questions, shall is the “normal” word to use. This was not evidenced by Fries’s research: “Of the 512 questions in the second person but 7 or 1.3 percent use shall; all the rest employ will.”

  Fries looked at shall and will in British and American plays published since 1900. In several categories, shall was perceptibly vanishing. This was especially problematic. In the United States, almost no textbooks acknowledged the existence of any forms of the future tense other than those using shall and will as auxiliary verbs. Going to, plan to, desire to, intend to—there were many in common use, all ignored to reserve room for shall, the language’s supposed equal partner in the future tense.

  Modern usage also called into question the words’ standard definitions. Characters in the plays Fries examined frequently used shall to express determination or intention and will to express simple futurity—the opposite of what many traditional rules prescribed.

  In the 1920s, Shall I? was still a common usage, but it would not be for long. A few years after Fries’s study was released, Bell Telephone allowed a researcher to listen in and count words spoken in the phone conversations of its customers. In the course of 1,900 conversations, will as an auxiliary term was used 1,305 times. Shall appeared only six times.10 The long-term trend was obvious. Twenty-five years hence, Alfred Hitchcock fans would leave the theater with the voice of a very proper Doris Day singing in their heads, using will in the first person to ask, “Will I be rich? Will I be pretty?”

  Fries showed that the roles of shall and will had changed a great deal since their usage was first described and then encoded in rules—rules that could not be relied on to predict or describe modern usage. Moreover, the conventional understanding of the future tense in English was hopelessly inadequate.

  His paper, “The Periphrastic Future with Shall and Will in Modern English,” technical in nature and scientific in method, was one of many flares across the bow of classroom grammar in the 1920s and ’30s. Th
e infamous It’s me, the crime of dangling modifiers, the prohibition on split infinitives, the traditional understanding of subject-verb agreement, the use of double negatives, slang, and many other notions were being subjected to empirical examination to determine their actual status in literature and speech. A rebellion was under way against the rule of rules, and Fries was leading it.

  Chapter 7

  In 1927, the American editor Henry Seidel Canby attended a conference in London for the Society for Pure English. The society had been founded in 1913 by a band of prominent traditionalists that included the poet Robert Bridges, the essayist Sir Walter Raleigh, Logan Pearsall Smith (an American-born writer living in England whom Dwight Macdonald admired and corresponded with),1 and Henry Bradley, the second editor of the Oxford English Dictionary. Its prospectus called for “preserving all the richness in differentiation in our vocabulary” and holding on to “nice grammatical usages.” Assimilationist toward foreign words (wanting them to, please, surrender their funny accents and bad spelling at the border), the society “opposed whatever is slip-slop and careless, and all blurring of hard-won distinctions.”2

  The appearance of an OED editor, Bradley, among the society’s founding members seems unlikely. No group of scholars had done more than the dictionary’s editors and volunteer contributors to win respect for the historical study of language change, an achievement at odds with the society’s own view of English as all the more perfect the less it changed.

  But Bradley had his bugaboos. Swashbuckling he thought a terrible addition to the English language, and, for all his historical perspective, he wondered, When did people start calling the upper part of the human face a for-hed? Everyone knew it was supposed to be pronounced for-id.

 

‹ Prev