Adam's Tongue: How Humans Made Language, How Language Made Humans
Page 29
TEMPLATES
Simplifying somewhat for the sake of brevity and clarity, the two most crucial kinds of words are, as you might expect, nouns and verbs. And accordingly, the two templates (roughly, phrases and clauses) are headed, respectively, by nouns and verbs.
The noun template looks something like this:
(Modifiern) Noun (Modifiern) → Nmax
This means that a noun may have an indefinite number of modifiers either before it, or after it, or occasionally a mix of these. Each language determines for itself on which side of the noun modifiers can go. (As in “The tall blond man with one left shoe”—English is pretty loose with its modifiers, as languages go.) The modifiers can include other phrases—[with [one left shoe]]—provided there’s a preposition or something to link them with the rest of the phrase. They can even include clauses—“The tall blond man [you saw yesterday.”] The total phrase we’ll call Nmax, the maximal expression of a noun.
The verb template looks something like this:
(Nmaxn) Verb (Nmaxn) → Vmax
As before, this indicates that an indeterminate number of Nmaxes can either precede or follow the verb or, again as in English, you can have it both ways. But although number must remain indefinite in a general formula, it is limited in any individual case by the number of possible arguments of the verb concerned.
What that means is that to attach to a verb, an Nmax has to fulfill a specific role relative to that verb. It has to be its Agent (whatever performs the action of the verb) or its Theme (whatever undergoes the action) or its Goal (whatever or whoever the action is directed toward). (There are other less important thematic roles, but they needn’t concern us here.) Not every verb assigns the same arguments. “Fall” takes only a Theme (“Bill fell”). “Melt” can take one (“The ice [Theme] melted”) or two (“Bill [Agent] melted the ice [Theme]”). A verb like “tell” can take all three (“Mary [Agent] told Bill (Goal) the time [Theme]”).
Where do these templates come from? The noun template must have started to emerge the very first time two things had to be distinguished from each other: “The big one, not the small one.” Soon there would be cases in which that had to be refined: “The big red one, not the small red one.” The verb template is implicit in the meanings of verbs. “Fall” affects only who or what falls, so it can have only one of the three main thematic roles. “Melt” can be something that happens or something you make happen, so it may have either one or two of those roles. But if you “tell,” you have to do it to somebody and you have to have something to tell them, so all three roles have to be represented somehow.
With all this apparatus, you might think you would have enough to run a language on, to compose it more or less automatically and understand it the same way, even if it was a good deal less complex than most languages are today. But wait. Where in all this is the process singled out by Chomsky as the one central and uniquely linguistic capacity—recursion?
BROUHAHA ABOUT PIRAHÃ
It’s not often that a heavy-duty linguistic argument finds its way into the pages of The New Yorker.
However, that’s exactly what happened in the spring of 2007, when that magazine featured an article on the work of Dan Everett, a linguist who for many years had been studying a language called Pirahã, spoken by an indigenous tribe in the Amazon basin.
Why would your typical New Yorker reader give a flying you-know-what about a language spoken by a few hundred jungle-dwelling pre-literates that many even in the professional linguistic community had never heard of? In twenty-first-century terms, there’s only one possible answer. The language provided data that seemed to challenge Chomsky. Challenging Chomsky is, as I mentioned in chapter 9, a continuing obsession, one that taps into the great divide between those who think culture determines the bulk of human behavior and those who think biology does it.
So the first lot thought they’d discovered a smoking gun, and for a few heady weeks the chattering classes found themselves grappling with a strange new concept: recursion.
Recursion, we are told, is the rat that ate the malt that lay in the house that Jack built. It’s what enables us to expand sentences indefinitely, to infinity if need be, by inserting phrases within phrases, clauses within clauses—just like those Russian dolls that have smaller but otherwise identical dolls nesting inside them. It’s what, as we saw in chapter 9, Chomsky and his colleagues regard as not only the most central part of language, but possibly the sole content of FLN, the only part that’s unique to humans. Consequently, it must be a universal of human language, determined by our biological makeup—mustn’t it?
But Dan Everett was claiming that Pirahã had no recursion.
Chomskyan linguists launched a massive counterattack on Everett’s analysis, saying he’d gotten it all wrong, that some of his own examples disproved his claims. Arguments quickly spiraled into a technical stratosphere where few New Yorker readers could follow them. What hardly anyone seemed to notice was that it didn’t make the slightest difference whether Everett was right or wrong.
Suppose he was right. Then the only question was, could a Pirahã baby learn a language that did have recursion? If it could—the most probable outcome—then the absence of recursion from Pirahã grammar might be rarer, but was no more remarkable, than the absence of sounds such as clicks or prenasalized consonants from English. Recursion, clicks, and prenasalized consonants are all things that human biology makes available to us. But biology doesn’t mean we have to use them—once again, we’re up against the myth of the gene as an unalterable, inescapable force determining our behavior down to the wire. Recursion is a more useful language component than clicks, so few if any languages manage without it, but if one human language chooses to do just that, it tells us nothing at all about the human language capacity.
But ironically, if the Pirahã baby couldn’t learn a recursive language, that would form one of the clearest proofs of the biological nature of language anyone could ask for. It would be puzzling, because it would mean that at some stage of evolution the language capacity had branched, and consequently some things possible for folk on the major branch would be impossible for those on the minor one. But that would be the only possible explanation, because if language was cultural as some still claim, the baby (once raised in the culture of a recursive language) would surely have learned recursion anyway.
But is there really such a thing as recursion?
Even to ask the question approaches blasphemy. For half a century, everyone, whether they accepted Chomsky’s theories or not, has agreed that recursion exists—that language is capable of embedding a linguistic object, a phrase or a clause, inside another linguistic object of the same kind. Whether people agreed with Chomsky or not, whether they believed that recursion was innate or not, nobody questioned that it was there, a force that had to be reckoned with.
Yet in fact, as I shall now show, it was an artifact of analysis.
Who created it? Chomsky did.
Who destroyed it? Chomsky did, only he didn’t realize he had.
It’s a fascinating story, and here it is.
THE STRANGE HISTORY OF RECURSION
In 1957 Chomsky published his seminal, groundbreaking work, Syntactic Structures. At that stage his kind of grammar was officially known as transformational-generative grammar, although, since this title was cumbersome and transformations were the most novel thing about it, most people back then called it simply transformational grammar.
Among other things, transformations took two simple sentences and made them into one complex one. Take for instance a sentence like “The girl you met yesterday speaks French.” This was originally assumed to be produced by first constructing the two simple sentences, “The girl speaks French” and “You met the girl yesterday.” The transformation then simply inserted the second sentence into the first, a process that came to be termed “embedding.” This gave you “The girl you met the girl yesterday speaks French.” The second occurrence of “the gir
l” was then “deleted under identity,” and voilà, you had your complex sentences. All complex sentences were assumed to be constructed like this, out of simple ones.
But wait. While for heuristic or didactic purposes, transformations might be shown as operating on actual strings of words, they weren’t really supposed to do this. They were actually much more abstract. Words were merely objects in “surface structure,” while transformations took place at the level of “deep structure.” Deep structure consisted of abstract forms, word classes and types of structure that underlay the superficial level of actual sentences. These forms were allotted symbols that were used in the instructions for transformations: S for sentence, N for noun, NP for noun phrase (since every noun was capable of expansion into a phrase), V for verb, VP for verb phrase, and so on. Sentences, down to the final transformation, were built with these terms, and words were inserted as the very last step in the sentence-forming process.
In order to construct the necessary deep structures, you needed a set of what were known as “rewrite rules.” Rewrite rules broke down deep-structure labels into their constituents, as follows:
S → NP VP
NP → (Det) N (PP)
VP → V (NP) (NP) (PP)
PP → P NP
“Det” stands for determiner—things like “the” or “this”—and PP stands for prepositional phrase, while the presence of parentheses indicates that a constituent is optional; NP and VP need include no more than an N and a V, respectively. Adjectives were not included at this stage; sentences with adjectives, even simple sentences, were “generated” by a transformation that in order to produce “The angry man left” first had to produce “The man left” and “The man was angry,” then proceed by insertion—“The man the man was angry left”—then deletion—“The man angry left”—and finally transposition.
And, as these rewrite rules showed, a unit could be included in another unit of the same kind: the NP that finished up inside a PP could subsequently be inserted inside another NP. That’s recursion.
This was Chomsky’s original formulation. However, it soon ran into problems that caused the theory to be revised and re-revised. First, derivations like that of “The angry man left” were abandoned, and then so were all derivations that derived complex sentences from simple ones. Complex sentences were now produced by building a “generalized phrase-marker,” a string of rewrite symbols that followed (more or less) the whole outline of a complex sentence. Transformations were taken out of the sentence-building process and reserved for things like changing sentences of one type into another type—for example, active into passive, or statement into question—or moving things around in sentences, such as bringing question words to the beginning. It followed from this that clauses, allotted the same symbol as sentences (since a single clause often constitutes a sentence), had to be included in the rewrite rules, which now read:
S → NP VP
NP → (Det) N (PP) (S)
VP → V (NP) (NP) (PP) (S)
So now either (or both) noun phrases and verb phrases, the constituents of sentences, could themselves contain sentences, and the picture of recursion (phrases within phrases, sentences within sentences) was complete.
However, things went on changing. The first word in the theory’s title disappeared: “transformational-generative grammar” became “generative grammar” tout court. That was because transformations had grown fewer and fewer, until a quarter century after Syntactic Structures there was only one: “move alpha,” which can be roughly translated as “move anything anywhere.” (This might strike you as rather unhelpful, but there was by that time a series of carefully devised principles that between them sharply constrained what could move where.) But in all the time these changes were taking place, nobody stopped to look closely at any effect they might be having on the original set of assumptions that generative grammar had started out with. Terms like “recursion,” “embedding,” “embedded sentence,” and the like now formed part of the jargon of the trade, and continued to be used by all, without anyone asking whether the newest version of the theory still licensed those terms.
By the 1990s, when Chomsky introduced his minimalist program, the original theory had changed beyond recognition. The deep structure/surface structure distinction had vanished, along with all the category labels, the NPs and VPs of the rewrite rules; those labels might sometimes still be used descriptively, as a matter of convenience, to refer to particular chunks of structure, but they no longer played any significant role in the theory. Of the elaborate network of rules and/or principles that had characterized the first versions, all that was left was the single process mentioned earlier: Merge.
Merge dealt straightforwardly with words, not category labels, and all it did was take two words and join them into a single unit. Take “Bill” and merge it with “left” and you got the sentence “Bill left.” What was to stop you taking “Bill” and “right” and making the nonsentence “Bill right”? Well, the lexical properties of the words themselves told you what they needed to be attached to. “Bill” has to be attached to a verb before you can get a sentence and a verb requires a subject like “Bill”; “left” can be an intransitive verb as well as an adjective but “right” can’t.
Suppose you want a longer sentence, “Bill left Mary.” Do you again start by joining “Bill” and “left,” then add “Mary” to give [[Bill left] Mary]? No way; sentences are built incrementally, but not that way. As we saw above, the link between verb and object, here “left” and “Mary,” is much tighter than that between “Bill” and “left,” so the former pair merges first.
Although this doesn’t seem to have been his motive, what Chomsky has done here is provide a pretty plausible model for how brains may actually put words together to form sentences, in the real world, in real time. One reason to suppose he wasn’t thinking about process is that Chomsky has always conceptualized things in terms of states rather than processes—for him, as we saw in chapter 9, almost any kind of process might just as well be instantaneous. Another is his apparent lack of interest in anything that actually happens inside brains. This is the only explanation I can think of for the all-but-incredible fact that he apparently hasn’t noticed—or at least hasn’t publicly admitted—what he’s actually done.
By proposing Merge, he’s assassinated recursion.
WHAT REALLY HAPPENS
Let’s go back to the sentence with which we began the last section:
The girl you met yesterday speaks French.
Even though the elaborate machinery that inserted “You met the girl yesterday” inside “The girl speaks French”—a classic case of a recursive process—has long since vanished, everybody still sees this as one sentence embedded within another sentence, or to be more precise, within a noun phrase that’s a constituent of the full complex sentence:
S1[NP[The girl S2[you met yesterday]S2]NP speaks French]S1
But what happens when you simply produce the sentence by successive applications of Merge?
Stage one (probably constructing in parallel, since the brain is a parallel processor): merge each verb with its complement.
[met yesterday] [speaks French]
Stage two: merge [met yesterday] with its subject.
[you [met yesterday]] [speaks French]
Stage three: merge “girl” with the product of the last merge.
[girl [you [met yesterday]]] [speaks French]
Stage four: close product of last merge by merging determiner.
[the [girl [you [met yesterday]]] [speaks French]
Stage five: Merge the two constituents.
[[the [girl [you [met yesterday]]]] [speaks French]]
Where was anything inserted or embedded in anything else? Nowhere. Words were simply merged with other words in a purely additive process. It is only when we look at the finished product and start to add category labels to it that in retrospect it looks as if one constituent has been placed inside another. But in the
actual process of sentence formation, nothing you could call recursion has taken place.*
Contra what Chomsky has claimed and most people have assumed, there’s no special ability to deal with recursive processing that has somehow evolved in the human species. There is, to the contrary, a complete absence of rules and restrictions. You can merge anything to form a sentence—words, phrases, clauses, you name it—provided that all the lexical requirements of all the words you merge are satisfied and none of them remain unsatisfied. Because what you are merging are actual words, or mergers of words, not abstract categories or chunks of labeled structure. It’s precisely the absence of any restriction on what type of object can be merged that allows the illusion of recursive processing to exist.
Please note, I’m not taking any credit for any kind of great new discovery here. All I’ve done is follow through on the logic of what was actually done by Chomsky himself—il miglior fabbro, “the better craftsman,” as T. S. Eliot famously said of Ezra Pound. The process of Merge, when applied, rules out any necessity for supposing that language requires recursion. Yet, as we saw in chapter 9, it was Chomsky, along with coauthors Hauser and Fitch, who tried to send language evolutionists wild-goose-chasing off through all the highways and byways of evolutionary biology in search of the Holy Grail, the mysterious capacity perhaps somehow used by some other species for numbering, or navigation, or social interaction, or . . .
THE FULL FLOWERING
Of course, what I’ve covered here is very far from a full account of language, even of grammar. We’ve reached the stage where it became possible to build merged, hierarchical sentence structures. A lot more would follow. There’s inflection, and agreement, and case marking, and empty categories, and anaphoric relationships, and much, much more. But with the ingredients I’ve described, you can get at least the skeleton of a full human language up and running.