Book Read Free

The Language Wars

Page 35

by Henry Hitchings


  In the 1970s the BBC began to move away from a rigid uniformity. Local colour became increasingly common in its radio and TV broadcasts. Today the BBC is one of the English-speaking world’s most conspicuous platforms for arguments about language. It is responsible for many programmes that take obvious pleasure in the variety of English; in 2000 a series of Radio 4’s The Routes of English included episodes looking at the dialects spoken in Cornwall, coastal Northumberland and Brixton. The BBC also continues to stress its role as a source of accurate, impartial journalism, and this is expected to manifest high standards of English usage. The remit of its international arm, the World Service, emphasizes the promotion of the English language. A large section of the British public counts on the BBC to act as a custodian of the English language, and is disappointed if it fails to do so. Among its senior broadcasters, John Humphrys is especially caustic about the linguistic spirit of the age. One of his pet hates is the historic present. According to Humphrys, if you are talking about social conditions in Victorian England, you need to use the past tense. You cannot make your portrait vivid by saying ‘The streets teem with sweeps, musicians, shoeblacks and cabmen’, because this may confuse some people, leaving them uncertain whether you are talking about then or now.

  William Cobbett noted that, when thinking of time and tenses, ‘we perplex ourselves with a multitude of artificial distinctions’. These distinctions have come about because ‘those who have written English Grammars, have been taught Latin; and, either unable to divest themselves of their Latin rules, or unwilling to treat with simplicity that, which, if made somewhat of a mystery, would make them appear more learned than the mass of the people, they have endeavoured to make our simple language turn and twist itself so as to become as complex in its principles as the Latin language is’. As far as Cobbett was concerned, there were three ‘times’: past, present, and future. Anyone deviating from this view was being ‘fanciful’.19

  Yet we all have some experience of the plasticity of the English tense system. If I pick up a newspaper and read the headline ‘Queen Mother Dies’, I know that what is being referred to in the present is actually in the past – she isn’t dying right now. But the headline writer uses the present tense to make the information seem more urgent and impressive. ‘Queen Mother Is Dead’ might be considered tactless; ‘Queen Mother Has Died’ sounds lumpen and banal. Alternatively, consider this: ‘I am going away tomorrow.’ Though cast in the present, the sentence refers to something in the future. ‘When do you start your new school?’ ‘You are to report to reception on arrival.’ ‘The train departs at 6.30.’ As these phrases suggest, it is a nonsense to pretend that the present tense always refers to present time.

  The same applies to the future tense. Take ‘You will insist on criticizing my driving’. This clearly refers to something that has already happened – several times, we gather. Meanwhile, ‘Hydrogen will burn with a squeaky pop’ refers to something that always happens. Such examples confirm that verbs alone do not convey the sense of time. Other words will be involved. To quote David Crystal, ‘the linguistic expression of time spreads itself throughout the whole of a sentence’.20 When we read the words ‘Radio broadcasting begins in 1920’ we don’t see the verb ‘begins’ and think ‘My God, it’s 1920!’ The words ‘in 1920’ tell us to read ‘begins’ in a special way. The author who chooses to write ‘begins’ rather than ‘began’ is guilty of nothing worse than exploiting our ability to see this. He or she is striving for immediacy, not trying to mislead us.

  Not just our purposes but also our circumstances affect our choice of language. This may seem uncontroversial, but before the last century it was rare to hear the matter put so explicitly. We adapt the way we express ourselves to suit our audience’s expectations, or, more accurately, our expectations of their expectations. Most of us occasionally suffer from what the linguist Einar Haugen called ‘schizoglossia’, an anxiety about which is the right form of our language to use at a particular moment. We are guilty sometimes of strange linguistic posturing. We use one sort of language when we are in the presence of people whose status – and whose opinion of our status – we consider important: we use another with our intimates. An extreme version of this is Charles V’s addressing himself to ladies in Italian and to God in Spanish. A less extreme one is my choosing to write this book in a style that from time to time includes quite technical terms; were I talking to a ten-year-old about its subject matter, I would express myself in a simpler fashion. When someone loses sight of which level of language he or she should be using, it can feel like an affront, even a violation. I would not be impressed if the vet wrote in his notes, ‘Poor little kitty’s botty is going “ow”.’ Neither would I be impressed if he said to my young daughter, ‘Your cat has squamous cell carcinoma’; if he said to me, ‘Your moggie’s gonna snuff it, pal’; or if he said more expansively, ‘Damn – I’ve just got cat all over the floor.’

  Now and then a single word violates our sense of appropriateness. When did you last hear someone say ‘That’s not a proper word’? A search on the internet quickly provides me with numerous examples of terms smeared in this way: joys, irregardless, unvalidated, aluminium, politeful, email, gotten, prioritize, extensible, cohabiting and splurge. Some of these doubtless seem inoffensive to you (as some do to me), but I am confident that others will have set your teeth on edge. The ‘not-proper’ word achieves this in two ways: it signifies something we dislike, perhaps a substance or a behaviour, and, more than this, it brings to mind an image of a person or type of person we disfavour, whom we can recall using the word or whom we can imagine using it. You may protest, ‘No, that’s not it at all. I object to irregardless because it’s illogical.’ But catch yourself in the act of disapproval: you’re likely to find that, rather than taking a stand on matters of etymology or logic, you have a reaction that is pointedly social. What’s striking is how long many of these words have been causing irritation. Splurge is attested with the sense ‘a sudden extravagant indulgence’ as early as 1928, the much-reviled irregardless appears in a dictionary of American dialects dating from 1912, and the verb to cohabit has been around for half a millennium. Gotten, widely heard in America, used to be quite acceptable in Britain and survives in ‘ill-gotten gains’, but is mostly regarded as an American eccentricity. There is actually a distinction in American English between got and gotten: got signals current possession (‘I’ve got ten dollars in my pocket’), whereas gotten relates to the process of acquisition (‘I’ve gotten us tickets to the ballet’). Those who sneer at gotten seem oblivious to this.

  There are several definitions of what a ‘word’ is. For something to be a word, it need not have existed a long time nor be in wide use. It may be defined as a unit of speech that can be uttered on its own to meaningful effect, or as a written sequence representing such a unit of speech. When we think about English words on the page, we think of each word having white space on either side of it but no space within. Commonly, what we call words are lexemes – abstract units of the sort that we can find in a dictionary, free forms that when spoken or written have some semantic content. To put it nakedly, words are the smallest units of language that can always stand alone as independent utterances. There are smaller meaningful units, morphemes, but they are not necessarily able to stand on their own. Pleasant is a lexeme, and so is unpleasantness, but the un- and -ness that we add to the first to make the second are morphemes.

  When we use a word that is unusual, perhaps when playing Scrabble or reporting a strange thing that has happened to us, a familiar rejoinder is ‘I bet it’s not in the dictionary’. The first problem here concerns what is meant by ‘the dictionary’. If I look in the OED, I shall find rastaquouère meaning ‘a person … regarded as a social interloper’ and nudiustertian meaning ‘of or relating to the day before yesterday’. It also includes puh-leeze and achy-breaky. But if I look in a pocket dictionary, it is possible that a word I really use will be absent. Besides, the word under scr
utiny may be new, and even in the digital age dictionaries take a while to catch up with novelty.

  A dictionary is ostensibly a record of usage. This means that on the one hand it provides explanations of words we may consider undesirable, debased or spurious, and that on the other it does not cover everything, because it is impossible to keep abreast of all usage. In any case, the almost religious recourse to dictionaries is naïve. Dictionaries manifest bias. Some are pretty candid about this. For instance, the Chambers Twentieth Century Dictionary (1972) defines jaywalker as ‘a careless pedestrian whom motorists are expected to avoid running down’. No dictionary, however clinically produced, is entirely untouched by human prejudice.

  Change in vocabulary is easily registered. It involves not just the arrival of new words, but also the disappearance of words that were once in everyday use. On the whole we are less aware of changes in grammar. Where grammar is concerned, we are more likely to suspect that ‘something is going on’ than to be sure of it. I have touched already on the way they is increasingly used in the singular. Whom seems to be receding. The passive appears increasingly to be formed using the verb ‘to get’ and a past participle – not ‘we were caught’, but ‘we got caught’. Shall as a modal auxiliary to denote future tense, particularly in the first person (‘I shall see you later’), is in decline. Less is used where traditionally fewer was preferred.

  With regard to the last of these, there is always someone who is irked by the sign in the supermarket that says ‘Five items or less’. Shouldn’t it be ‘Five items or fewer’? One way round this, adopted by a supermarket where I shop, is for the sign to read ‘Up to five items’. The rule that I can recall being taught is that less is used of bulk, but not of countable nouns: ‘I do less entertaining than you because I have fewer friends.’ One of the reasons for the blurriness of the distinction between less and fewer is the way more behaves. We use more with countable nouns and with non-countable ones: ‘I do more entertaining than you because I have more friends.’ However, in the Middle English period, more was used of quantities that were not being counted and the now obsolete mo was used where numbers were specified: one spoke of ‘more butter’ and of ‘mo loaves’, and, were I to revive the distinction, I would say, ‘I do more entertaining than you because I have mo friends.’ As mo disappeared, more took over both roles, and less copied this extension. But there were objections. The conventional distinction seems to begin in 1770 with Robert Baker’s Reflections on the English Language. Baker was different from most of his contemporary writers on language, informing his audience that he ‘quitted the School at fifteen’, knew no Greek and not much Latin, and owned no books. His Reflections contains some statements that will have sounded odd to his peers and continue to seem so now; for instance, ‘There are … Places, even in Prose, where for the sake of Sound, Whom may be used in the Nominative’. But regarding less he has proved influential. He claimed that the word was ‘most commonly used in speaking of a Number; where I should think Fewer would do better. No fewer than a Hundred appears to me not only more elegant than No less than a Hundred, but more strictly proper.’ ‘I should think’, ‘appears to me’: this is mere opinion, but Baker’s view caught on.21

  Amid all of this, we should be aware of the ease with which one can succumb to illusions. First of all, there is the illusion of frequency: when something happens a little, we believe we have seen the tip of an iceberg, and consequently we start to believe that it happens a lot. Then there is the illusion created by our selective attention. We are likely to notice and condemn things that are done outside our own social group, and to home in on the vices of a particular group to which we do not belong (often it’s teenagers), without considering whether (a) people behave differently when we’re looking at them, (b) our own social group is guilty of the misdemeanours we are so quick to scold, and (c) the corruptions within this group are rather less than the epidemic we suppose. Finally, there is what Arnold Zwicky has called the ‘recency illusion’ – a theme on which I have touched a few times already. We believe that something we have just noticed has only just started happening, although it may in fact be long-established.

  This illusion pervades discussion of disputed pronunciations. Some words can be pronounced in two noticeably distinct ways. Spotting this, many of us claim, without supporting evidence, that one pronunciation is edging the other out, as though this is strange and new. Examples of words that receive this attention include exquisite, economic, tissue, romance, scallop, controversy, finance, envelope, praline, either, respite, transport and zebra. (The list could be much, much longer.) While the difference is in some cases one of emphasis, mostly it is to do with the sounds of vowels. It hardly seems sensible to draw strong conclusions from the fact that I say zebra with a short e. In fact, self-observation reveals that I am not consistent in the way I pronounce several of these words. Despite reflecting on it, I have not been able to make out an absolutely clear pattern in my behaviour. This much is sure, though: while in English the location of the stress in a word is not completely free, it is not fixed as it is in, say, Polish or French, and the rhythm of a sentence – the locations of our articulatory force – will affect the way we treat the words within it.

  Finally, it is striking that the editors of The American Heritage Dictionary are responsible for a book with the title 100 Words Almost Everyone Mispronounces. Examples include acumen, chimera and niche. If ‘almost everyone’ mispronounces them, it follows that almost no one pronounces them ‘correctly’, so perhaps the supposedly correct pronunciations are close to becoming obsolete.22 As so often, the guardians of English are defending positions that seem already to have been lost.

  24

  Technology says ‘whatever’

  Wired life … wireless … lifeless?

  It is common to blame what we might call The New Incorrectness on technology. In his book The Gutenberg Elegies, which investigates the impact of technology on the experience of reading, Sven Birkerts characterizes the negative effects of the information age as ‘mediation, saturation, and fragmentation’. He suggests that the modern, ‘connected’ human is becoming weightless; ‘human gravity’ is being lost.1 Birkerts decries the increasing shallowness of culture – and what he calls ‘the gradual but steady erosion of human presence’ and ‘the ersatz security of a vast lateral connectedness’.2 Countless others are saying the same thing: technology is changing every department of experience, and too little attention is being paid to the consequences.

  When we hear the word technology, we probably think of computers or of what they make possible. But technology is of course rather older: examples include the wheel, the plough, and the hand axes used by our primitive ancestors more than a million years ago. The arrival of personal computers in our homes and places of work is really quite recent. I was born in 1974; I first used a computer in 1982, and first owned one soon afterwards, but did not start to work at a computer on a daily basis until about 1995. The concept of ‘desktop publishing’ is a mere twenty-five years old. The system of interconnected computer networks we now call the internet is older. The technologies that laid its foundation were originally developed as an aid to formal communication within the American military; the internet’s precursor, the ARPANET, took its name from the US Department of Defense’s Advanced Research Projects Agency and grew briskly in the early 1970s. Thanks to the development by Tim Berners-Lee of the World Wide Web, a system enabling connections between documents and services, the internet has, of course, become something very different. Berners-Lee’s invention, set out in a proposal in 1989, was publicized in 1991 and began to capture real public interest two or three years later. An interesting sidelight: his original name for it was Enquire, which he explains was ‘short for Enquire Within Upon Everything, a musty old book of Victorian advice I noticed as a child in my parents’ house … With its title suggestive of magic, the book served as a portal to a world of information.’ However, ‘unlike Enquire Within Upon Everythin
g, the Web that I have tried to foster is not merely a vein of information’. The vision Berners-Lee sets out is ‘a society that could advance with intercreativity and group intuition rather than conflict as the basic mechanism’; the ideal Web he imagines will ‘help me reorganize the links in my own brain so I can understand those in another person’s’.3

  I am writing these words on a computer, rather than in longhand. This has implications for the way I write. I use a piece of software called Scrivener, which intrudes less on my writing process than other word processing software I have owned. At least, that’s how it feels. If you use a word processing program such as Microsoft Word you will be familiar with the ways in which your writing is – or can be – mediated and moderated by it. For instance, there is the spelling checker, which flags words that may be incorrectly spelled, and, according to which settings you have programmed, may even ‘auto-correct’ some apparent mistakes as you write. The first spelling checker on a home computer was marketed in 1980. A grammar checker was part of a package called Writer’s Workbench in the 1970s; the first grammar checker for a home computer appeared in 1981, and in 1992 Microsoft for the first time included one as part of Word. People often point out the deficiencies of these utilities, but for many they have become a crutch, and users expect the software to catch their errors.

 

‹ Prev