A Point of View
Page 33
In Montaigne’s day you could get into terminal trouble for taking scepticism too far, which is probably one of the reasons why not even he pushed it on the subject of religion. Since then, a sceptical attitude has been less likely to get you burned at the stake, but it’s notable how the issue of man-made global warming has lately been giving rise to a use of language hard to distinguish from heresy-hunting in the fine old style by which the cost of voicing a doubt was to fry in your own fat. Whether or not you believe that the earth might have been getting warmer lately, if you are sceptical about whether mankind is the cause of it, the scepticism can be enough to get you called a denialist.
It’s a nasty word to be called, denialist, because it conjures up the spectacle of a fanatic denying the Holocaust. In my homeland, Australia, there are some prominent intellectuals who are quite ready to say that any sceptic about man-made global warming is doing even worse than denying the Holocaust, because this time the whole of the human race stands to be obliterated. Really they should know better, because the two events are not remotely comparable. The Holocaust actually happened. The destruction of the earth by man-made global warming hasn’t happened yet, and there are plenty of highly qualified scientists ready to say that the whole idea is a case of too many of their colleagues relying on models provided by the same computers that can’t even predict what will happen to the weather next week.
In fact the number of scientists who voice scepticism has lately been increasing. But there were always some, and that’s the only thing I know about the subject. I know next to nothing about climate science. All I know is that many of the commentators in newspapers who are busy predicting catastrophe don’t know much about it either, because they keep saying that the science is settled, and it isn’t.
There is no scientific consensus. There are those for, and those against. Either side might well be right, but what should be clear is that if you have a division on that scale, you can’t have a consensus. Nobody can meaningfully say that ‘the science is in’, yet this has been said constantly by many commentators in the press until very lately, and now that there are a few fewer saying it there is a tendency, on the part of those who do say it, to raise their voices even higher, and harden their language against any sceptic, as if they were protecting their faith.
Sceptics, say the believers, don’t care about the future of the human race. But being sceptical has always been one of the best ways of caring about the future of the human race. For example, it was from scepticism that modern medicine emerged, questioning the common belief that diseases were caused by magic, or could be cured by it. A conjecture can be dressed up as a dead certainty with enough rhetoric, and protected against dissent with enough threatening language, but finally it has to meet the only test of science, which is that any theory must fit the facts, and the facts can’t be altered to suit the theory.
The golf-ball crisp might look like a crisp, and in a moment of delusion it might taste like a crisp, and you might even swallow it, rather proud of the strength it took to chew. But if there is a weird aftertaste, it might be time to ask yourself if you have not put too much value on your own opinion. The other way of saying ‘What do I know?’ is ‘What do I know?’ That shade of different meaning wasn’t there in Montaigne’s original language, but it is in ours.
Postscript
If you’ll allow a metaphor so horridly mixed, the golf-ball potato crisp was a red herring. By such means, I hoped, I would be able to sneak up on the forbidden topic of Catastrophic Anthropogenic Global Warming, sometimes referred to as CAGW by writers who supposed their readers were insufficiently bored already. Because my brand of scepticism about the claims of the alarmists was still widely regarded as criminal indifference to the future of the Planet, it seemed wise to avoid tipping off the audience at the top of the show. Lull them first and needle them later. But it was a mistake, I discovered, to say that one knew next to nothing about climate science, because people who knew less than nothing, but who nevertheless had intuitive powers to predict the coming catastrophe, took one’s confession not as a sign of modesty, but as proof of the malevolence behind any attempt one might make to express an opinion differing from the opinion they supposed to be prevalent in the scientific world.
In actual fact, the scientific world had been divided on the subject from the beginning, and the division had by this time become apparent to anyone with a wider source of information than that provided by the mainstream media. For reasons of its own, however, the BBC, although it didn’t have to, had decided to copy some of the more upmarket outlets of printed news and opinion – in Britain, these were most conspicuously the Guardian and the Independent – in handing over the whole field of science to a science correspondent. If it was editorial humility that led to such delegation of responsibility, the result was orthodoxy in each case, and nowhere was the orthodoxy more rigidly imposed than at the BBC. This tiny broadcast was the very first case of the BBC letting someone on the air alone to put a differing view, and it was certainly not the prelude to a flood. Until the so-called (dumbly called) Climategate scandal broke at the University of East Anglia in November, cases of heresy expressed unfettered on the airwaves remained very rare, and after Climategate they became only slightly less rare, because the BBC, though forced, like the mainstream media as a whole, to report a news event, was slow to admit the implications.
Slowness was understandable, since one of the implications was that in their coverage of Climate Change (to give a poltergeist the dignity of capitalized initials) they had been wasting their time, and everybody else’s, for years on end. Nevertheless, even though the script for this broadcast aroused consternation when I submitted it, I was allowed on the air. In this instance as in so many others, Mark Damazer, controller of Radio 4, was prepared to back the contributor against the full weight of the building he was sitting in. But the message did come filtering down the stairs that I might do better to back off for a while, and talk of other matters. Meanwhile, in the outside world, things were starting to boil. I was only one of the tiniest bubbles, but the reaction of the Guardian’s Climate Change pundit George Monbiot was indicative. He said that the only reason I could hold such opinions was that I was an old man who didn’t care what happened to the Planet. Well, he was right about the first part of the description, but the day will come when he himself realizes that the second part proved he had little insight into how an old man feels about the world when the time draws near that he must leave it.
There was no point, though, in fighting back on the level of personality, because one could only be cooperating in a conspiracy to bore the public. Quite early on, the climatologists had made their eventually decisive mistake: they had turned the supposed difference of views – or difference of supposed views – into a plebiscite. This counting of heads was a competition they were bound, in the long run, to lose, because the matter would eventually turn on the exercise of critical reason; their opponents, under no obligation to prove a negative, had only to go on asking for proof.
The sceptics made just as big a mistake in supposing that when the position of the alarmists collapsed, everyone would suddenly turn sane; that the newspapers and television channels would automatically resume their erstwhile positions as arenas for debate; and that governments would stop spending the public’s money on hopelessly expensive and inefficient alternatives to the cheap power we already had. Eventually that became the next debate, and once again it was almost impossible to hear it happen. But just because it was so dauntingly clear that people living near a wind-farm in the making would soon be reduced to re-inventing smoke signals and the talking drum was no reason to think that common reason would soon prevail. There was just too much money in building the wrong thing. The money amounted to a tax on the poor; a fact which should have put the left on the alert; but the left, no longer worthy of its name, had long ago fallen silent.
ON STRIKE
Dates of show: 30 October and 1 November 2009
> Nowadays a strike is usually called industrial action but I’ve never much liked the term, because any proposed industrial action aims to produce industrial inaction, and usually it’s better to have a word for something that evokes the something instead of its opposite. Besides, the word ‘strike’ is short if not sweet, and it sounds like a blow, which is what it is meant to be.
At the time I write this script, the postal strike, after a brief lull, has once again hurtled into action, or inaction, and the chance is getting low that your Christmas cards will make it through to your maiden aunt in that little town where the train has been replaced by a bus, the local shops by a supermarket she can’t walk to, her hip by a stainless-steel gadget, and that nice man Nicholas Parsons’s smiling face on the telly by Russell Brand’s petulant snarl.
In fact she, you and I might already be hoping that somebody will reinvent the pony express. A few days ago I got a letter from someone I sent a letter to months ago. She said she had only just received my letter. But her letter was dated from weeks ago. For at least part of the total period, the Royal Mail was theoretically not on strike.
All too often, the Royal Mail feels to me as if it is on strike even when it isn’t. Whose fault is this? All I can suggest is that in matters of industrial relations, often a way of saying lack of industrial relations, we should be slow to point the finger. Not necessarily as slow as it takes a letter to get there, of course, but still slow. Maybe the fault goes deeper and further back than we think.
Last Sunday I happened to be on Andrew Marr’s television show when the chief executive of the Royal Mail, Adam Crozier, was one of the guests. For a man in his position, he seemed refreshingly normal. Some of his predecessors in the post, however, might as well have been wearing flying helmets and flippers. You might remember that the Royal Mail’s top management once took the inspired decision to change the name of the Post Office to Consignia. They might not have realized – or, even worse, they might have realized – that their new word Consignia, meant to be equally unintelligible but universally awe-inspiring to people of all nationalities, sounded very like the Spanish word consigna, meaning ‘left luggage’.
But they certainly realized soon afterwards that the British public disliked the new appellation, so they thought hard and changed it again, at huge expense. They didn’t change it back to Post Office, they changed it to Royal Mail plc. To do this, they had to ask the Queen. Kindly she said yes, instead of saying that on the whole, when organizations whose names were prefaced by the word ‘Royal’ were concerned, she would prefer it if the management could restrain itself from faffing about, because she had her own brand name to consider.
I might say here that Mr Crozier struck me as someone who might be rather better than some of his predecessors at listening to other voices. But the damage may already have been done, over the course of years. When industrial relations go sour, they tend not to be fixed without a blow being struck, and what you think about that tends to determine your politics.
My own politics, in this matter, remain where they always were, on the old-style left. I think it’s up to management, and always has been. If the managers can’t manage to sort it out, preferably in advance, then they ought not to be managing. But quite often they haven’t been. They’ve just been sitting there, failing to notice that the workers have begun to arrive at work facing backwards, ready to walk out.
When there is dignity in labour, workers usually want to work, even if the task is a drudge. They should beware of any outrage expressed on their behalf by false friends on the playtime left who have never done a hand’s turn. While it is a fine thing to be an artist, it is an even finer thing to be a doctor or a nurse, and can be just as fine a thing to stack shelves or clean lavatories. One of the few virtues of the old Soviet Union was that it respected the dignity of the workers. It also slaughtered them by the million, but that was an effect of totalitarian rule, not a sign of any innate conflict between management and labour. To the extent that there is such an innate conflict, modern history has consisted largely of the long process of resolving it.
Back in the nineteenth century, the future prosperity of my homeland, Australia, was ensured partly by the energies of people who had been transported to the colony because they were machine-breakers. Those victims of progress were some of our first trade unionists, having discovered the hard way that a free market, though necessary, will never produce justice by itself. In the twentieth century, it wasn’t just the Soviet Union that responded with force to any signs of independence from labour. In America in the 1930s, Detroit auto workers were beaten up for going on strike, and some of them were shot. Unions in the free countries had to battle every inch of the way for workers’ rights.
Admittedly it was very easy for unions, once they had consolidated their power, to become corrupted. Jimmy Hoffa of the Teamsters union was unusual only in being such a silk-suited hoodlum. Less spectacularly dressed, in Britain after World War II, there were honest union leaders who led their members into a Luddite cul-de-sac and the country into stagnation. In the time of Harold Wilson, trade union leaders like Jack Jones and Hugh Scanlon were practically in residence in Downing Street, and later on the grief was by no means universal when Mrs Thatcher broke the power of Arthur Scargill.
She could never have done it if the nation had been behind him, but in truth he never even had all the miners behind him. The idea was ripe by then that there had to be a balance. If the managers couldn’t manage, there was even less hope in the unions doing the managing instead. I myself can well remember when the print unions ruled Fleet Street through what were called Spanish practices, and phantom workers drew real salaries. Strikes were endemic. Too often writing a column on Friday for a paper that failed to come out on Sunday, I found myself in the uncomfortable position of being grateful to Rupert Murdoch, when he broke the grip of the union bosses.
Not that he or any other boss is an attractive prospect if his workers have no choice but to obey him. There has to be a concord of management and labour, and the lesson was taught most sharply by what happened when the Nazis brought Germany to ruin. As the great German historian Golo Mann pointed out, the division between management and labour was the crack through which Hitler had got in. And when the war was over, those few labour leaders who survived the concentration camps emerged convinced that for industrial harmony the workers needed more than their rights and conditions, they needed a seat on the board.
The workers must feel that they are in on the planning for how the job is done. When Japan was being rebuilt after that same war, the workers on the production lines were rewarded for their ideas about efficiency. The idea that they should be rewarded came from American advisers who took the chance to transplant the hopes of the New Deal, free from the inflexible old capitalist orthodoxies that had hampered them at home.
A labour–management concord was the solution in Germany and Japan and one way or another it will be the solution here: it’s just slow to come. Making the slowness slower, alas, is the still lingering twin effect – weaker now but not dead yet – of a conservatism that thinks the workers are out to wreck the nation and a radicalism that would like to see the nation wrecked, as if some kind of purity could ensue if people no longer had to work for a living.
But everyone has to work for a living, except those who contrive to get paid for preaching otherwise. The trick is to support the true and essential human feeling that work, any work, if well done and properly managed, has dignity. And if it doesn’t feel like that, then the managers should be fired first, before the workers are. When new technology comes in, some workers are bound to lose their jobs, but if they have no new job to go to, then the highest managerial layer of all, which is the government, is at fault.
In the liberal democracies, and precisely because they are so productive, this conflict in the centre, about how to manage work as the nature of the work changes faster and faster because of its own success, is the main theme of all the domestic
politics that matters. And like it or not, at the centre of it all, at the centre of the centre, is the worker’s right to stop work if the work has been dehumanized to the point where it is not worth doing: the right to strike.
Ideally it shouldn’t need to be exercised, and there must always be some people, of course, who are never free to do so. One of those is the Queen, but she must sometimes wish she were. You can imagine her getting a phone call from the managers of Hellosailor.com, wanting to change their name back to the Royal Navy. ‘Couldn’t you have put all that in a letter?’ she says. ‘Well, no, come to think of it, one supposes not.’
Postscript
Two hundred years after the industrial revolution began to transform the world, the relationship between management and labour remains one of the permanent points of dispute in a developed society. In fact if it doesn’t have that point of dispute, it probably isn’t developed. (In undeveloped societies, it isn’t a dispute, it’s a one-sided battle.) As someone born into the industrial proletariat, I have never forsaken my solidarity with the workers, and still count myself as left wing in politics, however conservative I might be in matters of culture. But the solidarity is mainly notional, because although I work quite hard when it suits me, I have no capacity at all for working when it doesn’t suit me, which is practically the first requirement in the ability to hold down a job.
People who enjoy their work, and will therefore work night and day unless somebody stops them, have a bad tendency to look down on those who don’t enjoy their work. But on the part of those who fancy themselves to be imaginative, it’s a failure of imagination not to realize that only a few people are blessed with an all-consuming purpose that they would pursue even if they were not paid. Most people have to clock on in the morning, and live with the knowledge that their time is being used up. To make them feel that their efforts are worthwhile is the whole art of management. In the years running up to World War II, management in Britain had not been very brilliant. Strikes were frequent, and when the war started the strikes did not stop. Even in the vital aircraft factories, the workers would down tools for more pay. Churchill was outraged when he heard about it, but it was a bad failure of imagination on his part. He thought that assembling the structure of a Wellington bomber’s left aileron ought to be as satisfactory for those doing it as reading vital documents half the night was for him. But they were two different kinds of activity.