by Tony Judt
Thus we should not be surprised to learn that an early English social democrat like Beatrice Webb took it for granted that the ‘socialism’ she sought could be parsed as public education, the public provision of health services and medical insurance, public parks and playgrounds, collective provision for the aged, infirm and unemployed and so on. The unity of the pre-modern world, its ‘moral economy’ as E.P. Thompson called it, was thus very much on her mind: people should cooperate, they should work together for the common good and no one should be left out.
Welfare states were not necessarily socialist in origin or purpose. They were the product of another sea change in public affairs that overtook the West between the ’30s and the ’60s: one that drew experts and scholars, intellectuals and technocrats into the business of administration. The result, at its best, was the American Social Security system, or Britain’s National Health Service. Both were extraordinarily expensive innovations, breaking with the piecemeal reforms and repairs of the past.
The importance of such welfare undertakings did not lie in the ideas themselves—the thought that it would be good to guarantee all Americans a secure old age, or to make available to every British citizen first-class medical treatment at no point-of-service cost, was hardly original. But the thought that such things were best done by the government and that therefore they should be done by the government: this was unprecedented.
Precisely how such services and resources should be made available was always a contentious issue. Universalists, influential in Britain, favored high across-the-board taxation to pay for services and resources to which all would have equal access. Selectivists preferred to calibrate costs and benefits according to the needs and capacities of each citizen. These were practical choices, but they also reflected deeply held social and moral theories.
The Scandinavian model followed a more selective but also more ambitious program: its goal, as articulated by the influential Swedish sociologist Gunnar Myrdal, was to institutionalize the state’s responsibility to “protect people against themselves.”10 Neither Americans nor British had any such ambitions. The idea that it was the state’s business to know what was good for people—while we accept it uncomplainingly in school curriculums and hospital practices—smacked of eugenics and perhaps euthanasia.
Even at their height, the Scandinavian welfare states left the economy to the private sector—which was then taxed at very high rates to pay for social, cultural and other services. What Swedes, Finns, Danes and Norwegians offered themselves was not collective ownership but the guarantee of collective protection . With the exception of Finland, Scandinavians all had private pension schemes—something that would have seemed very odd to the English or even most Americans in those days. But they looked to the state for almost everything else, and freely accepted the heavy hand of moral intrusion that this entailed.
The welfare states of continental Europe—what the French call the Etat providence, or providential state—followed yet a third model. Here, the emphasis was primarily on protecting the employed citizen against the ravages of the market economy. It should be noted that ‘employed’ here is no casual adjective. In France, Italy and West Germany it was the maintenance of jobs and incomes in the face of economic misfortune that preoccupied the welfare state.
To the American or even the modern English eye, this must seem peculiar indeed. Why protect a man or woman against the loss of a job which no longer produces anything people want? Surely it is better to acknowledge capitalism’s ‘creative destruction’ and wait for better jobs to come along? But from the continental European perspective, the political implications of throwing large numbers of people onto the street at times of economic downturn were far more consequential than the hypothetical efficiency loss in maintaining ‘unnecessary’ jobs. Like the 18th century guilds, French or German labor unions learned how to protect ‘insiders’—men and women who already have secure employment—against ‘outsiders’: the young, the unskilled and others in search of work.
The effect of this sort of social protection state was and is to keep insecurity at bay—at the price of distorting the supposedly neutral workings of the labor market. The remarkable stability of continental societies which had experienced bloody turbulence and civil war only a few years before casts a favorable light upon the European model. Moreover, whereas the British and American economies have been ravaged by the financial crisis of 2008, with well over 16% of the American work force officially unemployed or no longer seeking work at the time of writing (February 2010), Germany and France have weathered the storm with a far lower level of human suffering and economic exclusion.
By protecting ‘good’ jobs at the price of failing to create more low-paying ones, France, Germany and other continental welfare states have made a deliberate choice. In the US and the UK, beginning in the 1970s, low-wage and insecure jobs began to replace the more stable employment of the boom years. Today, a young person might consider himself fortunate to find employment, at minimum wage and with no benefits, in Pizza Hut, Tesco or Walmart. Such openings are less easily come by in France or Germany. But who is to say, and on precisely what grounds, that someone is better off working for low wages at Walmart than she is taking unemployment pay on the European model? Most people would rather have work, to be sure. But at any price?
The priorities of the traditional state were defense, public order, the prevention of epidemics and the aversion of mass discontent. But following World War II, and peaking around 1980, social expenditure became the main budgetary responsibility for modern states. By 1988, with the notable exception of the United States, all the major developed countries were devoting more resources to welfare, broadly conceived, than to anything else. Understandably enough, taxes also rose sharply in these years.
For anyone old enough to remember what had gone before, this crescendo of social expenditure and welfare provision must have seemed little short of miraculous. The late Ralf Dahrendorf, an Anglo-German political scientist well placed to appreciate the scale of the changes he had seen in his lifetime, wrote of those optimistic years that “[i]n many respects the social democratic consensus signifies the greatest progress which history has seen so far. Never before have so many people had so many life chances.”11
He was not mistaken. Not only did social democrats and welfare governments sustain full employment for nearly three decades, they also maintained growth rates more than competitive with those of the untrammeled market economies of the past. And on the back of these economic successes they introduced radically disjunctive social changes that came to seem, within a short span of years, quite normal. When Lyndon Johnson spoke of building a ‘great society’ on the basis of massive public expenditure on a variety of government-supported programs and agencies, few objected and fewer still thought the proposition odd.
By the early ’70s it would have appeared unthinkable to contemplate unraveling the social services, welfare provisions, state-funded cultural and educational resources and much else that people had come to take for granted. To be sure, there were those who pointed to a likely imbalance between public income and expenditure as the pension bills grew and the baby boom generation aged. The institutional costs of legislating social justice in so many spheres of human activity were inevitably considerable: access to higher education, public provision of legal aid to the indigent and cultural subsidies for the arts did not come free. Moreover, as the postwar boom wound down and endemic unemployment once again became a serious concern, the tax base of the welfare states appeared more fragile.
These were all legitimate reasons for anxiety in the waning years of the ‘great society’ era. But while they account for a certain loss of confidence on the part of the administrative elite, they don’t explain the radical transition in attitudes and expectations which has marked our own age. It is one thing to fear that a good system may not be able to maintain itself; it is quite another to lose faith in that system altogether.
CHAPTER TH
REE
The Unbearable Lightness of Politics
“A study of the history of opinion is a necessary preliminary to the emancipation of the mind.”
—JOHN MAYNARD KEYNES
Nothing, of course, is ever quite as good as we remember. The social democratic consensus and the welfare institutions of the postwar decades coincided with some of the worst town planning and public housing of modern times. From Communist Poland through social democratic Sweden and Labour Britain into Gaullist France and the South Bronx, over-confident and insensitive planners plastered cities and suburbs with unlivable and unsightly housing estates. Some of these are still with us—Sarcelles, a suburb of Paris, bears witness to the haughty indifference of bureaucratic mandarins to the daily life of their subjects. Ronan Point, a peculiarly ugly high-rise in east London, had the good taste to fall down of its own accord but most of the buildings of that era are still with us.
The indifference of local and national authorities to the harm wrought by their decisions can stand for a troubling aspect of postwar planning and renewal. The idea that those in authority know best—that they are engaged in social engineering on behalf of people who do not understand what is good for them—was not born in 1945, but it flourished in the decades that followed. This was the age of Le Corbusier: how the masses felt about their new apartments, the new towns to which they had been moved, the ‘quality of life’ to which they had been assigned, was all too often a matter of indifference.
By the late 1960s, the idea that “nanny knows best” was already starting to produce a backlash. Middle class voluntary organizations began to protest at the wholesale and abusive clearing not just of ‘ugly’ slums but also of prized buildings and townscapes: the wanton destruction of New York’s Pennsylvania Station and London’s Euston Station, the elevation of a monstrous office tower at the heart of Paris’s ancient Montparnasse quartier, the unimaginative redistricting of whole cities. Rather than an exercise in socially responsible modernization on behalf of the community, these began to appear as symptoms of uncontrolled and insensitive power.
Even in Sweden, where the Social Democrats’ grip on office remained as firm as ever, the relentless uniformity of even the best housing projects, social services or public health policies began to grate on a younger generation. Had more people known about the eugenicist practices of some Scandinavian governments in the postwar years, encouraging and even enforcing selective sterilization for the greater benefit of all, the sense of oppressive dependence upon a panoptic state might have been greater still. In Scotland, where the municipallyowned tower blocks of working-class Glasgow housed upwards of 90% of the city’s population, their air of dilapidation bore witness to the indifference of municipal (socialist) councils to the condition of their proletarian constituents.
The sense, widespread by the ’70s, that the ‘responsible’ state was unresponsive to the needs and desires of those for whom it spoke contributed to a widening social gulf. On the one hand, there stood an older generation of planners and social theorists. Heirs to the managerialist confidence of the Edwardians, these men and women remained proud of their achievements. Middle-class themselves, they were especially pleased at their success in binding the old elites into a new social order.
But the beneficiaries of that order—whether Swedish shopkeepers, Scottish shipworkers, inner-city African-Americans or bored French suburbanites—were increasingly resentful of their dependence upon administrators, local councilors and bureaucratic regulations. Ironically, it was precisely the middle classes who were most contented with their lot—in large measure because they came into contact with the providential state as a source of popular services rather than as a restriction upon autonomy and initiative.
But the greatest gulf was now the one separating generations. For anyone born after 1945, the welfare state and its institutions were not a solution to earlier dilemmas: they were simply the normal conditions of life—and more than a little dull. The baby boomers, entering university in the mid-’60s, had only ever known a world of improving life chances, generous medical and educational services, optimistic prospects of upward social mobility and—perhaps above all—an indefinable but ubiquitous sense of security. The goals of an earlier generation of reformers were no longer of interest to their successors. On the contrary, they were increasingly perceived as restrictions upon the self-expression and freedom of the individual.
THE IRONIC LEGACY OF THE ’60S
“My generation of the Sixties, with all our great ideals, destroyed liberalism, because of our excesses.”
—CAMILLE PAGLIA
It was a curiosity of the age that the generational split transcended class as well as national experience. The rhetorical expression of youthful revolt was, of course, confined to a tiny minority: even in the US in those days, most young people did not attend university and college protests did not necessarily represent youth at large. But the broader symptoms of generational dissidence—music, clothing, language—were unusually widespread thanks to television, transistor radios and the internationalization of popular culture. By the late ’60s, the culture gap separating young people from their parents was perhaps greater than at any point since the early 19th century.
This breach in continuity echoed another tectonic shift. For an older generation of left-leaning politicians and voters, the relationship between ‘workers’ and socialism—between ‘the poor’ and the welfare state—had been self-evident. The ‘Left’ had long been associated with—and largely dependent upon— the urban industrial proletariat. Whatever their pragmatic attraction to the middle classes, the reforms of the New Deal, the Scandinavian social democracies and Britain’s welfare state had rested upon the presumptive support of a mass of blue collar workers and their rural allies.
But in the course of the 1950s, this blue collar proletariat was fragmenting and shrinking. Hard graft in traditional factories, mines and transport industries was giving way to automation, the rise of service industries and an increasingly feminized labor force. Even in Sweden, the social democrats could no longer hope to win elections simply by securing a majority of the traditional labor vote. The old Left, with its roots in working class communities and union organizations, could count on the instinctive collectivism and communal discipline (and subservience) of a corralled industrial work force. But that was a shrinking percentage of the population.
The new Left, as it began to call itself in those years, was something very different. To a younger generation, ‘change’ was not to be brought about by disciplined mass action defined and led by authorized spokesmen. Change itself appeared to have moved on from the industrial West into the developing or ‘third’ world. Communism and capitalism alike were charged with stagnation and ‘repression’. The initiative for radical innovation and action now lay either with distant peasants or else with a new set of revolutionary constituents. In place of the male proletariat there were now posited the candidacies of ‘blacks’, ‘students’, ‘women’ and, a little later, homosexuals.
Since none of these constituents, at home or abroad, was separately represented in the institutions of welfare societies, the new Left presented itself quite consciously as opposing not merely the injustices of the capitalist order but above all the ‘repressive tolerance’ of its most advanced forms: precisely those benevolent overseers responsible for liberalizing old constraints or providing for the betterment of all.
Above all, the new Left—and its overwhelmingly youthful constituency—rejected the inherited collectivism of its predecessor. To an earlier generation of reformers from Washington to Stockholm, it had been self-evident that ‘justice’, ‘equal opportunity’ or ‘economic security’ were shared objectives that could only be attained by common action. Whatever the shortcomings of over-intrusive top-down regulation and control, these were the price of social justice—and a price well worth paying.
A younger cohort saw things very differently. Social justice no longer preoccupied radic
als. What united the ’60s generation was not the interest of all, but the needs and rights of each. ‘Individualism’—the assertion of every person’s claim to maximized private freedom and the unrestrained liberty to express autonomous desires and have them respected and institutionalized by society at large—became the left-wing watchword of the hour. Doing ‘your own thing’, ‘letting it all hang out’, ‘making love, not war’: these are not inherently unappealing goals, but they are of their essence private objectives, not public goods. Unsurprisingly, they led to the widespread assertion that ‘the personal is political’.
The politics of the ’60s thus devolved into an aggregation of individual claims upon society and the state. ‘Identity’ began to colonize public discourse: private identity, sexual identity, cultural identity. From here it was but a short step to the fragmentation of radical politics, its metamorphosis into multiculturalism. Curiously, the new Left remained exquisitely sensitive to the collective attributes of humans in distant lands, where they could be gathered up into anonymous social categories like ‘peasant’, ‘post-colonial’, ‘subaltern’ and the like. But back home, the individual reigned supreme.
However legitimate the claims of individuals and the importance of their rights, emphasizing these carries an unavoidable cost: the decline of a shared sense of purpose. Once upon a time one looked to society—or class, or community—for one’s normative vocabulary: what was good for everyone was by definition good for anyone. But the converse does not hold. What is good for one person may or may not be of value or interest to another. Conservative philosophers of an earlier age understood this well, which was why they resorted to religious language and imagery to justify traditional authority and its claims upon each individual.