Tales of a New America

Home > Other > Tales of a New America > Page 21
Tales of a New America Page 21

by Robert B. Reich


  4

  Rather than view public expenditures on the health, education, and well-being of America’s poor children as a form of “welfare” motivated by magnanimity or a prudent fear of the resentful and idle young, they are more accurately cast as investments in our collective future. A number of studies have suggested that certain kinds of expenditures on the health and education of children offer substantial public returns. Participants in one preschool program for poor children who were tracked through their teenage years and then compared with a set of otherwise similar children were found to have remained longer in high school and enjoyed significantly better job prospects. (They were also less often involved in crime or teenage pregnancies.) The program had cost $4,000 per child; the return on the investment was calculated to be many multiples of that sum.4 Another study found that young people who had been enrolled in the Job Corps, a program designed to socialize disadvantaged youth into the world of work, subsequently earned higher wages and needed fewer unemployment and welfare benefits than those who had not been.5 Title I of the Elementary and Secondary School Act of 1965, providing extra aid to poor school districts, was found to have had a positive effect on the achievement of disadvantaged children, particularly in their earlier years.6 Public funds spent on diet supplements and checkups for poor pregnant and nursing women and their children were demonstrated to cut the rate of premature births, lower infant mortality, reduce long-term child disability, and lead to larger head (and brain) sizes among babies.7 Community health centers were found to have reduced infant mortality and childhood disabilities.8

  Not all such programs tested out so well as collective investments, of course, since so many were inspired by the narrow ethos of charity. And, at least as importantly, the success of those that did work represented only potential benefits. It was still up to the individual to realize them. Our complex and rapidly evolving society offers many opportunities to use intelligence to beat the system—exploiting public benefits, circumventing rules and laws, taking advantage of others’ gullibilities, turning to crime. To be motivated to contribute rather than exploit, a young person needs more than good health and a solid education; he also must feel that he is a member of a society that respects him, and whose respect is worth retaining. Thus investment in American youth has meant not only efforts to guarantee their health and improve their technical competence, but also measures to initiate them into a culture of shared responsibility and mutual benefit.

  These notions of investment are far removed from how we normally use the term. Most economists regard investment as an activity undertaken exclusively in the private sector of the economy, a matter of accumulating artifacts. In the United States national accounts, where the economy is officially weighed and measured, no public expenditures whatsoever are tallied in the “investment” column—not even those for health and education. They are all lumped in with consumption.

  This peculiarity in the way we do the accounting had profound implications for one of the central policy preoccupations of the 1980s: the perceived shortage of investment capital. Orthodox economics attributed the chronic slowdown in economic growth to inadequate conventional investment—too little money devoted to buildings and equipment. The prescription was to reconfigure economic rules to encourage more of everything entered on the books as investment and less of everything entered on the books as consumption. Measures meant to achieve this result included across-the-board tax cuts coupled with extra-low tax rates on savings and capital investment, and corresponding reductions in government spending. The imperative was to cut back the “welfare state” and expand the productive sector of the society. The rationale could be easily summarized: Charity was a fine sentiment, but it must be balanced against the more compelling goals of growth and productivity. Expenditures on all manner of public goods—schools, day-care centers, parks, recreational facilities, clean air and water, museums, libraries, health facilities, prenatal care—were moral luxuries to be afforded when and to the extent we could. Beginning in 1981, the federal government withdrew support from all these areas.

  Consider the implications of redefining investment to include not just bricks, mortar, and machines, but also spending on the health and education (both technical and moral) of our nation’s children. First, cutting taxes solely to increase the pool of savings available for investment is a dubious strategy in a world of global capital markets—domestic savings are no longer the key determinant of the rate of investment. Second, reducing government spending on the capacity of the next generation to produce wealth means cutting a crucial investment that is nowhere reflected in national economic accounts. Third, under this broader definition of investment, a larger part of the public debt should be counted not as an excess of current consumption over current receipts, but rather as capital spending. Because the U.S. government, unlike corporations, makes no distinction between current expenses and investment, the assault on public spending indiscriminately cuts both categories. Far from constituting luxuries to be traded off against growth and productivity, investments in human capital ought to be viewed as a central means of achieving prosperity.

  5

  In the early part of the nineteenth century, when Britain set out to reform its Poor Law, the debates were both furious and fundamentally inconclusive. According to historian Gertrude Himmelfarb,

  Even as the ideological battles were being most seriously waged … there were significant respects in which most of the parties in the dispute were in agreement, sharing the same moral and intellectual assumptions about poverty, making the same distinctions among the poor, focusing on the same group of poor as “the social problem” and using the same vocabulary to describe that group and that problem.9

  The debates over social benevolence that raged in America during the 1980s could be characterized in much the same terms. Casting the issue as how tough or compassionate we should be toward “the poor” obscured more central questions. Both conservatives and liberals focused the argument on the appropriate level and means of transferring income from the majority of Americans to a separate and distinguishable minority in need. Both endorsed the ultimate objective of making them independent of us, either by eliminating their need or our obligation. It was an argument merely over the extent and instruments of public charity.

  Again, the mythic view has distracted us from the heart of the matter. Our Benevolent Community in fact faces a set of moral and intellectual challenges more complex and universal than how to guard “them” against hardship while also helping “them” gain independence. Pooling risks and investments can offer common benefits only when such policies are founded on a shared sense of social obligation. Thus the challenges: How to spread the risks of hardship without inviting laxness? How to pool our investments in the next generation while also inspiring them to fulfill their potential?

  A society premised solely upon the principle of selfish interest, even of the enlightened variety, cannot summon the shared responsibility upon which any scheme of social insurance or social investment must depend. But it is equally true that a society premised upon altruism and compassion toward others cannot sustain these noble sentiments when the going gets tough. The former arrangement asks too little of its citizens; the latter, too much. A truly Benevolent Community must both inculcate mutual responsibility and simultaneously celebrate the resulting mutual gain.

  CHAPTER 17

  THE CYCLES OF RIGHTEOUS FULMINATION

  1

  A readiness to suspect Rot at the Top—a conviction that our major institutions are prone to corruption and irresponsibility—is an enduring aspect of the American character. Like the other American myths, this one changes over time, and its varying versions have manifested both the best and the worst features that distinguish our culture, from a healthy vigilance against abuse of authority to occasional hysteria over fiendish plotting by the devils of the day. Previous chapters have examined how prevailing versions of the other myths have drifted away from curr
ent realities, so that they are less faithful guides to the challenges we now confront. In emphasizing either toughness or magnanimity, they have ignored the central challenge of seeking joint benefits and avoiding joint losses. So too with the tale of Rot at the Top. This mythology, however, has a special feature. The other myths generally evolve in responses (albeit often delayed or distorted) to our collective experience; today’s version of the Triumphant Individual is quite unlike that which prevailed fifty years ago, which was in turn different from that of one hundred years go. But the myth of the Rot at the Top tends to cycle, alternating its indictment between the two major realms of authority: political power and economic power.

  The liberal version of this tale typically concerns itself with the depradations of business; the conservative, with the bloat and meddling of government. As with the other stories, this one clearly has roots in reality. Both public and private bureaucracies have on occasion been impervious to the wants and needs of those whom they were established to serve. Neither has been immune from corruption, and both political and economic elites are often cushioned against the consequences of incompetence. At the highest reaches of either realm of authority—here as elsewhere—arrogance is often endemic.

  Yet our readiness to lay the blame for any ills that beset us at the door of the most visibly powerful among us—while it spares America the passivity and subservience that plague other cultures—risks obscuring the responsibility of citizens at large and our collective power and obligation to make choices. As we lament a pattern of failing farms, shuttered factories, or rising unemployment, our mythology comfortingly but insidiously blinds us to the fact that the fault lies not solely with “them”—with corporate or governmental leaders—but with us, in our failure to come to grips with the choices we face about what rules will govern our economic system. As the world becomes more complex and integrated, these choices become at once more daunting and more urgent. The temptation to avoid them increases, as does the cost. In excoriating in turn the demons of corporate or government control, we have fabricated a false dichotomy between economic and political leadership. We have paid scant attention to how the “market” should be organized and maintained to engender a productive working relationship between public and private spheres. As the global economy shrinks, this oversight has constrained our capacity to adapt.

  2

  By some accounts, public suspicions of Rot at the Top have deepened in recent years. In 1966, 42 percent of Americans surveyed expressed a “great deal of confidence” in congressional leaders, 41 percent were equally confident about the president and his cabinet, 55 percent expressed great confidence in corporate executives, and fully 62 percent felt the same way about top-level military officers. Even labor leaders (22 percent) and the press (29 percent) summoned a fair degree of confidence. But by 1981 a different picture emerged: Confidence in government and business leaders had fallen by about 20 percentage points; leaders in the military, labor unions, and the press suffered comparable declines.1 The drop was even more precipitous when the public was asked about the relationship between government and business. In 1958 more than three out of four Americans believed that their government was run for “the benefit of the people”; only 18 percent thought it was run “by a few big interests looking out for themselves.” By the end of 1972 opinions had reversed: Now more than half believed that the “big interests” were in control.2

  One could look upon this decline in confidence as a unique event in American history, attributable to the Vietnam War, the Watergate scandal, and economic stagnation. Left-leaning liberals, no less than conservatives, recoiled from the corruption and incompetence they saw at the top of American government. Daniel Ellsberg’s jeremiad to a crowd of students in the fall of 1971 exemplified nonpartisan scorn: “If there is one message I have gotten from the Pentagon papers, it is to distrust authority, distrust the president, distrust the men in power, because power does corrupt, even in America.”3 Modern liberals may defend government, and conservatives decry it, but when discussion turns to the latest adventure of the Central Intelligence Agency or the Pentagon their positions are often reversed.

  The recent decline in confidence is not unique to American history, however; what is more unusual, perhaps, is the high degree of confidence Americans felt for their leaders in the decades immediately following World War II. The subsequent decline has been more like a return to the normal level of suspicion that had been there from the start—against the imperial power of the British Parliament and Crown in the late eighteenth century; against the fragile central government established in Philadelphia and then in Washington; against the Bank of the United States, chartered corporations, and the caucus system in the 1830s; against Eastern plutocrats and bankers in the 1880s; against the urban machines and the trusts at the turn of the century; against the “economic royalists” of the 1930s. Dwight Eisenhower conjured up a specter of Rot at the Top that would haunt subsequent decades: “the military-industrial complex.”

  What precisely is the source of the Rot? Not great wealth alone. America’s rich have always claimed a large portion of the nation’s productive resources.4 But this has by no means always inspired general resentment or alarm. Opulence in America has provoked more ambition than hostility. In this, too, we are different from older cultures with feudal origins and histories of class conflict. For most of us, the rich are not “them”; they are what we aspire to become.5 We worry only when private wealth exercises political power. It was here that Theodore Roosevelt and Woodrow Wilson drew the line on the trusts, and Franklin D. Roosevelt damned the “economic royalists.” Private wealth applied to ostentatious consumption is perfectly appropriate; applied to the purchase of political power, it becomes diabolic.

  Nor is the Rot attributable to our political-economic system. In neither Marxism nor the fantasies of the far right have Americans found a plausible diagnosis of what ails the nation. For the most part, we have been proud of our system of government and we like capitalism. The Constitution, checks and balances, free speech, the right to vote, autonomous corporations, the free flow of capital, the profit motive, the free market—all of these features of democratic capitalism are held in deep regard. We may seek to improve them from time to time, but the basic principles remain sacrosanct. We worry only when the system is exploited by individuals or groups bent on accumulating power—when authority is misused. Nothing is held in lower contempt in America than “the interests”; nothing more reviled than monopolists, influence peddlers, and political cabals. Other nations, with parliamentary systems and deep social and economic cleavages, explicitly organize their politics around blocs of economic interests, whose leaders openly bargain on behalf of their constituents. In the United States, such corporatist negotiations are anathema. Pluralism—the play of competing groups for political influence—may provide an apt description for how policy actually gets made in America; but it is not how we choose to see ourselves. No one trusts the big boys negotiating in the back rooms.

  The cause of the Rot is assumed to be concentrated power. Our enduring crusade has been to check power wherever it occurs, whether in public or in private spheres. We have been less certain, however, about how we should go about the task, and what structures of leadership and organization should remain after we harness this danger.

  3

  When the myth takes up the depredations of business, the moral is that we must contain corporate malfeasance through strong government action. When the tale turns to the encroachments of government, the lesson is that we must liberate ourselves from meddling bureaucrats. Occasionally the two themes have been merged in a cautionary tale about collusion between business and government, but usually one or the other is seen as the source of the perversion. Casting issues of organization and authority as essentially moral struggles, though perhaps invigorating, tends to leave murky the question of how we are to arrange our common affairs once the miscreants have been routed. Thus each purging has appeared to
invite a surge of unconstrained power on the other side.

  For most of this century, conservative Republicans have sympathized with business; liberals and Democrats, with government. These affiliations have rarely rendered the parties or their candidates durably attractive to the majority of the electorate. Rather, political campaigns most commonly have been waged against one or the other of these demons; American politicians do not bid for power, they beg the electorate’s aid in ejecting the rascals that currently hold power. Republicans and conservatives have spoken darkly of the threats to liberty posed by government and labor; Democrats and liberals, of the threats to democracy and equality posed by business.

  The pendulum of righteous fulmination has tended to swing back and forth as business, then government, alternate in ascendancy. It was business’s turn in the 1880s, with the advent of large-scale enterprise. Progressivist activism followed in the early decades of this century with the first large wave of government regulation, culminating in the establishment of the Federal Trade Commission, laws governing hours and working conditions, health standards for medicines and meat, and antitrust legislation. The pendulum swung back again in the 1920s, as business regained its preeminence, and government duly receded.6 Efforts to enact welfare and labor legislation, or to expand farm supports, failed or were vetoed. The courts found social reform legislation to be unconstitutional. The Harding, Coolidge, and Hoover administrations busied themselves dismantling what was left of the previous decades’ economic controls.

 

‹ Prev