We Can All Do Better

Home > Other > We Can All Do Better > Page 6
We Can All Do Better Page 6

by Bill Bradley


  Obama had a window of opportunity to change our politics. As his first presidential initiative, he could have sought voluntary public financing of congressional and senatorial campaigns. In the early months of his presidency, he had a nearly 70-percent approval rating and a Democratic House and Senate. He could have said that “changing Washington” meant you had to end the money culture. The people had elected him to change Washington, he could have said, and so he wanted hearings on campaign finance reform in the first two weeks of the congressional session and a bill on his desk by March. I believe he would have gotten it, but his political advisors were said to have argued against pushing that reform, because it was not high in the polls on what people really cared about. Of course it wasn’t. How could you expect a population in economic shell shock to say anything other than “Deal with the economy, and protect my job and my 401(k), and give me access to health care.” The irony is that while reform wasn’t high on people’s lists, it was absolutely necessary in order to pass laws in the areas that people did list as their top five issues. Without reform, a battle to pass bold legislation on health care or energy or jobs would be a lobbyist’s feast, and large-scale success would be diminished—which is what happened.

  To anyone who practices politics, the corrupting influence of money seems obvious, but not to the Supreme Court. In 1976, in Buckley vs. Valeo, the Court said that the money spent by an individual on his or her own political campaign was political speech, protected under the free-speech clause of the Constitution, and therefore could not be limited. This opened the floodgates for rich people to finance their own efforts. The first thing both parties’ campaign committees want to know about prospective candidates is not their biography or whether they have leadership or communication skills but whether they can raise money. By that measure, the best candidate is the one who finances his or her own campaign. According to the Center for Responsive Politics, 250 of our 535 representatives and senators are millionaires.2 That’s 47 percent of Congress. Just 9 percent of Americans are millionaires.3

  The problem of money in politics is not new, of course. As the vice-presidential candidate in 1900 with William McKinley, Theodore Roosevelt saw firsthand how Mark Hanna, McKinley’s campaign manager, required corporations to contribute a percentage of the profits they stood to make from favorable Republican legislation. The practice revolted Roosevelt, who was a true progressive, good-government type. So in 1905, as president, he pressed Congress to ban corporate contributions to political campaigns, which they did in the Tillman Act of 1907. For more than a hundred years it was the law of the land. In 2010, the Supreme Court ruled, in Citizens United vs. Federal Election Commission, that the prohibition of corporate contributions for campaign ads was unconstitutional because it limited the free speech of corporations. (One wondered if the Second Amendment, the right to bear arms, would be the next corporate right.) It seems to me that only speech is speech—not money or T-shirts or tents in public parks or whatever. Once you go beyond speech, you are on the slippery slope that leads to arbitrary decisions influenced more by ideology than by common sense. The Court decisions of 1976 and 2010 have made any comprehensive effort for campaign finance reform very difficult. All proposals must be voluntary to pass constitutional muster with the Roberts Court.

  The reality of politics is so far removed from the Court’s decision in Citizens United that you wonder if the Court even understands how destructive the decision has been to our democracy. It makes you yearn for justices like President William Howard Taft, Senator Hugo Black, Senator Sherman Minton, Senator Harold Burton, and Governor Earl Warren, who actually practiced politics professionally before being elevated to the Court. They had a feel for people. They had made mistakes and suffered the electoral consequences. They knew the Court had to work with the world that existed and nudge it in particular directions. They didn’t force their opinions about what the Founders wanted two hundred years earlier on the America of their day. They knew what it was to compromise and build a coalition. It is, for example, quite conceivable that only a former politician like Warren—working with three former senators, Black, Minton, and Burton—could have gotten a unanimous decision on Brown vs. Board of Education, the landmark case that desegregated schools in America.

  The Roberts Court is now part of our political money problem, frequently choosing ideology over common sense, as if its judgments were divorced from the world around it. Its 2010 decision sits at the center of the selling of American democracy to the highest bidder. Year by year, election by election, decision by decision, power concentrates in fewer and fewer hands. The interests of the vast majority of Americans don’t seem to be as important to the Roberts Court as judicial purity. In Buckley, the Supreme Court said in effect that it was just fine that the candidate with little money only has a megaphone while the candidate with a lot of money has a microphone. In Citizens United, the Supreme Court has approved unlimited contributions by super PACs that can steal elections through widely broadcast lies. Instead of being constructive about a real problem, as Teddy Roosevelt was, the Roberts Court congratulated themselves for adhering to the narrowest interpretation of the Constitution. Today’s strict-constructionist justices would do well to heed the words of one of our key Founders, Thomas Jefferson, which is in plain view on the walls of his memorial:

  I am not an advocate for frequent changes in laws and constitutions. But laws and institutions must go hand in hand with the progress of the human mind. As that becomes more developed, more enlightened, as new discoveries are made, new truths discovered and manners and opinions change, with the change of circumstances, institutions must advance also to keep pace with the times. We might as well require a man to wear still the coat which fitted him when a boy as civilized society to remain ever under the regimen of their barbarous ancestors.

  Congress is for sale thanks to the Supreme Court’s actions, but the situation is even worse than that. Many corporations avoid the disclosure required under Citizens United, opting instead to establish 501-C-4 and 501-C-6 nonprofits (often the arms of super PACs), which can spend money on political advocacy without revealing the names of their donors. Some proponents of this scheme go so far as to argue that requiring disclosure would be a violation of First Amendment rights.

  In 1998, there were 10,406 registered lobbyists. In the peak year of 2007, there were 14,861.4 They don’t work because of charitable impulses. They serve people who want more money from government, arguing for the passage of laws, regulations, or policy favorable to their clients’ enterprises. The economist Mancur Olson argued that the influence of narrow interests can immobilize a democracy and prevent it from addressing the broad public interest.5 Such influence, like some cancers, grows slowly, but it can be fatal to the efforts by elected representatives to do what the vast majority of Americans need them to do.

  There is no place in public policy where the juxtaposition of narrow interest versus general interest is clearer than in tax policy. An ideal income tax system should have the lowest rates for the greatest number of taxpayers and assure that equal incomes are taxed equally and that those who have more pay more. Such a system would have lower rates than the current system and fewer loopholes. We’re a long way from that today. Every time regular taxpayers hear about one group or individual getting special treatment, it tells them that government belongs to the few. They’re not far from right. Worse, if the special interest gets the tax cut, money that might otherwise have gone toward alleviating the problems of all taxpayers—in big areas such as education, health care, or pensions—isn’t there. And because the appetite of the special interest is usually unquenchable, there will be requests for bigger breaks in the future.

  Citizens have reason to ask how our democracy works for them. In 1998 the amount spent on lobbying was $1.44 billion; twelve years later the figure was $3.51 billion.6 In 2009–2010, the financial industry made political contributions of $318 million at the federal level. Healthcare companies contributed $
145.7 million. The energy industry gave $75.5 million.7 Is it any wonder that financial reform was watered down, health care reform had no public option to private insurance, and no energy program became law? Government unions block accountability for the performance of government workers. Congress blocks presidential appointments for the narrowest of reasons, trumping the popular will and rendering government more and more ineffective. To hide the cost of government spending, legislation is rife with special definitions, delayed effective dates, tax breaks extended only for short periods, and rosy economic projections. Laws are passed only to have their effects muted in the regulatory process or their implementation delayed for decades by legal challenges from plaintiffs who hope for repeal by a future administration. All of Washington and most state capitals are in the grip of an insidious kind of corruption—one that uses money and the law to further narrow interests at the expense of the interests of all of us. “It might be unethical, even immoral, but it’s not illegal” is the new watchword.

  Fannie Mae is the poster child of this kind of legal corruption. It is private in the sense that it issues stock to the public and pays its executives gigantic salaries. It is public in the sense that it was created by Congress in 1938. By purchasing mortgages from banks, which encouraged banks to make long-term mortgage loans, it fulfilled a public purpose. Because the market assumed that the government would back any losses, Fannie Mae was able to raise money at a lower interest rate than private financial institutions. It didn’t have to file its financial statements with the SEC, it paid no state or local taxes, it had access to a line of credit of $2.5 billion at treasury, and it was subject only to congressional regulation. These subsidies were giant gifts from the Congress and were ferociously protected by Fannie Mae’s lobbying arm. Fannie asserted that all of the cost savings from these government subsidies were used to provide homeowners with lower mortgage rates. The truth was that Fannie kept one-third of the savings (in the billions) for itself, including its highly compensated executives.8 Whenever a congressman talked of privatizing this behemoth, Fannie rallied its friends in Congress to beat back the attempt. Money was a crucial weapon in its arsenal; between 1989 and 2009 it devoted roughly $100 million to lobbying and political contributions.

  Sometimes the most cost-effective investment a company makes is in its Washington office. From 1998 to 2010, when the financial industry bestowed $2 billion on its various champions in Washington, the industry was transformed: Regulation disappeared, leverage increased, risk skyrocketed, and institutions became gargantuan. Each of these steps was made possible by changes in public policy. While the industry’s political contributions may be obscenely large, they pale in comparison to the profits made possible by the changes in law and regulation. When the financial industry almost collapsed at the end of that profligate decade, endangering the economic health of hundreds of millions of Americans, lobbyists for the industry went to work on the Hill to make sure that nothing fundamental would change. I’m reminded of a well-meant pep talk I delivered long ago to a high-school student working as a Senate page, which I concluded (perhaps rather patronizingly) with, “Learn how to write an English sentence, know the history and literature of the country. And then with a little luck, you can become a U.S. Senator, too.” The page looked at me, puzzled, and replied, “I want to be a lobbyist.”

  Finance is a little like religion: No one really knows why certain things happen, but we are all deeply convinced of the correctness of our own views. And so here are mine: The financial crisis of 2008–2009 resulted from the interplay of human greed and bad government policy. The former is nothing new; it has been around for as long as we’ve been on this Earth, and it’s what moves markets. But public policy must channel that greed; unchanneled, it leads to disaster. These are what I consider the five public-policy mistakes:

  1. The 1999 repeal of provisions of the Glass–Steagall Act. In the 1920s, banks took big leveraged risks and lost. The resultant financial calamity set the stage for mass unemployment, home foreclosures, and a decade of human suffering. Policymakers of that era were determined that such a crash should never occur again. They held extensive hearings to understand what had happened, and they discovered that banks were taking deposits and then speculating with their depositors’ money. The Congress decided to prohibit a bank from doing both; it split the old institution into two separate entities. Banks that took depositors’ money were called commercial banks, and the government insured their deposits (no more bank runs), but their investments were limited. Banks that didn’t take depositors’ money became known as investment banks, which in those days were mostly partnerships (each partner had joint and several liability for the firm’s finances), and they could speculate as much as they wanted with their own money. The system worked well. Investment banks, because they were investing their own capital, were careful investors. Commercial banks didn’t generate large returns, but they were stable. They took in deposits and loaned money out prudently to people who started new businesses, bought homes or cars, bought seed and tractors for the new planting season. The interest these banks paid to depositors was less than the interest they received on their loans, and their profit was this spread. These bankers (my father was one of them) became essential to their local communities.

  The arrangement flourished for many years. But by the 1980s, banks had begun to chafe at the limitations of the law. Hadn’t floating interest rates on loans and deregulated interest rates on deposits guaranteed the spread for banks and reduced their worry about quality in loans? Why shouldn’t commercial banks own insurance companies? Why shouldn’t commercial banks make investments for their own benefit and use leverage to turbocharge their returns? Why shouldn’t investment banks take deposits, too, and get the federal insurance? How could U.S. banks compete with the big Japanese, German, British, and Swiss banks without these new powers? Bankers maintained that it was ultimately a question of competitiveness. Their lips said it was all about the national interest, but their eyes blazed with dollar signs. When the Glass–Steagall provisions that had reined them in were repealed in 1999, huge integrated financial institutions arose to speculate with depositors’ money in relatively unaccountable ways. The share of financial assets held by the ten largest financial institutions went from 10 percent in 1990 to 75 percent twenty years later.9 All the upside was with the bank. All the downside was with the taxpayer. Heads I win. Tails you lose. The stage had been set for disaster.

  2. The failure to regulate derivatives. In the 1990s, executives at the new megabanks found ways to bundle together real assets, like mortgages, slicing and dicing them into new financial instruments that symbolized a real thing but were just numbers on a screen. The daisy chain began with a tangible asset—a house, say, and someone who borrowed money to buy it. The buyer got a mortgage from a bank, which put a thousand such obligations, of varying quality, together into one security and sold it to the public, including other financial institutions. The bank no longer owned these mortgages or bore any responsibility for the loan payments or the mortgages’ quality. The investor who bought the security held the risk; the bank became simply a conduit. The derivatives team of the purchasing financial institution then took a thousand of these so-called mortgage-backed securities and packaged them into a financial instrument called a CDO (for “collateralized debt obligation”), which they in turn proceeded to sell to the public and other institutions. This was the second sale of the same asset. But it didn’t stop there. A thousand of these CDOs would then be packaged into a CDO-Squared, which was also sold by the latest institutional buyer. The same real asset—a house and its mortgage—had now been sold three times. The financial institutions that packaged each level of derivatives had essentially created money, and the suckers were on the other side of the trade.

  Many policymakers and way too many bank CEOs had not the foggiest idea of what was going on. Those who did lacked the strength or vision to call off the party. As Charles O. Prince, the former CEO of
Citigroup, said when asked why his corporation bought and sold derivatives, “It bears emphasis that Citi was by no means alone in the view and that everyone, including our risk managers, government regulators and other banks all believed that these securities held virtually no risk.”10

  Finally, to take this process to an absurdity, the daisy chain of selling the same asset over and over was insured by something called a CDS (for “credit default swap”), a kind of insurance that promised to pay someone who bought the CDO if the entities involved couldn’t pay them. But this kind of insurance wasn’t backed by reserves to assure payment in a crunch. The CDS was nothing more than a fourth sale—all resting on the initial sale of one house. With so much money at stake, the only thing that could rain on the parade was government regulation. So in the last year of his administration, President Bill Clinton signed a law, passed by a Republican Congress, which specifically prohibited the government from regulating the trading of derivatives.

  3. The loosening of regulations for Fannie Mae and Freddie Mac. It used to be that if you were a president who wanted low-income people to have homes, you had two choices: You could either subsidize their purchase of a home or build housing for them. But in the 1990s, there emerged a third alternative. You could instruct Fannie Mae and Freddie Mac to take on greater risk by investing in subprime mortgages given to people who, by traditional standards, would never have qualified for a loan. The relaxed underwriting standards quickly spread to private lenders, and housing speculation exploded. Prudent capital requirements were jettisoned, and profits soared. Mortgages were given by banks that never checked the income of applicants. These loans were quickly packaged and sold, and Fannie and Freddie were the biggest purchasers. Their breakdown was only a matter of time.

 

‹ Prev