Book Read Free

The European Dream

Page 41

by Jeremy Rifkin


  The new procedures represent an about-face to the way the chemical industry is regulated in the United States. In America, newly introduced chemicals are generally assessed to be safe, and the burden is primarily put on the consumer and the public at large or the government to prove that they cause harm. The European Union has reversed the burden of proof. Margot Wallstrom makes the point that “no longer do public authorities need to prove they [the products] are dangerous. The onus is now on industry” to prove that the products are safe.13

  The new EU policy represents a sea change in the handling of risks. In the United States, regulation is designed, for the most part, to address environmental problems once they occur. The Toxic Substances Control Act (TOSCA), passed in 1976, is America’s primary governmental tool for regulating toxic chemicals but is generally regarded as “being weak and too deferential to industry.”14 The vast majority of non-pesticide chemicals are not screened or tested at all before introduction into the marketplace. Even though the National Environmental Policy Act (NEPA) requires environmental-impact statements in advance of some scientific experiments and technological applications, it has been narrowly applied by the federal courts and restricted in its use. Even when it has been used, the threshold criteria for fulfilling NEPA requirements is so weak as to be largely ineffective in most instances. The European Union’s regulatory approach, in stark contrast, is designed to prevent harm before it occurs.

  Making companies prove that their chemical products are safe before they are sold is a revolutionary change. It’s impossible to conceive of the U.S. entertaining the kind of risk-prevention regulatory regime that the EU has rolled out. In a country where corporate lobbyists spend literally billions of dollars influencing congressional legislation, the chances of ever having a similar regulatory regime to the one being implemented in Europe would be nigh on impossible.

  What makes the new risk-prevention regime even more impressive is that the European Union is the largest chemical producer in the world and makes up 28 percent of the entire world output of chemical products. 15 The industry, which is the third largest in the European manufacturing sector, with annual sales of €519 billion, employs 1.7 million people, and is responsible for an additional 3 million jobs related to the industry. 16 Even so, the European Commission drove the regulatory process forward.

  The U.S. government and chemical industry—as well as European chemical companies and associations—have fought the new regulations. The U.S. says the EU chemical regulations threaten the export of more than $20 billion in chemicals that the U.S. sells to Europe each year.17 Undeterred, the European Commission endorsed the proposed regulations in October 2003. It is estimated that implementing REACH will cost the chemical industry about €2.3 billion over the next eleven years.18 The cost to downstream users (manufacturers who use chemical substances in their products) is expected to be around €2.8 to €3.6 billion over a similar time period.19 While some environmental organizations complain that the final regulations were watered down and needed to be strengthened, the very fact that the European Union has become the first political unit in the world to transfer risk to the companies, making them responsible for proving the safety of their products, represents a new departure in addressing the question of how best to regulate environmental and health risks that accompany new scientific and technological pursuits. The new proposals still have to be acted upon by the European Parliament and the European Council.

  GMOs and chemical products represent just part of the new “risk-prevention” agenda taking shape in Brussels. In early 2003, the European Union adopted a new rule prohibiting electronics manufacturers from selling products in the EU that contain mercury, lead, or other heavy metals. 20 Another new regulation requires the manufacturers of all consumer electronics and household appliances to cover the costs for recycling their products. American companies complain that compliance with the new regulations will cost them hundreds of millions of dollars a year.21

  All of these strict new rules governing risk prevention would come as a shock to most Americans, who have longed believed that the United States is the most vigilant regulatory oversight regime in the world for governing risks to the environment and public health. Although that was the case thirty years ago, it no longer is today.

  The new attention to risk prevention in Europe reflects a new sensibility to sustainable development and global stewardship of the Earth’s resources and environment. Some observers note that at least some of the impetus for strengthening regulatory oversight is the result of recent past failures of Europe’s regulatory procedures in handling the BSE outbreaks in cattle in the U.K. and other countries, the contamination of the blood supply with the HIV virus in France, the Perrier benzene scare in Europe, and other environmental and health calamities. While these incidences contributed to the heightened concern for better regulatory oversight, larger forces at work, well before these recent events, helped shape a new risk-prevention approach across the continent.

  The long-term effect of acid rain on the Black Forest in Germany; the release and spread of a deadly radioactive cloud over much of Europe in the wake of the Chernobyl nuclear-power-plant meltdown; the heightened fears over violent weather-pattern changes, including floods in Central and Eastern Europe, which many attribute to the impacts of global warming; and the proliferation of chemical and biological weapons have all sensitized Europeans to the growing global environmental and health risks attendant to the new era. Europe’s new sensitivity to global risks has led it to champion the Kyoto Protocol on climate change, the Biodiversity Treaty, the Chemical Weapons Convention, and many other treaties and accords designed to reduce global, environmental, and health risks. As mentioned in chapter 14, the U.S. government has refused, to date, to ratify any of the above agreements.

  The European Union is the first governing institution in history to emphasize human responsibilities to the global environment as a centerpiece of its political vision. Nation-states have a very different mission. Their aim has always been to expand territorial reach, exploit the Earth’s largesse, and advance material wealth. The Earth, in the nation-state era, has been viewed primarily as a resource. Science and technology, in turn, have been the tools used to probe nature’s secrets and harness her potential wealth. The goal was—and still is—economic growth and accumulation of property.

  While the member states of the EU are still very much wedded to the older nation-state mission, with its emphasis on the right to exploit nature’s resources, the people of Europe find themselves, at the same time, inexorably pulled toward a new global center of gravity where obligations to preserve the integrity of the Earth itself are of equal priority. The new crosscutting loyalties to both material self-interests and global environmental responsibilities represent the emergence of a new frame of mind for which there is no historical precedent. That’s not to say that others, elsewhere, don’t feel a similar tug. But in the U.S., for example, my sense is that global environmental concerns have somewhat less resonance among the public at large—although it’s hard to quantify—and far less attraction to political elites and policymakers.

  In Europe, intellectuals are increasingly debating the question of the great shift from a risk-taking age to a risk-prevention era. That debate is virtually nonexistent among American intellectuals. The new European intellectuals argue that vulnerability is the underbelly of risks. To the extent that individuals, and society as a whole, perceive greater opportunities than negative consequences in taking risks, they are “risk takers.” Americans, we’ve already noted, are risk-taking people. Europeans, on the other hand, are far more risk-sensitive. Much of their outlook is conditioned by a checkered past history where risk-taking resulted in significant negative consequences to society and posterity. Risk-sensitivity, however, has a silver lining. A sense of vulnerability can motivate people to band together in common cause. The European Union stands as a testimonial to collective political engagement arising from a sense of risk and
shared vulnerability. A sense of vulnerability can also lead to greater empathy for others, although it can also generate fear and retaliation toward outsiders, especially if they are perceived to be somehow to blame for one’s compromised circumstances.

  The severing of the individual from the collective in the industrial era created a new sense of risk exposure and vulnerability. Private and public insurance were ways of pooling risks to provide for one another. Insurance became a means of reducing vulnerability in an otherwise atomized, autonomous world. Although many Americans enjoy private insurance and the government provides insurance in the way of the Social Security fund, the idea of insurance—especially of a public nature—is much more developed in Europe. This is due, in part, to Europeans’ never fully accepting the Enlightenment notion of the autonomous individual responsible, in toto, for his or her fate. Europeans have continued to maintain a balance—at times uncomfortable—between individual autonomy and a collective risk-sharing responsibility. It’s the legacy of Catholic doctrine, feudal arrangements, and walled cities. Even the Protestant Reformation, with its near obsession on the individual, couldn’t totally pry the Europeans from an older and deeper communal affiliation.

  What’s changed qualitatively in the last half century since the dropping of the atomic bombs on Hiroshima and Nagasaki is that risks of all kinds are now global in scale, open-ended in duration, incalculable in their consequences, and not compensational. Their impact is universal, which means that no one can escape their potential effects. Risks have now become truly democratized, making everyone vulnerable. When everyone is vulnerable, and all can be lost, then traditional notions of calculating and pooling risks become virtually meaningless. This is what European academics call a “risk society.”

  Americans aren’t there yet. While some academics speak to global risks and vulnerabilities and a significant minority of Americans express their concerns about global risks, from climate change to loss of biodiversity, the sense of utter vulnerability just isn’t as strong on this side of the Atlantic. Europeans say we have blinders on. In reality, it’s more nuanced than that. Most Americans still hold firm to the underlying pillar of the American Dream—that each person is ultimately the captain of his or her own fate. Call it delusional, but the sense of personal empowerment is so firmly embedded in the American mind that even when pitted against growing evidence of potentially overwhelming global threats, most Americans shrug such notions off as overly pessimistic and defeatist. Individuals can move mountains. Most Americans believe that. Fewer Europeans do.

  Can one effectively build a dream on a sense of shared global risk and vulnerabilities? European elites think yes. Less sure is the European public, although the anecdotal evidence suggests they are more likely than any other peoples in the world to give it a try. Here in America, however, where 293 million individuals have been weaned on eternal optimism, and each socialized to believe that he or she can make his or her own way against all external odds, the possibility that a collective risk-prevention approach to scientific and technological pursuits might find a responsive audience is problematic.

  The European Union has already institutionalized a litmus test that cuts to the core of the differences that separate the new European view of shared risks and vulnerabilities from the older American view of unlimited personal opportunities and individual prowess. It’s called “the precautionary principle,” and it has become the centerpiece of EU regulatory policy governing science and technology in a globalizing world. Most European political elites, and the public at large, favor it. Far fewer American politicians and citizens would likely countenance it.

  The Precautionary Principle

  In November 2002, the European Commission adopted a communication on the use of the precautionary principle in regulatory oversight of science and technology innovations and the introduction of new products into the marketplace, society, and environment. According to the commission, a proposed experiment, or technology application, or product introduction is subject to review and even suspension in “cases where scientific evidence is insufficient, inconclusive or uncertain and preliminary scientific evaluation indicates that there are reasonable grounds for concern that the potentially dangerous effects on the environment, human, animal or plant health, may be inconsistent with the high level of protection chosen by the EU.”22 The key term in the directive is “uncertain.” When there is sufficient evidence to suggest a potential deleterious impact but not enough evidence to know for sure, the precautionary principle kicks in, allowing regulatory authorities to err on the side of safety by either suspending the activity altogether, modifying it, employing alternative scenarios, monitoring the activity to assess causal impacts, or creating experimental protocols to better understand its effects. The architects of the commission directive are quick to point out that the precautionary principle is to be invoked in a reasoned and nonarbitrary manner to ensure that it isn’t used as a political or economic hammer to advance other objectives. The directive states,

  Where action is deemed necessary, measures should be proportionate to the chosen level of protection, non-discriminatory in their application and consistent with similar measures already taken. They should also be based on an examination of the potential benefits and costs of action or lack of action and subject to review in the light of new scientific data and should thus be maintained as long as the scientific data remain incomplete, imprecise or inconclusive and as long as the risk is considered too high to be imposed on society.23

  The first known instance where the precautionary principle was put into effect occurred in September 1854 in the parish of St. James in central London. A London physician, John Snow, was investigating the source of a cholera outbreak that had taken five hundred lives in a ten-day period. Snow had published an earlier study comparing two water companies—one whose water was clean, the other whose water was contaminated by sewage. He theorized that the unclean water was linked to cholera. The study was already producing data to support his thesis at the time of the cholera outbreak. A quick investigation showed that all of the eighty-three people that had died in the Golden Square area between August 31 and September 5 had drank water from the contaminated Broad Street water pump rather than from the cleaner water company’s pump. He recommended to authorities that the pump handle of the Broad Street Water Company be removed. The action averted a further cholera outbreak. It should be emphasized that most scientists, at the time, did not share Snow’s view. They believed that cholera was carried by airborne contamination. The scientific link between polluted water and cholera wasn’t discovered until thirty years later.24

  The decision to follow Snow’s advice was a classic example of the precautionary principle at work—that is, taking action in a situation where there is reason to believe that there is a causal connection between an activity and deleterious consequences without yet having sufficient scientific proof to back up the claim.

  The first use of the precautionary principle in public policy came in the 1970s in Germany. German scientists and public officials were voicing increasing concern over “forest death” in Germany. They suspected that acid rain caused by air pollution was the cause but did not yet have ironclad scientific proof. Nonetheless, the German government made the decision to cut power-plant emissions with the passage of the German Clean Air Act of 1974, citing the principle of Vorsorge, or “forecaring.”25 The “precautionary principle” soon became a canon of German environmental law. The precautionary principle was “to be used in situations of potentially serious or irreversible threats to health or the environment, where there is a need to act to reduce potential hazards before there is strong proof of harm, taking into account the likely costs and benefits of action and inaction.”26

  The precautionary principle is designed to allow government authorities to respond pre-emptively, as well as after damage is inflicted, with a lower threshold of scientific certainty than has normally been the rule of thumb in the past. “Scientif
ic certainty” has been tempered by the notion of “reasonable grounds for concern.” The precautionary principle gives authorities the maneuverability and flexibility to respond to events in real time, either before they unfold or while they are unfolding, so that potential adverse impacts can be forestalled or reduced while the suspected causes of the harm are being analyzed and evaluated.

  Advocates of the precautionary principle argue that had it been invoked in the past, many of the adverse effects of new scientific and technological introductions might have been prevented, or at least mitigated, and they cite the introduction of halocarbons and the tear in the ozone hole in the Earth’s upper atmosphere, the outbreak of BSE in cattle, growing antibioticresistant strains of bacteria caused by the over-administering of antibiotics to farm animals, and the widespread deaths caused by asbestos, benzene, and PCBs.27

  In these and other instances, there were telltale signs of potential harmful effects, often right from the time of their introductions. The warning signals were ignored for a variety of reasons, including conflict of interests among the researchers responsible for overseeing possible threats. For example, in the United States, the Animal and Plant Health Inspection Service (APHIS), of the United States Department of Agriculture (USDA), is responsible for monitoring health problems in the nation’s farm animals and plants. But the USDA is also charged with the responsibility of promoting American agricultural products. In countless instances, the department has been less than rigorous in the pursuit of potential adverse environmental and health effects caused by existing agricultural practices, if those practices might, in any way, threaten the welfare of the agricultural interests they also serve.

 

‹ Prev