The whole doomsday case boils down to claiming that if something isn't done to curb CFCs, ultraviolet radiation will increase by 10 percent over the next twenty years. But from the poles to the equator it increases naturally by a whopping factor of fifty, or 5,000 percent, anyway!—equivalent to 1 percent for every six miles. A family moving house from New York to Philadelphia would experience the same increase as is predicted by the worst-case depletion scenarios. Alternatively, they could live 1,500 feet higher in elevation—say, by moving to their summer cabin in the Catskills.
Superposed on this is a minimum 25 percent swing from summer to winter, and on top of that a ten- to twelve-year pattern that follows the sunspot cycle. Finally there are irregular fluctuations caused by the effects of volcanic eruptions, electrical storms, and the like on atmospheric chemistry. Expecting to find some "natural" level, that shouldn't be deviated from, all this is like trying to define sea level in a typhoon.
Skin cancer is increasing, nevertheless. Something must be causing it.
An increasing rate of UV-induced skin cancers means that more people are receiving more exposure than they ought to. It doesn't follow that the intensity of ultraviolet is increasing, as it would if ozone were being depleted (in fact it's decreasing. Other considerations explain the facts better, such as that sun worship has become a fad among light-skinned people only in the last couple of generations; or the migrations in comparatively recent times of peoples into habitats for which they are not adapted, for instance the white population of Australia. (Native Australians have experienced no skin cancer increase.)
Deaths from drowning increase as you get nearer the equator—not because the water becomes more lethal, but because human behavior changes: not many people go swimming in the Arctic. Nevertheless, when it comes to skin cancer the National Academy of Sciences has decided that only variation of UV matters, and from the measured ozone thinning from poles to equator, and the change in zenith angle of the Sun, determined that a 1 percent decrease in ozone equates to a 2 percent rise in skin cancer. 179
How you make a disaster scenario out of this is to ignore the decline in surface UV actually measured over the last fifteen years, ignore the reversal that shows ozone to have been increasing again since 1986, and extend the 1979–86 downward slope as if it were going to continue for the next forty years. Then, take the above formula as established fact and apply it to the entire United States population. Witness: According to the NAS report (1975), approximately 600,000 new cases of skin cancer occur annually. So, by the above, a 1 percent ozone decrease gives 12,000 more skin cancers. Projecting the 5 percent ozone swing from the early eighties through the next four decades gives 25 percent, hence a 50 percent rise in skin cancer, which works out at 300,000 new cases in the year 2030 a.d., or 7.5 million over the full period. Since the mortality rate is around 2.5 percent, this gives the EPA's "200,000 extra deaths in the United States alone." Voilà: instant catastrophe.
As if this weren't flaky enough, it is known that the lethal variety of skin cancer has little to do with UV exposure, anyway. The cancers that are caused by radiation are recognizable by their correlation with latitude and length of exposure to the Sun, and are relatively easily treated. The malignant melanoma form, which does kill, affects places like the soles of the feet and underarm as well as exposed areas, and there is more of it in Sweden than in Spain. It is increasing significantly, and as far as I'm aware the reasons why are not known.
A Few Coincidences
So, what's going on? What are publicly funded institutions that claim to be speaking science doing, waving readings known to be worthless, faking data, pushing a cancer scare that contradicts fact, and force-feeding the public a line that basic physics says doesn't make sense? The only thing that comes through at all clearly is a determination to eliminate CFCs at any cost, whatever the facts, regardless of what scientists have to say.
Would it come as a complete surprise to learn that some very influential concerns stand to make a lot of money out of this? The patents on CFCs have recently run out, so anybody can now manufacture them without having to pay royalties. Sixty percent of the world market is controlled by four companies—DuPont Chemical and Allied Chemical in the United States, Imperial Chemical Industries in Britain, and Atochem in France, who are already losing revenues and market share to rapidly growing chemicals industries in the Third World, notably Brazil, South Korea, and Taiwan, which threatens their entire pricing structure. They also hold the patents in a sharing arrangement on the only substitutes in sight, which will restore monopoly privileges once again if CFCs are outlawed.
Ultraviolet light has many beneficial effects as well as detrimental. For all anyone knows, an increase such as the one that's being talked about could result in more overall good than harm. But research proposals to explore that side of things are turned down, while doomsayers line up for grants running into hundreds of millions. United Nations departments that nobody ever heard of and activists with social-engineering ambitions could end up wielding the power of global police. The race is on between chemicals manufacturers to come up with a better CFC substitute, while equipment suppliers will be busy for years. Politicians are posturing as champions to save the world, and the media are having a ball. Bob Holzknecht, who runs an automobile air-conditioning company in Florida, and who has been involved with the CFC industry for over twenty years observed when I talked to him, "Nobody's interested in reality. Everyone who knows anything stands to gain. The public will end up paying through the nose, as always. But the public is unorganized and uninformed."
Good science will be the victim too, of course. For a while, anyway. But truth has a way of winning in the end. Today's superstitions can spread a million times faster than anything dreamed of by the doom prophets in days of old. But the same technologies which make that possible can prove equally effective in putting them speedily to rest, too.
Well, that's the way I wrote it originally. Nowadays I'm not so sure about the "equally effectively" in that last sentence. It can take a long time. Yet I still find grounds for optimism in the long term. Wrecking is so much easier than building. But machines and cities are constructed; works of art and thought are accomplished. Overall, the human propensity to create seems vastly to outweigh the destructiveness.
To round things off, the following is taken from "Fact-Free Science," which appeared two years later, in Analog Science Fiction and Fact, April 1995.
The bottom-line test, after all the modeling and arguments over atmospheric chemistry are said and done with, is the amount of ultraviolet light reaching the Earth's surface. If stratospheric ozone were under relentless chemical attack in the way that we're told, the measured UV ought to be increasing. People who have measured it say it isn't.
In 1988, Joseph Scotto of the National Cancer Institute published data from eight U.S. ground stations showing that UV-B (the wavelength band affected by ozone) decreased by amounts ranging from 2 to 7 percent during the period 1974–1985. 180 A similar politically wrong trend was recorded over fifteen years by the Fraunhofer Institute of Atmospheric Sciences in Bavaria, Germany. 181
The response? Scotto's study was ignored by the international news media. He was denied funding to attend international conferences to present his findings, and the ground stations were closed down. The costs of accepting the depletion theory as true will run into billions of dollars, but apparently we can't afford a few thousand to collect the data most fundamental to testing it. In Washington, D.C., scientists who objected were attacked by environmentalist pressure groups, and former Princeton physics professor William Happer, who opposed the (1995) administration and wanted to set up an extended instrumentation network, was dismissed from his post as research director at the Department of Energy. The retiring head of the German program was replaced by a depletionist who refused to publish the institute's accumulated data and terminated further measurements, apparently on the grounds that future policy would be to rely on computer models
instead. 182
Critics jeered, and the depletion lobby was not happy. Then, after a lengthy silence, a paper appeared in Science, claiming that upward trends in UV-B had been shown to be linked to ozone depletion. 183 So, suddenly, all of the foregoing was wrong. The party line had been right all along. Depletion was real after all.
The study showed plots of ozone above Toronto declining steadily through 1989–1993, and UV increasing in step over the same period. But Dr. Arthur Robinson, the same whom we met before in the discussion on Global Warming, noticed something curious: Although the whole point was supposed to be the discovery of a correlation between decreasing ozone and increasing UV-B, nowhere in the study was there a graph relating these two quantities one to the other. 184 Neither were there any numbers that would enable such a graph to be constructed. Robinson enlarged the published plots and performed his own analysis. And the reason why no consequential trend line was shown, he discovered, was that there was no trend.
For the first four years, the ozone and UV-B rose and fell together: completely opposite to what the paper claimed to show. The result wouldn't have surprised depletion skeptics, however, who never accepted that UV has to go up as ozone goes down, in the first place. Rather, since UV creates ozone out of oxygen in the upper atmosphere, more UV getting through means a high UV flux, and so more ozone is being made up there. Hence, all else being equal, both quantities should change together with the seasonal variations and fluctuations of the sun. And the 1989–1992 pattern shows just that.
But all else isn't always equal. Ozone worldwide fell through the second half of 1992 to reach an extraordinarily low level in 1993. Satellite maps for this period show the diffusion through the stratosphere of huge plumes of sulfur dioxide from the Mount Pinatubo volcano eruption in 1991. This would extend to global dimensions the depletion chemistry usually restricted to polar ice clouds and responsible for the notorious Antarctic "hole" (replacement can't occur in the winter months because there's no sun).
So the low 1993 ozone was not caused by unusually low solar activity. Solar activity was normal, which would be expected to result in above-normal UV intensity because of the chemically thinned ozone cover. This one-time event was then stretched out to create an illusory trend beginning in 1989. In fact, it was produced from just four high readings out of more than 300 data points. 185
Logically, this would be like proving to the landlord that there's damp rising in the house by waiting for a flood and then averaging the effect back over four years. If the lawyers catch on to this one, it could open up a whole new world of liability actions.
The May 27, 1994, issue of Science carried a letter from Professors Patrick J. Michaels, Office of Climatology of the University of Virginia, and S. Fred Singer, director of the Science and Environmental Policy Project, Maryland, and research associate Paul C. Knappenberger, also from the University of Virginia, stating that the study was "So flawed as to require a formal withdrawal from the Scientific Literature."
Saving The Mosquitoes:
The War On DDT
When all its work is done the lie shall rot;
The truth is great, and shall prevail,
When none cares whether it prevails or not.
— Coventry Patmore, from the poem "Magna est Veritas"
The DDT controversy takes us back over thirty years and might have slipped from the memories of some. Others may never have been cognizant of it in the first place. And that's a good reason for selecting it for inclusion, for it constitutes the original, model environmental catastrophe scenario and protest movement, setting the pattern for just about all of the major issues that have become news since.
Some Background Intelligence: Malaria
The biggest single killer of human beings through history has been malaria. Before the 1940s, 300 million new cases were contracted annually worldwide, and of those stricken, 3 million died. 6 to 7 million cases occurred every year in the United States, primarily in the South and parts of California.
Malaria is caused by a genus of protozoan—the simplest, single-cell animal form—called Plasmodium, which comes in four species. In the human bloodstream they take a form known as merozoites, which burrow into the red blood cells and reproduce asexually, each one producing 6 to 26 new individuals which burst out to infect new blood cells on a cycle that repeats every forty-eight hours. When the number of merozoites exceeds about 50 per cubic milliliter of blood (a typical drop), the victim suffers a malaria attack every forty-eight hours. In a heavily infected person, the number of plasmodia present can be as high as 2 million per milliliter.
The severity of the symptoms depends on the species involved, but a typical attack consists of severe frontal headache and pain in the neck, lower back, and limbs, dizziness and general malaise, accompanied by waves of chill and seizures alternating with fever temperatures of up to 104oF and profuse sweating, acute thirst and vomiting being not uncommon. The falciparum variety can kill up to 40 percent of those affected. Deaths occur mainly among children under five years old. For those who survive, the pattern continues for several months, and then gives way to symptom-free periods punctuated by relapses that occur over anywhere from a year to ten years. The effects can be sufficiently debilitating to incapacitate 80 percent of a workforce, with such consequences as preventing harvesting of a food crop, thus rendering a population vulnerable to all of the opportunistic threats that come with malnutrition and an impaired immune system, such as hepatitis, tuberculosis, dysentery, and typhoid fever. Transmission from person to person takes place through the ingestion of blood by females of the Anopheles mosquito, and re-injection of Plasmodium into a new victim via the saliva after undergoing another part of its life cycle within the mosquito's stomach.
Since, through most of history, eliminating the mosquito was never feasible, attempts at checking the spread of the disease were directed at destruction of the breeding grounds. The two main methods were draining of swamps and marshy areas, which dates back to the Romans, and the flooding of lakes and open areas of water with oil from early spring to fall, to prevent the mosquito larvae from breathing. Where irrigation channels were needed for agriculture, a common practice was to introduce the "mosquito fish" Gambusia, a typically arduous and expensive undertaking, since it was usually necessary to first eradicate such predatory types as catfish, which were partial to Gambusia. These measures were partially successful at best, and confined to the more developed countries. Only Italy achieved what seemed to be eradication, after a fifteen-year program of intensive effort under Mussolini, but the victory turned out to be temporary.
Then, in 1939, Paul Mueller, a chemist working for J. R. Geigy S.S. in Switzerland, developed a compound, ichloro-diphenyl-trichloroethane—DDT—by combining chlorals with hydrocarbons and phenols that was cheap, easy to produce and use, nontoxic to mammals and plants, but extremely toxic on contact to insects and various other arthropods. The Allies quickly recognized its value for wartime use and found it 100 percent effective as a fumigant against the ticks and body lice that transmit typhus, which in World War I had killed millions of soldiers and civilians in Europe. In early 1944 an incipient typhus epidemic in Naples was halted with no adverse side effects apart from a few cases of very minor skin irritation, after efforts with more conventional agents achieved only limited results. A plague epidemic in Dakar, West Africa, was stopped by using DDT to eliminate the carrier fleas, and it was mobilized with great success against malaria in the Pacific theater, Southeast Asia, and Africa. After the war, DDT became widely available not only for the reduction of insect-transmitted human diseases but also of a wide range of agricultural, timber, and animal pests. The results from around the world seemed to bear out its promise as the perfect insecticide.
For combating malaria, it was sufficient to spray the walls and ceiling of dwellings once or twice a year. Malaria mosquitoes rested in these places when inactive, and the DDT penetrated via their feet. Incidence in India in the 1940s was over 100 million
cases annually, of which 2.5 million died. By 1962 these numbers were down to 5 million and 150,000, while life expectancy had risen from thirty-two to forty-seven. 186 A 1.5-ounce shot glass of DDT solution covered twelve by twelve feet of wall. The cost per human life saved worked out at about twenty cents per year. In the same period, India's wheat production increased from less than 25 million tons to over 100 million tons per year due to a combination of pest reduction and a healthier workforce. Ceylon—now Sri Lanka—reduced its malaria figures from 3 million cases and 12,000 deaths per year in the early fifties to 31 cases total in 1962, and 17 cases the year after, with zero deaths. Pakistan reported 7 million cases of malaria in 1961, which after the introduction of an aggressive spraying program had fallen to 9,500 by 1967. 187
In Africa, in what is considered to be its second most important medical benefit after reducing malaria, DDT proved effective in a program to control the bloodsucking tsetse fly, which transmits the protozoan responsible for deadly sleeping sickness and also fatal cattle diseases. According to the World Health Organization, 40 million square miles of land that had been rendered uninhabitable for humans because of tsetse fly infestation became available.
Another serious menace in parts of Africa and Central American is the blackfly that transmits roundworms causing "river blindness" in humans. Before DDT was introduced, more than 20,000 victims of this affliction in Africa were blind, with incidences as high as 30 percent of the populations of some villages. The larvae of the flies live in fast-flowing streams and had proved impossible to control until the occurrence of a fortunate accident in the 1950s in the Volta River basin, when a mule carrying DDT powder to a spraying project slipped while fording a stream and spilled its load into the water. Blackfly larvae were killed for a mile downstream without ill effects on other forms of aquatic life, and a river treatment program was implemented subsequently, greatly reducing the number of river-blindness sufferers. No masks or protective clothing were required for the operatives. In this entire period no instance of DDT-induced illness was reported among the estimated 130,000 spraying personnel employed, or the millions of people whose dwellings were treated. S. W. Simmons, chief of the technology branch of the Communicable Disease Center of the U.S. Public Health Service, said in 1959:
Kicking the Sacred Cow Page 29