Down to Earth_Nature's Role in American History
Page 21
The military put the park in order, gratifying those like John Muir, interested in safeguarding the nation’s natural wonders. “Uncle Sam’s soldiers,” exclaimed Muir, were “the most effective forest police.” The military restricted entry into the park, forcing visitors to use one of four main entrances instead of entering willy-nilly along the many surreptitious trails that Indians and rural whites had bushwhacked. It suppressed fires, which Indians in the area had long been accustomed to setting intentionally to kill game and manage the landscape to their liking. It erected fences to prevent cattle and other stray animals owned by whites from venturing within park boundaries. And it sought to prevent the poaching of game, which Congress, in 1894, had elevated into a federal offense. Conservation, as it played out in the national parks, essentially transformed such ingrained and acceptable behaviors as hunting, collecting, and fire setting into crimes like trespassing, poaching, and arson.30
The motivation behind this strand of conservation thinking again stemmed, in part, from a concern with lawless behavior. Just as Gifford Pinchot feared the chaos and threat to property rights posed by fire setting, wildlife advocates like William Hornaday voiced a similar concern over the perils of poaching. Hornaday, who was born in Plainfield, Indiana, in 1854, moved with his family to Wapello County, Iowa, as a young child. Living on the edge of an extensive and sparsely settled stretch of prairie, Hornaday often saw huge flocks of passenger pigeons and other birds, which evidently made quite an impression on him. He later went on to become a taxidermist and, in 1896, was chosen to head the New York Zoological Park, familiarly known as the Bronx Zoo. As director of the zoo, he spoke out in favor of wildlife conservation and against the reckless slaughter of game. He was particularly rankled by those, mainly immigrants and blacks, who killed wildlife for food. “The Italian is a born pot-hunter, and he has grown up in the fixed belief that killing song-birds for food is right!” he wrote in 1913. In the West especially, he noted, violators of game laws were often set free by sympathetic juries on the pretext that the suspect needed the meat to survive. Hornaday could not have disagreed more with such reasoning. “Any community which tolerates contempt for law, and law-defying judges, is in a degenerate state, bordering on barbarism; and in the United States there are literally thousands of such communities!” Whatever his love for animals, there is no denying Hornaday’s abiding concern with law and order, or his anti-immigrant rhetoric, both of which probably derived from the flood of foreigners to America’s shores at the turn of the century.31
For their part, rural whites took game within the park for several reasons. First, the market in elk teeth boomed after the founding in 1868 of the Elks Club, a New York City fraternal order, which used them for everything from rings to cuff links. Whites also sold hides for cash or traded in return for items such as coffee and sugar that were not easily obtained in the Yellowstone area. Second, these locals depended on game as a source of food, especially during economic downturns. One unemployed worker arrested in 1914 for poaching game in the park claimed it was the fact that he was “broke all the time” that drove him to crime. As one park official noted in 1912, elk wandering out of Yellowstone were often killed by “families that otherwise might have had a slim meat ration for the winter due to dull times for workingmen in this section of country.” And third, poaching offered an alternative to the discipline of wage work. As one newspaper put it, “Some men would rather spend a month or more time in trapping a beaver or two, or killing an elk at the risk of fine and imprisonment, than earn a few honest dollars by manual labor.”32
If it was hard to get rural whites to obey the law, it was even harder to force the park’s animals to cooperate with the authorities. The source of the problem was severalfold. To begin with, the boundaries of the park did not conform to a discrete ecosystem. The park was huge, to be sure, but not big enough to support and protect all the species of wildlife that roamed it, a fact recognized by none other than Gen. Philip Sheridan. Surprisingly, after years spent trying to annihilate the buffalo (and the Indians who depended on it), Sheridan had a second career as a conservationist. Following a tour of Yellowstone in 1881, he suggested that the park be doubled in size in order to encompass the full range of migrating species of wildlife. That never happened, mainly because much of the land Sheridan had in mind wound up as part of a national forest instead.
As it turned out, the arbitrary confines of the park proved considerably troublesome when it came to preserving the tourist-friendly elk. Elk stayed in the park’s higher elevations during the summer and early fall, but when winter hit, with its snow and severe cold, the animals drifted into the lower elevation river valleys further north, their so-called winter range. There was only one problem: Much of that range lay outside of Yellowstone’s boundaries. But when the erection of fences and the establishment of communities north of Yellowstone forced the elk to winter inside the park, the animals gobbled up much of the available plant cover and caused the park’s habitat to decline.
Climate also shaped the prospects for the park managers’ beloved big game. A sharp plunge in winter temperatures occurred between 1885 and 1900. The years from 1877 to 1890, meanwhile, proved the greatest in terms of winter precipitation. Thus the harshest winters on record in Yellowstone occurred during the latter part of the 1880s, with snow and cold limiting the access of such large herbivores as elk to forage. Climate combined with unregulated hunting depressed wildlife populations and may have spurred the calls for military intervention in the park’s affairs.33
In the century following 1900, winters in Yellowstone grew increasingly mild, expanding forage prospects and creating more favorable conditions for herbivore populations to expand. This trend toward more mild weather coincided, as it happened, with the advent of predator control in the park. Federal hunters were sent to the park in 1915 after an official from the Bureau of Biological Survey visited and recommended the extermination of all coyotes and wolves before they devoured all of Yellowstone’s elk. A huge debate has swirled over the number of elk present in the park, but one estimate placed 1,500 animals there in the late 1870s, rising to 20,000 to 30,000 by 1919. In the latter year, drought gripped the area in the summer, followed by a severe winter. By early in 1920, very little forage remained and, according to park service estimates, 6,000 animals starved to death. “The range was in deplorable condition when we first saw it,” reported biologists who visited in 1929, “and its deterioration has been progressing steadily since then.”34
If a debate rages over the exact elk count, changes in vegetation seem indisputable. Communities of willow shrubs once graced Yellowstone’s northern range. Over the course of the twentieth century, however, the willows vanished. The available evidence suggests strongly that the elk were to blame, although climate change and fire suppression may also have played a role. However the willow disappeared, the change had effects that extended up and down the food chain. Beavers relied on willow for building dams and for food. Thought to be common in the park in the early nineteenth century, beavers were vanishing by the 1930s. Change rippled through Yellowstone. Wetland habitat declined with the beavers no longer around to maintain it, making the park drier overall. As river habitats dried up, white-tailed deer, a species that depended on these environments, disappeared, becoming extinct in the park by 1930. Birds and even grizzly bears may have suffered as well.35
Despite Yellowstone’s vast expanse of land, separating the region from the larger ecosystem and packaging it for sale to tourists—chiefly by encouraging the proliferation of big-game species—had unintended consequences. Elk and bisons came to rule this world at the expense of beavers, deer, and other animals, as well as the wolves and coyotes that were killed off intentionally. We have here conservation of the few at the expense of the many.
CONCLUSION
“The natural resources of the Nation,” Gifford Pinchot wrote in 1910, “exist not for any small group, not for any individual, but for all the people.”36 Pinchot call
ed attention to one of the conservation movement’s greatest legacies. For most of the nineteenth century, the federal government occupied itself with disposing of the nation’s natural wealth, often to railroad, mining, and timber groups, which then claimed the land as private property and exploited it for all it was worth. With the birth of conservation, however, the federal government shifted roles from gift giver to expert overseer, assuming control over large sections of the continent and seeking to manage them, as Pinchot noted, in the interests of the American public.
That was an important achievement, especially in a nation wedded from birth to the concept of private ownership of land. Yet it must be borne in mind that conservation—whatever its merits over the government attempt to stimulate development at any cost—did not function in the interests of all Americans. At the core of the movement stood the clash over class and racial politics.
“The national parks must be maintained in absolutely unimpaired form for the use of future generations,” said Secretary of the Interior Franklin Lane in 1918.37 What he really meant was that national parks and their animal populations had to be administered to suit the needs of the middle-class tourists streaming into them. Compared to other unprotected lands, the parks were certainly less subject to human intervention. But to view them as untouched or “unimpaired” is to deny all the many attempts by government officials to shape what went on there, especially the efforts by game managers to conserve those animals that appealed to the parks’ mainly better-off white tourist clientele.
It is impossible not to be struck by the incredible number of contradictions that surround the history of the conservation movement. In getting back to nature in the national parks, to take one glaring example, tourists were actually bearing witness to an engineered environment, and a fragile one at that. By the late nineteenth century, Yellowstone was one of two places in North America where small numbers of buffalo existed (the other was Canada’s Wood Buffalo National Park). Market hunters, however, decimated this remaining herd between 1889 and 1894. A concessionaire by the name of E. C. Waters then imported a handful of bison from ranching magnate Charles Goodnight and shipped them off to Dot Island in Yellowstone Lake, where a steamboat shuttled tourists out to see them. Meanwhile, the U.S. military launched its own effort to restore bison, setting up an enclosure and dragging in bales of hay to lure the animals in for a dose of domestication. The buffalo failed to show. The hay the military cut destroyed bison habitat in the Hayden valley, spurring them to move on rather than accept the cavalry’s offer.38
Far more lasting success was obtained beginning in 1902, when President Roosevelt hired Charles “Buffalo” Jones, a bison expert, to maintain Yellowstone’s herd. Jones purchased animals from private ranchers and set up a corral near one of the park’s main entrances, although it was later moved to the Lamar valley, where it flourished for half a century. The bison gave the tourists something to see and the railroads a new selling angle. A 1904 advertisement trumpeted: “BISON once roamed the country now traversed by the North Pacific. The remnant of these Noble Beasts is now found in Yellowstone Park reached directly only by this line.”39
If park boosters were shameless in using bison to lure tourists, they bordered on deceitful when it came to employing Indians to entice people into the park. In his Dot Island venture, E. C. Waters tried to find a few Native Americans to complement the imported bison in the exhibit. But the Crow Indians he asked refused his offer. That did not stop the Northern Pacific, however, from using a small group of Blackfeet Indians to advertise the scenic beauty of Glacier National Park, another destination serviced by the line. In 1912, the railroad’s president, Louis Hill, arranged to have 10 Indians set up tepees on the roof of a New York City hotel to attract publicity for this new western attraction. The national parks were virgin territory, devoid of Indians, congressmen had said when setting up the parks. Now the railroads wanted the Native Americans back.40
Bison and Indians were the two icons that, more than anything else, symbolized the destruction of both nature and culture in the American West. Employing them to sell the American people on the need to visit the newly conserved and “unspoiled” parks amounted to one huge exercise in cultural self-deception. Could anything be more paradoxical than using contrived groups of animals and people, annihilated in the so-called winning of the West, to lure tourists to supposedly “untouched” wilderness? If conservation broke the pattern of unrestrained, economic development of the natural world, it substituted in its place a subtler political agenda shot through with irony.
10
DEATH OF THE ORGANIC CITY
Before 1880 it was not the least bit unusual to walk out into the streets of Atlanta and find cows. In 1881, however, the city’s political leaders decided that bovines were no longer welcome. The cows could come home, as the saying goes, but not to the streets of Atlanta—not after the city council passed a law making it illegal for cattle to roam the town. Apparently a large number of working people, who depended on the animals as a source of milk and meat, objected to the ordinance. A man named J. D. Garrison denounced the law as little more than a thinly veiled attempt at class warfare. “I speak the feelings of every man when I say this is the dictation of a codfish aristocracy,” he said. “It is an issue between flowers and milk—between the front yard of the rich man and the sustenance of the poor family.”1
The large animals that once wandered the streets of urban America are of course long gone. Finding out how and why they disappeared means taking a journey back to the late nineteenth century, to the Progressive Era, when cities across the nation underwent a major cleanup.
While conservationists put the countryside in order, another group of reformers trained their sights on urban areas. In 1869, only nine cities had populations exceeding 100,000; in 1890, 28 had reached that mark. A third of all Americans now lived in such places. New York, Chicago, Philadelphia, St. Louis, Boston, and Baltimore, in that order, were the nation’s largest population centers, filled with immigrants who worked producing apparel, forging steel, and packing meat. Crammed with people and factories, in addition to the pigs, horses, mules, cattle, and goats used for food and transport, the city emerged by the late nineteenth century as a dark and filthy place. Muckraker Upton Sinclair, in his novel The Jungle, captured the enormity of the problem, describing the “strange, pungent odor” people smelled as they approached Chicago’s stockyards—a stench “you could literally taste”—and the chimneys belching out smoke that was “thick, oily, and black as night.”2
Enter such Progressive Era reformers as Jane Addams, Robert Woods, and Florence Kelley, people who fervently felt that the creation of a clean and healthful environment, not genetic predisposition as had formerly been believed, could defend against ignorance and criminality. Taking their cue from the British, they formed settlement houses, most famously Hull House, established by Addams in Chicago in 1889—the model for some 400 such community institutions nationwide. The settlement houses engaged in a host of efforts to improve the lives of slum dwellers, setting up kindergartens and sponsoring everything from health clinics to music studios to playgrounds. In 1894, Addams herself, in an effort to lower the mortality rate in one ward, led a group of immigrant women from Hull House on a nightly inspection designed to check up on the work of the city’s garbage collectors.
MORTON STREET, NEW YORK CITY
Progressive Era reformers such as New York’s street-cleaning commissioner George E. Waring, Jr., modernized sanitation services. Waring increased both pay and morale among the city’s street-cleaning crew with excellent results, as demonstrated in these two photographs, one taken in 1893, the other in 1895, after Waring assumed his post. (George Waring, Street-Cleaning and the Disposal of a City’s Wastes [New York: Doubleday and McClure, 1897])
As important as the sanitary reforms were—especially the improved water and sewer systems that led to a decline in disease—they came at a cost. All dirt is not equally bad; cleaner cities are n
ot necessarily better for everyone. Indeed, there were some virtues to the filth. Life in the “organic city,” a place swarming with pigs and horses and steeped in mountains of manure, was dirty, but it also had a certain social and environmental logic.
Today we rarely associate big animals with urban areas, with the possible exception of the horse-and-buggy tour. In the nineteenth century, however, it would be impossible to imagine such places without the creatures that roamed the streets, not to mention the stinking piles of excrement they left behind. In these days before municipal trash collection, working-class women fed their families with pigs that fattened on city garbage. Horses carried people and goods, hauled pumps to help put out fires, and even produced power for manufacturing such things as bricks and lumber. Horse manure, meanwhile, did not go to waste. It streamed into surrounding vegetable farms, where it bolstered soil fertility. Even a good deal of human waste, which piled up in privies and cesspools in the days before sanitary sewers, found its way to the rural hinterlands. City dwellers and their animals were largely integrated into the regional soil cycle, supplying it with the nutrients for growing food that was then trucked back into town. Thus did life proceed in the organic city, with vegetables and hay flowing one way and waste the other.
In the late nineteenth century, reformers bent on sanitation put an end to the city in its down-to-earth form. They drove the pigs out, forcing the working class to rely more on the cash economy for food, and substituted municipal garbage collectors for the hogs. They replaced horses with electric streetcars, ending the city’s role as a manure factory and stimulating nearby vegetable farmers to turn to artificial fertilizers. They built sewerage systems that carried waste off into lakes, rivers, and harbors, eliminating the privies and the “night soil” men who had delivered the excrement to the countryside—distancing urbanites from the land they had once helped to enrich. Overall, public health improved. But the poor were left to fend for themselves in the wage economy as the urban commons—akin to the open range in the South and West—vanished. Public health indeed had its virtues, but it also had some important social and ecological tradeoffs.