Book Read Free

Why Trust Science?

Page 2

by Naomi Oreskes


  Values inevitably play a role in shaping science, Oreskes insists. In looking back on eugenics, scientists may say that science was distorted by values, but values were also central to opposing eugenics and also the Limited Energy Theory. Because values play an inevitable role, diverse scientific communities are more likely to be able to detect unexamined assumptions, blind spots, and inherited biases: “A community with diverse values is more likely to identify and challenge prejudicial beliefs embedded in, or masquerading as, scientific theory.” She also allows that there can be legitimate non-scientific objections—including ones based on religious or moral values—to policies that are justified partly by science but also by particular value claims.

  And humility is important. Diverse scientific communities can correct for the blind spots of arrogant scientists, but the history of science counsels humility: the greatest scientists (and, one might add, philosophers) have sometimes become fetishists about method, drawn false conclusions from evidence, and fallen prey to the prejudices and biases of their times.4 Even the best of scientists should remember that a complete grasp of the whole truth is yet far beyond us.

  So, when should we trust science? In concluding chapter 2, Oreskes summarizes: when an expert consensus emerges in a scientific community that is diverse and characterized by ample opportunities for peer review and openness to criticism. Of course, any particular scientific claim may be false, so she reminds us of Pascal’s Wager: consider the stakes of error. It may not be certain that flossing will be good for your teeth, but it is cheap and easy. It may not be certain that human actions and policy changes can reverse the dire effects of climate change, but consider the calamities that await our children and grandchildren if we now ignore scientific predictions that are correct.

  In a coda to her two lectures, Professor Oreskes returns to the issue of scientists’ values. In theory, scientific findings are one thing and the question of what if anything to do about them is another. So one might suppose that whereas the practical question of “what is to be done” inevitably implicates values, the question of what scientific evidence shows need not. Ideally, science should be able to leave political and moral controversies to others.

  Things are not so neat and simple, however. Professor Oreskes observes that people equate science with what they think are its implications. Fundamentalist and evangelical Christians from Williams Jennings Bryan to Rick Santorum have worried that evolutionary accounts of human origins undermine human dignity and morality, by making humans, in Santorum’s words, “mistakes of nature.” Skepticism about climate science, on the other hand, is fed by the suspicion that environmentalists seek to undermine the “American way of life”: big cars, motorboats, and high consumption.

  In the face of such suspicions it is profoundly mistaken, argues Oreskes, for scientists to retreat to value neutrality. In the face of the question: why should ordinary people trust science and take it seriously? It cannot be effective to reply that scientists lack values! That is precisely what worries people. Moreover, it is perfectly obvious that scientists do have values—everyone does—and that those values influence their work. To hide your values, Oreskes observes, is to hide your humanity.

  So, scientists should be honest about their values. Many people will share those values, and on that basis trust can be built. The Creation revered by Christians is the biodiversity cherished by Scientists, says Oreskes, and the evidence is overwhelming that these are now gravely threatened.

  In concluding, Professor Oreskes offers an eloquent summary of her own credo: her guiding values as a scientist and environmentalist. “If we fail to act on our scientific knowledge and it turns out to be right, people will suffer and the world will be diminished.”

  In the next section of this volume, four distinguished commentators expand upon, elaborate, or criticize central features of Professor Oreskes’s lectures.

  Professor Susan Lindee is the Janice and Julian Bers Professor of History and Sociology of Science at the University of Pennsylvania, where she also holds a variety of administrative posts. Lindee argues that in responding to scientific skepticism we should draw attention to the science that we encounter and rely upon constantly in our everyday lives. We should “work our way up, from the toaster,” to the frozen peas, the smart phones, and the other miracles of modern science and technology that enhance our lives.

  Of course, science’s contributions are not always so positive. Professor Lindee reminds us of the twentieth century’s brutal history of technology-enhanced warfare. She suggests that historians of science have sought to distance pure science from technological applications because of technology’s profoundly mixed legacy. Atomic scientists sought to maintain their moral purity by attributing the design of the bomb to mere engineers.

  Marc Lange is the Theda Perdue Distinguished Professor and department chair in philosophy at the University of North Carolina, where he specializes in the philosophy of science. Lange notes that the question of why we should trust science seems to lead into a vicious circularity: isn’t peer review just experts vouching for other experts?

  Professor Lange suggests that asking for an external vindication of science as a whole may be unreasonable: science is self-correcting in that it can subject any particular scientific claim to critical scrutiny, “But science cannot reasonably be expected to put all its theories in jeopardy at once.”

  Lange also raises the issue of what Thomas Kuhn described as revolutionary challenges to entire worldviews or paradigms, in which methods and theories “interpenetrate.” Using the example of Galileo, he suggests that there is typically “sparse common ground” across paradigm shifts, and scientists can use it to build an argument for one of the rival theories against the others. Lange closes by urging philosophers and others to stop overemphasizing “incommensurability and under-determination” and to devote more attention to positive accounts “of the logic underlying scientific reasoning.”

  Ottmar Edenhofer is deputy director and chief economist at the Potsdam Institute for Climate Impact Research, as well as a professor at the Technical University Berlin. He offered a comment in Princeton, and is joined here by Martin Kowarsch, who is head of the working group on Scientific Assessments, Ethics, and Public Policy at the Mercator Research Institute. They begin by suggesting that the Trump administration accepts much climate science but opposes ambitious climate change mitigation efforts, partly because it heavily discounts the costs of climate change outside the United States. Thus, scientific consensus does not equal policy consensus, and so they ask how Oreskes’s account of trust in science may need to be extended or amended for science-based policy assessments. They advise experimentation aimed at incremental learning about alternative policy pathways, and argue that costly mistakes have been made due to insufficient awareness of the complexity of the policy alternatives.

  Edenhoffer and Kowarsch agree with Oreskes that value neutrality is impossible. They build on Deweyan pragmatism to propose that all socially important values—“equality, liberty, purity, nationalism, etc.”—should be included in policy assessments: this may open the door to new and creative proposals.

  Finally, Jon Krosnick offers some thoughts, inspired by Professor Oreskes’s lectures, on the current state and future of science. Krosnick is Frederick O. Glover Professor in Humanities and Social Sciences and professor of communication, political science, and psychology at Stanford University, where he also directs the Political Psychology Research Group.

  Professor Krosnick describes a number of famous (now infamous) and influential scientific findings—in biomedicine, psychology, and elsewhere—whose results scientists have been unable to replicate. In some cases the data were fabricated, in other cases investigators admitted to repeating an experiment until the desired result was produced.

  Flawed research results partly from faulty methods, argues Krosnick, and also the desire for career advancement. Academic departments and professions place a premium on publishing surprisin
g and counterintuitive findings. Is it any wonder that many of these prove unfounded on closer inspection? Journals rarely publish negative results so refutation of bad research is slowed. He insists that scientists must face up to the problems and address the counterproductive motivations that are now rampant.

  In her wide-ranging Reply to Critics, Professor Oreskes deepens and enriches her argument.

  She praises Susan Lindee for her brilliant historical account of scientists’ attempts to distance themselves from the technological applications of their work, yet expresses doubt that becoming clearer-eyed about the science embodied in frozen peas and smart phones will have much effect on people’s attitudes to climate science. Americans do not reject science in general but rather particular “scientific claims and conclusions that clash with their economic interests or cherished beliefs.”

  In response to Marc Lange, Professor Oreskes expresses doubt that trust in scientific experts is viciously circular. The “social markers of expertise are evident to non-experts,” she argues, and it is relatively easy to figure out that climate science deniers are non-experts and that the American Enterprise Institute is precommitted to certain policy outcomes. Expert scientific consensus does tend to be reliable.

  In response to Edenhofer and Kowarsch, Professor Oreskes agrees that more work is needed on how to move from science to policy. Yet she insists that when powerful actors to seek to undermine public trust in the science associated with progressive climate policy, the roots of their skepticism are typically not in distrust of science but rather in economic self-interest and ideological commitments. Oreskes reiterates that if scientists are honest about their values, as she recommends, then they will often find that there is considerable overlap on the values behind climate policy disagreements, and this may help us build greater trust.

  Professor Oreskes turns, finally, to Jon Krosnick’s assertion that science faces a “replication crisis.” While allowing that there have been notable examples, often involving the misuse of statistics, she points out that the rate of retractions—that is, retractions as a percentage of published articles—is tiny: perhaps less than .01%. If the rate has risen, that may reflect a salutary increase in critical scrutiny of findings, rather than a higher incidence of faulty research. Or it may reflect unwarranted media coverage of flashy single-paper results in psychology and biomedicine.

  Oreskes pushes back against Krosnick’s wider suggestions about a crisis in science. His examples furnish no evidence that fraud is commoner in science than elsewhere. Moreover, in some of Krosnick’s examples fraud was discovered and punished expeditiously. Refutation and retraction are paths to progress. She reminds us that her argument has been that we should trust scientific consensus, not the single studies to which Krosnick draws attention, and reiterates that motivated industry funding of research is a serious problem.

  In an afterword penned just before this book went to press, Professor Oreskes notes that the problem of trust in science—and in news and information more generally—has exploded since she delivered the Princeton Tanner Lectures in the fall of 2016. Many more Americans believe in the reality of climate change than once did, but America is led by a science and fact-denying chief executive who is reversing hard-won progress on climate policy. It remains the case that much doubt about consensus findings in science is manufactured by those with financial or ideological interests in derailing science-based policies, just as she and Erik Conway argued in Merchants of Doubt.

  Professor Oreskes closes by reiterating that science merits our trust when scientific results achieve consensus among the expert members of diverse and self-critical scientific communities. And she offers a final example—controversies over the use of sunscreen—to illustrate this book’s core theme.

  Like all excellent books, this one addresses many questions and also raises some. While Professor Oreskes argues that progress and reliability in science depends more on the qualities of scientific communities than on the character of individual scientists, she also argues that scientists’ inevitably have values and that they should be honest about them. Do not well-working scientific communities depend on the predominance of good values—of intellectual honesty and truth seeking—among scientists? And if diversity is important in scientific communities, of what kinds? The inclusion of women and members of racial, ethnic, religious, and other minority populations has obviously been very good for all of the sciences, and scholarship generally. Are there social sciences (and perhaps other fields of inquiry) in which greater ideological diversity would be helpful?

  Readers will come away from this volume armed with a far better understanding of the vitally important enterprise of modern science and the reasons why we should trust scientific consensus. All who care about the future of humanity on this fragile earth should hope that this timely and important book gains a wide audience, before it is too late.

  Chapter 1

  WHY TRUST SCIENCE?

  Perspectives from the History and Philosophy of Science

  The Problem1

  Many people are confused about the risks involved in vaccination, the causes of climate change, what to do to stay healthy, and other matters that fall within the domain of science. Immunologists tell us that vaccines are generally safe for most people, have protected millions of people from deadly and disfiguring diseases, and do not cause autism. Atmospheric physicists tell us that the build-up of greenhouse gases in the atmosphere is warming the planet, driving sea level rise and extreme weather events. Dentists tell us to floss our teeth. But how do they know these things? How do we know they’re not wrong? Each of these claims is disputed in the popular press and on the internet, sometimes by people who claim to be scientists. Can we make sense of competing claims?

  Consider three recent examples.

  One: In a 2016 presidential debate, Donald Trump rejected the position of medical professionals—including that of fellow candidate physician Ben Carson—on the safety of vaccination. Recounting the experience of an employee whose child was vaccinated and later diagnosed as autistic, Mr. Trump stated his view that vaccines should be given at lower doses and be more widely spaced. Few medical professionals share his view.2 They consider delaying vaccination to increase the risk that infants and children will contract dangerous and otherwise preventable diseases such as measles, mumps, diphtheria, tetanus, and pertussis. Some of the children who contract these diseases will become gravely ill or die. Others will survive but pass on the infections to others. Yet, Mr. Trump is not alone in making this suggestion; prominent celebrities have made similar exhortations. Many parents now reject the advice of their physicians and choose to have their children vaccinated on a delayed schedule—or not at all. As a result, morbidity and mortality from preventable infectious diseases are on the rise.3

  Two: The vice president of the United States, Mike Pence, is a young Earth creationist, meaning that he believes that God created the Earth and all it contains less than ten thousand years ago. The consensus of scientific opinion is that Earth is 4.5 billion years old, that the genus Homo emerged two to three million years ago, and that anatomically modern humans appeared about two hundred thousand years ago. While science cannot answer the question of whether God (or any supernatural being or force) guided the process, most scientists are persuaded that life on Earth evolved largely through the process of natural selection over the course of Earth’s history, that humans share a common ancestor with chimpanzees and other primates, and that divine intervention is not required to explain the existence of Homo sapiens sapiens.4

  Do Americans lean toward the scientific view or the Pencian view? The answer depends a bit on how you ask the question, but if you are a religious person in America who attends church regularly, the chances are high that you agree with Mike Pence: 67% of regular churchgoers believe that God created humans in their present form within the last ten thousand years. Some of us may think that these people are all Republicans, but we would be wrong. According to the Gal
lup polling organization, while 58% of Republicans agreed with the statement that “God created humans in their present form, within the last 10,000 years,” so did 39% of independents and 41% of Democrats.5 Given this popular support for creationism, it is perhaps unsurprising that in 2012, the state of Tennessee enacted what some have called a “twenty-first-century Monkey Law,” empowering teachers to teach creationism in science classrooms.6 Despite repeated rejection of previous laws of this type by US courts, many states continue to attempt to enact comparable laws.7

  Three: The American Enterprise Institute (AEI) is a long-established and well-funded think tank in Washington, DC, committed to principles of laissez-faire economics, market-based mechanisms to social problems, limited (federal) government, and low rates of taxation. The Institute has long promoted skepticism about the scientific evidence for anthropogenic climate change and disparaged the conclusions of the scientific community, including the Intergovernmental Panel on Climate Change (IPCC).8 AEI scholars have suggested that climate scientists are suppressing dissent within their community; the Institute at one point offered a cash incentive to anyone willing to search for errors in IPCC reports. Jeffrey Sachs, head of the Earth Institute at Columbia University from 2002–16 and special advisor to UN secretary-general António Guterres on the Millennium Development Goals, has said of one well-known AEI scholar that he “distorts, misrepresents, or simply ignores” relevant scientific conclusions.9 In 2016, this particular scholar referred to scientists as an “interest group,” demanding to know why “scientific analysis conducted or funded by an agency headed by political appointees buffeted by political pressures … [should] be viewed ex ante as any more authoritative than that originating from, say, the petroleum industry?”10

 

‹ Prev