A Woman Looking at Men Looking at Women

Home > Literature > A Woman Looking at Men Looking at Women > Page 20
A Woman Looking at Men Looking at Women Page 20

by Siri Hustvedt


  Goethe was penetrating about how a hypothetical idea can take hold and infect one generation after another:

  A false hypothesis is better than none; there is no danger in a false hypothesis itself. But if it affirms itself and becomes generally accepted and turns into a creed which is no longer doubted and which no one is allowed to investigate: this is the evil from which centuries will suffer.28

  In his introduction to The Structure of Scientific Revolutions (1962), Thomas Kuhn asserted that no group of scientists could work without “some set of received beliefs” about what the world is like.29 He argued that before any research can be done, every scientific community must have agreed on the answers to a number of fundamental questions about the world and that these answers are embedded in the institutions that train scientists for their work. Kuhn, who began his career as a physicist, continues to distress his fellow scientists because the notion that the foundations of scientific work may be shaky remains a subversive position. In fact, the hostility toward Kuhn among scientists has often startled me. The mention of his name is enough to cause a bristling response. He is often viewed as someone who wanted to undermine science altogether, but I have never read him that way.

  Like Whitehead, Kuhn understood that science rests on a foundation that is assumed and does not begin at the beginning. If every graduate student in biology were presented with Descartes’s first question and asked to confirm her own existence and the existence of the world beyond her, she would be stopped in her tracks. “Normal science,” for Kuhn, consisted of “a strenuous and devoted attempt to force nature into the conceptual boxes supplied by professional education.” He went on to wonder “whether research could proceed without such boxes, whatever the element of arbitrariness in their historic origins and, occasionally, in their subsequent development.”30 Whitehead, Goethe, and Kuhn agree that there are received beliefs in science. Whitehead challenges the received truths about material reality established in the seventeenth century and the tendency in science for misplaced concreteness, mistaking the mathematical abstraction for the actuality it represents. The danger for Goethe is that an enduring hypothesis becomes truth and therefore goes unquestioned. For Kuhn, normal science floats along on the consensual, often unexamined beliefs he calls paradigms until some discovery, some intractable problem, explodes those same foundational convictions. He sees paradigm change as the upheaval that causes scientific revolutions.

  What Are We? Nature/Nurture: Hard and Soft

  The sense of entitlement M believed was inborn in male persons (which he might have confused with studies on dominance in male rats or other mammals lifted from evolutionary psychology, whose proponents argue that male dominance is a Darwinian sexually selected trait) is an old idea. Thoughts about male superiority have been framed in any number of ways since the Greeks, but in contemporary culture this still robust notion is usually understood in terms of a more recent history. Since the late nineteenth century in the West, human development has often been seen as a tug-of-war between nature and nurture. M was convinced that a feeling he called entitlement came by way of male nature, not nurture. What are the assumptions that make this nature/nurture division possible? Despite continual announcements from people in many disciplines that the divide is a false one or that it has been resolved, it continues to haunt scientists, scholars, and the popular imagination. In its starkest terms, the opposition looks like this: If people are destined at birth to turn out one way or another, then our social arrangements should reflect that truth. On the other hand, if people are largely creatures of their environments, then society should acknowledge this fact. The potential consequences of deciding that one is more important than the other are immense. Asking how much nature and how much nurture are involved in our development assumes that the two are distinct, that a neat line can be drawn between them and, with some luck, be measured.

  A cartoon version of nature and nurture involves simple notions of inside/outside or person/environment. Two children are born and set out on the path of life. Along the way, they both learn to talk and sing and dance and read. They both encounter obstacles and trip over them. Both are nearly drowned in a flood, but one child grows into a strong, resilient adult and the other withers, becomes ill, and dies young. Why? A current and popular idea, articulated by M at the dinner and by many others, is that there is something inside the strong child, some hard or soft hereditary substance that helps him to survive external buffeting that the weak child does not have or has less of. That stuff is usually called our genes. In everyday conversation, people often refer to good genes and bad genes. The person who has lived on legumes and never touched a cigarette or a glass of wine and runs fervently every day but who drops dead at thirty-eight is often explained as a case of bad genes. Every day we read about scientists who have found a new “gene” for this or that illness. Not wholly unrelated to genes and equally popular in the media is the idea that our brains have been “hardwired” for this or that behavior. Choosing spouses, believing in God, what we find beautiful, even asking for directions have all been explained through brain wiring, hard or otherwise: “It Turns Out Your Brain Might Be Wired to Enjoy Art,” “Is the Human Brain Hardwired for God?” “Male and Female Brains Wired Differently, Scans Reveal.”31 Whether the reference is to genes or to brains, the implication is that an identifiable biological material is shaping our destiny.

  Of course, forms of nature are also outside our two cartoon characters, and they themselves are natural beings, part of nature, but what the division is trying to sort out is what is inherited in people and what is made through experience and what we call the environment. In this view, the organism is in its environment but can be neatly separated from it. It is a discrete, contained entity with definable borders. In her small, brilliant book The Mirage of a Space Between Nature and Nurture (2010), the geneticist and philosopher of science Evelyn Fox Keller investigates the semantic ambiguities of the nature/nurture conflict and argues that before Charles Darwin, the opposition, as we now understand it, did not exist. It is certainly true that despite the long history of innate ideas in philosophy, the poles between what has come to be called nature and nurture took on a new appearance when Darwin’s theory of evolution hypothesized how specific traits were passed on from one generation of a species to the next. For Darwin, the stuff of heredity was not genes but submicroscopic “gemmules.” Keller identifies Sir Francis Galton as the thinker who hardened the nature/nurture distinction by isolating gemmules into bounded, stable units of biological inheritance. Indeed, he was the one who came up with the “jingle” that is still with us: nature versus nurture.

  As I will show, Darwin’s gemmules take on an even more particulate (or atomic) character for Galton than they did for Darwin, becoming more independent, but also invariant. Where Darwin held that these particles might be shaped by the experience of an organism, beginning with Galton, they were reconceived as fixed entities that were passed on from generation to generation without change. The relevance of this difference to the causal separation of nature and nurture is that although the effects of malleable (or soft) hereditary particles might be regarded as separable from the effects of “nurture” within a single generation, over the course of many generations, their influence would become hopelessly entangled with the influence of experience (or “nurture”).32

  For Galton, then, hereditary nature travels inside the body through reproduction over the generations, while nurture remains mostly outside it. It is not that the flood has no effect on our two cartoon characters who nearly drown, but rather that some internal substance of nature can explain why one character rebounds and the other sinks after the frightening experience and when, in Galton’s words, “nature and nurture compete for supremacy . . . the former proves the stronger.”33 Galton coined the word “eugenics” in 1883. His understanding of heredity led him to call for social policy to strengthen the future of the race by encouraging some people to mate and others to desist from mat
ing in the interest of the greater good.

  The eugenics movement that grew out of Galton and continued into the twentieth century had many faces, including progressive ones. The American birth-control advocate Margaret Sanger, for example, was a staunch supporter of eugenics. The horrors of National Socialism, which came later, would taint the word forever. What I am most interested in here, however, is not the well-known political applications of these ideas but Keller’s distinction between particulate (hard) and malleable (soft) hereditary substances and how the harder, atomic characterization served as a vehicle for a definitive boundary between nature and nurture, organism and environment, and that many people like M continue to attribute these hard qualities to what turns out to be softer and less definitive than many would like to think.

  Naming and conceptualization are vital to understanding, but meanings in language are not fixed. Despite great efforts to define and isolate them in biological science, semantics are rarely controllable. The idea of the gene originated with Gregor Mendel and his research on plants in the 1860s. The word “gene” itself, however, was invented by the Danish biologist Wilhelm Johannsen in 1909, who used it to refer to “germ cells” that somehow act to determine the characteristics of an organism. He declared that the word “gene” should be completely free of any hypothesis.34 Instead, it was an abstract concept about some hereditary biological substance that traveled in time over the generations. Later, it was applied to a material reality, although the characterization of that material reality, the gene, has changed over the course of the twentieth and now twenty-first centuries due to an explosion of research in genetics.

  The sensational discovery by Watson and Crick of DNA’s structure and the double helix in 1953, which is often still described as one that uncovered “the secret of life,” became known as “the central dogma”: the sequence of bases in DNA is transcribed into RNA, which the latter translates into a sequence of amino acids of a protein. In their model, the information moved according to a linear logic. Genes were seen as potent beads on a string, agents that largely determined what the organism would turn out to be. They were variously described as the engine, program, or blueprint for the organism, a hard, active unit of heredity. This idea resonates beautifully with a long history of ideas that advanced irreducible atomic bits of nature. The central dogma is Hobbesian, both because it has a hard material reality and because it proceeds in a clear step-by-step, mechanical manner. The Master Molecule explained not just how you got your mother’s nose but a great many other traits as well. However, soon after Watson and Crick’s discovery and decades before the grand genome project was completed, the model began to look a lot more complicated, less unidirectional, less neat, less atom-like and more gemmule-like. Discovery after discovery in molecular biology created a need to alter the admittedly elegant, simple model until the central dogma became untenable.

  Genomes are now understood as systems under cell control. Without its relation to the cellular environment, the gene isn’t viable. In fact, it is inert. It is neither autonomous nor particulate.35 Some have proposed dropping the word “gene” altogether because its history has given it meanings that it simply does not have any longer. The interactions among DNA, proteins, and the development of traits (such as noses) are tremendously complex and dependent on context.36 The popular idea that the fate of our cartoon people, marching through life and its inevitable hardships, is determined by their genes, which hold context-independent information like blueprints that directly code for those strong and weak traits that make one swim and the other sink, rests on an erroneous notion of what genes are. And although molecular genetics is a highly specialized, complicated field, the basic message that genes are dependent on their cellular environment is not all that difficult to grasp.

  Why then does the myth continue? Mary Jane West-Eberhard frames the problem this way: “The idea that genes can directly code for complex structures has been one of the most remarkably persistent misconceptions in modern biology. The reason is that no substitute idea for the role of genes has been proposed that would consistently tie genes both to the visible phenotype and to selection.”37 To put it another way, the problem of genotype (the inherited genes of an organism) and phenotype (all of its many characteristics, including its behavior) is not direct and does not lend itself to a reductive, simple formula such as the central dogma. West-Eberhard argues for developmental plasticity and genetics, “the universal environmental responsiveness of organisms alongside genes,” as the path of “individual development and organic evolution.”38

  The story of how a fertilized zygote turns into a complex organism is not fully understood, but studies in epigenetics are growing. The field was named by the developmental biologist, embryologist, evolutionist, and geneticist C. H. Waddington in the early 1940s. Before the discovery of DNA’s structure, Waddington hoped to capture biological occurrences in embryology that go beyond the unit of the gene. Waddington’s interests were interdisciplinary and included poetry, philosophy, and art. He drew well, and one of his drawings of what he called “the epigenetic landscape” is particularly evocative. With its billowing folds, slopes, and plateaus, his landscape has an organic, anatomical, almost erotic quality. In a curved crevice of a mountaintop sits a ball—a cell. The landscape is undergirded by intersecting guy ropes, which are in turn attached to a series of black pegs inserted into the ground. The pegs represent genes and the ropes their “chemical tendencies.” This visual map was meant to show that there is no simple relation between the gene and the finished organism. The ball’s path is dependent on what is happening in the terrain below. As he explained, “If any gene mutates, altering the tension in a certain set of guy ropes, the result will not depend on that gene alone, but on its interactions with all the other guys.”39

  Waddington’s landscape is a metaphor, and there are scientists who believe its utility is long past. Others have expanded on it to include later science and elucidate the still opaque relationship between genotype, the developing embryo, and phenotype.40 Since the 1990s, epigenetics has become a growing field, and now it is often described in terms more narrow than Waddington’s. Genetics is the study of heritable changes in gene activity due to changes in the DNA sequence itself, such as mutations, insertions, deletions, and translocations. Epigenetics is the study of changes in DNA function and activity that do not alter the DNA sequence itself but can affect a gene’s expression or suppression, alterations that can be inherited by succeeding generations of the organism.

  Methylation is a biochemical process through which a methyl group (CH3) is added to cytosine or adenine nucleotides, two of the four nucleotides—cytosine, guanine, thymine, and adenine—that are part of the DNA structure. What is important for this discussion is simply to understand that these studies demonstrate a new twist in the story of how cells roll down Waddington’s landscape, variations that affect the genes but are not in the genes themselves. Cytosine methylation inhibits or “silences” gene expression. Further, it seems that methylation patterns can be affected by environmental factors, such as diet, stress, and aging. Research by Michael Meaney and Moshe Szyf has linked methylation in rats to early stress in the pups. By separating pups from their mothers, depriving them of care, and by studying differences in maternal behavior—some rat mothers lick and groom their offspring much more than others—the researchers found higher incidences of methylation in stressed and “low-licked” pups than in the nonstressed, “high-licked” pups. The methylation changes were then inherited by the next generation of rats even though they were not exposed to the same “stressors.” The title of a 2005 paper by Michael Meaney tells the story: “Environmental Programming of Stress Responses Through DNA Methylation: Life at the Interface Between a Dynamic Environment and a Fixed Genome.”41

  I have never liked the word “interface,” perhaps because I find it vague. It is another word connected to computers, but it seems to mean a “common boundary” that can be applied to machines,
concepts, or human beings. It is hard to picture methylation changes in the DNA of an organism related to anxiety experienced through parental neglect, for example, as a shared boundary between environment and genome, as if the two met at a border. It is much easier to borrow Waddington’s drawing for a new purpose. A shock to the landscape—a big storm, for example, that doesn’t uproot the gene pegs holding up those undulating hills and dales but rather causes one of the chemical-tendency ropes to become tangled up tightly around a peg—will alter the course of the rolling ball.

  Much work remains to be done in epigenetics, and some molecular biologists remain skeptical about Meaney’s findings and wait for more research to replicate the experiments. Rats are not people, and people cannot be subjected to the bruising experiments routinely done on laboratory creatures, but there are some studies that suggest that early trauma, for example, affects methylation and gene expression in human beings, too. More remarkable perhaps is that the long discredited idea that parents hand down characteristics acquired during their lifetimes to their children, an idea that turned the French naturalist and evolutionary theorist Jean-Baptiste Lamarck (1744–1829) into the laughingstock of science, has been resurrected in epigenetics. Although Darwin and Lamarck are often seen as champions of conflicting ideas, Darwin did not oppose Lamarck. He too believed that some acquired characteristics were heritable. As for “stress,” that word now used for all manner of afflictions, it has long been known that neglect and shocks of various kinds affect a child’s development. What no one knew was that these experiences might affect nongenetic factors that nevertheless have an influence on gene behavior and how a person’s genes are expressed, and that this could be passed on to the next generation.

 

‹ Prev