Hacking Darwin
Page 9
Even in this highly contentious environment, however, a significant number of researchers sought to defend the principle that general cognitive ability was largely genetic and heritable while trying their best to avoid the clear danger of linking the science of genetics with the contentious politics of race and gender. Fifty-two leading professors published a joint statement in the Wall Street Journal in December 1994 asserting that intelligence can be defined and accurately tested, that it is largely genetic and heritable, and that “IQ is strongly related, probably more so than any other single measurable human trait, to many important educational, occupational, economic, and social outcomes.”18 Although Herrnstein and Murray were in many ways flawed messengers making some scurrilous policy recommendations, the idea that IQ had a real, measurable, and heritable genetic component was harder to dispel.
Identical twin and other sibling studies have for decades helped illustrate how much of our IQ is based on our genes versus our experience. The Minnesota Twin Family Study, conducted between 1979 and 1990, followed 137 pairs of twins—81 identical and 56 fraternal—separated early in life and raised apart. Similar to the findings of the researchers on the genetic makeup of height, the Minnesota scientists found that 70 percent of IQ was based on genetics, with only 30 percent resulting from different life experiences, a finding in line with many other twin studies.19 A more recent literature review of many different studies on the genetics of intelligence estimated IQ to be about 60 percent heritable.20 Because IQ is a measure of something expressed and realized as we learn and grow (which is why we don’t do IQ tests on newborns), the genetic heritability of IQ tends to increase as people get older.21
Nevertheless, the very concept of an even partly genetic-based IQ remains sensitive. Questions have repeatedly been raised about whether research like the Minnesota Twin Family Study might have had a socioeconomic bias. In 2003, University of Virginia psychology professor Eric Turkheimer decided to test whether the same percent heritability of intelligence held as true for poorer people as it did for the more middle-class groupings of twins tested in many of the other studies. Interestingly, he found that genetic predictors of IQ were less accurate for children from less advantaged families.22 One explanation for this could be that IQ tests are fatally biased and flawed. Another might be that negative environmental factors had a greater relative impact than positive ones in these disadvantaged situations—similar to how the food scarcity in North Korea made environmental factors more important for children born in the 1990s than for the rest of the population.
If we accept the findings of the vast majority of studies that IQ is mostly but not entirely genetic, the question then becomes how we can understand the specific genetics underlying IQ. To answer this question, scientists have carried out a large number of studies over recent years trying to identify the thousands of genes influencing IQ. Although only a few genetic variants23 linked to higher than average intelligence had been identified as of 2016, nearly 200 have been identified since that time.24 While this progress is astounding, each of the identified genes and gene variants accounts for only a tiny percentage of the estimated IQ differences between people.25 Identifying a few hundred genes out of the potentially thousands that impact intelligence tells us very little about an actual or potential person’s intelligence, in the grand scheme of things.
And yet…
That we are identifying these genes at such a rapid clip at an early stage of the genetic revolution strongly suggests we will find even more as the number of people sequenced, and access to their life records, increases; computing tools and big-data analytic capabilities expand; and more money, people, and educational, business, and governmental resources are being brought to bear.26 The total number of genes influencing IQ would by definition need to be significantly fewer than twenty-one thousand (the number of genes in the human genome). Our path to identifying the two hundred genes has accelerated exponentially in the past few years, so even a number in the low thousands appears reachable.
Major progress has been made in cracking the genetic code of IQ, and we still have a long way to go, but how much more genetically complicated is IQ than height? Two times more, three time more, five times? It is certainly not ten times more complex.
Steve Hsu’s preliminary analysis, and the impressive work of biostatistician Yan Zhang and her colleagues at Johns Hopkins University, suggest that most traits having an average heritability of 50 percent or higher could be predicted with relative accuracy once we have about a million people with their sequenced genomes and life records accessible.27 Let’s say for the moment that human intelligence is five times more complicated than Hsu and Zhang account. That would mean five million sequenced genomes and records. That is a lot of people by today’s standards but not compared to the two billion people likely to be sequenced over the coming decade.
Perhaps an even more complicated human trait to understand and quantify is personality style. Personality is among the most intimate human traits, which is why you gulped when the fertility doctor raised it as an option for selection. Although tests like Myers-Briggs have attempted to quantify different personality styles, the process still seems more an art than a science.
A leading theory of personality dating from the 1950s divides human personality styles into five major categories: extroversion, neuroticism, openness, conscientiousness, and agreeableness. Each of us, of course, is somewhere along the spectrum of each of these styles, and measures like these are necessarily relative and subjective. But we can all easily rank our friends and family members from most to least extroverted, neurotic, etc., so the categories are certainly not meaningless.
Per usual, the twin studies also have a lot to say about personality style. “On multiple measures of personality and temperament, occupational and leisure-time interests, and social attitudes,” the Minnesota Twin researchers remarkably found, “monozygotic [identical] twins reared apart are about as similar as are monozygotic twins reared together…the effect of being reared in the same home is negligible for many psychological traits.”28
Twin studies aren’t the only way researchers are digging into the genetics of personality style. Scientists at the University of California, San Diego, brought together data to compare hundreds of thousands of people’s genomes to information those same people provided in questionnaires about their personality styles in a study reported in 2016 in the journal Nature Genetics. When they compared the sequenced genomes of people who described themselves as having similar personality types, the researchers identified six genetic markers that were significantly associated with personality traits. Extraversion was associated with variants in genes WSD2 and PCDH15, and neuroticism with variants in the L3MBTL2 gene.29 From just looking at a few points in the genome, the UC San Diego researchers felt they could loosely predict what personality style a person would have.
There is almost no chance that our complex personality styles, in all but a small number of rare cases, are solely or even significantly influenced by single-gene mutations like these, but there is a high likelihood—a certainty even—that more genetic underpinnings of personality will be identified in the coming years.
Studies exploring the genetics of height, intelligence, and personality style show that we will increasingly be able to decode the genetic components of even our most intimate, complex, and human traits with growing accuracy. Even if we don’t fully uncover the genetics underpinning these traits for many decades, we won’t need such a complete understanding to deploy our limited but growing knowledge of genetics in both our health care and reproduction with increasing confidence. We will shift gradually from our generalized health care based on population averages to precision health care based on responding proactively to our genetic predispositions. Perhaps even more significantly, we will begin to integrate these genetic predictions into our reproductive decision-making.
The human implications of having these choices were what made you uncomfortable as the doctor o
utlined your options in the fertility clinic based on her advanced but incomplete knowledge of how genes function. Even with this limited knowledge, however, each of us will, in one way or another, need to decide whether or not we will sign the doctor’s tablet.
Making these kinds of decisions in the 2035 clinic will have a relatively low safety threshold compared to what will come after. Each of these embryos, in 2035, will be the natural and unedited offspring of the genetic parents like you. Even if our understanding of genetics proves completely wrong, all parents using IVF and traditional preimplantation embryo selection will wind up with a child as “natural” as any other.* But at a time when hacking biology is becoming the new operating mode of our species, there is little chance our application of genetic technologies to the reproduction process will end with just selecting from a few preimplanted embryos.
*This calculus would change if it were proven that having children through IVF and PGT was less safe than by sexual reproduction. Because traditional IVF and PGT have been carried out on older and higher pregnancy risk women, making this assessment would require comparing “like with like” potential mothers. So far, there is very little indication that this is the case.
Chapter 4
The End of Sex
I didn’t confess everything in the introduction to this book. I stopped the tale prematurely. I hid a word.
The doctor at the cryobank told me that the sprightly receptionist, who also functioned as the nurse, would take me back to the masturbatorium. It’s a real term.1
I blushed a little as she led me through the corridor and handed me a plastic container. She opened the door to a small room that must have once been a broom closet. Sterile white and dominated by a medical supply cabinet, the only indication of the room’s purpose was a porn video already playing on the TV. I wondered if someone had profiled me when choosing which DVD to slot in. Did something about me suggest tattoos?
“The magazines are on this rack,” she said in a professional tone, pointing to a stack of well-worn magazines. “If you don’t like this one, the other DVDs are in this drawer organized by type. Just place your deposit in the container and seal the lid tightly. Leave the specimen on the counter when you are done, and I’ll pick it up. Anything else you need?”
I shook my head. A hole to crawl into?
As she stepped out and closed the door behind her, I looked around the strange room feeling ill at ease amid the faint drone of low-volume grunting. Was this really what evolution had in mind for me? There were few environments I could imagine less conducive to the task at hand than this. A dentist’s chair came to mind.2*
But as awkward as it was, I’d returned to work before my lunch break ended. It really wasn’t all that bad. That’s because I’m a man.
The average healthy male produces more than 500 billion sperm cells over the course of his life, with between 40 million and 1.2 billion sperm cells being released each ejaculation. Like many other mammalian species, we produce so much sperm because we’ve been competing with other males for hundreds of millions of years to get our sperm as close as possible to the desired female’s eggs. The way our species is built, male sperm is a dime a dozen. That’s why the masturbatorium is a thinly disguised broom closet with no technology to speak of other than the television and a Y2K-era DVD player.
But for women the story is different.
Human women are born with about two million egg follicles that will, over time, turn into eggs. Most of these follicles close up before puberty, leaving only about three hundred to four hundred with the potential over the coming few decades to mature into eggs available for fertilization. That’s a big difference: about five hundred billion total sperm cells for men, up to four hundred total eggs for women. In contrast to the ease of extracting male sperm, extracting a human female egg is anything but.
When having her eggs retrieved for IVF, a woman must first endure up to five weeks of sex hormone pills and/or injections to make sure she develops a maximal number of follicles and eggs for retrieval. These ovary-stimulating drugs ramp up estrogen levels and not only can cause nausea, bloating, headaches, blurred vision, and hot flashes, but also increase the woman’s risk for dangerous blood clots and ovarian hyperstimulation.
Using ultrasound to monitor the development of the woman’s eggs attached to the follicle wall in her ovaries, IVF doctors wait until the follicles have reached maturity before anesthetizing the woman for the egg-retrieval procedure. A needle is then passed through the top of the vagina to reach into the ovary to vacuum out the egg, which is then placed in a plastic dish. After the women emerges from sedation and rests, she is usually sent home, where she can expect some cramping for at least a few hours but sometimes longer.
Millions of women endure the pain and potential, though rare, danger of egg retrieval because the promise in their minds of motherhood outweighs this inconvenience. But that doesn’t make the process any less difficult.
IVF is not impossible, but it is certainly not easy. It is also less emotionally appealing and more clinical than conception by sex. Conception by sex will always have emotional appeal, and some of us will always trust 3.8 billion years of evolution more than we do our local fertility doctors. But with its mix of costs and benefits, IVF will increasingly compete with sex as the primary way we humans procreate, particularly as it increasingly comes to be seen as safer and more versatile.
Since the birth of Louise Brown in 1978, more than eight million IVF babies have been born with no discernibly different health outcomes than other babies. In the United States, around 1.5 percent of all babies are born with IVF. In Japan, the number has risen to nearly 5 percent.3 The use of IVF hasn’t just helped older or higher risk mothers conceive but has also helped new categories of people, including many gay and lesbian couples, have their own biologically related children.
Starting in the 1980s and ’90s, fertility services, donor insemination, surrogacy, and changing social norms paved the way for more gay and lesbian couples to have children of their own in the United States and some other countries. The historic 2015 U.S. Supreme Court decision in Obergefell v. Hodges declaring gay marriage protected by the Constitution further facilitated this increasing normalization of the gay family.
With today’s technology, gay men who want to have a biologically half-related child generally need to find a donor for the egg that will often be fertilized by one of the men’s sperm and then gestated in a surrogate. Lesbian couples need, of course, a male sperm donor. This means that for the estimated 4 percent of the population who self-described to Gallup as being lesbian, gay, bisexual, or transgender (LGBT), assisted reproduction is already not the exception but the norm.4 As in other areas, the LGBT community stands at the vanguard of social change.
For IVF to become a more common way for all women to conceive, this kind of different thinking about conception will need to go even more mainstream. The IVF model will also need to become easier, less clinical, and less painful for women, a process already well underway.
Imagine you are wanting another child in 2045, a decade after your second was born. You are a busy executive with every moment of your day scheduled to the brink. You are proud of your and your partner’s ancestry and want to have your own 100 percent biologically related child. Your experience selecting what still feel like positive traits for your second child reaffirmed your data-driven instincts, so this time you don’t have many reservations about selecting your embryo prior to implantation with as much genetic information as possible. You are willing to get pregnant because you want to make sure your future child is looked after in utero, even though lots of other people are hiring gestational surrogates and a few are experimenting with synthetic wombs. But you’d still like to make the process as easy as possible, so long as it delivers the maximum benefit. Luckily, you don’t have to go through the difficult egg retrieval process because now you can hack your reproductive process more than you’d ever imagined possible.
You are leading your morning staff meeting at work when your assistant comes into your office and places a small Touch Activated Phlebotomy, or TAP, device on your arm. You don’t feel a thing and continue with your presentation as the device suctions one hundred milliliters of your blood, the equivalent of about seven tablespoons, into a small receptacle. That’s it. In just this moment, you’ve achieved more than was ever possible with the old IVF egg retrieval model, based on another wonder of human imagination and scientific progress: unlocking the magic of stem cells.
The work of scientists like “Magnifico” Spallanzani showed that human reproduction happens when a male sperm cell fertilizes a female egg cell. If we start as one fertilized egg cell, which then differentiates into the many different types of cells that make up a person, then it logically follows that those early cells must be able to grow out of the initial cell-like plants from a seed. But how?
It was not coincidental that when, in 1908, Russian scientist Alexander A. Maximow described a precursor cell that could develop into any other type of cell, he used a plant analogy. Over the ensuing years, scientists learned more about these amazing “stem cells” that both grow into the many different types of cells it takes to make an organism and then, in a more specialized form, continue regenerating each type of cell to keep the organism going. Stem cells are the seeds that later become the enablers of the branches.
For decades, researchers struggled trying to understand, identify, and isolate stem cells. Then in 1981, British researchers Martin Evans and Matthew Kaufman were the first to extract these stem cells from mouse embryos; a process first carried out on human embryos by two different teams of American researchers seventeen years later. The discovery of human embryonic stem cells launched a tidal wave of anticipation first among scientists and then among the general public.