Imagine yourself in the position of a couple in which one member is a well-educated woman in her mid-thirties who has delayed having children in order to pursue her career and now earns a comfortable salary; the husband has a similar type of job. They own a house in a safe suburb with good schools and are health-conscious—they are exactly the sort of parents that most state child welfare administrators would wish on a child. The couple will likely not have more than two children, and may opt for a singleton in order to devote their time and resources to giving the child every advantage they can. They may employ a well-qualified nanny to look after the child if the mother returns to work soon after the birth, and will certainly seek out good preschools and kindergartens. As the child gets older they will find tutors if the child needs them—and often even if he or she doesn’t—and enroll him or her in summer enrichment courses when school isn’t in session. Their thoughts will also turn to sports successes, or perhaps music lessons, always with the goal (either stated or tacit) that the child should be encouraged to succeed to the best of his or her abilities. Ultimately, college preparation will rear its head, in some cases as early as primary school, as the parents try to anticipate which secondary school will best prepare their offspring for a good university. Once a suitable institution is found, the parents may feel that they have fulfilled their role of sending their child on the way to a happy and successful life.
Even if this journey unfolds smoothly, it will cost hundreds of thousands of dollars over the child’s lifetime. Add to that the cost of conception (if assisted) and pregnancy, and it’s understandable if some parents start to see the whole process as an investment—a worthwhile one, to be sure, but a very real expenditure of scarce resources for the next generation. There is also, perhaps, a certain hubris—a desire to see one’s own offspring reflect well upon oneself—which, compounded with the other factors, can produce the complex path to adulthood taken by many upper-middle-class children today.
Along this well-trodden path, if any sign appears that the desired result might not come to pass, the parents may take action. A young boy’s rambunctious personality may be calmed by daily doses of Ritalin, a little girl’s crooked teeth corrected with braces, and any sign of a serious medical condition attacked with the latest high-tech diagnostic tools and a dedicated medical team. We all know that life’s uncertainties can blindside even those with the best-laid plans. But what if there was a way to avoid this, and stack the deck in your favor, by choosing the genes in your offspring? While some people might find this unethical, as we saw with the Whitakers, others simply see it as an intelligent use of modern technology.
This is the future promised by new genetic technologies. Although we know a fair amount about the genetic basis for many rare disorders—those that occur in less than 1 percent of the population, like Tay-Sachs or cystic fibrosis—geneticists are still very much in the process of discovering the genetic factors that influence the common diseases we learned about in Chapter 3, such as hypertension and diabetes. The environment might play the most important role in determining whether you get these diseases, but there are certainly genetic factors as well. And for psychological traits like schizophrenia, bipolar disorder, and alcoholism, even less is known. We are currently witnessing a revolution in our understanding of the genetic bases for such illnesses, and we will be discovering many of them in the next decade. In an interview in The Scientist, Eric Lander, one of the leaders of the Human Genome Project and the director of the Broad Institute in Cambridge, Massachusetts, has described this period in the field of genetics as “a tremendously exciting time to study human variation, because we finally have enough tools and infrastructure to correlate genotype to phenotype” (phenotype is the ultimate effect of a gene—its influence on appearance, disease state, behavior, and other outwardly visible characteristics). Clearly, we are on the cusp of a revolution in our understanding of our own genetic predispositions, and once we have the information, many people will want to use it, in the same way that other technologies have been embraced. But what will this information look like, and how will we make these decisions?
TWENTY-FIRST-CENTURY NEEDLES
There are around 23,000 genes in the human genome. This figure still seems startlingly low to me, since throughout my career as a graduate student and postdoctoral research fellow we had always estimated the number to be close to 100,000. When the draft of the human genome was announced in 2000, though, it was clear that we are far less complex at the genetic level than we’d thought. Or are we?
What the gene number revealed (incidentally, fruit flies have nearly as many genes in a genome that is one-twentieth the size of our own) was that it wasn’t simply large numbers of genes that produced human complexity; what was significant was the way the genes were being turned on and off to create our cells, tissues, and other characteristics. As the saying goes, it’s not about the size, it’s how you use it. Clearly, the human brain is far more complex than that of an insect; the difference lies not in having lots of brain-specific genes (memory genes, speech genes, genes that cause us to appreciate jazz, and so on) but, rather, in using the genes we have in exceedingly complex ways to produce these traits. The complexity of these gene interactions reveals an insight into an important genetic phenomenon known as pleiotropy.
Pleiotropy refers to the many effects of a gene on the organism’s phenotype, apart from the ones that are expected from the actual function of the gene. For instance, a single letter change in the sequence of the hemoglobin gene that codes for the oxygen-carrying protein in our red blood cells can cause the symptoms that we know as sickle-cell anemia, which occurs at a frequency of around 8 percent in people of African descent. These symptoms include kidney failure, stroke, and liver damage. All are the result of a single nucleotide change that causes the hemoglobin molecule, under certain circumstances, to change its shape and form crystalline structures in the blood cells, rendering them rigid and unable to carry blood into the smallest capillaries of the body. A seemingly small change, but one with significant effects.
Interestingly, this same change also renders people who carry the mutation less susceptible to infection by the malaria-causing parasite, in much the same way as the G6PD mutations we learned about in Chapter 3. In one of those odd twists of evolutionary fate, sickle-cell carriers—people with only one mutated copy of the gene, rather than the two necessary for full-blown anemia—are actually at an advantage in the malarial jungles of central Africa. This accounts for the relatively high frequency (around 25 percent) of carriers in this population, as thousands of years of malaria exposure have selected for this variant in the people living there. Something that is bad in one context turns out to be good in another.
Similarly, trisomy 21–the presence of three copies of chromosome 21–causes the congenital disorder known as Down’s syndrome. As far as we can tell, the genes present in the extra copy are identical to those present in one of the other two—it’s a duplicate of one of them. Why this duplication causes the suite of symptoms (lowered intelligence, motor dysfunction, facial abnormalities) seen in Down’s cases is unknown, but it almost certainly has something to do with the number of copies of certain key genes, which affects the way in which they function. These genes, located in the so-called Down’s critical region, are sufficient to cause the disorder even if they are the only parts of the chromosome that are duplicated, copied, and inserted into the otherwise normal chromosome. Why these genes are so finely tuned as to require exactly two copies, but not three, in order to function properly is currently unknown; clearly, seemingly minor genetic changes can have large unintended effects. In this case it is probably a change in gene regulation; the level at which one (or more) of the genes in this region produces its encoded proteins is too high with three copies, and developmental problems ensue.
Disorders such as sickle-cell anemia and Down’s syndrome, despite the complexity of the functional differences by which small genetic lesions are converted into the f
inal phenotype, are still relatively simple by the standards of most diseases affecting humans. The big killers we learned about in Chapter 3 also have genetic components, but instead of a single gene or chromosomal region, they involve small changes all over the genome, like tiny needles hidden in a gigantic haystack. Heart disease, for instance, has had dozens of genetic mutations associated with it in the scientific literature over the past twenty years. Some of these have stood up to scrutiny, while others have not been confirmed by other research teams and remain uncertain. And even for the confirmed associations, the relative odds are typically quite low, increasing the risk of hypertension or a heart attack by 50 percent or so. In other words, if a person not carrying the associated genetic variant has a risk of, say, 10 percent of having a heart attack at some point in his or her life, then someone carrying the associated mutation has a 15 percent chance. This means that there is still an 85 percent chance that he or she won’t have a heart attack, making the interpretation of the genetic effects quite difficult.
The environment is another complicating factor. As we saw with the Pima Indians, people who are very similar genetically can have vastly different disease rates, depending on their lifestyle. Even if you do have genes that predispose you to get diabetes, for example, living a healthy, active lifestyle low in high-calorie foods like sugar and fat will make you much less likely to develop the disease. These environmental effects further complicate the analysis of so-called complex, or mutifactorial, diseases like hypertension, diabetes, stroke, and the other big killers in the developed world. It is incredibly difficult to tease apart genetic factors from shared environmental factors. Suppose, for instance, that you are a researcher who has noticed a high incidence of heart disease in people of Scandinavian descent living in one part of Nebraska. Is this because of shared genetic factors—after all, they all come from the same part of Europe—or shared environmental factors, such as diet and lifestyle? The genetic studies need to be designed very carefully in order to separate these various influences, and then large numbers of people need to be analyzed in order to find a genetic association that is statistically significant.
The end result is that genes alone don’t tell the whole story. Ultimately, your DNA influences your health, but it doesn’t determine it. The real advantage of genetic testing, at least if done early enough in life, is that it can provide an insight into some of your risk factors, in the same way that analyzing your diet and lifestyle can. Armed with more information, you can make intelligent choices about the way you live. Carrying risk factors for heart disease? Probably even more important for you to exercise regularly, eat right, and never start smoking. High risk of prostate cancer? Maybe you should start getting screened at age forty, rather than the currently recommended fifty. Low risk of diabetes? Maybe you don’t have to be as careful as some other people about eating sugary foods (though you should still watch what you eat).
Such testing is becoming more commonplace as more of the genetic factors are discovered. Several companies now offer genetic testing to assess your risks of developing various disorders. All of the tests are predicated on the notion that knowledge is power, and that knowing will allow you to make lifestyle changes to reduce your odds of developing the diseases in question. If you are over the age of sixty, you already have a lifetime of behaviors that can’t be undone, so the testing is less useful than it is to someone in their twenties. Behavioral recommendations have changed dramatically over the past fifty years as epidemiologists have uncovered various environmental risk factors. Smoking was thought to be only vaguely unhealthy in the 1950s, a suntan was a key component of a healthy appearance in the 1970s, margarine was touted as the healthful alternative to butter in the 1980s, and a high-carbohydrate diet was considered desirable in the 1990s. We all have made lifestyle decisions based on these recommendations, often blindly following medical advice doled out with little underlying data to support it.
Our genetic risks have been a big unknown through all of this. While family history is often taken into account when medical recommendations are made, a lot of advice is based on the assumption that everyone responds in much the same way to whatever the factor is. Clearly, though, this isn’t the case—we all know people who smoke for years and never develop lung cancer, or who can eat anything they want and never become fat, or who do exactly what they should do and still die young. Jim Fixx, for instance, the man whose Complete Book of Running popularized jogging in the 1970s, died at fifty-two of a massive heart attack despite being extremely fit and having competed in dozens of marathons and other long-distance races. His father had died of a heart attack at forty-two, and clearly Jim was carrying genetic variants that predisposed him to heart disease despite his healthy lifestyle.
Progress in the field of genetic disease associations has been extraordinary since the completion of the Human Genome Project. In a March 2008 American Heart Association press conference to announce a number of new associations, Maren Scheuner of the RAND Corporation noted that fifteen years earlier there were only one hundred known genetic associations with disease (and many of these, such as cystic fibrosis, were for relatively rare disorders), while the number at the time of the press conference was around fifteen hundred—and growing. The rate of discovery is so high that health-care professionals simply cannot keep up, and one of the major challenges of the next phase of genetic medicine will be educating the health-care workforce.
The most useful time to test people, of course, would be at birth. This will probably start to happen in the next decade, once the efficacy of such testing becomes clear to the medical community. Lifetime risk could be calculated and a lifestyle “prescribed” that takes into account your genetic risk factors. It’s also possible, though, for testing to be carried out before conception, at the embryo’s eight-cell stage, using PGD. It’s now clear that many diseases are influenced by the environment in the womb, and such knowledge would allow the mother to modify her pregnancy to suit her offspring’s genetic makeup. For instance, Ellen Ruppel Shell, in her book The Hungry Gene, discusses the Dutch “hunger winter” babies. These children were born to women who were pregnant during the winter of 1944–45, when a wartime famine affected the population of Holland. Those whose mothers had lived through the famine during the first two trimesters of their pregnancy were 80 percent more likely to be obese as adults, and they also showed higher rates of diabetes and other chronic diseases. Scientists now understand that the fetal environment can affect how genes are expressed, and knowing the risks prior to becoming pregnant might help a woman tailor the womb environment to the child’s genetic inheritance.
There is, however, another possibility—one depicted in the 1997 science fiction film Gattaca, in which most children are created via IVF and a genetic scan allows parents to choose the genes that are most desirable. While such a scenario still seems futuristic, the Whitakers have shown that in fact it is not only possible but already happening. Michelle and Jayson simply tested for genes that would determine the success of Charlie’s stem cell transplant, but it’s possible to test for any of the genetic variants that might contribute to disease. Why saddle your offspring with the burden of being genetically predisposed to Alzheimer’s, for instance, when you can simply select for the embryos that don’t carry the predisposing variant in the ApoE gene?
Although some people may object to such genetic selection on the grounds that it is unethical, we already allow many other diagnoses to be carried out prior to birth, in the name of prevention. This is the thinking behind amniocentesis and chorionic villus sampling in mothers over the age of thirty-five. If a disorder like Down’s is seen in the fetus, the pregnancy can be terminated and the couple can try again. More than 90 percent of couples faced with such a diagnosis choose this option. With the application of PGD, of course, such invasive procedures and emotionally difficult decisions can be avoided altogether. A 2001 court case in France, where a child born with Down’s syndrome successfully sued his mother’s p
hysician for allowing him to be born with the disorder, even raised the possibility that such testing could eventually be required—if not legally, then at least de facto, in order to avert the risk of a costly lawsuit.
With health-care costs in America currently increasing at 11 percent per year (and at least half that amount in most other developed countries), a rate far beyond the broader rate of inflation, in the future such testing may also be required by health insurance companies if you want your child to have insurance coverage. As a result of rising health insurance costs, some American employers, such as the Michigan-based health-care company Weyco, have begun firing their employees for engaging in risky behavior like smoking. Will willfully ignoring your family’s genetic risks eventually become grounds for dismissal as well? Even in countries that have universal health care, there is still a strong incentive to prevent rather than treat disease, and the ultimate form of prevention is to avoid risks altogether.
Even though we can enact legislation to protect the rights of people with genetic disorders, such as the Genetic Information Nondiscrimination Act, which was passed by the U.S. government in 2008, I suspect there will eventually be social pressure to test for and act on genetic information. Such pressure—and the desire for a healthy baby, of course—is the reason most pregnant women in the developed world over the age of thirty-five choose to get their fetuses tested for chromosomal abnormalities; nearly all have ultrasounds and other tests. Will significant numbers of people choose to use PGD in the future in order to ensure that they have a normal child? It certainly seems possible, as a recent U.S. survey suggested: 52 percent of those polled said they would use a prenatal genetic test for susceptibility to heart disease, and 10 percent and 13 percent would test for genes associated with height and intelligence, respectively. A new project in China aims to identify gifted children and nurture their future development based on the results of genetic tests. As the project’s director says, “Nowadays, competition in the world is about who has the most talent. We can give Chinese children an effective, scientific plan at an early age.” And the California-based clinic Fertility Institutes even announced in early 2009 that it would begin offering PGD for traits such as hair and eye color. Clearly, the genetic future has arrived.
Pandora's Seed Page 15