Inheritance: How Our Genes Change Our Lives--and Our Lives Change Our Genes
Page 12
Given the number of medications we prescribe and the spectrum of genetics involved, codeine use in the pediatric population, is likely only one of many instances in which drugs meant to help people heal are having the opposite effect.
We now have the tools to identify ultrarapid and ultraslow metabolizers of certain medications, including opiates, through relatively simple genetic tests. But there’s a good chance that if you were recently prescribed an opiate like codeine in the form of Tylenol 3, you didn’t get checked this way.
So why aren’t those tests being used more proactively? That’s a great question—and one I absolutely urge you to bring up with your physician before you let yourself or your children be treated with certain medications.*
Of course, a risk to some isn’t a risk to all. For some people, codeine can be a perfectly safe and effective choice for pain relief.
So what we’re moving toward, I hope much sooner than later, is a world where there is no average recommended dose of any drug that is sensitive to your genetic inheritance but rather a personalized prescription that takes into account a myriad of genetic factors and that results in dosages that are just right—just for you.
Besides recommendations for doses of medications that work best for most but not all people, we’re beginning to understand that our genomes also play a significant role in how we respond to preventative health strategies. To appreciate what this might mean to you and the health recommendations you’re being given, I’d like to introduce you to Geoffrey Rose and acquaint you with his aptly named Prevention Paradox.
***
Some doctors are clinicians. Others are researchers. Not everyone can be both—and not everyone who could be both actually wants to be.
But for some doctors, myself included, the chance to see laboratory research reflected in the lives of patients offers incredible opportunities, enormous insights, and the absolute privilege of being in a front-line position to help people.
That’s what kept Geoffrey Rose going, too. As one of the world’s foremost experts on chronic cardiovascular diseases and one of the preeminent epidemiologists of his time, Rose certainly wasn’t required by the research community to do any clinical work at St. Mary’s Hospital in London’s historic Paddington district. But Rose continued to see patients for decades, even after a brutal car accident nearly claimed his life and resulted in loss of vision in one eye. He kept going, he told his colleagues, because he wanted to ensure his epidemiological theories were always grounded in clinical relevance.4
Rose is perhaps best known for his work highlighting the need for population-wide prevention strategies, such as the educational and interventional measures we’ve applied to the epidemic of heart disease. But he also fully recognized the public health failings of such programs. He called this the Prevention Paradox, which states that a lifestyle measure that reduces risk for the entire population may offer little or no benefit to any given individual.5 This approach privileges the success of the whole and neglects to tend to the needs of the few who don’t quite fit into the rubric of the genetic majority.
Put plainly, the wonder drug for the 5-foot-10, 185-pound white male may not do a thing for you. As we saw with Meghan’s prescription for codeine in the beginning of the chapter, it may even kill you.
Even so, we’ve made incredible gains in health outcomes by treating entire populations with vaccinations such as those that were given against smallpox. However, physicians don’t usually treat entire populations but rather individuals within those populations. Yet the guidelines for how we practice medicine are derived from the evidence that is garnered from population studies that are comprised of individuals from an eclectic mix of genetic backgrounds. Which is why codeine was used for so long for pain relief after pediatric tonsillectomies—because it worked on most of the kids, most of the time.
One example of the Prevention Paradox occurs in the first weeks when people with high LDL or “bad” cholesterol start taking fish oil supplements. Researchers found that using fish oil (which is high in omega-3 fatty acids from mackerel, herring, tuna, halibut, salmon, cod liver, and even whale blubber) is associated with a wide range of changes in LDL levels across the population, from down by 50 percent to up by a whopping increase of 87 percent.6 Researchers have dug deeper to demonstrate that people who supplemented their diets with the so-called healthy fats found in fish oil actually had a greater negative change in their cholesterol levels if they were carriers of a gene variant called APOE4. Meaning that supplementing with fish oil may be good for some and very bad for other people’s cholesterol levels depending on which genes they’ve inherited.
Fish oil is by far not the only supplement that millions of people are consuming daily worldwide. It’s been estimated that more than half of Americans are thought to be popping supplements, to the tune of $27 billion dollars a year in sales, hoping to prevent illnesses and treat diseases in what seems to be a simple and natural way.7
And there are not many medical guidelines or recommendations when it comes to supplements or vitamins, which likely is one of the reasons I often get asked if there’s any benefit in taking them at all, and, if so, at what dose. My answer usually has the qualifying word “depends” attached. There are many reasons to take or avoid supplements and vitamins. Have you been told that you are deficient in something particular? Do you have a genetic inheritance that requires you to have an increased intake of certain vitamins? Or most importantly, are you pregnant?
When it comes to fetal development there’s no better place to appreciate how the significant mix of vitamins and genes can conspire to prevent serious birth defects. To deepen our appreciation, we’ll need to take a trip back to the early part of the twentieth century, where there’s a certain sneaky monkey I’d like you to meet.
***
One of the biggest advances in the eradication of birth defects worldwide started with Lucy Wills and her monkey. And it’s a great example of how the old model of “what’s best for most people most of the time” has been incredibly effectual at saving and improving lives but has also been ineffective at best (and dangerous at worst) for certain segments of the population.
Like many of the bright young doctors-to-be of the generation born just before the turn of the twentieth century, Wills was fascinated by the cutting-edge field of Freudian thought and had been considering spending her career pursuing the science and art of psychiatry. But while training at London University’s School of Medicine for Women, which maintained a close relationship with several hospitals in India, Wills received a grant to travel to what was then Bombay to investigate a little-understood condition called macrocytic anemia of pregnancy, which can cause weakness, fatigue, and numbness of the fingers in some pregnant women.8 Wills quickly learned something about herself: She loved a good mystery.
At the time, all that was known about the cause of macrocytic anemia of pregnancy was that sufferers had bloated and pale red blood cells. But why? Given that the disease seemed to disproportionately impact poor women, Wills suspected it might be related to their diets. In Wills’ day, as in our own, people who were poor and underprivileged typically had less access to fresh fruits and vegetables, and that was certainly the case with the Indian textile workers Wills went to study.
To test her hypothesis, Wills tried feeding pregnant rats a diet similar to what the textile workers were eating. Sure enough, the rats began showing similar changes to their red blood cells, and Wills soon found she could get similar results in other lab animals, too.
With that, Wills began to “build back up” the animals’ diets, in much the same way as modern parents are encouraged to introduce new foods to their infants, one by one, in order to make adverse reactions easier to pinpoint.
Wills knew that a complete healthy diet would likely eliminate the problem, but she also knew that she didn’t have the power to make that happen for every woman in India. What she needed to do, then, was to identify the exact dietary element that was
missing from the women’s diets so that it could be supplemented during pregnancy. Despite considerable efforts, though, that exact element remained elusive—until one fateful day, when one of her test monkeys got its hands on some Marmite.
If you’re British or live in a country that’s a part of the former British Empire, you probably know about Marmite—a sticky, salty, dark brown paste with a love-it-or-hate-it taste made from concentrated brewer’s yeast—and its many brand incarnations, including Vegemite, Vegex, and Cenovis. It’s certainly not for everyone, but some folks won’t leave home without it. Marmite was a staple in British military rations through two world wars. When it ran low in army food supply chains during the conflict in Kosovo, back in 1999, soldiers and their families staged a successful letter-writing campaign to get it back on mess tent tables.9
Wills took meticulous notes on everything she did, but there’s no record whatsoever of how, exactly, the monkey got its hands on some Marmite. Monkey business being what it is, it’s possible the mischievous little creature just stole part of Wills’ breakfast.
“Tar in a jar,” as it’s both fondly and derisively known, is also chock-full of folic acid. And that, Wills discovered as her monkey staged a remarkable medical recovery in the wake of its Marmite feast, was the secret to curing macrocytic anemia of pregnancy.
It took another two decades for researchers to understand exactly why folic acid was such a powerful cure. Since then we’ve learned that it is crucial for cells that are rapidly dividing, which explains why women who don’t get enough of it during pregnancy might turn anemic: Their babies are consuming all their folic acid to grow.
In the 1960s, a connection was also made between folic acid deficiency and neural tube defects, or NTDs—abnormal openings in the central nervous system, such as appear in sufferers of spina bifida—that can run the gamut from relatively benign to deadly. This is the reason that physicians often recommend folic acid supplementation for women of childbearing age even before pregnancy, because the crucial window for its ability to protect against NTDs is in the first 28 days of gestation, a time when many women do not even know that they’re pregnant. Folic acid is also associated with reductions in preterm births, congenital heart disease, and, according to one recent study, possibly even autism.10
Now, even knowing this, if you still can’t bring yourself to spread a glob of Marmite on your breakfast toast, don’t worry—folic acid is also naturally found in lentils, asparagus, citrus fruits, and many leafy greens.
The American College of Obstetricians and Gynecologists recommends that all fertile women get at least 400 micrograms of folic acid a day. But that amount is based upon the average woman, with average genes. And as we know, there’s really no such thing as the average patient.
The recommendation also doesn’t account for one of the most common genetic variations out there. About a third of the population have different versions of a gene called methylenetetrahydrofolate reductase, or MTHFR, which is extremely important in folic acid metabolism in the body.
What we don’t understand is why certain women who have been diligent in taking supplemental folic acid before they conceive are still having babies with NTDs.11 It seems that for some women with certain mutations in MTHFR, or other related genes involved in folic acid metabolism, 400 micrograms of folic acid simply isn’t enough. Because of this, they’d likely benefit from taking even more folic acid, which some physicians are now recommending they do, especially in trying to prevent recurrence of a NTD.
Thinking that it might just be better to be safe than sorry?
Before you run out to the drugstore, though, you might want to take something else into consideration. Taking too much folic acid can mask a different problem, a deficiency of cobalamin, or vitamin B12. In short, seeking to head off one problem could hide another. And as we’re still in the very early clinical stages of understanding the short- and long-term risks associated with taking large doses of supplemental folic acid, a better safe than sorry approach might, in fact, be applied by not introducing additional chemical compounds into your body unless you know for certain that you and your baby-to-be need it. Which is exactly why getting a thorough look at your genome would definitely help.
Up until recently, though, there hadn’t been a good way to know which version of MTHFR people carry. Now there is. Testing for the common versions or polymorphisms in the MTHFR gene is now available and is being included in some types of prenatal testing. These screens, or carrier testing, look for thousands of mutations in a few hundred genes. If you’re thinking about getting pregnant, it’s a good thing to add to the long list of questions to ask your doctor.
Don’t be surprised, though, if your doctor doesn’t have an immediate or authoritative answer on the availability of commercial prenatal genetic testing for different versions of genes such as MTHFR. As the cost of testing has plummeted, there’s been a considerable lag between availability of testing and what to do with the information obtained.
In particular, many physicians are still trying to determine the proper steps for effectively counseling women about individualized care—they simply haven’t had to do it before. But as doctors learn more about all the different genes we can inherit, such as APOE4, and all the things we can do to impact those genes during our lifetimes, such as taking fish oil, things are changing. And fast.
The importance of many of these findings has led to the creation of new fields such as pharmacogenetics, nutrigenomics, and epigenomics, which are aiming to bring together a better understanding of how our lives are both affected and changed by our genes.
Now that you know that genetics plays a role in your nutritional needs, there’s one more thing you may want to consider before reaching for your next supplement.
Permit me to take you on an important side trip to explore where our vitamin supplements come from.
***
Maybe you’ve been on a health kick, or maybe it’s a New Year’s resolution, or maybe you’ve just reached that point in your life when you feel it’s time to make a change. Or maybe all this talk about nutrition is making you think about your weight, and so you’re trying to shed a few pounds or attempting to get a little more sleep. Whatever your plan, there’s a good chance that you’ve either considered using or are already taking a vitamin or herbal supplement.
Or two. Or three. Or seven.
But have you ever wondered about the origins of all those tablets and capsules? Where does the vitamin C in that adorable little chewy bear come from?
I’ll bet some of you just said “an orange.”
And that’s not surprising. After all, the companies that market these products often use oranges and other citrus fruits on the labels of their C vitamins, as if their employees woke up this morning in a Florida orange grove, picked a few plump, juicy fruits off a tree, and, through some magical process, shrank each one of them to look like an edible teddy bear.
The truth is, though, many of the vitamins you and your children might have taken this morning have been created through a process very similar to prescription drug manufacturing. And by one way of thinking, that’s good. Consistent manufacturing processes for vitamins and supplements mean you generally are getting the same thing today that you got yesterday, and that you’ll definitely get the same thing again tomorrow.
Indeed, besides differing streams of government regulations, the only real difference between prescription drugs and many vitamins is that the latter are based on chemicals that are usually naturally found in food.
But that’s not the same thing as ingesting vitamins that are in food. Because when we eat an orange, we’re not just eating a fruit made purely of vitamin C but rather something that is composed of fiber, water, sugar, calcium, choline, thiamine, and thousands of phytochemicals that are not limited to that single vitamin.
In this way, taking vitamins is a little bit like listening to just the piano loop from “Empire State of Mind.” Without Jay-Z’s staccato rhymes
, Alicia Key’s supporting vocals, the rhythm tracks and guitar riffs, you’d be left with no more than the same few measures of repetitive keyboard pounding.
What’s missing is the entirety of symphonic nutrition—all of the other phytochemicals and phytonutrients that are in a real orange, the purposes of which, as of yet, we don’t even fully understand.
That’s not to say that vitamin supplementation can’t be helpful in certain circumstances, as we’ve already seen with the use of folic acid for the prevention of neural tube defects. But if you are taking supplements, or giving them to your children, instead of ingesting something you could obtain so much more naturally then you may be missing the true nutritional majesty of consuming vitamins in their most natural form.
Now, if you’re committed to applying the latest in nutrigenomics and pharmacogenetics research into your daily regimen, where to begin?
Well, to start, as we’ve discussed before you should look to learn as much as you can about your own genetic inheritance. You may even consider getting your entire exome or genome sequenced. It’s much better to access and use your genetic information while you’re still living, though being alive is not really necessary to get results. As you’re going to see, when it comes to your genes, even the dead can speak.
***
The body was disfigured and terribly decomposed. So when a small group of hikers stumbled upon it while trekking through the Ötztal Alps, near the border of Austria and Italy, they initially assumed that they’d discovered the remains of another mountaineer—perhaps someone who had died several seasons back.
It took several days to get the body down from the mountain, but once that happened, it was clear that this was no ordinary hiker. Rather, the corpse was an exceptionally well-preserved mummified body that was thought to be at least 5,300 years old.
In the decades since Ötzi’s discovery, we’ve learned a tremendous amount about his life and death. For starters, it appears that he was murdered—his violent demise seems to have been caused by an arrowhead lodged in the soft tissue of his left shoulder and a subsequent blow to the head. Analysis of his stomach and intestinal contents show he’d eaten well in his final days—Ötzi had dined on grains, fruits, roots, and several types of red meat.