by Judy Jones
Helper T cells then produce yet another lymphokine that tells B cells it’s time to stop reproducing and start making antibodies, proteins tailored to fight specific antigens. The B cells are capable of creating millions of different antibodies, which either fight the invaders or flag them so phagocytes can recognize and devour them. The helper Ts, bless ’em, also secrete gamma interferon, a lymphokine that boosts T cell activation, assists B cells in producing antibodies, and helps macrophages digest the enemy.
When the invaders are vanquished, suppressor T cells tell the rest of the immune system to call it quits. Turning off the immune response is as important as launching it; some forms of blood cancer seem to be the result of T and B cells gone wild. And the immune system can mistakenly attack the body’s own cells, as in such autoimmune diseases as rheumatoid arthritis and systemic lupus erythematosus. (Allergies are basically an immune-system overreaction to harmless invaders like dust and pollen.)
After the battle’s over, phagocytes clean up the debris—dead cells, spilled protein fragments. Certain B and T cells, called memory cells, remain in the blood and the lymphatic system, on guard against renewed attack by the same antigen. Vaccination is a sort of basic training for memory cells; it introduces dead or weakened disease-causing substances into the body, priming memory cells so that they’ll recognize the invader should it ever appear again in full force.
The good guys don’t always win, of course. Cold viruses, for example, are constantly mutating to escape detection, and chemicals like asbestos overwhelm macrophages, which can’t digest them. But for every bout with flu or herpes you suffer, thousands of sneak attacks have been thwarted.
There’s one attacker against which the body seems to be powerless. HIV, the virus that causes AIDS, enters the body hidden inside helper T cells and macrophages in blood, semen, or vaginal fluid from an infected person. The virus apparently hijacks the victim’s helper T cells when they come to investigate the intrusion. It keeps more or less quiet for months or years in these hostage T cells until they begin to divide, perhaps in response to some subsequent infection. Not only do the helper T cells fail to sound the alarm that would activate killer T and B cells, but they are themselves spreading the disease. Second-wave invaders, in the form of opportunistic infections, meet no opposition when they march in for the kill.
There is, however, good news on other fronts. Researchers are harnessing the arsenal of the immune system to fight cancer. Crossing spleen and cancer cells, they’ve cloned the resulting hybrid to produce antibodies that scout the body for incipient tumors. These so called monoclonal antibodies may someday be routinely used to deliver drugs or radiation to a specific diseased site in the body, bypassing healthy tissue.
Perhaps the most dramatic advances come from the relatively new field of psychoneuroimmunology, the study of interactions between the mind and the nervous, endocrine, and immune systems. Scientists have found, for example, that a body under stress produces an excess of the steroid cortisol. Because macrophages coping with all that cortisol can’t seem to handle other infections, we catch more colds when we’re stressed out. We know that exercise seems to stimulate the brain to produce natural painkillers, called endorphins and enkephalins, and that it might even affect T cell production. Why? And how do the brain and immune system communicate? It looks as though the brain may speak to the white blood cells through the cytokines, the protein messengers —and the white blood cells may talk back. The immune system may ultimately be found to work not just as an army, but as part of an elaborate feedback loop. Is it the body’s discussion group? Its town meeting? Stand by for new metaphors. GENES “R” US
OK, here’s a quiz: The Human Genome Project was (a) a vast government-funded science project that began in 1990 and ended in 2000 with a fancy joint press conference hosted by then-president Bill Clinton and British prime minister Tony Blair; (b) a vast government-funded science project that began in 1990 and ended in April 2003, when the National Institutes of Health decided that it had closed many gaps in the previous calculations; (c) a vast government-funded science project that began in 1990 and ended in October 2004 when the journal Nature published what it called the most complete sequence of the human genome; or (d) all of the above.
You guessed it: d. And that’s the beauty—and complexity—of the Human Genome Project. Its mission, essence, and conclusion are all in the eyes of the beholder.
But let’s back up. You’ve read, or at least we’ve talked, about genes and DNA already and how there are thousands of genes in each of us, crammed along a tightly coiled strand of DNA in the nucleus of every one of the one hundred trillion cells that make up the human body, including hair and nails (but not red blood cells, which have no nucleus). And as we’ve mentioned, the information contained in that strand equals the data in a thousand encyclopedia volumes or, if you prefer, 125 Manhattan phone books. It programs the birth, development, growth, and death of each of us. And like any good computer program, it’s relentlessly binary: Three billion pairs of chemical code letters form the railings of the famous spiral staircase, a.k.a. the double helix, and are responsible for making proteins, the basic building blocks of life.
The Human Genome Project (HGP) was originally set up in 1990 by the U.S. Department of Energy and the U.S. National Institutes of Health to identify the thousands of human genes and map the exact sequence of the three billion base pairs (or nucleotides) that make them up. (It’s worth noting that about 97 percent of the nucleotides are “junk”—that is, they don’t correspond to protein coding.) It was a complicated job, and in some ways a humbling experience. For one thing, as scientists got closer to completing the sequence, they realized that the number of genes humans have is considerably lower than original estimates: from upward of a hundred thousand back in the mid-1990s to a (more or less) final count of twenty-five thousand, which puts us on par with the puffer fish and the mustard weed. As Dr. Francis Collins, director of the HGP, put it: “If we wanted to claim our special attributes on this planet come from the gene count, we’ve taken a serious hit.” (Since this discovery, scientists hoping to explain what makes humans different from other living creatures have begun focusing on the aforementioned “junk” parts of the genome, which, many believe, might turn out to be some kind of complicated genetic circuitry. If so, it could mean that human complexity arises not from the basic building blocks themselves but from the way those proteins are wired together.)
For a government agency, the HGP did a pretty good job. Regardless of which of the aforementioned dates count as the completion, it came in ahead of schedule (the project had been slated to take at least fifteen years) and, depending on your accounting system, probably under budget.
The mapping of the human genome is seen as the first step in tracking information on, among other things, four thousand genetically linked diseases. Theoretically, identifying the specific gene for a disease could lead to new therapies, which in turn could lead to cures for everything from Lyme disease to Alzheimer’s to ALS (Lou Gehrig’s disease) to severe combined immunodeficiency—the so called boy-in-the-bubble syndrome (all of which, and many others, have in fact been identified, thanks to the HGP). It could also give doctors a leg up in devising treatment plans for particular individuals. According to its proponents, someday your personal genome might be interpreted and the information transferred to a smart card. Then, when you got sick, you could walk into any hospital and hand over the card; doctors would be able to see all the glitches in your code (don’t take it personally, we all have them), identify precisely what the problem is, and prescribe just the right treatment for your specific genetic makeup.
But as you may have guessed, we’re still a long way from such a scenario. Yes, there have been major advances in the number of disease genes identified (from about 150 in 1990 to about 1,500 in 2002), and even in their application to cancer research (in 2004, scientists located a region of genes that increase a person’s risk of getting lung cancer). But once yo
u’ve discovered the elusive gene, the next steps—understanding the pathogenesis of the disease, then treating, and eventually, preventing it—are difficult, time-consuming, and costly.
There are also ethical complications, which tend to multiply, not disappear, as the science evolves. Do you really want to be in on that biggest of secrets, your biological fate? What would happen to unborn babies who were found to have the gene for, say, Huntington’s chorea or cystic fibrosis? Would their parents decide to abort? What if you were told that your fetus had a better-than-50-percent chance of developing breast cancer or Alzheimer’s in middle age? That’s an awful lot of good years to decide to cut off in the name of a mere probability. And there are legions of parents who wouldn’t be happy to hear that their child carried the gene for obesity—or homosexuality.
Granted, we’ve calmed down about some of these issues. For one thing, scientists have explained that it’s rarely a single gene that makes Jimmy gay. Instead, they believe it has to do with how numerous genes interact with each other, how that genome “junk” works—and how closely the prepubescent Jimmy identifies with Judy Garland. Meanwhile, the focus has been shifting from the genes themselves to the proteins they make. (We may have the same number of genes as a field mouse, but our superior protein production means we’re the ones exploiting Stuart Little and Mickey, not vice versa.) In any case, current thinking leans toward the conclusion that it’s neither nature nor nurture that decides our fate, but rather some of both. After all, we know that cultural and economic factors—not to mention the dearth or glut of fast-food chains—have as much to do with obesity as the fat gene.
Nowadays, the ethical debate over genetic research revolves around more mundane questions; for instance, how will we keep personal genetic information from being used against us by employers and insurance companies? (It wouldn’t be hard for them to obtain, given how it’s all right there, in blood, fingernail clippings, and the hair on the barbershop floor.) Some U.S. politicians have tried to push a Genetic Nondiscrimination Act through Congress, but so far it has failed to gain passage.
On a more philosophical level, all of this genetic scrutiny has made one thing clear: Genetically speaking, people are a lot more alike than they are different. Scientists now say that 99.9 percent of human beings’ DNA sequencing is identical. Not only that, but 50 percent of our genes are the same as a banana’s. And we’re even closer to the genetic makeup of a fruit fly. You might want to think about that the next time you reach for the flyswatter. CLONING AND THE STEM CELL DEBATE
It all started with a little lamb named Dolly. Born on July 5, 1996, the ewe caught the world’s attention as the first mammal to be cloned from the adult body cells of another. Until then, most scientists didn’t believe that cloning from adult cells was possible. But Dolly, named after country singer Dolly Parton, opened up a whole new scientific realm—and a big can of worms.
But we should back up here. For all its cutting-edge, sci-fiassociations, “cloning” is actually a pretty old term. Coined by a horticulturalist in 1903, it originally described an exact genetic copy of an individual organism, asexually produced, such as when a strawberry plant sends out clones to take root at an appropriate distance from the parent plant. Cloning took on new connotations in 1952 when biologists made a new frog from the DNA in the nucleus of an intestinal epithelial cell of a tadpole. After that, the floodgates opened, and within two decades scientists had cloned mice, pigs, sheep, and cattle. All of these clones were cultured from embryo cells. Dolly, however, was in a class by herself, since she was cloned from an adult cell—and her birth decimated the then-generally accepted theory that once cells reached adulthood, they were no longer totipotent (that is, able to develop into a complete organism).
Somatic cell nuclear transfer—the fancy name for cloning—is pretty simple, at least in theory. A cell is taken from an adult animal. The nucleus, along with the DNA inside it, is pulled out and placed beside an empty egg cell without a nucleus. (Think of it as sort of a blind date in a petri dish.) The egg and the nucleus are then nudged together by a mild electrical current and bathed in a kind of chemical love potion. The egg is essentially duped into thinking it has been fertilized, and in the best-case scenario it starts dividing like crazy. Although we like to think of clones being a mad-scientist experiment exclusively created in a test tube, the fact is that only the very beginnings of the process actually happen in vitro. Soon after its creation, the blastocyst (the ball of dividing cells that develops after conception) is transferred to the womb of a surrogate mom. (Dolly—or her cellular beginnings, anyway—were only “grown” in the lab for seven days before they were inserted into a blackface ewe’s womb. The surrogate then carried her to term.)
As simple as all this sounds, cloning is notoriously ineffective. For every happy clone that survives beyond birth, there have been, according to some estimates, as many as a thousand unsuccessful ones. Obviously, those aren’t great numbers. There are also questions about whether clones are genetically dysfunctional. Many cloned animals were born ill or deformed, or have died prematurely. This includes, alas, our friend Dolly, who developed lung disease along with a number of other ailments, and had to be put down in 2003 at the relatively young age (for a sheep) of six. Some scientists suggested that the genetic blueprint actually wears out, meaning that cloning has less in common with a Xerox copier than with the old ditto machine in the office of your junior high. Still, that hasn’t stopped enterprising entrepreneurs from cloning other animals including, notably, champion racehorses. And as of this writing, one company, Genetic Savings and Clone (don’t look at us), will create a carbon copy of your cat for only $32,000.
All these precedents have made it clear that the reproductive cloning of human beings is now within the realm of the possible. Whether or not it is desirable is another story, and a hot topic. Duplicating humans in the laboratory has an unnerving, Frankenstein (or at the very least Boys from Brazil) aspect to it that makes many people queasy. Though there have been no successful births of cloned humans (despite a number of unsubstantiated claims), in 2004 an alarmed United Nations General Assembly adopted a largely symbolic, nonbinding resolution to prohibit all forms of human cloning.
But cloning isn’t only, or even chiefly, for reproductive purposes. Many scientists are far more interested in the fact that cloning can be used to create stem cells, the jacks-of-all-trades of the cell world. Stem cells are part of the body’s repair system, dividing without limit and replenishing other cells in the organism. These remarkable creatures can either stay as they are or morph into something more specialized, such as a muscle cell, a brain cell, or a red blood cell. Advances in this kind of technology—called therapeutic cloning—may hold the key to curing diseases such as Alzheimer’s, Parkinsons, and diabetes.
Unfortunately, therapeutic cloning and reproductive cloning are often collapsed in the public debate—particularly by those who believe that life begins at conception. (To the latter, extracting stem cells from cloned embryos—which destroys the embryo in the process—is simply high-tech abortion.) The moral complexities have resulted in restrictions in many countries, including Japan, where there is a mandatory ten-year sentence for anyone found guilty of attempting to clone human beings, and the United States, where federal funding for the creation or manipulation of embryos is prohibited. These restrictions have frustrated many scientists and their supporters (including former First Lady Nancy Reagan, who became an outspoken advocate of stem cell research when her husband succumbed to Alzheimer’s), who see embryonic stem cell research as the key to future advancements in medicine. But even advocates urge caution.
One popular scenario warns of the experimental procedure in which human stem cells are inserted into an animal embryo to see how they divide and change in a living system. The research may be necessary—you would probably want to test the technology on lab rats before you tried it on a human friend—but what do we make of the chimera (named after the creature in Greek m
ythology that is part lion, part goat, and part serpent) that has been created? Since it possesses human qualities (at least on a cellular level), does that mean it has human rights? And what happens if a male lab-created mouse with human cells mates with a female lab-created mouse that also has human cells? As one science writer put it, there is the chance we could have “a sort of Stuart Little scenario, but in reverse and not so cute.”
Making a Name for Yourself in Science ARCHIMEDES’ PRINCIPLE
Holds that buoyancy is the loss of weight an object seems to incur when it is placed in a liquid, and that that loss is equal to the weight of the liquid it displaces. Takes off from (but is not restricted to) the “Eureka!” business.
To recap, it’s the third century b.c.; we’re in Syracuse, in Sicily (for centuries an important Greek outpost); and the local king, Hiero II, has reason to suspect that the royal jeweler has sneaked some silver into the new, and supposedly 100 percent gold, royal crown. Hiero calls in Archimedes, who for a while is stumped. He knows that gold weighs more than silver and consequently has less volume and that the volume of a piece of pure gold and of a crown of pure gold weighing the same amount would be the same. But how to measure the volume of a strange-shaped thing like a crown? Then, pondering the problem one day in the tub, Archimedes realizes that a body immersed in liquid displaces exactly its own volume of that liquid. Measure the volume of the water that’s spilled over the side of the tub and you’ve got the volume of the thing in the tub. At which point, Archimedes shouts “Eureka!” and runs home naked, where he puts first the piece of pure gold, then the crown said to be of pure gold, in a basin of water. The crown causes the water to rise higher, revealing itself to have a greater volume (and hence to be less dense, i.e., not pure gold). And revealing the jeweler to be guilty.