by Judy Jones
NAME: Homo floresiensis (from Flores, the Indonesian island where they were discovered)
TIME FRAME: 94,000 to 12,000 years ago.
VITAL STATS: Nicknamed the Hobbits, and for good reason; these were wee creatures. The most complete skeleton was from a thirty-year-old female who stood a little taller than three feet and weighed a slight fifty-five pounds. Their brains were only about as big as grapefruits. Still, they made a variety of tools and clearly used fire.
COMMENTS: The Hobbits have put a lot of scientists into a quandary. Traditional paleontology has it that hominid intellect is directly correlated with brain size. So, what to make of these guys, who existed at the same time as modern Homo sapiens and were clearly smart enough to make weapons to hunt a variety of wildlife (including their favorite prey, a rat the size of a small golden retriever)? Even weirder, local Indonesian myths mention hairy little creatures called Ebu Gogo, who had a reputation as the worst sort of dinner guests, eating everything from the serving plates to the village infants.
NAME: Homo sapiens sapiens (“wise, wise man”)
TIME FRAME: From about 195,000 years ago to today and perhaps even tomorrow.
VITAL STATS: Look in the mirror. Your brain probably logs in at about 1,350 cubic centimeters and you wouldn’t kick yourself out of bed.
COMMENTS: Our subspecies, right or wrong. Its members were hunter-gatherers until relatively recently; they domesticated animals 18,000 years ago, invented agriculture 12,000 years ago, founded cities only about 5,000 years ago, and didn’t invent the Slinky until the mid-twentieth century. THE SPLICE OF LIFE
Genetic engineering is the process that inserts genes from one living organism into the cells of another, thereby custom-tailoring them to do work they weren’t designed for. (But they still won’t do windows.) For example, thanks to genetic engineering, or recombinant-DNA technique, millions of bacteria are kept busy churning out precious human insulin. Scientists built the microfactories by slipping the human gene responsible for the creation of insulin into E. coli, a mild-mannered bacterium found in our intestinal tract. (It beat waiting around until a pig died, then harvesting its pancreas.) So far, genetic engineering has been most successful with microorganisms, plants, mice (whose immune systems have been made to mimic those of human beings), and—Stephen King, take note—livestock: pigs that gain weight faster, cows that give more milk. Science is still working on the problem of getting genetically altered DNA back into a human cell and, we’re told, isn’t even close to a solution. Someday, however, we may be able to replace or repair bad genes, like the ones responsible for such diseases as cystic fibrosis, sickle-cell anemia, and perhaps even cancer.
To understand how genetic engineering works, you’ve got to know about de-oxyribonucleic acid, or DNA, three letters even more important than MTV. First, go back to the days when you were a single cell—a fertilized egg, or embryo, whose nucleus contained forty-six packages called chromosomes, twenty-three from each parent, carrying the coded information for all your inherited characteristics, from hair color to susceptibility to stress to, some would argue, sexual orientation and propensity for violence. As your cells divided, each new cell was issued forty-six identical chromosomes (except for your reproductive cells, which, in anticipation of future mating, have only twenty-three apiece). The strandlike chromosomes are made up of several thousand genes, each responsible for a particular trait. And the genes are composed of DNA, the chemical that runs the show, programming and operating all life processes. All living things have DNA in every cell. (The exception: Some viruses contain only a chemical cousin of DNA called RNA.) Indeed, it has been said that if the instructions coded in one person’s DNA were written out, they would fill one thousand encyclopedia-sized volumes. You might want to wait for the movie.
In 1953 Francis Crick and James Watson earned a Nobel Prize for discovering that DNA is shaped like a spiral staircase, the famous double helix. Phosphates and sugars form the railing of the staircase, and pairs of four nitrogen bases in various combinations form the steps—up to about three billion of them in human DNA. The order of the base pairs determines the particular characteristics of any shrub, egret, or stand-up comic.
Twenty years after this discovery, two California researchers, Stanley Cohen and Herbert Boyer, found a way to perform DNA transplants. For example, it was they who turned bacteria into human-insulin factories, removing DNA from a human cell and cutting it with special enzymes at the spot where the needed gene is found. Then, using more enzymes, the human gene was snapped into a plasmid, a strand of extra DNA, found in bacteria. When that bacterium reproduces, it creates millions of copies of itself, each with the new gene. In this way scientists produce not only insulin but growth hormone and cancer-fighting interferon.
There’s plenty of controversy and intrigue in the world of genetically altered organisms. In 1985, such an organism was tested for the first time outside the lab—semiofficially. A vaccine made by genetically altering a herpes virus was injected into 250 piglets. The test may have launched a golden age of agriculture—or a regulatory nightmare. What if the gene-altered vaccine goes wild, mutates, affects humans? And, not to slip into anticlimax or anything, will small family farmers be able to afford the new technology? You see the problem: leg islators who have a hard time formulating a policy on Haiti set loose on something like this.
Gene splicing is not to be confused with cloning. And don’t get genetic engineering mixed up with in-vitro fertilization, the creation of test-tube babies. In that case, an ovum (more likely, several ova, to maximize the chances for success) are removed from the mother, fertilized in a petri dish by sperm from the father, and returned to the mother’s womb. Today there are more than a million in-vitro children in the world; in fact, the first, Louise Brown of England, is now all grown up. And the term is unfair: The kid spends only a couple of days in a dish, as an egg ½5 of an inch in diameter.
In a newer procedure, so called prenatal adoption, a woman needn’t even produce her own egg. Rather, the fertilized ova of another woman, who has been artificially inseminated with sperm from the first woman’s mate, can be implanted in the womb of the first woman and the resulting fetus carried to term.
Dr. James Watson, left, and Dr. Francis Crick with their Nobel Prize–winning model of the DNA molecule
Yet another technique allows the freezing of eggs, sperm, and full-fledged embryos for future use. A woman might deposit a fertilized egg, meant to be reimplanted later, once she’s sown her wild oats or launched her career. Or she could rent a womb, paying someone else to carry her child.
But mightn’t that lead to the creation of a class of drone-moms, employed by people who want to buy their way out of morning sickness? Ah, that’s just one of the ethical problems of biotechnology, the blanket term for all this fiddling around with Life Itself. Other sticklers: What if the wrong people start tampering with characteristics like physique, building a brave new world of superjocks and ultrawimps? What if a genetically engineered microbe escapes from the lab or the test area? There are still some eensy bugs to be worked out. MANY ARE COLD BUT FEW ARE FROZEN
Yes, the Big Chill is coming, but you won’t need your industrial-strength thermal underwear for another 3,000 to 20,000 years.
Over the past billion years, the earth has experienced three long periods during which ice built up at its poles, each period made up of several 100,000-year “ice ages,” when glaciers advanced to cover much of the world. These ice ages were punctuated by 10,000-year “interglacials,” warm spells marked by the melting of the vast ice sheets. We live at the end of such a temperate time-out; the last great ice age wound down about 7,000 years ago. At its peak, 20,000 years ago, glaciers encased much of North America, Europe, and Asia. Days were about eleven degrees colder than they are now, forcing humans and animals southward.
It’s not hard to see how an ice age is caused by a temperature drop, creating summers cool enough that the previous winter’s snow never melts. Sever
al seasons’ snows accumulate and compact to form glaciers. But what turns down the thermostat? The cold facts have been hotly debated, but the theory most widely accepted—the “astronomical” theory—states that three periodic changes in the earth’s position relative to the sun seem to have launched ice ages by influencing the amount of solar radiation the earth receives.
Because of the gravitational pull of the sun and moon on the equator, the earth wobbles on its axis like a toy top slowing down. Every 22,000 years or so, it describes a circle in space. The axis also tilts, causing the seasons. When the North Pole tips away from the sun, it’s winter in the Northern Hemisphere. Today, the angle of tilt is 231/2°, but every 41,000 years it moves from 22° to 24° and back again. Perhaps the most important cycle is a change in the shape of the earth’s orbit—from nearly circular to highly elliptical and back to circular—every 100,000 years due to the gravitational tug of fellow planets. The combined effect of these three cycles is to place the earth farther away from the sun at certain times, cooling the planet into an ice age.
If the astronomical theory makes sense to most scientists, why do some of them get all worked up about an imminent ice age? They insist that a veil of dust thrown up by man-made pollution and increased volcanic activity, together with thicker cloud cover generated by jet vapor, will lessen the amount of heat we get from the sun. (As so often happens in the wacky world of science, other people worry that some of these very same problems will make the earth too hot, not too cold.) Even if pollution and volcanoes do kick up enough dust to cause a slight drop in temperature (and we did find ourselves wearing sweaters a lot in the summer of 1991, right after Mount Pinatubo erupted in the Philippines) most scientists agree that this will not alter the primordial rhythms of the cosmos. It’ll be thousands of years before the ice age gathers steam, to mix a metaphor. In the meantime, ours may ultimately be the warmest interglacial in history—dangerously warm, thanks to the greenhouse effect, up next. SOME LIKE IT HOT
Leave your car parked all day in the sun with the windows rolled up and you’ll return to an interior much hotter than the air outside. That’s because window glass admits sunlight but traps heat. Carbon dioxide (CO), the gas we exhale, has a similar physical property; it is permeable to solar radiation (sunlight) but opaque to infrared radiation (heat). We need some CO2 to act as insulation, along with water vapor and ozone, an unstable form of oxygen. No sweat, as long as CO stays in its current proportions, i.e., only about .03 percent of the total atmosphere.
In the last 100 years, however, the amount of atmospheric CO has increased about 15 percent, thanks to the burning of coal, oil, and natural gas and to widespread deforestation, which has eliminated plants that would have photosynthesized thousands of tons of CO into oxygen and carbohydrates. The more CO in the atmosphere, the hotter we get. A rise in the global temperature of a mere 2°C to 4°C (3.6°F to 7.2°F), which some climatologists and a brace of computers think could happen by this century, would melt polar ice caps, causing seas to rise and submerge coastal cities; upset weather patterns, triggering droughts and floods; and decrease wind circulation, playing havoc with ocean currents and depleting the fish supply. Already global average temperatures are 1.1°F warmer than they were a century ago, and virtually all the hottest years on record have taken place in the last two decades.
As is usual in the apocalypse biz, a few people see the greenhouse effect as a boon, not a bane. First of all, they insist, it won’t get all that hot, and since CO2 stimulates photosynthesis and increases plant yields, agricultural productivity could soar. Others think the warming trend could extend our interglacial period (see preceding entry) and even stave off the next ice age. More people, however, side with the British scientific journal Nature, which observed some time ago that the release of CO from all those fossil fuels we’ve been burning may be the most important environmental issue in the world today.
Don’t confuse the greenhouse effect with deodorant doom, the notion that such man-made chemical compounds as jet exhaust and chlorofluorocarbons used as spray-can propellants and in air-conditioning (and now being phased out in the United States and much of Europe but not in most of the rest of the world) will destroy the ozone layer (ten miles up and twenty miles thick) that protects the planet from deadly ultraviolet and other high-energy radiation and you from skin cancer and cataracts. But feel free to fret. In 1985, scientists reported a large, mysterious hole in the ozone layer over the South Pole. The hole, which has appeared every winter since, is as big as the continental United States and growing. It could be a transient, fluky weather phenomenon, a previously unnoticed cyclical event—or the awful thing everyone said would happen if we overdid the Aquanet. ONE SINGS, THE OTHER DOESN’T
As the host said, rolling his eyes at the departing couple, “I don’t know how they stay together. She’s so left hemisphere, and he’s so right hemisphere.” He meant not that one of them’s from Peshawar and the other’s from Pittsburgh, but that their personalities epitomize the different ways the two halves of our brain seem to deal with the world. The left hemisphere, which controls the right half of the body, is responsible for language and logic; the right hemisphere, which controls the left half of the body, handles such intuitive, nonverbal processes as emotions and spatial relationships.
Uncovering this dichotomy earned a 1981 Nobel Prize for Roger Sperry, the Caltech psychobiologist who has experimented with people whose hemispheres had stopped speaking to each other. Ordinarily, the two halves of the brain are connected by a bundle of millions of nerve fibers, the corpus callosum, that allows signals to pass between the hemispheres and enables us to function as an integrated unit. By observing epileptics whose corpora callosa had been severed to prevent the spread of seizures, Sperry learned how the hemispheres divvy up the chores. In one classic experiment, he showed a different picture to each hemisphere simultaneously. (What the right eye sees is processed in the left hemisphere, and vice versa.) The left, verbal hemisphere was shown a picture of a knife and the right, nonverbal hemisphere was shown a picture of a spoon. Asked to name what he saw (to use language, a left hemisphere skill), the subject said knife. Asked to feel about with his left hand (a spiral spatial skill, controlled by the right hemisphere) and pick up what was in the picture, the subject chose a spoon from a group of objects. The right brain didn’t know what the left brain was doing.
Until recently, the division of labor between left and right seemed fairly clear-cut and accounted for a variety of functions: intellect versus intuition, concrete versus abstract thinking, objective versus subjective understanding, linear versus holistic perception. But now the head honchos themselves are split on a number of issues. Most agree that in 95 percent of right-handers and 67 percent of lefthanders, the left hemisphere is specialized for language. But new evidence suggests that the right brain handles some important language functions: recognizing narrative and humor (storyline and punchline), interpreting tone of voice, making metaphors. Scientists also thought that the rational left brain was definitely boss, relinquishing control to the right only during sleep and dreams. Now it appears that there may be complementary dominance: Each hemisphere has a specialty and kicks in when duty calls.
The specialization may have less to do with what the task is than with how it’s done. The right hemisphere seems to be a creative generalist, trying lots of solutions until one works. The left brain seems to be a plodding specialist, good at problems that are already familiar. Physical differences between the hemispheres support this hypothesis: The right hemisphere has long fibers plugging into many different areas of the brain, so it can schmooze and tune into the grapevine to solve a problem. The left brain contains shorter fibers tapping a smaller space, allowing it to do more detailed work. Still more studies (don’t these neurologists ever go to the movies?) show that the right brain handles novelty, the left brain what’s old hat. Unfamiliar faces are registered with the right hemisphere, familiar faces recognized with the left. Seasoned telegraphers
interpret Morse code with the left brain, young whippersnappers, the right. Nonmusicians recognize melodies with the right hemisphere, but musicians use the left.
One interesting bit of conjecture to come from split-brain theory: the idea that déjà vu—the feeling that what’s happening now has already happened in the past—may be nothing more than a neurological glitch that causes information to reach one hemisphere a split second before it reaches the other, so that a single event is processed twice. STATE OF SIEGE
In the last thirty years, researchers have pieced together a picture of what happens when a foreign invader (e.g., a virus, bacterium, or parasite) enters the body. First, the army—our trillion white blood cells, born in the bone marrow—is alerted. These cells are of two major kinds: phagocytes (“cell eaters”) that constantly patrol the body looking for intruders to devour, and lymphocytes, little commandos called T and B cells (T is for thymus, a small gland in the neck where T cells mature, B for bone marrow, the birthplace of B cells).
The soldiers in this army communicate via special proteins called cytokines. (Yes, of course, the military metaphor is tired, but “the body’s bowling team” just doesn’t cut it.) At the first sign of attack, special phagocytes called macrophages begin to swallow up the enemy, taking from each the equivalent of its dog tag—a tiny piece called the antigen, which a macrophage then displays on its own surface to attract helper T cells trained to recognize that particular antigen. (The thymus apparently has on hand a T cell able to recognize each of nature’s hundreds of millions of antigens.) The macrophage secretes a cytokine called interleukin-1, which stimulates the helperT cells to reproduce. The helperTs in turn secrete interleukin-2, which causes the creation of still more helperT cells and ofkillerT cells, whose job it is to clobber body cells already taken over by the invader. The new helper T cells release a lymphokine that orders the production of B cells in the spleen and lymph nodes. (Gives a whole new meaning to the term “sick day,” doesn’t it?)