Book Read Free

Heart

Page 11

by Sandeep Jauhar


  Within a decade after Forssmann’s epoch-making experiment, the taboo about touching the heart had been demolished. Scientists explored every avenue of access to animal and human hearts—under the breastbone; through the ribs; just below the nipple; through the left atrium; through the aorta; through the suprasternal notch, the soft spot above the breastbone and below the throat; and even through the back—giving them unprecedented access to the physiology of a once mysterious organ.

  But as is so often the case in science, once the taboo about touching the heart was breached, it was transformed into something equally inviolable. Accessing the heart’s apple-sized chambers was one thing. Inserting needles into the coronary arteries that supply blood to those chambers was a different challenge altogether. Coronary arteries are small, hardly five millimeters in diameter. When they are diseased with fatty plaque, their diameter can shrink to microns. No one thought that dye could be safely injected into these vessels because it was feared that occluding a coronary artery with a catheter for even a few seconds would precipitate a fatal arrhythmia. Even the fearless Forssmann never messed with the coronaries; they were only studied at autopsy. Though animal studies did not validate doctors’ widespread fears, the human heart was once again believed to be uniquely impervious to intervention. But did it have to remain so? After World War II, the coronary arteries became the new frontier in cardiac medicine, and the holy grail.

  *Forssmann told the journalist Lawrence Altman that this story, which Forss-mann himself publicized, was apocryphal.

  *Progress was slower abroad. John McMichael, a doctor in London, wanted to use catheterization in his own shock study. He contacted Cournand, who shared information on the technique. However, a colleague of McMichael’s warned him that the technique was dangerous and that he would not be defended in court if he was charged with manslaughter if a patient died.

  7

  Stress Fractures

  Every affection of the mind that is attended either with pain or pleasure, hope or fear, is the cause of an agitation whose influence extends to the heart.

  —William Harvey, De Motu Cordis (1628)

  In the catheterization lab, I was able to visualize the consequences—stony plaque, obstructive clot—of coronary artery disease. But why did the disease develop in the first place? This was a question that vexed scientists at mid-century, even as the heart-lung machine was being developed and cardiac catheterization techniques were being refined. (As is so often true in medicine, treatment outpaced understanding.) But by the 1960s, doctors had an idea—albeit incomplete—of the answer. And it came from a study begun in a small town in Massachusetts shortly after World War II that almost single-handedly defines the modern science of heart disease.

  The impetus for the Framingham Heart Study was obvious. In the 1940s, cardiovascular disease was the main cause of mortality in the United States, accounting for nearly half of all deaths. However, what was known about heart disease wasn’t enough to fill even a slim chapter in a modern textbook. Doctors, for example, did not know that myocardial infarction was caused by total or near-total obstruction of a coronary artery. (This mechanism wasn’t even mentioned in popular literature until 1955, when Humbert Humbert in Lolita is said to die of a “coronary thrombosis.”) The jury was also out on whether angina, chest pain caused by decreased coronary blood flow, was a psychological syndrome or a disease based on an organic cause. “Prevention and treatment were so poorly understood,” Dr. Thomas Wang and his colleagues wrote in The Lancet a few years ago, “that most Americans accepted early death from heart disease as unavoidable.”

  One who fell victim to this ignorance was our thirty-second president. Franklin Delano Roosevelt was in poor health for much of his presidency, even though his doctors, his family, and even journalists colluded to portray him as the picture of health. (Few in the public knew, for example, that Roosevelt was essentially confined to a wheelchair after contracting polio when he was thirty-nine.) Roosevelt’s personal physician, Admiral Ross McIntire, an ear, nose, and throat specialist, seemed to hardly pay attention to the president’s blood pressure as it rose over his four terms. When Roosevelt began his second term in 1937, his blood pressure was 170/100 (normal today is considered less than 140/90). When the Japanese bombed Pearl Harbor in 1941, it was 190/105. By the time American soldiers landed in Normandy in June 1944, it was 226/118, life-threateningly high. At the Yalta Conference in February 1945, Winston Churchill’s doctor wrote that Roosevelt “has all the symptoms of hardening of the arteries” and “I give him only a few months to live.” Yet McIntire insisted that the president was healthy and that his problems were “no more than normal for a man of his age.”*

  Within a month of Roosevelt’s last State of the Union address, in which he declared that “1945 can be the greatest year of achievement in human history,” his condition had visibly deteriorated. Roosevelt had already been admitted to Bethesda Naval Hospital with shortness of breath, profuse sweating, and abdominal swelling: classic signs of congestive heart failure. Howard Bruenn, one of only a few hundred cardiologists in the country at the time, gave the president a diagnosis of “hypertensive heart disease and cardiac failure.” However, there were few treatments available to him. He put Roosevelt on digitalis and a salt-restricted diet, but Roosevelt’s blood pressure continued to rise. It remained life-threateningly high until April 12, 1945, when Roosevelt died at the age of sixty-three from a stroke and brain hemorrhage. His last words, sitting for a portrait in Warm Springs, Georgia, where he had gone to rehabilitate, were “I have a terrific headache.”

  Though it was a national tragedy, Roosevelt’s death was not in vain. In 1948, Congress passed the National Heart Act, declaring “the Nation’s health [to be] seriously threatened by diseases of the heart and circulation.” Signing the bill into law, President Harry Truman called heart disease “our most challenging public health problem.” The law established the National Heart Institute (NHI) within the National Institutes of Health to promote research into the prevention and treatment of cardiovascular disease. One of the first grants was for an epidemiological study to be conducted by the U.S. Public Health Service.

  Epidemiology is about the ecology of disease: where and when it is found, or not. In 1854, John Snow, physician to Queen Victoria, performed the world’s first epidemiological study when he investigated a major cholera outbreak in London’s Soho. Snow was born in the town of York, at the intersection of two rivers contaminated by dung and sewage. His childhood likely sensitized him to a community’s need for clean water. Based on studies nearly ten years before the Soho epidemic, Snow had concluded that cholera was transmitted by “morbid matter,” not foul air, as his colleagues at the London Medical Society believed. He based his theory in part on the fact that workers in slaughterhouses, thought to be a font of cholera, were afflicted no more than the general population. So, when cholera broke out in London in 1854, Snow set his sights on a well. He went to the General Registry Office and mapped the addresses of all the cholera deaths in Soho, discovering that most deaths had occurred near a water pump on Broad Street. True to his meticulous nature, Snow also studied Soho residents who did not contract the disease—for example, inmates at a nearby prison that did not use the Broad Street pump, as well as brewery workers whose supervisor, a Mr. Huggins, told Snow that his men drank only water from the brewery’s own well (when they weren’t consuming the malt liquor they produced).

  Though Snow knew nothing of germs, he was nevertheless able to contain the epidemic, which caused 616 deaths, by persuading the board of governors of the local parish to remove the handle on the well’s pump, making it impossible to draw water. Only later, by studying water samples, did London authorities show that the pump was contaminated with sewage from a nearby cesspool, setting off what Snow called “the most terrible outbreak of cholera which ever occurred in this kingdom.” Snow’s investigation saved many lives. Just as important, it showed that an epidemic could be controlled without a pre
cise understanding of its cause.*

  After Snow’s study and the subsequent development of epidemiological techniques, public health authorities in the United States focused their attention on acute infectious diseases like cholera, tuberculosis, and leprosy. Chronic noninfectious ailments—the long-term hard hitters like heart disease—received little attention. But after Roosevelt’s death, Assistant Surgeon General Joseph Mountin, a founder of the Office of Malaria Control in War Areas (later known as the Centers for Disease Control, or CDC), was eager to correct this disparity. As was the case with cholera in the mid-nineteenth century, very little was known about the determinants of heart disease. Could risk factors be identified by studying people who developed the disease, just as Snow had studied the victims of the cholera epidemic?

  The national climate after World War II was favorable for such an investigation. New hospitals were being built, the National Institutes of Health was expanding, and there was increased federal commitment to basic and clinical research. Moreover, a beloved president had just died. In this environment, things moved quickly. By the summer of 1948, the U.S. Public Health Service had already negotiated the basic framework of an epidemiological study of heart disease with the Massachusetts Department of Health. The commonwealth was a natural choice for the project, with top medical schools, such as Harvard, Tufts, and the University of Massachusetts, in and around Boston. The commissioner of health was “warmly enthusiastic” about a pilot study to develop heart-screening tools. With the support of Harvard physicians, the town of Framingham, about twenty miles west of Boston, was chosen as the site.

  In the late seventeenth century, Framingham was a farming community, home of the first teachers’ college and the first women’s prison, and a haven for those trying to escape the witch hunts in nearby Salem. During the Civil War, it was the first town in Massachusetts to establish a volunteer battalion. However, by the 1940s, Framingham had turned into a middle-class industrial town. Children played with garden hoses on tree-lined streets. The 28,000 townsfolk lived mostly in single-family homes and earned a median family income of about $5,000 per year (about $50,000 today). (There were exceptions, of course, such as James Roosevelt, the president’s son, who owned a large estate on Salem End Road.) Most Framingham residents ate a typical meat-and-potatoes diet. Like the rest of the country, about half of them smoked. Predominantly white and of western European descent, they were believed to be representative of America after World War II.

  The key question at the heart of the Framingham study was this: Can the risk of a heart attack be predicted in a person with no overt heart disease? The plan was to follow approximately five thousand healthy patients between the ages of thirty and fifty-nine for twenty years until enough of them developed heart disease. Meanwhile, factors associated with the development of the disease would be identified (and later, hopefully, modified to prevent disease in healthy patients). At the time, hypothetical factors were “nervous and mental states,” occupation, economic status, and use of stimulants like Benzedrine. Though research linking heart disease with cholesterol had been available for decades (in 1913, researchers in St. Petersburg had demonstrated that feeding rabbits large quantities of cholesterol-rich foods, such as meat and eggs, caused atherosclerotic plaques), this information was not yet widely known to doctors or the American public.

  The initial outlay for the Framingham study was modest: about $94,000, mostly to cover office supplies (including ashtrays for the study researchers who smoked). Mountin, the assistant surgeon general, selected Gilcin Meadors, a young U.S. Public Health Service officer, as the first director. Born in Mississippi, Meadors had graduated from medical school at Tulane only eight years earlier. When he was tapped by Mountin, he was still completing a master’s degree in public health at Johns Hopkins. Besides a lack of experience, Meadors faced many challenges. He had to persuade local physicians, many of them suspicious of the federal government, to cooperate with the U.S. Public Health Service. Moreover, because of the long period required for heart disease to develop in healthy people, nearly half the eligible townsfolk would have to agree to participate, and their attrition rate would have to be almost vanishingly low.

  The study was announced in a small advertisement in the local newspaper on October 11, 1948. Then Meadors, the young upstart epidemiologist, went into action. Not your typical pocket-protector bureaucrat, Meadors was charming and sociable. He attended town meetings and befriended civic leaders. Flattered by his ambition, a whole network of veterans, lawyers, and housewives sprang up to spread word about the study. Meadors’s recruits knocked on doors, staffed telephone banks, and appeared at churches, parent-teacher organizations, and community groups. Their mission was to help him enroll subjects willing to reveal intimate information to federal officials with no promise of any direct benefit (though Meadors said the study would eventually lead to “recommendations for the modification of personal habits and environment”). Within weeks, Meadors’s staff had filled appointment slots through the spring.

  The first study questionnaires included items about personal and family history, parents’ age at the time of death, habits, mental state, and medication use. Government-appointed doctors peered into subjects’ eyes and palpated livers and lymph nodes. Blood and urine tests were taken; X-rays and electrocardiograms were performed. Though cholesterol testing had been considered before the initiation of the study, it was only added after the research had begun.

  After a year, control of the study shifted to the newly established National Heart Institute. The NHI changed the character of the project, making its methodology more rigorous. Instead of enrolling volunteers, it now randomly selected subjects, eliminating a source of bias. The focus also shifted toward investigating biological rather than “psychosocial” risk factors. Questions about sexual dysfunction, psychiatric problems, emotional stress, income, and social class were discarded. Statisticians at the NHI invented something called multivariate analysis, a method of calculating the relative importance of each of several factors that coexist in the expression of a disease. (In the beginning, Framingham scientists focused on age, serum cholesterol, weight, electrocardiographic abnormalities, red blood cell count, number of cigarettes smoked, and systolic blood pressure.) Therefore, the Framingham study, as it emerged in the 1950s, was “clinically narrow,” as one researcher put it, “with little interest in investigating psychosomatic, constitutional, or sociological determinants of heart disease.” This would turn out to be a major flaw.

  After nearly ten years of closely monitoring approximately fifty-two hundred patients, Framingham researchers published a key paper in 1957 (out of the nearly three thousand produced to date) showing that patients with high blood pressure had a nearly fourfold increase in the incidence of coronary heart disease. A few years later, hypertension was also shown to be a major cause of stroke. Referring to President Roosevelt’s premature death, Framingham scientists commented on the “mounting evidence that many of the commonly accepted beliefs concerning hypertension and its cardiovascular consequences may be in error.” Even Dr. Bruenn, Roosevelt’s cardiologist, wrote, “I have often wondered what turn the subsequent course of history might have taken if the modern methods for the control of hypertension had been available.”

  Later Framingham publications identified additional coronary risk factors, including diabetes and high serum cholesterol. One paper found that nearly one in five heart attacks present with sudden death as the first and only symptom, a discovery that ratified the tremendous fear that millions of Americans were living with. By the early 1960s, a definitive association had also been made between cigarette smoking and heart disease. (Smokers in previous studies hadn’t lived long enough to draw definitive conclusions.) This led to the first surgeon general’s report detailing the health hazards of smoking. In 1966, the United States became the first country to require warning labels on cigarette packages. Four years later, primarily because of Framingham, President Nixon signed legislati
on banning cigarette ads on television and radio, one of the great public health triumphs of the second half of the twentieth century.

  The Framingham study was nearly shut down in the late 1960s for lack of funding. There was no dearth of events—assassinations, riots, civil rights protests, and the Vietnam War—to occupy policy makers, and an epidemiological study in a small town in Massachusetts hardly seemed to warrant much attention. So, Framingham investigators went around the country trying to raise private money. Donors included some unexpected contributors, including the Tobacco Institute and the Oscar Mayer Company, which manufactured luncheon meats. In the end, only after President Nixon’s personal physician, the cardiologist Paul Dudley White, lobbied for the study was federal support revived.

  The Framingham study shifted the focus of medicine from treating cardiovascular disease to preventing it in those at risk. (Indeed, the term “risk factor” was introduced by Framingham researchers in 1961.) In 1998, while I was still in medical school, Framingham researchers published a formula, based on the major independent cardiac risk factors that had been identified—family history, smoking, diabetes, high serum cholesterol, and hypertension—to calculate a patient’s risk of getting heart disease within ten years. (This is the formula I used after my first CT scan showing that I had developed coronary plaque.) Today we know that programs that target such risk factors improve public health. For example, a recent twelve-year study of 20,000 Swedish men showed that almost four out of five heart attacks could be prevented through Framingham-inspired lifestyle changes, such as a healthy diet, moderate alcohol consumption, no smoking, increased physical activity, and maintaining a normal body weight. Men who adopted all five changes were 86 percent less likely to have a heart attack than those who did not. An earlier study of about 88,000 young female nurses found that participants who followed a healthy lifestyle—didn’t smoke, had normal body weight, exercised at least two and a half hours each week, had moderate alcohol consumption, followed a healthy diet, and watched little television—had almost no heart disease after twenty years of follow-up.

 

‹ Prev