Monoculture: How One Story is Changing Everything

Home > Other > Monoculture: How One Story is Changing Everything > Page 6
Monoculture: How One Story is Changing Everything Page 6

by Michaels, F. S.


  Even so, doctors weren’t always respected. In Roman times, doctors were decidedly low in status: they were slaves, freedmen, or foreigners. Until as late as 1745, surgeons were considered craftspeople who belonged to the same guild as barbers since both worked with their hands. A medical journal of the time remarked that when a promising young man chose to become a doctor, “the feeling among the majority of his cultivated friends is that he has thrown himself away.”2

  In the 1800s in England, doctors hovered around the edges of the gentry, trying to look and act like the upper class since professional success was about having the right aristocratic patrons and displaying the right social graces. In America, the aristocracy didn’t exist, so medical schools and societies were launched, often by doctors themselves, to bolster the status of the profession. At the same time, legislation was enacted that controlled who could and couldn’t open a medical practice.

  As a result, in the 1800s, being a doctor was a hard way to make a living. Americans were wary of medical authority. Doctors didn’t have stores of medical knowledge or techniques to pull from, and most families, isolated in rural areas with low incomes, could only afford to call a doctor if the situation was desperate. Doctors charged for mileage on top of the fee for a medical visit, and five to ten miles of travel meant the travel fee could be four or five times as high as the visitation fee. They ended up working long hours and traveling long distances to see patients. The image we still have today of the dedicated, selfless doctor comes from that era of medicine.3

  During the Industrial Revolution, work that used to be done at home started moving into the factories, making it harder for family members to care for the sick at home. As steamboats and railways were built, cities began to develop. Better mobility meant that family members were more spread out than they had been, so they weren’t always available to care for the sick. As cities grew, property values also started to rise, and many families could only afford to live in apartments, which left less space at home to care for the sick. More people were also living alone in cities, which meant the need for hospitals was growing along with the demand for doctors. At one time, few people used hospitals voluntarily because of the risk of infection; hospitals were more about charity than medical expertise and most were run by religious orders where nuns, doctors and nurses volunteered their time to care for the sick. You went to the hospital to die, or when you didn’t have family or friends to care for you. If you were sick, you were simply safer at home.4

  At the same time, doctors were also becoming more mobile. The invention of the telephone meant patients could call the doctor instead of sending for him, and the invention of the car meant doctors could reach patients faster; doctors were among the first car buyers. As doctors began to travel farther and faster, they saw more patients, increasing from an average of five to seven patients a day in the mid-1800s, to 18 to 22 patients a day by the early 1940s. As travel costs went down, medical care became more affordable. Doctors became more accessible, and people became more dependent on their services.5

  Still, in 1900, medical practice was unsophisticated. New ideas were slow to be adopted. Most surgeons still used their bare hands when operating, and few pharmaceutical drugs existed. A medical education meant you’d sat through two years of mostly lectures at one of over 150 schools, many of which were for-profit and had low entrance standards.

  Then medical knowledge started to grow. From the early 1900s to the early 1940s, x-rays, ECGs, and the four major blood groups were discovered, along with insulin, sulfa, penicillin and anaesthetics. Doctors became a symbol of healing. The growing demand for medical care meant that doctors could afford to give up lower-paying services and focus on higher-paying, more complex services that involved things like diagnostic labs, radiology, and surgical suites. Those complex services were often offered in hospitals now that medicine had advanced to the point where a doctor’s expertise no longer fit into a black bag, and where the services offered were too expensive to be maintained in every doctor’s office.6

  As urbanization shifted care of the sick from families and neighbors to doctors and hospitals, health care became a commodity, something that was bought and sold. At the same time, though, medicine wasn’t thought of as just another thing for sale. It was regulated because it dealt with serious issues like the relief of human suffering. Bad health care could have drastic consequences like disability or death, and most people who needed medical help weren’t in a position to evaluate the kind of help they were getting.

  The buying and selling of health care was also softened by the ideals that dominated the culture of the medical profession.7 In 1934, the ethics code of the American Medical Association (AMA) said non-doctors (outside investors) profiting from medical work was “beneath the dignity of professional practice, is unfair competition within the profession at large, is harmful alike to the profession of medicine and the welfare of the people, and is against sound public policy.”8

  Before World War II then, medicine was a cottage industry financed mostly by wealthy patients and philanthropists. Not enough medical technology existed to support a health manufacturing industry, and the government was uninvolved in health care other than via licensing and tax laws. In 1946, most American citizens were uninsured and paid for medical services out of their own pocket, or sometimes paid in kind. But in 1946, medicine was also viewed as a profession, not a business. A patient’s medical needs, by and large, were put ahead of a doctor’s financial gain.

  After the Second World War, funding that had gone to the atom bomb was redirected to medical research, and in the 1950s and ’60s, major advances were made in surgery, radiation, chemotherapy, organ transplants, and tranquilizers. Medical knowledge had now grown too large for a single doctor to learn during training, and doctors increasingly began to specialize. In 1923, 11 percent of American doctors were specialists; in 1989, over 70 percent were. Specialists were paid more than generalists and enjoyed more prestige, but specialization also meant that a doctor’s once-holistic view of you as a patient became fragmented, and personalized medical care started to fade.9

  With the rise of new medical technology, along with specialization, insurance coverage, and unregulated payments for doctors’ fees, medicine started looking attractive to outside investors. In the late 1960s and early 1970s, Wall Street started investing in for-profit health care facilities like investor-owned hospitals, nursing homes, home care, labs, and imaging services.10 After an advertising ban in medicine was lifted, doctors and hospitals started advertising their services. Where open and public competition between doctors and hospitals had once been considered unethical and unprofessional, advertising now made that competition public, which strained collegiality.11

  As investors started showing interest in health care, medical costs started to spiral due to inflation, growing research expenses, rising doctors’ fees, higher hospital costs, more health benefits for employees, and an aging population (medical advances had lengthened our lives but now we faced the complications of chronic disease which we just hadn’t survived to experience before). Malpractice suits were also rare until the twentieth century, when a growing number of lawsuits created “defensive medicine”: doctors did everything they possibly could in a medical situation to avoid being sued for negligence.12

  Technological advances in medicine were also proving expensive. Though new technology usually pays for itself because machines replace workers, in medicine that didn’t happen. Instead, medical advances involving complicated equipment and procedures required additional experts to be trained in the technology and increased costs instead of decreasing them.13

  The market was presented as a solution to all of these problems. The economic story says that a health care market will bail the government out of health care support it can no longer afford. Medicine started taking on the management practices of large businesses, and industrialization techniques were applied in the field. Private capital became a major player in the system,
and much of the money was tied up in insurance companies and manufacturers of health technology. For-profit health services appeared in home care, kidney dialysis centers, care centers, and hospitals. Multinational health care companies grew and were said to be “to the old ‘doctor’s hospitals’ what agribusiness is to the family farm.”14 An original $8 share in Humana, a multinational health care company, in 1968 was worth $336 by 1980; investments in hospital systems during those years returned almost 40 percent more in earnings than the average for other industries.15

  In the early decades of the twenty-first century, health care is a multi-billion dollar industry. Medical schools now offer joint MD-MBA degrees and business school graduates hold top positions in medical organizations, even though as recently as 1978, doctors weren’t expected to understand health care financing and organization.16 Managers of both not-for-profit and for-profit hospitals, who earn salaries as hefty as those in the private sector, are rewarded based on the net income of the hospital, and hospital CEOs or presidents “are clearly accountable to their boards as business experts.”17 Health care policies are laid out by business school professors and economists.18

  Arnold Relman, former editor-in-chief of the New England Journal of Medicine and the man who coined the term medical-industrial complex, says the most important socioeconomic change in a hundred years of American health care is the movement from “a professional service for the sick and injured into one of the country’s largest industries” — a transformation of health care from the compassionate relief of suffering to a profit-oriented business. Relman admits, “I am not saying that business considerations were never a part of the medical profession…or that physicians were in the past unconcerned about their income…But the commitment to serve patients’ medical needs (as well as the needs of public health) and the special nature of the relation between doctor and patient placed a particularly heavy obligation on physicians that was expected to supersede considerations of personal gain — and usually did.”19

  Biomedical ethicist Callahan agrees. “[T]here is an enormous difference,” he says, “between a discipline and a profession whose practitioners do not resist the personal good life when it comes their way, and one which has that life as its purpose.”20 Paul Starr, a Pulitzer Prize-winning sociologist adds, “The contradiction between professionalism and the rule of the market is long-standing and unavoidable. Medicine and other professions have historically distinguished themselves from business and trade by claiming to be above the market and pure commercialism. In justifying the public’s trust, professionals have set higher standards of conduct for themselves than the minimal rules governing the marketplace.”21

  Back in the 1960s, the norm as a doctor, according to the AMA, was to limit your professional income to “reasonable levels” because the “charging of an excessive fee is unethical…[the] fee should be commensurate with…the patient’s ability to pay…”22 Health care was considered to be about need, not someone’s ability to pay, since health care dealt in quality of life as well as life itself.23

  The economic story, on the other hand, says health care services are products, hospitals and doctors are sellers, and you as a patient, your government, and your insurance company are buyers. Your doctor is an entrepreneur competing with other doctors for your business. As a business, the health care industry promotes ever-changing products, “medicalizes” problems by advertising all kinds of conditions, stimulates interest in cures, builds consumer demand, and tries to get you out to the doctor more. Doctors can now be hired by insurers, which creates a conflict of interest between you and your doctor; an insurer typically wants to pay out as little as possible, so your doctor is caught in the middle, wanting to do what’s medically necessary for you as a patient while being aware that his or her employer is eyeing the cost.24

  In short, health care as a business is profoundly different from health care as a profession. Health care as a profession was founded on the relationship between you and your doctor; you trusted that your doctor was acting in your best interests.25 But by the 1990s, that trust was starting to erode. In the United States, hospitals were commonly paid a lump sum per patient by insurance programs, and so could grow their profits by keeping their costs down; doctors could order fewer tests, hospital workers could be paid less, and patients who were critically ill could even be shipped to other hospitals, thereby creating a financial incentive for hospitals to treat the least sick and discharge them as fast as possible, keeping turnover high. Simply, it was more profitable to keep people out of the hospital than in. By 1997, over one-third of hospital revenues in America were realized from outpatient services.26

  Conflicts of interest were also being pointed out in medical research. Before the economic story spread, research was supposed to be performed by disinterested parties according to the traditional norms of science. The Journal of the American Medical Association (JAMA), a leading publisher of medical research, saw its industry-financed research submissions drop 21 percent after it instituted a policy that required that data in company-sponsored medical trials be independently verified by university researchers. Still, since medical journals rely on corporate dollars (companies buy reprints of articles that support their products), JAMA “could face significant financial pressure to abandon the policy.”27 Another study released by JAMA’s editors found that in 2008, six of the top medical journals published a significant number of articles that were ghostwritten; other studies have shown that medical ghostwriters, whose work is hidden behind academic authors, are often sponsored by drug or medical device companies.28

  That conflict of interest was also spreading to the medical classroom. In 2009, Harvard Medical School students questioned the influence of pharmaceutical companies in what was being taught, pressing for faculty to disclose their industry ties after pharmacology students discovered that a professor promoting cholesterol drugs and disparaging students’ questions about side effects was a paid consultant to 10 drug companies, half of which produced cholesterol treatments. A then 24-year-old Harvard Medical student admitted, “We are really being indoctrinated into a field of medicine that is becoming more and more commercialized.”29

  A national survey of physicians published in the New England Journal of Medicine in 2007 found that 94 percent have “a relationship” with the pharmaceutical, medical device, or other related industries. That figure has contributed to concerns about how financial ties affect doctors’ prescribing habits and has led to calls for transparency regarding the financial relationship between doctors and medical industries.30

  As a consumer of medical services, in the economic story you are free to enter or exit the medical market as you please. You are, the story says, a knowledgeable buyer, and when the price of a health product or service gets too high, you are less likely to want to buy it. Most of us though, don’t choose to be sick, and medical professionals say you are anything but knowledgeable because you don’t have their extensive training. And if your life is at stake, chances are you will still buy whatever medical care you can get, no matter how much it costs; by 2007, over 60 percent of all U.S. bankruptcies were related to medical expenses.31

  The economic story says that in the market, no one is dependent; as a buyer, you have a free choice of sellers, and as a seller, you have a free choice of buyers, so no unevenness in power is involved.32 In reality, as a patient, you are heavily dependent on your doctor, and if you’re sick, it’s hard to “shop around” for a better deal like you’re supposed to as a consumer. Finally, when the trust between you and your doctor begins to erode because you begin to suspect that your doctor, as an rational economic individual, is looking out for his or her own interests instead of yours, who is left to advocate for you in the health care system?

  What do we gain in return for allowing the economic story in medicine and creating a medical marketplace, a health care industry? Do we enjoy better health or better health care? Relman doesn’t think so. He points out that almost all of
the reliable research points toward higher overhead and administrative costs in for-profit health care facilities than in not-for-profit facilities, and that the health service in those for-profit facilities is equal to that of non-profit facilities — or worse.33

  Callahan points out that the market model of health care will never encourage us to use less medical care, will never put limits on our desire for ever better health, and will never limit the development and use of health care technology, no matter how expensive it becomes or how incremental the health gains might be. The economic story will never encourage us to accept our own inevitable aging and death. Instead, the economic story in medicine orients us away from all of that, keeps us struggling for ever-longer life through advances in medical technology that simultaneously produce billions of dollars for the medical industry.34

  The economic story orients us not just physically, but spiritually, in matters of religious faith. Faith, said Wilfred Cantwell Smith, a scholar of comparative religions, is part of the human quest for transcendence. Faith is an orientation toward oneself, others, and the world, “a total response; a way of seeing whatever one sees and of handling whatever one handles; a capacity to live at more than a mundane level; to see, to feel, to act in terms of, a transcendent dimension.”35 Adhering to a certain religion is considered to be an expression of faith. In the United States, one of the most religious countries in the world, roughly 80 percent of Americans do just that, and identify themselves as Christians.36

  Historically, Christianity is the dominant religion of the Western world. As a religion, Christianity encompasses specific beliefs and ideas. It’s also an umbrella term that covers a variety of groups, some of which are convinced the others are misled at best and heretical at worst. Roughly 600 years after Christianity was made the official religion of Rome, the Christian church split into the Roman Catholic Church in the Western world and the Eastern Orthodox Church in the East. Then, during the Protestant Reformation, the Roman Catholic branch split again into Roman Catholicism and Protestantism. Across these centuries, says religious scholar Diana Butler Bass, Christian beliefs and values changed with the times, in distinct historical periods.37

 

‹ Prev