Book Read Free

The War on Normal People: The Truth about America’s Disappearing Jobs and Why Universal Basic Income Is Our Future

Page 20

by Andrew Yang


  I have two little boys and am not eager to see them become antisocial homebody zombies trying to set new high gaming scores. Yet, in observing parents interact with their children, I can see how easily it happens. Parents know there are endless hours to fill. Change won’t happen without some regulation, because the gaming and social media companies, many of them publicly traded, have strong financial incentives to maximize engagement.

  THE NEW CITIZENSHIP

  A renewal of citizenship and humanity will require a different experience at the user level of citizenship. What do I meant by that? The state has a few big responsibilities. Keeping people healthy and educating them are two of the main ones, which we’ll turn to shortly.

  Another aspect of citizenship is a sense of belonging and commonality. Most Americans have less and less exposure to those in other walks of life as we increasingly diverge into rural and urban enclaves. This leads to increasingly fraught politics as gaps become harder to bridge. Many of my friends advocate service year opportunities to foster more of a sense of unity. One idea is instituting an American Exchange Program or Citizenship Trip, during which all graduating high school seniors go on a month-long trip to several different parts of the country, hosted by host families and paid for by the government. They would volunteer for a local organization and participate in programming with 24 other high school graduates from diverse regions and backgrounds. The 25 young people would get to know each other in structured yet personal ways. It could be run by the top-rated schoolteachers and professors in the region each August and take place at high schools or community colleges. There would be some required programming on the basics of citizenship and civic investment.

  Afterward, everyone would have at least a few friends from vastly different backgrounds. Young people have the potential to develop significant relationships in short periods of time in the right context. It would permanently alter our politics by making it impossible to cast other Americans as anything other than fellow citizens who want better lives for themselves and their loved ones.

  People can tell when you actually invest in them—it’s one reason all of the high-end companies conduct elaborate trainings. Done well, the American Exchange Program would give people more of a reason to explore other parts of the country and maybe even move someplace different if an opportunity calls for it. It would open minds and hearts.

  In order for our society to prosper through the automation wave, the state must become a newly invigorated force. Citizenship must grow to mean something again. And we have to make clear that we value people intrinsically, independent of any qualities or qualifications.

  TWENTY-ONE

  HEALTH CARE IN A WORLD WITHOUT JOBS

  As jobs disappear and temporary employment becomes more prevalent, reforming our health care system will be more and more crucial. Right now, many of us rely upon our employers to pay for and provide health insurance, in whole or in part. This will be increasingly difficult to sustain as jobs with benefits become harder and harder to come by. On the consumer side, spiraling health care costs have already become a crushing burden for Americans. Health care bills were the number one cause of personal bankruptcy in 2013, and a study that year found that 56 million Americans—over 20 percent of the adult population—struggled with health care expenses they couldn’t afford to pay. We’ve all seen and heard the horror stories of people coming back from the hospital with a bill for tens of thousands of dollars. For many Americans it’s a double whammy if you get sick—you not only have to deal with the illness or injury, but you have to figure out how to pay for treatment.

  I worked at a health care software startup based in New York from 2002 to 2005. I was 27 years old at the time. Our CEO was a talented former physician named Manu Capoor. We were one of the early electronic medical records companies that specialized in taking paper info and digitizing it. Our niche was presurgical info, so our clients were large hospitals that hosted a lot of surgeries. I was the head of client engagement—I led small teams that rolled out our software to clerks, doctors’ offices, secretaries, nurses, office managers, residents, anesthesiologists, and the occasional surgeon. It took a while to train the dozens or hundreds of people who could be touching a particular patient file—we were modifying behavior from paper to digital. I would spend weeks and months in urban hospitals in the Bronx, Morningside Heights, or West Palm Beach distributing usernames and passwords, training people, troubleshooting, and answering the occasional angry phone call. I hung out outside operating rooms at seven a.m. because surgeons like to start early, and I ate at the IHOP across the street from the UMDNJ hospital in Newark so many times that I still can’t set foot in one today.

  Although we passionately believed in the benefits of our product, we found that it was difficult to make what seemed like a pretty simple and straightforward process change. There were many reasons for this, not the least of which was that the hospitals had a limited ability to police doctor’s office behavior. The surgeons were in charge; their procedures were the big money generators. Each surgeon’s office was its own business with different systems and practices. Some doctors liked to invest in technology and people, while others were clearly happy spending very little in order to maximize profitability. They just wanted to come in 3.5 days a week and get to their golf courses or boats as fast as possible. It was like a fast-moving assembly line filled with people scrambling to get through each day and little accountability or incentive to improve.

  We used to joke around as the months wore on that “health care is where good ideas go to die.” We never did accomplish our heady goals, as adoption was painstaking and difficult. I left after four years, having helped build a client base of about a dozen hospitals. I learned firsthand how even something that made sense and should make things more efficient would be slow going in the health care industry. There’s no real reason for them to change.

  I know my experiences in the early 2000s have played over and over again for others who have tried to improve the health care system with technology. In general, the use of technology has not transformed health care the way that optimists would hope. Health care costs have continued to climb to a record 17.8 percent of the economy in 2016, up from 11.4 percent in 1989 and less than 6 percent in 1960. We spend about twice what other industrialized countries do on health care per capita with lesser results. According to a 2014 Commonwealth Fund report, we are last among major industrialized nations in efficiency, equity, and health outcomes attributable to medical care despite spending much more than anyone else. Another study had the United States last among developed countries in basic measurements such as rate of women dying due to pregnancy or childbirth and rate of survival to age five. To the extent that new technology is used, it tends to be expensive new devices and implants that drive costs ever higher. The basic practice of medicine, as well as the training, is the same as it’s been for decades.

  Our job-based health insurance system does the very thing we most want to avoid—it discourages businesses from hiring. I’ve now run a couple of companies, and if I hire a full-time entry-level worker in New York at $42,000, I have to factor in an additional $6,000 for health care insurance costs. For employers, company-subsidized health insurance costs are a major impediment to hiring and growth. The costs get a lot higher for senior people with families—my last company was spending more than $2,500 a month on certain people’s insurance plans. If these costs weren’t on our books we definitely would have hired more people.

  Health insurance also pushes companies to make as many employees as possible into part-time gig workers or contractors. The organizations I ran were generous—my education company made instructors who worked more than 20 hours a week full-time employees and provided benefits accordingly. This was highly unusual in our industry and very expensive—we could do it because we were growing and profitable, and it was always important to me to take care of people. For many companies, insurance costs are increasingly out of control, and they c
an make or break a business. It’s very difficult to pass increased costs on to employees or take back benefits after they’ve been provided, so you’re setting yourself up for increased costs in good times and bad.

  On the worker side, I know tons of people who hang on to jobs that they do not want to be in just for the health insurance. Economists refer to this as “job lock”; it makes the labor market much less dynamic, which is bad in particular for young workers. Replacing health insurance is a major source of discouragement for people striking off on their own and starting a new business, especially if they have families. In a world where we’re trying to get more people to both create jobs and start companies, our employer-based health insurance system serves as a shackle holding us in place and a reason not to hire.

  As jobs disappear, having one’s health care linked to employment will become increasingly untenable. The need for a different approach is growing.

  Health care is not truly subject to market dynamics for a host of reasons. In a normal marketplace, companies compete for your business by presenting different value propositions, and you make an informed choice. With health care, you typically have only a few options. You have no idea what the real differences are between different providers and doctors. Costs are high and extremely unpredictable, making it hard to budget for them. The complexity leaves many Americans overwhelmed and highly suggestible to experts or institutions. When you actually do get sick or injured, you become cost-insensitive, just trying to get well. Hospitals often employ opaque pricing, resulting in patient uncertainty over what their insurance will actually cover. Moreover, when you’re ill, it’s possible your faculties can be impaired because of illness, emotional distress, or even unconsciousness.

  As Steven Brill wrote in his seminal Time magazine article on health care costs, “Unless you are protected by Medicare, the health care market is not a market at all. It’s a crapshoot.” The lack of real market discipline or cost control incentives has driven costs ever higher. Technology that should decrease costs has been kept at the door, because for most actors in the system, the goal is to increase revenue and profitability. The more services, tests, appointments, procedures, and expensive gadgets you use, the better. The system rewards activity and output over health improvements and outcomes.

  Changing these incentives is key. The most direct way to do so would be to move toward a single-payer health care system, in which the government both guarantees health care for all and negotiates fixed prices. Medicare—the government-provided health care program for Americans 65 and over—essentially serves this role for senior citizens and has successfully driven down costs and provided quality care for tens of millions. Most everyone loves Medicare—it’s politically bulletproof. Sam Altman, the head of Y Combinator, suggests rolling out Medicare across the population by gradually lowering the eligibility age over time. A gradual phase-in would give the industry time to plan and adjust. This is an excellent way forward, and a “Medicare-for-all” movement is currently gathering steam. There would inevitably remain a handful of private options for the super-affluent, but most everyone would use the generalized care.

  One harsh reality is that any rationalization of health care costs will hit tons of resistance because it’s going to reduce a lot of people’s incomes. Dean Baker, co-director of the Center for Economic and Policy Research, has written about the high cost of health care, including doctor salaries. “We do waste money on insurance, but we also pay basically twice as much for everything,” he writes. “We pay twice as much to doctors. Would single-payer get our doctors to accept half as much in wages?” Moving toward a single-payer system, Baker says, would mean “fights with all of these powerful interest groups.”

  At least some doctors have been voicing their discontent with the current arrangement that puts money and efficiency over time spent with patients. Dr. Sandeep Jauhar, a cardiologist and author, writes that doctors today see themselves not as “pillars of any community” but as “technicians on an assembly line” or “pawn[s] in a money-making game for hospital administrators.” Jauhar notes that only 6 percent of doctors “described their morale as positive” in a 2008 survey, and most are pessimistic about the future of the medical profession.

  A 2016 survey of American doctors by the Physicians Foundation found that 63 percent have negative feelings about the future of the medical profession, 49 percent said they often or always experience feelings of burnout, and 49 percent would not recommend a career in medicine to their children. The same survey found that excessive paperwork and regulation was a consistent burden, with only 14 percent of doctors believing they had enough time to provide patients the highest quality of care. Almost half were planning on retiring, taking a nonclinical position, going part-time, or reducing their patient hours due to various frustrations. The low amount of time spent per patient makes doctors unhappy, cuts patients short, and drives up costs. Jauhar notes that many doctors work at “hyperspeed” and call in specialists just to “cover their ass” in case they missed something, resulting in ever more tests and costs.

  When I went to Brown in the mid-1990s, about half of the people around me were pre-med. I remember how hard they all studied for organic chemistry, which was the weed-out class that separated the people who were going to go on to med school successfully from those who were going to have to rethink their ambitions. Many people who wanted very badly to be doctors didn’t make it. One friend in particular I remember being crushed by the realization that she wasn’t going to fulfill her childhood ambition. None of my friends who are doctors today actually use organic chemistry for anything. It was just intended to make things really difficult.

  Becoming a doctor is a climb through a very competitive and hierarchical system. You study to get a high GPA in your pre-med coursework, take the MCAT, spend a summer caddying for a doctor or researcher, compete in med school to graduate with honors, apply to match with a desirable residency, then pursue the right internship and fellowship. At every level, the people become smarter and smarter. Some specializations take as many as six years after medical school, or 10 full years after college. Different specialties take on different personalities—the anesthesiologists are mellow, the orthopedic surgeons are jocks, the pediatricians love children, and so on. The amount of money you make largely corresponds to how many years you spend in specialized training. Family medicine doctors make about $200,000 a year on average, while orthopedic surgeons make more than $500,000 per year. The average educational debt load for a medical school graduate is $180,000, with 12 percent of doctors owing a whopping $300,000 for their training.

  In part as a result of this system, there’s a national shortage of both primary care doctors and doctors who practice in rural areas. About 65 million Americans live in what one expert called basically “a primary care desert.” The Association of American Medical Colleges estimated that the number of additional doctors necessary to provide appropriate care to underserved areas was 96,200 in 2014, with a gap of about 25,000 in primary care alone. Many states are offering grants and incentives to address doctor shortages, as 12 states have fewer than half the number of primary care doctors necessary to provide adequate coverage. After all of the competition, schooling, and debt, many doctors don’t want to sign up for less pay and prestige to work in underserved areas.

  The process is also not selecting people for empathy. Most medical schools apply a mechanical screen to determine who to interview based solely on college GPA, course of study, and MCAT score. Though some schools say they are trying to identify applicants who display various personal traits, we’re still talking about 21,030 people per year who studied science and did well on the MCAT attending med school, which is a very restricted group of people.

  Martin Ford, the author of Rise of the Robots, suggests that we create a new class of health care provider armed with AI—college graduates or master’s students unburdened by additional years of costly specialization, who would nonetheless be equipped to head ou
t to rural areas. They could help people monitor chronic conditions like obesity and diabetes and refer particularly hairy problems to more experienced doctors. Call them primary care specialists. AI will soon be at a point where technology, in conjunction with a non-doctor, could offer the same quality of care as a doctor in the vast majority of cases. In one study, IBM’s Watson made the same recommendation as human doctors did in 99 percent of 1,000 medical cases and made suggestions human doctors missed in 30 percent of them. AI can reference more cases than the most experienced physician while keeping up to date with the latest journals and studies.

  Predictably, doctors have lobbied against nurse practitioners and unsupervised residents seeing patients, and they would doubtless feel even more negatively about this new class of primary care specialist. But this change would make health care much more widely available, open up a new employment category for smart and empathetic college graduates who genuinely want to spend time with patients, and eventually lighten the time burden on individual physicians.

  This brings us back to how to implement a new single-payer system. We need to do more than rationalize current costs—we need to transform the way that doctors get paid.

  Adopting Medicare-for-all or a single-payer system will solve the biggest problems of rampant overbilling and ever-increasing costs. But Medicare still generally reimburses based on individual appointments, procedures, and tests, which maintain the incentives for doctors to do more to get paid more. There is a movement toward “value-based” or “quality-based reimbursement,” which tries to measure patient outcomes, readmission rates, and the like and reward providers accordingly. One startup based in Maryland, Aledade, is having success by giving primary care doctors incentives to reduce costs. But these “pay for performance” plans are tough to measure, influence a very low proportion of the funds currently being received by providers, and have had mixed results.

 

‹ Prev