Book Read Free

The Accidental Theorist: And Other Dispatches from the Dismal Science

Page 17

by Paul Krugman


  This trend should have been obvious even in 1996. After all, even then America’s richest man was Bill Gates, a college dropout who didn’t seem to need a lot of formal education to build the world’s most powerful information technology company.

  Or consider the panic over “downsizing” that gripped America in 1996. As economists quickly pointed out, the rate at which Americans were losing jobs in the nineties was not especially high by historical standards. Why, then, did downsizing suddenly become news? Because for the first time white-collar, college-educated workers were being fired in large numbers, even while skilled machinists and other blue-collar workers were in high demand. This should have been a clear signal that the days of ever-rising wage premia for people with higher education were over, but somehow nobody noticed.

  Eventually, of course, the eroding payoff to higher education created a crisis in the education industry itself. Why should a student put herself through four years of college and several years of postgraduate work in order to acquire academic credentials with hardly any monetary value? These days jobs that require only six or twelve months of vocational training—paranursing, carpentry, household maintenance (a profession that has taken over much of the housework that used to be done by unpaid spouses), and so on—pay nearly as much as one can expect to earn with a master’s degree, and more than one can expect to earn with a Ph.D. And so enrollment in colleges and universities has dropped almost two-thirds since its turn-of-the-century peak. Many institutions of higher education could not survive this harsher environment. The famous universities mostly did manage to cope, but only by changing their character and reverting to an older role. Today a place like Harvard is, as it was in the nineteenth century, more of a social institution than a scholarly one—a place for the children of the wealthy to refine their social graces and make friends with others of the same class.

  The celebrity economy. The last of this century’s great trends was noted by acute observers in 1996, yet somehow most people failed to appreciate it. Although business gurus were proclaiming the predominance of creativity and innovation over mere routine production, in fact the growing ease with which information could be transmitted and reproduced was making it ever harder for creators to profit from their creations. Today, if you develop a marvelous piece of software, by tomorrow everyone will have downloaded a free copy from the Net. If you record a magnificent concert, next week bootleg CDs will be selling in Shanghai. If you produce a wonderful film, next month high-quality videos will be available in Mexico City.

  How, then, can creativity be made to pay? The answer was already becoming apparent a century ago: Creations must make money indirectly, by promoting sales of something else. Just as auto companies used to sponsor Grand Prix racers to spice up the image of their cars, computer manufacturers now sponsor hotshot software designers to build brand recognition for their hardware. And the same is true for individuals. The royalties the Four Sopranos earn from their recordings are surprisingly small; mainly the recordings serve as advertisements for their arena concerts. The fans, of course, go to these concerts not to appreciate the music (they can do that far better at home) but for the experience of seeing their idols in person. Technology forecaster Esther Dyson got it precisely right in 1996: “Free copies of content are going to be what you use to establish your fame. Then you go out and milk it.” In short, instead of becoming a Knowledge Economy we have become a Celebrity Economy.

  Luckily, the same technology that has made it impossible to capitalize directly on knowledge has also created many more opportunities for celebrity. The 500-channel world is a place of many subcultures, each with its own culture heroes; there are people who will pay for the thrill of live encounters not only with divas but with journalists, poets, mathematicians, and even economists. When Andy Warhol predicted a world in which everyone would be famous for fifteen minutes, he was wrong: If there are indeed an astonishing number of people who have experienced celebrity, it is not because fame is fleeting but because there are many ways to be famous in a society that has become incredibly diverse.

  Still, the celebrity economy has been hard on some people—especially those of us with a scholarly bent. A century ago it was actually possible to make a living as a more or less pure scholar: Someone like myself would probably have earned a pretty good salary as a college professor, and been able to supplement that income with textbook royalties. Today, however, teaching jobs are hard to find and pay a pittance in any case; and nobody makes money by selling books. If you want to devote yourself to scholarship, there are now only three options (the same options that were available in the nineteenth century, before the rise of institutionalized academic research). Like Charles Darwin, you can be born rich, and live off your inheritance. Like Alfred Wallace, the less fortunate co-discoverer of evolution, you can make your living doing something else, and pursue research as a hobby. Or, like many nineteenth-century scientists, you can try to cash in on scholarly reputation by going on the paid lecture circuit.

  But celebrity, though more common than ever before, still does not come easily. And that is why writing this article is such an opportunity. I actually don’t mind my day job in the veterinary clinic, but I have always wanted to be a full-time economist; an article like this might be just what I need to make my dream come true.

  1 Recent research has shown that the great majority of ulcers are the result of a bacterial infection.

 

 

 


‹ Prev