Antifragile: Things That Gain from Disorder

Home > Other > Antifragile: Things That Gain from Disorder > Page 26
Antifragile: Things That Gain from Disorder Page 26

by Taleb, Nassim Nicholas


  Scranton showed that we have been building and using jet engines in a completely trial-and-error experiential manner, without anyone truly understanding the theory. Builders needed the original engineers who knew how to twist things to make the engine work. Theory came later, in a lame way, to satisfy the intellectual bean counter. But that’s not what you tend to read in standard histories of technology: my son, who studies aerospace engineering, was not aware of this. Scranton was polite and focused on situations in which innovation is messy, “distinguished from more familiar analytic and synthetic innovation approaches,” as if the latter were the norm, which it is obviously not.

  I looked for more stories, and the historian of technology David Edgerton presented me with a quite shocking one. We think of cybernetics—which led to the “cyber” in cyberspace—as invented by Norbert Wiener in 1948. The historian of engineering David Mindell debunked the story; he showed that Wiener was articulating ideas about feedback control and digital computing that had long been in practice in the engineering world. Yet people—even today’s engineers—have the illusion that we owe the field to Wiener’s mathematical thinking.

  Then I was hit with the following idea. We all learn geometry from textbooks based on axioms, like, say, Euclid’s Book of Elements, and tend to think that it is thanks to such learning that we today have these beautiful geometric shapes in buildings, from houses to cathedrals; to think the opposite would be anathema. So I speculated immediately that the ancients developed an interest in Euclid’s geometry and other mathematics because they were already using these methods, derived by tinkering and experiential knowledge, otherwise they would not have bothered at all. This is similar to the story of the wheel: recall that the steam engine had been discovered and developed by the Greeks some two millennia before the Industrial Revolution. It is just that things that are implemented tend to want to be born from practice, not theory.

  Now take a look at architectural objects around us: they appear so geometrically sophisticated, from the pyramids to the beautiful cathedrals of Europe. So a sucker problem would make us tend to believe that mathematics led to these beautiful objects, with exceptions here and there such as the pyramids, as these preceded the more formal mathematics we had after Euclid and other Greek theorists. Some facts: architects (or what were then called Masters of Works) relied on heuristics, empirical methods, and tools, and almost nobody knew any mathematics—according to the medieval science historian Guy Beaujouan, before the thirteenth century no more than five persons in the whole of Europe knew how to perform a division. No theorem, shmeorem. But builders could figure out the resistance of materials without the equations we have today—buildings that are, for the most part, still standing. The thirteenth-century French architect Villard de Honnecourt documents with his series of drawings and notebooks in Picard (the language of the Picardie region in France) how cathedrals were built: experimental heuristics, small tricks and rules, later tabulated by Philibert de l’Orme in his architectural treatises. For instance, a triangle was visualized as the head of a horse. Experimentation can make people much more careful than theories.

  Further, we are quite certain that the Romans, admirable engineers, built aqueducts without mathematics (Roman numerals did not make quantitative analysis very easy). Otherwise, I believe, these would not be here, as a patent side effect of mathematics is making people over-optimize and cut corners, causing fragility. Just look how the new is increasingly more perishable than the old.

  And take a look at Vitruvius’ manual, De architectura, the bible of architects, written about three hundred years after Euclid’s Elements. There is little formal geometry in it, and, of course, no mention of Euclid, mostly heuristics, the kind of knowledge that comes out of a master guiding his apprentices. (Tellingly, the main mathematical result he mentions is Pythagoras’s theorem, amazed that the right angle could be formed “without the contrivances of the artisan.”) Mathematics had to have been limited to mental puzzles until the Renaissance.

  Now I am not saying that theories or academic science are not behind some practical technologies at all, directly derived from science for their final use (not for some tangential use)—what the researcher Joel Mokyr calls an “epistemic base,” or propositional knowledge, a sort of repository of formal “knowledge” that embeds the theoretical and empirical discoveries and becomes a rulebook of sorts, used to generate more knowledge and (he thinks) more applications. In other words, a body of theories from which further theories can be directly derived.

  But let’s not be suckers: following Mr. Mokyr would make one want to study economic geography to predict foreign exchange prices (I would have loved to introduce him to the expert in green lumber). While I accept the notion of epistemic base, what I question is the role it has really played in the history of technology. The evidence of a strong effect is not there, and I am waiting for someone to show it to me. Mokyr and the advocates of such view provide no evidence that it is not epiphenomenal—nor do they appear to understand the implications of asymmetric effects. Where is the role of optionality in this?

  There is a body of know-how that was transmitted from master to apprentice, and transmitted only in such a manner—with degrees necessary as a selection process or to make the profession more respectable, or to help here and there, but not systematically. And the role of such formal knowledge will be overappreciated precisely because it is highly visible.

  Is It Like Cooking?

  Cooking seems to be the perfect business that depends on optionality. You add an ingredient and have the option of keeping the result if it is in agreement with Fat Tony’s taste buds, or fuhgetaboudit if it’s not. We also have wiki-style collaborative experimentation leading to a certain body of recipes. These recipes are derived entirely without conjectures about the chemistry of taste buds, with no role for any “epistemic base” to generate theories out of theories. Nobody is fooled so far by the process. As Dan Ariely once observed, we cannot reverse engineer the taste of food from looking at the nutritional label. And we can observe ancestral heuristics at work: generations of collective tinkering resulting in the evolution of recipes. These food recipes are embedded in cultures. Cooking schools are entirely apprenticeship based.

  On the other side, we have pure physics, with theories used to generate theories with some empirical validation. There the “epistemic base” can play a role. The discovery of the Higgs Boson is a modern case of a particle entirely expected from theoretical derivations. So was Einstein’s relativity. (Prior to the Higgs Boson, one spectacular case of a discovery with a small number of existing external data is that of the French astronomer Le Verrier’s derivation of the existence of the planet Neptune. He did that on the basis of solitary computation, from the behavior of the surrounding planets. When the planet was actually sighted he refused to look at it, so comfortable was he with his result. These are exceptions, and tend to take place in physics and other places I call “linear,” where errors are from Mediocristan, not from Extremistan.)

  Now use this idea of cooking as a platform to grasp other pursuits: do other activities resemble it? If we put technologies through scrutiny, we would see that most do in fact resemble cooking a lot more than physics, particularly those in the complex domain.

  Even medicine today remains an apprenticeship model with some theoretical science in the background, but made to look entirely like science. And if it leaves the apprenticeship model, it would be for the “evidence-based” method that relies less on biological theories and more on the cataloging of empirical regularities, the phenomenology I explained in Chapter 7. Why is it that science comes and goes and technologies remain stable?

  Now, one can see a possible role for basic science, but not in the way it is intended to be.3 For an example of a chain of unintended uses, let us start with Phase One, the computer. The mathematical discipline of combinatorics, here basic science, derived from propositional knowledge, led to the building of computers, or so the story g
oes. (And, of course, to remind the reader of cherry-picking, we need to take into account the body of theoretical knowledge that went nowhere.) But at first, nobody had an idea what to do with these enormous boxes full of circuits as they were cumbersome, expensive, and their applications were not too widespread, outside of database management, only good to process quantities of data. It is as if one needed to invent an application for the thrill of technology. Baby boomers will remember those mysterious punch cards. Then someone introduced the console to input with the aid of a screen monitor, using a keyboard. This led, of course, to word processing, and the computer took off because of its fitness to word processing, particularly with the microcomputer in the early 1980s. It was convenient, but not much more than that until some other unintended consequence came to be mixed into it. Now Phase Two, the Internet. It had been set up as a resilient military communication network device, developed by a research unit of the Department of Defense called DARPA and got a boost the days when Ronald Reagan was obsessed with the Soviets. It was meant to allow the United States to survive a generalized military attack. Great idea, but add the personal computer plus Internet and we get social networks, broken marriages, a rise in nerdiness, the ability for a post-Soviet person with social difficulties to find a matching spouse. All that thanks to initial U.S. tax dollars (or rather budget deficit) during Reagan’s anti-Soviet crusade.

  So for now we are looking at the forward arrow and at no point, although science was at some use along the way since computer technology relies on science in most of its aspects; at no point did academic science serve in setting its direction, rather it served as a slave to chance discoveries in an opaque environment, with almost no one but college dropouts and overgrown high school students along the way. The process remained self-directed and unpredictable at every step. And the great fallacy is to make it sound irrational—the irrational resides in not seeing a free option when it is handed to us.

  China might be a quite convincing story, through the works of a genius observer, Joseph Needham, who debunked quite a bit of Western beliefs and figured out the powers of Chinese science. As China became a top-down mandarinate (that is, a state managed by Soviet-Harvard centralized scribes, as Egypt had been before), the players somehow lost the zest for bricolage, the hunger for trial and error. Needham’s biographer Simon Winchester cites the sinologist Mark Elvin’s description of the problem, as the Chinese did not have, or, rather, no longer had, what he called the “European mania for tinkering and improving.” They had all the means to develop a spinning machine, but “nobody tried”—another example of knowledge hampering optionality. They probably needed someone like Steve Jobs—blessed with an absence of college education and the right aggressiveness of temperament—to take the elements to their natural conclusion. As we will see in the next section, it is precisely this type of uninhibited doer who made the Industrial Revolution happen.

  We will next examine two cases, first, the Industrial Revolution, and second, medicine. So let us start by debunking a causal myth about the Industrial Revolution, the overstatement of the role of science in it.

  The Industrial Revolution

  Knowledge formation, even when theoretical, takes time, some boredom, and the freedom that comes from having another occupation, therefore allowing one to escape the journalistic-style pressure of modern publish-and-perish academia to produce cosmetic knowledge, much like the counterfeit watches one buys in Chinatown in New York City, the type that you know is counterfeit although it looks like the real thing. There were two main sources of technical knowledge and innovation in the nineteenth and early twentieth centuries: the hobbyist and the English rector, both of whom were generally in barbell situations.

  An extraordinary proportion of work came out of the rector, the English parish priest with no worries, erudition, a large or at least comfortable house, domestic help, a reliable supply of tea and scones with clotted cream, and an abundance of free time. And, of course, optionality. The enlightened amateur, that is. The Reverends Thomas Bayes (as in Bayesian probability) and Thomas Malthus (Malthusian overpopulation) are the most famous. But there are many more surprises, cataloged in Bill Bryson’s Home, in which the author found ten times more vicars and clergymen leaving recorded traces for posterity than scientists, physicists, economists, and even inventors. In addition to the previous two giants, I randomly list contributions by country clergymen: Rev. Edmund Cartwright invented the power loom, contributing to the Industrial Revolution; Rev. Jack Russell bred the terrier; Rev. William Buckland was the first authority on dinosaurs; Rev. William Greenwell invented modern archaeology; Rev. Octavius Pickard-Cambridge was the foremost authority on spiders; Rev. George Garrett invented the submarine; Rev. Gilbert White was the most esteemed naturalist of his day; Rev. M. J. Berkeley was the top expert on fungi; Rev. John Michell helped discover Uranus; and many more. Note that, just as with our episode documented with Haug, that organized science tends to skip the “not made here,” so the list of visible contribution by hobbyists and doers is most certainly shorter than the real one, as some academic might have appropriated the innovation by his predecessor.4

  Let me get poetic for a moment. Self-directed scholarship has an aesthetic dimension. For a long time I had on the wall of my study the following quote by Jacques Le Goff, the great French medievalist, who believes that the Renaissance came out of independent humanists, not professional scholars. He examined the striking contrast in period paintings, drawings, and renditions that compare medieval university members and humanists:

  One is a professor surrounded and besieged by huddled students. The other is a solitary scholar, sitting in the tranquility and privacy of his chambers, at ease in the spacious and comfy room where his thoughts can move freely. Here we encounter the tumult of schools, the dust of classrooms, the indifference to beauty in collective workplaces,

  There, it is all order and beauty,

  Luxe, calme et volupté

  As to the hobbyist in general, evidence shows him (along with the hungry adventurer and the private investor) to be at the source of the Industrial Revolution. Kealey, who we mentioned was not a historian and, thankfully, not an economist, in The Economic Laws of Scientific Research questions the conventional “linear model” (that is, the belief that academic science leads to technology)—for him, universities prospered as a consequence of national wealth, not the other way around. He even went further and claimed that like naive interventions, these had iatrogenics that provided a negative contribution. He showed that in countries in which the government intervened by funding research with tax money, private investment decreased and moved away. For instance, in Japan, the almighty MITI (Ministry for Technology and Investment) has a horrible record of investment. I am not using his ideas to prop up a political program against science funding, only to debunk causal arrows in the discovery of important things.

  The Industrial Revolution, for a refresher, came from “technologists building technology,” or what he calls “hobby science.” Take again the steam engine, the one artifact that more than anything else embodies the Industrial Revolution. As we saw, we had a blueprint of how to build it from Hero of Alexandria. Yet the theory didn’t interest anyone for about two millennia. So practice and rediscovery had to be the cause of the interest in Hero’s blueprint, not the other way around.

  Kealey presents a convincing—very convincing—argument that the steam engine emerged from preexisting technology and was created by uneducated, often isolated men who applied practical common sense and intuition to address the mechanical problems that beset them, and whose solutions would yield obvious economic reward.

  Now, second, consider textile technologies. Again, the main technologies that led to the jump into the modern world owe, according to Kealey, nothing to science. “In 1733,” he writes, “John Kay invented the flying shuttle, which mechanized weaving, and in 1770 James Hargreaves invented the spinning jenny, which as its name implies, mechanized spinning. These m
ajor developments in textile technology, as well as those of Wyatt and Paul (spinning frame, 1758), Arkwright (water frame, 1769), presaged the Industrial Revolution, yet they owed nothing to science; they were empirical developments based on the trial, error, and experimentation of skilled craftsmen who were trying to improve the productivity, and so the profits, of their factories.”

  David Edgerton did some work questioning the link between academic science and economic prosperity, along with the idea that people believed in the “linear model” (that is, that academic science was at the source of technology) in the past. People were no suckers in the nineteenth and twentieth centuries; we believe today that they believed in the said linear model then but they did not. In fact academics were mostly just teachers, not researchers, until well into the twentieth century.

  Now, instead of looking into a scholar’s writings to see whether he is credible or not, it is always best to consider what his detractors say—they will uncover what’s worst in his argument. So I looked for the detractors of Kealey, or people opposing his ideas, to see if they address anything of merit—and to see where they come from. Aside from some comments by Joel Mokyr, who, as I said, has not yet discovered optionality, and an attack by an economist of the type that doesn’t count, given the devaluation of the currency of the economics profession, the main critique against Kealey, published in the influential journal Nature by a science bureaucrat, was that he uses data from government-sponsored agencies such as the OECD in his argument against tax-funded research. So far, no substantive evidence that Kealey was wrong. But, let us flip the burden of evidence: there is zero evidence that the opposite of his thesis is remotely right. Much of all of this is a religious belief in the unconditional power of organized science, one that has replaced unconditional religious belief in organized religion.

 

‹ Prev