Merton further suggested that “all scientific discoveries are in principle ‘multiples.’ ” In other words, when a scientific discovery is made, it is made by more than one person. Sometimes a discovery is named after the person who develops the discovery, rather than after the original discoverer.
The world is full of difficulties in assigning credit for discoveries. Some of us have personally seen this in the realm of patent law, in business ideas, and in our daily lives. Fully appreciating the concept of the kaleidoscopic discovery engine adds to our cognitive toolkits because the kaleidoscope succinctly captures the nature of innovation and the future of ideas. If schools taught more about kaleidoscopic discovery, even in the context of everyday experience, then innovators might enjoy the fruits of their labor and still become “great” without a debilitating concern to be first or to crush rivals. The great eighteenth-century anatomist William Hunter frequently quarreled with his brother about who was first in making a discovery. But even Hunter admitted, “If a man has not such a degree of enthusiasm and love of the art, as will make him impatient of unreasonable opposition, and of encroachment upon his discoveries and his reputation, he will hardly become considerable in anatomy, or in any other branch of natural knowledge.”
When Mark Twain was asked to explain why so many inventions were invented independently, he said, “When it’s steamboat time, you steam.”
Inference to the Best Explanation
Rebecca Newberger Goldstein
Philosopher; novelist; author, 36 Arguments for the Existence of God: A Work of Fiction
I’m alone in my home, working in my study, when I hear the click of the front door, the sound of footsteps making their way toward me. Do I panic? That depends on what I—my attention instantaneously appropriated to the task and cogitating at high speed—infer as the best explanation for those sounds. My husband returning home, the housecleaners, a miscreant breaking and entering, the noises of our old building settling, a supernatural manifestation? Additional details could make any one of these explanations, except the last, the best explanation for the circumstances.
Why not the last? As Charles Sanders Peirce, who first drew attention to this type of reasoning, pointed out: “Facts cannot be explained by a hypothesis more extraordinary than these facts themselves; and of various hypotheses the least extraordinary must be adopted.”
“Inference to the best explanation” is ubiquitously pursued, which doesn’t mean that it is ubiquitously pursued well. The phrase, coined by the Princeton philosopher Gilbert Harman as a substitute for Peirce’s term, “abduction,” should be in everybody’s toolkit, if only because it forces one to think about what makes for a good explanation. There is that judgmental phrase, the best, sitting out in the open, shamelessly invoking standards. Not all explanations are created equal; some are objectively better than others. And the phrase also highlights another important fact. The best means the one that wins out over the alternatives, of which there are always many. Evidence calling for an explanation summons a great plurality (in fact, an infinity) of possible explanations, the great mass of which can be eliminated on the grounds of violating Peirce’s maxim. We decide among the remainder using such criteria as: Which is the simpler, which does less violence to established beliefs, which is less ad hoc, which explains the most, which is the loveliest?
There are times when these criteria clash with one another. Inference to the best explanation is certainly not as rule-bound as logical deduction nor even as enumerative induction, which takes us from observed cases of all a’s being b’s to the probability that unobserved cases of a’s are also b’s. But inference to the best explanation also gets us a great deal more than either deduction or enumerative induction does.
It’s inference to the best explanation that gives science the power to expand our ontology, giving us reasons to believe in things we can’t directly observe, from subatomic particles—or maybe strings—to the dark matter and dark energy of cosmology. It’s inference to the best explanation that allows us to know something of what it’s like to be other people on the basis of their behavior. I see the hand drawing too near the fire and then quickly pull away, tears starting in the eyes while an impolite word is uttered, and I know something of what that person is feeling. It’s on the basis of inference to the best explanation that I can learn things from what authorities say and write, my inferring that the best explanation for their doing so is that they believe what they say or write. (Sometimes that’s not the best explanation.) In fact, I’d argue that my right to believe in a world outside my own solipsistic universe, confined to the awful narrowness of my own immediate experience, is based on inference to the best explanation. What best explains the vivacity and predictability of some of my representations of material bodies and not others, if not the hypothesis of actual material bodies? Inference to the best explanation defeats mental-sapping skepticism.
Many of our most rancorous scientific debates—say, over string theory or the foundations of quantum mechanics—have been over which competing criteria for judging explanations the best ought to prevail. So, too, have debates that many of us have conducted over scientific versus religious explanations. These debates could be sharpened by bringing to bear on them the rationality-steeped notion of inference to the best explanation, its invocation of the sorts of standards that make some explanations objectively better than others, beginning with Peirce’s enjoinder that extraordinary hypotheses be ranked far away from the best.
Cognitive Load
Nicholas Carr
Science and technology journalist; author, The Shallows: What the Internet Is Doing to Our Brains
You’re sprawled on the couch in your living room, watching a new episode of Justified on the tube, when you think of something you need to do in the kitchen. You get up, take ten quick steps across the carpet, and then just as you reach the kitchen—poof!—you realize you’ve forgotten what it was you got up to do. You stand befuddled for a moment, then shrug your shoulders and head back to the couch.
Such memory lapses happen so often that we don’t pay them much heed. We write them off as “absentmindedness” or, if we’re getting older, “senior moments.” But the incidents reveal a fundamental limitation of our minds: the tiny capacity of our working memory. Working memory is what brain scientists call the short-term store of information where we hold the contents of our consciousness at any given moment—all the impressions and thoughts that flow into our mind as we go through a day. In the 1950s, Princeton psychologist George Miller famously argued that our brains can hold only about seven pieces of information simultaneously. Even that figure may be too high. Some brain researchers now believe that working memory has a maximum capacity of just three or four elements.
The amount of information entering our consciousness at any instant is referred to as our cognitive load. When our cognitive load exceeds the capacity of our working memory, our intellectual abilities take a hit. Information zips into and out of our mind so quickly that we never gain a good mental grip on it. (Which is why you can’t remember what you went to the kitchen to do.) The information vanishes before we’ve had an opportunity to transfer it into our long-term memory and weave it into knowledge. We remember less, and our ability to think critically and conceptually weakens. An overloaded working memory also tends to increase our distractedness. After all, as the neuroscientist Torkel Klingberg has pointed out, “We have to remember what it is we are to concentrate on.” Lose your hold on that and you’ll find “distractions more distracting.”
Developmental psychologists and educational researchers have long used the concept of cognitive load in designing and evaluating pedagogical techniques. They know that when you give a student too much information too quickly, comprehension degrades and learning suffers. But now that all of us—thanks to the incredible speed and volume of modern digital communication networks and gadgets—are inundated with more bits and pieces of i
nformation than ever before, everyone would benefit from having an understanding of cognitive load and how it influences memory and thinking. The more aware we are of how small and fragile our working memory is, the better we’ll be able to monitor and manage our cognitive load. We’ll become more adept at controlling the flow of the information coming at us.
There are times when you want to be awash in messages and other info-bits. The resulting sense of connectedness and stimulation can be exciting and pleasurable. But it’s important to remember that when it comes to the way your brain works, information overload is not just a metaphor, it’s a physical state. When you’re engaged in a particularly important or complicated intellectual task, or when you simply want to savor an experience or a conversation, it’s best to turn the information faucet down to a trickle.
To Curate
Hans Ulrich Obrist
Curator, Serpentine Gallery, London
Lately, the term “to curate” seems to be used in a greater variety of contexts than ever before, in reference to everything from an exhibition of prints by old masters to the contents of a concept store. The risk, of course, is that the definition may expand beyond functional usability. But I believe “to curate” finds ever wider application because of a feature of modern life impossible to ignore: the incredible proliferation of ideas, information, images, disciplinary knowledge, and material products we all witness today. Such proliferation makes the activities of filtering, enabling, synthesizing, framing, and remembering more and more important as basic navigational tools for twenty-first-century life. These are the tasks of the curator, who is no longer understood simply as the person who fills a space with objects but also as the person who brings different cultural spheres into contact, invents new display features, and makes junctions that allow unexpected encounters and results.
Michel Foucault once wrote that he hoped his writings would be used by others as a theoretical toolbox, a source of concepts and models for understanding the world. For me, the author, poet, and theoretician Édouard Glissant has become this kind of toolbox. Very early, he noted that in our phase of globalization (which is not the first one), there is a danger of homogenization but at the same time a countermovement, the retreat into one’s own culture. And against both dangers he proposes the idea of mondialité—a global dialog that augments difference.
This inspired me to handle exhibitions in a new way. There is a lot of pressure on curators not only to do shows in one place but also to send them around the world, by simply packing them into boxes in one city and unpacking them in the next. This is a homogenizing sort of globalization. Using Glissant’s idea as a tool means to develop exhibitions that build a relation to their place, that change with their different local conditions, that create a changing dynamic complex system with feedback loops.
To curate in this sense is to refuse static arrangements and permanent alignments and instead to enable conversations and relations. Generating these kinds of links is an essential part of what it means to curate, as is disseminating new knowledge, new thinking, and new artworks in a way that can seed future cross-disciplinary inspirations.
But there is another case for curating as a vanguard activity for the twenty-first century. As the artist Tino Sehgal has pointed out, modern societies find themselves today in an unprecedented situation: The problem of lack, or scarcity, which has been the primary factor motivating scientific and technological innovation, is now joined and even superseded by the problem of the global effects of overproduction and resource use. Thus, moving beyond the object as the locus of meaning has a further relevance. Selection, presentation, and conversation are ways for human beings to create and exchange real value, without dependence on older, unsustainable processes. Curating can take the lead in pointing us toward this crucial importance of choosing.
“Graceful” SHAs
Richard Nisbett
Social psychologist; codirector, Culture and Cognition Program, University of Michigan; author, Intelligence and How to Get It: Why Schools and Cultures Count
1. A university needs to replace its aging hospital. Cost estimates indicate that it would be as expensive to remodel the old hospital as to demolish it and build a new one from scratch. The main argument offered by the proponents of the former is that the original hospital was very expensive to build and it would be wasteful to simply demolish it. The main argument of the proponents of a new hospital is that a new hospital would inevitably be more modern than a remodeled one. Which seems wiser to you—remodel or build a new hospital?
2. David L., a high school senior, is choosing between two colleges, equal in prestige, cost, and distance from home. David has friends at both colleges. Those at College A like it from both intellectual and personal standpoints. Those at College B are generally disappointed on both grounds. But David visits each college for a day, and his impressions are quite different from those of his friends. He meets several students at College A who seem neither particularly interesting nor particularly pleasant, and a couple of professors give him the brush-off. He meets several bright and pleasant students at College B and two professors take a personal interest in him. Which college do you think David should go to?
3. Which of the cards below should you turn over to answer to determine whether the following rule has been violated or not? “If there is a vowel on the front of the card, then there is an odd number on the back.”
Some considerations about each of these questions:
Question 1: If you said that the university should remodel on the grounds that it was expensive to build the old hospital, you have fallen into the “sunk-cost trap” shorthand abstraction (SHA)identified by economists. The money spent on the hospital is irrelevant—it’s sunk—and has no bearing on the present choice. Amos Tversky and Daniel Kahneman pointed out that people’s ability to avoid such traps might be helped by a couple of thought experiments like the following:
Imagine that you have two tickets to tonight’s NBA game in your city and that the arena is forty miles away. But it’s begun to snow, and you’ve found out that your team’s star has been injured and won’t be playing. Should you go, or just throw away the money and skip it?
To answer that question as an economist would, ask yourself the following question: “Suppose you didn’t have tickets to the game, and a friend called you up and said he had two tickets to tonight’s game which he couldn’t use, and asked if you’d like to have them.” If the answer is “You’ve got to be kidding. It’s snowing and the star isn’t playing,” then the answer is that you shouldn’t go. That answer shows you that the fact that you paid good money for the tickets you have is irrelevant—their cost is sunk and can’t be retrieved by doing something you don’t want to do anyway. Avoidance of sunk-cost traps is a religion for economists, but I find that a single college course in economics actually does little to make people aware of the sunk-cost trap. It turns out that exposure to a few basketball-type anecdotes does a lot.
Question 2: If you said that “David is not his friends; he should go to the place he likes,” then the SHA of the law of large numbers (LLN) has not been sufficiently salient to you. David has one day’s worth of experiences at each; his friends have had hundreds. Unless David thinks his friends have kinky tastes, he should ignore his own impressions and go to College A. A single college course in statistics increases the likelihood of invoking LLN. Several courses in statistics make LLN considerations almost inevitable.
Question 3: If you said anything other than “Turn over the U and turn over the 8,” psychologists P. C. Wason and P. N. Johnson-Laird have shown that you would be in the company of 90 percent of Oxford students. Unfortunately, you—and they—are wrong. The SHA of the logic of the conditional has not guided your answer. “If P, then Q” is satisfied by showing that the P is associated with a Q and the not-Q is not associated with a P. A course in logic actually does nothing to make people better able to answer questio
ns such as number 3. Indeed, a PhD in philosophy does nothing to make people better able to apply the logic of the conditional to simple problems like Question 3 or meatier problems of the kind one encounters in everyday life.
Some SHAs apparently are “graceful,” in that they are easily inserted into the cognitive toolbox. Others appear to be clunky and don’t readily fit. If educators want to improve people’s ability to think, they need to know which SHAs are graceful and teachable and which are clunky and hard to teach. An assumption of educators for centuries has been that formal logic improves thinking skills—meaning that it makes people more intelligent in their everyday lives. But this belief may be mistaken. (Bertrand Russell said, almost surely correctly, that the syllogisms studied by the monks of medieval Europe were as sterile as they were.) But it seems likely that many crucially important SHAs, undoubtedly including some that have been proposed by this year’s Edge contributors, are readily taught. Few questions are more important for educators to study than to find out which SHAs are teachable and how they can be taught most easily.
Externalities
Rob Kurzban
Psychologist, University of Pennsylvania; director, Pennsylvania Laboratory for Experimental Evolutionary Psychology (PLEEP); author, Why Everyone (Else) Is a Hypocrite: Evolution and the Modular Mind
When I go about doing what I do, frequently I affect you as an incidental side effect. In many such cases, I don’t have to pay you to compensate for any inadvertent harm done; symmetrically, you frequently don’t have to pay me for any inadvertent benefits I’ve bestowed upon you. The term “externalities” refers to these cases, and they are pervasive and important because, especially in the modern, interconnected world, when I pursue my own goals I wind up affecting you in any number of different ways.
This Will Make You Smarter Page 12