by James Geary
A common metaphorical gesture is the “thumbs-up” sign, in which we indicate our state of general well-being by closing the fist and extending the thumb upward at a 90-degree angle. Visual metaphors like these also follow Aristotle’s definition. The only difference is that the thing is given an image or a gesture rather than a name that belongs to something else.
Metaphor is so essential that it is impossible to describe emotions, abstract concepts, complex ideas, or practically anything else without it, as art historian and connoisseur of metaphor Nelson Goodman wrote in Languages of Art:
Metaphor permeates all discourse20, ordinary and special, and we should have a hard time finding a purely literal paragraph anywhere. This incessant use of metaphor springs not merely from love of literary color but also from urgent need of economy. If we could not readily transfer schemata to make new sortings and orderings, we should have to burden ourselves with unmanageably many different schemata, either by adoption of a vast vocabulary of elementary terms or by prodigious elaboration of composite ones.
Shakespeare’s description of Juliet is a marvel of metaphorical economy. On the surface, Juliet is nothing like the sun. Nevertheless, she shines. Romeo is inexorably drawn by her gravitational pull. She is the center of his universe. She radiates heat. And her brightness can, of course, burn. In these particulars at least, she is indeed the sun. Shakespeare’s schematic transfer tells us everything we need to know about Juliet—and Romeo’s feelings for her—in just four simple words.
After hundreds of years of constant use, this comparison has become something of a cliché. But the metaphorical thinking that enabled the equation to be made in the first place is the essence of creativity in the sciences as well as the arts. Whenever we solve a problem, make a discovery, or devise an innovation, the same kind of metaphorical thinking takes place.
Scientists and inventors compare two things: what they know and what they don’t know. The only way to find out about the latter is to investigate the ways it might be like the former. And whenever we explore how one thing is like another, we are in the realm of metaphorical thinking, as in this comparison, another academic staple, from Scottish poet Robert Burns:
My love is like a red, red rose21.
By drawing our attention to the similarities between the object of his affections and a perennial flowering shrub of the Rosaceae family, Burns exquisitely—and economically—tells us about the unknown (his love) by comparing it with the known (a red, red rose). We can therefore be reasonably sure that the beauty of Burns’s beloved is flush and full (and fleeting), her perfume is sweet, and she can be very prickly. And we know all this without ever having laid eyes on her.
The paradox of metaphor is that it tells us so much about a person, place, or thing by telling us what that person, place, or thing is not. Understanding a metaphor (like reading a book about that process, in fact) is a seemingly random walk through a deep, dark forest of associations. The path is full of unexpected twists and turns, veering wildly off into the underbrush one minute and abruptly disappearing down a rabbit hole the next. Signposts spin like weather vanes. You can’t see the wood for the trees. Then, suddenly, somehow, you step into the clearing. A metaphor is both detour and destination, a digression that gets to the point.
Aristotle identified the mastery of metaphorical thinking as “a sign of genius22, since a good metaphor implies an intuitive perception of the similarity in dissimilars.” French mathematician Henri Poincaré found an ingenious metaphor for metaphorical thinking in the theories of one of Aristotle’s predecessors, Epicurus.
According to the Greeks, the world was made up of just two basic things: atoms and the void. “Atoms are unlimited in size and number23,” wrote Democritus, the fourth-century B.C.E. philosopher who formulated ancient Greece’s version of atomic theory, “and they are borne along in the whole universe in a vortex, and thereby generate all composite things—fire, water, air, earth; for even these are conglomerations of given atoms.”
To the Greeks, the physical universe was, quite literally, an atomic shower, a steady downpour of tiny, indivisible particles falling through empty space. All the objects in the world—all the things we see, hear, smell, touch, and taste—were made up of atoms combining and recombining in every conceivable way.
In some of the wilder expositions of the theory, thinkers imagined a distant time when the body parts of every living thing tumbled through the void. The early universe was a cascade of arms and legs, feet and paws, fins and wings, hands and claws. Every limb connected randomly with every other until it met its corresponding shape and clicked into place. Through this process of trial and error, the world as we know it was made.
But Epicurus, who was born around 341 B.C.E., spotted a flaw in the theory. In order to meet its match, an atom could not simply fall through the void like rain. It must veer from the vertical path and waft its way down like a feather. Otherwise, he reasoned, it would never bump into any other atoms and thus never form the conglomerations Democritus described.
So Epicurus came up with the clinamen—the unpredictable moment during which each atom deviates from its course, creating the opportunity for a chance encounter with another atom. It was only through these “clinamactic” collisions, Epicurus believed, that change, surprise, and variety entered the world.
Like most ancient Greek philosophers, Epicurus left behind very few of his own words and even less about his own life. We know about the clinamen largely thanks to the first-century C.E. Roman poet Lucretius, whose epic poem On the Nature of the Universe is an encyclopedic exposition of Epicurean philosophy.
Not much is known about Lucretius, either, except that, according to Saint Jerome, he was driven insane by a love potion and killed himself at the age of forty-four. Whether his love resembled the sun, a red, red rose, or something else entirely, we do not know.
Still, for a love-crazed, suicidal poet, Lucretius summed up Epicurus’s ideas quite lucidly. Without the clinamen, he wrote, “No collision would take place24 and no impact of atom upon atom would be created. Thus nature would never have created anything.” Some 2,000 years after the composition of Lucretius’s poem, Poincaré used Epicurean atomic theory to explain the nature of mathematical discovery and, by extension, the nature of metaphorical thinking.
Born in Nancy, France, in 1854, Poincaré was a cross between a dandy and a distracted professor. He was “short and plump25, carried an enormous head set off by a thick spade beard and splendid moustache, was myopic, stooped, distraught in speech, absent-minded and wore pince-nez glasses attached to a black silk ribbon.” He was also intensely interested in the sources of creativity.
In The Foundations of Science, Poincaré set out his general theory of ingenuity. Based on his own experience as well as his interrogations of other mathematicians, Poincaré concluded that great creative breakthroughs occur unexpectedly and unconsciously after an extended period of hard, conscious labor. He invoked an Epicurean analogy to explain this. Poincaré described ideas as being like Epicurus’s atoms, writing:
During the complete repose of the mind26, these atoms are motionless; they are, so to speak, hooked to the wall . . . During a period of apparent rest and unconscious work, certain of them are detached from the wall and put in motion. They flash in every direction through the space . . . as would, for example, a swarm of gnats . . . Their mutual impacts may produce new combinations. What is the role of the preliminary conscious work? It is evidently to mobilize certain of these atoms, to unhook them from the wall and put them in swing. After this shaking-up imposed upon them by our will, these atoms do not return to their primitive rest. They freely continue their dance.
Poincaré’s atomic two-step is a deft analogy for how mathematical creativity—indeed, all creativity—lies in the dance of metaphorical thought, the tumultuous tango that ensues when idea rubs up against idea, when thought grapples with thought.
Metaphor is the mind’s great swerve. Creativity don’t mean a thi
ng if it ain’t got that clinamactic swing.
This same idea is contained in the three most famous words in all of Western philosophy, Descartes’s “Cogito ergo sum.” This phrase is routinely translated as:
I think, therefore I am.
But there is a better translation.
The Latin word cogito is derived from the prefix co (with or together) and the verb agitare (to shake). Agitare is the root of the English words “agitate” and “agitation.” Thus, the original meaning of cogito is “to shake together,” and the proper translation of “Cogito ergo sum” is:
I shake things up, therefore I am.
Metaphor shakes things up, producing everything from Shakespeare to scientific insight in the process.
The mind is a plastic snow dome: most beautiful, most interesting, and most itself when, as Elvis put it, it’s all shook up. And metaphor keeps the mind shaking, rattling, and rolling long after Elvis has left the building.
Metaphor and Etymology
Language Is Fossil Poetry
When Elvis appeared on The Ed Sullivan Show for the first time, on September 9, 1956, his pelvic undulations on other programs had already unsettled television executives and nervous parents across the country. Elvis performed two sets that night. For the first, the camera remained fixed above his waist. For the second, it pulled back far enough for the world to see the gyrations that earned him the moniker, Elvis the Pelvis27. The uproar occasioned by Elvis’s early TV appearances is not unlike that which has periodically attended metaphor. Elvis was condemned for promoting immorality and licentiousness; metaphor has been condemned for promoting deception and subversion.
Historically, metaphor has often been considered a devious use of language, an imprecise and vaguely suspicious linguistic trick employed chiefly by charlatans, faith healers, snake oil salesmen, and poets. Many philosophers regarded metaphorical language as at best a harmless diversion and at worst a deliberate and potentially dangerous obfuscation. As a result, not many serious thinkers took metaphor at all seriously.
In Leviathan, Thomas Hobbes classified metaphor as one of the “abuses of speech28” and accused people of lying when they “use words metaphorically29; that is, in other sense than that they are ordained for; and thereby deceive others . . . Reasoning upon [metaphor]30 is wandering amongst innumerable absurdities; and their end, contention and sedition, or contempt.”
The Anglo-Irish philosopher George Berkeley advocated going cold turkey to protect against the errors of metaphorical thinking. “A philosopher should abstain31 from metaphor,” he urged.
In An Essay Concerning Human Understanding, John Locke was equally unsympathetic:
If we would speak of things as they are32, we must allow that all the art of rhetoric, besides order and clearness; all the artificial and figurative application of words eloquence hath invented, are for nothing else but to insinuate wrong ideas, move the passions, and thereby mislead the judgement; and so indeed are perfect cheats.
As Hobbes, Berkeley, and Locke plunged the dagger of reason into metaphor’s cheating heart, they were presumably unaware that the weapon they wielded was, in fact, metaphor itself.
Hobbes, in his brief denunciation, repeatedly abuses speech by using words in senses other than that for which they were ordained. The meaning of the word “ordain,” for example, comes from roots that mean “to set in order,” not “to designate,” “to decree,” or even “to admit to the Christian ministry.” Similarly, “deceive” literally meant “to catch or ensnare” before it meant “to make a person believe what is not true.”
And can we really construe the phrase “wandering amongst innumerable absurdities” as anything other than a seditious, contemptible deployment of figurative language when perfectly rational language would do? After all, how can one wander amongst absurdities? The idea itself is absurd.
Berkeley shamelessly indulged in metaphor by using the word “abstain,” which is derived from the Latin verb tenere (to hold) and literally refers to anything untenable, anything that cannot be held—such as a mistaken opinion about metaphor.
Even Locke’s seemingly innocuous choice of “insinuate” harbors a hidden metaphor. The word comes from the Latin sinus, meaning “a bay, gulf, or cove”; only much later was it used to describe the introduction of a thought or a thing through a winding, circuitous route, as seafaring smugglers shift contraband by hugging the shore.
Metaphor got up so many philosophical noses because it seemed so imprecise. Comparing your beloved to a red, red rose might be fine if you’re writing a poem, but these thinkers believed more exact language was needed to express the “truth”—a term, by the way, distilled from Icelandic, Swedish, Anglo-Saxon, and other non-English words meaning “believed” rather than “certain.”
The truth is, metaphor is astonishingly precise. Nothing is as exact as an apt metaphor. Even the most mundane metaphors contain finely detailed descriptions, hidden deposits of knowledge that a quick dig into a word’s etymology will turn up.
Open a dictionary at random; metaphors fill every page. Take the word “fathom,” for example. The meaning is clear. A fathom is a measurement of water depth, equivalent to about six feet. But fathom also means “to understand.” Why?
Scrabble around in the word’s etymological roots. “Fathom” comes from the Anglo-Saxon fæthm, meaning “the two arms outstretched.” The term was originally used as a measurement of cloth, because the distance from fingertip to fingertip for the average man with his arms outstretched is roughly six feet. This technique was later extended to sounding the depths of bodies of water, since it was easy to lower a cord divided into six-foot increments, or fathoms, over the side of a boat. But how did fathom come to mean “to understand,” as in “I can’t fathom that” or “She’s unfathomable”? Metaphorically, of course.
You master something—you learn to control or accept it—when you embrace it, when you get your arms around it, when you take it in hand. You comprehend something when you grasp it, take its measure, get to the bottom of it—fathom it.
Fathom took on its present significance in classic Aristotelian fashion: through the metaphorical transfer of its original meaning (a measurement of cloth or water) to an abstract concept (understanding). This is the primary purpose of metaphor: to carry over existing names or descriptions to things that are either so new that they haven’t yet been named or so abstract that they cannot be otherwise explained.
This ferrying back and forth happens all the time. What accounts for the amazing acceleration of a sports car? Horsepower. What happens to an economy when growth falls and unemployment rises? A depression. What do you see when you switch on a computer? A desktop. These are all metaphors, names taken from one thing and applied to a completely different thing because someone somewhere once noticed a resemblance.
The English literary critic, philosopher, and evangelical etymologist Owen Barfield picked up Aristotle’s definitive insight when he wrote in History in English Words:
When a new thing33 or a new idea comes into the consciousness of the community, it is described, not by a new word, but by the name of the pre-existing object which most closely resembles it.
Look at and listen to the language around you and you will discover a moveable feast of metaphor. Let me run this idea by you; ideas do not have legs (neither do tables or chairs, by the way) but “run” is used metaphorically to request a brisk consideration of a proposal. Similarly, combs do not have teeth; books do not have spines; and mountains do not have feet.1
The markets are jittery today; markets don’t get the jitters, investors do, but the phrase metaphorically expresses the reigning uncertainty.
I see what you mean; you “see” absolutely nothing when you say this, but you do convey quite clearly that you understand what someone else is saying.
Etymologies make perfect poetic sense. The word “emotion,” for example, comes from the Latin verb “to move,” movere. How do we describe the emotional state
occasioned by a poignant encounter, a beautiful film, or a powerful piece of music? We are moved. Movement is even visibly ensconced in the word “emotion” itself.
Even the word “literal”—derived from the Latin litera, meaning “letter”—is a metaphor. “Literal” means “according to the letter”; that is, actual, accurate, factual. But litera is, in turn, derived from the verb linire, meaning “to smear,” and was transferred to litera when authors began smearing words on parchment instead of carving them into wood or stone. The roots of linire are also visible in the word “liniment,” which denotes a salve or ointment. Thus, the literal meaning of “literal” is to smear or spread, a fitting metaphor for the way metaphor oozes over rigid definitional borders.
It is impossible to pinpoint the first use of most words as metaphors. It happened far too long ago and, in most cases, long before reading and writing were commonplace. But it is possible to pinpoint the metaphorical debuts of some words, thanks to the Oxford English Dictionary.
* * *
1 Notes do not have feet, either, just as lines do not have heads. The words “footnote” and “headline” are metaphors for the text that appears at the bottom and the top of a page, respectively, thereby occupying positions analogous to the corresponding body parts.
* * *
The first recorded literal use of the word “hot34,” for example, occurred in 1000, according to the OED. Its first recorded metaphorical use in relation to taste (a hot, or spicy, food) occurred in 1390, to sound (a hot musical passage) in 1876, and to color (a hot red) in 1896. The first literal meaning of the word “bridge35” dates back to the eleventh century, but the common figurative use of the word (to bridge our differences) didn’t occur until the middle of the eighteenth century.