Umber
On October 18, 1969, under cover of a vicious storm, a group of men broke into the Oratory of San Lorenzo in Palermo and stole a priceless Nativity scene by Michelangelo Merisi Caravaggio. Caravaggio was by many accounts a violent, troubled man, but no one who stands before one of his few remaining paintings can doubt his genius. Nativity with St. Francis and St. Lawrence, a huge work in oil created 360 years before its theft, shows the birth of Christ as a grim scene of impoverishment and exhaustion. The surviving photos show a very dark composition, just a few figures, heads bowed and disheveled, picked out against a muddy ground. Like many of Caravaggio’s other works, this painting probably owed its dark drama to his use of umber.1
Although some believe that umber’s name is, like sienna’s, geographical in origin—one source is Umbria in Italy—it is more likely that the word comes from the Latin ombra, meaning “shadow.” Like hematite [here] and sienna, umber is one of the iron oxide pigments commonly called ochers. But while hematite is red and sienna, when raw or unheated, is a yellow-tinged brown, umber is cooler and darker, perfect as a dark glaze.2 It is also, like its fellows, a very stable and reliable pigment, and was considered an essential part of every artist’s palette up until the twentieth century. However, it is also profoundly unglamorous. George Field, the nineteenth-century chemist and author of Chromatography, wrote that “it is a natural ocher, abounding with oxide of manganese . . . of a brown-citrine color, semi-opaque, has all the properties of a good ocher, is perfectly durable both in water and oil.”3 One can almost hear the yawn in his words.
Umber is one of the oldest known pigments used by humans. Ochers were used on cave walls at Altamira in Spain and at Lascaux, a trove of cave art in southwest France, which was rediscovered by a dog called Robot in September 1940.4 As Robot snuffled around the roots of a fallen tree, he uncovered a small opening underneath. His owner, the 18-year-old Marcel Ravidat, returning with three friends and some lamps, squeezed through a 40-foot-long shaft and into a vast chamber filled with Late Stone Age paintings.
Umber really came into its own, however, with the dramatic tenebrism of the late Renaissance and baroque artists like Joseph Wright of Derby and Rembrandt van Rijn. These painters, sometimes called Caravaggisti because of their admiration for their predecessor, gloried in the drama of strong contrasts between spots of light and deepest shadow, a technique also known as chiaroscuro, from the Italian words chiaro (“bright”) and oscuro (“dark”). Rembrandt, particularly in his impoverished later years after his bankruptcy in 1656, used a startlingly small range of pigments to produce the effect, relying heavily on the cheap, somber ochers, especially umber.5 It is there in the backgrounds and heavy clothing of his late self-portraits, the ones in which he is so uniquely expressive: sometimes thoughtful, or wounded, or quizzical, but always catching and holding the viewer’s gaze, his face strongly lit in the pools of darkness.
In 1996 the fate of Caravaggio’s masterpiece painting was revealed in a spectacular trial. Francesco “Mozzarella” Marino Mannoia, a Sicilian Mafia specialist in refining heroin, became a government informer after the death of his brother. Francesco told the court that he had sawed the painting from its frame above the altar and bundled it up to deliver to the man who had commissioned the theft. Tragically, though, he had no experience with valuable works of art and no idea of the care with which they needed to be handled. When the patron saw the condition of the painting after its rough treatment, he wept. “[I]t was not . . . in a usable condition anymore,” Marino Mannoia admitted at his trial 30 years later.6 Many still refuse to believe it has been destroyed,7 continually appealing for the painting’s safe return, ever hopeful that it will, one day, emerge from the shadows.
Mummy
On July 30, 1904, O’Hara and Hoar placed an unusual advertisement in the Daily Mail. What they wanted—“at a suitable price”—was an Egyptian mummy. “It may appear strange to you,” the notice read, “but we require our mummy for making color.” Then, to stave off any pricks of public conscience, they continued: “Surely a 2,000-year-old mummy of an Egyptian monarch may be used for adorning a noble fresco in Westminster Hall or elsewhere without giving offense to the ghost of the departed gentleman or his descendants.”1
By then such a plea was unusual enough to raise comment, but mummies had been dug up and reused in various ways for centuries without much fuss. Mummification had been common burial practice in Egypt for over 3,000 years. Internal organs were removed before the body was washed and embalmed using a complex mixtures of spices, as well as preservatives, including beeswax, resins, asphalt, and sawdust.2 Although mummies—particularly those of the rich and distinguished, whose wrappings were likely to contain gold and trinkets3—could be valuable in themselves, those who dug them up were more often after something else: bitumen. The Persian word for bitumen was mum or mumiya, which had led to the belief (along with the fact that mummified remains were very dark) that all mummies contained the substance.4 Bitumen—and by extension mummies—had been used as a medicine from the first century A.D. Ground-up mummy, or “mummia,” was applied topically or mixed into drinks to swallow, and it seemed there was almost nothing it could not cure. Pliny recommended it as toothpaste; Francis Bacon for the “stanching of blood”; Robert Boyle for bruises; and John Hall, Shakespeare’s son-in-law, used it on a troubling case of epilepsy. Catherine de’ Medici was a devotee, as was François I of France, who carried a little pouch of powdered mummy and rhubarb on him at all times.5
Trade was brisk. John Sanderson, an agent for an importer called the Turkey Company, vividly described an expedition to a mummy pit in 1586:
We were let down by ropes, as into a well, with wax candles burning our hands, and so walked upon the bodies of all sorts and sizes . . . they gave no noisome smell at all . . . I broke off all of the parts of the bodies to see how the flesh was turned to drugge, and brought home divers heads, hands, armes and feet.6
Mr. Sanderson actually returned to England with one complete mummy and 600 pounds of sundry parts to refresh the supplies of the London apothecaries.7 Demand, however, outpaced supply, and there are numerous reports of replacements being hastily made from the bodies of slaves and criminals. While on a visit to Alexandria in 1564, the physician to the king of Navarre interviewed one mummy dealer who showed him 40 he claimed to have manufactured himself in the past four years.8
Because apothecaries often dealt in pigments too, it is not so surprising that the rich brown powder also found itself on painters’ palettes. Mummy, also known as Egyptian brown and Caput mortum (“dead man’s head”), was used as paint—usually mixed with a drying oil and amber varnish—from the twelfth until the twentieth centuries.9 It was well known enough for an artists’ shop in Paris to call itself—tongue, presumably, in cheek—À la Momie. Eugène Delacroix used it in 1854 when painting the Salone de la Paix at the Hôtel de Ville in Paris; his fellow countryman Martin Drölling favored it too, as did the British portraitist Sir William Beechey.10 There was some debate as to which bits of the mummy to use to get the best and richest browns—recommended for translucent glazing layers for shadows and skin tones. Some suggested using just the muscle and flesh, while others thought that the bones and bandages should also be ground up to get the best out of this “charming pigment.”11
Gradually though, toward the end of the nineteenth century, fresh supplies of mummies, authentic or otherwise, dwindled. Artists were becoming dissatisfied with the pigment’s permanency and finish, not to mention more squeamish about its provenance.12 The Pre-Raphaelite painter Edward Burne-Jones hadn’t realized the connection between “mummy brown” and real mummies until one Sunday lunch in 1881, when a friend related having just seen one ground up at a colorman’s warehouse. Burne-Jones was so horrified he rushed to his studio to find his tube of mummy brown, and “insisted on our giving it a decent burial there and then.”13 The scene made a great impression on the teenage Rudyard Kipling, Burne-Jon
es’s nephew by marriage, who was also a guest at lunch. “[T]o this day,” he wrote years later, “I could drive a spade within a foot of where that tube lies.”14
By the beginning of the twentieth century demand was so sluggish that a single mummy might provide a paint manufacturer with pigment for a decade or more. C. Roberson, a London art shop that had first opened its doors in 1810, finally ran out in the 1960s. “We might have a few odd limbs lying around somewhere,” the managing director told Time magazine in October 1964, “but not enough to make any more paint. We sold our last complete mummy some years ago for, I think, £3. Perhaps we shouldn’t have. We certainly can’t get any more.”15
Taupe
Sometime in 1932 the British Color Council (BCC) began working on a special project. The idea was to create a standardized catalog for colors, complete with dyed silk ribbons to show exactly what color was meant by each term. It would, they hoped, do for color “what the great Oxford Dictionary has done for words.”1 It “will mark,” they wrote, “the greatest achievement of modern times in assisting British and Empire industries with color definition,” thereby giving Britain trade a competitive edge.
What wasn’t mentioned was that Britain was behind the curve. Albert Henry Munsell, an American artist and teacher, had been working on a way of three-dimensionally mapping color since the 1880s; his system was fully fledged by the first decade of the twentieth century and has been used with minor tweaks ever since.2 A. Maerz and M. R. Paul, who built on Munsell’s work but also wished to incorporate the common names for colors, published their Dictionary of Color—modeled on Samuel Johnson’s idiosyncratic English Dictionary—in New York in 1930. It included pages of color chips, a comprehensive index, and snippets of information on many common colors.
They all developed a fine appreciation for the difficulty of the task. Colors were hard to pin down; they could change name over time, or the shade associated with a name might morph alarmingly from one decade or country to the next. Chasing down and collating color terms and samples had taken the BCC 18 months. Messrs. Maerz and Paul had labored over the task for years.3 One color that vexed both sets of researchers was taupe. It is actually a French word, meaning “mole.” However, while the color of a mole was, by broad consensus, “a deep gray on the cold side,” taupe was all over the place—the only thing consistently agreed upon was that it was generally browner than a mole had a right to be.4 The BCC’s assumption was that the confusion was due to ignorant English-speakers not realizing that taupe and mole were different words for the same thing. Maerz and Paul were rather more thorough. They set out on an expedition around the zoological museums of the United States and France to look at foreign specimens from the genus Talpa, to determine whether there was a logical reason for using both terms. “[I]ts color certainly varies,” they concluded, but what was generally understood by the term taupe “represents a considerable departure from any color a mole might possess.” The sample they included in their book, therefore, was “a correct match for the average actual color of the French mole.”5
Despite transatlantic efforts to return this color to something approximating the hue of its parent mammal, taupe has since continued to run wild. Beloved by the makeup and bridal industries, it is roomy enough to contain a plethora of pastel brown-grays while still managing to sound refined and elegant. If only these intrepid color cartographers had taken all the lessons of Samuel Johnson’s great enterprise to heart, they might have been spared the wild-mole chase. Johnson, even as he set down definitions next to words in his dictionary in 1755, was realistic enough to appreciate the ultimate futility of his task. This rueful reflection from his preface could just as easily apply to colors: “sounds are too volatile and subtile for legal restraints; to enchain syllables, and to lash the wind, are equally the undertakings of pride.”
Kohl
Payne’s gray
Obsidian
Ink
Charcoal
Jet
Melanin
Pitch black
Black
What do you think of when you see the color black? Perhaps a better question would be: what don’t you think of when you see black? Few colors are more expansive and capacious. Like the dark obsidian mirror [here] that once belonged to Dr. Dee, look into black and you never know what might look back. It is, simultaneously, the color of fashion and of mourning, and has symbolized everything from fertility to scholarship and piety. With black, things are always complicated.
In 1946 Galerie Maeght, an avant-garde Parisian gallery on rue du Bac on the Left Bank, staged an exhibition called “Black Is a Color.” It was a statement intended to shock: this was the precise opposite of what was then taught at art schools.1 “Nature knows only colors,” Renoir once declared. “White and black are not colors.”2 In one sense, this is right. Like white, black is an expression of light, in this case its absence. A true black would reflect no light whatsoever—the opposite of white, which reflects all light wavelengths equally. On an emotional level, this has not affected our experience or use of blacks as colors; on a practical level, it has so far proved impossible to find or create a black that reflects no light at all. Vantablack, a carbon nanotube technology created in Britain in 2014, traps 99.965 percent of the spectrum, making it the blackest thing in the world. In person it is so dark it fools the eyes and brain, rendering people unable to perceive depth and texture.
A whiff of death has clung to black as far back as records reach, and humans are fascinated and repelled by it. Most of the gods associated with death and the underworld—such as the jackal-headed Egyptian god Anubis, Christianity’s devil, and the Hindu goddess Kali—are depicted with truly black skin, and the color has long been associated with both mourning and witchcraft.
However, while black is so often linked with endings, it is present at the start of things too. It reminded the ancient Egyptians of the rich silt that the Nile deposited after the floods each year, making the land fertile. Black’s potential for creation is there in the opening passages of Genesis—it is, after all, out of the darkness that God conjures light. Nighttime too has a peculiar fecundity, for all the obvious reasons, and because of dreams that blossom only when we close our eyes to shut out the light. A piece of artists’ charcoal [here] is a perfect emblem for beginnings. The outline—usually black—was invented over 30,000 years ago. It may be the example non plus ultra of artistic artifice, but this has never mattered to artists, and the black line is art’s foundation stone. It was to hand when early men and women first began expressing themselves by leaving their marks on the world around them, and it has been used at the inception of nearly all artistic endeavors since.3 Some 12,600 years after Paleolithic fingers and pads of soft leather daubed fine charcoal powder onto the cave walls at Altamira, Leonardo da Vinci favored fine sticks of the stuff. It was one of these that he used to sketch out a softly blended sfumato—from fumo, for smoke—version of what would become his simultaneously mysterious and expressive painting The Virgin and Child with St. Anne (1503–19), now in the Louvre.
It was during Leonardo’s lifetime, too, that black reached its zenith as the color of fashion. His near contemporary, Baldassare Castiglione, wrote in his Book of the Courtier that “black is more pleasing in clothing than any other color,” and the Western world agreed.4 Its rise as the most fashionable of colors had three causes. The first was practical: sometime around 1360, new methods were found for dyeing fabrics true black, rather than dirty brown-grays, which made them more luxurious. A second reason was the psychological impact of the Black Death, which decimated the population of Europe, and led to a desire for greater austerity and collective penitence and mourning.5 Philip the Good (1396–1467), who famously was rarely seen out of black clothing, favored it to honor the memory of his father, John the Fearless, who had been assassinated in 1419.6 The third reason was the wave of laws that sought to codify social strata through dress: wealthy merchants were fo
rbidden to wear colors reserved for the old money, such as scarlet [here], but they could wear black.7 The obsession lasted until the first decades of the eighteenth century. Estate inventories show that in around 1700, 33 percent of nobles’ and 44 percent of officers’ clothing was black; it was popular with domestics too, making up 29 percent of their wardrobes.8 At times the streets must have resembled Rembrandt’s paintings. The Sampling Officials (1662) and Anatomy Lesson of Dr. Nicolaes Tulp (1632): crowds of identical black-clad folk jostling for space.
Despite the ubiquity, black has retained both its popularity and its fresh, challenging modernity.9 Kazimir Malevich’s Black Square, for example, is believed to be the first purely abstract painting. Used as we are to abstract art, the magnitude of this work is difficult to understand. For Malevich, though, Black Square (he created four different versions between 1915 and 1930) was a statement of intent. He desperately wanted, as he put it, “to free art from the dead weight of the real world,” and so “took refuge in the form of the square.”10 This, for the first time, was art for art’s sake, and a revolutionary idea needed a revolutionary color: black.
Kohl
Lurking in the Egyptology section of the Louvre in Paris is a curious object. It is a squat, sparkling white statuette of a bowlegged creature, whose red tongue lolls from a mouth lined with sharp teeth; it has pendulous, triangular breasts; a fierce blue V for eyebrows; and a long shaft of a tail that dangles rudely between its legs. Made between 1400 and 1300 B.C., it depicts the god Bes, who, while he may look terrifying, was actually rather sweet: a fearsome fighter, he was popular with ordinary Egyptians because he was a protector, particularly of homes, women, and children. What he was protecting in this case, though, was rather different: hidden in the statuette’s hollow head is a small container intended for kohl eyeliner.
The Secret Lives of Color Page 19