by Tim James
The exploding gas Henry Cavendish had isolated was a different element (contained within the acid, not the metal) and, when heated with oxygen, combined to form water. Lavoisier named this gas hydrogène from the Greek hydros-genes (water-maker), which translates into hydrogen.14
This new way of looking at things also explained why you couldn’t breathe in a room after a fire had been burning. It wasn’t because the fire was giving out a toxic substance: it was because air was partly made from oxygen and fires absorbed it, leaving the other gas behind.
This useless gas was eventually shown to react under extreme conditions and could make niter, one of the key ingredients in gunpowder, so the French statesman Jean-Antoine Chaptal named it nitregène—nitrogen.
Science always progresses when a hypothesis is proven wrong and Lavoisier’s experiments signed the death warrant on phlogiston. Air was an unreacted mixture of nitrogen and oxygen, water was a fused compound of hydrogen with oxygen, and fire was a reaction between oxygen and any available chemical. None of them was an element.
For his efforts, Lavoisier was taken to the guillotine in May 1794. Possibly because he worked as a taxman in pre-revolutionary France (never a good idea), but more likely because he criticized the inferior science of Jean-Paul Marat, who became a leading figure of the revolution. An unlucky end for a great mind, although that’s nothing compared to the bad luck of a chemist named Carl Scheele.
THE UNLUCKIEST MAN IN THE HISTORY OF CHEMISTRY
Cavendish, Lavoisier, and Priestley were geniuses of a new science and other people quickly joined the hunt. Everyone wanted the glory of discovering a new element, although agreeing on who makes a discovery isn’t always obvious.
Some elements have been around since antiquity so it’s impossible to know who originally discovered them. The Old Testament contains passages dating back three thousand years that refer to gold, silver, iron, copper, lead, tin, sulfur (correctly spelled with an f—see Appendix I), and possibly antimony.15
Then there are instances of someone predicting an element without actually obtaining a sample. Johan Arfwedson deduced there was an element hidden within petalite rock and named it lithium from the Greek lithos (rock), but it wasn’t until 1821 that William Brande extracted it.16
In order to avoid confusion and settle debates we tend to talk about the first person to isolate an element rather than discover it. Credit goes to the first person who manages to hold a pure sample of an element and recognize it as such. Which brings us to the Swedish chemist Carl Scheele.
In 1772, Scheele successfully made a brown powder, which he named baryte from the Greek barys, meaning heavy. He knew there was an element hidden inside (barium) but it was Humphry Davy who isolated it and got the glory.
In 1774, Scheele discovered the gas chlorine (from the Greek chloros, meaning green) but didn’t realize it was an element. It was again Humphry Davy who made this link in 1808, thus getting the credit.
That same year, Scheele discovered calx of pyrolusite but failed to isolate the elemental manganese inside, achieved a few months later by Johan Gahn.
Then it happened again in 1778 when Scheele identified molybdenum, before it was isolated by Peter Hjelm. And then again in 1781 when he deduced the existence of tungsten but failed to isolate it before Fausto Elhuyar, who got the credit.17
Scheele even discovered oxygen in 1771—three years before Priestley—but his manuscript was delayed at the printers and, by the time it was published, Priestley had got his results out.18
To commemorate his many contributions to chemistry, the mineral Scheelite was named after him … until it was officially renamed calcium tungstate and Scheele was once again nudged out of the history books. If there is a god of chemistry, he apparently hates Carl Scheele.
CHAPTER TWO
Uncuttable
DIAMONDS, PEANUTS, AND CORPSES
In 1812 the German chemist Friedrich Mohs invented a 1 to 10 scale to classify the hardness of minerals. Tooth enamel has a score of 5, for example, while iron ranks as a 4. This means your teeth will technically dent a lump of iron but not the other way around. Although I don’t recommend you try it because if you accidentally bite steel (iron with carbon impurity), which has a hardness of around 7.5, you’ll regret it.
Diamonds were given a value of 10 because they were the hardest things known at the time. Their claim to the crown was only overthrown in 2003 when a group of researchers from Japan managed to make something even harder—a hyperdiamond.
The most common explanation given for how diamonds form is that coal (fossilized plant) gets compressed underground until it turns hard and transparent. It’s what everyone gets told in primary school but it’s a complete myth. Diamonds are made in a much more extreme environment.
The same year hyperdiamonds were manufactured, Hollywood birthed its own unbelievable creation: The Core, a sci-fi film, which has to be seen to be believed. A few highlights from the movie involve a man hacking the entire global internet from a laptop, sunlight melting the Golden Gate Bridge, and Hilary Swank landing a space shuttle in the San Fernando Valley.
One scene in particular stands out for me. A team of scientists is launched into the Earth’s mantle in order to nuke the Earth’s core and find themselves dodging diamonds the size of buildings.1
What’s interesting about this scene is that, while giant diamonds are unlikely, it’s otherwise fairly accurate. Diamonds really are made in the Earth’s mantle, not in the crust.
A diamond is made solely from carbon and it takes billions of years to grow one. Plants do contain carbon but haven’t been around long enough to create the gems we extract from mines today. To fuse carbon into a crystal also takes a staggering amount of pressure and temperature—far more than you could achieve in a planetary crust.
Diamonds are really made a few hundred kilometers into the upper mantle, where pressures are hundreds of thousands times greater than atmospheric pressure and temperatures are comparable to the surface of the Sun. Once they’ve been made, the crystals are vomited to the surface in volcanic eruptions, which solidify, and we eventually dig them up.
The compressed-plant myth probably arises because we also mine coal and that is made from heat-compressed plant, but it forms at wussy temperatures and pressures, inadequate for diamonds.
It is also true that one naturally turns into the other, but it’s the opposite of what the myth claims. Diamonds are slightly unstable and will decay into coal over thousands of years. So, the obvious question is: could we reverse the process?
In 2003, Tetsuo Irifune from the Tokyo Institute of Technology decided to try compressing coal into a diamond for real. By using the engineer’s equivalent of an extreme pressure cooker, Irifune took a lump of coal-like carbon and subjected it to pressures far in excess of what you’d get in the mantle. The result was a hyperdiamond, a chemical never seen before in nature.2
Hyperdiamonds will have a Mohs value greater than 10 but the precise number hasn’t been calculated because the original piece of carbon is compressed so much the resulting hyperdiamond is tiny. We’re talking a few millionths of a gram.
But we don’t have to use coal as our starting material. Dan Frost from the Bavarian Geological Institute in Germany managed to make a diamond by compressing peanut butter,3 and the Illinois-based company LifeGem can make artificial diamonds by compressing your deceased loved one’s ashes. Provided you’ve got the carbon, it can be crystalized.
The fact that coal, diamond, and hyperdiamond are all made from the same element yet have different properties (we refer to them as “allotropes of carbon”) suggests that elements can somehow arrange themselves in different ways.
In order to explain this phenomenon, we’re going to have to look closely at the notion of something being diamond-like or “uncuttable.” And in ancient Greek the word for uncuttable is one you probably know already: atom.
THE MAN WHO PROVED GOD
Imagine holding a grain of sand between your
fingertips. It’s hard to make out details with the naked eye but logically the grain would have two halves; a left hemisphere and a right one. You could imagine a knife small enough to chop the grain right down the middle, splitting it in two. Then, once you had these half grains, you could repeat the process, slicing to quarter grains and so on.
Theoretically, you could do this forever. No matter how small the grain fragment, you’d always be able to zoom in and divide in half again.
The alternative would make no sense. Imagine chopping a grain up so small that it no longer had a left or right half. A piece so small it didn’t have any size and just was. For an object like this, the very concept of dividing by two would be meaningless. It would be like trying to divide by two on a calculator and the calculator replying with “Sorry, you have reached the smallest thing, you can’t divide anymore.” You’d have to be crazy to suggest the existence of a smallest object. Cue Democritus.
Democritus was a philosopher/stand-up comedian living in the fifth century BCE and he took the idea of elemental substances very seriously. He believed everything was made of microscopic uncuttable pieces (atoms) that combined to make the world around us.
Say you’ve got a packet of M&M’s. Rather than eating them in mixed handfuls, every sane human being divides them into piles organized by color and eats them one pile at a time. Don’t trust anybody who does otherwise.
This sifting of a mixture into purity is what we’re really doing when we break a substance down into its elements; we’re grouping the atoms according to type. This would also explain where allotropes come from. Diamond, coal, and hyperdiamond could all be made from carbon atoms stacked and arranged differently, leading to a variety of properties.
And, as if the atom hypothesis wasn’t strange enough, Aristotle later used Democritus’s idea to prove the existence of God. Because atoms were constantly in motion, bouncing off each other and flying through the emptiness between, every atom’s movement could be back-tracked to a collision with an earlier atom, whose movement could be explained as a collision with an earlier one still. Cause led to effect and every effect had a preceding cause.
If you went back far enough there must have been a first movement that caused everything but had no cause itself. Such a thing (an uncaused cause) would be outside the normal laws of nature while still being able to influence them. God, in other words.4 Make of that what you will.
LORD OF THE SWAMP
Sadly, along with many other great ideas, Democritus’s atomic hypothesis was shelved as the Holy Roman Empire took hold of intellectual Europe. It wasn’t until the late 1700s that atoms were given serious attention thanks to the work of an English scientist named John Dalton.
At the age of twelve, most people in England are getting acquainted with being a student in high school. John Dalton was teaching at one. The son of a weaver, Dalton had already taught himself science, mathematics, English, Latin, Greek, and French, and achieved the rank of headmaster by his late teens.5
Don’t be fooled though. While a fierce academic, Dalton still knew how to have a good time and, like any youngster, spent his free moments collecting samples of swamp gas from local bogs. Surprisingly, he never married.
It was while burning these samples of gas that Dalton learned gases don’t react all willy-nilly but combine in specific ratios. Hydrogen and oxygen, for instance, always combine in a two-to-one mix and nothing else. If you have three times as much hydrogen as oxygen, you end up with a third of your hydrogen left at the end. It’s as if there’s only a limited amount of oxygen “bits” to go around.
Dalton decided the best way of explaining these findings was to assume there were tiny particles making up each elemental gas. Thanks to his proficiency in Greek he was familiar with the work of Democritus and began referring to these particles as atoms.
The idea, however, was not widely accepted. Dalton had a habit of overcomplicating things and the book he published in 1808 to outline his atomic hypothesis was a notoriously difficult read.6 His ideas were rigorous but his explanations were boring and his chemistry was cumbersome.
Nevertheless, Dalton was greatly respected and was eventually given the privilege of being presented to King William IV. This also led to him committing the biggest faux pas of his career because Dalton was a Quaker and forbidden to wear scarlet clothing, which happened to be the color of robes required for meeting the king. Dalton was color-blind (incidentally, he was the first person to document its existence), and the event’s organizers “forgot” to tell him he was wearing robes that would offend his fellow Quakers.7
So Dalton went parading around in front of other Quakers in the most outrageous clothing imaginable. The unluckiness of being simultaneously color-blind, a Quaker, and publicly dressed in scarlet is remarkably unfortunate. Somewhere, in a dark corner of purgatory, Carl Scheele is probably cackling to himself.
UNDER PRESSURE
The real watershed for the atomic hypothesis came in 1899 when the French physicist Émile Amagat began experimenting with pressure chambers. Amagat had spent his youth lowering samples of gas into mineshafts to measure how much they got compressed, and by adulthood had designed sophisticated machinery capable of compressing gases to three thousand times atmospheric conditions.
Through these experiments he discovered there was a limit to how far a gas could be squeezed. Once you got to a certain point, the gas fought back and refused to get smaller.8
This couldn’t be explained with the infinitely-smaller-particles hypothesis. If matter was made from infinitely small chunks, then any gas would contain an infinite number of gaps in between them as well. No matter how small you compressed a gas there would always be enough space for the matter to fall into.
The physicist Robert Boyle, son of the Earl of Cork, had conducted experiments on gas pressure and argued that it was possible to compress a gas forever because of this very reason. Amagat’s research showed otherwise. A gas had a fixed amount of matter, which meant it probably wasn’t made from an infinity of smaller bits.
Combined with Dalton’s swamp-gas discoveries, Amagat made the idea of atoms look less like a hypothesis and more like a theory—something that has evidence in its favor. However, there was one big problem or, rather, a very small one. In order to make sense of Amagat’s readings you had to accept that atoms were tiny. Unthinkably so.
Imagine looking at planet Earth from space and trying to pick out a single grape on its surface. That is the equivalent of looking at a grape and trying to pick out a single atom on its skin.
If atoms were real they would need to be so small that even waves of visible light would be too big to bounce off them. It wouldn’t matter how powerful your microscope was, atoms would be impossible to discern by their very nature.
Scientists are in the business of testing theories once they’ve been established, but how could you test this one? How could you see the unseeable?
EINSTEIN WAS HERE
Albert Einstein was a legend in his own lifetime. What’s more impressive is that he deserved the reputation. Publishing over three hundred scientific papers and essentially inventing the landscape of modern physics, Einstein was the epitome of genius.
It would be foolish to summarize his many achievements in a few paragraphs, so we’ll focus on the one most relevant to chemistry: a paper he published on July 18, 1905, in which he made the atomic hypothesis testable rather than speculative.
While working at the Swiss patent office, Einstein stumbled across some research from 1827 by the Scottish botanist Robert Brown. Brown had noticed that grains of pollen floating on water appeared to jiggle in random patterns. Originally, he had assumed the grains were alive but found the same thing happened with sand or dust. The phenomenon was known as Brownian motion and, although unexplained, it was nothing more than a curiosity.
Einstein decided to model the pollen’s trajectory through the water and found it could only be explained as the result of bombardment from water particles.
To accurately describe how the pollen moved, you had to factor in the friction of pollen against water, which meant you had to accept the existence of “water atoms.”
Despite the persistent rumors that he failed math in school, Albert Einstein was a mathematician par excellence and drew up an equation that related water temperature to the pollen grain’s likely movement. By introducing an equation with a measurable outcome, Einstein changed the game completely. An idea can be debated but a number cannot, so if you can predict a specific value from your hypothesis you have something to search for directly.
He finished his paper with the phrase, “It is to be hoped that some enquirer may succeed shortly in solving the problem suggested here.”9 As was usually the case with Einstein, his equation was soon tested and confirmed. The zigzagging wasn’t random at all, but the result of minor fluctuations in water movement on either side of the grain. Pollen looked like it was undergoing constant collisions because it genuinely was.
In finding this, Einstein did for the atomic hypothesis what Lavoisier did for the elemental one: he provided indisputable, quantitative evidence. You couldn’t sensibly discuss elements without atoms anymore, or vice versa. There was no argument to be had. Atoms were real.
CHAPTER THREE
The Machine Gun and the Pudding
THE SMALLEST MOVIE IN HISTORY
In 1989 researchers at IBM pushed the boundaries of marketing by creating a sculpture of their company logo using only thirty-five atoms. Then in 2013 they went even further and created a sixty-second film, A Boy and His Atom, by drawing images with atoms and animating them through stop-motion, earning a Guinness World Record for the world’s smallest stop-motion film.