by Bill Bryson
Congress refused to heed Washington’s request and insisted he take a salary of $25,000 a year. It also did him the honour of allowing him to choose the site of the nation’s permanent capital – not so much out of altruism but more because it couldn’t decide on a location itself. At least forty sites had been considered and argued over, from Germantown, Pennsylvania, to Kingston, New York, before Washington was authorized to make his choice. He selected a ten-mile square alongside the Potomac River’s head of navigation. (In 1846 Virginia reclaimed the portion on its side of the river, which explains why the modern Washington has ruler-straight boundaries on three sides but an irregular wriggle on the fourth.) In 1791 the city-to-be was named Washington; the 6,100-acre tract within which it was situated was called the Territory of Columbia (eventually of course changed to District of Columbia), thus neatly enshrining in one place the two great mythic names of the age.
Two years later, Washington laid the cornerstone for the Capitol, and in 1800 the city of Washington opened for business. America was on its way.
5
By the Dawn’s Early Light: Forging a National Identity
Bombardments in the early nineteenth century provided a spectacle that must have been quite thrilling to anyone not on the receiving end. The art of the matter was to cut fuses to just the right length so that they would detonate at or near the moment of impact. In practice, they went off all over the place. Hence the ‘bombs bursting in air’ of the American national anthem. As most people know, the words to the anthem were inspired by the bombardment of Fort McHenry in Baltimore Harbour during the War of 1812. Francis Scott Key, a young lawyer, had been sent to try to negotiate the release of an American prisoner, and found himself detained aboard a British man-of-war.
Through the night Key watched as the British fleet ranged round the harbour threw a colourful fusillade of explosives at the embattled fort. When dawn broke and Key saw the American flag still flying, tattered but defiant, he was sufficiently moved to dash off a poem. The poem was frankly terrible, but it bore an emotional impact easily forgotten at this remove. Published under the title ‘Defence of Fort M’Henry’, and set to the decidedly funereal tune of an English song called ‘To Anacreon in Heaven’ (the beat has since been considerably enlivened), it became a sensation. Soon almost everyone had forgotten its original title and was calling it ‘The Star-Spangled Banner’, by which name it has been known ever since.
The flag that Key saw flying over Fort McHenry had fifteen stars and fifteen stripes. In the early years of independence, the custom was to add a star and a stripe to the flag each time a state joined the Union. By 1818 Congress was flying a flag with no fewer than eighteen stripes and it was becoming evident that the practice would soon become unsustainable. Congress decided that enough was enough and officially decreed that henceforth flags should have thirteen stripes (one for each of the original colonies) and as many stars as there were states.
The War of 1812 also saw the rise of another American icon: Uncle Sam. He appears to have arisen in 1813 in Troy, New York, but little more than that is known.1 Previously America had been personified by a character of obscure origins called Brother Jonathan, who usually appeared in apposition to the English John Bull. The inspiration for Uncle Sam is sometimes traced to one Samuel Wilson, an army inspector of Troy, but it seems more probable that the name was merely inspired by the initials US. The top-hatted, striped-trousered figure that we associate with the name came much later. It was popularized in the 1860s in the cartoons of Thomas Nast, and later reinforced by the famous I WANT YOU recruiting posters of the artist James Montgomery Flagg, in which Uncle Sam lost his genial sparkle and took on a severe, almost demonic look, which he has generally kept to this day.
Thus by the end of the second decade of the nineteenth century America had a national anthem (though it would not be officially recognized as such until 1931), a more or less fixed flag and a national symbol in the form of Uncle Sam. It was, in short, beginning to accumulate the rudiments of a national identity.
But in other ways America remained a collection of disparate parts, each following its own course. This was most arrestingly seen in the absence of uniform times. Until as late as 1883, there were no fixed times in America. When it was midnight in New York, it was 11.47 in Washington, and 11.55 in Philadelphia. In 1869, when Leland Stanford struck the golden spike that marked the completion of America’s first transcontinental railway (in fact he couldn’t manage to drive the spike in; the work had to be completed by someone more adept with a manual implement), the news was instantly telegraphed to a breathlessly waiting nation. In Promontory, Utah, the great event happened at 12.45, but in nearby Virginia City it was deemed to be 12.30. In San Francisco it was 11.46 or 11.44, depending on which authority you believed in, and in Pittsburgh the information was simultaneously received at six places and logged in at six different official times.
In an age when most information arrived by horseback, a few minutes here and there hardly mattered. But as the world became more technologically sophisticated, the problem of variable timekeeping did begin to matter. It was a particular headache for the railways and those who travelled on them. In an effort to arrive at some measure of conformity, most railway companies synchronized the clocks along their own lines, but often these bore no relationship to the times used either locally or by competing railways. Stations would often have a multiplicity of clocks – one showing the station time, another the local time and the rest showing the times on each of the lines serving that station. Passengers unfamiliar with local discrepancies would often arrive to catch a train only to find that it had recently departed. Making connections in a place like Chicago, where fifteen lines met, required the careful study of fat books of algorithms showing all the possible permutations.
Clearly something needed to be done. The first person to push for uniform time throughout the country was the rather unlikely figure of Charles F. Dowd, head of the Temple Grove Ladies’ Seminary in Saratoga Springs, New York. In 1869, the year of Leland Stanford and the golden spike, Dowd began agitating for the adoption of four time zones very much along the lines of those used today. The idea met with surprisingly heated objections. Many thought it somehow ungodly to tinker with something as elemental as time, ignoring the consideration that clocks are not a divine concept. Some communities saw it as an impudence to expect them to change their clocks for the benefit of commercial interests like the railways and telegraph companies. Almost everyone found the entire notion strange and puzzling, particularly those who lived on or near the prospective time zone borders. People in a place like North Platte, Nebraska, couldn’t for the life of them understand why their neighbours down the road in Ogallala should get to rise an hour later than they each day. The objections extended even to groups of the greatest eminence. The British Association for the Advancement of Science for one dismissed the idea as ‘too utopian’.2
Finally in November 1883, after a meeting called the National Railway Time Convention, it was agreed to introduce time zones and synchronize clocks. The date 18 November, dubbed ‘the day of two noons,’ was set for its inception. For two weeks, people everywhere fretted and fussed as if the country were about to be struck by an outsize meteor. Farmers worried that their hens would stop laying or that their cows would go dry. Workers in Chicago, suspecting they were to be compelled to work an extra nine minutes on the big day, threatened to strike. By the dawn of the appointed day, the nation was in a fever of uncertainty. Just before noon people everywhere began silently gathering by town halls and court-houses to watch the clocks change.
Although the time change had no legal authority – it was done solely at the behest of the railways – it was introduced almost everywhere, and almost everywhere the event proved to be disappointingly anticlimactic. Millions watched as the hands on their court-house clocks were summarily advanced or moved back a few notches, then pursed their lips and returned to business as it dawned on them that that was as exciti
ng as it was going to get. Here and there local difficulties cropped up. In Washington, a disagreement between the US Attorney General and the head of the Naval Observatory meant that for several years government clocks in the city showed a different time to all others.3 For the most part, however, America took to uniform timekeeping with barely a flutter and life grew easier because of it.
The fuss over introducing time zones was as nothing compared with the push, half a century later, for summer time, or daylight-saving time as it quickly came to be called. The driving force behind this idea was a businessman named William Willett, who wanted it primarily so that he would have more daylight to play golf in the evenings. Again the outcries were vociferous. The New York Times called it ‘an act of madness’ and others seriously suggested that they might equally change thermometers to make summers appear cooler and winters warmer. As one historian has put it, ‘the idea of altering clocks to suit some human whim made daylight saving seem both unnatural and almost monstrous to its many opponents’.4 Although America briefly instituted daylight saving as a way of conserving energy supplies during World War I, such was the antagonism to the idea in some quarters that it wasn’t until 1966 that America got universal summer time.
Money, too, was a feature of American life that did not become standardized until relatively late in the day. Until the issuing of the first ‘greenbacks’ during the Civil War, the federal government in Washington produced no paper money, but only coins. Paper money was left to banks. As recently as the first half of the nineteenth century banks – and the word is used loosely to describe some of these institutions – were in the happy position of being able to print their own money. Types of bills proliferated wildly. In Zanesville, Ohio, to take one example, no fewer than thirty banks churned out money under such colourful appellations as the Virginia Saline Bank and the Owl Creek Bank. The bills were often of such dubious value that they were referred to as shinplasters.5 Some banks’ money was more respected than others’. The Citizens’ Bank of New Orleans issued a particularly sought-after $10 bill. Because the French word for ten, dix, was inscribed on the back they became known as Dixies. As a descriptive term for the whole of the South, the word didn’t really catch on until Daniel Decatur Emmett, a northerner, wrote the immensely successful song ‘Dixie’s Land’ (which almost everyone thinks, wrongly, is called ‘Dixie’) in 1859.6
With so many types of money floating about, the situation would appear to have been hopelessly confused, but in fact it was a huge improvement on what had gone before. Throughout the long colonial period, the British had allowed very little British specie to circulate in the colonies. Though businesses kept their accounts in pounds, shillings and pence, they had to rely on whatever tender came to hand. A bewildering mixture of home-minted coins and foreign currency – Portuguese johanneses (familiarly known as joes), Spanish doubloons and pistoles, French sous and picayunes, Italian and Flemish ducatoons, American fugios (so called because the Latin fugio, ’I fly’, was inscribed on one side) and other coins almost without number – circulated throughout the colonies, and business people had to know that 1s. 4d. was equal in value to one-sixth of a milled peso (the original ‘piece of eight’), that a Spanish or Mexican real was worth twelve and a half cents, that a Portuguese johannes traded for $8.81, that 2s. 3d. was equivalent to half a Dutch dollar. Along the eastern seaboard, a real was generally called a shilling, but elsewhere it was more racily known as a bit. First found in English in 1688, bit may be a translation of the Spanish pieza, ’piece’ (which metamorphosed into peso), or it may be that the early coins were literally bits broken from larger silver coins. Because a bit was worth twelve and a half cents, a quarter dollar naturally became known as two bits, a half dollar as four bits, particularly west of the Mississippi. Ten cents was a short bit; a long bit was fifteen cents. Even after America began minting its own coins, foreign coins remained such an integral part of American commerce that they weren’t withdrawn from circulation until 1857.
To add to the confusion, values varied from place to place. In Pennsylvania and Virginia, a half-real went by the alternative name fipenny bit or fip because it was equivalent in worth to an English five-penny piece. But in New York it was worth sixpence and in New England fourpence halfpenny. It is something of a wonder that any business got done at all – and even more wondrous when you consider that until after the Revolution there wasn’t a single bank in America. Philadelphia got the first, in 1781; Boston and New York followed three years later.7
Not surprisingly, perhaps, many people dispensed with money and relied instead on barter, or country pay as it was often called. The goods used in barter were known as truck (from the Old French troquer, meaning to peddle or trade), a sense preserved in the expression to have no truck with and in truck farm, neither of which has anything to do with large wheeled vehicles. (In the vehicular sense, truck comes from the Latin trochus, ’wheel’.)
The decimalized monetary system based on dollars and cents was devised by Gouverneur Morris as assistant to the superintendent of finances, in consultation with Thomas Jefferson, and adopted in 1784 against the protests of bankers and businessmen, most of whom wanted to preserve English units and terms such as pounds and shillings. The names given to the earliest coins were something of an etymological rag-bag. In ascending order they were: mill, cent, dime, dollar and eagle. Dollar comes ultimately from Joachimstaler, a coin that was first made in the Bohemian town of Joachimstal in 1519 and then spread through Europe as daler, thaler and taler. In an American context dollar is first recorded in 1683.8 Dime, or disme as it was spelled on the first coins, is a corruption of the French dixième, and was intended to be pronounced ‘deem’, though it appears that hardly anyone did. The word is not strictly an Americanism. Dime had been used occasionally in Britain as far back as 1377, though it had fallen out of use there long before, no doubt because in a non-decimal currency there was no use for a term meaning one-tenth. Cent of course comes from the Latin centum, ’one hundred’, and was rather an odd choice of term because initially there were two hundred cents to a dollar.9 The custom of referring to a single cent as a penny is a linguistic hold-over from the days of British control. No American coin has ever actually been called a penny. (The term appears to come from the Latin pannus, ’a piece of cloth’, and dates from a time when cloth was sometimes used as a medium of exchange.)
A notable absentee from the list of early American coins is nickel. There was a coin worth five cents but it was called a half dime or jitney, from the French jeton, signifying a small coin or a token. When early in this century American cities began to fill with buses that charged a five-cent fare, jitney fell out of use for the coin and attached itself instead to the vehicles. Nickel didn’t become synonymous with the five-cent piece until 1875; before that nickel signified either a one-cent or three-cent piece. The phrase ‘don’t take any wooden nickels’ dates only from 1915 – and, no, there never was a time when wooden nickels circulated. Such a coin would have been immediately recognizable as counterfeit and in any case would have cost more to manufacture than it was worth.
One of the more durable controversies in the world of numismatics is where the dollar sign comes from. The first use of $ in an American context is in 1784 in a memorandum from Thomas Jefferson suggesting the dollar as the primary unit of currency, and some have deduced from this that he made it up there and then, either as a monogram based on his own initials (improbable; he was not that vain) or as a kind of doodle (equally improbable; he was not that unsystematic). A more widely held notion is that it originated as the letters U and S superimposed on each other and that the U eventually disintegrated into unconnected parallel lines. The problem with this theory is that $ as a symbol for peso far outdates its application to US dollars. (It is still widely used as a peso sign throughout Latin America.) The most likely explanation is that it is a modified form of the pillars of Hercules, wrapped around with a scroll, to be found on old Spanish pieces of eight.
Many slang terms and other like expressions associated with money date from the nineteenth century. Americans have been describing money as beans (as in ‘I haven’t got a bean’) since 1810 and as dough since at least 1851, when it was first recorded in the Yale Tomahawk. Small change has been around since 1819, not worth a cent since the early 1820s, and not worth a red cent since 1839. Upper crust dates from 1832, easy money from 1836, C-note (short for century note) for a $100 bill from 1839, flat broke and dead broke from the 1840s. Americans have been referring to a dollar as a buck since 1856 (it comes from buckskin, an early unit of exchange). Sound as a dollar, bet your bottom dollar, strike it rich, penny-ante and spondulicks or spondulix (a term of wholly mysterious origin) all date from the 1850s. A $10 bill has been a sawbuck since the early 1860s. It was so called because the original bills had a roman numeral X on them, which brought to mind a saw-horse, or sawbuck. Mazuma, from a Yiddish slang term for money, dates from 1880, and simoleon, another word of uncertain provenance, meaning $1, dates from 1881.