By the 1830s, the growing demand for data of all sorts and about all things was well under way to becoming the data-intensive model of knowledge of today. Although there is no magic year or decade in which the current information landscape was sown with the seeds of digital data, we can see the 1830s as a juncture when all the elements necessary for a great information inflation were in place. We can call the 1830s the golden spike—the layer of earth geographers use to establish boundaries between geological time periods—to demarcate the boundary zone between an artisanal information landscape marked by scarcity and the technologically intensive forensic approach. Before the 1830s, only a handful of people saw the Earth as old and none saw humans as products of evolution. But in the 1830s we see—at least in the rearview mirror—two events that historians note as landmarks of the new knowledge landscape: the publication of Lyell’s Principles of Geology and Darwin’s assimilation of deep time into his theories about the origins of species. As J. W. Burrow points out, “Evidence of the antiquity of man in the conjunction of human artefacts and remains of extinct animals, which had been accumulating from excavations in France and England from the 1830s onwards, was finally accepted, after long skepticism, at the end of the 1850s.”
THE INVENTION OF THE SCIENTIST
By the 1830s, the enterprise of science had begun its slow transformation from an amateur pursuit of knowledge, accessible largely to men of means, to a profession open to men and women from different ranks and economic classes. In this decade natural philosophy and natural history collapse to form the domain we now know (in English) as science, meaning the physical sciences. The merging of natural philosophy and natural history, together with the realignment of the natural sciences around evidence-based practices, was codified with the invention of the word “scientist” in 1833.
Though ubiquitous today, the word is new in the English-speaking world. It was coined by William Whewell, an eminent man of science who convened the third meeting of the British Association for the Advancement of Science. This group was formed to “promote science in every part of the empire” and was open to all “cultivators of science.” After a stem-winding speech of welcome extolling the union of facts and theory, of natural history and philosophy, Whewell was publicly confronted by the renowned Samuel Taylor Coleridge. The man of letters strenuously objected to these humble practitioners with dirt on their hands and mud on their shoes (the stigmata of fieldwork) assuming the mantle of philosopher. Very well, Whewell amiably and shrewdly agreed. If philosopher is a term “too wide and lofty” for the likes of the assembled, then “by analogy with artist, we may form scientist” as one who is “a student of the knowledge of the material world collectively.”
The word came only slowly into common parlance, not showing up often in written sources until the 1860s—and even then mostly in America. (For a long time, the English thought the word a vulgar American neologism.) Its invention nonetheless signaled that a scientist would be expected, like the artist, to have exquisitely keen powers of observation, to record one’s observations faithfully, and to command a minimum of manual skills to craft tools of observation, to set up finely calibrated experiments, and to present the results with graphical as well as verbal precision. The term “scientist” grew in usage along with professionalization of the field. Though the pace and complexion of the change varied by country and class, by the 1870s the physical sciences were fully fledged as professions and supported by an educational system in most European and Anglo-American countries.
NEW MACHINES FOR NEW DATA
The forensic imagination demanded better instruments of investigation and lots of them. That desire to read Nature’s archives drove—and still drives—the invention of new technologies to observe, measure, record, play back, analyze, compare, and synthesize information. The body of information spewing out of telescopes, microscopes, cameras, X-ray machines, lithographic printing presses, and calculating machines grew at a constantly accelerating pace. Libraries, museums, and archives were built with startling speed, filled up, added new storage units, and began to swell with books and scientific journals, maps and charts, photographs and sound recordings, natural specimens and fossils, cargo containers full of scientific booty from numerous expeditions to ever more remote corners of the world and beyond into space. The unified body of knowledge encompassed in Jefferson’s library fractured into multiplying domains of specialized expertise that today can barely understand each other’s methods and terminologies. The world of knowledge fissured, like the ancient landmass Pangaea that broke into seven continents. And like tectonic plates in constant motion, domains of knowledge collide and break into smaller areas of expertise, each supporting rapidly evolving, highly diversified forms of knowledge and producing more and more data every year.
Information technologies begin their dizzying cycles of innovation in media and data compression at this time. Before the 1830s, books were largely printed on linen and cotton rag paper and produced in small-scale printing shops. By the 1830s, the mechanical mass production of books using cheap wood pulp paper was under way. Before the 1830s the palette of recording technologies was limited to paper, ink, watercolors, and oils. Beginning in the 1830s, new technologies appear in rapid succession: image capture (the first daguerreotype was taken in 1839); sound recording (the first recording of a human voice was made in 1860 by Édouard-Léon Scott de Martinville and the first playback machine by Thomas Edison in 1877); and X-rays (discovered by Wilhelm Röntgen in 1895).
Before the Internet, distributing knowledge required transportation of physical objects. In 1830, the first intercity railroad, between Manchester and Liverpool, was built and augured the age of rapid overland routes to carry not only coal and grain but also books and magazines. But at the same time, immaterial modes of communication were under development as well. Electrical telegraphy was developed in several places in the 1830s (St. Petersburg, Göttingen, and London), and telephones were invented in the 1870s (Alexander Graham Bell).
The growth of new information technologies in turn created demand for a new information technology infrastructure—more libraries, archives, museums, collecting depots, specimen repositories, and skilled staff to manage all the assets they housed. By the 1830s, the volume of books to be collected grew to such a degree that these institutions became something quite different from the libraries that Montaigne or Jefferson knew. In 1836, the Library of Congress had 24,000 volumes, four times the number it had twenty years earlier. But it was a newcomer to the game. The British Museum (now the British Library) had 180,000 titles, the imperial library at St. Petersburg 300,000, the Vatican 400,000, and the royal library in Paris was nearing 500,000.
The proliferation of specialized libraries and archives in turn demanded a new architecture, designed specifically for purpose-built storage of collections at scale and reading rooms that could accommodate hundreds of bodies in chairs. The sheer quantity of information that was being assembled demanded specialization, and again, in the 1830s, libraries specialized in specific types of content for specific users. In 1832, the Library of Congress had accumulated so many books, manuscripts, and maps that its primary function—to serve the legislative needs of Congress—was overwhelmed by the sheer volume of the materials. Congress formalized a separate Law Library of Congress, moved it into its own space in the Capitol, and hired special staff to attend it.
The nineteenth century was marked by a series of crises around physical and intellectual control of all the evidence streaming in. Emboldened by the dream of accelerating the rate of human progress and well-being, expanding our control over the natural world, and freeing ourselves us from drudge labor, we went to work on documenting the world. We built more infrastructure to manage the documents, supported the growth of highly specialized fields of knowledge to keep pace with the incoming data, and trained cadres of skilled professionals who in turn spontaneously differentiated themselves, like Darwin’s finches developing differently shaped beaks. Technolo
gies and tools coevolve with the ideas and cultural practices of their users. And then the users outgrow them and want more. Science itself cannot advance without ever finer tools of observation, measurement, experimentation, and the communication of these results to other scholars. (The World Wide Web was devised to speed communication of research results among collaborating scientists at distant sites.) Thus the happy relay race between demand for and supply of new technologies to observe, measure, and record accelerates quickly into the giddy pace of the Red Queen and Alice running faster and faster just to keep up.
THE ART OF DEDUCTION
The art of forensics is in seeing matter as the footprint of the past. This is an aesthetic intuition based on the recognition of certain patterns that recur in Nature and are often described by scientists and mathematicians as elegant, beautiful, or simple. The science of forensics lies in the patient application of refined skills to decipher the clues left by previous states of existence. It takes decades of education to develop the scientific sensibility to detect significant patterns and acquire skills to decode the message in the matter. But the premise itself is breathtakingly simple, something anyone can grasp intuitively, without any specialized knowledge. The aesthetic appeal and metaphysical profundity of the forensic insight are the reasons the whodunit-and-how fiction of detection arose in the nineteenth century and still reigns as the grand narrative of our technological age, the Age of Matter.
By the 1840s the forensic imagination had already penetrated popular culture. In 1841, Edgar Allan Poe published the first story of detection, a “tale of ratiocination.” “The Murders in the Rue Morgue” features Monsieur C. Auguste Dupin, a man of science, who uses the scantest of physical evidence to arrive at an improbable yet true deduction about the murderer of two women—an orangutan. Arthur Conan Doyle acknowledged Dupin as an inspiration for Sherlock Holmes. Beyond the intrinsic charm of an eccentric detective, what makes Dupin, Holmes, and legions of their descendants compelling to watch is the near mystical marriage of knowledge and practice to interpret the world put in the service of moral actions—the catching of criminals and transgressors. Their zealous, even ascetic, dedication and single-mindedness of purpose became the hallmark of the professional man (and eventually woman). Immersed in the data-dense environment of a crime scene, Sherlock Holmes always brought laser-like focus and purpose. He believed that being a specialist meant that he had to pick and choose what he paid attention to. “You say that we go round the sun,” he said to Watson in one of his periodic fits of pique at the obtuseness of his friend. “If we went round the moon it would not make a pennyworth of difference to me or to my work.”
The main reason materialist science triumphed over all competing models of knowledge lies in its effectiveness in prediction. By eliding a deity who can intervene at will to suspend natural laws, we have gained considerable control over our own destiny. Science gains its powers of explanation by conscientiously separating “the objects of natural knowledge from the objects of moral discourse.” Scientists separate how questions from why and dwell exclusively on what is, not what ought to be. This is the moral hazard Socrates warned against—that by alienating our knowledge, making it “external to us,” we have bought an immense measure of power over the world at the expense of having power over ourselves. The nineteenth century saw the rapid rise of specialization, and not only in domains of knowledge. Labor, too, grew increasingly specialized as factory production adopted the assembly line. When Karl Marx described the “alienation of labor,” he was not just talking economic theory, but also of the increasing perception that laborers were losing a sense of autonomy. As we outsource more of the most intimate part of ourselves—our personal memory and identity—to computer code, the fear of losing our autonomy—the alienation of our data, so to speak—increases because in the digital age, only machines can read our memory and know what we know at scale. As we gain mastery over our machines, this anxiety will lessen. But it will never go away, for the trade-offs we make between our appetite for more knowledge and our need for autonomy and control will continue to keep us on the alert for unintended consequences.
Science can create very powerful technologies, but it is not science alone that can help us manage them. As the historian Steven Shapin says, “The most powerful storehouse of value in our modern culture is the body of [scientific] knowledge we consider to have least to do with the discourse of moral value.” Today we ascribe virtue to those who advance the progress of humankind rather than the salvation of souls. The elite vanguard of virtue are no longer priests and clerics dressed up in black and crimson robes, but rather scientists, engineers, and technology entrepreneurs dressed down in white lab coats and denim jeans. The tale of progress from ignorance to enlightenment follows the same plotline as the narrative from sin to salvation. This is a distinctly Western perspective, quite different from the cyclical view of time typical of Hindu and Buddhist world views. But it is the one that undergirds the global language of digital memory.
The conversion to materialism was the critical change in consciousness that led to our current dominion over the world. The cluster of causes that reinforced the forensic shift is not hard to isolate: the embrace of empirical methods and materialist theories to understand natural causes and effects; the harnessing of that understanding by economic systems that apply this knowledge to create ever finer instruments and accumulate greater amounts of evidence; the dedication of resources to educating an expert workforce to generate and apply more knowledge; and political regimes that keep the pursuit of scientific knowledge well resourced and safe from either religious or ideological interference. Take any of these four factors away, and science and technology are crippled. But taken together, these forces create a runaway effect that results in an inflation of information. Such an outcome was probably inconceivable to Thomas Jefferson, and no aspect of it would be more confounding than its effect on his library.
CHAPTER SEVEN
THE SCIENCE OF MEMORY AND THE ART OF FORGETTING
What I perceive are not the crude and ambiguous cues that impinge from the outside world onto my eyes and my ears and my fingers. I perceive something much richer—a picture that combines all these crude signals with a wealth of past experience … Our perception of the world is a fantasy that coincides with reality.
—CHRIS FRITH, COGNITIVE PSYCHOLOGIST
LIFE WITH NO LIMITS
In the nineteenth century, we renounced intellectual limits, a priori censors, and religious filters on what we allowed ourselves to know. Emboldened by the Enlightenment cult of reason, we saw curiosity no longer as a vice, but as a civic virtue. We institutionalized public support of libraries, schools, and a free press to propagate progress and freedom. At the end of the twentieth century, the invention of digital computers vaulted us over the physical barrier to ideas spreading like fire “from one to another over the globe.” Information can be shared ubiquitously and nearly instantaneously.
Now we confront a new barrier—the natural limits of human attention and ability to absorb information. These are biological constraints shared by all living creatures. The analog memory systems we have superseded were well adapted to this limitation, constraining the production of and access to print and audiovisual content. To be equally effective, the digital memory systems we build will need to acknowledge and compensate for our natural limits. The biology of memory is a young science, still in its salad days. Yet even preliminary findings suggest how to work with, rather than try to defeat, our brain’s architecture of constraints to enhance personal and collective memory. Memory sets up a series of gates and controls to turn away the trivial or distracting and allow ready admittance of valuable information for conversion to long-term memory.
Scientists learn about how memory works in large part by studying what happens when memory breaks down. Two types of memory failure have particular implications for memory in the digital age. The first failure is the interruption of long-term memory formation. Wit
hout conversion of short-term memory to long, we are not able to make sense of the world, recognize patterns, make inferences or conjectures; in short, we cannot learn. Without long-term memory, we would be stuck in the present. Every day would be Groundhog Day, only with no happy Hollywood ending in sight. The second failure is the loss or disintegration of memory. Amnesia robs us of the ability not only to remember the past, but also to imagine the future, make predictions, and engage in mental time travel. If we lose the long-term memory of humanity, we will be like amnesiacs, not knowing who we are, where we have been, or where we are going.
When We Are No More Page 11