By that time, it was 1971, and Liz Straus was now Liza Loop. She was married with a young son, living in the placid farm town of Cotati, California, amid Sonoma County wine country. She’d missed the fieriest days of countercultural protest, but she’d become a passionate crusader nonetheless: for educational reform. She wanted all children to have a learning experience as stimulating, and empowering, as her own had been at Dana Hall.
Liza Loop had plenty of company. Student demonstrations and teacher strikes had become a global phenomenon, as newly politicized young people pushed against a rote and rule-bound system seemingly trapped in the nineteenth century. Educational experts questioned the relevance of teaching with paper and textbooks in an age when nearly every American family had a television in their living room, and enthusiastically endorsed the idea of introducing programmable “teaching machines” into the classroom. On top of it all, American schools remained civil rights battlegrounds, as continuing integration struggles had given way to fiercely contested court-ordered busing. These measures prompted some white parents to violently resist, and many others to opt out of public school systems altogether. Alternative schooling movements like Montessori and Waldorf spiked in popularity.8
A growing number of educational advocates had begun talking about computers as critical features of the new and improved schools of the future. The newly installed Nixon Administration began to investigate how it could build, and fund, a network of computers in schools. Major electronics firms, including IBM, joined in the effort to create specialized hardware and software for education. The Ford Foundation was so committed to the enterprise that it established a separate nonprofit dedicated to the cause. “Learning is the new growth industry,” exulted its president, Harold Gores. Berkeley dean Robert Tschirgi proclaimed that the computer was “the greatest thing to hit education since Johann Gutenberg invented movable type.”9
When the first computers actually landed in the classroom, however, high-flying corporate rhetoric didn’t come remotely close to educational reality. The software was clunky. The hardware was as rigid and bureaucratic as the analog systems it replaced. Most of all, teachers didn’t know enough about computers to teach students how to use them, and the user interface wasn’t designed for students to learn it on their own. Fancy hardware ended up in school basements and storage closets, gathering dust.
Things worked out differently, however, when computer specialists themselves brought the machines into schools, and worked with teachers and students to create more-individualized curricula. This happened in the case of Dean Brown, a Stanford psychologist who’d started working with Montessori teachers to teach very young children how to program devices and play rudimentary computer learning games. Brown’s vision of how computers might transform education was miles away from—and conceptually far more audacious than—the ideas bandied about by political and corporate leaders. “Education is the realization and the unfolding of that which is latent, already there,” he wrote in 1970. “The teacher is a creative artist and the computer can be a chisel in his hands.”10
Shortly after she enrolled in Sonoma State’s master’s program, Liza Loop found herself in a class led by Brown. Everything changed. “I spent five minutes in the room with Dean and said, ‘That’s my career, that’s where I’m going,’” Loop remembered. Intensely sociable and expansively friendly, this Sonoma County housewife didn’t fit any tech stereotype. “I’m not particularly interested in computers,” she jauntily confessed. “It’s humans that turn me on.” But Loop’s lack of formal training in computing ultimately became an asset, enabling her to push past technical jargon and explain to ordinary people—kids, teachers, and especially girls and women like her—how computers could become part of their lives.11
THE NEW GUARD
Blending the change-the-world politics of the counterculture with the technophilic optimism of the Space Age, Felsenstein and Loop became two members of a steadily enlarging techno-tribe that emerged in the Bay Area and other college towns and aerospace hubs at the turn of the 1970s.
Many were like Lee Felsenstein: Sputnik-generation boys with science-fair ribbons who’d collided head-on with the cultural liberation of the Vietnam era. They proudly called themselves “hackers,” relentlessly future-focused, suspicious of centralized authority, pulling all-nighters to write the perfect string of code. They demonstrated superior technical talent by infiltrating (and sometimes deliberately crashing) institutional computer networks. Overlapping their ranks were the renegade “phone phreaks” who discovered how to use high-pitched signals to break into AT&T’s networks and enjoy long-distance calls for free.12
But a good number were also like Liza Loop: baby boomers drawn to computers by a passion to change the way society worked, especially in how it educated a new generation. There was Pam Hardt, a Berkeley computer science dropout and co-founder of a San Francisco commune called Resource One; she secured a “long-term loan” of an aging SDS minicomputer, settled it in the commune’s living room, and made it the mothership of a time-shared bulletin-board system called Community Memory. There was Bob Albrecht, an engineer who quit his corporate gig at supercomputer maker Control Data Corporation to join an educational nonprofit called the Portola Institute, a far-ranging collective operated on a shoestring. Portola spawned the bible of the techno-counterculture, the Whole Earth Catalog, created by artist, utopian, and “happening” impresario Stewart Brand. High-tech met hippiedom on the Catalog’s pages, which featured fringed buckskin jackets and camp stoves alongside scientific calculators. Its motto: “Access to Tools.”13
Albrecht’s project was the People’s Computer Company, started in 1972 as a walk-in storefront for computer training, accompanied by a loose and loopy newsletter “about having fun with computers.” Festooned with hand-drawn dragons and off-kilter typesetting, the PCC had the rangy look and feel of an underground tabloid like the Berkeley Barb (where Felsenstein had become a staff writer). Instead of columns decrying Nixon’s bombing of Cambodia, the PCC had features on how to learn computer language, with titles like “BASIC! Or, U 2 can control a computer.”14
If Bob Albrecht was the revolution’s Ben Franklin, then Ted Nelson became known as its Tom Paine. A computing-entranced former graduate student in sociology with a prep-school accent and the manners to match, Nelson considered himself a specialist in ideas “too big to get through the door.” In the mid-1960s, he came up with a nonlinear system for organizing writing and reading he called “hypertext.” In 1974, he applied the concept in a self-published book titled Computer Lib: You Can and Must Understand Computers NOW! (Flip the volume over, and there was a second book, Dream Machines, which talked about computers as media platforms. Nelson was thinking well ahead of his time.)
“Knowledge is power and so it tends to be hoarded,” exhorted Nelson. “Guardianship of the computer can no longer be left to a priesthood” who refused to build computers that could be understood by ordinary people. Released into the world as Richard Nixon was helicoptering away from the White House in disgrace, Computer Lib made clear who the enemies were. “Deep and widespread computer systems would be tempting to two dangerous parties, ‘organized crime’ and the Executive Branch of the Federal Government (assuming there is still a difference between the two),” he wrote. “If we are to have the freedoms of information we deserve as a free people, the safeguards have to be built in at the bottom, now.”15
In that drop-out-and-tune-in place and moment, these men and women began to think and talk about how computers could transform from fearsome weapons of the establishment into tools of personal empowerment and social change. Of course, the fact that Northern California had been such a hub of Cold War science was why many of them were there in the first place. They’d migrated west for college and graduate school, or jobs in government labs and industrial research operations. Their knowledge of computing came from their prior participation in the technocratic system they criticized. A
nd they weren’t all kids. Many were professionals in their twenties and thirties, with children, mortgages, and graduate degrees.
Thus the gulf between the scientific Cold Warriors and the techno-utopians was not as great as it seemed. Many of the ideas that animated the personal-computer crusade, like human-computer interaction and networked collaboration, were the same ones that had consumed the Cambridge seminars of Norbert Wiener in the 1940s and the labs of McCarthy and Minsky and Licklider in the 1950s. The new generation believed in the same principle that had animated government science ever since Vannevar Bush celebrated its “endless frontier” in 1945: technological innovation would cure society’s problems and build a better American future.16
Such technophilia also made this change-the-world movement oddly conservative when it came to disrupting conventional gender roles, reckoning with society’s racism, or acknowledging yawning economic and educational inequalities. In this crowd, the Liza Loops and Pam Hardts remained in the distinct minority. Nonwhite faces almost never appeared. Radical feminism’s impact was brief and glancing; Black Power and other civil rights movements rarely were given a nod. For some of these technologists, a singular focus on computing was an escape from identity politics. For others, tech was an answer to social inequities. The overwhelmingly white and middle-class group had faith that “access to tools” would fix it all.
For all their blind spots, the new generation’s minds were blown open by the political and cultural earthquakes of the 1960s. And technology did set them free, for they were able to make their ambitious and intensely personal vision of a thinking machine far closer to reality because of the existence of the microprocessor. Their core idea—that the computer could be used by anyone for creation and communication and work and play—became the right idea at exactly the right time.
INFORMATION OVERLOAD
The metaphor of the mainframe had coursed throughout 1960s politics as eloquent shorthand for American political and social institutions—governments, armies, corporations, universities—and the stifling conformity they imposed. “There’s a time when the operation of the machine becomes so odious, makes you so sick at heart, that you can’t take part! You can’t even passively take part!” famously declared the Berkeley Free Speech Movement’s Mario Savio in late 1964. “And you’ve got to put your bodies upon the gears and upon the wheels, upon the levers, upon all the apparatus, and you’ve got to make it stop.” Student protesters that autumn pinned signs to their chests bearing a riff on the prim warning that appeared on every IBM punch card: “I am a UC student. Please don’t bend, fold, spindle or mutilate me.”17
Students didn’t simply decry computers as tools that stripped them of individuality, but of their privacy as well. Only a few months before Savio and his compatriots mobilized in Berkeley, investigative journalist Vance Packard hit the best-seller lists with The Naked Society, which outlined the fearsome extent of electronic snooping in several hundred ulcer-inducing pages. “If Mr. Orwell were writing [1984] today rather than in the 1940s,” Packard wrote, “his details would surely be more horrifying . . . There are banks of giant memory machines that conceivably could recall in a few seconds every pertinent action—including failures, embarrassments, or possibly incriminating acts—from the lifetime of each citizen.”18
Right on Packard’s heels came the publication of the English-language translation of The Technological Society, a grim assessment of the modern condition by French sociologist Jacques Ellul. The transistor and computer had locked modern society in a battle between individual agency and machine conformity. The machine seemed to be winning. When that happens, Ellul concluded darkly, “everything will be ordered, and the strains of human passions will be lost amid the chromium gleam.”19
Perhaps no author captured the information-age zeitgeist more thoroughly than a journalist and self-appointed futurist named Alvin Toffler. The hyperkinetic New Yorker had started his adult life as a Marxist civil rights activist, which he followed with several years experiencing the workingman’s life as a welder in Cleveland. Eventually he moved into journalism, ultimately leaving Marx behind to become an editor at the buoyantly capitalist Fortune magazine. With his wife, Heidi, he started up a consulting practice and began writing books (she did not get authorial credit until many volumes later). Prolific, prone to hyperbole, and relentlessly self-promoting, Toffler burst onto the best-seller lists in the spring of 1970 with Future Shock, a 500-page treatise on how technology was transforming everything—and blowing everyone’s minds in the process.
Forty-one years old and sympathetic to the “post-materialist” priorities and open-minded sexual mores of the younger generation, Toffler released a thundering waterfall of prose designed to hook and excite—and frighten—a general audience. “What is occurring now is not a crisis of capitalism, but of industrial society itself,” he wrote. “We are simultaneously experiencing a youth revolution, a sexual revolution, a racial revolution, a colonial revolution, an economic revolution, and the most rapid and deep-going technological revolution in history.”20
If you already thought life was chaotic, Toffler argued, then you didn’t know the half of it. “Change is avalanching down on our heads,” he said, “and most people are grotesquely unprepared to cope with it.” It was, Toffler argued, a question of too much information: “The entire knowledge system in society is undergoing violent upheaval. The very concepts and codes in terms of which we think are turning over at a furious and accelerating pace.” The modern world, he memorably concluded, had an acute case of “information overload.”21
In a nation already deeply anxious about technology and nearly everything else, Toffler’s book was a hit straight out of the gate. Three major book-of-the-month clubs chose Future Shock, and it had buoyant sales despite withering reviews. (One called it “a high school term paper gone berserk.”) Style aside, the book’s dystopic overtones were certainly a lot to swallow; Toffler’s own mother remarked: “if that’s the way it’s going to be, I don’t want to be here.” Nonetheless, Future Shock ultimately sold five million copies, and made Alvin Toffler into an inescapable seer of the information age.22
For all its wilder ideas and overstuffed prose, Toffler’s book was stunningly prescient. He predicted that technology would break down large bureaucracies into an “ad-hocracy” of many smaller, more agile units that could grow and shrink on demand. He talked about how electronic communications would enable a splintering of mass culture into thousands of different, specialized channels where everyone could get their own, specially tailored news. He talked of how inundation by information would reduce attention spans and increase skepticism toward expert authority. He pointed out how much the U.S. already had shifted toward a service economy, and how information technologies accelerated that shift.
From the Manhattan Project to manned spaceflight to massive Cold War–era interventions in the Third World, Americans had understood technology as a big-organization tool to solve large-scale problems—war, famine, poverty, education, transportation, and communication. Most academic seers of postindustrial society generally operated on the presumption that bigness would still prevail, even if the means of production would change. Future Shock reflected a shift in a different direction. Technology might instead become a way to fix the problems of the world, to push against social institutions, and achieve self-actualization. But the path to do so would be by going small. One of the few optimistic notes sounded by Future Shock had to do with the destiny of big and unfeeling organizations. “The bureaucracy,” wrote Toffler, “the very system that is supposed to crush us all under its weight, is itself groaning with change.” Ultimately, he asserted, technology would break down big institutions, restoring individual autonomy in the process.23
THE COMPUTER NEVER FORGETS
It wasn’t only student radicals and grandiloquent futurists questioning the mainframe status quo—it was establishment politicians too. From the mid-1960s glory da
ys of the Great Society to the scandal-wracked last months of the Nixon era, Capitol Hill lawmakers devoted hundreds of hours to fiery floor speeches and hearings on computers and privacy. The target of their ire wasn’t IBM or the corporations that used its products, but government bureaucracies with rapidly growing electronic databases enumerating everything from a person’s age and marital status to their medical history and draft number.
Senator Sam Ervin was one of the most prominent and consistent of these critics. A strict Constitutionalist (and the ardent states-rights segregationist who had so deeply disliked 1965’s immigration reforms), Ervin gained enduring fame as the folksy chair of the Senate Watergate Committee. Before that, however, he spent the first years of the 1970s helming an investigation into government computers, and his hearings generated juicy headlines. With chairman’s gavel in one hand and a densely printed sheet of microfilm in the other, Ervin railed against the encroaching “dossier dictatorship” in Washington, warning darkly, “The computer never forgets.”24
On the House side, the crusader-in-chief was Neil Gallagher, Democrat of New Jersey, who parlayed a Kennedyesque demeanor and a knack for comparably pungent soundbites to make his name as a privacy advocate. He started holding hearings on “Computers and the Invasion of Privacy” in 1966, bringing in witnesses like Vance Packard. By 1970, Gallagher was taking his pitch directly to the computer professionals themselves, describing “the computer as ‘Rosemary’s Baby’” in an anguished essay for the journal Computers and Society. “The whir of the computer as it digests and disseminates dossiers . . . is frequently the sound of flesh and blood being made soulless,” he wrote. “Raw data are now extracted in much the same way teeth are pulled: either under the ether of uninformed consent or ripped out by the roots.”25
The Code Page 15