Few events darkened the canvas of 1995 like the atrocity at Srebrenica. In July 1995, Bosnian Serb forces entered the U.N.-designated “safe area” around the town and rounded up and executed more than 8,000 Muslim men and boys in the worst single outrage of the long and brutal war in Bosnia. The massacre’s mercilessness and scale stirred the United States, belatedly, to confront the horrors of Bosnia. The aftermath of Srebrenica brought diplomatic moves, backed by air strikes on Serb positions, that cleared the way for peace talks at Dayton, where the war was brought to a close after nearly four years.
“Srebrenica” lives on as a point of reference, a grim reminder of the hazards of inaction in the face of barbarity. “Srebrenica” has found resonance and relevance in the prolonged civil war in Syria, especially after the slayings of more than 100 people, most of them women and children, near the town of Houla in 2012. “Syria’s Srebrenica,” the Wall Street Journal declared.26 “Houla: Shadows of Srebrenica” was the headline in the Washington Post.27
But Houla gave rise to no international intervention in Syria, its similarities to Bosnia notwithstanding. The parallels are striking: the ruthlessness, random shelling, sectarian violence, gruesome imagery, and feeble international response have all made Syria seem much like a grim replay of the Bosnian war. Moreover, the diplomatic démarche that ended the war in Bosnia has been studied for insights that might be applied to Syria, where a despotic regime—backed by Russia and Iran—is pitted against the splintered forces of loosely allied rebels.28
Despite surface similarities, Syria is no Bosnia. The differences are profound, and even a ghastly “Srebrenica moment” is unlikely to force NATO to intervene, as it did in Bosnia in 1995. The images of Syria’s horror have had strikingly little effect on Western public opinion even though, as one observer has noted, they “evoke the horror of the Holocaust. They are a call to action.”29 It may be that the images from Syria—including those of children killed in nerve gas attacks in 2013—are unbearable, or that they demand more potent responses than their audiences can muster.30
No matter how disturbing the imagery, Americans have demonstrated little appetite for another military intervention in the Middle East, not after the agony of the Iraq War and the bursting of the post-Dayton “hubris bubble.” Perhaps the most decisive difference between Bosnia and Syria is that Russia is far less accommodating nowadays than it was in 1995, when it was reeling from the breakup of the Soviet Union four years before. The autocratic Vladimir Putin is intent on returning Russia to an international power, and crucial to his ambitions is blocking U.S. initiatives in Syria and elsewhere. So the civil war in Syria may grind on as a conflict of attrition, producing more horrors and bloodshed. But Syria is not likely to have its Dayton.
In the twenty years since entering the American mainstream, the Internet extended its reach deep into the economic, political, social, and cultural facets of contemporary life—so much so that it has come to be taken almost for granted. Occasionally, though, there are echoes of Bob Metcalfe’s famously wrong prediction in late 1995 that the Internet was headed for a catastrophic collapse.31 For example, Network World published a column a few years ago by Johna Till Johnson, an engineer and technologist. The column appeared beneath the headline “Is the Internet Doomed to Fail?” It suggested that demand for Internet access would soon exceed capacity. Johnson acknowledged that predicting the Internet’s pending failure seemed “crazy . . . in this era of Facebook, Twitter and a ‘digital millennial’ generation that’s grown up never not knowing the Internet. But there are worrying signs that the Internet’s architecture may not be able to scale effectively much longer. . . . Demand for access bandwidth is growing exponentially, while provider investment is growing linearly. The lines cross—demand exceeds capacity—sometime around 2012.”32 The year came and went, and the Internet lives on.
Probably no one has defined the era of the Internet as consistently and singularly as Marc Andreessen, an inventor of the Mosaic browser and cofounder of Netscape whom a Newsweek report in 1995 declared to be “the über-super-wunder whiz kid of cyberspace.”33 Andreessen is only in his forties, but he has become a kind of éminence grise in Silicon Valley—an éminence grise who shaves his head. He is respected more than ever. Wired magazine said in 2012 that no one in the preceding twenty years had “done more than Marc Andreessen to change the way we communicate.”34
Andreessen these days heads a venture capital firm, Andreessen-Horowitz, that has backed such winners as Facebook, Twitter, Skype, Groupon, and Pinterest. Andreessen also is much sought-after for his predictions and assessments, which tend to be sweeping and colorful. He still marvels at the still-unfolding digital world. In a commentary in the Wall Street Journal a few years ago, he declared that “software is eating the world”35—which meant that the Internet will continue to deploy its disruptive effects on industry after industry.36 And in 2014, Andreessen observed: “We’re just now starting to live in the world where everybody has a supercomputer in their pocket and everybody’s connected. And so we’re just starting to see the implications of that.”37
He does not talk much about Netscape these days, but the memory of Netscape, that swaggering exponent of the early Web, lives on. Its dazzling initial public offering of stock in August 1995—when the Internet seized the attention of the financial world—is a significant moment in digital lore as well as a telling point of reference. Netscape’s IPO was recalled, for example, when LinkedIn, the social-networking platform popular among business professionals, scored a highly successful public debut in 2011. LinkedIn’s shares soared 109 percent on the day they opened for trading, an introduction that was evocative of Netscape’s head-turning, first-day success in 1995.38 It was a “Netscape moment,” said the San Francisco Chronicle, noting that the IPOs of Netscape and LinkedIn both “delivered nine-figure paydays to top executives and minted millionaires throughout the ranks,” on paper, at least.39
The Netscape browser, which introduced the Web to millions of people, has its heir in Firefox, the open-source browser that was developed in a collaborative, nonprofit venture that Netscape set up. The project was called Mozilla. Several months before its acquisition by America Online in 1998, Netscape opened the source code of its Navigator Web browser—making it available online and effectively inviting anyone with the technical know-how to revise and improve the software. Netscape set up Mozilla to manage the project.40 In 2003, America Online spun off the Mozilla project, and, the following year, Firefox 1.0 was introduced. It swiftly gained support among consumers and Web developers who had grown disenchanted with Internet Explorer, the victor in the 1990s “browser war” with Netscape.41 The introduction of Firefox shook up a then-dormant browser market and signaled the emergence of other competitors, such as Google’s Chrome browser and Apple’s Safari. By 2014, Chrome had supplanted Internet Explorer as the most popular Web browser, commanding a market share of more than 40 percent. Explorer’s market share had receded to 23 percent, a fraction of its dominance after vanquishing Netscape.42
To identify 1995 as a hinge moment of the recent past also is to say that time has come for a searching reappraisal of the 1990s, that time is ripe to confront and flatten the caricatures so often associated with the decade. The 1990s may still seem recent, but ample time has passed to allow the decade to be examined critically and with detachment. After all, as John Tosh noted in his well-regarded study The Pursuit of History, “it is the recent past on which people draw most for historical analogies and predictions, and their knowledge of it needs to be soundly based if they are to avoid serious error. The recent past has also proved a fertile breeding ground for crude myths—all the more powerful when their credibility is not contested by scholarly work.”43
Crude myth has begun to define the American 1990s. In recent years, the conservative syndicated columnist Charles Krauthammer scoffed at the 1990s as a “holiday from history” and a “soporific Golden Age,”44 a time when the United States largely ignored the gath
ering threat of Islamic terrorism only to pay a staggering price in 2001. The decade was, Krauthammer has said, “our retreat from seriousness, our Seinfeld decade of obsessive ordinariness.” His critique is principally aimed at Bill Clinton, whom he has dismissed as “a president perfectly suited to the time—a time of domesticity, triviality and self-absorption.”45 Krauthammer’s criticism of Clinton is not completely without merit, although recent scholarship has argued that Clinton recognized the growing terrorist threat and took steps to meet it, if not decisively.46
Admirers of Clinton, such as journalist Haynes Johnson, have likened the 1990s to “the best of times”—words in the title of Johnson’s book published in 2001.47 The “best of times” interpretation sees the American 1990s through a lens of a booming economy at home and unrivaled power abroad. But neither “holiday from history” nor “the best of times” is very accurate or nuanced. They are more like expedient labels than telling summaries. The American 1990s were a complex period when the United States grew hesitantly and fitfully into the role of the world’s lone superpower, when a dazzling communication technology went from obscurity to near-ubiquity, and when a rising tide of democratization abroad reached, at least briefly, into once-inhospitable lands. For a heady moment in the first years of the decade, it seemed as if Western liberal democracy had triumphed everywhere, signaling an end point in mankind’s long quest for rational and stable government.48
The 1990s were a searching time, rich in promise, in portent, and in disappointment. And squarely in the midstream of the decade was its most decisive year.
The Timeline of a Watershed Year: 1995
January 1
The last original Far Side, a popular, single-panel cartoon by Gary Larson, appears in U.S. newspapers, closing a fifteen-year run.
January 2
Marion Barry is inaugurated mayor of Washington, D.C., an office he left in 1991 following an FBI undercover operation that captured him on videotape smoking crack cocaine. He served a six-month prison term. Barry won the district’s mayoral election in November 1994.
January 4
For the first time in four decades, Congress convenes under Republican control. Newt Gingrich becomes speaker of the House.
January 5
The New York Times, in a report citing “several senior American and Israeli officials,” says Iran “could be less than five years away from having an atomic bomb.”
January 8
After 1,143 performances, the musical Guys & Dolls closes at the Martin Beck Theater in New York City.
January 9
Sheik Omar Abdel-Rahman and eleven other defendants go on trial in federal court in New York City. They are accused of conspiring to wage terrorist attacks across the city.
January 11
Oprah Winfrey, host of the country’s most-watched daytime television talk show, breaks down during the taping of a program about drug abuse, saying she had smoked cocaine twenty years earlier. The program aired January 13.
January 12
A daughter of Malcolm X, Qubilah Shabazz, is arrested in Minneapolis and accused of attempting to hire a hitman to kill Louis Farrakhan, leader of the Nation of Islam.
January 16
A prosecutor in Union, S.C., says he would seek the death penalty for Susan Smith, accused in the car drowning of her sons, 3-year-old Michael and 14-month-old Alex. The boys died strapped in their car seats.
January 17
A magnitude 6.9 earthquake strikes Kobe, Japan, killing more than 6,400 people. It is Japan’s worst earthquake in more than seventy years.
January 18
The French minister of culture, Jacques Toubon, announces the discovery of Stone Age paintings in a cave in southern France. The more than 300 images of animals and human hands are believed to have been made 20,000 years earlier.
January 22
Rose Fitzgerald Kennedy, mother of President John F. Kennedy and two U.S. senators, dies at Hyannis Port, Mass., at age 104.
January 24
The prosecution begins opening statements at the O. J. Simpson double-murder trial in Los Angeles. Deputy District Attorney Christopher Darden tells the jury: “I think it’s fair to say that I have the toughest job in town today. Except for the job that you have. Your job may just be a little bit tougher.” Darden adds, in something of an understatement: “It’s going to be a long trial.”
January 25
The defense begins opening statements in the Simpson trial. Lead defense lawyer Johnnie L. Cochran, Jr. tells jurors: “The evidence in this case, we believe, will show that O. J. Simpson is an innocent man wrongfully accused.”
January 27
A quickly compiled book by O. J. Simpson, I Want to Tell You, arrives in stores. Proceeds from sales will help cover legal expenses incurred by Simpson in his high-profile trial in Los Angeles on charges of killing his former wife and her friend. Simpson says in the book that he is eager to testify in his own defense.
January 29
The high-powered San Francisco 49ers overwhelm the San Diego Chargers, 49–26, to become the first team to win five National Football League Super Bowl titles.
January 31
President Bill Clinton announces he will act unilaterally to provide Mexico with a $20 billion loan to support the beleaguered peso.
February 2
Prosecutors at the O. J. Simpson trial play for jurors a tape of an emergency 911 telephone call, in which Simpson is heard screaming, cursing, and threatening to harm his former wife, Nicole Brown Simpson. She placed the call from her home in October 1993.
February 3
Air Force Lt. Col. Eileen Collins becomes the first woman to pilot a NASA mission as space shuttle Discovery blasts off from Cape Canaveral, Fla.
February 6
Siddig Ibrahim Siddig Ali pleads guilty in federal court and implicates his former codefendants standing trial on charges of plotting elaborate terrorist attacks across New York City. Siddig Ali was accused of taking a central role in the conspiracy.
February 7
Ramzi Ahmed Yousef, suspected mastermind of the 1993 World Trade Center bombing and other terrorist plots, is arrested in Pakistan.
February 9
Former U.S. Senator J. William Fulbright, 89, an outspoken opponent of America’s war in Vietnam, dies of a stroke in Washington, D.C.
February 10
Sweden’s prime minister, Ingvar Carlsson, says the Russian submarines that were believed to have entered Swedish territorial waters from time to time since the 1980s were really minks.
February 12
Jurors in the O. J. Simpson double-murder trial are taken to the site in the Brentwood section of Los Angeles where Nicole Brown Simpson and Ronald L. Goldman were fatally stabbed in June 1994. They also visit Simpson’s estate nearby.
February 13
House Speaker Newt Gingrich says he will not seek the Republican nomination for president in 1996.
February 15
FBI agents arrest Kevin Mitnick, a 31-year-old computer hacker and suspected cyberthief accused of stealing data files and credit card numbers from computer systems across the country. Mitnick would spend nearly five years in prison.
February 19
In an event signaling the start of the long run to the 1996 presidential election, nine Republican candidates who have entered or are likely to enter the race deliver remarks at a fund-raising dinner in New Hampshire. The event falls a year and a day before the state’s first-in-the-nation presidential primary election.
February 21
Steve Fossett, a 50-year-old Chicago stockbroker, becomes the first person to fly solo in a balloon across the Pacific Ocean. His balloon lands in western Canada after a four-day flight from South Korea.
February 22
France accuses five Americans, including four diplomats, of espionage and reportedly asks them to leave the country.
February 23
The Dow Jones industrial average surges past the 4,000
mark, closing at 4,003.33, a record high.
February 26
Britain’s oldest investment bank, Barings PLC, is forced into bankruptcy protection after a 28-year-old trader in its Singapore office, Nick Leeson, loses $1.38 billion speculating on Tokyo stock prices.
February 27
Publication date of a now-famous Newsweek commentary, “The Internet? Bah!” in which the author, Clifford Stoll, dismisses the emergent digital world as over-hyped and over-sold—“a wasteland of unfiltered data.”
March 1
A relative newcomer, Sheryl Crow, wins “record of the year” honors at the annual Grammy Awards ceremony in Los Angeles for “All I Wanna Do.” Tony Bennett’s MTV Unplugged wins “best album” award.
March 2
The U.S. Senate narrowly turns down a constitutional amendment to require a balanced federal budget. The vote was 65–35, just shy of the two-thirds majority required for passage.
1995 Page 21