As the Industrial Age gathered steam, more products—even more disconnected from their producers—needed to be sold. Ad agencies developed powerful brands to camouflage the factory-based origins of most of what people consumed. Industrial agriculture became the valley of a green giant, and factory-made cookies became the work of little elves working in a hollow tree. Mass media arose to disseminate all of these new myths, utterly devoid of facts. And as long as media remained a top-down proposition, there was very little fact-based, peer-to-peer communication to challenge any of it. People were working hard on assembly lines or in cubicles anyway, no longer experiencing themselves in their multiple social roles simultaneously. They were workers on the job trying to earn a paycheck, and consumers at home relaxing to the mythological drone of mass media.
Digital technology broke this.
The fundamental difference between mass media and digital media is interactivity. Books, radio, and television are “read only” media. We watch, but only have a choice over how we will react to the media someone else has made. This is why they are so good for storytelling: We are in the storyteller’s world and can’t talk back. Digital media, on the other hand, are “read-write.” Any digital file that is playable is also sharable and changeable. (Files can be locked, at least until hackers figure out how to break the lock, but such protection is ultimately against the bias of the medium. That’s why it so rarely works.) As a result, we are transitioning from a mass media that makes its stories sacred, to an interactive media that makes communication mutable and alive.
Likewise, and in stark opposition to the media monopolies of broadcast radio and television, digital communications technologies are based on networks and sharing. The original reason computers were networked to one another was so that they could share processing resources. This makes them biased toward peer-to-peer activity. Mass media respected only the law of gravity: The people with the presses or broadcast facilities dropped their myths down onto the masses. Digital media go up, down, and sideways. In a sense there is no longer any up or down at all, as each node in the network can receive the message or refuse it, change it or leave it alone, and delete it or pass it on.
We’re back in the bazaar. Only instead of individuals conversing one-on-one with our local friends and associates, each of us has a global reach greater than that of most broadcast television networks. Any one of our blog posts or tweets can end up becoming the next runaway meme, reaching millions of fellow users in hours—each of whom is free to comment, remix, and mutate the original. And once again, we are no longer confined to our arbitrarily limited and distinct roles of workers or consumers. We are at once consumers, producers, investors, critics, and more, capable of breaking down the myths of mainstream media and revealing truths to one another. People are connected to one another on more than one basis again.
It’s hard for any company to maintain its mythology (much less its monopoly) in such an environment. As we transform from media consumers back to cultural communicators, we message one another seeking approval and reinforcement. Myths and narratives will always be deconstructed, and mistruths eventually corrected. The bias of our interactions in digital media shifts back toward the nonfiction on which we all depend to make sense of our world, get the most done, and have the most fun. The more valuable, truthful, and real our messages, the more they will spread and better we will do. We must learn to tell the truth.
Sometimes it’s the most negative truths that spread the best and fastest: a sports hero in a sex scandal, a celebrity porn tape, a terrible crime, or an urban legend that goes “viral.” Yet even in the worst of those cases, the rumor is usually based either on an underlying truth or a cultural issue that has not been adequately addressed by its target. That’s why people are compelled to repeat it when they hear it. Whether it’s the news of a disowned princess dying in a car crash or a presidential candidate whose father is not a citizen, the untruths that spin out from there are just the uncontrolled mutations of people succumbing to some of the other biases of digital media. The information is still being presented and accepted as fact by newly minted digital citizens working against centuries of mythological control. They are not yet particularly adept at discerning the truth. Even though the facts they believe may be wrong, they are still committed to the nonfiction style of communication.
The same is true for traditional media, where “reality” programs now outnumber scripted shows. Instead of watching situation comedies, we watch real people placed in outrageous situations: geeks trying to woo models, women competing to marry a millionaire who is actually a poor construction worker, or dwarves doing almost anything. By aping the nonfiction bias of net entertainment, television and other traditional formats end up reflecting only the worst side of each of digital technology’s other biases. The result is violent spectacle, dehumanizing humiliation, and collective cruelty. But the underlying urge is to participate and capitalize on a culture returning to fact-based exchange. It is not an exact science.
As a person’s value and connections in the digital realm become dependent on the strength of their facts and ideas, we return to a more memetic, fertile, and chaotic communications space. Once a message is launched—whether by an individual or the CEO of a Fortune 500 company—it is no longer in that person’s control. How it is received, how it is changed, and whether it is replicated and transmitted is up to the network. May the best meme win.
Advertising agencies believe they have all this interactivity in hand. They look at the digital communications space as a “conversation” through which the newly empowered consumer can speak her mind to the company, ask for what she wants, and then see herself reflected in the brand. Back and forth, call and response. Of course that’s just the wishful thinking of mass media professionals who have no great facts to transmit, and it’s wrong on both counts: It’s not a two-way conversation, and the person on the other end is no longer identifying herself as a consumer.
The digital bazaar is a many-to-many conversation among people acting in one or more of their many cultural roles. It is too turbulent to be directed or dominated—but totally accessible to the memes of almost anyone, companies included. And since big companies, nations, and organizations generally produce things that affect a lot of people, the memes they release will tend to have more relevance and replicate better. Just not predictably. So when a car company decides to give its customers the online tools to make their TV commercials for a new vehicle, the most popular videos end up being anti-commercials, critical of the gas-guzzling SUVs. These scathing satires are the ones that get passed around the net, and even rebroadcast on television. It’s news. The company gets a conversation—just not the one it wants. That’s because on the net, mythologies fall apart and facts rise to the surface.
Many are dedicated to promoting this phenomenon. Technology sites sponsor contests to see who can reveal the inner workings of upcoming products before the manufacturers release them—much to the consternation of Silicon Valley CEOs and their marketing departments. Meanwhile, and much more significantly, sites like WikiLeaks and Memory Hole provide cover for activists with information they want to release to the public. Whether it’s damning transcripts from a corporation’s board meeting or the Afghan War policy documents of the Pentagon, the real facts now have a way to rise to the surface. We may hear what these institutions are saying to us, but now we also know what they actually did last summer. . .
The beauty—and, for many, the horror—is that actions are even more memetic than words. In a digital communications space, the people do the talking. If a company wants to promote conversation about itself, all it really needs to do is something, anything, significant. There are companies who get on the front page of the newspaper simply for releasing an upgrade to a phone. This is less about their ability to communicate than the power and importance of their actions to so many people.
In advertising terms, this means abandoning brand mythology and returning to attributes. It may
sound obvious to those of us in the real world, but marketers need to learn that the easiest way to sell stuff in the digital age is to make good stuff. The fictional story that cookies were baked by elves is no longer as important as whether the cookies are healthy, have natural ingredients, are sourced appropriately, involve slave labor, or are manufactured in an environmentally friendly fashion. The facts about the cookies—particularly the facts that are socially relevant—are what will spread online, and it will happen quite naturally as employees share these facts with their friends on social networks, and consumers share these facts with potential shareholders, and so on. Ads based on brand image will only have staying power if they happen to be contradicted by some real fact about the company; then they will be valued by bloggers as terrific, visual evidence of corporate duplicity.
Likewise, people will thrive in a digital mediaspace as they learn to share the facts they’ve discovered and disregard the nonsense. We all have relatives who mistakenly pass on ridiculous viral emails about corporations that will give a donation of million dollars if you pass the email to others, or a kid in a hospital who needs a blood transfusion, or a threatening virus that will your data if you don’t shut down you computer immediately. It’s sweet that they want to share with us; it’s just a shame they don’t have anything real to share. Viral media fills this need for them, giving them fake facts with which to feed digital media’s bias for nonfiction contact.
Those who succeed as communicators in the new bazaar will be the ones who can quickly evaluate what they’re hearing, and learn to pass on only the stuff that matters. These are the people who create more signal and less noise, and become the most valued authorities in a digital media. But the real winners will once again be those who actually discover and innovate—the people who do and find things worthy of everyone else’s attention. They’re the ones who give us not only good excuses to send messages to one another, but also real ways for us all create more value for one another.
The way to flourish in a mediaspace biased toward nonfiction is to tell the truth. This means having a truth to tell.
IX. OPENNESS
Share, Don’t Steal
Digital networks were built for the purpose of sharing computing resources by people who were themselves sharing resources, technologies, and credit in order to create it. This is why digital technology is biased in favor of openness and sharing. Because we are not used to operating in a realm with these biases, however, we often exploit the openness of others or end up exploited ourselves. By learning the difference between sharing and stealing, we can promote openness without succumbing to selfishness.
No matter how private and individual we try to make our computers, our programs, and even our files, they all slowly but surely become part of the cloud. Whether we simply back up a file by sending it to the server holding our email, or go so far as to create a website archive, we all eventually make use of computing resources we don’t actually own ourselves. And, eventually, someone or something else uses something of ours, too. It’s the natural tug of digital technology toward what may well be its most essential characteristic: sharing.
From the CPU at the heart of a computer distributing calculations to various coprocessors, to the single mainframe at a university serving hundreds of separate terminals, computer and network architecture has always been based on sharing resources and distributing the burden. This is the way digital technology works, so it shouldn’t surprise us that the technologists building computers and networks learned to work in analogous ways.
Perhaps because they witnessed how effective distributed processing was for computers, the builders of the networks we use today based both their designs as well as their own working ethos on the principles of sharing and openness. Nodes on the Internet, for example, must be open to everyone’s traffic for the network to function. Each node keeps the packets that are addressed to it and passes on the others—allowing them to continue their journey toward their destination. Servers are constantly pinging one another, asking questions, getting directions, and receiving the help they need. This is what makes the Internet so powerful, and also part of what makes the Internet so vulnerable to attack: Pretty much everything has been designed to talk to strangers and offer assistance.
This encouraged network developers to work in the same fashion. The net was built in a “gift economy” based more on sharing than profit. Everyone wanted a working network, everyone was fascinated by the development of new software tools, so everyone just did what they could to build it. This work was still funded, if indirectly. Most of the programmers were either university professors or their students, free to work for credit or satisfaction beyond mere cash.
Pretty much everything we use on the Internet today—from email and the web to streaming media and videoconferencing—was developed by this nonprofit community, and released as what they called freeware or shareware. The thrill was building the network, seeing one’s own innovations accepted and extended by the rest of the community, and having one’s lab or school get the credit. The boost to one’s reputation could still bring financial reward in the form of job advancement or speaking fees, but the real motivator was fun and pride.
As the net became privatized and commercialized, its bias for openness and sharing remained. Only now it is often people and institutions exploiting this bias in order to steal or extract value from one another’s work. Digital technology’s architecture of shared resources, as well as the gift economy through which the net was developed, have engendered a bias toward openness. It’s as if our digital activity wants to be shared with others. As a culture and economy inexperienced in this sort of collaboration, however, we have great trouble distinguishing between sharing and stealing.
In many ways—most ways, perhaps—the net’s spirit of openness has successfully challenged a society too ready to lock down knowledge. Teachers, for example, used to base their authority on their exclusive access to the information their pupils wished to learn. Now that students can find out almost anything they need to online, the role of the teacher must change to that of a guide or coach—more of a partner in learning who helps the students evaluate and synthesize the data they find. Similarly, doctors and other professionals are encountering a more educated clientele. Sure, sometimes the questions people ask are silly ones, based on misleading ads from drug companies or credit agencies. Other times, however, clients demonstrate they are capable of making decisions with their professionals rather than surrendering their authority to them—often leading to better choices and better results.
The net’s bias toward collaboration has also yielded some terrific mass participatory projects, from technologies such as the Firefox browser and Linux operating system to resources like Wikipedia. As examples of collective activity, they demonstrate our ability to work together and share the burden in order to share yet again in the tool we have gained. For many, it is a political act and a personal triumph to participate in these noncommercial projects and to do so for reasons other than money.
These experiences and tools have, in turn, engendered an online aesthetic that is itself based in sharing and repurposing the output of others. As early as the 1920s, artists called the Dadaists began cutting up text and putting it together in new ways. In the 1960s, writers and artists such as William Burroughs and Brion Gysin were experimenting with the technique, physically cutting up a newspaper or other text object into many pieces and then recombining them into new forms. They saw it as a way to break through the hypnosis of traditional media and see beyond its false imagery to the real messages and commands its controllers were trying to transmit to us without our knowledge. Digital technology has turned this technique from a fringe art form to a dominant aesthetic.
From the record “scratching” of a deejay to the cut-and-paste functions of the text editor, our media is now characterized by co-opting, repurposing, remixing, and mashing-up. It’s not simply that a comic book becomes a movie that becomes a TV series, a
game, and then a musical on which new comic books are based. Although slowly mutating, that’s still a single story or brand moving through different possible incarnations. What we’re in the midst of now is a mediaspace where every creation is fodder for every other one.
Kids repurpose the rendering engines in their video games to make movies, called “machinima,” starring the characters in the game. Movies and TV shows are re-edited by fans to tell new stories and then distributed on free servers. This work is fun, creative, and even inspiring. But sometimes it also seems to cross lines. Books are quoted at length or in decontextualized pieces only to be included as part of someone else’s work, and entire songs are repurposed to become the backing tracks of new ones. And almost none of the original creators—if that term still means anything—are credited for their work.
In the best light, this activity breaks through sacrosanct boundaries, challenging monopolies on culture held by institutions from the church to Walt Disney. After all, if it’s out there, it’s everyone‘s. But what, if anything, is refused to the churn? Does committing a piece of work to the digital format mean turning it over to the hive mind to do with as it pleases? What does this mean for the work we have created? Do we have any authority over it, or the context in which it is used? We applaud the teenager who mashes up a cigarette commercial to expose the duplicity of a tobacco company. But what about when a racist organization mashes up some video of your last speech to make a false point about white supremacy?
Program or Be Programmed Page 8