I Live in the Future & Here's How It Works: Why Your World, Work, and Brain Are Being Creatively Disrupted
Page 20
Carrier also discovered that many of the difficulties combining tasks were similar across age groups. For example, multitasking while reading for pleasure was the least likely to happen simultaneously (although 46 percent of the Net Geners reportedly tried to do it often anyway). That’s not surprising in light of the depth of thought that reading requires. When you read, “you recruit more of your senses, you recruit more of your higher-level reasoning processes, your imagination gets more involved. If you do it right, it’s a mentally intensive task. It requires paying attention to the material and accessing your long-term memory,” Carrier said. A lot of the information in the text requires you to infer and draw that conclusion in your imagination. All of that makes it very hard to read while sending texts or answering e-mail.
But one side effect may be that more difficult tasks, the ones that really require your brain to kick into third or fourth gear, may be less appealing. Carrier says that research shows that “traditional reading, print reading is not as engaging” to certain younger groups anymore. That’s not to say all reading or all groups of young people. But once students have had the chance to experience multimedia approaches, they may find the results much more interesting. Younger kids, who are exposed to that kind of stimulation more often, now think and work with different kinds of visual and auditory stimulation. “They didn’t grow up thinking print reading was the end all, be all of the highest level of scholarly pursuit,” he said.
Carrier’s points about reading are at the heart of the debate over multitasking. Kids come home from school and open their laptops (purportedly to do homework) but may also watch movies, chat with friends, or update their status on a social network. Then, when they sit down with a book, their brains say, “Hey, wait a minute, I’m not used to just sitting here with words. Where are the images, where’s the conversation, where are the pop-up windows?”
Reading is an extremely engaging task, and if done correctly, it can engage the imagination and numerous areas within the brain. Reading also forces the brain to think deeply, wiring our brains for deep introspection and thought. It’s also an imperative part of growth for the brain and ingenuity. But this doesn’t mean that all forms of reading and learning need to happen this way: There’s a balance of other forms of media that we can swirl into the learning apparatus of the brain.
Innovations in electronic books may well change the way we look at reading in the future. A history book about, say, the Civil War could include a video game instead of just having words and maps. After reading about the Battle of Gettysburg, for example, you might go into the battle as a soldier or a general and experience this turning point of the war “firsthand.”
Or a biography of Albert Einstein could include an interactive avatar-like program of him. You could ask him questions about his life or about the theory of relativity. You could engage in an interactive conversation with an actor or read his papers together. To me, that sounds like a pretty compelling form of learning.
This is the type of stimulation and learning the next generation may demand. In the media survey for the Kaiser Family Foundation, a seventeen-year-old girl explained, “I get bored if it’s not all going at once, because everything has gaps—waiting for a website to come up, commercials on TV, etc.”
As we will see in chapter 8, the experience will drive the success of future stories. People who make their living telling stories will feel more and more pressure to create experiences that offer multiple layers of content, additional social feedback from a community with shared interests, threaded topics, and true interaction. If they don’t, they may capture only their audience’s partial attention.
From a scientific and research-based perspective, Carrier believes that as you add more simultaneous media to the way we teach and tell stories, you’ll “recruit more of your senses, you’ll recruit more of your higher level kind of reasoning processes. Your imagination gets more involved, and you get higher motivation levels.”
From a personal perspective, especially as I think about what I learned while researching this book, I believe Cheever is right. My own exploration included conducting interviews, watching videos, listening to lectures, and reading research papers and books. I created my own form of interactive learning. Future students and researchers will do more by expecting their sources to be archived and searchable and available in multiple formats. And if the story is told in a fashion that the multitasking generation is used to, they will give the subjects more than one ear, or at least their partial attention.
City Multitasker/Country Multitasker
All the studies discussed above show how rapidly our brains are capable of adapting to and melding with new environments. Some of these changes are iterative, happening as new technologies enter our lives, and some are new and explosive, but our arguably underutilized brains just morph and readjust to the new experiences.
If Johannes Gutenberg had invented the Internet five hundred years ago instead of inventing the printing press, our brains wouldn’t have exploded and turned into sloppy green goo. We would have figured out how to use the new technology and harness it to share information and tell stories, just as we are doing today.
Do we create the technologies to cater to our brains’ thirst for stimulation, or are our brains just doing what they need to do to keep up? Most of the scientists I’ve interviewed agree that the brain’s thirst for stimulation drives the technological advances of each new innovation. We want to know more, and we want to see it, smell it, feel it, and hear it, engaging all of our senses in the experience. Young people growing up are getting a taste of this in their own learning and their own exploration, and they will want more, both for themselves and for their own children.
Reading and imagination remain important. But how can we expect a child who goes online three or four hours a day, pecking and clicking, defining his or her media path, delving into an immersed and fully interactive storytelling experience, to sit still and read a book or watch a movie if the experience is not stimulating her brain properly? Some will say, of course, that these children are spoiled (or stupid) and that they have lost the ability to concentrate. Or some may assume they have ADD and shouldn’t spend as much time online since it only compounds the problem.
One initial response is to limit the amount of time spent playing video games, the hours online, or the number of text messages. It’s a mistake to think that this ricochet behavior is a “problem” that needs to be “solved.” The problem isn’t the multitasking generation but the media they are consuming. What if we look at it from the other point of view? Maybe these older types of content—books, movies, television, and newspapers—aren’t adapting appropriately to the technologies and expectations of the young and old, of today’s adapted and more demanding brains.
Video games and the Internet are not a detriment to our brains and society. Learning how to manage fourteen buttons at once or navigate content-rich websites is a benefit, not a hindrance to further learning. As I’ve spelled out, research shows that video-game players have superior eye-hand coordination, an increased capacity for visual attention, and a vastly superior set of spatial visualization skills.
This doesn’t mean that all books and TV shows need to become storytelling carnivals full of color, noise, and moving pictures with embedded crawlers along the bottom of the page. There should be a balance, and the result should be relevant to the content and the viewer who is consuming it.
John Medina emphatically explains in his book that no two brains are the same. He cites Michael Jordan, who is considered the best basketball player of all time. Jordan’s brain is built and attuned to basketball more than that of any other human being on the planet. But as Medina points out, when Jordan decided to start playing baseball full-time, he was literally the worst player in the league.
The same thing applies to the way we consume media. “The bottom line,” as Richard Haier said to me, is that “if you think of the brain like a thermostat, some people have the
ir thermostat for stimulation set very high and others have it very low. So you may love rock music, but you may hate going to concerts because you find them way overstimulating—too many people, too loud—even though you appreciate the music. Or think about people who enjoy a quiet weekend in the country. They find it relaxing and stimulating. And other people, city people, can’t wait to get out of the country and back to the city because they’re not stimulated enough.”
The beauty of the next ten years, as more and more content types begin to move permanently onto screens of all shapes and sizes, will be the ability to pick the experience that’s right for you—to engage in the type of stimulation that fits your hyperpersonalized preferences.
If you want to consume a more prosaic type of storytelling, that should be your option. If for you, as for me, that wouldn’t be stimulating enough to your brain, an immersive supplementary experience should be available to you. And if the content creators don’t tell the story in this new immersive way, you may well be able to create a substitute yourself.
It won’t have to be all or nothing. People who live in the city still like driving in the country on weekends—even if they drive a little faster than the people who live there year-round.
8
what the future will look like
a prescription for change
The future is already here—it is just unevenly distributed.
—William Gibson
What the Future Will Look Like: Dinner on the Moon
On the run from the police in the science-fiction movie Minority Report, the character played by Tom Cruise decides to take cover in a Gap clothing store.1 There he is greeted not by a happy, breathing Gap employee but by the digital avatar of a helpful clerk. In a flash, the see-through saleswoman recognizes him via an eyeball scanner and instantly remembers his recent purchases.
“Hello, Mr. Yakamato!” she greets him. “Welcome back to the Gap!
“How’d those assorted tank tops work out for you?” the digital sales associate asks.
The scene is just sixteen seconds long, but it has achieved near-cult status among advertising executives, designers, and technology nerds. In one respect, this moment of the movie is both comical and realistic. Through this quick exchange you get an exciting and maybe frightening glimpse of the future. Implicit in that brief meeting of the eyes and the scanner are all the possibilities of an entirely new way of shopping. But even more alluring is the potential for radical new daily experiences.
Ultimately, that’s where all this technological upheaval is taking us: to a world rich in new and different experiences. Already, the Web and digital devices have changed how and where you read, watch, and listen and what you read, watch, and listen to. They have changed the communities you interact with. They have rearranged your brain cells and the way you think about everything from maps and locations to friends and relationships. They have shifted your approach to the world from a third-person perspective to a first-person one—and a hyperpersonal one at that. Most of this sea change has bubbled up from users as they brought the new technologies into their lives and adapted as the technologies changed them.
Now the companies must figure out how they will adapt and sell products in this shifting environment. As was fictionalized in Minority Report, it will be up to the Gaps, the Starbucks, the automakers, newspapers, book publishers, and moviemakers to decide which technologies to adopt and how to use them to their fullest advantage in moving their products, shows, and content. Ultimately, some companies will win, and those winners will be the ones that create the best and most meaningful experiences for their customers.
A lot of visionaries and futurists worked on the Minority Report concepts. Steven Spielberg, the director of Minority Report, asked his team of designers to envision what the year 2054 might look like. Spielberg tapped into the creative talents of famous writers such as Douglas Coupland and Stewart Brand, and also worked with interface designers from the Massachusetts Institute of Technology, including John Underkoffler, the movie’s science adviser.
A creative team involved in the retail experience of the movie said “customers did not actually have to try on the clothes in the store but could do so in a virtual way.” A three-dimensional representation of your body would be stored in your mobile phone or wristwatch. The information would then be transmitted to a life-sized “virtual mirror.” Dale Herigstad, a designer on the movie’s retail concept said, “You could then see specific sizes and styles of clothes on your virtual self in the mirror” and even put yourself in different environments, such as a park or office space, so that you would have an idea what that red dress or blue suit would look like at a dimly lit party. Then you could send images of the setting to a friend and ask if “my butt looks big in these jeans.”
Some people might find this all a little creepy and invasive. A computer system would know who you are, when you bought your last T-shirt, the true size of your belly or butt and even if they had changed since your last shopping spree, and the socks and underwear you prefer. Already, there have been some initial efforts. For a time, Levi’s made “perfect fit” jeans based on a person’s measurements, but it stopped in 2004 when it closed its last domestic manufacturing plants. Lands’ End offered a digital look at how clothing would fit if you entered your own sizes. But those attempts were crude compared with the possibilities of digital technology today, which can account for your unique curves and angles, coupled with settings that can even tell you what your date might be wearing and how he or she will look. Imagine if you could carry your exact information and preferred styles on your phone rather than trying on thirty different styles of jeans to find the most comfortable and flattering fit.
Other ideas that were presented, but not used, in Minority Report are also in nascent stages, such as wallpaper that is actually a flexible screen. The team once mocked up a restaurant setting with those digital walls. If you want to eat in Venice in real time, you tell the restaurateur, and voilà, your booth is flush with Italian gondolas floating by as you sit along the canals. Or if you prefer a chicken sandwich for dinner on the moon, no problem; that exists, too—and you don’t even have to change into a space suit. Or maybe you’re in New York and a family member is in LA. The two of you could eat dinner together through the virtual wallpaper. Of course, you couldn’t pass the ketchup, but you could enjoy each other’s company and feel as if you were in the same room together.
Other concepts that didn’t make it into the final edit of the movie included an “oasis” that would allow some respite for people experiencing information overload. Designers imagined an option to pay to enter a totally controlled space to relax and completely shut out the information chaos outside. There you could reboot your mind in a controlled setting that matched your interests aurally or visually. Beach lovers could experience the quiet calm of the Caribbean for an hour, and those who preferred the mountains might do the same at the top of Mount Everest. Concepts like these aren’t meant to replace a trip to the beach but instead are meant to give you a calm break from the unstoppable flow of information in an ever-more-connected world.
Not only are we starting to see experiences like this that bring us into other worlds, we’re also seeing them happen at a much quicker pace than Spielberg imagined in his 2002 movie. We’re seeing a society in which our lives are augmented by our ever-smaller and more powerful mobile devices and our online preferences accompany us wherever we go—even into the real world. The next challenge is to convert these technological abilities into profitable businesses that serve the consumnivore’s growing appetite. Of course, that’s much more easily imagined than accomplished. But the good news is that just as the porn industry showed us in the beginning of this book, it’s not one size fits all; a number of products are likely to fill the bill.
What the Future Will Look Like: Does Screen Size Matter?
Whenever people find out what I do for a living, they ask the same questions: First, how long will paper be
around? Quickly followed by, Which device will replace paper for reading books and newspapers? Will it be the Kindle? The Nook? The iPad? Or something yet to be announced or even maybe envisioned? Will it be flexible? Will it be as small as a deck of cards or as big as a broadsheet newspaper?
I hear the question everywhere I go—at conferences, at dinner parties, even around the office. People inevitably want to know which specific device will solve all the problems that will come with phasing out paper-based media. For a long time, I have to admit, I thought one gizmo would hold all the answers as well: “one device that rules.”
One of my first tasks at the Times’s research and development lab was trying to answer this question and assess what the next generation of the newspaper might look like: what might become the iPod of electronic readers. Part of my job was to keep track of upcoming new reading devices and screen technologies and to understand where digital readers and screens would be in the next two years to two decades. It was not an easy task, but it was definitely a fun one.
The best part of the job was ordering new gadgets. We tracked just about anything with a button and power source: motion-tracking remote controls for your TV that allow you to navigate your living room like a video-game controller. Completely flexible screens that respond to multiple fingers at the same time. Virtual keyboards that wirelessly link to your phone or computer and visually project keys onto any surface you want to type on, from a desk tabletop to a sidewalk. Microprojectors that are the size of the end of your little finger and can project an incredibly bright display up to thirty inches wide, which is larger than most standard monitors. Then there were the e-readers, devices such as the Sony Reader, Amazon’s Kindle, the Apple iPad, and book applications for mobile phones, along with a wide range of funky-looking European and Japanese reading devices that were never intended for the U.S. market.