by Matt Richtel
This is a big part of every day for Dr. Atchley, if not the walk, the retreat. He lives here, on twenty-six acres, about a ten-minute drive from campus—in an underground house that he designed a few years ago. It’s around 2,600 square feet, and covered fully by hardscrabble brown dirt on one side, with nearly eight feet of land above the structure of the roof. One side, facing south, is window-lined, inviting light and heat.
His wife, Ruthann, is chair of the psychology department at Kansas. The Drs. Atchley think of this place as their quiet, protective cocoon, and also part architectural novelty that they still seem bemused the bank ever let them borrow to build.
Underground, inside the house, it’s cool in the winter, and mostly in the summer, too. Year-round, there is no cell phone service, at least not inside. They do get Internet access. Outside: owls, frogs, deer, the occasional bobcat, and Dr. Atchley keeps a small colony of bees.
In the garage, he parks a Subaru with a personalized license plate. It reads: ATTEND. When asked to explain, he jokes: “Because ‘Turn off your fucking cell phone’ is too long for a license plate.”
ON THE DAY OF the conversation about intelligent design, Dr. Atchley walked and thought, and later that night, he crafted an email back to his student in which he quoted Hebrews 11:1.
“Now faith is the substance of things hoped for, the evidence of things not seen.”
And then Dr. Atchley continued his note in his own words: “If you try to play the game of using a method that relies on testing observable evidence, it seems to me you ignore the message of faith, which is that faith does not require evidence and should be strong in the face of evidence to the contrary (reread Job for a better lesson on faith despite contrary evidence.)”
In other words: Don’t expect science to prove your faith.
Religion doesn’t come up much for Dr. Atchley, and when it does, he can call on the years he spent at a Jesuit high school. Usually, he’s exploring people’s dedication to a different idol, that of technology.
Why are we so drawn to our devices?
What makes us check them all the time? When sitting at dinner? When behind the wheel of a car?
“Are these devices so attractive that, despite our best intentions, we cannot help ourselves?”
He believes that to be true, but is not relying on his gut. He wants to prove it. “Some of the questions appear to be difficult, even impossible to answer. But they might not be impossible to answer. We may not be able to point to the exact mechanism in the brain, but we can infer it with the right kinds of experiments.”
For Dr. Atchley, who’s doing some of the foremost research in the field, the questions suggest he’s awoken from his own blind faith in technology.
BEFORE THERE WERE COMPUTERS in Silicon Valley, before the land gave way to industrial design, there was fruit. Oranges, pomegranates, and avocados. Acre after acre. Trees and dust under a hot sun. Perfect farming conditions. Before Silicon Valley was Silicon Valley, it was one big farm, the Valley of the Heart’s Delight.
Talk about innovation. The fruit cup was invented in the Valley of the Heart’s Delight, by Del Monte.
Then World War II came along. And the Varian brothers and Hewlett and Packard arrived. They got their early funding from the federal government. Defense contractors with a high-tech bent.
This intersection of computers, telecommunications, and the military would yield a change arguably as significant and characteristic of modern life as anything in medicine and the industrialization of food. It was the birth of the Internet, the product of a research program initiated in 1973 by a branch of the military called the Defense Advanced Research Projects Agency (DARPA). The aim was to create a communications system that would go across multiple networks, making it less vulnerable to attack or instability. It was, if nothing else, very hearty.
Silicon Valley became an engine for its growth, serving it and feeding from it. Still, there were orchards left, a dwindling handful. Computers and communications technology commingling with open spaces.
A perfect place to be a thirteen-year-old. Particularly one with a bike, an innate curiosity, and a latchkey.
A younger Dr. Atchley, an only child, with working parents, had to entertain himself. His mother, a hippie by orientation, worked as a legal secretary; his stepfather was a physicist and engineer who designed machines that made silicon wafers, which computer microprocessors are built on.
Left to himself, Paul, a slight boy with dark hair, spent hours trucking through the patchwork of fields and residences. He looked for rocks to turn over and ditches to explore. “I can still smell what it smelled like in those summertime dried-up algae-frog-filled catching places,” he says.
He was also fascinated by computers. He vividly remembers the day he went with his dad (actually his stepfather who had legally adopted him) to the local Byte Shop to look at the first Apple computer. It cost more than $1,000. Way beyond the pale. Paul’s own at-home technology was relegated to a black-and-white television he had in his room, and on which he watched local channels and monster movies. He loved fantasy and sci-fi, and read books about what to do to survive a nuclear attack; he imagined he’d live in an underground house.
Finally, he got a computer. It was called a TI-99, made by Texas Instruments, and it was one of the first home computers. Kids used them to play games. Paul did that. But his interest went beyond games.
“It wasn’t that it gave me the ability to play games. It was that it gave me the ability to do anything I wanted with it—program my own games, use the tape drive to store secret information. What it really represented was limitless potential.
“At that point in my life, I was convinced I’d become a botanist on a space station.”
He was serious. Technology had been moving so quickly, doubling, tripling, quadrupling in power. “We were expanding beyond ourselves, beyond our own planet, pushing back frontiers, bettering ourselves.”
And communicating across geographic barriers. From his room, he could reach out across the city, the country, the globe.
He didn’t realize it, but he was in the midst of an extraordinary time.
ONE REASON THINGS WERE changing so much owed to a familiar technology maxim called Moore’s law. It essentially says computing power doubles every eighteen months to two years.
But there is another key technology axiom. It defines a different kind of change to the world and our lives, and indirectly would drive Paul’s eventual research: Metcalfe’s law. It defines the value of a telecommunications network, say, the Internet, as proportional to the square of the number of users. The more people, the more valuable the network.
It was named after Robert Metcalfe, an electrical engineer and innovator who helped develop the Ethernet computing standard used to connect computers over short distances. According to a history published by Princeton University, Metcalfe’s law was officially christened in 1993, but the principle was first identified in 1980, when Paul was getting his TI-99 and dividing time between it and his bike. It’s not that the concepts eloquently captured by Metcalfe’s law were entirely new; networks had been developing, and their potential significance were defined in the latter half of the century. But Metcalfe put a fine point on what had become a core attribute of media by the end of the twentieth century.
One simple way to think about how much had changed, how much power had come to personal communications, is illustrated by a simple comparison. In World War II, an extraordinary calculating machine commissioned by the U.S. military, the Electronic Numerical Integrator and Computer, could, each second, perform around 350 multiplications or 5,000 simple additions. By 2012, the iPhone 4 made by Apple could do two billion instructions per second. The iPhone 5, in 2013, even more.
The ENIAC weighed thirty tons. The iPhone 5 is less than four ounces. It carries voice communications and the Internet, a crystallization of all the wondrous powers of the previous millennia, a machine in our pockets that, on its face, worked fully in people’s
service, the ultimate entertainment and productivity machine.
By the standard of the iPhone, when Dr. Atchley was just teenage Paul, the pace of their communications and the amount and variety of information—print, voice, video—was relatively limited (maybe not by phone but certainly by computer or mobile phone).
As Paul was growing up, a half generation before Reggie came of age, there was a coming together of these two fundamental computing principles, Moore’s law and Metcalfe’s law. One defined the acceleration of computer processing power, which allowed not just speed, but so many capabilities, including, at its core, interactivity; the other captured the rapid expansion of the communications network and its value.
In union, they were combining to provide unprecedented service to humans. But they were also putting a new kind of pressure on the human brain: Moore bringing increased information, ever faster, and Metcalfe making the information so personal as to make the gadgets extraordinarily seductive, even addictive.
A FEW MONTHS BEFORE Dr. Atchley took that February-morning walk with his dogs, he attended a first-of-its-kind conference in Southern California. About two hundred neuroscientists gathered with support from the National Academy of Sciences to confront a new question: What is technology doing to our brains?
The introductory lecture was given by Clifford Nass, the provocative Stanford University sociologist who two years earlier had gotten Dr. Strayer and Dr. Gazzaley together to think about the science of multitasking. Now, Dr. Nass was pushing scientists to go beyond the existing science and ask hard questions about whether the ubiquity of constantly connected mobile devices could, ultimately, hamper the things that make us most human: empathy, conflict resolution, deep thinking, and, in a way, progress itself.
Near the front sat Dr. Gazzaley, mentally preparing for his own talk to be given shortly. Seated in the upper right, Dr. Strayer wore glasses, his neck slightly hunched.
And in the back row, Dr. Atchley. His wire-rimmed glasses were perched on his nose and his Macintosh laptop was shut beneath him. That was somewhat noteworthy; many in the crowd had laptops open, including a guy just in front of Dr. Atchley who had four windows open, checking email, the news, and a shopping site.
In explaining why he doesn’t open his laptop, Dr. Atchley calls upon a phrase from his Jesuit training. “Lead me not into temptation but deliver me from evil,” he says, paraphrasing Matthew 6:13.
He thinks that if he opens his laptop, he’ll start checking things. Get distracted from the lecture, and his own analysis of it. He doesn’t trust himself to be disciplined, and he says some fundamental neuroscience has emerged to support his fears. There’s plenty of anecdotal evidence, too.
At the University of Kansas, the journalism school does a periodic “media fast,” in which students aren’t allowed to use their devices for twenty-four hours. When the fast took place in the fall of 2011, students reflected afterward about their experience.
To wit:
“How could I abandon my closest friend, my iPhone?”
“My media fast lasted fifteen minutes before I forgot that I was fasting and checked my phone.”
“The withdrawals were too much for me to handle.”
“Five minutes without checking a text message is like the end of the world.”
“I don’t want to do this assignment again.”
Why? Why is this stuff so compelling?
Dr. Atchley says one thing that makes the question fascinating to him is that when people multitask, they often do so in situations that defy common sense, say, trying to concentrate on an in-person conversation while checking a sports score or attempting to drive while dialing a phone. He believes that there are impulses driving these multitaskers that don’t meet the eye. In fact, he says that increasingly technology is appealing to and preying upon deep primitive instincts, parts of us that existed aeons before the phone.
For one: the power of social connection, the need to stay in touch with friends, family, and business connections. Simple, irresistible. “It’s a brain hijack machine,” he says. He’s trying to prove it.
This is what comes next in the study of the science of attention, the latest wave. Are there some things that can so overtake our attention systems as to be addicting? Is one of those things personal communications technology?
A trip to his lab, he says, will help illustrate his quest.
CHAPTER 17
TERRYL
TOWARD THE END OF 2006, when Terryl first approached Jackie outside of gymnastics and asked how she might help, Jackie showed typical stoicism and said: “I think we’re good.”
Terryl, respecting Jackie’s strength and privacy, let it go. Besides, she still thought the accident had happened in the adjoining county, Box Elder. So she settled in for being a friend.
Just before Christmas that year, the Furfaros and the Warners went to a gymnastics meet in Park City. Jackie piled the girls into the Saturn. Terryl and Alan drove their van, carrying the brood of four: Jayme, the oldest, then twelve years old; Taylor, the boy, age ten at the time; Allyssa, just shy of five years old; and Katie, who was three and suffered from cystic fibrosis and autism.
At the meet, and over meals, talk between the Warners and Furfaros was about the kids and gymnastics, not the accident.
A FEW DAYS LATER, Jackie took Stephanie and Cassidy to her mother’s house in Nevada for the Christmas holiday. That meant driving on Valley View Drive.
As they left home, she tried to avoid the thought that within ten minutes she’d be at the spot where Jim had died. She put on the radio and, for the girls, started a Disney movie, which showed on monitors that attached to the back of the front seats, absorbing Stephanie and Cassidy. The aftermarket DVD player had initially been a sore spot between her and Jim.
“Why do they need this?” Jim had asked.
She told him sternly: “You’re not always with us on the drive; you don’t know how hard it is to go six hours without entertainment.”
Altogether, they were at least a nine-screen household at the time of Jim’s death. There were the computers downstairs and a third on the dining room table that the girls sometimes used. The movie screens in the backseat of her Saturn. Each parent had phones. There were two televisions. Not included are the various GPS and other devices Jim played around with.
Soon after the accident, Stephanie, the older of the girls, had begun to play World of Warcraft, using her father’s account and his computer and desk, peering into his hefty NEC monitor. Jim and Jackie had decided that she could use World of Warcraft when she turned six.
After Jim’s death, Jackie and the girls also got lost in films, holding movie nights. In particular, they’d gather around and watch The Sound of Music, or another classic, and eat a take-out pizza. For a while, Stephanie suspended her playing of Dance Dance Revolution because it reminded her of her father. A few days after Jim died, at the viewing of body, Jackie had asked the girls to look at their father so they could know and understand that this was a real thing. As Stephanie would later recall in a school essay, her mother came up to her afterward and said: “I am so sorry this happened, and I know you can be strong and get through it.”
Media seemed to help.
As Jackie drove that morning on their Christmas journey, the girls watching Disney, she tried to lose herself in the radio. She thought: If I get teary, my vision gets bleary and then it’ll be hard to drive and then I’ll have to explain to the kids why I’m stopping.
For Christmas that year, she felt she overdid it a bit with the presents. “I got the girls too many books.”
FOR CHRISTMAS, JACKIE WAS in Nevada, Leila was at home, Terryl was in Mexico—at an orphanage.
On that Sunday before Christmas, Terryl and Alan surprised the family with the trip to Mexico. But it wasn’t exactly a vacation, Terryl told the kids. She brought out a bunch of seemingly random supplies: toothbrushes, combs, little deodorant sticks. They were the makings of “hygiene” kits to be handed out, along w
ith puzzles, books, and sun hats, at an orphanage in Puerta Peñasco.
The next day, they piled back into the Ford Windstar van and drove south, stopping for the night in Searchlight, Nevada.
Puerta Peñasco is about a hundred miles south of the Arizona border on a strip of land that connects the Baja Peninsula to the rest of Mexico. The Warners stayed free in a condo that belonged to Neal Harris, Terryl’s old friend from Southern California who had been engaged to April before she died of cancer.
Neal had hit it big in the technology field, really big. He held top sales jobs at four different technology companies that were ultimately sold or went public for more than a billion dollars each. There was SynOptics Communications, which was among the pioneers in creating technology to allow for faster, more efficient delivery of data over Ethernet lines. It merged in 1994 with another company in a 2.4-billion-dollar deal, a precursor to the dot-com boom.
Neal moved to Ascend Communications, which built little boxes to terminate Internet signals and was acquired by Lucent Technologies in 1999 for $24 billion, one of the largest acquisitions in history. And, after that, Neal went to Foundry Networks, which made routers and switches—essential pieces of technology to deliver Internet traffic. It would be acquired by Brocade for $3 billion in July 2008.
He was, like so many fortunate Americans, a big beneficiary of Metcalfe’s law. More connections, faster connections, more efficient connections—Neal and others in the booming tech industry serving a seemingly insatiable drive to communicate with one another and trade at ever-increasing rates. They were creating the most powerful robots and conduits the world had ever seen, things that each year were making the tools of the previous year seem slow by comparison. They were creating wealth. This was another side of the technology revolution, big, big money. Neal had amassed a multimillion-dollar fortune.