The most important job of childhood and adolescence is to learn attachment to and trust in other people. That happens through human attention, presence, and conversation. When we think about putting children in the care of robots, we forget that what children really need to learn is that adults are there for them in a stable and consistent way.
From Better than Nothing to Better than Anything
The bonds of attachment and the expression of emotion are one for the child. When children talk with people, they come to recognize, over time, how vocal inflection, facial expression, and bodily movement flow together. Seamlessly. Fluidly. And they learn how human emotions play in layers, again seamlessly and fluidly.
Children need to learn what complex human feelings and human ambivalence look like. And they need other people to respond to their own expressions of that complexity. These are the most precious things that people give to children in conversation as they grow up. No robot has these things to teach.
These are the things that we forget when we think about children spending any significant amount of time talking with machines, looking into robotic faces, trusting in their care. Why would we play with fire when it comes to such delicate matters?
But we do. It’s part of a general progression that I’ve called “from better than nothing to better than anything.” We begin with resignation, with the idea that machine companionship is better than nothing, as in “there are no people for these jobs.” From there, we exalt the possibilities of what simulation can offer until, in time, we start to talk as though what we will get from the artificial may actually be better than what life could ever provide. Child-care workers might be abusive. Nurses or well-meaning mothers might make mistakes. Children say that a robotic dog like the AIBO pet will never get sick, and can be turned off when you want to put your attention elsewhere. And, crucially, it will never die. Grown-ups have similar feelings. A robot dog, says an older woman, “won’t die suddenly, abandon you, and make you very sad.”
In our new culture of connection, we are lonely but afraid of intimacy. Fantasies of “conversation” with artificial beings solve a dilemma. They propose the illusion of companionship without the demands of friendship. They allow us to imagine a friction-free version of friendship. One whose demands are in our control, perhaps literally.
I’ve said that part of what makes our new technologies of connection so seductive is that they respond to our fantasies, our wishes, that we will always be heard, that we can put our attention wherever we want it to be, and that we will never have to be alone. And, of course, they respond to an implied fourth fantasy: that we will never have to be bored.
When people voice these fantasies, they are also describing, often without realizing it, a relationship with a robot. The robot would always be at attention, and it would be tolerant of wherever your attention might take you. It certainly wouldn’t mind if you interrupted your conversation to answer a text or take a call. And it would never abandon you, although there is the question of whether it was ever really there in the first place. As for boredom, well, it would do its best to make boredom, for you, a thing of the past.
If, like Tara, we choose to share our frustrations with robot friends because we don’t want to upset our human friends with who we really are and what we’re really feeling, the meaning of human friendship will change. It may become the place you go for small talk. You’d be afraid that people would be tired out by big talk. This means that there won’t be any more big talk because robots won’t understand it.
Yet so many people talk to me about their hope that someday, not too far down the road, an advanced version of Siri will be like a best friend. One who will listen when others won’t. I believe this wish reflects a painful truth I’ve learned in my years of research: The feeling that “no one is listening to me” plays a large part in our relationships with technology. That’s why it is so appealing to have a Facebook page or a Twitter feed—so many automatic listeners. And that feeling that “no one is listening to me” makes us want to spend time with machines that seem to care about us. We are willing to take their performances of caring and conversation at “interface value.”
When roboticists show videos of people happy to engage with sociable robots, the tendency is to show them off as moments of exalted play. It is as though a small triumph is presented: We did it! We got a person to talk happily with a machine! But this is an experiment in which people are the “reengineered” experimental subjects. We are learning how to take as-if conversations with a machine seriously. Our “performative” conversations begin to change what we think of as conversation.
We practice something new. But we are the ones who are changing. Do we like what we are changing into? Do we want to get better at it?
Turning Ourselves into Spectators
In the course of my research, there was one robotic moment that I have never forgotten because it changed my mind.
I had been bringing robots designed as companions for the elderly into nursing homes and to elderly people living on their own. I wanted to explore the possibilities. One day I saw an older woman who had lost a child talking to a robot in the shape of a baby seal. It seemed to be looking in her eyes. It seemed to be following the conversation. It comforted her. Many people on my research team and who worked at the nursing home thought this was amazing.
This woman was trying to make sense of her loss with a machine that put on a good show. And we’re vulnerable: People experience even pretend empathy as the real thing. But robots can’t empathize. They don’t face death or know life. So when this woman took comfort in her robot companion, I didn’t find it amazing. I felt we had abandoned this woman. Being part of this scene was one of the most wrenching moments in my then fifteen years of research on sociable robotics.
For me, it was a turning point: I felt the enthusiasm of my team and of the staff and the attendants. There were so many people there to help, but we all stood back, a room of spectators now, only there to hope that an elder would bond with a machine. It seemed that we all had a stake in outsourcing the thing we do best—understanding each other, taking care of each other.
That day in the nursing home, I was troubled by how we allowed ourselves to be sidelined, turned into spectators by a robot that understood nothing. That day didn’t reflect poorly on the robot. It reflected poorly on us and how we think about older people when they try to tell the stories of their lives. Over the past decades, when the idea of older people and robots has come up, the emphasis has been on whether the older person will talk to the robot. Will the robot facilitate their talking? Will the robot be persuasive enough to do that?
But when you think about the moment of life we are considering, it is not just that older people are supposed to be talking. Younger people are supposed to be listening. This is the compact between generations. I was once told that some older cultures have a saying: When a young person misbehaves, it means that “they had no one to tell them the old stories.” When we celebrate robot listeners that cannot listen, we show too little interest in what our elders have to say. We build machines that guarantee that human stories will fall upon deaf ears.
There are so many wonderful things that robots can do to help the elderly—all those things that put the robot in the role of the cavalry. Robots can help older people (or the ill or homebound) feel greater independence by reaching for cans of soup or articles of clothing on high shelves; robots can help shaky hands cook. Robots can help to lower an unsteady body onto a bed. Robots can help locate a mislaid pair of glasses. All of these things seem so much for the good. Some argue that a robot chatting with an older person is also unequivocally for the good. But here, I think we need to carefully consider the human specificity of conversation and emotional care.
Sociable robots act as evocative objects—objects that cause us to reflect on ourselves and our deepest values. We are in the domain of that fourth chair where we c
onsider nature—our natures and the second natures we have built. Here, talking with machines forces the question: What is the value of an interaction that contains no shared experience of life and contributes nothing to a shared store of human meaning—and indeed may devalue it? This is not a question with a ready answer. But this is a question worth asking and returning to.
It is not easy to have this kind of conversation once we start to take the idea of robotic companionship seriously. Once we assume it as the new normal, this conversation begins to disappear.
Right now we work on the premise that putting in a robot to do a job is always better than nothing. The premise is flawed. If you have a problem with care and companionship and you try to solve it with a robot, you may not try to solve it with your friends, your family, and your community.
The as-if self of a robot calling forth the as-if self of a person performing for it—this is not helpful for children as they grow up. It is not helpful for adults as they try to live authentically.
And to say that it is just the thing for older people who are at that point where they are often trying to make sense of their lives is demeaning. They, of all people, should be given occasions to talk about their real lives, filled with real losses and real loves, to someone who knows what those things are.
Finding Ourselves
We are positioned to have these conversations. Sometimes I fear they may not happen.
As I was concluding work on this book I attended a large international meeting that had a session called “Disconnect to Connect.” There, psychologists, scientists, technologists, and members of the business community considered our affective lives in the digital age. There was widespread agreement that there is an empathy gap among young people who have grown up emotionally disconnected while constantly connected to phones, games, and social media. And there was much enthusiasm in the room for how technology might help. Now, for people who show little empathy, there will be “empathy apps” to teach compassion and consideration. There will be computer games that will reward collaboration rather than violence.
The idea is that we’ve gotten ourselves into trouble with technology and technology can help us get out of it. It’s that image of the cavalry. Where we once dreamed of robots that would take care of our physical vulnerabilities, now apps will tend to our emotional lapses. If we have become cold toward each other, apps will warm us. If we’ve forgotten how to listen to each other, apps will teach us to be more attentive. But looking to technology to repair the empathy gap seems an ironic rejoinder to a problem we perhaps didn’t need to have in the first place.
I have said that it can be easier to build an app than to have a conversation. When I think of parents who are drawn to their email instead of a dinner conversation with their children, I am not convinced that there is a technological fix for the emotional distance that follows. Yes, we should design technology to take account of our vulnerabilities—those phones that release us rather than try to hold us—but to bridge the empathy gap, I think of things that people can do. I think of parents who experiment with sacred spaces and technology time-outs to reclaim conversation with their children and each other. I think of the college students and CEOs who put their phones away to pay full attention to friends and colleagues. I think of the new enthusiasm for meditation as a way to be present in the moment and discover the world we hold within. When people give themselves the time for self-reflection, they come to a deeper regard for what they can offer others.
The moment is right. We had a love affair with a technology that seemed magical. But like great magic, it worked by commanding our attention and not letting us see anything but what the magician wanted us to see. Now we are ready to reclaim our attention—for solitude, for friendship, for society.
Caring machines challenge our most basic notions of what it means to commit to each other. Empathy apps claim they will tutor us back to being fully human. These proposals can bring us to the end of our forgetting: Now we have to ask if we become more human when we give our most human jobs away. It is a moment to reconsider that delegation. It is not a moment to reject technology but to find ourselves.
This is our nick of time and our line to toe: to acknowledge the unintended consequences of technologies to which we are vulnerable, to respect the resilience that has always been ours. We have time to make the corrections. And to remember who we are—creatures of history, of deep psychology, of complex relationships. Of conversations artless, risky, and face-to-face.
Acknowledgments
In this book, I study something I think is slipping away: a certain kind of face-to-face talk. Unplanned. Open-ended. The kind that takes time. You study what isn’t there by studying what is. So to investigate the conversations that were not happening, I asked people what they were talking about, who they were talking with, and how it was going. To answer this question, a lot of people reached for their laptops and phones to show me their latest exchanges. But then, when I said I also wanted to talk with them, they were gracious. My argument about conversation is based on talking to people, face-to-face, many of whom admitted that this usually wasn’t easy for them. My thanks to them is all the more heartfelt.
In this work, I had help over the years from two research colleagues. For interviews of students and young adults, I worked with Emily Carlin. For interviews within the business community, I worked with Erica Keswin. In many places, the scaffolding of my argument grew out of conversations with them, and then even more conversations contributed immeasurably to the interpretation of what we found.
Additionally, Carlin was my research assistant throughout this project; she broadened the scope of materials I read as well as serving as the best kind of dialogue partner. In this collegial group was also Kelly Gray, whose taste and wonderful ideas have sustained my efforts to understand things and thinking since I began the MIT Initiative on Technology and Self in 2001. Gray was crucial to that effort and to the shaping of every one of the books that have emerged from it. This is the sixth.
I thank Katinka Matson, Susan Pollak, Nancy Rosenblum, Merilyn Salomon, Natasha Schüll, Susan Silbey, Daniel Stern, and Susan Stern, who helped in the formulation of this project. I also thank Mel Blake, Rogers Brubaker, Jackson Davidow, Amira Eltony, Emily Grandjean, Alice Kurtz, Herb Lin, Nelly Mensah, Chris Meyer, Stan Rogow, Benjamin Sherman, Elizabeth Thys, Rodanthi Vardouli, and Theodora Vardouli for helpful comments as the writing progressed. Conversations with Richard Giglio and Diane Hessan were an inspiration; Jean Rhodes was a generous friend to this project with practical help and new ideas. A conversation with Paul Reitter was the best kind of academic exchange: I left it with new questions and new ideas! I thank Aziz Ansari for our conversations about romance in the digital age. I thank Louis C.K. for giving me permission to cite his poetry about solitude, empathy, and cell phones. Judith Spitzer and Randyn Miller provided the competent and calming administrative backup that every author dreams of. Additionally, the meticulous Ms. Spitzer was able to track down the peskiest of citations that I had noted on little slips of yellow paper whose location became elusive at critical moments. My editor at Penguin Press, Virginia Smith, responded to a first draft with a letter for which I shall always be grateful, one that offered clear directions for what to do next. The review of the manuscript by Veronica Windholz at Penguin Press was a treasured gift. To Drs. Andrew Chen and Leslie Fang I owe deep gratitude for their tenacious work on my migraines so that I could be equally tenacious about my book.
My daughter, Rebecca, read an early draft with a stern and constructive editorial eye. And then she read a final draft and held me to a higher standard. I marvel that I have raised a loving daughter who is also a fearless editor.
MIT and my home in the STS program have been a wonderful environment in which to work on this project. The students in my STS seminar, Technology and Conversation, were a sounding board as I shaped this book. I thank the students in that class (and all of m
y students from 2010 to 2015 who lived with a professor preoccupied with talk!) and hope they recognize how seriously I took their ideas.
As I worked on this project I was faced daily with the irony that for someone writing about a flight from conversation, this book brought me some of the most memorable conversations of my life.
Sherry Turkle
Boston, May 2015
Notes
THE EMPATHY DIARIES
we often find ourselves bored: A 2015 Pew Research study reported that younger users of mobile phones “stand out prominently when it comes to using their phones for two purposes in particular: avoiding boredom, and avoiding people around them.” Aaron Smith, “U.S. Smartphone Use in 2015,” Pew Research Center for Internet, Science, and Technology, April 1, 2015, http://www.pewinternet.org/2015/04/01/us-smartphone-use-in-2015.
a word in the dictionary called “phubbing”: Macmillan Dictionary, BuzzWord section, “Phubbing,” http://www.macmillanthedictionary.com/us/buzzword/entries/phubbing.html.
we find traces of a new “silent spring”: Rachel Carson, Silent Spring (Boston: Houghton Mifflin, 1962).
moment of recognition: I Forgot My Phone, a short film directed by Miles Crawford, written by and starring Charlene deGuzman, exemplifies the new recognition. It was posted online in August 2013. It presents the following narrative, a cautionary tale about our flight from conversation:
Imagine a day when a young woman’s daily routine unfolds normally, with one exception: She forgot her phone. She wakes up in the arms of her lover who idly strokes her arm as he does his email. At a birthday party, guests fuss over getting a picture of the cake. When it’s time for a celebratory toast, the focus is on taking photographs of the champagne. A lunch with friends is silent—everyone is on a phone. When she goes bowling and makes a strike, none of her friends give her a high five; they’re all texting. She can’t share a moment of laughter with her boyfriend when they go out to a comedy club. He has replaced actual laughter with a post “about laughter” that he shares with his online friends.
Reclaiming Conversation Page 38