by David Levy
(b) A corollary of this positive feature of attachment is the feeling of “separation distress” that occurs when an attachment relationship is disrupted, when the mother figure is absent.
(c) A further positive feature, closely related to proximity maintenance, is the role of the attachment figure as a “safe haven,” allowing a person who is distressed to find contact (i.e., close proximity), assurance, and safety. This role is not one for which the attachment figure is uniquely suited, but just as an infant is more easily calmed by its mother than by another adult, so an adolescent or adult is normally calmed more easily by their attachment figure than by an alternative.
(d) An attachment figure has a role as a “secure base” from which to explore the world. A young child whose attachment figure is nearby and accessible will feel relatively comfortable in exploring strange and new environments but uncomfortable when lacking the proximity of their attachment figure. Similarly, an adult will normally feel more secure when exploring a new career opportunity or an unusual leisure activity if their romantic partner is accessible.
Without spreading the bounds of credulity too far, it isn’t difficult to see how each of these four features can apply not only to human attachment figures but also to artifacts that serve the role of attachment figures, such as teddy bears, dolls, and computers. A young child likes to cuddle its doll or teddy bear (proximity); the child dislikes having its beloved toy taken away from it (separation distress); if the toy or doll is not actually within the child’s reach, it is at least comforting for the child to know that it is nearby and accessible (providing a safe haven); and with the knowledge that the doll or bear is at hand or accessible, a young child will feel more confident about activities that involve exploration and discovery, activities that start from a “secure base.” Replace “young child” with “adult,” and replace “doll or teddy bear” with “computer” and any of you who are regular computer users will most likely be able to sympathize with the following rationalization: You like to interact with your computer because it responds to your input on the keyboard and with the mouse (proximity); you do not like being unable to access your computer (separation distress) because you rely on it to help you with certain tasks, such as checking your e-mail; if you are not actually using your computer, you feel more comfortable when it is near enough for you to access it when it is needed (a safe haven in the event of a storm of tasks); and you feel confident about playing a new game, deciding on the menu for a dinner party, or choosing a vacation destination, because you know that the computer is there to be asked (i.e., Google or some other search engine) if advice is needed. These are all symptoms of attachment.
Since the attachment process begins in infancy,* it is perhaps only natural that children generally exhibit stronger feelings of attachment for their computers than do adults. While young children bond with their blankets and toys, older children are bonding in large numbers with their computers. A MORI (Marketing & Opinion Research International) survey of children in Britain, conducted in December 2003, found that 45 percent of the children surveyed considered their computer to be a trusted friend, while 60 percent responded that they were extremely fond of their computer. The corresponding figures for adults were lower, at 33 percent and 28 percent respectively, but still a significant proportion of the population. Furthermore, 16 percent of adults and 13 percent of children aged eleven to sixteen responded that they often talk to their computer. And evidencing a general belief in the future of emotional relationships with computers, 34 percent of the adults surveyed and 37 percent of the children thought that by the year 2020 computers will be as important to them as are their own family and friends. This strength of the appeal of computers has been described by Cary Cooper, professor of organizational psychology and health at Lancaster University, as a “technological umbilical cord.”
These findings would appear to indicate a shift in values in modern society, from a norm where the lives and well-being of family members are paramount to an entirely different scale of values, a scale on which a serious computer crash is deemed more important than the illness of a family member. Should we be so surprised that in some individuals and in some families such a different scale of values might exist? I think not. We have already seen, in the previous chapter, that some dog owners value their relationship with their pet more highly than their relationship with their spouse. So why should we not expect similar feelings to be expressed by some people for computers, and in the future for robots? Readers who are horrified at the fact that more than 30 percent of those surveyed held such opinions can take comfort from one very important factor that will to some extent at least militate in favor of the importance of the human family member relative to that of the computer or the robot: Humans are irreplaceable; computers and robots are replicable.* Hopefully, this factor will sustain a reasonable measure of balance in the minds of the majority.
In exploring the type of relationship that develops between humans and objects, it is important to understand exactly what we mean by “relationship” in this context. The contemporary view of relationships held by social psychologists is that the partners in a relationship are fundamentally interdependent—that is to say that a change in one of the relationship partners will bring about a change in the other. Mihaly Csikszentmihalyi and Eugene Rochberg-Halton have shown that our daily lives are influenced to quite a significant extent by man-made objects and that through these influences we establish a sense of connectedness with those objects.* A relationship with an object is one in which the experience we have with that object brings about a change in us, while what we do with that object will usually bring about a change in the object itself, even if it is a very small change, such as having experienced some wear through being used or simply having its location changed by being moved. The form of connectedness (for which read “attachment”) that Csikszentmihalyi and Rochberg-Halton maintain we develop with objects is thus derived from the influence on our daily lives and on our identities brought about by our interactions with those objects. In the case of computers, “interaction” is certainly the operative word. Whereas our interaction with most objects is limited to what we do with the object, and is therefore a one-way street, our interactions with computers are two-way (or multiway) interactions, during the course of which what we do to the computer (typing on the keyboard, clicking the mouse, and thereby participating with the computer in whatever task it is accomplishing) is part of a genuinely interactive process.
A novel approach in the investigation of attachment to computers is expounded by John McCarthy† and Peter Wright in their delightful paper “The Enchantments of Technology.” They argue that the attachment some people experience toward computers is one born of an enchantment with the technology. Each of us has the capacity to be enchanted by different things—some of us by a painting, some by a string quartet, some by the smile of a child, some by a motorbike. Just as different people can be enchanted by different things, so different things have the power to enchant different people, and technology is one of those things that has the power to enchant.
This view of enchantment as the basis of attachment to computers is due partly to the ideas of John Dewey, arguably the most influential thinker on education in the twentieth century. Dewey’s 1934 book Art as Experience asserts that experience is created by the relationship between a person and the tools that they use, the tools that form part of their environment. Dewey discusses a kind of sensual development of the relationship between a person and their environment, a development derived from a combination of the senses that familiarize the person with their environment. He uses as an example a mechanic working on an engine. When the mechanic is totally absorbed in his work, he sees, hears, smells, and touches the engine, and through these senses he diagnoses what is wrong. Being completely immersed in his work, totally focused on the task at hand, the mechanic develops a relationship with the engine. Because of his senses, he is caught up with what we might call the “
personality” of the engine.
Another researcher who turned his attention to the enchantment of technology was the anthropologist Alfred Gell, who views the cause of this form of enchantment as being the power behind the enchanter. Gell suggests that the power of technology to enchant derives from our sense of wonder at the skill of that technology’s creator—the process of creating the technology being more enchanting than the technology itself. But without any pleasure or similar emotions coming from the experience of a technology, McCarthy and Wright doubt the capacity of that technology to delight. To them enchantment also involves a sense of pleasure that is derived from the experience of novelty.* When your computer does something clever for the first time, something that satisfies you, there is a heightened feeling of pleasure. The satisfaction contributes to a state of enchantment, but it is the pleasure of novelty that turns satisfaction into enchantment. This is why working with computers and with software holds great potential for enchantment, because software is not always repetitive and boring—it often has the capacity to surprise, to create the unexpected. Consider, for example, a program designed to compose music. You might sit and listen to one new composition after another, with little to arouse your interest for a while. But then, out of the blue as it were, the program produces a composition that you find very much to your liking. This new piece of music not only pleases you as a piece of music, but also surprises you by the power of the technology to compose, and it is in this surprise, and in the technology that creates this surprise, that the power of enchantment resides. This capacity to surprise is also evident to some extent in the behavior of virtual pets—an owner can rarely be certain how a virtual pet will behave.
McCarthy and Wright also explore the vicarious, visceral, and voyeuristic pleasure we derive from technology and how these forms of pleasure contribute to our feeling of enchantment. They use film as their example, but it could equally be computers or another form of technology. Transposing their exposition into a world with robots, the voyeur in us will experience the joy of watching the new and the wonderful, our vicarious nature will endear a robot to us if it repeatedly performs in novel and amazing ways that fit together as a coherent behavior pattern, while our visceral selves will rejoice in the thrill, joy, and apprehension of interacting with robots.*
On Anthropomorphism
Some people regard anthropomorphism as a belief, even if only a transitory one, that the object in question is essentially human, a definition with which I disagree. In my view it is important to recognize the distinction between believing that a computer, for example, is essentially human and merely treating it in the same or similar ways as one might if it were human.
In the early days of artificial intelligence, Joseph Weizenbaum was perplexed by the extent to which people would anthropomorphize his computer program ELIZA:
I knew of course that people form all sorts of emotional bonds to machines, for example to musical instruments, motorcycles, and cars. And I knew from long experience that the strong emotional ties many programmers have to their computers are often formed after only short exposures to their machines. What I had not realized is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.
What caused many of ELIZA’s users to anthropomorphize was the humanlike snatches of conversation that the program generated:
USER: Men are all alike.
ELIZA: In what way?
USER: They’re always bugging us about something or other.
ELIZA: Can you think of a specific example?
USER: Well, my boyfriend made me come here.
ELIZA: Your boyfriend made you come here.
USER: He says I’m depressed much of the time.
ELIZA: I’m sorry to hear you are depressed.7
It is easy to understand how someone could quickly begin to ascribe human characteristics to such a conversation partner.
The first grandmaster loss to a chess program led to an amusing example of anthropomorphism with computers. During a tournament in London in 1977, the English grandmaster Michael Stean was defeated by the program Chess 4.6, which had been developed at Northwestern University.* At one moment in the game, when the program had found a tactic that he’d overlooked, Stean referred to the computer as a “bloody iron monster,” and some moves later he exclaimed, “This computer is a genius.” Stean was not explicitly attributing humanlike qualities to the computer but employing the descriptive term “monster” as one might for a naughty child and “genius” as the ultimate intellectual compliment regardless of to whom (or to what) it is being paid. It was the program’s remorseless performance in this intellectually difficult task that so impressed Stean, a world-class player, as to cause him to anthropomorphize.*
If you are a computer user, you will most likely have complained at some time or other that your computer refuses to work. In doing so you have attributed to your computer one of the characteristics of a living being, and you will have started to regard it as having some sort of relationship with you—a master-slave relationship in which you expect it to do your every bidding. The ease with which we slip into such a frame of mind has long been the subject of investigation by psychologists and anthropologists, but it is only relatively recently, with the advent of intelligent computers, that it has been recognized that the level of such relationships can rise to the point where, instead of being our slave, we think more in terms of the computer as a kind of friend.
In their book The Media Equation, Byron Reeves and Clifford Nass describe interaction with computers as being fundamentally a social tendency, but in their view it is not consciously anthropomorphic. They regard such interaction as automatic and subconscious, a view that stems from the general denial by most people that they treat computers as social entities. Yet despite this common denial, people do interact with computers according to normal human social conventions—by being polite, for example—and if a computer violates such a convention, it is usually regarded by its human operator as being deliberately offensive or obstructive, clearly an example of anthropomorphism. I believe that it matters not if the anthropomorphism of computers is subconscious. What I feel is important is the effect that anthropomorphism has on the emotional attachment felt toward a computer. It is the combination of attachment and anthropomorphism that, in my view, facilitates in us the creation of a human-computer relationship. As computers become increasingly accepted through the process of anthropomorphism, so will computer users come to treat them more like partners than work tools. For “computer” read “robot” and the mental leap is made—robots as partners.
But we are getting ahead of ourselves. Before we examine why humans develop relationships with computers, let us first explore in more detail Yeager’s comment (see footnote on page 76), identifying the anthropomorphism of computers as a subconscious reaction.
Following the publication of The Media Equation, now widely regarded as a classic in the field of human-computer relationships, Reeves and Nass extended their experimental research in collaboration with Youngme Moon. Their studies investigated how people apply the rules of human social interaction in their interactions with computers. What their research results demonstrated was a marked difference between how people say they regard computers and how they behave toward computers. Their results are based on some of the thirty-five experimental studies they carried out, studies that recreated a broad range of social and natural experiences in which computers often took the place of one of the humans in the interaction.
Nass and Moon’s paper, “Machines and Mindlessness: Social Responses to Computers,”* makes clear at the outset that:
of the thousands of adults who have been involved in our studies, not a single participant has ever said that a computer should be understood in human terms or should be treated as a person.8
In the light of this unanimity, the actual behavior of these thousands paints a stark contrast, leading Nass and
his group to conclude that there is clear evidence that people subconsciously treat computers as having personality and “apply social rules and expectations to computers.” The experiments they carried out were mainly based on situations described in the literature of experimental psychology. The same social situations were replicated, as were the same experimental stimuli, but instead of monitoring a human-human social situation, the experimenters replaced one of the humans with a computer. Before you start to think that this replacement creates a completely different form of interaction, pause for a moment to consider some important similarities: (a) Humans communicate using words—so do computers; (b) humans are interactive in that they respond to a social situation based on all of their prior “inputs” from the person with whom they are interacting—computers are also interactive, in that the way they respond is based on their prior inputs from the user during that session (and possibly during earlier sessions as well, if the software has been programmed to learn); and (c) computers fill many roles that have traditionally been filled by humans. It is against this background of similarity, rather than a background of complete difference, that the results of these experiments should be interpreted.
One series of experiments carried out by Reeves, Nass, and Moon investigated whether computer users attribute gender to a computer.* Three stereotypical attitudes were investigated: (a) Dominant men are assertive and independent—positive attributes—while dominant women are pushy or bossy—negative attributes; (b) people are more likely to accept an evaluation of their own performance if it comes from a man rather than from a woman; and (c) people assume that men know more about certain topics, thought of as “masculine” topics, than do women, while women know more than men about certain “feminine” topics. The experiment, designed to test whether these stereotypical attitudes extend to “male” and “female” computers, employed programs that incorporated male and female recorded voices saying exactly the same things.