by David Levy
Maintaining the enjoyment of a relationship can also come in a variety of ways: (a) the use of humor, which makes computers appear more likable, competent, and cooperative than computers that lack humor; (b) talking about shared past experiences and the expectations of future togetherness, especially when making use of reference to mutual knowledge; and (c) “continuity behaviors” related to the time people are apart, talking about the time spent apart and using appropriate greetings and farewells. All these are important strategies in maintaining a sense of persistence in a relationship.
Conversation in general is also an important element of relationships and has formed one of the biggest challenges to the AI community, ever since Alan Turing proposed his famous test for intelligence in 1950.* Most human relationships develop during the course of face-to-face conversations, and even small talk, such as regular use of the greeting “Good morning,” can influence the development of a conversation, since it has been found to increase the trust of some computer users.† And a lesson learned from the development of “expert systems” software is that another way for an intelligent computer to garner a user’s trust is by explaining and justifying its beliefs, decisions, and conclusions during a conversation.‡
It is not only what we say in conversation that affects people’s reactions to us; how we speak is also important. The way we address someone will usually depend on the form of our relationship with them: “David” is friendly; “Mr. Levy” is less so. “Hello” is friendly; “Good morning” is less so. Thus the forms of language used in a computer application, even if they are only in menus or some other form of text, signal a certain set of relational expectations on the part of the user. And the tone of voice produced by a computer’s speech synthesizer can also be an important factor in shaping the attitude of a user to that computer. The more frequently a computer matches the user in intonation, the higher the user rates the computer on measures of familiarity, such as comfortableness, friendliness, and perceived sympathy.
In summary it would appear that all of the emotional benefits we have considered here, deriving from human-human relationships, could also be provided by computers. Similarly, the behaviors we have discussed here, those necessary to endear one human being to another, appear already to be capable of simulation and in some cases have been simulated, using conversational and other techniques that are the subjects of research in the AI community.
Virtual Pets—the Tamagotchi
In 1975 a fad for Pet Rocks was started in California by Gary Dahl, a salesman. Here was a pet that required no care, no food, no walking and yet gave its owner a few moments of pleasure. The idea spread like wildfire, and within a few weeks of the inception of the idea Dahl was selling rounded gray pebbles at the rate of ten thousand per day, together with a Pet Rock Training Manual—a step-by-step guide to having a happy relationship with your geological pet, including instructions for how to make it roll over and play dead and how to house-train it. “Place it on some old newspapers. The rock will never know what the paper is for and will require no further instruction.”
In the light of the widespread enthusiasm for Dahl’s completely inanimate, amorphous pets with which their owners could enjoy no real interaction, the advent and huge commercial success of the Tamagotchi* should have come as no great surprise. The idea for this product was conceived by a Japanese mother for her children, to counter their problem of being unable to own a real pet due to lack of space at home. Depending on which reports one believes, the number of Tamagotchis sold during its heyday varied between 12 million and 40 million.†
The Tamagotchi fits into the palm of the hand and is shaped like a flattened egg, with a small LCD‡ screen on which a simple graphical representation of the virtual pet is displayed. The idea is that the owner must care for the Tamagotchi in its virtual world, by pressing buttons to simulate the giving of food and drink, the playing of games, and other behaviors that are typical of a mother-child relationship, ensuring that the Tamagotchi will survive and thrive. When the Tamagotchi “wants” something, it sounds an electronic beep to alert its owner and indicates its particular needs at that moment by displaying appropriate icons on the LCD. If the Tamagotchi is neglected, it can “grow ill” and “die,” often causing heartbreak to its owner. The creature’s behavior patterns were programmed to change with time, in order to give the owners the sense that each Tamagotchi is unique and therefore provides a unique relationship for the owner, just as each pet animal and each human child are unique.
A remarkable aspect of the Tamagotchi’s huge popularity is that it possesses hardly any elements of character or personality, its great attraction coming from its need for almost constant nurturing. It is this nurturing theme that engenders, in many Tamagotchi owners, a feeling of love for their virtual pet, an experience that can substitute for the experience of owning and caring for a real pet or even a human baby. In Japan the biggest group of Tamagotchi owners has been women in their twenties, most of whom purchased their toy because they craved the experience of nurturing. In the mother-child and other relationships between humans, the nurturer nurtures as a natural consequence of her love for the nurtured one, and of the object’s need for her nurturing. In the human-Tamagotchi relationship, the same elements of a human relationship exist, but they act in the reverse direction—it is the need to nurture the virtual pet that engenders the emotion of love, not the love that impels the nurturing instinct. And this desire to nurture creates in many Tamagotchi owners what Sherry Turkle calls “the fantasy of reciprocation.”12 Tamagotchi owners also want their virtual pets to care about them in return.
This nurturing instinct is a significant feature in human-pet relationships as well. In referring to the role of a pet as a surrogate child in a childless relationship, or as an additional child for parents, Marvin Koller explains that yet another role of pets is to prolong the parenthood process for middle-aged and elderly parents whose children have flown the nest:
The family pet always needs attention, and the pleasure it brings its keepers derives partly from the sustained dominance and importance of those who take care of it. The need to be needed is powerful, and parents whose children have grown up are gratified by this sustained dependence of their family pet over the years.13
The literature abounds with anecdotes about Japanese Tamagotchi owners who go to great lengths to preserve the life and well-being of their virtual pet—businessmen who postpone or cancel meetings so as to be able to feed their Tamagotchi and attend to its other essential needs at appropriate times, women drivers who are momentarily distracted in traffic while responding to the beeping of their needy electronic creature, a passenger who had boarded a flight but felt compelled to leave the aircraft prior to takeoff—and vowed never to fly with that airline again—because a flight attendant insisted she turn off her Tamagotchi, which the passenger felt was akin to killing it. Every example reflects the attitude of devoted Tamagotchi owners that their lovable egg is alive, and a logical corollary of this virtual life is that the Tamagotchi can virtually “die.” When death occurs, the owners can arrange for the virtual “birth” of a new creature, and in addition many owners pay proper respect to their departed creature by logging on to a Web site that offers virtual cemeteries where the owners can post eulogies to their departed ones. The belief that their Tamagotchi had died is a further indication that the owner has somehow regarded it as having been alive.
It was not only in Japan that the Tamagotchi craze gave rise to important life decisions such as whether to miss a business meeting or to take one’s eyes off the road while driving. In Israel an important religious question arose that depended for its answer on whether a Tamagotchi was deemed to be alive. Orthodox Jews are not permitted to do anything on the Sabbath that constitutes “work,” and in the strictest of Orthodox households this includes such acts as switching on and off the lights and other electrical and electronic equipment, unless the act of work is necessary for pikuach nefesh—“the saving
of souls,” an act of life or death. The question therefore arose, is the pressing of the buttons on a Tamagotchi, an act carried out in order to sustain the Tamagotchi’s virtual life, covered by the “saving of souls” exception? The position of Tamagotchi owners on this issue is clear, but the rabbinate in Israel took a different view—namely, that it is not a real soul being saved by pressing the buttons, and therefore interaction with a Tamagotchi is forbidden on the Sabbath. Despite this ruling, the very fact that the rabbinate had to make a decision on the Tamagotchi issue underlines the widespread feeling that the Tamagotchi is alive and has a right to life.
The effect of the Tamagotchi and Furby crazes has been to spawn a culture in which electronic products are accepted as having lifelike properties. Sherry Turkle describes how children have been affected by this realization of some sort of life in man-made objects:
A generation of children is growing up who grant new capacities and privileges to the machine world on the basis of its animation. Today’s children endow the category of made objects with properties such as having intentions and ideas. These were things previously reserved for living beings. Children come up with the new category “sort of alive” for describing computational animation, and they are increasingly softening the boundaries between artifact and flesh, as well as blurring boundaries between the physical real and simulation.14
But even though Turkle, when researching for her 1984 book The Second Self, grew to expect that children “might come to take the intelligence of artifacts for granted, to understand how they were created, and be gradually less inclined to give them importance,” she was surprised at “how quickly robotic creatures that presented themselves as having both feelings and needs would enter mainstream American culture,” remarking that “by the mid-1990s, as emotional machines, people were not alone.”15
Turkle explains that as a result of this change in perception as to the aliveness of artifacts, “people are learning to interact with computers through conversation and gesture. People are learning that to relate successfully to a computer you have to assess its emotional state;…you take the machine at interface value, much as you would another person.” And she discovered that in some people this change in perception can lead to a preference for interacting with an artificial creature rather than a real one, quoting children who, on seeing a pair of Galápagos turtles at the American Museum of Natural History in Boston, remarked that robot turtles would have been just as good, cleaner, and would have saved transporting the real ones thousands of miles. Turkle also observes that “when Animal Kingdom opened in Orlando, populated by ‘real’—that is, biological—animals, its first visitors complained that they were not as ‘realistic’ as the animatronic creatures in the other parts of Disney World. The robotic crocodiles slapped their tails, rolled their eyes—in sum, displayed archetypal ‘crocodile’ behavior. The biological crocodiles, like the Galapagos turtle, pretty much kept to themselves.”16
The relationship between Tamagotchi owners and their virtual pet has been compared to “parasocial” relationships. The term “parasocial” was coined by Donald Horton and Richard Wohl to represent the type of interaction that TV viewers have in mind when they imagine themselves becoming closely acquainted with the personalities of characters on their favorite shows: “After watching a television series for a period of time, viewers come to feel that they know the characters as well as friends or neighbors.”
It has been found that the process of developing parasocial relationships bears many similarities to the process of developing real-life relationships. But Linda Renée-Bloch and Dafna Lemish assert that the development of an owner-Tamagotchi relationship is quite different from a parasocial relationship because, in the case of the Tamagotchi, it is not a human (TV) personality with which the relationship is developed, but the personification of a machine. They support their assertion with the argument that in the Tamagotchi relationship the owners can affect the life of the creature by their actions: “The very existence of the virtual partner to the interaction depends on responding to its demands.” I take the opposite view. I hold that precisely because the owner can affect the virtual life of the Tamagotchi, the relationship is an even stronger form of parasocial interaction than that between a TV viewer and a favorite character, the dream of having an intimate closeness with that character being better realized in the case of the Tamagotchi because its owner controls, and has the power to enhance, the creature’s virtual life—just as a human has the power to enhance and to some extent control (or at least affect) the lives of friends and loved ones. This type of power can already be seen in some interactive TV systems that allow a viewer to determine what happens next in a story line—should she kiss him passionately, slap his face, or run out of the room crying? Such systems enhance TV viewers’ parasocial-relationship experience by adding the element of control, allowing them to gain an increased level of intimacy with the TV character in a similar way to how Tamagotchi owners relate to their virtual pet.
Virtual Pets That Live on the Screen
Handheld virtual pets such as the Tamagotchi are the simplest form of the genre, based on low-cost electronics that allow a retail price of fifteen dollars or less. The next step up in complexity is the virtual pet that “lives” on the TV or computer screen, usually a cartoonlike character. The most believable and lifelike of these characters exhibit a variety of social cues: intelligence, individuality, sociability, variability, coherence, and some conversational ability. Add the ability to recognize the user’s emotional state and other social cues and they will become utterly compelling.
Sherry Turkle notes that the behavior of a character in a computer game impels some computer users to anthropomorphize not only the virtual character but also the computer itself. This is hardly surprising, given that computer users often anthropomorphize their computer even when the task it is executing is not one involving any virtual characters. When a believable character appears on the screen, the tendency to anthropomorphize must surely be greater.
A popular example of a screen-based character that encourages anthropomorphism is the virtual girlfriend. A character of this sort was first announced in a 1994 advertisement in PC Magazine:*
Now You Can Have Your Own GIRLFRIEND
…a sensuous woman living in your computer!
GIRLFRIEND is the first VIRTUAL WOMAN. You can watch her, talk to her, ask her questions and relate with her. Over 100 actual VGA photographs allow you to see your girlfriend as you ask her to wear different outfits, and guide her into different sexual activities. As a true artificial intelligence program, GIRLFRIEND starts with a 3000 word vocabulary and actually GROWS the more you use it. She will remember your name, your birthday, and your likes and dislikes. GIRLFRIEND comes with the base software [sic] and GIRLFRIEND LISA. Additional girls will be added. This program requires 7–10 MB of free space.
This type of character has recently been metamorphosed to create a new twist on the Tamagotchi concept. Rather than the user’s lavishing care on the virtual character as the path to giving her a long and happy life, the key with this virtual girlfriend, launched by the Hong Kong company Artificial Life in the autumn of 2004, is much simpler. It is money. For a monthly fee of six dollars (real money, not virtual dollars), customers can download an image of “Vivienne,” a slim, talking brunette, to their cell phones and then spend much more (real) money sending her virtual flowers, virtual chocolates, and other virtual gifts, not to mention the essential spending on the cell-phone calls necessary to interact with Vivienne. In return for their generosity, customers are made privy to different aspects of Vivienne’s life, such as meeting her virtual female friends, who also appear as images on the display screen of the cell phone. But if a customer neglects Vivienne, she refuses to speak.
Vivienne was followed in January 2006 by a virtual boyfriend for women, with other characters being planned by Artificial Life to cater to gay and lesbian customers.
Robotic Virtual Pets
<
br /> The highest form of virtual pet is one that moves around your room—for example, Sony’s AIBO, a robot dog. AIBO’s design was based on the ethology* of canine behavior patterns, and in particular on the research conducted by John Scott and John Fuller, and also that of Michael Fox. This body of research has provided a comprehensive categorization of canine behavior patterns that covers the whole range of a dog’s activities and forms the basis for the AIBO’s own behavior patterns, which include expressions of anger, disgust, fear, happiness, sadness, and surprise.
AIBO comes with a number of preprogrammed behavior patterns that encourage owners to project humanlike attributes onto their virtual pets. The AIBO plays, it sleeps, it wags its tail, it simulates feelings of affection and unhappiness. Sony describes the AIBO as “a true companion with real emotions and instinct.”† Not everyone will embrace this concept, but to a large extent any argument over this point is not of great import. What is important is that many people, especially children and the elderly, have been found by psychologists to behave with AIBO in the same way they would interact with real animals. And as the technology improves and robot pets become increasingly lifelike, the boundary between people’s perceptions of robotic pets and their perceptions of real animals will become increasingly blurred.
As a result of its animal-like behavior, AIBO engenders feelings of love in many of its owners similar to those felt by the owners of real pets. Children’s interactions with AIBO were investigated in a comparative study of seven-to fifteen-year-olds, which compared their AIBO interactions to their interactions with a real Australian shepherd dog. The majority of children in this study treated AIBO in ways one would treat a dog. As one child said, when asked how she would play with AIBO, “I would like to play with him and his ball and just give him lots of attention and let him know he’s a good dog.” Fifty-six percent of those surveyed by Gail Melson believed that AIBO had mental states (for example, feeling scared), 70 percent said that AIBO had personality, and 76 percent asserted that AIBO had moral standing (i.e., it could be held morally responsible or blameworthy for its actions and could have rights and deserve respect). Given how rudimentary AIBO is in terms of its capabilities, it is remarkable that so many children treated it not only as if it were a social agent (the focus of research by Reeves and Nass, albeit human, not dog) but also as having mental states and moral standing. It is therefore reasonable to conclude that as robots become increasingly lifelike in their behavior and as these children influence the adults around them and grow into adults themselves, more and more people will treat robots as if they are mental, social, and moral beings—thus raising the perception of robotic creatures toward the level of biological creatures.