Love and Sex with Robots_The Evolution of Human-Robot Relationships
Page 18
The design of the head that Hanson demonstrated was based on that of his blue-eyed girlfriend, Kristen Nelson. In April 2002, he had gone to a bar in the trendy Exposition Park area of Dallas, complete with a pair of calipers, in search of someone whose head would be suitable as a model for what Hanson had in mind. There he saw Kristen, whom he knew casually, and asked her, “Can I make you into a robot?” He did. The movements of Hanson’s artificial head are made possible by a collection of twenty-four motors, invisible to the observer, that simulate the actions of most of the muscles in the human face. The motors are driven by two microprocessors, and they employ nylon fishing line to tug the artificial skin when it needs to move. The eyes contain digital cameras to enable the head to see the people who are looking at it and, if required, to imitate their facial expressions, courtesy of its “muscle” motors.
Following its first convincing demonstration and the aura of publicity that surrounded it, the head attracted interest from companies in fields ranging from artificial limbs to sex dolls. And that was in 2003. In the time line for the development of sentient, lovable robots, Hanson’s work puts head design ahead of schedule. Add Hanson’s artificial head to Grace’s body and already the physical appearance of robots will have reached new heights of acceptability. And just as a robot’s emotional and intellectual makeup and its face and voice can be selected on an individual basis, so it can be designed with any wished-for physical characteristics, including skin, eye, and hair color; size of genitalia; and sexual orientation.
(LEFT TO RIGHT) DAVID HANSON’S ROBOT VERSION OF KRISTEN NELSON’S HEAD, KRISTEN, DAVID.
Feel and Touch Technologies
In designing artificial skin for robots, the most important properties will probably not be its appearance and expressiveness but rather its sensing capabilities—feel and touch. From a purely practical perspective, having a well-developed sense of feel will enable a robot to detect changes in its surroundings and move accordingly. But it is the more romantic aspects of feel that concern us here—how a robot can detect a physical expression of love, a caress or a kiss. Though perhaps with different research goals in mind, scientists in Japan, Italy, and the United States are working on high-tech skin development. The sensuous robot will be one of the spin-offs of their research.
At the University of Tokyo, a group led by Takao Someya is developing a synthetic skin, based on the technology for printing enormous numbers of flexible, low-cost pressure sensors on a large area of the skin material. Meanwhile in Italy, at the University of Pisa, Danilo de Rossi and his team are making skin using artificial silicone, which has the properties of elasticity (human skin stretches if pulled) and sensitivity to pressure. And in the United States, scientists at NASA are employing infrared sensors embedded in a flexible plastic covering—the sensors detect an object as the robot touches it and then send a signal to the robot’s computer, its “brain,” corresponding to the size, shape, and feel of the object.
The different types of sensor and the different skin materials being investigated by these groups reflect that the study of artificial-skin technology is still in its infancy and there is not yet a consensus as to what materials and technologies make for the best artificial skin. Future artificial-skin materials are likely to be more tactile and to provide even more sensors to afford greater sensitivity, but from the perspective of skin as an important component of a robot love object or sex object, it is hardly important what types of sensors are being used, or how many. What is important is that robots will be able to feel and recognize the touch and caress of an affectionate human, to know when their human is making the first physical overtures of passionate, romantic love. Similarly, a delicate sense of touch will be needed by a gentle robot lover, able to return its human’s tender caresses and initiate its own. Scientists at the Polytechnic University of Cartagena in Spain have created a sensitive robotic finger that can feel the weight of pressure it is exerting and adjust the energy it uses accordingly, allowing a robot to caress its human partner with the sensitivity of a virtuoso lover.
Smell and Taste Technologies
One novel technology that will contribute to a robot’s physical appeal is smell synthesis. The right kind of bodily fragrance can act as a powerful attraction and aphrodisiac, and not necessarily the kind of scent that comes in small bottles with big price tags. Instead the idea is to create electronically any smell to order. Just as your stereo speakers play out digitally stored music, so its smell equivalent will spray out the digitally stored smells generated by this technology. Your robot can exude a favorite perfume or a realistic counterfeit of your (human) loved one’s body fragrance, or even a body fragrance of its own that has been designed to appeal to you and to cater to your hormones and your personal desires.
The early attempts at bringing smell technology to the market were not exactly a great success. Despite serious investment, reportedly $20 million in one company alone,* the sweet smell of success eluded the pioneers in this field. By 2005, however, a new generation of digital-smells companies were racing to be the first to launch viable smell-creation technology,† and technologies very similar to those employed in the generation of smells to order can also be employed in the creation of artificial flavors that taste just like the real thing.
The fascinating aspect of this technology, from the perspectives of love and sex, lies in the creation of scents that can set a partner’s hormones running. These sense technologies will provide some of the foundation for the amorous and sexual attraction that humans will feel for robots. Sex usually involves several senses simultaneously: We enjoy the sight of our loved one, we enjoy the sound of their voice, the feeling of their skin when we caress it and the feeling on ours when we are touched, we enjoy their smell and their taste. All of these senses heighten our erotic arousal, and all of their corresponding technologies can be designed into robots to make them both alluring and responsive.
Robot Behaviors
An important facet of designing robots that promote satisfactory relationships with humans (satisfactory from the human point of view) is an analysis of the extent to which the robot needs to behave in a sociable way with humans in different types of situation. If, in a particular situation, a robot exhibits none of the normal human characteristics of emotion, it will probably appear to be insensitive, indifferent, even cold or downright rude. Solving this problem is not that simple. There might be some people—some nationalities, some age groups, or one of the sexes—who do not perceive a robot to be any of these things in the given situation, simply because of their cultural, educational, or social background. What is cold, rude, or uncouth to one group in society might appear to be completely normal, acceptable, even friendly to another group. A sociable robot that has emotional intelligence will therefore need to be able to make this distinction, to decide how to behave with different people in the same situation in order to be perceived as sociable by all of them. (Robots will be programmed to want to be liked by everyone, just as you and I do.)
Other factors that might affect the appropriate way for a robot to behave include where the human-robot interaction is taking place. Is it in the home, where a more overtly friendly behavior by the robot would be appropriate? Or is it at work, where the human might be the robot’s boss (or vice versa), and therefore a more overtly respectful attitude would be required of the robot (or the human)? Robots will need to be endowed with many “rules” of sociability for all sorts of situations and contexts, and this rule set can be expanded through the use of learning technologies. If a robot acts in a manner that appears rude to a human, the robot can simply be told, “That is rude,” whereupon, like a well-brought-up child, the robot can learn to improve its manners and behavior.
An interesting question here is whether robots should merely be designed to imitate human sociability traits or whether they should be taught to go further and create sociability traits of their own, traits that are atypical of humans but can nevertheless be appreciated by humans.
To do so would be a form of creativity, possibly no more difficult to program than the task of composing “Mozart’s” Forty-second Symphony or painting a canvas that can sell in an art gallery for thousands of dollars—tasks that have already been accomplished by AI researchers.*
At the ATR Intelligent Robotics and Communication Laboratories in Kyoto, a robot called Robovie has been developed as a test bed for ideas in robot-human communication. Robovie has a humanlike body that is only four feet tall, so as not to be overly intimidating to the humans outside the laboratory with whom it comes into contact from time to time. Robovie has two arms, two eyes, and a system of three wheels to enable it to move around. (Legs are not yet considered a necessity for Robovie’s principal sphere of activity, which is communication with humans rather than tasks involving movement.) Robovie has an artificial skin, to which have been attached various sensors, sixteen of them, made from pressure-sensitive rubber. It can speak, it can hear and recognize human speech, and it can charge its own batteries when necessary.
Robovie’s developers believe that there is a strong correlation between the number of appropriate behaviors a robot can generate and how intelligent it appears to be. The more often a robot can behave in what is perceived to be an appropriate manner, the more highly will its intelligence be regarded. The scientists developing Robovie plan to continue to develop new behavior patterns until Robovie has advanced to the point where it is much more lifelike than a simple automaton. Part of this progress will come from the robot’s tendency to initiate interaction with a human user, rather than merely being reactive. You and I don’t always wait until we are spoken to before we say something, so why should a robot? You and I don’t always wait until someone stretches out their hand to us and says, “Hi. Nice to meet you.” Nor should a robot. Robovie will in appropriate circumstances shake hands with you; hug you; greet, kiss, and converse with you; play simple games such as rock-scissors-paper; and sing to you. And these are just some of the behavior patterns it had been taught up to mid-2004.
Robovie’s arms, eyes, and head also contribute to the robot’s ability to interact with humans and to how they perceive it, partly because of the importance of eye contact in the development of human relationships and therefore in the creation of empathetic robots. We humans greatly increase our understanding of what others are saying to us, the subtext as well as the words themselves, when we establish eye contact and observe a speaker’s body gestures. Research has repeatedly shown that during a conversation humans become immediately aware of the relative position of their own body and that of the person to whom they are speaking—the body language improves the communication. This explains the tendency for Japanese roboticists to build human-shaped robots, endowing them with effective communication skills and employing the results of research from cognitive science to create more natural communication between robot and human.
Experiments with a group of twenty-six university students showed that Robovie exhibits a high level of performance when interacting with humans, while the students generally behaved as though they were interacting with a human child, many of them maintaining eye contact with the robot for more than half the duration of the experiment. Some of the students even joined in with the robot in its exercise routines, moving their own arms in time with the robot’s movements. The natural appearance of the students’ interactions in the experiment was attributed to the humanlike appearance and behavior of the robot.
Humanoid Robots—from the Laboratory to the Home
The development of humanoid robots has thus far been a long and slow process. The first serious development of humanoids began at the School of Science and Engineering at Waseda University in Japan, with the commencement of the WABOT project in 1970. The first full-scale humanlike robot, WABOT-1, was completed in 1973. It could talk (in Japanese), it could measure distances, it could walk, and it was able to grip and carry objects with hands that incorporated tactile sensors to allow the robot to feel what it was carrying. It also had an artificial mouth, ears, and eyes.
In 1984 came the musician robot WABOT-2, designed to play a keyboard instrument. This task was chosen by the Waseda engineers as one that requires humanlike intelligence and dexterity. WABOT-2 could read a musical score, play tunes of average difficulty on an electronic organ, and accompany someone who was singing a song.
The most dramatic development thus far in the Waseda project started in 1986: creating a robot that can walk like a human. Well, almost. Its feet edge slowly and deliberately forward, and even after twenty years’ research it is not yet able to qualify for the walking championship in the Olympic Games. But it has long been able to climb up and down stairs and inclines, it can set its own gait so as to be able to move on rough terrain and avoid obstacles, and it can walk on uneven surfaces.
The March of the Humanoids
Once upon a time, before the advent of the PC, computers were so expensive that they were rarely found outside the confines of government, big business, and academia. Reasons for this expense included the high cost of powerful processing units—the “electronic brains” that enabled the computers to compute—and of the computer memories that had to be employed to store the programs and their data. All this changed in the late 1970s, when inexpensive microprocessors became available, devices that cost a few dollars but could perform calculations and the electronic manipulations of data that only a few years earlier would have required a “mainframe” computer.* Suddenly there were computers in the home, such as the Commodore PET and the Sinclair Spectrum, inveigling themselves into people’s daily lives. Androids have not yet reached that level of integration into our society, but their day is fast approaching.
Robots are not yet just like us, obviously. They behave in most respects in what we currently refer to as “robotlike” or “robotic” ways. One physical manifestation of this is how biped robots walk, slowly and deliberately moving their feet, making it obvious to the observer that they’re thinking about every step. Even the most advanced android robots today move in this extremely slow and deliberate manner.* Similarly, the best of today’s conversational software can be recognized as artificial by just about all the judges at the annual computer-conversation competitions. So as yet we cannot fairly describe our robots as being sociable, because to be considered sociable they would first need to be more humanlike. But that will come. When robots are perceived as making their own decisions, people’s perceptions of them—as solely tools for mowing the lawn and other domestic tasks—will change. And just as the day will arrive when, all of a sudden, robots are sufficiently humanlike to be considered for the epithet “sociable,” so the day will also come when robots are sufficiently sociable, in human terms, to be considered as candidates for our deepest affections.
Why do I believe that the necessary change in thinking will take place among a wide body of the population, a change sufficiently dramatic to alter people’s perception of robots from that of servants to their being our friends, companions, and more? It is because we have already seen other instances of the process necessary to bring about similar changes in our ideas about the roles of robots. This process requires two components—a change in our social and/or cultural thinking and a significant leap in technological capability.
There are several examples from the twentieth century of major social and cultural changes—particularly those relating to women: their enfranchisement as voters; their role in the home and in parenting, developing from that of dutiful housewives to members of a more equal partnership; their role in the workplace, from filling only the more menial jobs to taking on management and executive positions; the advances in female contraception that have given women more choices regarding their lifestyles and careers. Society is also undergoing a change in ideas regarding senior citizens, moving away from the expectation that one works with retirement in mind—and the sooner the better—to what is becoming regarded as a more economically sound model—namely, that later retirement means more earning po
tential and a lesser financial burden on the state, on one’s children, and on inadequate pension schemes. Another change that has become apparent in recent years is in society’s view of human appearance, as our concerns over obesity can be seen to lead to cultural expectations regarding the “correct” body size and shape, the result being that many women develop eating disorders while they try to stay (or become) thin. Also more apparent nowadays are cultural changes in individuals, as those who encounter people of other cultures sometimes question the ideas and conventions of their own culture, and change as a result.
Leaps in technology occur frequently. In the case of humanoid robots with the capabilities described in this book, most of the more difficult advances will be in the realm of the robot’s software—the computer programs that give it emotions and personality, that enable it to think, to understand what is said to it, to conduct a conversation, to make intelligent deductions and assumptions. These advances will come partly through new techniques in artificial intelligence—in other words, through new programming ideas—and partly because of developments in computer hardware, in the chips or whatever it is that will do the thinking, and in the computer memories that store the massive amounts of information robots will need. We have seen for many years that computing speeds and computer memory sizes increase steadily, year upon year, but the increases we have witnessed during the past two or three decades will pale into insignificance when completely new technologies become mainstream, technologies that go under names such as “optical computing,” “quantum computing,” “DNA computing,” and “molecular computing.” So rest assured, the advances in technology needed to create the robots that I describe in this book will indeed come. It is only a matter of time, and technological advances are happening ever faster as time goes on. The more we know about a science, the faster we are able to discover even more about that science and to develop technologies based on this new knowledge.