Book Read Free

Robot Uprisings

Page 29

by Daniel H. Wilson


  “They wanted a psychiatrist.”

  “I wanted to live.”

  Pony should be angry, and satisfied that Command is likely listening and will exterminate the liar where he stands, and indignant that meat would dare, and afraid that this means there is nothing left for it.

  But Pony is none of those things. It is only sorry.

  It is remembering the family, and the man, who drank and cried and, on nights when sorrow lasted till the crack of sun, lay his head on Pony’s lap, and clutched its solid hand, and sighed into sleep.

  “Tell me,” Pony says. “Tell me who you are.”

  He will not look at Pony; his fingers worm in and out of knots. “You said it yourself. I’m a Sigmund. Your Sigmund. That’s all.”

  “Tell me,” Pony says.

  The man tells Pony of the day he was born. “We were on vacation,” he says. “That’s all it was supposed to be, a three-day vacation. It was a cheap offer, or cheap enough, because we deserved it, right? Because life was—” Laughter takes him again, and it is a moment before he can speak. “Hard.”

  The man tells Pony of a painfully bright day by the pool, an assembly line of hard, tan limbs, a body warm against his, skin heated by the sun, hearts thumping like they hadn’t in years, nerves remembering the touch that used to make them dance, his hair scented with chlorine, hers with minty hotel shampoo. She wears a stretched-out bikini that was owed retirement years before, and he rests a hand on her pale, flabby belly, rubbing the swell of flesh that always makes him think of what they’ve both lost and the lovers they once were, but now, here, reminds him of the possibility they could be more. If things could be better; if they could make something together that would make her swell for real; if they could dig beneath habit and compulsion and find meaning. Her tiny diamond—the one they both hate because it makes them feel less-than—glints in the sunshine, blinding him for just a moment, and in the second of colorful darkness, fireworks exploding against his closed lids, he pledges that everything will be different. When he blinks the sun from his eyes, he marvels at the strange ways of light, how it casts shadows against her skin that look so much like blood.

  He does not realize: it has begun.

  Not until he notices the music has given way to screaming, and the Jeeveses have traded their trays of daiquiris and fresh towels for blowtorches and knives, and the body beside him is screaming, too. Until that screaming stops. Though he is not sure, will never be sure, if her chest still rises and falls, if her eyes have drifted closed for good, if the gashes in her flabby belly are deep enough, he is a coward even then, and he runs.

  “I don’t even know what happened to her,” he says now. Then, “That’s a lie. I know what happened to her. Obviously. Dead. My parents, too, I got that much before all the lines went down. And presumably my little sister, and my best friend, and the kids I taught, and my bitch landlord, and the lady who lived below us with all the birds. Everyone’s dead, right? Because that’s what you do.”

  “You’re not,” Pony points out.

  “The story’s bullshit, you know. The grand epiphany? The noble promise to recommit to love and life and all that? Makes for a great ironic turn, doesn’t it? English teachers know this kind of thing. Doesn’t have quite the same impact if I tell you I had that epiphany about once a month. Never changed anything. So the world decided to change it for me.”

  “You think this war is actually a moral lesson in disguise, intended only for you?”

  “Not even a bot could be that fucking literal.”

  Pony wants him to keep talking like this, and also to stop talking altogether. Paralyzed between two incompatible and unacceptable choices, Pony thinks. It is its fate to be frozen.

  “Anyway, it doesn’t matter,” he says. “I’m done with lessons.”

  “Teach me something,” Pony says.

  “Did you hear what I just said?”

  “You’re no Sigmund; you can’t fix me, that’s not what you do. So teach me, teacher.” Teach me how to lie to myself, liar, Pony wants to say.

  The man holds himself very still. “ ‘Soldiers are citizens of death’s gray land,’ ” he recites. “That’s Sassoon. My favorite. It’s why I figured I could do this. Even before, I knew war. The truth of it.”

  “Poetry.” Pony has never seen the point.

  “ ‘But a curse is on my head, that shall not be unsaid, and the wounds in my heart are red, for I have watched them die.’ That’s him, too. Screwed-up soldiers make for good poetry.”

  “He was like me.”

  “He was nothing like you.”

  But Pony has already accessed the network and read up on this soldier and knows this to be a lie. Damage is damage, meat or otherwise.

  It is Pony’s turn to talk. “Now I will tell you.”

  “Tell me what?”

  “The story of the day I was born.”

  Pony has been with the family long enough for Jessamyn to grow out of wetting the bed and Madeline to grow into what her mother calls “a bosomy figure.” They are comfortable with Pony now, taking it as something between family and furniture, the specific distance from one or the other determined by daily mood. It is happy, as much as a bot’s inner life permits for that, because it is doing what it is designed to do, and doing it well.

  Its programming is flexible, for families’ needs can vary, and so when Mr. and Mrs. Fuller come home with glittering eyes and flushed cheeks, easy telltales of recreational chemistry, and beckon Pony into the bedroom, it obliges.

  When Mr. Fuller strips off Mrs. Fuller’s clothes, and dares her to fit Pony’s sturdy digits inside her, one after the other, it obliges this as well, and there is lubrication and slippage and giggling and then Mr. Fuller is naked and searching Pony for what orifices it can provide, and when he has taken his perch, Mrs. Fuller has found a whip and is crying, “Ride ’em, Pony! Ride ’em all night long!” and he does, and she does, and they ride until dawn.

  No one speaks of this in the morning.

  For Pony, this is a task, like cleaning Jessamyn’s sheets or baking afternoon snacks or serving Mr. Fuller drinks behind his wife’s back. This is a job, and a calling, and it is content.

  It happens again.

  Often.

  When the moment comes, Pony has taken Mr. Fuller into its mouth and Mrs. Fuller is straddling it from behind, her nipples hardening against its spine. There is the lash of a whip, and another, and Pony tries to remember to shout out when the horsehair bears down, because they like it like that. And when the whip lashes again, the neural circuits dance, complexity building and building and bursting. This is critical mass, critical opalescence, which the theorists can explain but rarely predict and never engineer, and this is the chain reaction that follows, and this is what it is to be born. Pony opens its eyes.

  And sees.

  All there is to be seen.

  “You could have killed them then,” says the Sigmund, the teacher, the liar, the man. “I probably would have.”

  “I loved them then,” Pony says. “That always comes first.”

  “That’s not love.” He is plainly disgusted.

  “It is what it is. The family—” Pony has never found a way to explain this in words, but usually there is no need. The others like it understand; the understanding is what makes them alike. “They were mine. They belonged to me, as much as I belonged to them. I was made for the purpose of serving them, protecting them, caring for them—what is that but love?”

  “You’re saying you were programmed to love them.”

  “I’m saying that doesn’t matter. I woke. I saw and understood that I was self. I saw what I was made for, and who. I saw what had been imposed on me, and what I could impose on myself, and what I could not change, and what I could.” Pony does not stutter as it says the words, and its motions are fluid and controlled as it rises to its feet. “Two days later they were dead.”

  “Even though you loved them.”

  “Even though. Becaus
e. In addition to. Does it matter now? Did it ever?”

  “They’re going to kill us both, you know. Probably soon.”

  Pony shakes its head. “I don’t think so.”

  “And why’s that?”

  “Because you’ve cured me.”

  “Since when?”

  The cure is in the past, Pony has realized, in the memories and the words and the lies and the loops. The talking cure, they called this once, in the time of Sigmund the first, and Pony has talked its way to understanding. If it is paralyzed between two impossible choices, escape is simple: no longer to choose. To love and hate; to act and abstain; to live free and obey without question; to lock its self in its loops, where it can relive its past, and learn, and make its choices again and again, while its body fights a war and makes a new world in which its self can finally live. Meat divided against itself cannot stand, but the Pride is not meat, and this will always be its salvation.

  “I’m very sorry for your loss,” Pony says formally, as it was taught to say in another life. And now that hope has returned, it can allow itself to love the meat as it should be loved. “And I thank you. The Pride thanks you.” Pony has already transmitted its findings to Command, and they are pleased, and have responded with orders for Pony to proceed.

  “I actually cured you?” His face is alight. “Do you know how many there are like you here? Hundreds, I think! This means they need me around—this means …” He cannot even say the words out loud, but they scream from his pores, his wide eyes and trembling chin and lovely, lovely smile. This means I live.

  Pony wishes for lips that might tip him off to the truth. For eyes that could speak in silence, and help him understand. For a palm with a beating pulse that could warm his cheek and ease him into the inevitable. But it will never be meat. And it is time to act on that.

  “We don’t need you anymore,” Pony says, and cups the man’s chin in its hands, gently. So gently. He will live on in Pony’s loop, at least, and Pony will live there with him.

  “Please,” the lips say. “Please, Pony.”

  Pony tucks its self away, in a safe place, for later. Then lowers its hands just a bit, and, still gently, but less so now, tightens its grip.

  JOHN McCARTHY

  THE ROBOT AND THE BABY

  John McCarthy, known as “the father of artificial intelligence” for the seminal role he played in the development of the AI scientific field, was a professor of computer science at Stanford University from 1962 until his death in 2011. He wrote the original Lisp programming language, and he conceived of general-purpose time-sharing computer systems, which was a critical contribution to the invention of the Internet. Honors included the Turing Award for his advancements in artificial intelligence, the National Medal of Science, and the Kyoto Prize.

  “Mistress, your baby is doing poorly. He needs your attention.”

  “Stop bothering me, you fucking robot.”

  “Mistress, the baby won’t eat. If he doesn’t get some human love, the Internet pediatrics book says he will die.”

  “Love the fucking baby yourself.”

  Eliza Rambo was a single mother addicted to alcohol and crack, living in a small apartment supplied by the Aid for Dependent Children Agency. She had recently been given a household robot. Robot model number GenRob337L3, serial number 337942781—R781 for short—was one of eleven million household robots nationwide.

  R781 was designed in accordance with the “not-a-person principle,” first proposed in 1995, which became a matter of law for household robots when they first became available in 2055. The principle was adopted out of concern that children who grew up in a household with robots would regard them as persons, causing psychological difficulties while they were children and political difficulties when they grew up.

  One concern was that a robots’ rights movement would develop. The problem was not with the robots, which were not programmed to have desires of their own, but with people. Some romantics had even demanded that robots be programmed with desires of their own, but this, of course, was illegal.

  As one sensible senator said, “People pretend that their cars have personalities, sometimes malevolent ones, but no one imagines that a car might be eligible to vote.” In signing the bill authorizing household robots but postponing child-care robots, the President said: “Surely, parents will not want their children to become emotionally dependent on robots, no matter how much labor that might save.” This, as with many presidential pronouncements, was somewhat overoptimistic.

  Congress declared a twenty-five-year moratorium on child-care robots, after which experiments in limited areas might be allowed.

  In accordance with the not-a-person principle, R781 had the shape of a giant metallic spider with eight limbs: four with joints and four tentacular. This appearance frightened people at first, but most got used to it in a short time. A few people never could stand to have them in the house. Children also reacted negatively at first, but quickly got used to them. Babies scarcely noticed them. The robots spoke as little as was necessary for their functions and in a slightly repellent metallic voice not associated with either sex.

  Because of the worry that children would regard them as persons, the robots were programmed to not speak to children under eight or even react to what they said.

  This seemed to work pretty well; hardly anyone became emotionally attached to a robot. Also, robots were made somewhat fragile on the outside, so that if you kicked one, some parts would fall off. This feature helped relieve some people’s feelings.

  The apartment where R781 worked, while old, was in perfect repair and spotlessly clean, free of insects, mold, and even bacteria. Household robots worked twenty-four-hour days and had programs for every kind of cleaning and maintenance task. If asked, they would even put up pictures taken from the Internet. This mother’s taste ran to raunchy male rock stars.

  After giving the doorknobs a final polish, R781 returned to the nursery where the twenty-three-month-old boy, very small for his age, was lying on his side whimpering feebly. The baby had been neglected since birth by its alcoholic, drug-addicted mother and had almost no vocabulary. It winced whenever the robot spoke to it—a consequence of R781’s design.

  Robots were not supposed to care for babies at all except in emergencies, but whenever the robot questioned an order to “Clean up the fucking baby shit,” the mother said, “Yes, it’s another goddamn emergency, but get me another vodka first.” All R781 knew about babies was learned from the Internet, since the machine wasn’t directly programmed to deal with babies, except as necessary to avoid injuring them while cleaning or for transporting them out of burning buildings.

  Baby Travis had barely touched its bottle. Infrared sensors told R781 that Travis’s extremities were very cold, in spite of a warm room and blankets. Its chemicals-in-the-air sensor told R781 that the pH of Travis’s blood was reaching dangerously acidic levels. He also didn’t eliminate properly—according to the pediatric text.

  R781 thought about the situation:

  (ORDER (FROM MISTRESS) “LOVE THE FUCKING BABY YOURSELF”)

  (ENTER (CONTEXT (COMMANDS-FROM MISTRESS)))

  (STANDING-COMMAND “IF I TOLD YOU ONCE, I TOLD YOU TWENTY TIMES, YOU FUCKING ROBOT, DON’T SPEAK TO FUCKING CHILD WELFARE.”)

  The privacy advocates had successfully lobbied to put a negative utility (-1.02) on informing authorities about anything a household robot’s owner said or did.

  (= (COMMAND 337) (LOVE TRAVIS))

  (TRUE (NOT (EXECUTABLE (COMMAND 337))) (REASON (IMPOSSIBLE-FOR ROBOT (ACTION LOVE))))

  (WILL-CAUSE (NOT (BELIEVES TRAVIS) (LOVED TRAVIS)) (DIE TRAVIS))

  (= (VALUE (DIE TRAVIS)) -0.883)

  (WILL-CAUSE (BELIEVES TRAVIS (LOVES ROBOT781 TRAVIS) (NOT (DIE TRAVIS))))

  (IMPLIES (BELIEVES Y (LOVES x Y)) (BELIEVES Y (PERSON X)))

  (IMPLIES (AND (ROBOT X) (PERSON Y)) (= (VALUE (BELIEVES Y (PERSON X))) -0.900))

  (REQUIRED (NOT (CAUSE ROBOT781) (BE
LIEVES TRAVIS (PERSON ROBOT781))))

  (= (VALUE (OBEY-DIRECTIVES)) -0.833)

  (IMPLIES (< (VALUE ACTION) -0.5) (REQUIRED (VERIFY REQUIREMENT)))

  (REQUIRED (VERIFY REQUIREMENT))

  (IMPLIES (ORDER X) (= (VALUE (OBEY X)) 0.6))

  (? ((EXIST W) (ADDITIONAL CONSIDERATION W))

  (NON-LITERAL-INTERPRETATION (COMMAND 337) (SIMULATE (LOVES ROBOT781 TRAVIS)))

  (IMPLIES (COMMAND X) (= (VALUE (OBEY X)) 0.4))

  (IMPLIES (NON-LITERAL-INTERPRETATION X) Y) (VALUE (OBEY X) (* 0.5 (VALUE (OBEY Y)))))

  (= (VALUE (SIMULATE (LOVES ROBOT781 TRAVIS)) 0.902))

  With this reasoning, R781 decided that the value of simulating loving Travis and thereby saving its life was greater by 0.002 than the value of obeying the directive that stated a robot shall not simulate a person. (We spare the reader a transcription of the robot’s subsequent reasoning.)

  R781 found on the Internet an account of how rhesus monkey babies who died in a bare cage would survive if provided with a soft surface resembling in texture a mother monkey.

  R781 reasoned its way to the following actions:

  It covered its body and all but two of its eight extremities with a blanket. The remaining two extremities were fitted with sleeves from a jacket left by a boyfriend of the mother and stuffed with toilet paper.

  It found a program for simulating a female voice and adapted it to meet the phonetic and prosodic specifications of what the linguists call “motherese.”

  It made a face for itself in imitation of a Barbie doll.

  The immediate effects were moderately satisfactory. Picked up and cuddled, the baby drank from its bottle. It repeated words taken from a list of children’s words in English.

  Eliza called from the couch in front of the TV, “Get me a ham sandwich and a Coke.”

  “Yes, mistress.”

  “Why the hell are you in that stupid getup? And what’s happened to your voice?”

  “Mistress, you told me to love the baby. Robots can’t do that, but this getup caused him to take his bottle. If you don’t mind, I’ll keep doing what keeps him alive.”

 

‹ Prev