First Contact - Digital Science Fiction Anthology 1
Page 5
“Mr. Church, would you like to read my operation manual?”
The machine’s metal lips flap in sync with the voice, which is pleasant, androgynous, and very “computery.” No doubt that was a decision made after a lot of research to avoid the uncanny valley. Make the voice too human, and you actually diminish the ability to create false empathy.
“No, I don’t want to read your operation manual. Does it look like I want to hold up a book?” I lift my limp left arm with my right and let it drop. “But let me guess, you can lift me, carry me around, give me a restored sense of mobility, and engage me in healthy positive chitchat to maintain my mental health. Does that about cover it?”
My outburst seems to shock the machine into silence. I feel good for a few seconds before the feeling dissipates. Great, the highlight of my day is yelling at a glorified wheelchair.
“Can you help me up?” I feel foolish, trying to be polite to a robot. “I’d like a ... bath. Is that something you can help with?”
Its movements are slow and mechanical, nonthreatening. The arms are steady and strong, and it gets me undressed and into the bathtub without any awkwardness. There is an advantage to having a machine taking care of you: you don’t have much self-consciousness or shame being naked in its arms.
The hot bath makes me feel better.
“What should I call you?”
“Sandy.”
That’s probably some clever acronym that the marketing team came up with after a long lunch. Sunshine Autonomous Nursing Device? I don’t really care. “Sandy” it is.
According to Sandy, for “legal reasons,” I’m required to sit and listen to a recorded presentation from the manufacturer.
“Fine, play it. But keep the volume down and hold the crossword steady, would you?”
Sandy holds the folded-up paper at the edge of the tub with its metal fingers while I wield the pencil in my good hand. After a musical introduction, an oily, rich voice comes out of Sandy’s speaker.
“Hello. I’m Dr. Vincent Lyle, Founder and CEO of Sunshine Homecare Solutions.”
Five seconds in, and I already dislike the man. He takes far too much pleasure in his own voice. I try to tune him out and focus on the puzzle.
“... without the danger of undocumented foreign homecare workers, possible criminal records, and the certain loss of your privacy ...”
Ah, yes, the scare to seal the deal. I’m sure Sunshine had a lot to do with those immigration reform bills and that hideous Wall. If this were a few years earlier, Tom and Ellen would have hired a Mexican or Chinese woman, probably an illegal, very likely not speaking much English, to move in here with me. That choice is no longer available.
“... can be with you, 24/7. The caretaker is never off-duty ... “
I don’t have a problem with immigrants, per se. I’d taught plenty of bright Mexican kids in my class – some of them no doubt undocumented – back when the border still leaked like a sieve. Peggy was a lot more sympathetic with the illegals and thought the deportations too harsh. But I don’t think there’s a right to break the law and cross the border whenever you please, taking jobs away from people born and raised here.
Or from American robots. I smirk at my little mental irony.
I look up at Sandy, who lifts the lens hood flaps over its cameras in a questioning gesture, as if trying to guess my thoughts.
“... the product of the hard work and dedication of our one-hundred-percent American engineering staff, who hold over two hundred patents in artificial intelligence ...”
Or from American engineers, I continue musing. Low-skilled workers retard progress. Technology will always offer a better solution. Isn’t that the American way? Make machines with metal fingers and glass eyes to care for you in your twilight, machines in front of which you won’t be ashamed to be weak, naked, a mere animal in need, machines that will hold you while your children are thousands of miles away, absorbed with their careers and their youth. Machines, instead of other people.
I know I’m pitiful, pathetic, feeling sorry for myself. I try to drive the feelings away, but my eyes and nose don’t obey me.
“... You acknowledge that Sunshine has made no representation that its products offer medical care of any kind. You agree that you assume all risks that Sunshine products may ...”
Sandy is just a machine, and I’m alone. The idea of the days, weeks, years ahead, my only company this robot and my own thoughts, terrifies me. What would I not give to have Peggy back?
I’m crying like a child, and I don’t care.
“... Please indicate your acceptance of the End User Agreement by clearly stating ‘yes’ into the device’s microphone.”
“Yes, YES!”
I don’t realize that I’m shouting until I see Sandy’s face “flinch” away from me. The idea that even a robot finds me frightening or repulsive depresses me further.
I lower my voice. “If your circuits go haywire and you drop me from the top of the stairs, I promise I won’t sue Sunshine. Just let me finish my crossword in peace.”
“Would you drop me out the upstairs window if I order you to?”
“No.”
“Have a lot of safeguards in those silicon chips, do you? But shouldn’t you prioritize my needs above everything else? If I want you to throw me down the stairs or choke me with your pincers, shouldn’t you do what I want?”
“No.”
“What if I ask you to leave me in the middle of some train tracks and order you to stay away? You wouldn’t be actively causing my death. Would you obey?”
“No.”
It’s no fun debating moral philosophy with Sandy. It simply refuses to be goaded. I’ve not succeeded in getting sparks to flow from its head with artfully constructed hypotheticals the way they always seem to do in sci-fi flicks.
I’m not sure if I’m suicidal. I have good days and bad days. I haven’t broken down and cried since that first day in the bathtub, but it would be a stretch to say that I’ve fully adjusted to my new life.
Conversations with Sandy tend to be calming in a light, surreal manner that is likely intentional on the part of Sandy’s programmers. Sandy doesn’t know much about politics or baseball, but just like all the kids these days, it’s very adept at making Web searches. When we watch the nightly game on TV, if I make a comment about the batter in the box, Sandy generally stays silent. Then, after a minute or so, it will pop in with an obscure statistic and a non-sequitur comment that’s probably cribbed verbatim from some sabermetrics site it just accessed wirelessly. When we watch the singing competitions, it will offer observations about the contestants that sound like it’s reading from the real-time stream of tweets on the Net.
Sandy’s programming is surprisingly sophisticated. Sunshine apparently put a great deal of care into giving Sandy “weaknesses” that make it seem more alive.
For example, I discovered that Sandy didn’t know how to play chess, and I had to go through the charade of “teaching” it even though I’m sure it could have downloaded a chess program in seconds. I can even get Sandy to make more mistakes during a game by distracting it with conversation. I guess letting the invalid win contributes to psychological well-being.
Late morning, after all the kids have gone to school and the adults are away at work, Sandy carries me out for my daily walk.
It seems as pleased and excited to be outside as I am – swiveling its cameras from side to side to follow the movements of squirrels and hummingbirds, zooming its lenses audibly on herb gardens and lawn ornaments. The simulated wonder is so real that it reminds me of the intense way Tom and Ellen used to look at everything when I pushed them along in a double stroller.
Yet Sandy’s programming also has surprising flaws. It has trouble with crosswalks. The first few times we went on our walks, it did not bother waiting for the walk signal. It just glanced around and dashed across with me when there was an opening in the traffic, like an impatient teenager.
Since I’m no l
onger entertaining thoughts of creatively getting Sandy to let me die, I decide that I need to speak up.
“Sunshine is going to get sued if a customer dies because of your jaywalking, you know? That End User Agreement isn’t going to absolve you from such an obvious error.”
Sandy stops. Its “face,” which usually hovers near mine on its slender stalk of a neck when we’re on walks like this, swivels away in a facsimile of embarrassment. I can feel the robot settling lower in its squat.
My heart clenches up. Looking away when admonished was a habit of Ellen’s when she was younger. She would blush furiously when she felt she had disappointed me, and not let me see the tears that threatened to roll down her cheeks.
“It’s all right,” I say to Sandy, my tone an echo of the way I used to speak to my little daughter. “Just be more careful next time. Were your programmers all reckless teenagers who believe that they’re immortal and traffic laws should be optional?”
Sandy shows a lot of curiosity in my books. Unlike a robot from the movies, it doesn’t just flip through the pages in a few seconds in a flutter. Instead, if I’m dozing or flipping through the channels, Sandy settles down with one of Peggy’s novels and read for hours, totally absorbed, just like a real person.
I ask Sandy to read to me. I don’t care much for fiction, so I have it read me long-form journalism, and news articles about science discoveries. For years it’s been my habit to read the science news to look for interesting bits to share with my class. Sandy stumbles over technical words and formulas, and I explain them. It’s a little bit like having a student again, and I find myself enjoying “teaching” the robot.
This is probably just the result of some kind of adaptive programming in Sandy intended to make me feel better, given my past profession. But I get suckered into it anyway.
I wake up in the middle of the night. Moonlight falls through the window to form a white rhombus on the floor. I imagine Tom and Ellen in their respective homes, sound asleep next to their spouses. I think about the moon looking in through their windows at their sleeping faces, as though they were suddenly children again. It’s sentimental and foolish. But Peggy would have understood.
Sandy is parked next to my bed, the neck curved around so that the cameras face away from me. It gives the impression of a sleeping cat. So much for being on duty 24/7, I think. Simulating sleep for a robot carries the anthropomorphism game a bit too far.
“Sandy. Hey, Sandy. Wake up.”
No response. This is going to have to be another feedback item for Sunshine. Would the robot “sleep through” a heart attack? Unbelievable.
I reach out and touch the arm of the robot.
It sits up in a whirring of gears and motors, extending its neck around to look at me. A light over the cameras flicks on and shines in my face, and I have to reach up to block the beam with my right hand.
“Are you okay?” I can actually hear a hint of anxiety in the electronic voice.
“I’m fine. I just wanted a drink of water. Can you turn on the bedside lamp and turn off that infernal laser over your head? I’m going blind here.”
Sandy rushes around, its motors whining, and brings me a glass of water.
“What happened there?” I ask. “Did you actually fall asleep? Why is that even part of your programming?”
“I’m sorry,” Sandy says. It really does seem contrite. “It was a mistake. It won’t happen again.”
I’m trying to sign up for an account on this website so I can see the new pictures of the baby posted by Ellen.
The tablet is propped up next to the bed. Filling in all the information with the touchscreen keyboard is a chore. Since the stroke, my right hand isn’t at a hundred percent either. Typing feels like poking at elevator buttons with a walking stick.
Sandy offers to help. With a sigh, I lean back and let it. It fills in my personal information without asking. The machine now knows me better than my kids. I’m not sure that either Tom or Ellen remembers the street I grew up on – necessary for the security question.
The next screen asks me to prove I’m a human to prevent spambots from signing up. I hate these puzzles where you have to decipher squiggly letters and numbers in a sea of noise. It’s like going to an eye exam. And my eyes aren’t what they used to be, not after years of trying to read the illegible scribbles of teenagers who prefer texting to writing.
The puzzles they use on this site are a bit different. Three circular images are presented on the page, and I have to rotate them so the images are oriented right-side-up. The first image is a zoomed-in picture of a parrot perched in some leaves, the bird’s plumage a cacophony of colors and abstract shapes. The second shows a heaped jumble of plates and glasses lit with harsh lights from below. The last is a shot of some chairs stacked upside-down on a table in a restaurant. All are rotated to odd angles.
Sandy reaches out with a metal finger and quickly rotates the three images to the correct orientation. It hits the submit button for me.
I get my account, and the pictures of little Maggie fill the screen. Sandy and I spend a long time looking at them, flipping from page to page, admiring the new generation.
I ask Sandy to take a break and clean up in the kitchen. “I want to be by myself for a while. Maybe take a nap. I’ll call you if I need anything.”
When Sandy is gone, I pull up the Web search engine on the tablet and type in my query, one shaky letter at a time. I scan through the results.
The seemingly simple task of making an image upright is quite difficult to automate over a wide variety of photographic content ... The success of our CAPTCHA rests on the fact that orienting an image is an AI-hard problem.*
An Al-hard problem: one that cannot yet be solved by computers alone, but requires the work of a human.
My God, I say to myself. I’ve found the man in the Mechanical Turk.
“Who’s in there?” I ask, when Sandy comes back. “Who’s really in there?” I point my finger at the robot and stare into its cameras. I picture a remote operator sitting in an office park somewhere, having a laugh at my expense.
Sandy’s lens hoods flutter wide open, as if the robot is shocked. It freezes for a few seconds. The gesture is very human. An hour ago I would have attributed it to clever programming, but not now.
It lifts a finger to its metallic lips and opens and closes the diaphragms in its cameras a few times in rapid succession, as though it were blinking.
Then, very deliberately, it turns the cameras away so they are pointing into the hallway.
“There’s no one in the hall, Mr. Church. There’s no one there.”
Keeping the cameras pointing away, it rolls up closer to the bed. I tense up and am about to say something when it grabs the pencil and the newspaper (turned to today’s crossword) on the nightstand, and begins to write rapidly without the paper being in the cameras’ field of view. The letters are large, crude, and difficult to read.
PLEASE. I’LL EXPLAIN.
“My eyes seem to be stuck,” it says to the empty air, the voice as artificial as ever. “Give me a second to unjam the motors.” It begins to make a series of whirring and high-pitched whining noises as it shakes the assembly on top of its neck.
WRITE BACK. MOVE MY HAND.
I grab Sandy’s hand, the metal fingers around the pencil cool to the touch, and begin to print laboriously in capital letters. I’m guessing there is some feedback mechanism allowing the operator to feel the motions.
COME CLEAN. OR I CALL POLICE.
With a loud pop, the cameras swivel around. They are pointed at my face, still keeping the paper and the writing out of view.
“I need to make some repairs,” Sandy says. “Can you rest while I deal with this? Maybe you can check your email later if you’re bored.”
I nod. Sandy props up the tablet next to the bed and backs out of the room.
Dear Mr. Church,
My name is Manuela Ríos. I apologize for having deceived you. Though the headset
disguises my voice, I can hear your real voice, and I believe you are a kind and forgiving man. Perhaps you will be willing to hear the story of how I came to be your caretaker.
I was born in the village of La Gloria, in the southeastern part of Durango, Mexico. I am the youngest of my parents’ three daughters. When I was two, my family made its way north into California, where my father picked oranges and my mother cleaned houses. Later, we moved to Arizona, where my father took what jobs he could find and my mother took care of an elderly woman. We were not rich, but I grew up happy and did well in school. There was hope.
One day, when I was thirteen, the police raided the restaurant where my father worked. There was a TV crew filming. People lined up on the streets and cheered as he and his friends were led away in handcuffs.
I do not wish to argue with you about the immigration laws, or why it is that our fates should be determined by where we were born. I already know how you feel.
We were deported and lost everything we had. I left behind my books, my music, my American childhood. I was sent back to a country I had no memories of, where I had to learn a new way of life.
In La Gloria, there is much love, and family is everything. The land is lush and beautiful. But how you are born there is how you will die, except that the poor can get poorer.
My father went back north by himself, and we never heard from him again. My sisters went to Mexico City, and sent money back. We avoided talking about what they did for a living. I stayed to care for my mother. She had become sick and needed expensive care we could not afford.
Then my oldest sister wrote to tell me that in one of the old maquiladoras over in Piedras Negras, they were looking for girls like my sisters and me: women who had grown up in the United States, fluent in its language and customs. The jobs paid well, and we could save up the money my mother needed.
The old factory floor has been divided into rows of cubicles with sleeping pads down the aisles. Each girl has a headset, a monitor, and a set of controls before her like the cockpit of a plane on TV. There’s also a mask for the girl to wear, so that her robot can smile.