Uncanny Valley

Home > Other > Uncanny Valley > Page 2
Uncanny Valley Page 2

by C. A. Gray


  “Whatcha doin’ there?”

  I jumped, and gasped, a hand flying to my chest as the other covered my pages as best I could. “Geez! You scared me.”

  “I see that. You might wanna lay off the espresso, I think it’s not good for your nerves.” Liam, the post-doc I worked under, raised an eyebrow at me, his lips curled in that characteristic smirk of his. I shut the notebook immediately. “What are you writing?”

  I shrugged. “Just killing time,” I told him evasively. If he knew, I’d never hear the end of it.

  “I guess it’s too much to hope that you might be brainstorming new experiments…”

  “No, just…” I decided it would be best to change the subject. “What are you doing here, Liam?” He usually didn’t come by during my experiments—human psychology wasn’t his thing. He waited for me to bring him conclusions, and then translated them into algorithms. I’m technical—I don’t do people, was the way he usually put it. It was sort of tongue-in-cheek though, since his social skills were fine, as far as I could tell. Although he was undeniably eccentric.

  He took a deep breath, his face suddenly serious. I knew what he was going to say before he said it—only one subject rendered him so somber. “You heard Halpert’s address?”

  I nodded. I had watched it after Mom called last night, even though I’d just sent her a quick comm about it and gone right back to bed after that.

  “This is bad,” he said, in a tone of voice that prompted agreement.

  I opened my mouth and closed it again, waiting for him to elaborate. Which he would, if I didn’t do it for him. He couldn’t help himself. “You’re afraid that if the bots get creativity…”

  “They’ll become superintelligent, way surpassing humans!” he finished. “Once they become creative, they can devise ways to make themselves smarter, without our intervention at all… and that’s an exponential curve. This was exactly what the Council of Synthetic Reason was supposed to prevent! I wrote a manifesto about it on my locus as soon as he’d finished and issued a counter-call to Halpert’s. We have to band together across the globe and hound our senators to stop the challenge, or else this collaboration will lead to the extinction of the human race—”

  “Oh, Liam,” I rolled my eyes. Among his many eccentricities, Liam had a conspiracy theory labyrinth locus with apparently millions of followers, or so he was always telling me. He was a sort of B-list celebrity in that way—in pseudonym only, since I never saw his picture on there anywhere, and he didn’t use his real name. “I don’t want to become a target!” was his explanation for this. I’d gone to his locus once before, expecting to see skulls and crossbones and apocalyptic imagery, but I had to admit I was grudgingly impressed. It looked classy enough, and the one post I’d read was remarkably well-written, cited, and devoid of groundless assertions. I also saw that it had thousands of shares, and had only been posted that morning.

  “It’s true, if bots gain the creative ability to problem-solve like we have, one of the most probable futures is our extinction!” Liam insisted. “Don’t roll your eyes at me!”

  “You’re such an extremist. It’s quite a leap from giving bots creativity to the extinction of the human race.” I emphasized each word, trying to make him recognize how silly he sounded.

  Undaunted, Liam raised his index finger as he counted off his points. “One: bots are single-minded. They exist only to fulfill their programmed core purpose. Two: they are ruthless logicians. Within the parameters of their abilities, they perform that core purpose as efficiently as possible. Three: at present, they can only learn to do procedural tasks, they cannot yet solve complex and interconnected problems. Once they have creativity, though, they’ll be able to do that just like we can. So what happens if you have a bot who has creativity, and whose goal is to make itself smarter? It does nothing else, 24 hours a day, seven days a week. Trial and error, trial and error.” He demonstrates a logarithmic curve with his hand, eyes wide as he shook his head at me for emphasis. “Exponential growth!”

  I stared at him, unimpressed. “Okay fine, so they become superintelligent. How does that lead to our—”

  “Because!” he cut me off. “Maybe the bot who first becomes superintelligent has no interest in us at all, but it is interested in spreading its own upgrades to other bots. Now every other bot in the world, with every other possible core purpose, has infinite creativity at its disposal in order to fulfill its purpose! Just take one of those bots: let’s say one of them manufactures shoes, okay? Its goal is to manufacture as many high quality shoes in as short a period of time as possible. But it’s ruthlessly logical, and narrowly focused. What if it decides that humans are using too many resources that it needs to make its shoes? Logical solution? Kill all the humans! Then it can mine all the resources on earth and the moon and Mars and any other planet they colonize after we’re gone, and voila! The entire universe becomes a giant shoe closet!”

  I cracked up a little. I couldn’t help it. “All right, I see your point that they might carry out their core purposes in unexpected ways. But it seems like we should be able to safeguard against that…”

  “You mean, predict all the possible ways that a superintelligence leagues beyond human understanding might interpret their core purposes?” Liam countered. “How would we do that, exactly? That would be like an ant trying to fathom the mind of a man! The only possible way to approach this is to stop it before it can happen. QED,” he added with a triumphant flourish, sitting back in his chair. He always said ‘QED’ when he felt like he’d made an unassailable point. All I knew was that it was a math term in Latin, and it basically meant the same thing as, “infinity times, the end.”

  I looked up before I could shoot back something sarcastic at Liam, sensing that we were not alone. Carrie, my first test subject, stood meekly in the door frame. I saw her mouth fall open a little when she saw Liam, and she colored a bit. I knew why: Liam was your classic tall, dark, and handsome, with blue eyes and that angular jaw that drove women crazy. It annoyed me. All right, he’s good looking, people. Get over it. If they knew him the way I did, it wouldn't take long.

  “Done?” I said to Carrie, leaning past Liam with a firm smile and forcing her to look at me as I pointed down the hall. “The medic bot will take one more blood sample in the next room over. Then come back here and I’ll get you paid.”

  She vanished, and I turned back to Liam. One thing I had to give Liam: he never seemed to get that self-complacent air that most attractive men get when they find themselves admired. I’m not sure he even noticed.

  “But there might be one thing we can do as a potential safeguard, just in case Halpert’s challenge is answered…” Liam went on, as if the interruption hadn’t happened. “Program them with a moral code.”

  “You’re talking about Isaac Asimov’s Three Laws of Robotics?” I raised my eyebrows, trying unsuccessfully not to smirk. I only knew these existed in the first place because of something I read on Liam’s locus, and I couldn’t exactly recall what they were—something about the hierarchy of robotic priorities as a way to keep them from becoming dangerous to humans.

  “That’s fiction,” Liam scoffed. “This is reality. ”

  I shrugged. “Okay. So program them with a moral code, then. What do you need me for?”

  “I need you,” he leaned forward, his eyes staring melodramatically into mine, “because everybody thinks we won’t get creativity except through emotion. They go together, peas in a pod.” He crossed two fingers together to illustrate. “Humans have a moral code too, but when we break it, why do we break it?”

  “Emotion,” I sat back in my chair, catching his drift now.

  He nodded. “So let’s recap.” He held up his fingers one at a time. “One: Superintelligent bots with unlimited creativity. Two: Emotional superintelligent bots… who can presumably therefore overwrite or ignore their programming if they don’t happen to like it. So sure, we could give them a moral code, but wh
o’s to say they’ll follow it?”

  “If they can overwrite their programming, maybe they decide they don’t want to manufacture shoes, then, either,” I pointed out. “Maybe they want to… I don’t know, optimize the fertility of giraffes instead.”

  Liam shrugged. “Maybe, but both will be equally fatal to us—too many humans means not enough space for giraffes, either. Point being, we need to instill at least some semblance of morality in the bots that they won’t just overwrite at will. Maybe they can, but they’d choose not to. So what does that take? Empathy or something? I don’t know, this is your department. I’m just speculating.”

  I bit my lip, eyes absently scanning the air as I tried to mentally retrieve what I’d studied about this. “Empathy comes from mirror neurons, which cause us to share the emotional experience of another person. Emotion comes from the limbic system, and it’s mediated by a whole lot of neurotransmitters…”

  “But sociopaths don’t have emotion or morality, right?” Liam cut me off. “Seems like they’d be a good group to study, to see what’s missing…”

  “They have emotion,” I corrected, “they just lack empathy.”

  “Perfect!” Liam said, “that’s exactly what we’re looking to avoid: a race of superintelligent sociopaths! So you’re on that. Design me a human experiment to define the relationship between emotion, empathy, and morality.”

  I laughed, incredulous. “Oh, is that all! What about my thesis?” I gestured at the lab where we currently sat.

  “You’re already studying human emotion, aren’t you? Something about that new neuropeptide?”

  “Yes!” Then I reminded him, since I knew he wouldn’t recall the particulars, nor would he care, “We’re shooting VMI images of the brain when participants see an image of someone they love, and then drawing pre- and post-blood samples to see if salheptonin increases simultaneously. My theory is, it’s the neuropeptide responsible for the physical sensations of desire.”

  “Okay, so this is just a pivot,” Liam shrugged.

  “This is a completely new project!” I protested. “And it’s enormous, by the way! If I ‘pivot’ like you’re suggesting, someone else will complete my research before I do, and I’ll have to start all over with something else if I want to graduate!”

  “Rebecca!” He made exploding movements with his hands. “End of the world! Who cares if you have a degree if the entire human race is extinct?”

  I shook my head, incredulous. “I hate you.”

  “No you don’t,” he flashed me a rakish smile. “Besides, like you’re having so much fun working on this thesis of yours, anyway?” he gestured at the stark white room in which we sat, and at the notebook I’d closed. “Every time I come upon you unannounced, you’re writing the next Gone with the Wind. Or reading a novel from the Second Age. Or humming to yourself to learn the harmony for your next musical performance—you never told me when the performance was, by the way.”

  I flushed. “Just because I have lots of interests—”

  “Hey, hey. It’s okay,” Liam cut me off, hands in the air with an expression of mock seriousness. “I’ve got millions of people following my labyrinth locus who agree with me. I don’t need you to be one of them, although it would be very nice. But you remember what Dr. Yin said about you in your review?”

  I sighed, and quoted Dr. Yin: “‘Technically very competent, but her heart does not seem to be in her work.’” I hadn’t known whether to be flattered or insulted when I read that.

  He nodded. “Yeah. And you know what she said to me privately? She asked if you were planning to go for your Ph.D., because if you were, she’d like to offer you a spot in her lab. You’re that good, Bec. And considering you’d rather be doing something else, in a way that’s even more impressive. Who knows, this project might even turn into your Ph.D. thesis!”

  “Eh-hem.”

  I looked up, startled. One of my middle-aged male test subjects stood in the door frame, ready for his exit blood draw. Carrie queued up behind him, waiting to get paid. I sent the man to the medic bot and beckoned Carrie forward, scanning her fingerprint again to deposit her compensation and trying to ignore the way her eyelashes shyly fluttered at Liam.

  Liam never complimented me. He usually restricted his communications to fluent sarcasm punctuated by obsessive rants. It’s only because he’s buttering me up to do something for him, I thought as I dismissed Carrie.

  When I’d finished, Liam resumed his unnerving stare deep into my eyes, triggering involuntary heat in my cheeks. I wished he’d blink, or look away occasionally, or something. He went on, “What I’m saying is, I need someone on the human psych end to help me program a morality the bots will actually follow, as a safeguard in case Halpert’s challenge does succeed. At least then, even if superintelligence does emerge, we’ll have ‘warm and fuzzy’ superintelligent bots, with a soft spot for humans!” He mimed his impression of ‘warm and fuzzy’ as he said this, which turned out to look a lot like jazz hands. I shook my head at him with a suppressed smile, and his eyes twinkled at me in response. “Once we have your succinct definition, then Nilesh and Larissa and I can reverse engineer it in mathematical terms and make it into an algorithm. Then you can go back to working on whatever you want, I promise.”

  “And if someone answers Halpert’s challenge before we get the chance?” I asked. “Because what you’re asking me to do is borderline impossible, I’m just saying…”

  Liam shrugged, unconcerned. “Then we’ll kill him, and bury his research before anybody else can use it to buy us more time. Please be quick though, would you? I’d like to kill as few people as possible.”

  “Glad to hear you have a backup plan,” I nodded, trying to match his deadpan delivery without success.

  Chapter 2

  When I arrived in my flat, I dropped my coat and hat on my twin bed. I had another hour to kill after the experiment finished and before my Creative Writing class, and probably should have started working on Liam's new project. But impossible requests breed apathy (I think I read that on a plaque somewhere once), so I didn't feel especially motivated to try. Plus, I wanted to show Madeline the video of the performance from last night—or at least the parts when I was on stage. She had been my main study partner to help me learn my lines, as always, and I wanted her to appreciate the final product.

  Madeline was plugged into the charging station and powered down. She was about a foot tall, so I could easily transport her in my backpack, and made of a lightweight aluminum alloy. Her face was metal, too, and yet humanoid—but with eyes that took up about half her face. I pressed the power button behind her neck, and her eyes lit up.

  “How did it go?” she burst out when she saw me.

  “Come here and I’ll show you.” I pulled up my netscreen and pressed the play button with a flourish of anticipation. Since all of us (humans and bots alike) could access the labyrinth via our A.E. chips, the only legitimate purpose remaining for netscreens was for sharing a mutual experience, such as watching a film or a video, or sharing information from the labyrinth with someone else. I used my A.E. chip as little as possible, but then, I was odd.

  “Oh Becca,” Madeline breathed when I stopped the recording, her digital eyes shining with the most perfect admiration I could possibly ask for. “You were absolutely incredible!”

  “Thanks,” I grinned, dispensing with the necessary false humility that I would have employed with anyone else. With Madeline, I could just admit that I knew I was good.

  “When do rehearsals start for The Tempest?” she asked me eagerly. I had been cast as Miranda--my third leading role in a row.

  “Next week is the first one,” I told her. “This will be my first weekend off in awhile—I was thinking of asking Julie to come with me to London to meet Jake.” Jake was one of my best friends from Casa Linda, practically like a brother to me. We had gone to high school together, and had been part of the same group of friends for the last five ye
ars now. I'd met that whole group shortly after my dad's death. Including Andy.

  Madeline knew where my mind went next; I'd become predictable, I suppose. “Are you gonna try to get Andy to go too?”

  “I'll ask him,” I murmured, blushing. “He won't go, though. Quantum Track tickets are a lot more expensive from New York to London than they are from Dublin.”

  “He's got student loans too!” Madeline pointed out. “Besides, ever since last summer, I keep expecting him to show up at your doorstep one day, and sweep you into his arms, and tell you he can't live without you…"

  I giggled, delighted, but with a tinge of chagrin. Madeline was the only one who still indulged such fantasies with me. It was true that last summer, Andy commed me almost nonstop while we were home in Casa Linda for the summer, between hangout sessions with our friends. He told me all kinds of leading things--like, “I don’t know why such a beautiful girl would think such-and-such” (clearly implying he meant me,) or, “You know guys are super intimidated by you, right?” Which I of course took as the reason why he hadn't come right out and directly told me how he felt about me yet, nor did he ever ask to hang out with me alone: he was too shy. I just had to be more encouraging; then he'd come around.

  Maybe he'd convinced himself that I was out of his league or something. Maybe that was why he'd basically stopped talking to me this semester.

  Unfortunately, Madeline was the only “person” besides me who could believe that. Which made her pretty much the only one I could talk to about Andy at all, at this point.

  “I'll invite him,” I reiterated. “But I've gotta ask Julie first. I'm meeting her for lunch after Creative Writing.”

 

‹ Prev