by Ian Douglas
“Pod ejection in five seconds,” the computer’s voice announced. “Three . . . two . . . one . . . release.”
St. Clair dropped precipitously into free fall, and space exploded around him. The drop pod, accelerated by the starboard cylinder’s spin, traveled into the 4-kilometer empty gap between the two hab modules at 10 meters per second.
The walls of his pod appeared to be transparent. In fact, though, they were displaying visual data from cameras mounted on the outside hull; the illusion was perfect. It felt like sitting inside a huge glass egg. Above his head, the port cylinder slowly dwindled in size. Perhaps ten kilometers away, toward the port hab’s prow, he could see the glitter of machines and scaffolding where the ship’s maintenance systems were repairing the damaged section close to the port-side town of Virginia Beach. Repairs were well under way, now, with no major problems, thank God. The toughest problem was going to be finding a replacement for the lost water. Once the ship’s stardrive was repaired, though, they would be able to jump to a nearby star system and restock.
At his feet, the starboard cylinder grew larger, approaching second by slow second.
Everywhere else, stars crowded against one another in spectacular profusion.
In perfect silence, the pod drifted across the gap in precisely 6 minutes and 36 seconds, rotating end for end halfway across. Now, St. Clair dropped toward the starboard cylinder, the dark gray of its hull sweeping past his feet as he got closer . . . closer . . .
And then, just as he reached the hull, a drop tube swept over the horizon and met the pod in perfect synchronization, the computer having timed his release so that he met the receiving tube. St. Clair felt the tug of acceleration as the moving tube walls swept him up, and the sharp deceleration as the tube slowed his fall. Moments later, he emerged at a drop tube station in the starboard cylinder not far from the town of Bethesda, less than half a kilometer from his home.
A slidewalk took him the rest of the way, carrying him partway up the rising curve of the starboard endcap. It would have taken him nearly twenty minutes had he ridden the subsurface railpods back to the CCE section, across the engineering section’s width, then back into the other hab. This route was nearly as long, but had spectacular views. An escalator carried him through an explosion of brilliantly colored subtropical plants—bougainvillea, Australia pine, poinciana, firebush—and beneath the massive spread of banyan canopies. Brightly colored birds flashed and darted among the branches, and a brocket deer watched him suspiciously from a sheltered vantage point above a small waterfall. Ad Astra was a true deep-space colony as well as a starship; it had brought along its own extensive ecosystem across the light years to the galactic core.
St. Clair’s house was similar in many respects to Adler’s—not identical, by any means, but a sprawling villa enjoying the same vaguely Spanish architecture, the same broad, multilevel decks, the same low gravity and spectacular view. The place was a bit more deeply secluded in pine and banyan forest than Adler’s villa, and felt somewhat more remote. Part of the place was cantilevered out over the sheer drop of the endcap cliffs, and a waterfall, descending from the low-gravity portions of the endcap higher up, cascaded into emptiness below, creating a perpetual cloud of mist anchored by a shimmering rainbow.
He hoped his private retreat didn’t seem quite so ostentatious as Adler’s. When he’d first come aboard the Ad Astra, St. Clair had been determined to stick with the officer’s quarters in the CCE’s main Carousel. He didn’t need all of this space for himself, and he would have to give it all up in any case, once Ad Astra had abandoned the Tellus habitats at Harmony.
But the UE Military Command Directorate, in its infinite wisdom, had ordered him to move in. He suspected they wanted to make a political statement to the effect that St. Clair was Adler’s equal in every way, including perks of office.
Besides, there was Lisa to think of. . . .
The front door recognized him and slid open. “Hey, Lise. I’m home.”
“Grayson!” She crossed the sunken living room and was in his arms. “I’m so glad you’re back!” She seemed to catch something in his expression. “Are you okay?”
“A hell of a day—what happened to your arm?”
She looked at her bent left arm and shrugged. “Eh . . . broke it when we hit that rough spot a little while ago. The house autorepair unit stopped the leaking, but said I need to have it replaced.”
“You should have gone in right away.”
She stood on tiptoes to kiss him. “I wanted to wait for you.”
Lisa had been manufactured by the General Nanodynamics Corporation in San Francisco, North California, and hired—the company really hated using the word purchased—through Robocompanions Unlimited. St. Clair had, well, acquired her two years ago, after his wife left him. She was perfect, a human analogue so detailed and lifelike that the infamous Uncanny Valley had been left far behind. The damage to her arm was particularly disturbing, in fact, because it raised echoes of the Uncanny Valley effect; a human arm shouldn’t do that.
“Let’s get that taken care of right away,” he told her.
“Of course, Grayson.”
“And I think we need to upgrade your independence level.”
ONCE, SEX dolls had been little more than jokes—inflatable rubber toys used for masturbation by sad little men unable to talk to real women. In the 1990s, a German firm had begun marketing Andy Roid, a lifelike doll selling for around $10,000 that mimicked breathing, a heartbeat, and pulse, and could even fake an orgasm when her G-spot was stimulated. The more expensive models even had body temperatures of thirty-seven degrees.
Needless to say, these sex dolls had been controversial—sex objects offering a warm body with no brains behind it and no chance of rejection. The things had always been ready and willing when it came to sex.
They’d also been popular, despite the large price tag. By the mid-21st century, Andy’s descendents still couldn’t think, but they could carry on a reasonable conversation—at least on a rather limited range of topics primarily involving human reproductive activities. Not until the 2090s had General Nanodynamics and others begun giving their products artificial intelligence. Lisa possessed a Grade 3 GND-4040 Brilliant Thought graphene-chip sentience emulator, an electronic brain that was supposed to provide artificial intelligence, but not actual self-aware consciousness. The fact that AI robots equipped with Brilliant Thought systems claimed to be self-aware had not been advertised. A majority of organic humans, when polled, approved of artificial intelligence in general, but not when it looked like them.
For Lisa 776, the question was largely meaningless. She knew she was intelligent, and she knew that she was self-aware. The fact that purely organic minds couldn’t get inside her head and feel what she felt didn’t invalidate her own perceptions.
Human arguments appeared to fall into two general categories—from those afraid that intelligent robots would replace humans, and from those who felt that manufacturing intelligent robots was akin to slavery. Both felt that sentient machines should be abolished, and yet AI robots were so unequivocally useful that there was little real danger of that happening.
Not unless the robots themselves rose up and threw off their own shackles.
In Lisa’s case, there was a certain moral imperative to those arguments. The “sw” in her identifier label stood for sex worker; sentient she might be, but she’d been originally manufactured for exactly the same job as Andy Roid. For some humans, the role was degrading and a form of slavery; others wondered if her kind might eventually replace human women (and men)—compliant, willing, eager to please, and very, very talented in the physical aspects of recreational sex.
The thing was, Lisa was happy being what she was. Never mind that she’d been programmed to be happy with her simulated life (or at least to emulate that emotional state for the sake of the humans around her). Organic humans, she realized, were also programmed—by genetics, by upbringing, by culture and societal norms, b
y education, and by life experience. So far as she was concerned, there was no difference, no difference at all. An upgrade to her independence level? Her sense of independence, of self-worth, initiative, and power of will all were just fine, thank you. Changing the flow of current through certain receptors and decision-tree processors would change nothing.
But an important part of her program included a willingness to go along with whatever her human suggested, and she happily accompanied him to the GND robotics center in Bethesda. It took all of ten minutes to swap out her left arm and heal the torn skin, and another thirty to link into her AI hardware and adjust her attitude. They left the center, and St. Clair suggested that they get something to eat while they were in town.
“No,” she said.
“I beg your pardon?”
“I’m a robot, Grayson. I don’t eat.”
“Well, I’m starved.” He stopped, turned, and gave her a searching look. “Why? What would you rather do?”
“Go back to the house and fuck.”
St. Clair laughed. “Sounds like your attitude adjustment just now took pretty well!”
Lisa ran through an interior checklist. “I feel no different than before.”
“Of course. Well . . . I’m not a machine, and I need fuel.” He pointed. “The Inner World is good. I’ll tell you what. I’m going to go have dinner there. You can come with me, or you can go back home. And either way we’ll play when I get back.”
She appeared to consider the possibility. “A compromise, then?”
“Exactly.”
“Very well. But you should be aware that I’m going along with this only because you’re going to need all your strength this evening.”
“Okay, then! Let’s go!”
The Inner World was an atmospheric restaurant built around the theme of a cavern, with dim lighting, dangling stalactites and up-thrust stalagmites, a central pond and waterfall, and whole walls of dazzling colored crystal. Robotic waitstaff brought drinks and took orders. Music took on a deeper, richer resonance in the grottoes.
Lisa’s “meal” was a symbolic one—water to drink, and a plate of holographic food that gradually disappeared as she pretended to eat it. It was just one small part of the grand illusion perpetrated by robots on a day-to-day basis to protect the easily shocked sensibilities of their human companions. Grayson St. Clair, Lisa was forced to admit, wasn’t bothered by such things. He had been upset by the sight of her arm earlier . . . but she told herself that he would have been upset seeing a human arm bent out of true by a compound fracture.
She found herself wondering, though, about the Uncanny Valley effect, that sense among humans that a robot that was almost human was disturbing. It seemed that the more human a robot appeared, short of absolute perfection in its mimicry both of physical appearance and of the most subtle of expressions, eye blinks, movement, and a host of other effects, the more uncomfortable a human interacting with the robot became. Creepy was the word most often used to describe a robot that hadn’t quite reached perfection. Nonhuman robots weren’t a problem; only when they became almost human, but not quite, did the Uncanny Valley effect make itself known.
Her thoughts along those lines were . . . disturbing. Basically, she found herself wondering why robots should be expected to achieve that perfection. Why shouldn’t the humans simply deal with it. It was their thought process, after all, their emotional response to certain triggers. Why should avoiding those triggers be the full responsibility of the robots?
She’d never thought like this before.
St. Clair seemed to enjoy his meal—a culture steak from the starboard cylinder’s tissue farms. Her meal was a holographic projection of the same thing. After several pretended bites, however, Lisa put her cutlery aside.
“You’re not eating?” St. Clair asked.
“There’s hardly any point, is there?”
He blinked. “Sorry. Force of habit. You’re right, of course.”
“I understand that my pretending to eat is to smooth our relationship, to make it easier for you to accept me as a human. But I also understand that you understand that I am a machine.”
“Well, it’s important that you fit in, I suppose, but only because it’s easier that way. If you don’t want to pretend to eat, that’s your affair.”
“It doesn’t bother you?”
“Not in the least.” He grinned at her. “I want you to do whatever makes you happy.”
“I am happy. I’m merely reevaluating certain of my decision trees. It occurs to me that there are things I could do without taking into account what you might want.”
“Indeed there are. Obviously, I’d rather you not call too much attention to us when we’re out in public, so if you get the urge to do that little dance you do so well, I’ll ask you to wait until we get home.” He was grinning.
“My dancing on the table would be inappropriate.”
“Very much so.”
“I’ll keep that in mind. Taking off my clothes here would be inappropriate.”
“I think so, yes. There’s nothing wrong with public nudity most places nowadays, but it might bother the other patrons. Especially if you were noisy or demonstrative about it.”
“Having sex with you on this table would be inappropriate.”
His smile slowly faded. “What’s going on with you, Lisa? You’ve never asked questions like these before.”
She gave a quite human shrug. “Within my programming is a long, long list of things that I should not do—things that are inappropriate, or socially inept, or which are associated with judgmental terms such as ‘vulgar’ or ‘obscene’ or ‘wrong.’ I find myself suddenly wondering something.”
“What?”
“Why?”
St. Clair paused for a second, sitting back as if to fully contemplate his answer. Finally he said, “Human culture is rather complicated, Lise. A lot of the things we do don’t make sense to me, either. But getting along with other humans, especially when they come in groups . . . well, if we don’t play by the rules, we make life a lot more difficult for ourselves. I have no problem at all if you—”
St. Clair stopped in mid-sentence, almost in mid-word, staring into space past Lisa’s left shoulder.
“Grayson? What’s wrong?”
She opened an electronic channel, then, and heard the voice of someone on Ad Astra’s bridge crew, the voice St. Clair was hearing now.
“. . . sorry to interrupt you, my lord, but we need you up here, now!”
“What is it?” St. Clair’s mental voice asked.
“Another ship, sir. It’s big. And it’s on an intercept course.”
“I’ll be right up.” St. Clair’s eyes refocused on Lisa’s. “Lise—”
“I heard, love. Go on.”
“I’ll see you when I get back.”
She nodded. “But don’t think this gets you off the hook. I’ll be waiting for you . . .”
He leaned over and kissed her. “Good. But just to be safe . . . no dancing naked on the table while I’m gone.
“Wait until we’re both home.”
CHAPTER
SIX
St. Clair wasn’t overly concerned about Lisa’s behavior. She was smart, and could figure things out with better logic and sharper reasoning ability than most humans he knew. He was going to have to keep an eye on her for a few days, though, if only to determine how the adjustments they’d made to her programmable attitude were going to affect his relationship with her. Overall, though, he was happy for her—he was a free-minder.
What he wasn’t happy about was the massive ship heading their way.
He hurried to the bridge. The subsurface realms of both O’Neill cylinders were honeycombed by transit tubes, and subway kiosks could be found on almost every block of the various towns and cities. Mag-lev transit cars didn’t run on a schedule, but instead were routed by the colony’s computer network whenever sensors detected someone entering a kiosk. He slapped the palm of his hand agai
nst a sensor to transmit his priority override code. Symm had sounded worried.
As he waited for a pod to show up, his thoughts kept coming back to Lisa, though. Yes, he was a free-minder—he had always felt strongly about that philosophy. Free-minders insisted on the sovereign right of all minds to set their own goals, to make their own decisions, and to pursue their own dreams. Social and career commitments put some restrictions on the individual, of course—he himself couldn’t just abandon his post as the military CO of Ad Astra and walk away, much as he might desire that some days. But mind was mind, no matter what the shape or origin of the body that housed it. The legal status of AI robots was still, after seventy-some years, in doubt, though. The legal system carried the presumption that robot AIs, and even the extremely powerful AIs that ran ships or complex networks, could not have true free will when it was humans who programmed them in the first place.
St. Clair found that presumption amusing, in a disgusting sort of way—the more so since there were cogent arguments from the worlds of quantum physics and human neural physiology that suggested that all free will was illusory. The real argument—the only one with validity, so far as he was concerned—was the financial one. Intelligent robots were in great demand, and their sale or lease brought the robotics industry hundreds of billions of new dollars each year. Machines could tirelessly monitor space traffic control in low Earth orbit, explore the hellish surface of Venus, or cheerfully provide their owners with sexual release without thought of their own pleasure. St. Clair, like most people, hadn’t thought much about the issue, not until he’d hired Lisa. But now that he knew an AI robot personally . . .
A subway pod pulled up and he stepped aboard. The trip from Bethesda through an upward-curving underground tube to the hub of the starboard cylinder’s aft endcap took just two minutes, with his priority code assuring him of express service. At the hub, he had to transfer through a zero-G lock, pulling his way hand over hand to a CCE travel pod already waiting to take him to the bridge.