by Ted Chiang
FROM: Helen Costas
I don’t like the idea of anyone have sex with my digient, but then I remember that parents never want to think about their kids having sex, either.
FROM: Maria Zheng
That’s a false analogy. Parents can’t stop their children from becoming sexual, but we can. There’s no intrinsic need for digients to emulate that aspect of human development. Don’t go overboard with the anthropomorphic projection.
FROM: Derek Brooks
What’s intrinsic? There was no intrinsic need for digients to have charming personalities or cute avatars, but there was still a good reason for it: they made people more likely to spend time with them, and that was good for the digients.
I’m not saying we should accept Binary Desire’s offer. But I think what we need to ask ourselves is, if we make the digients sexual, would that encourage other people to love them, in a way that’s good for the digients?
Ana wonders if Jax’s asexuality means he’s missing out on things that would be beneficial for him to experience. She likes the fact that Jax has human friends, and the reason she wants Neuroblast ported to Real Space is so he can maintain those relationships, strengthen them. But how far could that strengthening go? How close a relationship could one have before sex became an issue?
Later that evening, she posts a reply to Derek’s comment:
FROM: Ana Alvarado
Derek raises a good question. But even if the answer is yes, that doesn’t mean we should accept Binary Desire’s offer.
If a person is looking for a masturbatory fantasy, he can use ordinary software to get it. He shouldn’t buy a mail-order bride and slap a dozen InstantRapport patches on her, but that’s essentially what Binary Desire wants to give its customers. Is that the kind of life we want our digients to have? We could dose them with so much virtual endorphin that they’d be happy living in a closet in Data Earth, but we care about them too much to do that. I don’t think we should let someone else treat them with less respect.
I admit the idea of sex with a digient bothered me initially, but I guess I’m not opposed to the idea in principle. It’s not something I can imagine doing myself, but I don’t have a problem if other people want to, so long as it’s not exploitative. If there’s some degree of give and take, then maybe it could be like Derek said: good for the digient as well as the human. But if the human is free to customize the digient’s reward map, or keep rolling him back until he finds a perfectly tweaked instantiation, then where’s the give and take? Binary Desire is telling its customers that they don’t have to accommodate their digients’ preferences in any way. It doesn’t matter whether it involves sex or not; that’s not a real relationship.
#
Any member of the user group is free to accept Binary Desire’s offer individually, but Ana’s argument is persuasive enough that no one does so for the time being. A few days after the meeting, Derek tells Marco and Polo about Binary Desire’s offer, figuring that they deserve to be kept informed of what’s going on. Polo is curious about the modifications Binary Desire wanted to make; he knows he has a reward map, but has never thought about what it would mean to edit it.
“Might be fun editing my reward map,” says Polo.
“You not able edit your reward map when you working for someone else,” says Marco. “You only able do that when you corporation.”
Polo turns to Derek. “That true?”
“Well, that’s not something I would let you do even when you are a corporation.”
“Hey,” protests Marco. “You said when we corporations, we make all our own decisions.”
“I did say that,” admits Derek, “but I hadn’t thought about you editing your own reward map. That could be very dangerous.”
“But humans able edit own reward maps.”
“What? We can’t do anything like that.”
“What about drugs people take for sex? Ifridisics?”
“Aphrodisiacs. Those are just temporary.”
“InstantRapport temporary?” asks Polo.
“Not exactly,” says Derek, “but a lot of the time when people take that, they’re making a mistake.” Especially, he thinks, if a company is paying them to take it.
“When I corporation, I free make own mistakes,” says Marco. “That whole point.”
“You’re not ready to be a corporation yet.”
“Because you not like my decisions? Ready mean always agree with you?”
“If you’re planning on editing your own reward map as soon as you’re a corporation, you’re not ready.”
“I not said want,” says Marco emphatically. “I don’t want. I said when corporation, I free do that. That different.”
Derek stops for a moment. It’s easy to forget, but this is the same conclusion the user group came to during forum discussions about incorporating the digients: if legal personhood is to be more than a form of wordplay, it has to mean granting a digient some degree of autonomy. “Yes, you’re right. When you’re a corporation, you’ll be free to do things that I think are mistakes.”
“Good,” says Marco, satisfied. “When you decide I ready, it not because I agree you. I can be ready even if I not agree you.”
“That’s right. But please, tell me you don’t want to edit your own reward map.”
“No, I know dangerous. Might make mistake that stop self from fixing mistake.”
He’s relieved. “Thank you.”
“But let Binary Desire edit my reward map, that not dangerous.”
“No, it’s not dangerous, but it’s still a bad idea.”
“I not agree.”
“What? I don’t think you understand what they want to do.”
Marco gives him a look of frustration.”I do. They make me like what they want me like, even if I not like it now.”
Derek realizes Marco does understand. “And you don’t think that’s wrong?”
“Why wrong? All things I like now, I like because Blue Gamma made me like. That not wrong.”
“No, but that was different.” He thinks for a moment to explain why. “Blue Gamma made you like food, but they didn’t decide what specific kind of food you had to like.”
“So what? Not very different.”
“It is different.”
“Agree wrong if they edit digients not want be edited. But if digient agree before be edited, then not wrong.”
Derek feels himself growing exasperated. “So do you want to be a corporation and make your own decisions, or do you want someone else to make your decisions? Which one is it?”
Marco thinks about that. “Maybe I try both. One copy me become corporation, second copy me work for Binary Desire.”
“You don’t mind having copies made of you?”
“Polo copy of me. That not wrong.”
At a loss, Derek brings the discussion to a close and sends the digients off to do work on their studies, but he can’t easily dismiss what Marco has said. On the one hand Marco made some good arguments, but on the other Derek remembers his college years well enough to know that skill at debate isn’t the same as maturity. Not for the first time, he thinks of how much easier it would be if there were a legally mandated age of majority for digients; without one, it will be entirely up to him to decide when Marco is ready to be a corporation.
Derek’s not alone in having disagreements in the wake of Binary Desire’s offer. The next time he talks to Ana, she complains about a recent fight with Kyle.
“He thinks we should accept Binary Desire’s offer,” she says.”He said it’s a much better option than me taking the job at Polytope.”
It’s another opportunity to be critical of Kyle; how should he handle it? All he says is, “Because he thinks modifying the digients isn’t that big a deal.”
“Exactly.” She fumes a bit, and then continues. “It’s not as if I think wearing the InstantRapport patch is no big deal. Of course it is. But there’s a big difference between me using InstantRapport voluntarily and B
inary Desire just imposing their bonding process on the digients.”
“A huge difference. But you know, that raises an interesting question.” He tells her about his conversation with Marco and Polo. “I’m not sure if Marco was just arguing for the sake of arguing, but it made me think. If a digient volunteers to undergo the changes that Binary Desire wants to make, does that make a difference?”
Ana looks thoughtful. “I don’t know. Maybe.”
“When an adult chooses to use an InstantRapport patch, we have no grounds to object. What would it take for us to respect Jax’s or Marco’s decisions the same way?”
“They’d have to be adults.”
“But we could file articles of incorporation tomorrow, if we wanted to,” he says. “What makes us so sure we shouldn’t? Suppose one day Jax says to you he understands what he’d be getting into by accepting Binary Desire’s offer, just like you with the job at Polytope. What would it take for you to accept his decision?”
She thinks for a moment.”I guess it would depend on whether or not I thought he was basing his decision on experience. Jax has never had a romantic relationship or held a job, and accepting Binary Desire’s offer would mean doing both, potentially forever. I’d want him to have had some experience with those matters before making a decision where the consequences are so permanent. Once he’s had that experience, I suppose I couldn’t really object.”
“Ah, “ says Derek, nodding. “I wish I’d thought of that when I was talking to Marco.” It would mean modifying the digients into sexual beings, but without the intention of selling them; another expense for the users’ group, even after they got Neuroblast ported. “That’s going to take a long time, though.”
“Sure, but there’s no hurry to make the digients sexual. Better to wait until we can do it properly.”
Better to set an older age of majority than risk setting it too young. “And until then, it’s up to us to look after them.”
“Right! We have to put their needs first.” Ana looks grateful for the agreement, and he’s glad he can provide it. Then frustration returns to her face. “I just wish Kyle understood that.”
He searches for a diplomatic response. “I’m not sure anyone can, if they haven’t spent the time we have,” he says. It’s not intended as a criticism of Kyle; it’s what he sincerely believes.
Chapter Nine
A month has passed since Binary Desire’s presentation, and Ana is in the private Data Earth with a few of the Neuroblast digients, awaiting the arrival of visitors. Marco tells Lolly about the latest episode of his favorite game drama, while Jax practices a dance he’s choreographed.
“Look,” he says. She watches him rapidly cycle through a sequence of poses.
“Remember, when they get here, you have to talk about what you built.”
“I know, you said and said already. I stop dancing soon they here. Just having fun.”
“Sorry, Jax. I’m just nervous.”
“Watch me dancing. Feel better.”
She smiles. “Thanks, I’ll try that.” She takes a deep breath and tells herself to relax.
A portal opens and two avatars walk through. Jax promptly stops dancing, and Ana walks her avatar over to greet the visitors. The onscreen annotations identify them as Jeremy Brauer and Frank Pearson.
“I hope you didn’t have any trouble getting in,” says Ana.
“No,” says Pearson, “the logins you gave us worked fine.”
Brauer is looking around. “Good old Data Earth.” His avatar pulls on the branch of a shrub and then let’s go, watching the way it sways. “I remember how exciting it was when Daesan first released it. It was state of the art.”
Brauer and Pearson work for Exponential Appliances, maker of household robots. The robots are examples of old-fashioned AI; their skills are programmed rather than learned, and while they offer some real convenience, they aren’t conscious in any meaningful sense. Exponential regularly releases new versions, advertising each one as being a step closer to the consumer’s dream of AI: a butler that is utterly loyal and attentive from the moment it’s switched on. To Ana this upgrade sequence seems like a walk to the horizon, providing the illusion of progress while never actually getting any closer to the goal. But consumers buy the robots, and they’ve given Exponential a healthy balance sheet, which is what Ana’s looking for.
Ana isn’t trying to get the Neuroblast digients jobs as butlers; it’s obvious that Jax and the others are too willful for that type of work. Brauer and Pearson don’t even work for the commercial division of the company; instead, they’re part of the research division, the reason that Exponential was founded. The household robots are Exponential’s way of funding its efforts to conjure up the technologist’s dream of AI: an entity of pure cognition, a genius unencumbered by emotions or a body of any kind, an intellect vast and cool yet sympathetic. They’re waiting for a software Athena to spring forth fully grown, and while it’d be impolite for Ana to say she thinks they’ll be waiting forever, she hopes to convince Brauer and Pearson that the Neuroblast digients offer a viable alternative.
“Well, thank you for coming out to meet me,” says Ana.
“We’ve been looking forward to it,” says Brauer. “A digient whose cumulative running time is longer than the lifespan of most operating systems? You don’t see that very often.”
“No, you don’t.” Ana realizes that they came more for nostalgia’s sake than to seriously entertain a business proposal. Well, so be it, as long as they’re here.
Ana introduces them to the digients, who then give little demonstrations of projects they’ve been working on. Jax shows a virtual contraption he’s built, a kind of music synthesizer that he plays by dancing. Marco gives an explanation of a puzzle game he’s designed, one that can be played cooperatively or competitively. Brauer is particularly interested in Lolly, who shows them a program she’s been writing; unlike Jax and Marco, who built their projects using toolkits, Lolly is writing actual code. Brauer’s disappointment is evident when it becomes clear that Lolly is just like any other novice programmer; it’s clear he was hoping her digient nature had given her a special aptitude for the subject.
After they’ve talked with the digients for a while, Ana and the visitors from Exponential log out of Data Earth and switch to videoconferencing.
“They’re terrific,” says Brauer. “I used to have one, but he never got much beyond baby talk.”
“You used to have a Neuroblast digient?”
“Sure, I bought one as soon as they came out. He was an instance of the Jax mascot, like yours. I named him Fitz, kept him going for a year.”
This man had a baby Jax once, she thinks. Somewhere in storage is a baby version of Jax that knows this man as his owner. Aloud, she says, “Did you get bored with him?”
“Not so much bored as aware of his limitations. I could see that the Neuroblast genome was the wrong approach. Sure Fitz was smart, but it would take forever before he could do any useful work. I’ve got to hand it to you for sticking with Jax for so long. What you’ve achieved is impressive.” He makes it sound like she’s built the world’s largest toothpick sculpture.
“Do you still think Neuroblast was the wrong approach? You’ve seen for yourself what Jax is capable of. Do you have anything comparable at Exponential?” It comes out more sharply than she intended.
Brauer’s reaction is mild. “We’re not looking for human-level AI; we’re looking for superhuman AI.”
“And you don’t think that human-level AI is a step in that direction?”
“Not if it’s the sort that your digients demonstrate,” says Brauer.
“You can’t be sure that Jax will ever be employable, let alone become a genius at programming. For all you know, he’s reached his maximum.”
“I don’t think he has–”
“But you don’t know for certain.”
“I know that if the Neuroblast genome can produce a digient like him, it can produce one as smart a
s you’re looking for. The Alan Turing of Neuroblast digients is just waiting to be born.”
“Fine, let’s suppose you’re right,” says Brauer; he’s clearly indulging her. “How many years would it take to find him? It’s already taken you so long to raise the first generation that the platform they run on has become obsolete. How many generations before you come up with a Turing?”
“We won’t always be restricted to running them in real time. At some point there’ll be enough digients to form a self-sufficient population, and then they won’t be dependent on human interaction. We could run a society of them at hothouse speeds without any risk of them going feral, and see what they produce.” Ana’s actually far from confident that this scenario would produce a Turing, but she’s practiced this argument enough times to sound like she believes it.
Brauer isn’t convinced, though. “Talk about a risky investment. You’re showing us a handful of teenagers and asking us to pay for their education in the hopes that when they’re adults, they’ll found a nation that will produce geniuses. Pardon me if I think there are better ways we could spend our money.”
“But think about what you’re getting. The other owners and I have devoted years of our attention to raising these digients. Porting Neuroblast is cheap compared to what it’d cost to hire people to do that for another genome. And the potential payoff is exactly what your company’s been looking for: programming geniuses working at high speed, bootstrapping themselves to superhuman intelligence. If these digients can invent games now, just imagine what their descendants could do. And you’d make money off every one of them.”
Brauer is about to reply when Pearson interjects. “Is that why you want Neuroblast ported? To see what superintelligent digients might invent one day?”