Book Read Free

Science Fiction: The Best of the Year, 2007 Edition

Page 10

by Rich Horton


  Then he put the selected animal into a transparent box the size of a shipping crate. The glass walls of the box were pierced with ventilator holes and inlaid with a mesh of ultrafine inductors. A cable as thick as my arm snaked from the box to the rack of electronic instrumentation. “You recognize the devices on the dog's collar?"

  "Neuroprostheses,” I said. “The kind they attach to old people.” The kind they had attached to Grandfather back when he was merely dying, not entirely dead.

  "Right,” Bloom said, his face simmering with enthusiasm. “The mind, your mind, any mind—the dog's mind, in this cases—is really a sort of parliament of competing neural subroutines. When people get old, some or all of those functions start to fail. So we build various kinds of prostheses to support aging mentation. Emotive functions, limbic functions, memory, the senses: we can sub for each of those things with an external device."

  That was essentially what Grandfather had done for the last five years of his life: shared more and more of his essential self with a small army of artificial devices. And when he eventually died much of him was still running in these clusters of epibiotic prostheses. But eventually, over time, without a physical body to order and replenish them, the machines would drift back to simple default states, and that would be the end of Grandfather as a coherent entity. It was a useful but ultimately imperfect technology.

  "Our setup's a little different,” Bloom said. “The prostheses here aren't subbing for lost functions—the dog isn't injured or old. They're just doubling the dog's ordinary brainstates. When I disconnect the prostheses the dog won't even notice; he's fully functional without them. But the ghost in the prostheses—the dog's intellectual double—goes on without him."

  "Yeah, for thirty seconds or so,” I said. Such experiments had been attempted before. Imagine being able to run a perfect copy of yourself in a digital environment—to download yourself to an electronic device, like in the movies. Wouldn't that be great? Well, you can, sort of, and the process worked the way Bloom described it. But only briefly. The fully complex digital model succumbs to something called “Shannon entropy” in less than a minute. It's not dynamically stable.

  (Postmortem arrays like Grandfather last longer—up to a couple of years—but only because they're radically simplified, more a collection of vocal tics than a real personality.)

  "Thirty seconds is enough,” Bloom said.

  "For what?"

  "You'll see."

  About this time the evening's audience began to drift in. Or maybe “audience” is too generous a word. It consisted of five furtive-looking guys in cloaks and rags, each of whom slipped Bloom a few bills and then retreated to the shadows. They spoke not at all, even to each other, and they stared at the dog in its glass chamber with strange, hungry eyes. The dog paced, understandably nervously.

  Now Bloom rolled out another, nearly identical chamber. The “death chamber.” It contained not a dog but a sphere of some pink, slightly sparkly substance.

  "Electrosensitive facsimile gel,” Bloom whispered. “Do you know what that is?"

  I'd heard of it. Facsimile gel is often used for stage and movie effects. If you want an inert duplicate of a valuable object or a bankable star, you scan the item in question and map it onto gel with EM fields. The gel expands and morphs until it's visually identical to the scanned object, right down to color and micron-level detail if you use the expensive stuff. Difference was, the duplicate would be rigid, hollow, and nearly massless—a useful prop, but delicate.

  "You duplicate the dog?” I asked.

  "I make a dynamic duplicate. It changes continuously, in synch with the real thing. I've got a patent application on it. Watch.” He dimmed the lights and threw a few switches on his bank of homemade electronics.

  The result was eerie. The lump of gel pulsed a few times, expanded as if it had taken a deep breath, grew legs, and became ... a dog.

  Became, in fact, the dog in the adjacent glass cage.

  The real dog looked at the fake dog with obvious distress. It whined. The fake dog made the same gesture simultaneously, but no sound came out.

  Two tongues lolled. Two tails drooped.

  Now the freaks in the audience were almost slavering with anticipation.

  I whispered, “And this proves what?"

  Bloom raised his voice so the ginks could hear—a couple of them were new and needed the explanation. “Two dogs,” he said. “One real. One artificial. The living dog is fitted with an array of neuroprostheses that duplicate its brain states. The dog's brain states are modeled in the electronics, here. Got that?"

  We all got it. The audience nodded in unison.

  "The dog's essence, its sense of self, is distributed between its organic brain and the remote prostheses. At the moment it's controlling the gel duplicate, too. When the real dog lifts his head and sniffs the air—like that: see?—he lifts the fake dog's head simultaneously. The illusion mimics the reality. The twinned soul operates twin bodies, through the medium of the machine."

  His hand approached another switch.

  "But when I throw this switch, the living dog's link to the prosthetics is severed. The original dog becomes merely itself—it won't even notice that the connection has been cut."

  He threw the switch; the audience gasped—but again, nothing obvious happened.

  Both dogs continued to pace, as if disturbed by the sharp smell of sweat and ionization.

  "As of now,” Bloom said, “the artificial animal is dynamically controlled solely by the neuroprostheses. It's an illusion operated by a machine. But it moves as if it had mass, it sees as if it had eyes, it retains a capacity for pleasure or pain."

  Now the behavior of the two dogs began to fall out of synchronization, subtly at first, and then more radically. Neither dog seemed to like what was happening. They eyed each other through their respective glass walls and backed away, snarling.

  "Of course,” Bloom added, his voice thick with an excitement he couldn't disguise, “without a biological model the neuroprostheses lose coherence. Shannon entropy sets in. Ten seconds have passed since I threw the final switch.” He checked his watch. “Twenty."

  The fake dog shook its head and emitted a silent whine.

  It moved in a circle, panting.

  It tried to scratch itself. But its legs tangled and bent spasmodically. It teetered a moment, then fell on its side. Its ribs pumped as if it were really breathing, and I guess it thought it was breathing—gasping for air it didn't really need and couldn't use.

  It raised its muzzle and bared its teeth.

  Its eyes rolled aimlessly. Then they turned opaque and dissolved into raw gel.

  The artificial dog made more voiceless screaming gestures. Other parts of it began to fall off and dissolve. It arched its back. Its flanks cracked open, and for a moment I could see the shadowy hollowness inside.

  The agony went on for what seemed like centuries but was probably not more than a minute or two. I had to turn away.

  The audience liked it, though. This was what they had come for, this simulation of death.

  They held their breath until the decoherent mass of gel had stopped moving altogether; then they sighed; they applauded timidly. It was only when the lights came up that they began to look ashamed. “Now get out,” Bloom told them, and when they had finished shuffling out the door, heads down, avoiding eye contact, he whispered to me, “I hate those guys. They are truly fucking demented."

  I looked back at the two glass cages.

  The original dog was trembling but unhurt. The duplicate was a quiescent puddle of goo. It had left a sharp tang in the air, and I imagined it was the smell of pain. The thing had clearly been in pain. “You said there was no cruelty involved."

  "No cruelty to animals,” Bloom corrected me.

  "So what do you call this?"

  "There's only one animal in the room, Mr. Paczovski, and it's completely safe, as you can see. What took shape in the gel box was an animation controlle
d by a machine. It didn't die because it was never alive."

  "But it was in agony."

  "By definition, no, it wasn't. A machine can only simulate pain. Look it up in the statutes. Machines have no legal standing in this regard."

  "Yeah, but a complex-enough machine—"

  "The law doesn't make that distinction. The EES is complex. Aibots are complex: they're all linked together in one big neural net. Does that make them people? Does that make it an act of sadism if you kick a vacuum cleaner or default on a loan?"

  Guess not. Anyway, it was his show, not mine. I meant to ask him if the dog act was the entire substance of his proposed Cartesian Theater ... and why he thought anyone would want to see such a thing, apart from a few unmedicated sadists.

  But this wasn't about dogs, not really. It was a test run. When Bloom turned away from me I could see a telltale cluster of bulges between his shoulder blades. He was wearing a full array of neuroprostheses. That's what he meant when he said the dogs were experiments. He was using them to refine his technique. Ultimately, he meant to do this to himself.

  * * * *

  "Technically,” Grandfather said, “he's right. About the law, I mean. What he's doing, it's ingenious and it's perfectly legal."

  "Lada's lawyers told her the same thing."

  "A machine, or a distributed network of machines, can be intelligent. But it can never be a person under the law. It can't even be a legal dog. Bloom wasn't shitting you. If he'd hurt the animal in any way he would have been remanded for treatment. But the fake dog, legally, is only a representation of an animal, like an elaborate photograph."

  "Like you,” I pointed out.

  He ignored this. “Tell me, did any of the ginks attending this show look rich?"

  "Hardly."

  "So the anonymous investor isn't one of them."

  "Unless he was in disguise, no. And I doubt Bloom would have turned down a cash gift even if it came from his creepy audience—the investor wouldn't have needed me or Lada if he had a direct line to Bloom."

  "So how did your investor hear about Bloom in the first place, if he isn't friendly with him or part of his audience?"

  Good question.

  I didn't have an answer.

  * * * *

  When I told Lada what I'd seen she frowned and ran her gold finger over her rose-pink lower lip, a signal of deep interest, the kind of gesture professional gamblers call a “tell."

  I said, “I did what you asked me to. Is there a problem with that?"

  "No—no problem at all. You did fine, Toby. I just wonder if we should have taken a piece for ourselves. A side agreement of some kind, in case this really does pan out."

  "If what pans out? When you come down to it, all Bloom has to peddle is an elaborate special effect. A stage trick, and not a very appealing one. The ancillary technology might be interesting, but he says he already filed patents."

  "The investor obviously feels differently. And he probably didn't get rich by backing losers."

  "How well do you know this investor?"

  She smiled. “All honesty? I've never met him. He's a text-mail address."

  "You're sure about his gender, at least?"

  "No, but, you know, death, pain—it all seems a little masculine, doesn't it?"

  "So is there a next step or do we just wait for Bloom to put together his show?"

  "Oh,” and here she grinned in a way I didn't like, “there's definitely a next step."

  She gave me another name. Philo Novembre.

  * * * *

  "Rings a bell,” Grandfather said. “Faintly. But then, I've forgotten so much."

  * * * *

  Philo Novembre was easier to find than Jafar Bloom. At least, his address was easier to find—holding a conversation with him was another matter.

  Philo Novembre was ten years short of a century old. He lived in an offshore retirement eden called Wintergarden Estates, connected to the mainland by a scenic causeway. I was the most conspicuously youthful visitor in the commute bus from the docks, not that the sample was representative: there were only three other passengers aboard. Aibot transports hogged the rest of the road, shuttling supplies to the Wintergarden. Their big eyes tracked the bus absently and they looked bored, even for machines.

  Novembre, of course, had not invited me to visit, so the aibot staffing the reception desk asked me to wait in the garden while it paged him—warning me that Mr. Novembre didn't always answer his pages promptly. So I found a bench in the atrium and settled down.

  The Wintergarden was named for its atrium. I don't know anything about flowers, but there was a gaudy assortment of them here, crowding their beds and creeping over walkways and climbing the latticed walls, pushing out crayon-colored blooms. Old people are supposed to like this kind of thing. Maybe they do, maybe they don't; Grandfather had never demonstrated an interest in botany, and he had died at the age of a century and change. But the garden was pretty to look at and it flushed the air with complex fragrances, like a dream of an opium den. I was nearly dozing when Philo Novembre finally showed up.

  He crossed the atrium like a force of nature. Elderly strollers made way for his passage; garden-tending aibots the size of cats dodged his footfalls with quick, knowing lunges. His face was lined but sharp, not sagging, and his eyes were the color of water under ice. His left arm was unapologetically prosthetic, clad in powder-black brushed titanium. His guide, a thigh-high aibot in brown slacks and a golf shirt, pointed at me and then scuttled away.

  I stood up to meet him. He was a centimeter or two taller than me. His huge gray gull-winged eyebrows contracted. He said, “I don't know you."

  "No sir, you don't. My name is Toby Paczovski, and I'd be honored if you'd let me buy you lunch."

  It took some haggling, but eventually he let me lead him to one of the five restaurants in the Wintergarden complex. He ordered a robust meal, I ordered coffee, and both of us ignored the elderly customers at the adjoining tables, some so extensively doctored that their physical and mental prostheses had become their defining characteristics. One old gink sucked creamed corn through a tube that issued from his jaw like an insect tongue, while his partner glared at me through lidless ebony-black eyes. I don't plan ever to get old. It's unseemly.

  "The reason I'm here,” I began, but Novembre interrupted:

  "No need to prolong this. You bought me a decent meal, Mr. Paczovski. I owe you a little candor, if nothing else. So let me explain something. Three or four times a year somebody like yourself shows up here at the Wintergarden and flatters me and asks me to submit to an interview or a public appearance. This person might represent a more or less respectable agency or he might be a stringer or a media pimp, but it always comes down to the same pitch: once-famous enemy of automated commerce survives into the golden age of the EES. What they want from me is either a gesture of defiance or a mumbled admission of defeat. They say they'll pay generously for the right note of bathos. But the real irony is that these people have come on a quest as quixotic as anything I ever undertook. Because I don't make public appearances. Period. I don't sign contracts. Period. I'm retired. In every sense of the word. Now: do you want to spend your time more profitably elsewhere, or shall we order another round of coffee and discuss other things?"

  "Uh,” I said.

  "And of course, in case you're already recording, I explicitly claim all rights to any words I've spoken or will speak at this meeting or henceforth, subject to the Peking Accords and the Fifty-second Amendment."

  He grinned. His teeth looked convincingly real. But most people's teeth look real these days, except the true ancients, like the guy at the next table.

  * * * *

  "Well, he knows his intellectual property law,” Grandfather said. “He's got you dead to rights on that one."

  "Probably so,” I said, “but it doesn't matter. I wasn't there to buy his signature on a contract."

  "So what did you want from him? Or should I say, what did Lada want fro
m him?"

  "She wanted me to tell him about Jafar Bloom. Basically, she wanted me to invite him to opening night at the Cartesian Theater."

  "That's it?"

  "That's it."

  "So this client of hers was setting up a scenario in which Novembre was present for Bloom's death act."

  "Basically, yeah."

  "For no stated reason.” Grandfather's photograph was motionless a few moments. Implying deep thought, or a voltage sag.

  I said, “Do you remember Philo Novembre back when he was famous? The eighties, that would have been."

  "The 2080's,” Grandfather mused. “I don't know. I remember that I once remembered those years. I have a memory of having a memory. My memories are like bubbles, Toby. There's nothing substantial inside, and when I touch them they tend to disappear."

  * * * *

  Philo Novembre had been a celebrity intellectual back in the 2080's, a philosopher, a sort of 21st century Socrates or Aristotle.

  In those days—the global population having recently restabilized at two billion after the radical decline of the Plague Years—everyday conveniences were still a dream of the emerging Rationalization. Automated expert systems, neuroprostheses, resource-allocation protocols, the dole: all these things were new and contentious, and Philo Novembre was suspicious of all of them.

  He had belonged to no party and supported no movement, although many claimed him. He had written a book, The Twilight of the Human Soul, and he had stomped for it like a backwoods evangelist, but what had made him a media celebrity was his personal style: modest at first; then fierce, scolding, bitter, moralistic.

  He had claimed that ancient virtues were being lost or forgotten in the rush to a rationalized economy, that expert systems and globally-distributed AI, no matter how sophisticated, could never emulate true moral sensitivity—a human sense of right and wrong.

  That was the big debate of the day, simplistic as it sounds, and it ultimately ended in a sort of draw. Aibots and expert systems were granted legal status in loco humanis for economic purposes but were denied any broader rights, duties, privileges, or protection under the law. Machines aren't people, the courts said, and if the machines said anything in response they said it only to each other.

 

‹ Prev