by Greg Egan
“Operation Butterfly is only the beginning. Crisis management, for a tiny part of the planet. Imagine how much computing power it would take to render sub-Saharan Africa free from drought.”
“Why should I imagine that, when the most modest schemes are still unproven? And even if weather control turns out to be viable, more supercomputers can always be built. It doesn’t have to be a matter of Copies versus flood victims.”
“There’s a limited supply of computing power right now, isn’t there? Of course it will grow – but the demand, from Copies, and for weather control, is almost certain to grow faster. Long before we get to your deathless utopia, we’ll hit a bottleneck – and I believe that will bring on a time when Copies are declared illegal. Worldwide. If they’ve been granted human rights, those rights will be taken away. Trusts and foundations will have their assets confiscated. Supercomputers will be heavily policed. Scanners – and scan files – will be destroyed. It may be forty years before any of this happens – or it may be sooner. Either way, you need to be prepared.”
Thomas said mildly, “If you’re fishing for a job as a futurology consultant, I’m afraid I already employ several – highly qualified – people who do nothing but investigate these trends. Right now, everything they tell me gives me reason to be optimistic – and even if they’re wrong, Soliton is ready for a very wide range of contingencies.”
“If your whole foundation is eviscerated, do you honestly believe it will be able to ensure that a snapshot of you is hidden away safely – and then resurrected after a hundred years or more of social upheaval? A vault full of ROM chips at the bottom of a mine shaft could end up taking a one-way trip into geological time.”
Thomas laughed. “And a meteor could hit the planet tomorrow, wiping out this computer, all of my backups, your organic body … anything and everything. Yes, there could be a revolution which pulls the plug on my world. It’s unlikely, but it’s not impossible. Or there could be a plague, or an ecological disaster, which kills billions of organic humans but leaves all the Copies untouched. There are no certainties for anyone.”
“But Copies have so much more to lose.”
Thomas was emphatic; this was part of his personal litany. “I’ve never mistaken what I have – a very good chance of a prolonged existence – for a guarantee of immortality.”
Durham said flatly, “Quite right. You have no such thing. Which is why I’m here offering it to you.”
Thomas regarded him uneasily. Although he’d had all the ravages of surgery edited out of his final scan file, he’d kept a scar on his right forearm, a small memento of a youthful misadventure. He stroked it, not quite absentmindedly; conscious of the habit, conscious of the memories that the scar encoded – but practiced at refusing to allow those memories to hold his gaze.
Finally, he said, “Offering it how? What can you possibly do – for two million euros – that Soliton can’t do a thousand times better?”
“I can run a second version of you, entirely out of harm’s way. I can give you a kind of insurance – against an anti-Copy backlash … or a meteor strike … or whatever else might go wrong.”
Thomas was momentarily speechless. The subject wasn’t entirely taboo, but he couldn’t recall anyone raising it quite so bluntly before. He recovered swiftly. “I have no wish to run a second version, thank you. And … what do you mean, ‘out of harm’s way’? Where’s your invulnerable computer going to be? In orbit? Up where it would only take a pebble-sized meteor to destroy it, instead of a boulder?”
“No, not in orbit. And if you don’t want a second version, that’s fine. You could simply move.”
“Move where? Underground? To the bottom of the ocean? You don’t even know where this office is being implemented, do you? What makes you think you can offer a superior site – for such a ridiculous price – when you don’t have the faintest idea how secure I am already?” Thomas was growing disappointed, and uncharacteristically irritable. “Stop making these inflated claims, and get to the point. What are you selling?”
Durham shook his head apologetically. “I can’t tell you that. Not yet. If I tried to explain it, out of the blue, it would make no sense. You have to do something first. Something very simple.”
“Yes? And what’s that?”
“You have to conduct a small experiment.”
Thomas scowled. “What kind of experiment? Why?”
And Durham – the software puppet, the lifeless shell animated by a being from another plane – looked him in the eye and said, “You have to let me show you exactly what you are.”
Chapter 3
(Rip, tie, cut toy man)
June 2045
Paul – or the flesh-and-blood man whose memories he’d inherited – had traced the history of Copies back to the turn of the century, when researchers had begun to fine-tune the generic computer models used for surgical training and pharmacology, transforming them into customized versions able to predict the needs and problems of individual patients. Drug therapies were tried out in advance on models which incorporated specific genetic and biochemical traits, allowing doses to be optimized, and any idiosyncratic side-effects anticipated and avoided. Elaborate operations were rehearsed and perfected in Virtual Reality, on software bodies with anatomical details – down to the finest capillaries – based on the flesh-and-blood patient’s tomographic scans.
These early models included a crude approximation of the brain, perfectly adequate for heart surgery or immunotherapy – and even useful to a degree when dealing with gross cerebral injuries and tumors – but worthless for exploring more subtle neurological problems.
Imaging technology steadily improved, though – and by 2020, it had reached the point where individual neurons could be mapped, and the properties of individual synapses measured, non-invasively. With a combination of scanners, every psychologically relevant detail of the brain could be read from the living organ – and duplicated on a sufficiently powerful computer.
At first, only isolated neural pathways were modeled: portions of the visual cortex of interest to designers of machine vision, or sections of the limbic system whose role had been in dispute. These fragmentary neural models yielded valuable results, but a functionally complete representation of the whole organ – embedded in a whole body – would have allowed the most delicate feats of neurosurgery and psychopharmacology to be tested in advance. For several years, though, no such model was built – in part, because of the prohibitive cost of supercomputer time; in part because of a scarcely articulated unease at the prospect of what it would mean. There were no formal barriers standing in the way – government regulatory bodies and institutional ethics committees were concerned only with human and animal welfare, and no laboratory had yet been fire-bombed by activists for its inhumane treatment of physiological software – but still, someone had to be the first to break all the unspoken taboos.
Someone had to make a high-resolution, whole-brain Copy – and let it wake, and talk.
In 2024, John Vines, a Boston neurosurgeon, ran a fully conscious Copy of himself in a crude Virtual Reality. Taking slightly less than three hours of real time (pulse racing, hyperventilating, stress hormones elevated), the first Copy’s first words were: “This is like being buried alive. I’ve changed my mind. Get me out of here.”
His original obligingly shut him down – but then later repeated the demonstration several times, without variation, reasoning that it was impossible to cause additional distress by running exactly the same simulation more than once.
When Vines went public, the prospects for advancing neurological research didn’t rate a mention; within twenty-four hours – despite the Copy’s discouraging testimony – the headlines were all immortality, mass migration into Virtual Reality, and the imminent desertion of the physical world.
Paul was twenty-four years old at the time, with no idea what to make of his life. His father had died the year before – leaving him a modest business empire, centered on a thriving retail chain,
which he had no interest in managing. He’d spent seven years traveling and studying – science, history, and philosophy – doing well enough at everything he tried, but unable to discover anything that kindled real intellectual passion. With no struggle for financial security ahead, he’d been sinking quietly into a state of bemused complacency.
The news of John Vines’ Copy blasted away his indifference. It was as if every dubious promise technology had ever made to transform human life was about to be fulfilled, with a vengeance. Longevity would only be the start of it; Copies could evolve in ways almost impossible for organic beings: modifying their minds, redefining their goals, endlessly transmuting themselves. The possibilities were intoxicating – even as the costs and drawbacks of the earliest versions sank in, even as the inevitable backlash began. Paul was a child of the millennium; he was ready to embrace it all.
But the more time he spent contemplating what Vines had done, the more bizarre the implications seemed to be.
The public debate the experiment had triggered was heated, but depressingly superficial. Decades-old arguments raged again, over just how much computer programs could ever have in common with human beings (psychologically, morally, metaphysically, information-theoretically …) and even whether or not Copies could be “truly” intelligent, “truly” conscious. As more workers repeated Vines’ result, their Copies soon passed the Turing test: no panel of experts quizzing a group of Copies and humans – by delayed video, to mask the time-rate difference – could tell which were which. But some philosophers and psychologists continued to insist that this demonstrated nothing more than “simulated consciousness”, and that Copies were merely programs capable of faking a detailed inner life which didn’t actually exist at all.
Supporters of the Strong AI Hypothesis insisted that consciousness was a property of certain algorithms – a result of information being processed in certain ways, regardless of what machine, or organ, was used to perform the task. A computer model which manipulated data about itself and its “surroundings” in essentially the same way as an organic brain would have to possess essentially the same mental states. “Simulated consciousness” was as oxymoronic as “simulated addition.”
Opponents replied that when you modeled a hurricane, nobody got wet. When you modeled a fusion power plant, no energy was produced. When you modeled digestion and metabolism, no nutrients were consumed – no real digestion took place. So, when you modeled the human brain, why should you expect real thought to occur? A computer running a Copy might be able to generate plausible descriptions of human behavior in hypothetical scenarios – and even appear to carry on a conversation, by correctly predicting what a human would have done in the same situation – but that hardly made the machine itself conscious.
Paul had rapidly decided that this whole debate was a distraction. For any human, absolute proof of a Copy’s sentience was impossible. For any Copy, the truth was self-evident: cogito ergo sum. End of discussion.
But for any human willing to grant Copies the same reasonable presumption of consciousness that they granted their fellow humans – and any Copy willing to reciprocate – the real point was this:
There were questions about the nature of this shared condition which the existence of Copies illuminated more starkly than anything which had come before them. Questions which needed to be explored, before the human race could confidently begin to bequeath its culture, its memories, its purpose and identity, to its successors.
Questions which only a Copy could answer.
#
Paul sat in his study, in his favorite armchair (unconvinced that the texture of the surface had been accurately reproduced), taking what comfort he could from the undeniable absurdity of being afraid to experiment on himself, further. He’d already “survived” the “transition” from flesh-and-blood human to computerized physiological model – the most radical stage of the project, by far. In comparison, tinkering with a few of the model’s parameters should have seemed trivial.
Durham appeared on the terminal – which was otherwise still dysfunctional. Paul was already beginning to think of him as a bossy little djinn trapped inside the screen – rather than a vast, omnipotent deity striding the halls of Reality, pulling all the strings. The pitch of his voice was enough to deflate any aura of power and grandeur.
Squeak. “Experiment one, trial zero. Baseline data. Time resolution one millisecond – system standard. Just count to ten, at one-second intervals, as near as you can judge it. Okay?”
“I think I can manage that.” He’d planned all this himself, he didn’t need step-by-step instructions. Durham’s image vanished; during the experiments, there could be no cues from real time.
Paul counted to ten. The djinn returned. Staring at the face on the screen, Paul realized that he had no inclination to think of it as “his own.” Perhaps that was a legacy of distancing himself from the earlier Copies. Or perhaps his mental image of himself had never been much like his true appearance – and now, in defense of sanity, was moving even further away.
Squeak. “Okay. Experiment one, trial number one. Time resolution five milliseconds. Are you ready?”
“Yes.”
The djinn vanished. Paul counted: “One. Two. Three. Four. Five. Six. Seven. Eight. Nine. Ten.”
Squeak. “Anything to report?”
“No. I mean, I can’t help feeling slightly apprehensive, just knowing that you’re screwing around with my … infrastructure. But apart from that, nothing.”
Durham’s eyes no longer glazed over while he was waiting for the speeded-up reply; either he’d gained a degree of self-discipline, or – more likely – he’d interposed some smart editing software to conceal his boredom.
Squeak. “Don’t worry about apprehension. We’re running a control, remember?”
Paul would have preferred not to have been reminded. He’d known that Durham must have cloned him, and would be feeding exactly the same sensorium to both Copies – while only making changes in the model’s time resolution for one of them. It was an essential part of the experiment – but he didn’t want to dwell on it. A third self, shadowing his thoughts, was too much to acknowledge on top of everything else.
Squeak. “Trial number two. Time resolution ten milliseconds.”
Paul counted. The easiest thing in the world, he thought, when you’re made of flesh, when you’re made of matter, when the quarks and the electrons just do what comes naturally. Human beings were embodied, ultimately, in fields of fundamental particles – incapable, surely, of being anything other than themselves. Copies were embodied in computer memories as vast sets of numbers. Numbers which certainly could be interpreted as describing a human body sitting in a room … but it was hard to see that meaning as intrinsic, as necessary, when tens of thousands of arbitrary choices had been made about the way in which the model had been coded. Is this my blood sugar here … or my testosterone level? Is this the firing rate of a motor neuron as I raise my right hand … or a signal coming in from my retina as I watch myself doing it? Anybody given access to the raw data, but unaware of the conventions, could spend a lifetime sifting through the numbers without deciphering what any of it meant.
And yet no Copy buried in the data itself – ignorant of the details or not – could have the slightest trouble making sense of it all in an instant.
Squeak. “Trial number three. Time resolution twenty milliseconds.”
“One. Two. Three.”
For time to pass for a Copy, the numbers which defined it had to change from moment to moment. Recomputed over and over again, a Copy was a sequence of snapshots, frames of a movie – or frames of computer animation.
But … when, exactly, did these snapshots give rise to conscious thought? While they were being computed? Or in the brief interludes when they sat in the computer’s memory, unchanging, doing nothing but representing one static instant of the Copy’s life? When both stages were taking place a thousand times per subjective second, it hardly seeme
d to matter, but very soon—
Squeak. “Trial number four. Time resolution fifty milliseconds.”
What am I? The data? The process that generates it? The relationships between the numbers?
All of the above?
“One hundred milliseconds.”
“One. Two. Three.”
Paul listened to his voice as he counted – as if half expecting to begin to notice the encroachment of silence, to start perceiving the gaps in himself.
“Two hundred milliseconds.”
A fifth of a second. “One. Two.” Was he strobing in and out of existence now, at five subjective hertz? The crudest of celluloid movies had never flickered at this rate. “Three. Four.” He waved his hand in front of his face; the motion looked perfectly smooth, perfectly normal. And of course it did; he wasn’t watching from the outside. “Five. Six. Seven.” A sudden, intense wave of nausea passed through him, but he fought it down, and continued. “Eight. Nine. Ten.”
The djinn reappeared and emitted a brief, solicitous squeak. “What’s wrong? Do you want to stop for a while?”
“No, I’m fine.” Paul glanced around the innocent, sun-dappled room, and laughed. How would Durham handle it, if the control and the subject had just given two different replies? He tried to recall his plans for such a contingency, but couldn’t remember them – and didn’t much care. It wasn’t his problem anymore.
Squeak. “Trial number seven. Time resolution five hundred milliseconds.”
Paul counted – and the truth was, he felt no different. A little uneasy, yes – but factoring out any squeamishness, everything about his experience seemed to remain the same. And that made sense, at least in the long run – because nothing was being omitted, in the long run. His model-of-a-brain was only being fully described at half-second (model time) intervals – but each description still included the results of everything that “would have happened” in between. Every half second, his brain was ending up in exactly the state it would have been in if nothing had been left out.