Quarantine

Home > Science > Quarantine > Page 14
Quarantine Page 14

by Greg Egan


  'For what? Keeping my mouth shut?' 'For the result.'

  She scowls. 'Don't be so reasonable, it makes me sick. You didn't want us to be right. I don't expect you to slit your wrists, but can't you at least be a little . . . sullen?'

  'Not on duty.'

  She leans against the doorframe, sighing. 'Sometimes, I really do wonder which of us is the least human - you on duty, or me when I'm smeared.'

  'Smeared?'

  'Uncollapsed; in multiple states. That's what we call it: smeared.' She laughs. 'That will be my claim to fame: the first human being in history to smear at will.'

  The opportunity to contradict her, to mention Laura,

  133

  hangs in the silence, tantalizing for a moment - but the risk of what it could lead to is too great. Which doesn't mean that I can't still probe around the edges. 'At will, yes - but couldn't someone have suffered neurological damage, and lost their ability to collapse the wave?'

  She nods. 'Good point. That might well have happened. The thing is, nobody would ever know, nobody could ever tell. Every time such a person interacted with someone who did collapse the wave, they'd be reduced to a single history, a single set of memories - and they wouldn't even know, themselves, that anything was different.'

  'But - while they were alone. . . ?'

  She shrugs. Ί don't know what it means to ask that. I've told you, / end up with just one set of memories myself. The effects prove that I've been smeared, but of course someone with brain damage wouldn't have the mod's control over the eigenstates - so other people would collapse them according to exactly the same probability distribution that would have applied if they'd collapsed themselves. The end result would be the same.' She laughs. Ί expect Niels Bohr would have said that such a person was the same as everyone else. If there's no way for anyone, the person included, to know what they "experienced" while they were unobserved, how can it be considered real? And I'd half agree with him: I mean, however long they went between contact with other people, each time an observation actually took place, all the states they'd occupied - all the multiple thoughts and actions they'd "experienced" - would collapse into one, perfectly mundane, linear sequence.'

  'What if they were left alone often? Left unobserved, most of the time? Do you think they could learn, somehow, to take advantage of what was happening? Force it to make a real, permanent difference - the way you can, with the mod?'

  She seems about to dismiss the idea, but then she hesitates, ponders the question seriously - and suddenly smiles. Ί wonder. How improbable is the configuration of

  134

  neurons in the mod? If someone was smeared for long enough, they'd evolve all kinds of weird and unlikely neural structures - along with a whole lot of highly probable ones. Normally, that would have no effect -the most probable configurations would still be the ones chosen when the collapse took place; everything else would vanish. But if one of these unlikely versions of the brain had some ability to meddle with the eigen-states, maybe it could bootstrap itself to a higher probability.'

  'And once a version which could do that had been made "real" -'

  '- then the next time the person smeared, they'd have a double advantage. Not only would they have the eigen-state meddling ability, per se, but they'd be starting from a new baseline - other states with even greater skills would now be far more probable, far easier to reach. The whole thing could snowball.' She shakes her head, enchanted. 'Evolution in a single lifetime! Emergent probability with a vengeance! / love it Γ

  'So it really could happen?'

  Ί doubt that very much.'

  "What? You just said -'

  She pats my shoulder sympathetically. 'It's a beautiful idea. So beautiful I'd say it just about disproves itself. If it really could happen, where are the end results? Where are all the case histories of brain-damaged people juggling eigenstates at will? The first stage must be too hard to reach in any reasonable time. Eventually, I'm sure, someone will get around to calculating just how long it would take to perform the initial bootstrap - but the answer could easily be months, years, decades ... it could be longer than a human lifetime. And how long does anyone spend alone?'

  Ί suppose you're right.'

  'Well, I have to defend my place in history, don't I? Such as it is.'

  Karen says, Ί like her. She's intelligent, cynical, and only

  135

  a little naive; the best friend you've made in years. And I think she can help you.'

  I blink at her, and moan softly. The strange thing is, I don't feel at all like I've suffered a sudden loss of control; rather, my featureless memories of the last three hours in stake-out mode seem to have evaporated, as if they'd never been anything but a delusion.

  I say, 'What do you want?'

  She laughs. 'What do you want?'

  Ί want everything to go on as normal.'

  'Normal! First you were a slave to a bunch of kidnappers, and now you're apparently worshipping the thing that enslaves you. The Ensemble in the head! It's bullshit.'

  I shrug. Ί have no choice. The loyalty mod isn't going to vanish. What do you expect me to do? Drive myself insane, trying to fight it? I don't want to fight it. I know precisely what's been done to me. I don't deny that without the mod, I would want to be free of it - but where does that leave me? /// was free, I'd want to be free. And if I was someone else entirely, I'd want completely different things. But I'm not, and I don't. It's irrelevant. It's a dead end.'

  'It doesn't have to be.'

  'What's that supposed to mean?'

  She doesn't reply; she turns and looks 'out' across the city, then raises a hand and - impossibly - signals the window to enhance the hologram's contrast, cutting back the spill from the advertising signs, darkening the empty sky to the deepest black imaginable.

  Karen controlling RedNet? Or has the hallucinatory process which conjures up her body started manipulating the rest of my visual field? I contemplate these equally improbable explanations with equally numb resignation. There's no point hoping any more that this problem will cure itself. The neurotechnicians are going to have to take me apart.

  I stare at the perfect darkness of the Bubble, unwillingly entranced by the sight of it, whatever kind of illusion - contrast-enhanced hologram, or pure mental fabrication - 'the sight of it' is.

  136

  A faint pinprick of light appears in the blackness. Assuming that it's nothing but a flaw in my vision, I blink and shake my head, but the light stays fixed in the sky. A high, slow-moving satellite, just emerged from the Earth's shadow? The point grows brighter, and then another appears close by.

  I turn to Karen. 'What are you doing to me?'

  'Sssh.' She takes my hand. 'Just watch.'

  Stars keep appearing, doubling and redoubling in number like phosphorescent celestial bacteria, until the sky is as richly populated as I remember it from the darkest nights of my childhood. I hunt for familiar constellations, and for a fleeting instant I recognize the saucepan shape of Orion, but it's soon gone, drowned in the multitude of new stars coming into being around it. My eye finds exotic new patterns - but they're as transitory as the rhythms in Po-kwai's random chant, vanishing the moment they're perceived. The satellite views on Bubble Day, the most baroque space operas of the forties, never had stars like this.

  A dazzling tract of light - like an impossibly opulent version of the Milky Way - thickens to the point of solidity, then grows steadily brighter.

  I whisper, 'What are you saying? That the damage we've done can be . . . undone? I don't understand.'

  The band of light explodes, spreading across the sky until the perfect blackness becomes perfect, blinding white. I turn away. Po-kwai cries out. Karen vanishes. I spin back to face the hologram. The sky above the towers of New Hong Kong is empty and grey.

  I hesitate at the door to the apartment, just listening for a while. I don't want to startle her again, but I have no intention of becoming complacent. Nobody could have reac
hed her without passing me . . . but what kind of state was I in, hallucinating cosmic visions, to know who or what might have walked right by me, unseen? The whole episode already seems completely unreal; if not for a lingering vision of the blazing sky, I'd swear that I had a seamless recollection of standing guard in stake-out

  137

  mode, from the time I bid Po-kwai good night to the instant I heard her scream.

  As I open the door, she's stepping into the living room, hugging herself. She says drily, 'Well, you're not much use. I could have been murdered in my bed by now.' Despite the joke, she seems far more shaken than last time.

  'Another nightmare?'

  She nods. 'And this time, I remember . . . what it was about.'

  I say nothing. She scowls at me. 'So stop being a fucking robot, and ask me what I dreamt.' 'What did you dream?'

  Ί dreamt that I lost control of the mod. I dreamt that I smeared. I dreamt that I. . . filled ... the whole room, the whole apartment. And I don't sleepwalk, you know -' Suddenly, she starts shivering violently.

  'What -'

  She reaches out and grabs me by the arm, leads me down the corridor towards the bedroom. The door is closed. She points me at it bodily, takes a second to catch her breath, then says, 'Open it.'

  I try to turn the handle. It doesn't move.

  'It's locked. That's how paranoid I am. I lock it every night now.'

  'And you woke. . .?'

  'Outside. Half-way up the corridor.' She positions herself at the spot. 'After hitting one eight-digit combination to open the thing, and another to lock it behind me.'

  'Did you. . .dream of doing that? Did you dream of operating the lock?'

  'Oh, no. In the dream, I didn't need to touch the lock -1 was already outside the room. Inside and outside. I didn't need to move. . .1 just had to strengthen the eigenstate.'

  I hesitate, then say, 'And do you think -'

  She says firmly, Ί think my subconscious must have it in for me, that's all I can say. I must have hit the right codes in my sleep, however hard that is to believe. Because if you're wondering if the mod might have let me tunnel

  138

  through a closed door - like an electron through a voltage barrier - the answer is, it can't. Even if that were possible in theory, this mod was not designed to do any such thing. It was designed to work on microscopic systems. It was designed to demonstrate the simplest effects - nothing more.'

  I imagine my reply so vividly that I can almost hear the words: 'It wasn't designed at all.'

  But the machinery in my skull keeps me silent, and instead I nod and say, Ί believe you - you're the expert. And it was your dream, not mine.'

  139

  9

  Lui says, 'We can use this.'

  ' Use it? I don't want to use it, I want to put a stop to it! I want the Canon's blessing to tell Po-kwai exactly what's happening. I want to get the whole thing under control.'

  He frowns. 'Under control, yes, but you musn't tell Po-kwai about Laura. Suppose Chen found out that you'd disobeyed her? Where would that leave us? Right now, I'm sure nobody even suspects the existence of the Canon; they have far too much confidence in the loyalty mod. Or far too little respect for it. They don't seem to have realized just how powerful a combination intelligence and its antithesis could be. You know, in formal logic, an inconsistent set of axioms can be used to prove anything at all. Once you have a single contradiction, A and not A, there's nothing you can't derive from it. I like to think of that as a metaphor for our distinctive kind of freedom. Forget Hegelian synthesis; we have pure Orwellian doublethink.'

  I look past him irritably, across the crowded lawns of Kowloon Park, to a flower bed shimmering in the heat. I have no one else to turn to, and I don't seem to be getting through to him.

  I say, 'Po-kwai deserves to know the truth.'

  'Deserves? It's not a matter of what she deserves, it's a matter of what the consequences would be. I have the greatest respect and admiration for her, believe me. But do you really want to sacrifice the Canon, just to let her know that she's been deceived? The sham Ensemble wouldn't simply impose harsher mods on us, if that's what you're thinking; they'd write off their losses - they'd kill us. And what do you think they'd do to her, if she tried to back out now?'

  'Then we have to protect her, and protect ourselves. We have to bring the sham Ensemble down.'

  140

  Even as I say it, I realize how ludicrous a suggestion it is, but Lui says, 'Eventually, yes. But that's not going to happen on a whim. We need to act from a position of strength. We need to exploit whatever opportunities present themselves.' He pauses - just long enough for my hesitant silence to sound like implicit consent - then adds, 'Like this one.'

  'Po-kwai is losing control of her mod. I'm going insane. How is that an opportunity?'

  He shakes his head. 'You're not "going insane". Some of your mods are failing, that's all. Why? P3 is designed to act as a barrier, confining you to certain, useful states of mind - and yet somehow you're tunnelling through that barrier, into states that are supposedly inaccessible: boredom, distraction, emotional agitation. That ought to be highly unlikely - and yet you're doing it. AH the diagnostics tell you that the mod is physically intact. Which means the system itself is undamaged . . . but the probabilities of the system are being changed. Remind you of anything?'

  I shudder. 'If you're saying Po-kwai is manipulating me the way she manipulates the ions . . . how can she? Okay, she can alter the probabilities of a smeared system - like a silver ion whose spin is still a mixture of up and down - but what's that got to do with me? I'm the very opposite of a smeared system: I collapse the wave, don't I?'

  'Of course you do - but how often?'

  'All the time.'

  'What do you mean, "all the time"? Do you think you're permanently collapsed? The collapse is a process -a process that happens to a smeared system. You think smearing is an exotic state - something that only happens in laboratories?'

  'Isn't it?'

  'No. How can it be? Your whole body is built out of atoms. Atoms are quantum mechanical systems. Suppose - conservatively - that the average atom in your body, left uncollapsed for a millisecond, might do one of ten different possible things. That means, in a millisecond, it

  141

  will smear into a mixture of ten eigenstates: one for each of the things it might have done. Some states will be more probable than others - but until the system is collapsed, all these possibilities will co-exist.

  'After two milliseconds, there'd be a hundred distinct combinations of things this atom might have done: any of the ten possibilities, followed by the same choice again. That means smearing into a mixture of a hundred different eigenstates. After three milliseconds, a thousand. And so on.

  'Add a second atom. For each possible state of the first atom, the second could be in any one of its own states. The numbers multiply. If one atom, alone, could have smeared into a thousand states, a system of two would have smeared into a million. Three atoms, and it's a billion. Keep that up until you get to the size of a visible object - a grain of sand, a blade of grass, a human body -and the numbers are astronomical. And constantly increasing with time.'

  I shake my head numbly. 'So, how does it ever stop?'

  'I'm getting to that. When one smeared system interacts with another, they cease to be separate entities. Quantum mechanics says they have to be dealt with as a single system - you can't lay a finger on one part without affecting the whole thing. When Po-kwai observes a smeared silver ion, a new system is formed: Po-kwai-plus-the-ion, which has twice as many states as Po-kwai had, alone. When you observe a blade of grass, a new system is formed: you-plus-the-blade-of-grass, which has as many states as you had, alone, multiplied by however many states the blade of grass had.

  'But a system which includes you includes the collapse-inducing part of your brain - which ends up smeared into countless different versions, representing all the different possible states of
everything else: the rest of your brain, the rest of your body, the blade of grass, and anything else you've observed. When this part of your brain collapses itself - making one version of itself real - it can't help collapsing the whole, combined system: the rest of your

  142

  brain, the rest of your body, the blade of grass, and so on. They all collapse to a single state, in which just one of all the countless billions of possibilities actually "happens". Then, of course, they all start smearing again . . .'

  I say, 'All right, I understand: people have to smear, in order to collapse. AH the possibilities have to be there - in a sense - in order for one to be chosen. The collapse is like . . . drastically pruning a tree - which has to grow out a little, in all directions, before we can choose which branch we leave uncut. But we must collapse so often that we don't have time to be aware of being smeared, in between. Hundreds of times a second, at least.'

  Lui frowns. 'Why do you say that? How would we be "aware of being smeared"? Consciousness seems like a smooth flow, but that's just the way the brain organizes perceptions; reality isn't created continuously, it comes in bursts, in spasms. Experience must be constructed retrospectively; there's no such thing as the present - it's only the past that we succeed in making unique. The only question is the time scale. You say that if it was anything more than a few milliseconds, we'd somehow be aware of the process . . . but that's simply not true. This is how subjective time arises, how the future turns into the past, for us. We're in no position to discern how, or when, it happens.

  'Granted, in the experiments with Po-kwai when she didn't use the collapse-inhibiting part of the mod, she was unable to influence the eigenstates - but that's no proof of anything. Even if she failed because she collapsed herself-plus-the-ions before she could change the probabilities -and that's by no means the only explanation - you can't generalize from one person, in a laboratory, to the whole human race, all of the time. Depending on their state of mind, depending on whether they're in groups or alone, people might go for seconds, or even minutes, between collapsing. There is no way of knowing.'

 

‹ Prev