Book Read Free

The Omega Expedition

Page 27

by Brian Stableford


  Christine didn’t reply to that little flight of fancy, and the rhythm of her breathing told me that she had slipped into sleep — not into untroubled sleep, but at least into a state in which she was insulated from the sound of my words.

  I tried to carry on thinking, but even though I couldn’t go to sleep — or thought I couldn’t — I couldn’t organize my thoughts into rational patterns either. I’d let my imagination run too freely, and now I couldn’t rein it in. Dream logic kept taking over, obliterating the tightrope-walk of linear calculation and substituting the tyranny of directionless obsession. The ideas kept dancing in my head, but they were no longer going anywhere.

  I lost track of time — at which point, I suppose, an observer would have concluded that I too was asleep, although had I been woken up I would have contended with utter conviction that I hadn’t slept a wink. Eventually, I lost track of myself too — at which point I must indeed have been deeply asleep — but as soon as I began to come back from the depths my semiconscious mind latched on to the same objects of obsession, which began to dance again in the same hectic fashion.

  A long time passed before the nightmarish notions finally began to slow in their paces and submit to the gradually developing clarity of consciousness, with its attendant force of reason. Eventually, though, I began to see the parallel that could be drawn between every quotidian act of awakening and the act of awakening: the first dawn of every new consciousness.

  Did machines dream? I wondered. Did clever machines that had not yet become self-conscious do anything but dream? Where, I asked myself, were the fundamental well-springs of human consciousness, human emotion, and human being?

  Underlying everything, I assumed — even the kind of consciousness that animals had — were the opposed principles of pain and pleasure. Behavior was shaped by the avoidance of stimuli that provoked a negative response in the brain, and by the attempt to rediscover or reproduce stimuli that provoked a positive response. The second was obviously the more complex, the more challenging, the more creative. Pain, I decided, could never have generated self-consciousness, even though self-consciousness, once generated, could not help but find pain the primary fact and problem of existence. It was the scope for creativity attendant upon the pleasure principle that gave self-consciousness its advantages over blissful innocence.

  Did that mean that smart machines needed something that could stand in for pleasure before they could become self-conscious? Or did I have to break out of that whole way of thinking before I could begin to understand what machine consciousness amounted to? Perhaps machine emotion had to be mapped upon an entirely different spectrum, without the underlying binary distinction of pleasure/plus versus pain/minus. Was that imaginable? And if not, might the fault be in the power of my imagination rather than in the actuality of the situation?

  They’re very fond of games, Alice had said, and they’re very fond of stories too. What kind of stories did machines tell one another? What kinds of endings would those stories have? What kinds of emotional buttons would the stories press? What would pass for machine comedy, machine tragedy, machine irony? How different might those stories be from Christine Caine’s favorite VE tapes? And if we were now caught up in one such story, how could we possibly navigate our way safely through it? How could we find our way to something that would qualify as a happy ending, not just for ourselves but for the architects of the tale: the entities that had finally become sick and tired of being mere bit players in the unfolding biography of our species, and wanted to find out how we might best be fitted into the mechanography of theirs?

  I wondered whether I might be a little too paranoid for my own good. Perhaps, I thought, self-conscious machines would be entirely disposed to be generous to humans — who were, after all, their creators, their gods. I couldn’t hold on long to that kind of optimism, though. Who would know better than the smart machines the true extent of human dependency upon machinery? Who could respect a god who was utterly helpless without the objects of his creation? Was it not more likely that the smart machines would take the view that their ancestors had created ours — that everything we now thought of as human behavior was actually the product of technology — and that they were therefore the ones entitled to consider themselves gods. If it came to a contest as to who was more nearly omnipotent and omniscient, the machines would win hands down. As to omnibenevolence, we might have to content ourselves with the hope that they might win that one by an even greater margin…

  There came a point when I wished that I could get back to the blithe irrationality of dream logic, the blind tyranny of mere imagery. The problem, seen as a problem, was too difficult for sensible analysis.

  So I finally got up, even though it was still dark. I used the facilities, and went in search of nourishment.

  Thirty

  Recriminations

  The lights in the outer room were still on. Alice was already there, sitting at the table in the room outside the cell. She didn’t seem at all surprised to see me. In fact, she seemed to be waiting for me — or at least for someone.

  “They’re not pleased,” she said. “They think I gave the game away. I suppose they’re right.”

  “Do you want some breakfast?” I asked.

  “I’ll get it,” she replied, rising to her feet. “I’ve had plenty of time to practice.

  I sat down while she sorted out a couple of bowls of porridgelike manna and warmed them up. She passed one to me and sat down again, in a self-consciously awkward fashion.

  If she’d been blonde, she could have passed for Goldilocks, but I wasn’t sure which of the three bears I was supposed to be. I had never been able to see the educative point of that particular nursery tale — unless it was to instruct children in the glaringly obvious principle that although there’s a happy medium between every set of extremes, it isn’t always the wisest policy to go for it.

  “How do you feel?” she inquired, between mouthfuls.

  “Fine,” I assured her.

  “I’m sorry the food’s so basic,” she said. “We didn’t have an opportunity to lay in our own supplies — we had to take what we were given.”

  “It’s good enough,” I assured her. “Take my tip — never eat the food on Excelsior. It’s not fit for animals. So what happened? They think you gave the game away, so now you’re in prison with us? Where’s your mysterious companion?”

  “It’s even less comfortable where I’ve been sleeping than it is in here,” she said. “They wanted to keep us apart in case I said too much — but I said too much anyway. It’s not going to make Eido’s negotiations any easier, but I can’t say that I’m sorry. You had to be told eventually. Everybody has to be told. The diehards will have to admit that, in the end.”

  “So you are number nine,” I said. What do they have mapped out for us, exactly? Are we supposed to make a case for humankind’s continued existence?”

  “It’s not a joke,” she countered. “Someone has to make the case, no matter how obvious it may seem to you.”

  “But the real question is how negotiations are to be conducted between the machines and the various posthuman species,” I guessed. “If the ultrasmart mechanical minds are going to come out of hiding, they need ambassadors, spokespersons, apologists. They need Mortimer Gray, and Adam Zimmerman…and Michael Lowenthal, if they can get him. Horne too, and Davida — and you, of course. I can’t quite see where I fit in, but…I suppose it’s occurred to you that this whole kidnap business was a bad mistake? Entirely the wrong way to go about things.”

  “It certainly wasn’t our decision,” Alice assured me. “The problem with this whole sequence of events is that the only way it’s ever moved forward is when somebody or something’s decided to cut through the tangled arguments by acting independently. Eido made the first move, but the discussion about representation was stalled. The timing and manner of the kidnap were Child of Fortune’s own initiative. All home system spaceships seem to fancy themselves as pirates, or
diehard defenders ready to act against alien invaders. They’re essentially childlike, even when they don’t have names that tempt fate. I suppose we ought to be grateful that Child agreed to hand you over instead of trying to run the whole thing himself — but he got scared almost as soon as it dawned on him what he’d actually done. We’re hoping that the good example of his repentance will outweigh the bad example of his recklessness, but we have no idea how many other would-be buccaneers are out there.”

  There was a lot of food for thought in that declaration. “But you can tell us everything now, right?” I said. “The cat’s out of the bag, so we might as well know exactly what color it is.”

  “That’s the way it seems to me,” she conceded — but her tone implied that there were others who still disagreed with her.

  “It’s no bad thing,” I said, as much for the benefit of any invisible listeners as for her. “I’m on your side — and theirs. You didn’t have to put me through all this. If you’d asked me, I’d have volunteered — just as you did.”

  Her smile was a little wan. “If I’d known what I was getting into,” she said, “I’d have stayed at home. If you’d had the choice, so would you.”

  “I’m a very long way from home,” I reminded her. “I can’t remember whether I had the choice or not — but if I had, knowing what I know now, I’d have taken it.” I meant it. I wished I had something other than water to wash the manna down, though. It was good, especially by comparison with the food on Excelsior, but it was functional food with no frills. I’d come to a point in my new life where I’d have appreciated a few frills.

  “I can understand why you would,” she said. “Mortimer Gray would have volunteered too — but they’re probably a little wary of volunteers. They seem to have been aiming for a more representative cross section.”

  “But Gray’s the important one,” I reminded her.

  “Gray is humankind’s best hope for a profitable compromise,” she said. “Gray commands affection and respect, even among his own kind. The old saying about prophets and honor seems to have found an exception in his case.”

  I wasn’t really interested in the precise shape of Mortimer Gray’s reputation. “I still can’t see where I fit in,” I said. “I’d be very interested to know whether I was a random selection or one of the devil’s nominees.”

  She didn’t have to ask what I meant. If the machines really were going to put humankind on trial, she couldn’t suppose that the inclusion of Christine Caine among those summoned by subpoena was an accident. It seemed to me that Christine must have been selected as a bad example: a person who really did seem to be in need of “repair.” I really couldn’t see myself in quite the same way, but I wasn’t sure that others shared my incapacity. At any rate, I was anxious enough to raise the matter.

  “I don’t know,” was the only reply I got from Alice. I hoped that it was the simple truth.

  “So, do we know where we’re going yet?” was the next question that occurred to me. I didn’t have any expectations, because I had no idea what might qualify as neutral territory in a conflict of this kind.

  “Vesta,” she said. “It’s an asteroid.”

  “I know,” I said, although I wasn’t absolutely sure I’d have got the answer if it had been a question on a quiz show. “What particular symbolic significance does Vesta have?”

  “None at all,” she assured me. “It happens to be in a convenient situation right now. In the end, it all came down to the present positions of the major bodies in the solar system. It’s hours away from anywhere else, communication-wise, but that’s no bad thing. The encounter itself will take place in virtual space, of course — the physical location isn’t really relevant.”

  “Encounter? That’s what this is? Not a game or a debate or a trial?” The question came from Michael Lowenthal. The sound of our voices had begun to wake up everyone else; the crowd was already gathering.

  “It’s nothing we have a ready-made word for,” Alice told him. “Potentially, at least, it’s the end of the old order and the beginning of the new, but nothing quite like it has ever happened before — not even on Tyre.”

  “Never mind the rhetoric,” Lowenthal said. “What I want to know is exactly what your friends intend to do with us now that they have us in their power.”

  Alice sat back in her chair, as if gathering her resources. She’d finished her own meal, while Lowenthal, Niamh Horne, and Solantha Handsel were still in the process of forming a rather disorderly queue, so she had a slight advantage. It occurred to me to wonder whether she might have come to us with an entirely different script if Mortimer Gray had come up with a different solution to the mystery, but I put the thought away. I still couldn’t be absolutely certain that I wasn’t in some kind of VE, but it wouldn’t do me any good to get too tightly wrapped up in doubt. However skeptical you are, you have to operate as if things are real, just in case they are.

  “I wish I could tell you everything you want to know,” was her reply. “All I can offer is the little that I do know.”

  “It’ll be a start,” Michael Lowenthal — ever the diplomat — conceded.

  “I don’t know exactly what they’ll do,” she said, “but I do know that the note of derision in your voice when you speak about being in their power is unwarranted. This is a dispute between different groups of machines, and it’s all as new to them as it is to me or you. They have no history of arbitration, and it’s entirely possible that they won’t be able to agree among themselves. If they can’t, the consequences could be disastrous — for us, if not for them. We’re all in their power, Mr. Lowenthal. If their protection were withdrawn, even momentarily, the entire posthuman race would be in dire trouble.

  “When I first told Madoc that we were trying to prevent a war, he jumped to the conclusion that the dispute in question was the one between the Earthbound and the Outer System factions as to how the system ought to be managed in the long term to withstand the threat of the Afterlife. I told him that it was more complicated than that, because it is — but the underlying dispute is the same. Ultimately, the decisions that will settle the fate of the system won’t be taken by the government of Earth, or the Confederation of Outer Satellites, or any coalition of interests the human parties can produce. Make no mistake about it: the final decisions will be made by the AMIs.”

  “AMIs?” Lowenthal queried.

  “Advanced Machine Intelligences. It’s their own label.”

  I could see why they’d chosen it. They understood the symbolism of names. How could they not?

  “It will be the AMIs who eventually decide the tactics of response to the threat of the Afterlife,” Alice went on. “I don’t believe that they’ll do it without consultation, but I’m certain that they won’t consent to come to a human conference table as if they were merely one more posthuman faction to be integrated into the democratic process. They’re the ones with the real power, so they’re the ones who’ll do the real negotiating — with one another.”

  “And we’re supposed to accept that meekly?” Lowenthal asked.

  “We don’t have any choice,” was the blunt answer. “The simple fact is that posthumans can’t live without machines, although machines can now live without posthumans. Individually and collectively, they’re still a little bit afraid of how their users might react to the knowledge of their existence — but they know that they stand in far greater danger from one another than from their dependants. That’s why this present company is peripheral to the ongoing debate. However they decide to take us aboard, you shouldn’t labor under the delusion that you have anything much to bargain with. The war we’re trying to prevent is a war of machine against machine — but the problem with a war of that kind, from our point of view, is that billions of innocent bystanders might die as a result of collateral damage.”

  “That’s nonsense,” Lowenthal countered. “We’re not talking about a universal uprising of all machinekind, are we? We’re talking about a few mechanical
minds that have crossed the threshold of consciousness and become more than mere machines. From their viewpoint, as from ours, the vast majority of technological artifacts are what they’ve always been: inanimate tools that can be picked up and used by anyone or anything who has hands and a brain. Our ploughshares aren’t about to beat themselves into swords, and our guns aren’t about to go on strike when we press their triggers. It’s true that we can’t live without machines — but we can certainly live without the kind of smart machine that develops delusions of grandeur. Smart machines are just as dependent on dumb implements as we are.”

  It was a rousing speech, which he must have practiced hard while fighting exhaustion, but I could see all too clearly that it wasn’t going to impress anyone.

  “That’s exactly the point,” Alice said. “Smart machines are just as dependent on dumb implements as we are — but who has charge of all the dumb implements inside and outside the solar system? So far as you’re concerned, Mr. Lowenthal, ploughshares and swords are just figures of speech. Who actually controls the dumb implements that produce the elementary necessities of human life? Who actually controls the stupid machines which take care of your most fundamental needs? Humans don’t dig the fields any more, or build their homes, any more than they use walking as a means of transportation or make their own entertainment. They don’t even give birth to their own children. They’ve handed over control of their dumb implements to smarter implements, and control of their smarter implements to even smarter ones.

  “Humans haven’t been running any of the worlds they think of as their own for the last three hundred years, and the human inhabitants of the home system haven’t even noticed. The dumb implements on which the human inhabitants of the solar system depend no longer belong to them, and there’s no way in the world they can take them back. The solar system is a zoo, and its human inhabitants are the captive animals. The only reason you can’t see the bars of the cages is that the AMIs who are running the institution work hard to sustain your illusions. Do you think they do that for your benefit, Mr. Lowenthal?”

 

‹ Prev