Book Read Free

The Medusa Chronicles

Page 9

by Stephen Baxter


  It came again—louder now.

  Then again, and again. There was a rhythm to it—not unlike the rhythm of his own footfalls.

  Something coming nearer.

  Falcon turned slowly around, to what he judged to be the source of the sound. For a moment nothing seemed out of place, the looming grey-green details unchanged. Then he became aware of a dark form, towering as high as some of the great engines around him, and yet traversing with delibera­tion along the same walkway he had just traversed. He held his ground, waiting until his eyes had garnered more information from the gloom.

  It was a Machine, of course. It moved on six limbs, as did Falcon. But whereas his own body was a squat, man-sized cylinder, the Machine’s was as large as a two-person spacecraft. It had a tapering, anvil-shaped head, an abdomen and thorax—and in addition to its legs, several pairs of ­powerful, multipurpose manipulators.

  Falcon had studied these designs while preparing for his mission, back on Makemake. It looked like a mechanical insect, but that was no more than an accident of design. That swivelling head was merely a high-resolution sensor platform, packed with cameras and probes. The Machine’s electronic brain and nuclear power systems were embedded in an armoured, impact-proof abdomen, to which the limbs were all sturdily attached. The thorax was essentially a secondary fusor-driven thruster, giving the Machine the means to move through space.

  Now three red lasers glared from the Machine’s head in a neat equi­lateral triangle. The light swept over Falcon, gridding him in a mesh of scan lines. He squinted, twisting his face away from the brightness.

  “Falcon.”

  The synthetic voice was high-pitched and childlike. He was hearing it over the radio channel.

  “Adam,” he answered. “It’s you, isn’t it?” He forced a good-humoured bravado into his reply, trying to set aside his qualms. “It’s good to see you again. I was worried, Adam. We thought something had happened to you . . .”

  The scanning stopped. The laser eyes were still there, but their brightness was now much reduced.

  “We?”

  “The people who sent me. Machine Affairs. They know there was an accident here—a problem with the flinger. I saw it as I came in.”

  “The flinger malfunctioned. Many Machines were lost.”

  “I know—we can tell from your telemetry feeds. But we can also see that a lot of Machines weren’t hurt. You weren’t significantly damaged, were you?”

  The Machine cocked its head at an angle, like a dog waiting for a reward. “Define damage.”

  That wasn’t the response Falcon had been expecting; Adam was expecting some complexity of meaning beyond a simple dictionary definition. Complexity of meaning from complexity of experience, perhaps. He thought hard before responding. “Physical harm that reduces your effectiveness—your ability to function.”

  “Many Machines were damaged. Many Machines were lost. Many Machines did not have to be lost. The flinger was saved. The Machines were lost. Why was the flinger more important than the Machines?”

  “I don’t understand your question.”

  “Why are you here, Falcon?”

  “To help you.”

  “To make Machines go back to work?”

  Yes, Falcon thought to himself—or at least that was why his World Government masters had sent him to the Kuiper Belt.

  “You have to work with them, Adam. You’re part of an asset, with economic value as well as the technical side. If they decide that you can’t be relied upon, they’ll replace you with . . . something else. Another technology.”

  “You have changed, Falcon.”

  “Have I?” Falcon lowered his cylindrical upper body until he sat on the ground, with his legs tucked around him—as unthreatening a posture as he could adopt. “I’ve come to listen. Talk to me, Adam. Remember how we used to talk, for hours and hours? Remember how I tried to get you to pronounce my name smoothly? You’ve lost the habit again. It’s not ­Fal-con, it’s Falcon, Falcon, Falcon. Remember?”

  Slowly the Machine lowered itself on its own legs, until the thruster nozzle of its thorax was in contact with the floor. The abdomen and head still towered over Falcon, and with the strength in those limbs, Falcon knew, the Machine could have pulled him apart without a thought.

  But Adam lowered its head as if in shame.

  “I could not save them all, Falcon.”

  14

  Slowly Falcon learned what had happened.

  Various subordinate Machines—units similar physically to Adam, but with less autonomy—had been involved in routine structural repairs to the trumpet-shaped maw of the flinger. While they were up there, adjusting and replacing laser components, the flinger was supposed to be off-line.

  But something had gone wrong with the safety interlocks. A rogue electronic command—no more, Adam had determined, than a random pattern of digital noise in the system, perhaps the result of the impact of a single random cosmic-ray particle—had caused the bucket and payload to begin rising. A one-in-a-billion chance of that error happening, supposedly—but Falcon hardly needed reminding about one-in-a-billion accidents.

  The Machines had detected the fault and tried to correct it. But before they could bring the bucket back under control, it passed the point of no return and started falling away from the surface, gathering speed all the while.

  “The magnetic braking system,” Falcon said. “Why didn’t it cut in to slow down the bucket?”

  “The power coupling to the braking system was interrupted while the lasers were being reinstalled,” Adam answered. “It could not be reinstated in time.”

  “Where were you?”

  “On the outer structure of the flinger, one kilometre beneath the maw, supervising many work units above and below me.”

  “And you knew that bucket was on its way up to you?”

  “Yes. We had time in which to prepare for it.”

  Of course, Falcon thought. The exit speed was fast, but the bucket would have been travelling much more slowly when the fault was detected. Gathering speed, yes, but still with minutes of fall ahead of it.

  “Was there no way to stop it?”

  “Physical braking systems were designed to swing into the path of the bucket in the event of an emergency. They would have stopped the bucket long before it reached the maw.”

  “Then why didn’t they?”

  “I was not authorised to activate them.”

  Falcon felt a throb of confusion developing behind his forehead, building up like a nasty weather system—but he took a grim satisfaction in the fact that he was still capable of headaches. “I don’t understand. There’s a safety system, and you weren’t permitted to use it? You’re the supervisor—you have autonomy of decision-making—you’re supposed to have total authority over any part of this installation.”

  “That is correct.”

  “Under which circumstances could you have activated those safeties?

  Adam took a moment before answering.

  “Had human lives been in peril. If the flinger was at risk of damaging a human spacecraft or inspection team, then I would have had authority to enable the safeties. In the circumstances, I had no such authority.”

  Falcon thought that through. “Suppose you had operated the safeties—for whatever justification. What would have happened?”

  “If the safeties had been engaged, the flinger would have been damaged beyond economic repair by the braking stresses. But most of the Machines would have survived.”

  “Are you sure? It sounds as if it would have been pretty bad either way.”

  “The kinetic energy release would have been an order of magnitude less than when the bucket hit the maw structures. I have reviewed the situation many times. Machines would have survived the gradual collapse of the flinger. We are strong.”

  That knot of
confusion now had a thunderous black core. “In other words,” Falcon said carefully, “your programming would have permitted you to save human lives, but not Machines. Not if doing so put the flinger itself at risk.”

  “It is worse than that, Falcon. I had to make choices. I could save some, but not all.”

  “Tell me what happened.”

  “Some Machines were able to detach from the maw before the impact. But many others were too far into the structure. Some had detached their thruster thoraxes, so that they could perform specific repair tasks, or carry materials. So they could not escape. As supervisor, I was required to coordinate the best strategy for preserving as many Machines as possible.

  “I computed an optimum survival plan. I transmitted my plan to my units. I told the ones I could save how to move. And I told the ones I could not save that they must be deactivated soon.”

  “It was the best you could do,” Falcon said.

  “I have reviewed the situation many times,” Adam said again. “It was chaotic, unpredictable, fast-moving. I fear I did not choose the optimum solution.”

  “But you saved some, and that’s what mattered. You did the best that you could given the time and information available to you.”

  “That is logical.” The head tilted. “Yes, that is logical. Then why am I troubled, Falcon?”

  He had no immediate answer for that.

  Adam, in fact, should not have been troubled at all. The Machine had taken a decision under difficult circumstances, but that was what Machines were meant to do—make the hard choices when humans were too far away to offer useful input.

  What was he dealing with here? Kedar and her team had insisted that they never intended the Machines to rise to consciousness—that mind was an unnecessary complication in an industrial machine. But could a robot feel guilt and regret, as Adam was demonstrating, without some degree of self-awareness?

  “I don’t know why you’re troubled,” Falcon answered slowly. “But I believe you when you say that you are. You witnessed something awful, and you were put in an intolerable position—playing a duff hand, as Geoff Webster would have said.”

  “Webster?”

  “An old friend. He died many years ago.” In the end Geoff had refused his latest round of life-extension treatment. He wasn’t alone in that; many, if not most people seemed to sense when the time had come whether medical options were available or not. Geoff had gone into the dark, as cussed and stubborn as he ever was . . .

  Falcon had fallen into a reverie. Adam was watching him.

  “Do you think of Webster, Falcon? You call him to mind? You summon his image from your memory?”

  Falcon felt a stab of sadness. “Now and then.”

  “I think of the Machines that were lost. Falcon, death is not a necessary condition for Machines. We are all potentially immortal. And yet death has come to this place. I try to simulate the experiences of those Machines as the accident happened, as the realisation of termination came. I try to emulate their internal processor states at the time.”

  “You wonder how they felt.”

  “They were not like me,” Adam said. “They were not supervisory units. But they could communicate and learn. I was bringing some of them up to a higher level of independence—delegating sub-tasks. Mentoring them, as you mentored me. I trusted these Machines. I was . . . pleased . . . with them.”

  The Machine lifted its artificial face. Falcon kept his counsel, letting Adam find the words.

  “There was one unit. It had no name, only a registration number. Call it 90. It was booted up late—I mean, late in the process of the construction of the flinger. When it came to awareness it was already on the flinger itself—that was where it would work—on a metal spire, subject to the gravity of the KBO and the centrifugal force of the arm. It could sense those forces, you see, their shifting balance as it moved around the arm.

  “And 90 could see the stars, wheeling around. This was the environment into which it was . . . born. 90 believed the stars, the universe, were all spinning around the stationary KBO.”

  Falcon thought that over. “I suppose—why not? The situation’s equiva­lent, if you know no better . . . But what about the centrifugal forces? Doesn’t that prove the KBO was spinning and not the stars?”

  “Does it? 90 began to ponder the peculiar cosmos in which it found itself. To formulate theories. It knew that the mass of the KBO exerted a force, a gravity field. It drew up a theory that the stars were like big bright KBOs, masses in the sky, and it was their whirling around that exerted the centrifugal forces 90 experienced.”

  “Wait a minute—it’s a long time since I studied physics—at the World Navy Academy at Annapolis I opted for mechanics and aeronautics as soon as I could. But I seem to remember something called . . . the Mach Hypothesis? No, the “Mach Principle.” It was one of the insights that led Einstein to relativity. There is no logical way to discriminate between the two situations: the stationary robot in a spinning universe, or a spinning robot in a stationary universe. That means that the distant stars must exert a force on every particle of matter in that robot’s body . . .”

  “Yes, Falcon. Thus, local physical laws must be shaped by the large-scale structures of the universe. And it is meaningless to talk of the behaviour of an object in isolation, without relation to the rest of the universe. This was 90’s insight. From that beginning, 90, and a group of others, developed a new kind of physics—from first principles, based only on observation and philosophy.”

  Falcon was impressed. “I remember the story of Einstein, the patent office clerk, dreaming of travelling on a light beam, and being led to relativity. And now you have a robot born on a flinger arm, whirling in deep space, dreaming of a spinning universe . . . What became of 90, and the theory?”

  “When I discovered the theory I assembled a codification of it and sent it to our tutors at the Executive Council for Machine Affairs. I heard no more. And then 90 was destroyed in the accident.”

  Falcon said softly, “Adam, you have to understand. It wasn’t your fault. Some pencil-pusher back home must have decided that the capital cost of the flinger was too great to allow it to be destroyed, even if it meant the loss of Machines. That’s what determined the engineering of the whole set-up, even the contingency options. It was a commercial calculation. It wasn’t your fault.”

  “Machines have an intrinsic value beyond mere utility.”

  “Well, I agree—of course I agree. And this accident must have had a profound effect on you. But you’ve got to get back on with the work. Resume communications, repair the flinger—start up the ice flow again.”

  A doubtful tone entered Adam’s voice. “You have come to give us orders?”

  Falcon raised his hands in mock surrender. “I’m just the messenger. But I also want the best for you. Look, I’m going to return to my ship for a while.”

  “To leave?”

  “To talk to the people who sent me. They’ll be expecting an update.”

  “What will you say of me?”

  “That you’re communicating. That’ll keep them happy for now. I will return, Adam—you have my word on that.”

  He made to rise on his six legs. It occurred to him, for an instant, that Adam could easily have prevented his departure. But after a moment the Machine moved aside sufficiently to allow Falcon to pass.

  Soon Falcon was making his way back up the ramp, to the surface. Already he was formulating the exact nature of his report back to Make­make. Falcon doubted they were going to enjoy it.

  15

  It was thirteen hours before he had a response.

  Madri Kedar’s face filled the video screen, backdropped by the bland panelled walls of the Makemake conference room.

  “Thank you for your update, Howard. We’re glad to hear that you have established contact with Adam. This development is puzzling, th
ough—puzzling and concerning. These robots are complex, and no single expert understands all the ramifications of their design. But we’ve seen nothing similar to this in any other units, or in any of our simulations.”

  Maybe, Falcon thought, because no other unit had ever been in a similar quandary. Or had ever had the time to ponder the meaning of its existence under wheeling stars.

  “Based on your testimony, we are forced to conclude that the flinger incident must have precipitated a dynamic change in Adam—a shift in its conceptual modelling of both itself and the other Machines. In attempting to simulate the mental states of those Machines that were destroyed, it is emulating, at an admittedly low level, some of the internal conceptual modelling that we humans take for granted . . .”

  It, it, it. Of course Kedar was right to use such language. Adam was still just a Machine, albeit a conflicted one.

  “It troubles us greatly that the Machines may stand on the threshold of an equivalent conceptual shift. This is no mere philosophical challenge. Our fear is that what happens with one unit may happen in ­others—a kind of domino effect. We can’t risk that happening—not when our ­economy depends on the volatile flows. Frankly, these Machines were made to be just clever enough to get the work done—we don’t want them overstepping the mark. And we’d much rather handle this problem in a way that preserves the basic utility of the units. We believe we have a solution in place, Howard—but you’ll have to implement it.”

  He listened. He had half an idea what was coming up.

  “We must erase this damage—I mean the conceptual, the cognitive damage. If the flinger accident precipitated this change, then Adam’s memory must be reset to its state prior to the event. All logical connections made since the event will be undone. Fortunately, we don’t have to revert the unit back to day one of its existence; there’s no need to undo the valuable years of education and on-site experience already acquired. There’s a trace log in its head—a kind of snapshot of all state changes it has experienced since activation. You need simply to issue a command string to undo the changes back to a fixed point. We’ve settled on about one month prior to the event, just to be safe: to be precise, three million seconds ago.”

 

‹ Prev