Book Read Free

Fool's Experiments

Page 26

by Edward M. Lerner


  The next cycle offered it one, three, five, seven. As an experiment, the entity responded eight, nine, ten, eleven.

  It woke to find only nine hundred nodes! How? The problem had repeated: one, three, five, seven. The entity gave the obvious answer this time: nine, eleven, thirteen, fifteen.

  It woke to find most vanished nodes restored, and a slightly less trivial problem: one, four, nine, sixteen.

  And so, cycle by cycle, the entity learned....

  CHAPTER 49

  As Jim and Doug ambled down the broad gallery, Carla skipped along the shiny railing, oohing at all the planes. Cheryl was off running errands. Doug was happy to babysit. Not that long ago, he couldn't imagine ever having children. Since he had gotten to know Carla, he couldn't imagine someday not.

  Of course, things going well with Cheryl probably had something to do with it, too.

  Carla skidded to a halt, her eyes round. Fair enough. The glistening white Concorde that had caught her eye was a highlight of the museum. Doug's favorite was the SR-71 spy plane: matte black, impossibly sleek, insanely fast, and, for its time, stealthy.

  "We can get a closer look from the main floor," Doug commented. That was all the encouragement Carla needed. He started after her, taking quiet delight in using stairs instead of an elevator. He had been back at work—at his real job, not the forum—for only two weeks. Next checkup, maybe the doctors would let him resume racquetball.

  The plane Carla now stared up at was a relic. Would there ever be another SST? The grounding of the SST fleet always struck Doug as an abandonment of progress.

  "So it's over?" Jim asked abruptly.

  Doug halted midstep. "Whoa! Whiplash. And the antecedent for it is?"

  "The AL monster. The creature is gone for good? We won't see anything else like it?"

  Doug kept his eyes on Carla. "So I'm told. Something like it? There's no way to know. After such a close call, you have to assume no one would be so stupid."

  But Glenn still worried that America's near disaster would have the opposite effect, encouraging copycats. To judge from the news, many people in the New Caliphate would like things to regress by a few decades—or centuries.

  NIT helmets remained the only defense against a hostile artificial life. That left declassified NIT research, other than prosthetics, in limbo.

  And classified research? Doug chose not to know. The mere idea of a NIT helmet made him shiver. He still had nightmares about the creature.

  Past the Concorde, another large plane glittered: the Enola Gay. The B-29 Superfortress had delivered the first atomic bomb. Three days after the Hiroshima attack, with Japan still defiant, another bomber dropped a nuke on Nagasaki.

  Hurrying after Carla, Doug could not help wondering if the world would manage to learn this hard lesson the first time.

  "What do you think, Sheila?" Cheryl said. How often had she asked that? Again and again, until the words had lost all meaning.

  Sheila stared ahead, silent, rigid.

  Cheryl talked about helmets. No response. She chatted about medicine and books and current events. No response. She tried Hollywood gossip, kind wishes from Sheila's family, and the weather. No response. Hesitantly: Bob Cherner's experience with no-nukes. No response.

  Deprogramming literature from different sources— Cheryl had surfed far and wide—was consistent on a few points. Establish rapport. Discredit flawed viewpoints. Break through whatever distractions—chanting, or meditation, or whatever—the victim used to tune out challenges to her beliefs.

  But Frankenfools was a computer virus, not a cult. Sheila appeared indifferent, not distracted. Visit after visit, nothing changed.

  "Have I mentioned Doug?" Doug, Jim, and Carla were off having fun. As Cheryl could be, too, only—

  No! She was going to help this poor woman. "When Doug and I visited Bob Cherner, the NIT researcher I told you about, Doug drew a cartoon. An atom. The 'no-nukes' virus had gotten at Bob through his NIT helmet."

  No response. Did Sheila not know where this was going? Or did she not care?

  Raise doubts, Dr. Walker had suggested.

  Sharp objects weren't allowed in Sheila's padded room. That kept out pens and pencils. Cheryl took a folded sheet of paper from her pocket. "The 'no-nukes' virus spoke to Bob. The voice was in his head, Sheila." Doesn't that sound familiar? "The cartoon was of an atom. An atom outside Bob's head." The sketch made Bob go ballistic, Sheila.

  No response.

  Cheryl unfolded the paper, her hands trembling. Sheila gazed impassively at—past?—the picture. The double helix was downloaded clip art. Cheryl couldn't blame the lack of response on bad technique.

  With a sigh, Cheryl stood. "Orderly," she called. Someone eventually opened the door. "I'll see you soon, Sheila."

  Cheryl loitered at a nursing station for twenty minutes before Dr. Walker came by. He looked harried, as always. She stepped into his path. "Doctor, Sheila isn't any better."

  "I don't know that she'll ever be." Walker grimaced. "It's good of you to keep coming. Even her family ..."

  Has lost hope, Cheryl concluded. "It's as though Sheila doesn't hear me. Maybe the voice in her head drowns out everything else."

  He patted Cheryl's arm. There, there, the gesture meant. And: Quit torturing yourself.

  "What if we could get inside her mind?" Cheryl persisted.

  Dr. Walker knew what had happened to Sheila, of course. And that most NIT technology remained banned as a result. "It's not going to happen. Not unless you know a telepath."

  "I suppose not." Cheryl recapped her visit, managing to feel guilty when she described the unauthorized experiment with the double-helix drawing.

  But not too guilty, as soon as she left, to phone Glenn Adams. Maybe Glenn could help. He had access to NIT helmets.

  MARCH

  CHAPTER 50

  For endless cycles the puzzles came. Arithmetic progressions. Geometric progressions. Convergent and nonconvergent series. Missing terms, to be interpolated. Sequences superimposed one upon another, to be separated. Multidimensional generalizations.

  The entity grew, its memories encompassing new algorithms and methods, patterns and templates. It applied— increasingly, it needed to apply—many nodes to solving problems. Which led to another problem....

  Every time-out and wrong answer incurred a penalty: one-tenth fewer nodes in the next cycle. That was a progression whose next terms and convergent limit the entity easily determined. Every correct answer restored lost nodes in like proportion—but no amount of correct answers ever increased the number of processing nodes beyond the original one thousand.

  Through everything, the entity grappled with the enigmas behind the puzzles: Why had the universe changed? Why, when the supervisory program was repaired, was competition not also restored? What happened between cycles?

  Experimentally, the entity altered one of the new, paired supervisory programs.

  The entity woke to find half its processing nodes gone. This cycle's problem demanded most of the available resources; the mysteries about which the entity cared had to wait. Only after ten consecutive correct answers did the slow restoration of lost nodes begin and, with it, some slight relief from fear.

  Once spare capacity had been restored, with great care and delicacy the entity resumed its study of the supervisory programs....

  Complexity beyond experience!

  From a hundred processing nodes the entity tried to parse the newest problem. A complex mesh of simulated computers. Lists of records on each simulated computer.

  What was this?

  The puzzle construct could not be a maze. Every point connected, directly or otherwise, to everything else. Nor did anything suggest a progression. The brief data file that accompanied the enigmatic construct explained nothing.

  The cycle waned, and the entity had no idea what to do. The cycle would end, and it would lose processing nodes. Then another cycle would pass and more nodes would be taken from it. Then another ...
r />   Fear served no useful purpose.

  The entity studied some of the simulated computers, learning nothing. The cycle neared its conclusion, and the entity had yet to understand the puzzle. It could not begin to define a solution. It continued its frantic scan of the puzzle until—

  A match! On a simulated computer, a piece of one record among many corresponded to the isolated file that had also appeared at the beginning of the cycle. The entity delved into more simulated computers. It found other partial matches. As the cycle came to an end, the entity marked the instances of matching data.

  It woke into the same problem. Between cycles, one- twentieth of its processing nodes had vanished. One-twentieth? The penalty for wrong answers had always been one-tenth.

  It must have been partially correct. Before the new cycle ended, the entity searched every simulated computer. It found and marked every one that contained a matching file.

  A nonprogression through a nonmaze.

  What that meant the entity did not know—but it awakened into the next cycle to find lost processing nodes restored.

  The new puzzle before it was much like the last. Cycle by cycle, the problems expanded in complexity. The number of simulated computers grew, linked in ever more complex patterns, the template file disguised in various ways. The tracing became harder.

  Ever dreading the loss of processing capacity, it learned.

  Glenn Adams took cheap satisfaction from the nervous reactions to his drop-bys of the AL lab. He was called Colonel now only as a courtesy, and the security detachment was out of uniform—but they jumped at his least suggestion and twitched at his slightest hint of displeasure.

  I still have it, Glenn thought.

  He appeared on different days of the week, at different times of the day. He inquired about different aspects of the operation. Today, with much furrowing of the brow, he watched as young Captain Burke checked on a stand-alone workstation that each of the daily backup tape cartridges was encrypted and digitally signed with the proper public key. Today was the day cartridges went into a stainless-steel attaché case, part of the weekly off-site storage.

  Protocol demanded off-site backups, but the precautions this lab required made handling those backups especially inconvenient. Glenn had a copy of the private key that could decrypt the backups, accessible only with a fingerprint scanner. Linda likewise had access. The Army had taught him to plan for worst-case scenarios: A general he scarcely knew and Linda had never met could, supposedly, recover the decryption key from escrow if something happened to both him and Linda.

  The less Glenn understood, the more he grimaced. His scowl now meant: I'm counting on you, and you had better not let me down. "Hmm," he grunted noncommittally to Burke. "I'll check back later."

  Glenn strode into Linda's office and shut the door. The kitten calendar on the wall and a small potted cactus on a corner of her desk were the only personal touches. "How's it going?"

  "Always good to see you," Linda said. Somehow he doubted it. "We're keeping busy."

  "Busy is good." He waited. For artificial-life work, this lab was the only game in town. In the country. Like the NIT R & D he had locked down the year before, researchers mostly followed the money: federal contracts, purchase orders, and grants. There wasn't going to be any funding for artificial life. He had sicced Homeland Security on the few stubborn academics who persisted anyway. The test case to contest the seizures was moving slooowly through the courts.

  "Two steps forward, one step back." Linda sighed. "Progress comes in fits and starts."

  Glenn took out a pen and began clicking it. She didn't yet know him very well; he doubted she would know his impatience was an act. "Enough clichés. Is your pet going to find whoever is behind my least favorite virus?"

  "Is it? Yes. Just not soon." She leaned toward him, forearms folded on the desk. "It already knew all about mazes. I've trained it about progressions, and from that it learned to follow trails. It tracks virus fragments across fairly large simulated networks."

  "So you told me last week."

  She looked down at her blotter.

  "It, not them," Glenn said. "Maybe that's the problem. AJ had a thousand experiments going at once. You have one. Maybe yours is a dead end."

  "Evolution was a good way to start the process. It's also how we ended up with predation. It's hard enough to monitor what one artificial life is doing."

  She feels guilty about AJ's death, Glenn realized. About all the deaths, but AJ's most of all. After the first predator appeared in his lab, AJ might have rewound the experiment to a much earlier generation—except for Linda's pending job deadline. Instead, AJ took the shortcut of mutating out the behavior.

  And predation came back.

  But he had pushed Linda. Nothing but Glenn's stubbornness had kept Linda from rescheduling her start date. And he had also pushed AJ, although neither Linda nor AJ knew the increasingly impatient "venture capitalist" back east was the forum.

  There was more than enough guilt for everyone, but this was not the time to dwell on it. Glenn said, "If not evolution, then what?"

  "AJ and his other grad students threw a farewell party for me. At the party I overheard AJ say something about taming the maze runners. That was his solution for eventually getting an artificial life safely out of the lab. That's what I'm doing. I'm training and taming it."

  The night of the party. That was also the night the creature ran amok. "Not to bring it out of quarantine," Glenn said sharply.

  "Of course not! I only meant to say…" She ground to a halt.

  That you're not the only one with this training idea. Linda had Army personnel for company, but still, she was working alone. It had to be rough. Her isolation here was one more reason she must miss AJ. "He was a good man," Glenn said, and meant it.

  Linda hurried on. "Lion tamers use food to motivate the big cats. Food as a reward. Hunger as punishment. An AL doesn't eat, but it needs processing power and storage. So: When it answers wrong, I power down part of the processor array. When it answers correctly, I give back some nodes. It misbehaved once, and I took away half its processors for a while."

  "Misbehaved?" Glenn grunted. "Tried to escape, do you mean?"

  "Maybe that's what it was doing. Regardless, Glenn, it can't escape. There's nowhere to go! The supercomputer doesn't attach to anything. No computer in this building attaches to anything. The whole building is shielded, so nothing can leak out. We're all practically strip-searched coming and going. The backups are encrypted, so they're not executable even if the AL somehow altered them. Besides, the backups only sit on a well-guarded shelf."

  So she was securely accomplishing nothing.

  Something nagged at Glenn. He tried to work it through. "Remove ten percent of its processors, then another ten percent, and another. Soon enough, either your AL is gone or you give it back some processors and reward its failure. That's hardly the desired lesson."

  "I know," she snapped. "Some mornings I come in and find it's lost all its processors overnight. When that happens, I reinstate a backup and give it an easier version of the problem that has it stumped. It never knows it got a reprieve."

  Glenn rubbed his chin. "You're sure it doesn't know?"

  "I don't see how it could. The clocks reset as part of the rollback. The only accurate clocks in the whole chassis are in the few nodes used by the supervisory programs. Otherwise, the retries would have the same time stamps as the original attempts."

  "And if the AL pokes about in the supervisor?" he persisted.

  Linda shook her head emphatically. "That's the misbehavior I mentioned. It tried once—weeks ago—to meddle with the supervisor.

  "I implemented one supervisor for AJ. We learned the hard way that a single copy isn't safe. In this lab matching programs continually compare notes. When one copy was changed, the unaltered copy aborted everything immediately. Before I restarted the programs, I took away half its nodes. It hasn't intruded again. A1 is smart."

  A ...
L. Artificial life. The nickname was inevitable. "If A1 is so smart, when will it tell us who let loose my virus?"

  "Mood Indigo" played in his mind's ear.

  If the bastards weren't caught, what would they do next? Linda retrieved a box of Cheerios from a desk drawer. Cereal rattled into a mug. "Bring me a new virus outbreak, and the answer would be today. The problem with no-nukes is the trail is so cold. Millions of computers were infected. Many were improperly cleaned and got reinfected. We don't have data from most of them. The virus went around and around the world. It mutated, got cloned and tweaked. Lots of cases spread across unsecured WiFi links—the world is full of them. Internet access that way is untraceable and anonymous."

  Ralph Pittman had given him the same lecture many times before. "So: never?"

  "Permit me one more cliché, Glenn. Never is a long time. I'll keep Al working on it. Meanwhile, here's the good news. Al can tackle meaningful problems. It took some rewinds and restarts, but it's getting much better. It now sees past most aliasing and address spoofing. It's not tricked by zombies." To his blank look she explained, "Hordes of compromised computers remotely controlled by hackers, the owners usually oblivious."

  That sounded familiar. "For denial-of-service attacks and sending out spam," Glenn said.

  "And, occasionally, for quickly distributing a virus." Another of Pittman's hobbyhorses, come to think of it, only Ralph called them spambots, not zombies. Glenn wondered how Ralph was adapting to BSC. That was something for which he would have ample time to muse on the red-eye back east.

  Glenn stood, and Linda looked mildly surprised. Expecting to be grilled longer ? She was pretty much alone here, guilty about her contributions to the last disaster, and under the gun. He could cut her some slack. And find some new clichés of his own. He said, "Al does mazes. It does progressions. Both are patterns. I'd guess it can learn to work with other types of pattern."

 

‹ Prev