“Is philosophy a hobby of yours, Eric?” Evelyn asked. She had been running her eyes over the shelves of books around the study.
“Oh, I dabble in a bit of everything,” Shipley replied affably.
“So the universe is inductive,” Corrigan concluded.
“Isn’t it obvious?” Shipley said.
“I thought that philosophers have been having a problem with induction for centuries,” Evelyn commented.
Shipley shrugged. “It’s of their own making—as are most of humanity’s problems. They started by assuming that the universe couldn’t work inductively—because they couldn’t reduce it to formal rules—when it obviously does.”
“So we have to teach the simulator how to be inductive,” Corrigan said. “How does real-world logic work, then, Eric?”
“Being ninety-percent right, ninety percent of the time,” Shipley replied. “It’s what science, business, war, and evolution are all about.”
“What about sex?” Hatcher asked, looking away from the Colt and taking a swig from his can.
“Oh.” Shipley smiled. “That’s made up of all of the above.”
A thoughtful expression came over Hatcher’s face. “Maybe the way isn’t to try and teach the system how to be inductive at all,” he said. “I mean, if we’re not really sure how we do it ourselves, we’re hardly in a position to spell out the rules, are we?”
“What other way is there?” Corrigan asked.
“Maybe the thing to do is turn it the other way around.”
There was silence for a couple of seconds while the others puzzled over this. “How do you mean?” Shipley asked finally.
“Let it learn in the same way as we do: by observing the behavior of real people in the environments that it creates. With EVIE, we’ve got all the technology you need.” Hatcher paused, then went on, more excited visibly as he warmed to the idea, “Instead of the inhabitants of a world evolving in response to the environment, the environment learns to get better by watching the reactions of the inhabitants. See what I mean—it would be turning nature upside down.”
“Hmm.” Shipley drew back, frowning. “I’ll have to give that some thought. . . . It’s interesting, that, Tom. Very interesting.”
Corrigan had taken down one of the books that Evelyn had noticed and was turning the pages idly. “Epictetus? I’ve heard of him.”
“Greek slave, taken to Rome,” Shipley said, moving over.
“Got freed and became a philosopher,” Corrigan completed.
“He’s the reason why I’ve never been interested in politics or prestige,” Shipley told them.
“Really?” Evelyn said.
Shipley grinned. “Oh, I was kidding. But he does say some interesting things.”
“Such as?” Corrigan asked curiously.
“That you shouldn’t seek happiness through things that other people have control over,” Shipley answered. “Otherwise you end up being enslaved to them.”
That didn’t seem to leave very much, as far as Corrigan could see. “What else is it that you should want, then?” he asked.
“Live for your own values and beliefs: things that nobody can take away,” Shipley answered. “Then nobody can own you.”
The veiled reference to their private conversation earlier would have been enough on its own to goad Corrigan into dissenting, even without his innate Irish argumentativeness. “It sounds like a pretty empty cop-out, if you ask me, Eric,” he opined. “The kind of thinking of somebody who would never try going for anything for the fear of losing it. Where’s the challenge and satisfaction in living a life like that?”
“It’s being free, Joe. Fearing nobody. Look at the antics of some of the people we see every day and ask how many of them can say they have that.”
Corrigan shook his head. “You live your way, I’ll live mine. I couldn’t accept a philosophy like that.”
Shipley seemed unperturbed. “Maybe you should go back and get in touch with your roots again, sometime, Joe,” he said. “To Ireland. There’s a tradition there, too, that understands the kind of things I’m talking about.”
“Oh, you don’t buy that load of rot, too, do you, Eric?” Corrigan groaned. “Thieves, rogues, and scoundrels, the lot of ’em. They’d sell their grandmothers for the price of a pint—and then leave you stuck with the tab if you look the other way.”
“I’d still like to go there,” Evelyn said. It was something they had talked about a number of times.
Corrigan looked at her. “Well, maybe it is about time that you and I took a break somewhere.” He raised his eyebrows. Her face split into a smile, and she nodded eagerly. “How about Florida, or maybe Mexico?” he suggested.
“Somewhere a bit sunnier than Pennsylvania in December, anyway,” she said. “It’s no better than Boston.”
Corrigan thought for a few seconds longer. “Then let’s make it California,” he said. “There’s a string of places dabbling in neural stuff on the West Coast that I’ve always been meaning to check out. And there’s an old friend of mine from MIT called Hans Groener who’s doing things at Stanford on sleep and dreams that sound interesting, but I’ve never had a chance to see it.”
“Sure, California’ll do. Why not?” Evelyn said. “I’ve never seen Yosemite.”
“Do it,” Shipley told them. “Everything’s slowing down here for the holidays. And you’ve probably got some leave that you need to take before the year’s out, Joe.”
Why not? Corrigan thought. “I’ll call Hans tomorrow and see what we can do,” he promised.
CHAPTER SIXTEEN
Despite his fatigue and having been up all night, Corrigan did not sleep well. He awoke halfway through the afternoon, still feeling woolly headed and groggy. All he could remember from his disjointed recollections of the early-morning hours was that Lilly’s place was north of the river, somewhere near the Allegheny Center. He cleaned up and put on some fresh clothes, then fixed himself a snack. Computer-injected hunger signals felt just the same, even if his real body was in repose, getting its nutrients from dermally transfused solutions. After that, he left without turning Horace on again, and caught a bus to the North Side.
But nothing that he saw jogged his memory as he wandered up and down the streets of the district contained in the crook of the I-279 Expressway, north of Three Rivers Stadium. Any of a score of apartment-block entrances that he passed could have been hers. Any of the streets that he walked along could have left the hazy image that was all he could piece together of unremarkable frontages glimpsed in predawn shadows.
It made sense to him now why recent years should have seen so much redevelopment around Pittsburgh. For every part of the old city that was “demolished,” new, simulated scenery could be substituted that would not have to conform to anybody’s real-world experiences. Nobody could walk around inside the Camelot, for example, and be puzzled by not finding things the way they used to be. The “realscaping” task was thus considerably eased.
He wanted to tell Lilly that she had been right, but everything was okay—the experiment was going as it should. Yes, their memories of the actual commencement of Oz had been suppressed, and alternative stories given to mask the transition from the real world to the illusory. But it didn’t follow that something sinister was going on. Some such provision would have been necessary to ensure that the responses of the surrogates—the real-world participants coupled into the simulation—would be natural and valid.
And boy, had that part of the scheme worked as planned! Until Lilly waved the facts in front of his nose, he himself—one of the principal creators of the simulation—had failed to realize that he was inside it. She had thought to question where he had not because she had known less. He had been involved in the planning of Oz. Hence, if any deception were intended, he would have known about it. Since he didn’t, there couldn’t be any; and once the impossibility was established in his mind, there was no place for the possibility to coexist. The irony was that it had been abl
e to work in his case only because of his knowledge that it couldn’t work.
The main cause of Lilly’s distress and anger was not so much the deception—she was a military volunteer, and things like that happened and could be compensated for—but the twelve years that she saw as stolen from her life. And who could blame her for that? But what he knew, and she almost certainly would not, was that those twelve years were also an illusion. The system coupled directly into post-sensory brain centers, which enabled data to be coded in a prereduced, highly compressed form that eliminated delays associated with preprocessing in the perceptual system. This meant that time inside the simulation ran about two hundred times faster than real time in the world outside. Hence, the actual time that they had spent hooked into the virtual world would be closer to three weeks than the twelve years that they remembered subjectively. Although even that was longer than the durations projected for the test runs that Corrigan had expected to be taking part in, it wasn’t outrageous. They were all scientists and volunteers, after all. They would have had little problem agreeing to something like that.
He hoped that if he could find her and reassure her of at least that much, she would see things in a different light and be less likely to start doing anything rash that might disrupt the experiment. There was no reason for the test conditions to be affected by the mere fact of their knowing what they knew now, as long as they continued to act as if nothing had changed. The system could only monitor external behavior: what a surrogate did and said. Since nobody possessed the knowledge to tell it how, the system was not able to decode inner thought processes from deep inside the brain and read minds. If it could, there would have been no need for Project Oz in the first place.
The whole idea had been that the system would learn to make its animations more lifelike by imitating the behavior of real people injected as surrogate selves into the simulation. It had no way of knowing why the surrogates that it watched behaved in the ways they did—any more than they frequently did themselves. At the end of the experiment nobody would know, let alone have been able to specify beforehand, the precise structure of software structures and linkages that had self-organized to make such mimicking possible. The neural structures responsible for the complexity of human behavior in the real world had evolved by principles that were appropriate to carbon chemistry. Trying to duplicate them in code would have been as misguided as building airplanes that flapped feathers. Oz was designed to build, in ways appropriate to software, whatever structures it needed to achieve similar results. Nobody needed to know exactly what the final structures were, or how they worked. The aim was to achieve directed coevolution: the end-product, not the mechanism for attaining it, was the important thing.
That had been the theory, anyway. Whether it would work was what Oz had been set up to test. And from the bizarre goings-on going on in the world around him, Corrigan’s first conclusion had to be that as far as its prime goal was concerned, the project had wandered somewhat off the rails. For, far from modeling themselves on the surrogates, the system animations seemed to be going off into self-reinforcing behavior patterns of their own, while—if his own and Lilly’s cases were anything to go by—the surrogates had become misfits. That in itself didn’t trouble him unduly. This was research, after all; perfection could hardly be expected from a first-time run—and especially in an undertaking as unprecedented and as ambitious as this.
Hence, it was no surprise that the animations fell short of true human emulation in some aspects. What was astounding was that they came so close. The empty stares and “flatness” were minor flaws compared to the extraordinary degree of realism—even if it did tend somewhat toward the eccentric—with which the personas that he encountered daily were able to act out their affairs and effect the continuity of leading consistent background existences offstage. So what if the system had overstepped the boundary of neurosis when it tried to make Jonathan Wilbur an embodiment of human criteria for personal success and failure; or if Maurice at the Camelot couldn’t master a value system that didn’t reduce to a simple profit-and-loss calculus? They had fooled Corrigan. It was sobering to realize just how effective the combined weight of suggestion and authority had been in persuading him that the defects he had perceived in the early stages were in himself and not in the world around him. Now so much seemed so obvious.
The universal ineptness at fathoming humor and metaphor that he had observed for years—processes that involved the associative genius of human intellect at its subtlest—should have given the game away. And if not that, then surely the curious and unnatural persistence of people like Sherri and Sarah Bewley when they pressed him for explanations of where they had missed the point. Of course. All the time it was the system—wanting to know how it could do better. Mind reading was not an option.
In the real world, when people acted strangely or unsociably, others tried to gain some insight to why by getting them to talk. In the same way, when the system sought deeper understandings of what motivated the surrogates, it put animations around them to ask its questions for it. And the closer the relationship, the more personal—and hence relevant to the purpose—their questions could naturally be. Maurice, his boss at work; Sarah, his rehabilitation counselor (and the earlier attempt in the form of Simon, which had failed)—both were examples of the computers trying to get close, wanting to discover what made him behave as he did. No wonder he sometimes found himself reflecting that Horace, Sarah, and Maurice sounded the same. They were the same. His house manager was the system in disguise, too.
And even his wife! For hadn’t it been Sarah who first came up with the suggestion that marrying Muriel might be a good move—for “therapeutic” reasons? Weird but frighteningly effective, he had to concede. Acting through one manufactured personality, the system had insinuated itself into his personal world in the form of another. Corrigan could only marvel at the ingenuity of it. Already the project was surpassing anything he had imagined, even in his wildest moments of selling it to others.
And now there was the risk that just at this crucial stage when Oz was surpassing all expectations, Lilly, unless he could get to her, might jeopardize the whole thing. But he was not going to get to her this way, he admitted to himself finally. Until he figured out another way, or until she got to him again, the thing was to carry on acting normally. That meant going in to work today, just as if nothing had happened. He crossed back over the river to Downtown and decided that it wasn’t worth going out to Oakland. On the other hand, it was too early to go straight to the Camelot just yet.
There was a bar not far from the Vista—the hotel where Evelyn had stayed when she came down from Boston for her job interview—that he used to frequent a lot during his time with CLC, but which he hadn’t been in since his “breakdown.” He knew every scratch on the countertop in that place, the prints and curios on the walls, all the scuff marks and stains in the pattern on the wallpaper. There was no way that the realscaping crews could have covered every place in town.
Out of curiosity, he made his way there. The street had acquired its share of changes over the years, but apart from a new door and a coat of paint, the bar still looked pretty much the same—outside. But that was the easy part. He went inside. . . .
And sure enough, it had all been remodeled. New counter, new walls, new everything. Corrigan sighed and ordered a Bushmills, straight up. It could have been his imagination, but he was sure he detected a hint of a knowing smirk on the face of the pudgy, balding bartender.
“Okay, you got me,” Corrigan conceded, raising his glass.
“Ah . . . pardon, sir?”
“It doesn’t matter.”
There was a pay phone in an alcove by the cigarette machine. Corrigan changed a twenty into quarters and sauntered over. He set his drink on top of the phone and tapped in the number for Information International. The codes were all different from the ones used in the real world—the change had been explained as necessitated by changes in procedure by the
phone companies. Corrigan couldn’t remember offhand why the Oz designers had done that.
“How can I help you?” a voice inquired.
“I’d like a number in Ireland, please.”
There was a confused pause. Corrigan smiled at the thought of the drastic axing and reassembly of a whole section of the system’s pointer tree that those few simple words would have caused.
“Why do you want a number there?” the voice demanded, sounding belligerent. Oh, yes, Corrigan thought to himself, it all seemed so obvious now.
“What the hell does it matter why I want it?” he retorted. “Would you please just do your job.”
Another pause, then a different voice, a woman’s, with a believable brogue: “Directory, which town, please?”
“Dun Laoghaire.”
“Yes. And who would you be wanting there?”
“There’s a grocer’s shop on the corner of Clarinda Park and Upper George’s Street, called Ansell’s, that stays open late. What’s their number?” It would be approaching ten P.M. in Ireland—five hours ahead of Pittsburgh time.
A long pause. “Ah, I don’t seem to have them listed anywhere. Are you sure it’s still there? It might have changed.”
“How about the New Delhi? It’s an Indian restaurant along the street.”
“No. I don’t have that either.”
Corrigan grinned. The system was throwing every obstacle at him that it could come up with. “Then tell me the number of the Kingston Hotel at the bottom of Adelaide Street.”
“A hotel is it, you said?” The system was trapped. Corrigan could sense it, there in the voice.
Cyber Rogues Page 55