Book Read Free

Harbinger (The Janus Harbinger Book 1)

Page 22

by Olan Thorensen


  “But knowing something is possible doesn’t tell you how long it will take,” Harold said in a discouraging tone.

  “How long is a separate issue from knowing it exists,” said Sinclair. The others turned to look at the general. Philosophical utterances from him were not expected. “Just knowing that a given technology is inevitable will alter the course of that and other technologies, not to mention fallout in social patterns, religion, and every other aspect of human civilization.”

  Mueller restarted the recording. “. . . but the limitations and the timeline will almost certainly be a disappointment.”

  “Then how is this going to be significantly different from before? Not giving us information immediately may be considered no different from getting information too slowly,” said Mueller.

  “That is something only you can decide. I am limited to what I can tell you. If this is not satisfactory, nothing else can be done.”

  Mueller stopped the recording. “As you can see, while we did get an indication of more information to be forthcoming, we had no idea of the effort it would take or the time.”

  “On the other hand,” said Harold, “what choice did you have?”

  “None, not really,” replied Mueller. “We played about the only card we had—threatening not to communicate.”

  “We can skip forward now to some later recordings, but let’s stay with what you’ve heard so far,” said Huxler. “Suffice it to say that while it was a frustrating process at times, we did establish more of a dialogue. But the next recording is a collage of exchanges over the next couple of months.”

  “One thing to be aware of.” said Mueller. “We believe Simeon was learning how to carry on a conversation. By this, I mean not only asking and answering questions, but starting to learn to anticipate questions, how to interpret subtleties of questions and answers, and, perhaps most important, beginning to uniquely interact with various staff members in slightly different ways.”

  “Different how?” asked Andrew.

  “Let me give you one clear-cut example,” said Huxler. “When Simeon talks with Freddie, they talk almost entirely mathematics. With Jeff or Rachel, Simeon questions them about the meaning of words and language structure.”

  Zach pursed his lips for a moment. “By languages, you mean in general or English specifically.”

  “Interesting point,” said Mueller. “Once we asked whether he was interested in other languages besides English. Simeon said the information that other languages existed was sufficient for the moment—with an implication he would be interested in learning other languages later.”

  “Later?” said Harold. “Like, what later time or why would he be interested then?”

  Howard smiled. “Two questions that help illustrate some of our current frustration. On the question of when Simeon might be interested in other languages, the answer was always later. There’s been some discussion that perhaps Simeon’s time sense is different than ours, or maybe he doesn’t care about specific time intervals.”

  “Or he doesn’t want to tell you the answer,” said Zach.

  “Correct. Which leads to the second issue, which is Simeon won’t always give answers. In this case, what would make him interested in other languages? This and other questions, he commonly answers by saying he doesn’t know.”

  “Doesn’t know when or why he will want to use other languages?” said Jason, rubbing the back of his neck with his left hand. “Doesn’t this imply he has an agenda, a schedule, or some predetermined criteria for when certain things will come to pass? In this case, he will know when the time is right for him to learn other languages, based on events, or whatever, that lead to a decision.”

  “Our conclusion as well,” agreed Huxler. “Which means, as you say, that there is something of a schedule. But then, if there’s a schedule for what he wants to learn, might there be a schedule for when he tells us more?”

  “What if Simeon is lying?” asked Jill.

  Huxler shook his head. “We can never totally rule that out, but so far we haven’t been able to catch Simeon in a lie.” Huxler smiled ruefully, “Of course, that may only mean he’s cleverer than we are—not an impossibility.”

  Zach seemed unsatisfied. He shook his head slightly. “Have you considered another possibility? What if he actually does not know? You’re assuming Simeon, or whatever the Object is, is fully conscious of a purpose. You’re also assuming it’s a single entity or whatever.”

  Seconds of silence followed—not the silence of inactivity, but the silence of so much mental activity as to preclude anything else. Finally, Huxler spoke first. “So . . . you’re suggesting what? There’s more than one part of the Object, or we’re in communication with only one of a larger pool of . . . what?”

  “How would I know?” Zach said. “It occurs to me to wonder who named him Simeon? Not a common name. Why not Bill or Einstein or Joe-Bob?”

  “Simeon chose the name from a list of first names he requested,” said Mueller.

  “Did he say why he chose Simeon?” asked Ralph.

  Huxler frowned, wrinkling his mouth to one side as he considered the question. “It was another of those cases where we got no clear answer. But I do remember he asked for the meaning of the names. We told him names tended to have origins, but most meanings were lost, and names are now simply identifiers.”

  Mueller turned to a keyboard and a monitor to his left, keying in and speaking as if to himself. “Yes, but I remember we sent him a list of names with their origins. Let’s see . . . we have the file here somewhere,” he mused, scrolling through directories and files. “Here we go . . . the file First Names and Derivations.”

  “Simeon,” Mueller read off the screen. “The son of Jacob and Leah in the Old Testament. Founder of one of the twelve tribes of Israel. Derivation possibly from the Hebrew phrase shama on, meaning ‘he has heard.’ Similar roots to the names and phrases translated such as ‘God has heard’ and ‘“listens to the words of God.’”

  This started an eruption from the entire group.

  Sinclair quieted the group down and was about to suggest discussing this more later, when Zach broke in again, speaking softly. “Think of another possible source of the name ‘Simeon.’ Not the ancient origin. Think of a single word sounding similar. Something several of you have had intimate contact with.”

  Andrew’s eyes widened. He whispered something, mainly to himself, then louder for all to hear and with a wondering tone, “Simulation.”

  Ralph jumped on it first. “Holy fucking shit! He’s a simulation?”

  “But, of course, he’s a simulation,” asserted Mueller with a puzzled, almost irritated tone. “He’s the visual construct we see on the screen. Obviously, whatever the Object is, alien, AI, whatever, it’s not a middle-aged man with thinning light brown hair and brown eyes.”

  Huxler became thoughtful. “We’ve been assuming the image we see is the Object’s way of giving us something similar to ourselves, at least visually, to put us more at ease.” He paused for a moment and looked at Zach. “But you have something else in mind?”

  “What if Simeon is an intermediary? A projection the Object believes will make us more comfortable. In that case, the Object could either be honestly trying to communicate or deliberately hiding its true purpose.”

  “Wait a minute,” said Jill. “Why hide anything?”

  Huxler responded before Zach could. “Zach’s right. It could be hiding its real agenda for reasons that could be benign or malevolent. Remember, this thing is alien. We have to be cautious about humanizing it. It’s possible we may never understand its purpose, even if it tried to explain it to us—the differences may be a chasm where we can’t even see the other side.”

  Mueller was not convinced, but he was considering Huxler’s words when Jill spoke. “Have you asked whether it lies or knows its purpose?”

  The physicist reacted, annoyed. “Well, of course, we’ve asked what its purpose is. As for lying, we’ve looked to
catch it leading us astray or outright lying but haven’t detected anything yet . . . as I said just a few minutes ago.”

  “That’s not what I meant. Not whether you think it’s lying, but whether you asked if it could lie. And not whether it would reveal a purpose, but does it know what its purpose is?”

  The reformulations of the two questions stopped Mueller, as he considered. Huxler turned and paced along a room’s wall, hands clasped behind his back, then he turned and said admiringly, “Jill’s right. Perhaps we’ve not asked the right questions.”

  “Ask him,” ordered Sinclair.

  Mueller turned back to the waiting head on the monitor and reconnected the sound. “Simeon, we have a couple of questions we would like to ask.”

  “But, of course, Howard, I’m always happy to talk with you.”

  “Yes, but talking is not the same as answering. Do you understand the difference?”

  Simeon’s face lost its expression for a second, as if either retrieving or constructing an answer. “Talking is a form of audio communication between or among individuals or groups. The communication method is usually through a spoken language, but other forms of communication can qualify as talking. Answering is in response to questions either communicated directly or implied. Individual A could ask a question of individual B. If B responds with information satisfying A, then B is said to have answered.” Simeon’s face lost the blank look. “Is that a correct understanding?”

  “Yes, Simeon, that is correct. Do you understand what a lie is?”

  This time, Simeon answered immediately, “A lie is to respond to a question with an incorrect answer.”

  Huxler made a cutoff motion with his left hand hanging at his side. Mueller saw the motion and switched off the microphone.

  “Jill has me thinking skeptically. His answer could be a way to avoid answering without directly lying.”

  “Push him,” said Sinclair. “His answer was true but not complete.”

  Mueller turned back to Simeon, “Your answer was not complete. The answer could be incorrect, but the person giving the answer could believe it to be correct. Therefore, it would not be a lie. A lie is when a deliberately incorrect answer is given. Do you understand?”

  “Yes, Howard, I understand.”

  “Very good, Simeon, but here is one more aspect of lying. If an individual gives information it knows would be an incorrect answer if a specific question were asked, then that also is a lie.”

  “Nice nuance,” whispered Sinclair.

  “Simeon . . . can you lie?”

  “Why are you asking, Howard?”

  “We can discuss that later, Simeon. For the moment, please answer my question. Can you lie?”

  “Knowing the reason for the question would help me understand humans and formulate an accurate answer.”

  “Obfuscating,” murmured Huxler.

  Mueller continued. “Please understand, Simeon, to humans, to avoid answering a question can be a tactic to give an answer that is considered negative. In this case, we are now starting to wonder if you are avoiding answering the question of whether or not you can lie because you can lie.”

  All of them were figuratively holding their breath, waiting and not knowing exactly for what.

  “I have never lied to you, Howard.”

  “That is good, Simeon, but the question is—can you lie?”

  “Not at this time.”

  “Not at this time?” interjected a surprised Huxler. “Then at some future time, you might lie to us?”

  “I do not know the answer because the future is beyond current knowledge. I can only say that I have not lied to you up to this moment and that I am unable to lie at this time.”

  “And if at some future time, if you become able to lie, would you tell us that has changed?”

  Simeon smiled. “And what is the problem with your question, Wilbur? Howard?”

  Huxler laughed. “You’re right. The logic is that if you can lie, then you may already have lied to us when you say you cannot lie. If you cannot lie now but might in the future be able to, then at that time you could lie by saying you still could not lie.”

  Sinclair gave Huxler a “finish this” motion with both hands.

  “Thank you, Simeon, this has been an interesting conversation. We will discuss it more in the future.”

  “I am always happy to talk, Wilber, to you, Howard, and the others at any time.”

  Howard turned off the mic and camera, and all of the members looked at one another.

  “Well,” said Sinclair, “that was interesting.”

  “Interesting in the good or bad sense?” teased Huxler.

  “Time will tell.”

  Why VR?

  After a break, they reconvened in the Level 3 conference room. Jason spoke first.

  “Another obvious question is why exactly are we here? Why do you need our virtual reality system? You obviously can communicate quite well with Simeon, or whoever, without the complexities of the system we developed at VR.”

  “We need your system because Simeon insists,” said Mueller.

  Sinclair snorted. “Insists is the operative word. Simeon claims it’s essential if we want more progress.”

  How can Simeon insist?” puzzled Harold.

  “By refusing to give answers or communicate at all,” Huxler said in resignation. “Shortly after we started the regular exchanges, he stated that development of more complex and detailed information exchanges would require an advanced VR system. Thus, your project.”

  Ralph now was frowning. “But we developed our system completely on our own. The Object, or Simeon, whatever, must have knowledge of technology that would make developing whatever VR system it claims to require a trivial exercise.”

  “Yet you didn’t get any help or clues, much less blueprints, on developing such a system,” said Mueller. “We’ve thought long and hard about why you needed to go it alone.”

  “You’ve put your finger on a critical issue,” responded Huxler. “Simeon has made it quite clear that any technology transfer, at least for the foreseeable future, will be tightly controlled by him. Some of us believe Simeon is afraid of the consequences if he transfers too much technology too soon.”

  “Or maybe at all,” added Sinclair. “Bottom line is, we don’t have any idea of Simeon’s true agenda. Is he benign, hostile, neutral, or something beyond our understanding? We only know what he tells us and what we can read between the lines. I might add, we hope Simeon doesn’t understand humans so well he fools us into thinking we understand him.”

  “I think this gives me a headache,” said Jill.

  The nervous tension broke with a brief laugh from most of the group.

  “You know, this may actually be the most important part of this whole unbelievable situation,” said Jason. “Keeping our entire civilization from collapsing by discovering too much technology too soon.”

  Mueller brought them back to the original question. “And so—why do we need the VR system and why a system so complex? It took us a while to get a two-part answer from Simeon. According to him, he needs to understand humans in as great a depth as possible and interacting in the most realistic setting is necessary for optimal results.

  “Direct electronic exchanges and video over monitors is insufficient—he says. Whatever the reason, to get more information out of him, it will be on his timetable and as he develops his understanding of humans. It relates directly to what we were just discussing—technology transfer. Simeon says that the VR system has to be completely developed by us without technology transfer.”

  “So . . . we’ll get to ‘talk’ with him inside the VR?” asked Jason.

  “Eventually,” Huxler answered, “though not right away and not necessarily all of you. We’re still working out how we’ll proceed and integrate the VR system once it’s operational.”

  Ultimate Step

  When the meeting ended, Sinclair asked Andrew and Zach to meet him in his office after dinner tha
t evening. They sat at a small round table. Zach noted that no one else was present, including Huxler, which made him suspect the topics would be things Sinclair did not want Huxler participating in.

  “Let’s get right down to it,” said a grim Sinclair. “We’re in such unknown territory with the Object that EVERY scenario has to be considered.” The emphasis on “every” was not lost on either Andrew or Zach.

  “You mean what if the Object turns out to be hostile?” offered Andrew.

  “I mean what if it turns out to be a threat to the very existence of the human race?”

  It was Zach who cut to the chase. “So, what exactly are our roles if this threat materializes?”

  Sinclair sat back in his chair, icy blue eyes darting back and forth between them. “We see two types of scenarios where extreme action may be needed.”

  Zach didn’t like the sound of “extreme action”—one of the euphemisms people used without literally saying what they hesitated to say aloud.

  “The easiest is internal,” said Sinclair, “although not easy in the sense of how to respond. And by internal, I mean of human origin. We cannot lose control of the site and any potential flow of knowledge from the Object. The discussion we had yesterday with the others passed over this more quickly than was realistic. We won’t go into it now, but there may come a time when specific pieces of knowledge absolutely cannot get out. Depending on the circumstances, no action is off the table, and I mean no action. At least as far as I understand, past presidents have agreed with this, although action may have to be taken whether or not the president agrees.”

  “What kind of action are we talking about?” asked Andrew, frowning skeptically.

  “Let’s do what the scientists call a ‘thought experiment,’ where you assume a condition and consider the consequences. What if you know that information on how to easily create a deadly plague in a basement is in the mind of the Iranian grand ayatollah in Qom. You further assume only he knows this information, and he is about to send this information to every terrorist group in the world. What would you do?”

 

‹ Prev