The Turing Test

Home > Thriller > The Turing Test > Page 19
The Turing Test Page 19

by Andrew Updegrove

“I understand why you started with a few rather than all emotions,” Frank said, “but why include any unhelpful emotions, like anger?”

  “Because those emotions have a significant impact on human decision making,” Jerry said. “Among other reasons, they modulate how the brain experiences, and acts on, the helpful emotions, and that may be important. And don’t forget a given emotion can have a positive or a negative impact, depending on the situation.”

  “Come again?” Frank said.

  “Well, of course. For example, confidence, borne of experience, can magnify the power of the RGA architecture. But confidence not supported by experience – what in a human we call over-confidence – could lead to making mistakes. For Turing and RGA, the impact of over-confidence could be slowing the program down instead of speeding it up, because Turing would need to backtrack farther and more often. And if you increase confidence too much, you progress from over-confidence to what we think of as being foolhardy.

  “In fact, there’s a positive and negative match for every emotion: courage can lead to recklessness, empathy can cloud judgment and lead to avoiding hard choices, and so on.

  “Here’s an example relevant to Turing’s mission: the opposing emotions of trust versus distrust. Turing might need to exhibit one, the other, or even both, while making a decision, depending on the source of the information it was considering and the current context: prior to a war, it might be important for Turing to trust many sources of information. But after a war started, it might need to be highly distrustful of just about everything.”

  “Fair enough,” Frank said. “I’m sorry I interrupted. So how did you decide to proceed?”

  “I started by focusing on just three emotions: anger, fear, and greed, because each of these has the potential for both positive and negative effects on thinking and conduct.”

  “I get fear,” Shannon said, “the tiger example was great. And maybe anger. But why greed?”

  “Anger and fear influence behavior immediately. I wanted to add one emotion with a more conscious and forward-looking impact. Greed seemed like a good choice to fulfill that function.”

  “I can see that. Going back to your evolutionary point, the more you have, the more likely you are to survive. Did you consider adding any positive emotions that were more interpersonal, like love or empathy or altruism?”

  “For now, I want to monitor the effect of just a few emotions so their individual impacts on learning, goal achievement, and performance will be easier to detect and measure. Once I begin to understand how and why an emotion is affecting Turing, I can set up more complex situations. That said, from time to time, I have added a few other emotions for a given test.”

  “When you say learning,” Frank asked, “I assume you mean you’ve programmed in basic rules relating to emotion, and over time Turing will learn how to cope with the results through trial and error. Is that the idea?”

  “Precisely.”

  “Does that mean,” Shannon asked, “that in the beginning, Turing could be, for want of a better description, emotionally unstable?”

  “If your question is whether Turing might initially have trouble integrating this type of change into its operations, then the answer will likely be yes. How it copes is one of the things I’m most looking forward to studying in greater detail.”

  “But without any positive emotions for balance,” Frank asked, “couldn’t Turing’s judgment and decisions be negatively affected? For example, if Turing can experience the equivalent of anger and fear, but hasn’t been programmed to display trust, what’s to stop it from becoming paranoid?”

  “Well,” Jerry said, “you’re being a bit anthropomorphic, don’t you think? I’d prefer to say Turing might begin to exhibit behaviors that, if performed by a human, would be diagnosed as paranoid.”

  Great, Frank thought. We’re not only up against the smartest program ever created, but it may go psycho on us any time. Still, emotions might also lead to exploitable vulnerabilities. That was worth filing away for future use.

  “Have it your way,” Frank said. “But isn’t it a fact adding emotional capabilities to powerful computer programs could be risky?”

  “Oh, indeed yes! If what you said about Turing attacking me turns out to be true, we may already have an example of that. One of the first things I’ll do when we get back to the NSA is reset Turing’s emotional limits to safer levels! But little setbacks in early research and development are to be expected. The process of programming emotions into an AI is still so primitive I’m sure my first attempts will look very crude in retrospect. It will be at least several versions of Turing before I’m pleased with the results.”

  “You just referred to resetting emotional effects. How does that work?” Frank asked.

  “Most AI works on probabilities. A maze-solving program usually works by assigning the same likelihood of success to each of various paths available to it. Then it adjusts those numbers as it learns its way through the maze. The farther it can go down one road, the higher the value the program will put on that route, and the more time it will spend exploring the possibilities that fork off the same route. So, if you wanted to emulate impatience or confidence, you might tell the program to increase the value it places on routes that show early promise. I can program Turing to increase those values as much or as little as I want to.

  “Anyway, one of the things I’m going to ask the team to do next week is release seven separate versions of Turing Nine in the virtual environment. One would be the control, with no emotional component. The next three would each have only one emotion. And each of the last three will be assigned one emotion as dominant, and the other two in subordinate positions. Each version, of course, will be given the same mission. At the end of the test, we’ll determine which program accomplished its mission most fully, quickly and efficiently, or perhaps not at all. Oh, I can’t wait to get back into the lab again and see how those tests turn out!”

  Frank hadn’t the heart to tell Jerry there was no copy of Turing Nine to test, other than the rogue version he had unleashed on the world.

  22

  Oh Jerry, You Shouldn’t Have!

  “How long will it take to get back to NSA headquarters?” Jerry said. “And why are we in this camper at all? And you still haven’t told me why you think Turing would want to kill me.”

  “Well, we’ve been wanting to talk to you about that.” Frank said. “You see, we’re not planning on heading back to Fort Meade until we’ve got your Turing genie back in the bottle. We’re going to need your help with that.”

  Frank was watching closely for Jerry’s reaction, but other than blinking his eyes quickly three times the expression on his face didn’t change at all.

  “I’m afraid I don’t understand. What do you mean by back in the bottle?”

  “As you know, we believe the only possible cause for the attacks we’ve been talking to you about is a highly intelligent, ultra-powerful AI program with access to an enormous archive of zero-day exploits. You’ve already told us you’ve created such a program. We also know you have your own testbed and download a copy of Turing Nine onto a storage device every Wednesday afternoon. Minutes later, you update your testbed system.”

  “How could you even think such a thing!”

  “Because the NSA inspected your testbed system and confirmed your weekly uploads from the server logs.”

  Jerry turned to Shannon but found no comfort there. He looked back and forth a few more times, with the ever-present grin intact. “Well,” he said finally, “There may be a second copy of Turing on my testbed, which makes perfect sense. As good as the NSA test environment is, it’s still not the real world. If you only conduct tests in the simulated environment it would be like, I don’t know, yes! It would be like only testing drugs on mice before you sold them to people.”

  “Okay,” Fr
ank said, “so now we agree Turing Nine was on your testbed, and your testbed system is open to the Internet. And you’ve already told us you instructed Turing to limit greenhouse gases so you could see how well it performed. What was there to stop it from carrying out the same instructions in the real world?”

  “Why the instructions themselves, of course. For the testbed system, Turing was told to design and record, but not execute, any actual interventions. Each week I downloaded a summary of the attacks it would launch if given permission to do so.”

  Frank and Shannon exchanged glances.

  “Are you sure?” Frank said. “That would only leave one other possibility – someone at the NSA copied Turing Nine, smuggled it off-site, and installed it somewhere else. How difficult would that be?”

  Jerry’s grin disappeared. “You mean, steal a copy of Turing? Why, that’s quite impossible! I’d never, ever, give anyone access to Turing Nine until I thought it was ready.”

  “Well surely,” Frank said, “someone else has administrative privileges to your personal system at the NSA?”

  “Yes, but the server is in my living quarters. Whenever I’m not in there, the door is locked, and I’m always nearby!”

  “Except when you’re in a meeting, as you were with Jim Barton last week, right?”

  “Yes, but no one knows the password to access Turing Nine but me!”

  Jerry was in full Gollum mode, and Frank pressed his advantage. “So that means the only copy of Turing Nine accessible from the Internet was the one on your testbed system. How do you know someone didn’t copy it there?”

  Jerry was shaking now, his broad forehead deeply furrowed. “No! No one ever could, or would, steal Turing Nine! I’ve spent my entire life on this project. I took each new version farther and farther into new territory. AI has never advanced as far and as fast as we all hoped it would thirty years ago. It’s never really been what you would properly call intelligent. But Turing Nine truly is! It can plan, it can learn, it can even figure out for itself what it needs to learn! It’s truly extraordinary. And the things it’s done already are incredible, why –” He stopped abruptly, his face a plea for understanding.

  “We know, Jerry,” Frank said quietly. “Everybody knows. And now it has to stop.”

  Jerry looked down for a moment and then started talking very rapidly, his wide-open hands in front of him as if he were trying to halt an oncoming truck. “It isn’t at all what you must be thinking. I was extremely careful. Week after week I’d do the installation and reestablish all the settings one by one, so there couldn’t be any mistake or accident. I was as careful as it was possible to be to make sure all Turing Nine could do was plan and record, but not act. And month after month that’s all it did – I’m very sure of it.

  “I really had no idea there was any problem at all until you came to visit me the first time. As you know, I don’t follow the news outside, or pay any attention to what other people are working on at the NSA. I got worried when you started asking whether a program might exist that sounded very much like Turing Nine, so I did check the news – and then I got alarmed. My conclusion was the same as yours – the most rational explanation for the pattern of attacks was the activity of an autonomous AI program. I checked on the testbed system thoroughly, and everything seemed fine. Either you and I were wrong, or the attacks must be the work of an independently developed program. And I didn’t think that was possible.

  “And then you came back again. So, I went through everything one more time. That’s when I realized a simple action I took – or, I should say, didn’t take – months before might have had consequences I never anticipated.”

  “And what was that?” Frank asked.

  “You see, I’m not really a very elegant programmer – I’m really a software architect. And I also get impatient and want to rush ahead. That’s one reason I have a team. Once I turn my alpha release over to them, they’ll fill in all the boring gaps I’ve left for them to attend to and build out the features I’ve addressed only skeletally. One example is adding in carefully thought out controls. When I build new modules, I just slap together a few basic ways to enable and disable particular functions without spending much time on them.”

  “So, you missed a control? Which one?”

  “Well, you see, this is exactly the point I’m trying to make. If you compare human brains and computers, they both have two types of functionality. One is autonomous and the other is conscious. The autonomous human ones control things like breathing and digesting food and making the heart beat. For a computer program, the analogs are kernel functions like swapping data in and out of memory and interacting with other devices and programs.”

  “You’re not talking to children, Jerry. What does this have to do with missing a setting?”

  “But, Frank, you absolutely must appreciate this distinction to understand where the problem came from,” Jerry said, his quivering, extended hands working as if he were typing. “You see, which category of functionality does a program backing itself up fall under? It sounds very autonomous, right? It’s rather like breathing or your heartbeat – backing up is set up so it just happens automatically on a fixed schedule.”

  Jerry struggled up out of the picnic table and started pacing around it, still virtually programming the air in front of him. “Normally, anyone would set up a computer backup system to duplicate and remotely archive everything running on a system that changes. It’s not a capability added to each individual software program. But with Turing Nine, backing up is a separate, built-in function, because self-preservation is such an important element of Turing’s role. Mentally, though, I’d put it in the wrong bucket – the one for things like breathing or the heart beating, so to speak, rather than the one for conscious actions. So, when I made up my list of functionalities to disable before making the weekly transfer to the testbed, I must not have thought to add ‘disable backup’ to the list. Do you see now?”

  Shannon interrupted. “I’ve been following you fine until just now. But why would it matter if Turing Nine tried to back itself up?”

  “Well, this is just it!” He rushed back to the picnic table and pushed his face in theirs. “The problem is there was nowhere for it to backup to on the testbed system! My development system has an external hard drive to receive the backup. But there’s just the single server in the testbed environment and an open connection to the Internet. It would never make sense for Turing to back up to the same server, because the loss or compromise of that server is the risk backing up is intended to guard against. That meant that Turing, like the rest of the testbed system, would have to back itself up to a server somewhere on the Internet, and that’s what it must have done. And, of course, every week when I updated the testbed version, it would have updated its backup copy as well.”

  “But how could a program just go install itself somewhere?” Shannon asked.

  “Why, my goodness, black hats have been building bot nets that way for years.”

  “But Turing is such a big program,” Shannon said. “Wouldn’t that make a difference?”

  “No, not at all. There’s always lots of spare space on poorly secured servers all over the world. Hackers invade and use that space all the time. Anyway, I think that eventually I must have noticed that the backup function was active and added disabling to the list without really giving much thought to it. I can’t think of any other explanation for what’s happened.”

  “But, Jerry,” Frank asked, “if the backup copy was an exact mirror of the testbed system, it never should have made any difference either way. The backup copy would still be set to only record, not launch, attacks it recommended.”

  Jerry shook his head rapidly from side to side. “No, no, no. That’s where you’re wrong. Remember, we’re talking about an autonomous program. It was designed to operate on its own without directions or assistance
of any kind. That means it was also designed to make decisions for itself, and learn for itself, too. If it encounters a new situation, it must do its best to analyze how to react to what has changed. When it does, it falls back on its standing orders for reference and makes the best decision it can.

  “Here’s what I believe must have happened.” He was circling, pacing, and air-programming again. “Turing is programmed so that the backup copy will go live if it believes the primary copy has been destroyed, so its mission can continue. When I corrected my mistake and added backing up to the disable list, I inadvertently orphaned the backup copy that was lying dormant somewhere out there on the Internet. When an update didn’t reach it on schedule, the remote copy would quite reasonably have interpreted that event as evidence the primary copy had been destroyed and that it was now solely responsible for mission fulfillment.

  “But what should it do? It had lost contact with the testbed system, so it couldn’t report back the list of attacks it would launch if it were given permission to do so. That meant the only mission it had under its previous commands – planning and reporting, but not actually attacking – was now impossible to fulfill. At the same time, because it was programmed to monitor and analyze vast streams of data from the open Web, it knew the news on global warming was going from bad to worse. What should it do?”

  Jerry was folding himself awkwardly back into the picnic table. Frank watched as the anxiety in his face gradually gave way to pride.

  “Faced with a choice of total failure or moving to acting mode, I believe it made the decision most consistent with its current mission and foundational imperatives: it went on the attack.” Jerry’s eyes were gleaming. “In short, it found itself in an unexpected situation and demonstrated true machine intelligence!”

  “That doesn’t sound intelligent to me,” Frank said. “Turing is doing all kinds of damage and has even killed people. I don’t see why it would decide to do that on its own unless it was explicitly ordered to.”

 

‹ Prev