The Turing Test

Home > Thriller > The Turing Test > Page 16
The Turing Test Page 16

by Andrew Updegrove


  “And Recursive Guess Ahead can help with that?” Shannon prompted him.

  “RGA? Oh! Yes – that’s exactly what RGA does. You see, a recent innovation in computing involves splitting up the processing power of a single computer into tens, or hundreds even, of separate computing units – we call them ‘virtual machines.’ I realized I could use all those individual virtual machines working together in parallel on the same problem in a special way. Using the chess analogy, after the program uses its background knowledge to narrow the possibilities to a much smaller number of alternatives, it could analyze each one of those options simultaneously.

  “When any of those virtual machines concludes that a given move would be more likely to lead to loss rather than a win, that information can be used to recalculate the likelihood of success for all the remaining alternatives, and that data is fed to all the other virtual machines. Then the virtual machine that reached the dead end can be reassigned to start helping whatever processors are pursuing the alternative with the highest likelihood of success at that moment, and so on. It’s sort of like a game of leap frog for computers.

  “The result is that all the virtual computers are always using the highest state of learning to further the most hopeful approaches, because the most current information is always flowing back into all the virtual computers. That’s where the ‘recursive’ part of the name comes from.

  “Now let me explain the ‘guess ahead’ part and use a game of chess as an example. Normally, a computer would look at a particular strategy by calculating the probabilities of each move, one at a time. What Turing does is assign half of its virtual machines to working forward the normal way from the current situation, one decision at a time. The other half are assigned to start moving forward from what Turing thinks the board is most likely to look like a half a dozen moves ahead. If any of those looks like a sure winner, it communicates that information back to the other virtual machines, and they start to work on the best strategy to get to the point where the ‘guess ahead’ machine started its analysis and quit pursuing any of the approaches that have a lower likelihood of leading to success. That can save an enormous amount of computing power and also provide a huge advantage over an opponent that’s only thinking forward in a linear fashion.”

  “That’s a very elegant technique,” Frank said.

  Jerry beamed. “Yes. I like to think it’s like evolution occurring on a vastly expedited basis. In the real world, its pure chance when a mutation proves to be beneficial. When it does, it takes decades for the improved gene to be passed along, and only a few children – or maybe none at all – are lucky enough to receive it. If it is passed along, it takes thousands more years for the improved gene to spread throughout even a small part of humanity. With RGA, the best new ‘genes’ are shared immediately with every other virtual computer.”

  “Very interesting,” Frank said. “How would that affect Turing’s behavior, using the energy infrastructure example?”

  “Oh, I expect the ‘guess ahead’ resources are considering things like how the world would be reacting to its attacks. And, by the way, that takes us back to the Turing test.”

  “It does?” Shannon said. “How?”

  “Well, because someone who’s obsessed with human intelligence might say ‘but that’s still not intelligence!’ Well! That’s quite a statement, isn’t it? We don’t have any idea how human intelligence works! Perhaps someday we’ll determine that our brains actually use something just like RGA to tackle problems. Or we may never find out how our minds work. The real point is that with every generation of computer architecture, we come closer to approximating human cognitive performance. And performance is what really matters, just as Alan Turing said.

  “That said, my personal guess is that as computers become equal, and then superior, to humans, it will be because they’ve learned how to function in ways that are quite similar to our own. AI neural network architectures, for example, are based on what we think goes on in our own heads, and they’re finally starting to produce very dramatic results. In fact, just like a human brain, we don’t always know how a neural network reaches the conclusions it does.”

  “Still,” Frank asked. “Wouldn’t it be a surprising coincidence if brains and software end up with the same approach?”

  “Oh, I don’t think so! Why, look at how similar the operations of computer malware are to the way real viruses act. There’s a tremendous amount in common between systems of all sorts, whether they’re biological, mechanical, or virtual. Probabilities, electrical forces, physics, and every other empirical measure and rule of nature are all agnostic – each one is what it is and applies in the same way to everything, regardless of the system they affect. If we want an AI to be able to do something as well as a biological system can, we need to accept that we’re subject to all the same laws that enable as well as constrain those systems. AI engineers may be using software instead of human ‘wetware’ to think, but the challenges and goals are the same.”

  Once again, Jerry paused, waiting for the next question. Frank had found the exchange very engaging but decided he had heard all he needed to know, at least for now.

  “Congratulations on RGA – that’s quite an innovative piece of design. And thanks very much for your time today. This has been enormously instructive.”

  He and Shannon got up to leave, but when he reached the door, he paused and turned. “Oh – one last thing. There’s no chance your program could have escaped your lab, or been hacked from outside, is there?”

  “Oh, no. I wouldn’t think so. But that’s not really my department. I just plug things into the wall. You’d have to check with whoever worries about security here to find out.”

  * * *

  Frank and Shannon waited to discuss Turing until they’d left NSA headquarters.

  “Well,” Shannon said, “it sounds like we now know for sure what’s launching the attacks.”

  “Maybe, but I don’t want to make the same mistake twice. Let’s not confuse coincidence with causality.”

  “It would have to be one heck of a coincidence if something’s behind the attacks other than Jerry’s Turing Nine.”

  “Not quite. The tests we’ve run so far haven’t excluded any of the four scenarios we identified. All of them are still plausible.”

  “But still, Frank – don’t you believe we’ve gotten to the bottom of this?”

  “Honestly, yes – except for the lower level details. But if we want to come up with a plan of attack, it’s going to have to work across all four of the possibilities we identified.”

  “And what kind of plan is that?”

  “I’d like to sleep on it.”

  “Ever the computational man of mystery, hmm?”

  “Not at all,” Frank said. “We just learned more than I expected to today. I want some time to think things through.”

  “Fair enough,” Shannon said. “But I’ve still got a couple of questions. Why didn’t you challenge Jerry when he owned up to the fact he’s programmed Turing to attack energy infrastructure? We already know he’s probably running it on a testbed open to the Internet.”

  “Because I think we’ll learn more from him if we don’t put him on the defensive.”

  “Okay. Then how about this – why did Jerry volunteer the fact that he instructed Turing to come up with the same type of attacks that are occurring?” Shannon said.

  “Not immediately, though,” Frank said. “At first he only talked about Turing Eight. It wasn’t until we came back sniffing again that he owned up to what Turing Nine could do. He may be odd, but he’s also brilliant. Now that we’ve shown up on his doorstep for the third time asking the same questions, he might think we already know, or at least guess, that Turing is behind the exploits. Being the first to mention that Turing was already programmed to launch similar attacks would mak
e him sound innocent.”

  “Maybe. But that would be quick thinking on his part.”

  “Or perhaps he’s been rehearsing that story ever since we said we wanted to come back yet again. After all, the server logs on the NSA’s simulation environment show what’s running on them, what it’s been doing, and who set the tests up. My guess is there’s a much shrewder Jerry behind that moronic grin than I thought.”

  “But he didn’t mention that he installed Turing Nine on his testbed system,” Shannon said. “That makes me think he’s got something to hide.”

  “Absolutely,” Frank said.

  “Okay. Last question. Why do you figure Jerry would have used his testbed that way to begin with? Wouldn’t testing a weaponized software program on the open Internet require some kind of prior review and approval?”

  “I’d certainly think so. You or I would balk at doing that without permission. But Jerry? I’m not so sure. He’s got his sweetheart deal with the NSA, and it also sounds like he doesn’t think a new version of Turing belongs to anyone but him until he decides to turn it over. Heck, he’s been living alone for twenty-five years in his nerd-cave under Fort Meade. How much of a relationship with reality can the guy still have? He never actually referred to Turing as ‘My Precious!’ but I’m not sure he needed to.”

  Shannon laughed. “Fair enough. I can buy into that. But why run his program outside at all if he’s already got this world-class NSA simulation environment to work with?”

  “Oh, that doesn’t surprise me at all. Here’s how that is: to the guys who create them, artificial environments are awesome. And they are. But at most, they’re always somewhat out of date. And they’re never as rich and complex as reality. Even if they were, someone with a big ego would still want to prove – to himself if no one else – that his best work can cut it on the big stage as well as the small one. I can easily imagine Jerry wanting to test Turing against real targets to debug and improve it.

  “But then, I’m guessing, something went wrong – maybe somebody found out what he was up to and copied the program. Or perhaps Turing detected danger and the self-defense imperative kicked in, causing it to flee. If that happened, it might have been programmed to copy itself onto another available hosting resource. From what Jerry has already told us, it would make sense for it to have that kind of self-preservation mechanism and capability.”

  “I bet you’re right,” Shannon said. “Where does that take us next?”

  Frank found himself in a celebratory mood. “I’m thinking it takes us to dinner. Your pick, my treat.”

  18

  A Mobile Case of TMI

  Random objects of Shannon’s had continued to accumulate in Frank’s condo, with the latest arrivals being a pair of sneakers and some exercise clothes. Maybe, Shannon had suggested over dinner, he’d like company on his morning run?

  Or maybe he wouldn’t he grumbled to himself, morosely brushing his teeth. There were certain masculine rites a woman should realize a man might want to enjoy in solitude, the better to savor and draw spiritual sustenance from. Zen-like oases amid the chaotic stresses of another day, so to speak. First among them for Frank was the sacred ritual that began when he laced up his running shoes and departed for his daily, lumbering fly-by of the marble monuments memorializing America’s greatest leaders. Female intrusion into this somber rite suggested a grievous violation of his personal space.

  There was also the fact she could probably run circles around him.

  But there it was. He’d promised Shannon he’d wake her up when he was out of the bathroom, and that was that. He did so and, hoping a cup of coffee might juice his stamina, retreated to the balcony to self-medicate.

  “Ready?” Shannon was standing at the sliding door.

  “I guess,” he said. “Be gentle?”

  “Oh, don’t be silly. I haven’t gone running in years.”

  “Right. But you ran cross-country in college.”

  “How do you know?”

  “Mr. Google knows everything.”

  “Well, you can tell Mr. Google college was a long time ago. Are you ready or not?”

  Frank grunted and stood up. He followed her downstairs and out on to the sidewalk.

  “Which way do you run?”

  “To the right.”

  He set off at a speedier than usual rate and tried to make his stride look smoother than his normal flat-footed assault of the pavement. He often marveled at the effortlessly fluid motion of young runners. Likely enough, the whippersnappers took their innate gracefulness for granted, too. Why had he come down the chute as a lumbering ox?

  “So, what do you think, now that you’ve slept on what we learned yesterday?” Shannon asked.

  Oh great. Not only was she violating his morning ritual, but she was expecting him to sacrifice precious breath chatting as well.

  “Funny thing – I actually spent most of that time sleeping. But anyway, despite all my cautious talk in the past, I think we should ask Jim to let us go all-in on the assumption Jerry’s program has gone rogue. Even if we haven’t proved it yet.”

  “What if he says no?”

  “Then we’ll deal with that on tomorrow’s run.”

  She let his response sit for two blocks.

  “Well, why shouldn’t he say yes? He’s gone to bat for us once before.”

  “I’m not sure that’s a good thing. We put him in the position of asking the director of the Agency to approve a dicey plan, and then we came back with nothing. Jim’s just a mid-level guy.”

  “He’ll look good if we turn out to be right.”

  “Of course. But he can’t get in trouble by saying no to us – there’s only risk if he says yes. I expect he won’t be willing to stick his neck out very far until we can prove we’ve nailed it. And I wouldn’t blame him.”

  There was nothing upbeat to be said to that, so neither of them said anything for a while. But eventually Shannon’s curiosity got the better of her as they jogged in place, waiting for a light to change. “You do think that’s the right recommendation to give him, don’t you?”

  “Well, sure. I remember a startup I was part of not long after I left MIT. It flamed out, but we thought we were hot stuff. It was one of those times when a bunch of engineers fresh out of a top university could get big-shot venture capitalists and the best lawyers to meet with them. Anyway, one of my co-founders couldn’t stop worrying about what could go wrong. It was really holding us back. One day our lawyer said something interesting.”

  “Which was?”

  “He said ‘Look. Every decision you make will involve big risks. You shouldn’t be reckless, but you also need to realize that if your decisions are too cautious, you’re not acting conservatively at all.’”

  “What’s that, a riddle?” Shannon said.

  “That’s pretty much what we asked. What he said was ‘The word conservative implies safety. But the reality of a high-tech startup in a hyper-competitive market is that conservative decisions will slow you down. That will let the competition get there first, and you’ll fail. So too much conservatism is equivalent to needless risk-taking.’”

  “Interesting,” Shannon said. “I can see that.”

  “He summed it up like this,” Frank continued. “‘So, here’s the bottom line: do you want to bet on success or failure? If you’re going to bet on failure, then you’re idiots to do this at all, because you’re sure to fail.’ That made a lot of sense. We ended up failing anyway, but at least it wasn’t because we didn’t take the right chances.”

  “Are you sure that guy was a lawyer? That advice sounds too useful and practical.”

  “That’s why he was so successful. Anyway, that’s why I think we need to run with the theory that an escaped copy of Turing is behind the attacks. If we’re right, we’re heroe
s. And if we’re wrong, well, it’s not like we’re the only people working on this project. Or that the entire world is betting all its chips on our hand.”

  “That makes sense to me, too. So, I bet you can guess my next question.”

  “Sure. Now what do we do? The answer is I’m thinking we should leave town.”

  Shannon coasted to a stop. He was happy to do the same. “Why?” she said.

  “Because we need to assume Turing may detect whatever we do, type, read, or say.”

  “Can’t we just continue to be careful, like we have been?”

  “That’s not careful enough, because we can’t ever be sure we’re safe. Don’t forget, Turing’s spies are everywhere.”

  “Are you talking about IoT devices? If so, isn’t that an exaggeration? The Internet of Things is only starting to get off the ground.”

  “You think? There were over eighteen billion IoT devices hooked up to the Internet by 2014. Two years later, there were ten billion more.”

  He pointed up in the air. “See the camera on that light pole? About two-thirds of the way towards the top? It’s feeding a picture of every license plate that passes through this intersection back to a central server. There’s another camera just like it every few blocks, and security cameras in the lobby of my building, and yours, too. You’ll also find a wireless device on your electric meter and another one on your water meter. Anyone who has access to the information from those feeds knows whether you’re home or not, if you’re awake or asleep – even when you’re doing your laundry. Same thing at my place. And if someone taps into my Internet feed, well, it’s all out there, isn’t it?

  “And don’t forget, when we’re talking about the Internet of Things, we’re not talking just about smart speakers and little widgets in light bulbs and thermostats. There are also billions of phones and laptop computers, each one with a microphone, a camera, and an Internet connection. All those functions are live most of the time. Someone can hack and control all that remotely. That’s why I don’t have my phone with me right now – do you have yours?”

 

‹ Prev