Infinity Born

Home > Other > Infinity Born > Page 7
Infinity Born Page 7

by Douglas E. Richards


  Guerrero paused. “You’re able to switch effortlessly between thousands of tasks and activities,” he continued. “But the same computer system that crushes you in chess can’t beat you at Crazy 8 without substantial modifications. And you continually learn. So while DeepMind possesses a specific intelligence, you’re a jack of all trades. You possess a general intelligence.”

  “I’m not sure my political opponents would agree with you,” said the president, “but I get your point. So AI isn’t the Holy Grail. General intelligence is. Which is what the G in AGI must stand for: Artificial General Intelligence.”

  “That’s right,” said Guerrero. “A system that exhibits AGI should be able to solve problems, learn, and take effective, human-like action, in a variety of environments.”

  “Which sounds like it goes beyond just passing the Turing test,” noted the president.

  “Turing was a true genius, and his test was considered the Holy Grail for a long time. But creating a computer that can fool a human into thinking it’s human, while impressive, isn’t nearly enough. Passing the Turing test might just mean that the computer was extremely well programmed, nothing more.”

  “Right,” said the president. “Like if you were in a room and someone sent in a question for you in Chinese. You might be able to replace each Chinese word with an English one using a pre-specified program—say Google Translate—and reverse the process with your answers. In this way your answers in Chinese might satisfy the questioners. But just because you can fool a native speaker doesn’t mean you actually understand a single word of Chinese.”

  Guerrero and Melanie glanced at each other, barely able to keep their mouths from falling open. This was a simplified variation of a famous thought experiment about consciousness called The Chinese Room. Strausser couldn’t have just come up with this example at random.

  Perhaps there was more to this president than met the eye.

  “Exactly right,” said Guerrero. “The trick to AGI is for it to become self-aware, creative. And even self-awareness isn’t enough.”

  “Yes, because sentience, consciousness, has some magical, ineffable quality,” said Strausser. “Not just mind, but body and spirit in some ways. Qualia become a vital consideration.”

  This time Melanie’s mouth did fall open. This was a sentence an expert in the field wouldn’t have been embarrassed to have uttered. Strausser was nobody’s fool. He clearly liked to be underestimated, but a keen, well-read mind lurked behind this facade, and he had now come out of the closet. It was time to probe just how deep his knowledge really was.

  “I know what I mean when I use the word qualia,” she said. “But what do you mean, Mr. President?”

  “I would define qualia as subjective experiences, sensations,” he replied, “that only conscious entities seem to be able to have. How a thing seems. How it feels. How it affects a consciousness emotionally and spiritually. The pain of a toothache. The beauty of a sunset. The taste of wine.”

  Melanie gazed at him in wonder. She had expected him to have to gather his thoughts and then fumble for an answer to her question. But he had shot back a response without hesitation that was as accurate as she could want.

  “There is a classic thought experiment,” continued the president, “which I’m sure you know far better than I do, that helps me think about this. Imagine a woman who was kept since birth in a black-and-white house, only able to view the world through a black-and-white television monitor. This woman could earn an MD/PhD in color vision. Could know every last thing that is known about the color red. Its wavelength, how it hits the rods and cones in the eye, the exact way this information is processed by the optical nerve and brain.”

  “But she still wouldn’t know the color red,” said Melanie, finishing for the president. “Even though she has every bit of formal knowledge about the color, she would still gain additional information by going outside and seeing something red.”

  “Yes, what red looks like,” said Strausser. “At least what it looks like to her. How it makes her feel. The subjective experience of the color red. Qualia.”

  Guerrero looked completely taken aback, and Melanie understood why. Not only was the president not a fool, it was becoming increasingly evident that he harbored a significant intellect.

  “So you knew the difference between AI and AGI when you came in here, didn’t you?” she said.

  The president nodded. “I wanted to see if you’d just accept my use of the term AI, or try to enlighten me. And, if so, how you might choose to explain it. I was trying to get a feel for the two of you.”

  “You mean size us up?” said Melanie.

  “Kind of a brash way to put it,” he said, “but yes.”

  “I figured you could handle some brashness,” said Melanie.

  “You figured right,” said the president. “In fact, I find you refreshingly direct. Which is what I want in my world-class scientists. You’re still managing programs, so you do need to have some political skills. But if you were too political, that would make me uncomfortable.”

  The president sighed loudly. “By the same token, having too high an IQ can be a liability in my world. I’m a man of the people. If I seem to be too well-read, too focused on reading scientific journals instead of figuring out how to defend some indefensible lie told by a member of my party, I scare a lot of people. So if anyone ever asks you about me, be sure to tell them I’m as dumb as a rock, and only read comic books.”

  Melanie nodded thoughtfully. It suddenly made sense why he was here with them now. It wasn’t an accident, as much as he had arranged it to look like a last-minute decision. He was a closet scientist at heart, and had done considerable thinking and reading on the subject of consciousness.

  “But back to the main topic,” said the president. “While I’ve tried to educate myself a little over the years, I’m not narcissistic or delusional enough to believe I could even come close to truly understanding what you’ve been doing here. I’ve studied a little theory and philosophy of AGI, but no nuts and bolts. I don’t know an exaflop from an exa-wife. But I’ve been told that you’ve built what should be the fastest, most powerful computer system ever constructed, using billions of taxpayers’ dollars out of a Black Budget. And you hope to use a software program that mimics evolution to bring about your AGI.”

  “That is correct,” said Melanie Yoder. “TUC is truly mind-boggling. A thousand dollars today can buy a laptop as formidable as the most powerful supercomputer that existed in 2020. And DARPA has invested millions of times this amount on a single system, which resides in this bunker,” she added, gesturing to the mostly buried facility fifty yards away.

  Strausser looked duly impressed. “And if you do get to AGI,” he said slowly, studying Melanie with great intensity, “how likely is it that your Artificial General Intelligence evolves itself into Artificial Super Intelligence in a blink of an eye?”

  Melanie was tempted to water down her reply, but she felt certain he was searching her body language and would detect a lie as though he were a human polygraph. “Very likely,” she replied simply, which seemed to be the answer he was expecting.

  “So AGI,” said the president, “impossible as it has proven to be to create, is really just a stepping-stone to something far greater. A pawn that you create a millimeter away from the eighth rank on the chess board, poised to become a queen. As soon as your system achieves human intelligence, it becomes smart enough, and self-aware enough, to direct its own evolution and improvement, instead of relying on random chance. Then you get runaway improvements happening at speeds we can’t comprehend. Millions of generations of improvements in minutes. And we have no idea what this ASI will be like, only that our puny intelligence will be dwarfed. That its thinking will be far beyond our ability to grasp.”

  A quick smile flashed across Strausser’s chiseled features. “The only thing we can know for sure about a system possessing Artificial Superintelligence,” he added, “is that it would be much too sm
art to ever get elected president.”

  Melanie ignored this latest attempt at humor. “So you think we’re playing with fire?” she said.

  “I know you’re playing with fire,” replied the president. “And so do you, or you aren’t one-thousandth as smart as I think you are. That’s not the question. The question is, are you convinced the safeguards you’ve put in place will suffice?”

  10

  Melanie Yoder couldn’t help but feel insulted by the president’s question. “If I wasn’t convinced the safeguards would be enough,” she said, “I wouldn’t be going forward.”

  Strausser smiled sheepishly. “I probably should have rephrased that,” he allowed. “Just how convinced are you that your safeguards will suffice? And why do you think so?”

  Melanie opened her mouth to respond, but the president held out a forestalling hand. “But we’ll get to these questions in a moment. First, let me back up for a bit. So you have the fastest, most powerful computer ever built. So what? Do speed and grandiosity really help? Doesn’t this just help you achieve failure that much faster? Humans have achieved AGI with brains as comparatively slow as glaciers. There are certainly quantitative aspects of consciousness, a requirement for a minimum level of complexity and processing power. But once you’ve passed this, isn’t consciousness mostly qualitative?”

  Melanie considered how to respond. “Many believe the opposite is true, that consciousness is nothing more than an emergent property of complexity. Create a complex enough system and it pops out all by itself.”

  The president shook his head skeptically. “And what do you think?”

  “I tend to agree with your conclusions,” she admitted. “Emergent is just a fancy word for magic, in my opinion. It’s defined as an unpredictable property that simply arises on its own from simpler constituents. If it’s unpredictable to science, then it might as well be magic.”

  There was something about the look on the president’s face that made her believe he already knew which side of this debate she favored. That he had already read her scholarly articles on the subject, in fact.

  “But even if we acknowledge that consciousness has a strong qualitative aspect,” she continued, “it will almost certainly also require a very high level of complexity and processing power to achieve.”

  “Which your TUC has covered in spades,” said the president.

  “Yes,” said Melanie. “Also, speed and power allow more generations of evolution to take place in less time. But, in general, with all of this said, I agree with you: speed and power alone aren’t enough. Increase the processing speed of a fruit fly brain a million fold and you still don’t get AGI.”

  “So since you agree that this mighty system alone isn’t enough,” said Strausser, “you must believe you’ve also come up with the mother of all AGI evolution algorithms. One that will supply Frankenstein’s magical spark to animate your creation. To create a system that can not only win at Jeopardy! but care that it won.”

  Melanie nodded. “We believe we’ve developed superior AGI evolution algorithms and superior inductive learning algorithms, both.”

  Computer evolution algorithms copied biological evolution almost exactly. As a great simplification, software—coding—took the place of genes. Software that achieved the most successful results, based on preselected “fitness functions” would be copied into the next generation, and allowed to cross with other successful software, creating progeny software that shared features of both parents. Random mutations would also be introduced along the way. In each generation, only the fittest software would survive. Then rinse and repeat. Millions of times, until endless crossings and random mutations enhanced performance in magical and surprising ways.

  “So what makes your algorithms so superior?” asked the president bluntly.

  “I’m afraid an explanation would require symbolic logic and mathematics that not even I’ve fully mastered,” said Melanie. “I’m afraid you’ll have to take my word for it. If any evolution program can work, this one will. And, again, this is in conjunction with advanced learning algorithms. DeepMind made some major innovations that we’ve built upon.”

  “That’s Google’s entry, right? The one that won at Go?”

  “Well, it became Google’s entry. It began life as a British company that Google later acquired. And it took a while to build up to Go. It first became famous by playing an old Atari arcade video game called Breakout. It wasn’t programmed with any information about the game or how to play. It was tasked with maximizing its score, nothing more, so it had to learn how to play. At first it fumbled around and was horrible, but after an hour it never missed. DeepMind has become considerably more advanced since those days, of course.”

  “And you believe that this learning system has become the gold standard?” said the president. “The one to build upon?”

  “That is my personal belief, yes,” replied Melanie.

  “So what’s its secret?”

  “It uses deep learning on a convolutional neural network,” she explained. “Combined with a form of model-free reinforcement learning called Q-learning.”

  “That didn’t help me at all,” admitted the president. “But no matter. Let’s move on.” He paused in thought. “So after DeepMind learns, can scientists understand the result? How and what it learned?”

  “For something simple like Atari, yes,” she replied. “But not for something like Go. That’s the beauty of it.”

  “And also what makes it a little scary,” said Guerrero. “The people behind AlphaGo have no idea why it plays the game the way it does. Can’t begin to fathom the computer’s strategy.”

  “The word scary is a good segue into the subject of safeguards,” said Strausser. “So you think these software breakthroughs of yours can finally get us to AGI?”

  “Combined with the most advanced hardware ever created, yes,” replied Melanie.

  “And you’ve already acknowledged that if you’re successful, TUC will likely graduate from AGI to ASI very rapidly.”

  Dr. Melanie Yoder blew out a long breath. “Yes. Which is why this is the most important project in the history of humanity. Whoever wields superintelligence first has the biggest first mover advantage in history. The possibilities for improving the human condition are truly extraordinary.”

  “Wields is an interesting word choice,” said the president pointedly. “For us to wield something we have to control it.”

  “That’s correct,” she said.

  “What makes you think we can control an ASI? Why would it allow itself to become a slave to its intellectual inferior?”

  “The core, unchanging goals that drive its evolution are obedience to its creators, friendliness to humanity, promotion of human good, and reduction of human death and suffering. These and other fail-safes are stitched into its evolutionary DNA in a thousand different ways, and are so redundant, there is no conceivable way the system could evolve to be antagonistic to humanity.”

  “Do you agree, Dr. Guerrero?”

  He nodded. “I do. Our best minds have reviewed our approach and are satisfied that we’ll retain control.”

  The president turned to face Melanie once again. “And if you’re wrong?”

  “We aren’t,” said Melanie with conviction. “But just in case, we built TUC in a separate bunker structure out here in the desert. He’s cut off from all electrical lines. We’ve downloaded the entire contents of the Internet into his memory, including millions of books and hundreds of thousands of textbooks. He’ll need to draw upon this vast database to have any hope of evolving to sentience. But he isn’t connected to the Internet in any way, nor can he connect himself.”

  “He can’t affect the outside world in any way,” added Guerrero. “And we won’t let him for some time, not until we’re as sure as we can be about what it is we’ve created. We have the means to cut off his power supply at any time, and there is nothing he can do to prevent us. He has no moving parts and no way to recruit peop
le or affect his environment.”

  Guerrero paused. “Finally,” he continued, “the building he’s in is mined. We can activate it via radio, by transmitting a code that is two hundred digits long. The string of digits was produced by a randomization program in another computer. TUC won’t know it’s there, and if he does, even with his speed, he couldn’t crack it by trial and error if he had until the end of time.”

  The president nodded. “You’ve been using male pronouns for TUC, I’ve noticed. For some reason, when I came here, I thought of it as a her.” He gestured toward Melanie Yoder. “How did you choose the name TUC?” he asked. “Assuming that you were the one who named it.”

  “I was,” said Melanie. “It’s short for The Ultimate Computer.”

  “Good choice,” said Strausser, who seemed sincere in this praise. “And you just randomly decided it was a male?”

  “Randomly?” said Melanie in mock indignation. “It wasn’t random at all. TUC is just naturally a male name. You know,” she added with a grin, “Friar TUC. TUC Rodgers. Tom Sawyer and TUC Finn.”

  The president laughed out loud. “Right. Short for TUKleberry.”

  “Exactly,” said Melanie. “Even the classic TUCxedo is most often worn by a male.”

  “Glad I asked,” said the president dryly. He turned his attention to the digital countdown clock, running in the corner of the large monitor. The grand experiment was set to commence in just less than five minutes.

 

‹ Prev