“A backup means a duplicate. When you do get a functioning humanoid artificial intelligence—do you think that you will be able to copy it as well?”
“Of course. Whatever it does—it will still just be a program. Every copy of a program is absolutely identical. Why do you ask?”
“It’s a matter of identity, I guess. Will the second AI be the same as the first?”
“Yes—but only at the instant it is copied. As soon as it begins to run, to think for itself, it will start changing. Remember, we are our memories. When we forget something, or learn something new, we produce a new thought or make a new connection—we change. We are someone different. The same will apply to an AI.”
“Can you be sure of that?” she asked doubtfully.
“Positive. Because that is how mind functions. Which means I have a lot of work to do in weighting memory. It’s the same reason why so many earlier versions of Robin failed. The credit assignment problem that we talked about before. It is really not enough to learn just by short-term stimulus-response-reward methods—because this will solve only simple, short-term problems. Instead, there must be a larger scale reflective analysis, in which you think over your performance on a longer scale, to recognize which strategies really worked, and which of them led to sidetracks, moves that seemed to make progress but eventually led to dead ends.”
“You make the mind sound like—well—an onion!”
“It is.” He smiled at the thought. “A good analogy. Layer within layer and all interconnected. Human memory is not merely associative, connecting situations, responses and rewards. It is also prospective and reflective. The connections made must also be involved with long-range goals and plans. That is why there is this important separation between short-term and long-term memory. Why does it take about an hour to long-term memorize anything? Because there must be a buffer period to decide which behaviors actually were beneficial enough to record.”
Sudden fatigue hit him. The coffee was cold; his head was beginning to ache; depression was closing in. Shelly saw this, lightly touched his hand.
“Time to retire,” she said. He nodded sluggish agreement and struggled to push back the chair.
26
June 19, 2024
Shelly opened her apartment door when Benicoff knocked. “Brian just came in,” she said, “and I’m getting him a beer. You too?”
“Please.”
“Come in and take a look—after all you paid for it.”
She led the way into the living room where all traces of the army barracks had been carefully removed. The floor-to-ceiling curtains that framed the window were made from colorful handwoven fabric. The carpeting picked up the dark orange from the curtain pattern. The slim lines of the Danish teak furniture blended pleasantly with this, providing a contrast to the spectacular colors of the post-Cubist painting that covered most of one wall.
“Most impressive,” Ben said. “I can see now why the accounts department was screaming.”
“Not at this—the fabric and rugs are Israeli-designed but Arab-manufactured and not at all expensive. The painting is on loan from an artist friend of mine, to help her sell it. Most of the money went for the high-tech kitchen. Want to see it?”
“After the beer. I better brace myself for it.”
“Going to explain the mystery of your invitation to a Thai lunch today?” Brian said, lolling back comfortably in the depths of a padded armchair. “You know that Shelly and I are prisoners of Megalobe until you run down the killers. So how do we get out to this Thai restaurant of yours?”
“If you can’t get to Thailand, why Thailand will come to you. As soon as you told me you wanted to bring me up to date on your AI I thought we ought to make a party of it. Thanks, Shelly.”
Ben took a deep swig of cold Tecate and sighed. “Good stuff. It all began with a security check last week. I sit in with Military Intelligence when they vet any soldiers to be transferred here. That was when I discovered that Private First Class Lat Phroa had joined the army to get away from his father’s restaurant. He said he had enough of cooking and wanted some action. But after a year of army food he was more than happy to cook a real Thai meal in the kitchen here, if I could get the ingredients. Which I did. The cooks went along with it and the troops are looking forward to the change. We’ll have the mess hall to ourselves after two. We’ll be the guinea pigs and if we approve, Lat promised to feed everyone else tonight.”
“I can’t wait,” Shelly said. “Not that the food here is bad—but I would love a change.”
“How is the investigation going?” Brian asked. It was never far from his thoughts. Ben frowned into his beer.
“I wish I could bring some good news, but we seem to have hit a dead end. We have Alex Toth’s military record. He was an outstanding pilot, plenty of recommendations for that. But he is also a borderline alcoholic and a troublemaker. After the war they threw him out as fast as they could. No trace of him at the address he gave at the time. The FBI has found some records of his employment through his pilot’s license, kept up to date. But the man himself has vanished. The trail is ice cold. Dusty Rhodes’ story checks out. He was conned into it and then left to hang out and dry in the wind. There is absolutely no way to trace the money that was paid into his account.”
“What’s going to happen to Rhodes?” Shelly asked.
“Nothing now. The remaining money they gave him has been sequestered for the crime victims’ fund and he signed a complete statement of everything that happened, everything he did. He’ll keep his nose clean in the future or will be hit with a number of charges. We want to keep this thing as quiet as we can while the investigation is still in progress.”
Shelly nodded and turned to Brian.“You must bring me up to date. Did you ever get that B-brain to work?”
“Indeed I did, and sometimes it works amazingly well. But not often enough to trust very far. It keeps breaking down in fascinating and peculiar ways.”
“Still? I thought that using LAMA-5 made debugging easier.”
“It certainly does—but I think that this is more a problem of design. As you know, the B-brain is supposed to monitor the A-brain, make changes when needed to keep it out of various kinds of trouble. Theoretically this works best when the A-brain is unaware of what is happening. But it seems that as Robin’s A-brain became smarter it learned to detect that tampering—then tried to find ways to change things back. This ended up in a struggle for power as the two brains fought for control.”
“It sounds like human schizophrenia or multiple personalities!”
“Exactly so. Human insanity is mirrored in machine madness and vice versa. Why not? A malfunctioning brain will have the same symptoms from the same cause, machine or man.”
“It must be depressing, being set back by lunatic brains in a box.”
“Not really. In a way, it’s actually encouraging! Because, the more the robot’s foul-ups resemble human ones, the closer we are getting to humanlike machine intelligence.”
“If it is going that well—why are you so upset?”
“Is it obvious? Well, it’s probably because I’ve finally come to the end of the notes we retrieved. I’ve worked through just about everything that those notes described. So much so that now I am swimming out into uncharted seas.”
“Is there any rule that the AI in your lab must be the same as the one that was stolen?”
“Yes, pretty much so, except for some minor details. And the trouble is that it has so many bugs that I am afraid that we’re stuck on a local peak.”
“What do you mean?” Ben said.
“Just a simple analogy. Think of a scientific researcher as a blind mountain climber. He keeps climbing up the mountain and eventually reaches a peak and can climb no higher. But because he can’t see anything he has no way of knowing that he’s not at the top of the mountain at all. It is merely the peak of a local hill—a dead end. Success is then not possible—unless he goes back down the mountain again and l
ooks for another path.”
“Makes sense,” Ben said. “Are you telling me that the AI you have just built—which is probably almost the same as the one that was stolen—may be stuck on a local peak of intelligence and not on some much higher summit?”
“I’m afraid that’s it.”
Ben yodeled happily. “But that is the best news ever!”
“Have you gone around the twist?”
“Think for a second. This means that whoever stole your old model must also be stuck in about the same way—but he won’t even know it. While you can go and perfect your machine. When that happens we’ll have it—and they won’t!”
As this sunk in a broad grin spread across Brian’s face. “Of course you’re right. This is the best news ever. Those crooks are stuck—while I’m going to push right ahead with the work.”
“Not at this moment you’re not—after lunch!” Shelly said, putting down her wineglass and pointing to the door. “Out. It’s after two and I’m starving. Eat first, talk later.”
After eating See Khrong Moo sam Rot—which despite its name was absolutely delicious—sweet, sour and salty spareribs—they even managed some custard steamed in pumpkin for dessert.
“I’ll never eat army chow again,” Brian groaned happily and rubbed his midriff.
“Tell that to the cook—make his day,” Shelly said. “That’s what I’m going to do.”
Lat Phroa took their praise as his due, nodding in agreement. “It was pretty good, wasn’t it? If the rest of the troops like it I’m going to work hard to get this kind of chow in the regular menu. If only for my own sake.”
Ben left them there and they walked off some of the lunch by strolling back to the lab.
“I’m enthusiastic—but apprehensive,” Brian said. “Swimming out into uncharted seas. Up until now I have been following the charts, my own notes—but they have just run out. It’s a little presumptuous of fourteen-year-old me to think that I can succeed where the twenty-four-year-old me pooped out.”
“Don’t be so sure. Dr. Snaresbrook maintains that you’re smarter now than ever before—your implants have given you some outstanding abilities. And furthermore, in the work you’ve done with Snaresbrook—analyzing your own brain—you’ve probably discovered more about yourself than a squad of psychologists ever could. It’s clear to me that you’re getting there, Brian. Bringing something new into the world.
“A truly humanlike machine intelligence.”
27
July 22, 2024
Ben found the message in his phone when he woke up. It was Brian’s voice.
“Ben—it’s four in the morning and we have it at last! The data in Robin was almost enough, and Dr. Snaresbrook finished the job by decoding some more material from my brain. It was an awful job, but we managed to get it done. So now, theoretically, Robin contains a copy of my superego and I’ve set the computer to reassembling all of Robin’s programs to try to integrate the old stuff with the new. Need some sleep. If you can make it please come to the lab after lunch for a demo. Over and out—and good night.”
“We’ve done it,” Brian said when they met in the laboratory. “The data already downloaded into Robin was almost enough. It was Dr. Snaresbrook who finished the job, adding what might be called a template, a downloaded copy of my superego. You could say that it was a copy of how the highest-level control functions of my brain operate. All memory that was not associated with control was stripped away until we had what we hoped would be a template of a functioning intelligence. Then came the big job of integrating these programs with the AI programs that were already running. This was not easy but we prevailed. But along the way we had some spectacular failures—some of which you already know about.”
“Like the lab wreck last week.”
“And the one on Tuesday. But that is all in the past. Sven is now a real pussycat.”
“Sven?”
“Really Robin number 7, after we found out that 6.9 couldn’t access all the memory we needed.”
“Blame Shelly for that,” Brian said. “She claims that when I say ‘seven’ it sounds more like ‘sven.’ So when I wasn’t looking she programmed in a Swedish accent. The name Sven stuck.”
“I want to hear your Swedish AI talk!”
“Sorry. We had to take the accent out. Too much hysteria and not enough work getting done.”
“Sounds good to me. When do I get to meet your AI?”
“Right now. But first I’ll have to wake Sven up.” Brain pointed to the motionless telerobot.
“Wake up or turn on?” Ben asked.
“The computer stays on all the time, of course. But the new memory management scheme turned out to be very much like human sleep. It sorts through a day’s memories to resolve any conflicts and to delete redundancies. No point in wasting more memory on things that you already know.” Brian raised his voice. “Sven, you can wake up now.”
The three lens covers clicked open and the legs stirred as Sven turned toward them.
“Good afternoon, Brian and Shelly. And stranger.”
“This is Ben.”
“A pleasure to meet you, Ben. Is that your given name or family name?”
“Nickname,” Ben said. Robin had forgotten him again—for the third time—as its memory was changed. “Complete name, Alfred J. Benicoff.”
“A pleasure to meet you, Ms. or Mr. Benicoff.”
Ben raised his eyebrows and Brian laughed.
“Sven has still not integrated all the social knowledge involved with recognizing sexual distinctions. In fact, in many ways, it is starting from scratch, with entirely new priorities. The main thing is completeness first. I want Sven to have as well rounded an intelligence as that of a growing child. And right now, like a child, I want to teach him how to safely cross streets. We’re going for a walk now—would you like to come?”
Ben looked at the clutter of electronic machinery and his eyebrows shot up. Brian laughed at his expression and pointed to the other end of the lab.
“Virtual reality. I can’t believe how much it’s improved in the last ten years. We’ll get into those datasuits and Sven will join us electronically. Shelly will supervise the simulators.”
The suits opened at the back; Brian and Ben took off their shoes and stepped in. They were suspended at the waist so they could turn and twist as they walked. The two-dimensional treadmill floor panels let their feet move in any direction, while other effectors inside the boots simulated the shapes and textures of whatever terrain was being simulated. The featherweight helmets turned with their heads, while the screens they looked into displayed the totally computer-generated scene. Ben looked up and saw the Washington Monument above the treetops.
“We’re in Foggy Bottom,” he said.
“Why not? Details of the city are in the computer’s memory—and this gives Sven a chance to deal with the rotten District drivers.”
The illusion was almost perfect. Sven stood erect next to him, swiveling its eyes to look around. Ben turned to the image of Brian—only it wasn’t Brian.
“Brian—you’re a girl—a black girl!”
“Why not? My image here in virtual reality is computer generated so I can be anything. This gives Sven an extra bonus of meeting new people, women, minority groups, anyone. Shall we go for a walk?”
They strolled through the park, hearing the sound of distant traffic, pigeons cooing in the trees above them. A couple came the other way, passed them, talking together and completely ignoring the shambling tree robot. Of course—they were computer-generated images as well.
“We haven’t tried crossing any streets yet,” Brian said, “so why don’t we do that now? Make it easy the first time, will you Shelly?”
Shelly must have worked a control because the heavy traffic in the street ahead began to lighten up. Fewer and fewer cars passed and by the time they had reached the curb there were none in sight. Even the parked cars had driven away, all the pedestrians had turned corners and none had ret
urned.
“Want to keep it as simple as possible. Later on we can try it with cars and people,” Brian explained. “Sven, think you can step down off the curb all right?”
“Yes.”
“Good. Shall we cross now?”
Ben and Brian stepped into the road.
“No,” Sven said. Brian turned to look at the unmoving figure.
“Come on—it’s all right.”
“You explained that I was to cross the road only when I was sure a car was not coming.”
“Well, look both ways, nothing in sight, let’s go.”
Sven did not move. “I’m still not sure.”
“But you’ve already looked.”
“Yes, there was no car then. But now is now.”
Ben laughed. “You are very literal, Sven. There is really no problem. You can see both ways for a kilometer at least. Even if a car turned the corner doing one hundred kilometers an hour we could get across well before it reached us.”
“It would hit us if it were going five hundred kilometers per hour.”
“All right, Sven—that does it for today,” Brian said. “Switching off.”
The street vanished as the screen went dark; the backs of the suits swung open.
“Now, what was that about?” Ben asked as he backed out and bent to pick up his shoes.
“A problem that we’ve seen before. Sven still doesn’t know when to stop reasoning, to stop being outlandishly logical. In the real world we can never be one hundred percent sure of anything, so we have to use only as much knowledge and reasoning as is appropriate to the situation. And in order to reach a decision there must be a point at which thinking has to stop. But doing that itself requires inhibition skills. I think the reason that Sven got stuck was because his new superego was inhibiting the use of those very skills.”
“You mean it turned off the very process that was supposed to stop being turned off? Sounds suspiciously like a paradox. How long will it take to fix?”
The Turing Option Page 27