Society of the Mind

Home > Other > Society of the Mind > Page 23
Society of the Mind Page 23

by Eric L. Harry


  She waited, but he said nothing further.

  "Mr. Gray, when you told me about how you could, you know, operate robots remotely…"

  He looked up at her. "Teleoperation," he supplied.

  "Yeah. Well… was that place where I went some kind of simulation? What it would be like to teleoperate a robot on another planet or something?"

  "I don't know," he replied in a faraway voice.

  "You don't know the place I'm talking about?" Laura asked. "The place with the pitch-black surface and some sort of… landing craft?"

  "Oh, I know the place very well," Gray replied in his tired monotone, looking up at her but clearly lost deep in thought. That was all he would say.

  Gray remained quiet on the walk to the car parked in front of his house. He'd suggested they go for a drive, and Laura didn't press him about their destination.

  When the car doors closed, Laura said, "Hoblenz doesn't know about that place, does he?" Gray shook his head.

  "Assembly building," he directed in a clear voice, and the car took off. "Everything is on a need-to-know basis. With the computer assuming control over more and more of our operations, the list of things even the department heads need to know is growing shorter."

  "You act like that's a good thing," Laura said, shaking her head.

  They passed through the gates and turned left on the curbed road. "Keeping people in the dark. Why are you so secretive? I mean, surely you trust Margaret and Georgi and Griffith and the others."

  He opened his mouth to speak, but the words seemed to stick in his throat. Finally he said, "It's a burden to know certain things. To have to carry them with you all the time. Having to live with their weight on your shoulders."

  Laura caught a glimpse of Gray shaking his head before they plunged into the black mouth of the tunnel. She felt emboldened by the cloak of darkness. "The computer knows your big secret."

  When they burst out into the dim light of the evening, Laura saw that Gray was looking at her. "Of course it does," he said. "The computer knows almost everything. This island is highly automated and the computer is at the center of all we do." His gaze drifted out of the window. "It's we humans that are becoming increasingly used to the day-to-day operations."

  "And you really think I can help the computer, Mr. Gray?" Laura asked, then anxiously awaited his response.

  He turned to her. "Please, call me Joseph," was all he said. The assembly building was off-limits to humans. Those who did go inside, Gray explained, made quick runs in from the half a dozen trailers that had been set up just outside.

  He led Laura through the duster into the massive building. They didn't venture out onto the main floor, but instead took a side door into a hallway. It led past empty offices to a door labeled "Nursery." Laura followed Gray in.

  Along the far wall were arrayed a series of tall stools with low backs. They faced a wide plate-glass window overlooking a large room from on high. Gray ushered Laura past rows of computer terminals toward the window.

  In front of each stool was a dashboard filled with monitors and controls. Contraptions that looked like empty gloves rose from the deactivated workstations, and devices like the grip of a gun hung suspended from the ceiling by booms.

  "We used to do this by hand," Gray said, apologizing for gear that to Laura seemed futuristic. He motioned for her to climb up onto a stool.

  When she did, Laura saw a strange room one story beneath the window. A huge Model Seven stood precariously amid a clutter of objects. Balls, cubes, wedges, and cones — all jet-black — were strewn about the room's floor, which like the walls and ceiling was covered in bright white padding.

  The gangly robot torso was supported by straps, as were its spidery legs and stunted metal arms. Connected to the straps were long cables — some taut, others loose — that descended from a carriage on the ceiling.

  The overall impression Laura got was of a puppeteer whose marionette danced on the floor far below.

  "When they first come off the assembly line," Gray explained from the stool next to hers, "they're basically helpless."

  The Model Seven extended its slender arm uncertainly, knocking one cube to the floor from its perch atop another. The robot's gripper returned to its side with spastic starts and stops. A claw hanging from the puppet master's small arm replaced the fallen cube atop its twin. A series of cables then tightened, and the sleeve encasing the Model Seven's arm smoothly led the pupil's gripper to the block. When the metal claw closed on the cube, the cables guided the arm to the floor with machinelike precision. The arm returned to the robot's side and sagged as the sleeve's cables loosened.

  "The Model Seven is being trained by what's called lead through programming," Gray said as the robot waited for the cube to be placed back on its pedestal. "The Seven's controller records the motions and then plays them back exactly as learned. Watch."

  This time, the robot's arm rose to the cube on its own, grabbed it, and lowered it to the floor — its movements precisely duplicating those it had just been taught.

  "We were absolutely amazed at how much processing it took to program mobility into the robots," Gray said. "We humans think that rising out of a chair, walking across the room, and getting a glass of water is the easiest thing in the world, but that playing chess is difficult. In fact, it's just the opposite. We're just so incredibly proficient at motor skills that they seem easy. High-level processing — like chess, differential equations, musical composition — seems difficult only because we're so bad at it. That results from the fact that high-level thought is the newest, most recent addition to our mental repertoire, whereas we began learning locomotion billions of years ago."

  Laura let Gray finish, and when he looked over at her he said, "Oh, I'm sorry. I guess I'm venturing into your field."

  "No, that's okay. I'll give you an A minus so far."

  Gray smiled and turned back to the room below. "A lot of rudimentary subsystems are hardwired in the robots right on the factory floor. Then, the main computer runs them through a few trillion simulations. Finally they get here, but as you can see" — Gray nodded at the infant robot, which was all trussed up in its web of cables—"they're still pretty helpless. They can't even stand up at first."

  "How old is this one?"

  "Oh, I'd guess it came off the production line about a week ago. After two weeks in the nursery it goes to the Basic and then Advanced Neural Programming Centers."

  "Preschool and kindergarten?" Laura asked, and Gray smiled.

  They watched as the gangly, four-legged robot tried to walk straight at a row of new objects. The cables went taut on practically every step as the robot simply forgot to plant a foot.

  "Why don't you just program all the neural nets the same way?" Laura asked. "I mean, once one of them gets everything down, it would seem a lot more efficient to use its program for all the rest. It must be expensive to train them one at a time like this."

  "But it's the only way," he said, spinning his stool around to face her. The earnest expression on his face captivated Laura. "You see, neural nets have two tremendous advantages over digital processors. The first is that they fail gracefully. If something goes wrong in one of their nodes they just reroute their processing instead of crashing the whole system. And then, there's the second major advantage, which is obvious."

  "What's that?"

  With eyes glistening Gray said, "They develop brilliance." His voice had a dreamy quality to it. "There were theories, but nobody knew till we tried. The trick is in the process of learning. You can tell the robot that if you push one cube off another it'll crash to the floor. But if you allow it to learn, it begins to make generalizations almost immediately!"

  He was energized. He looked radiant with excitement. "Every traditional computer is designed to respond to the same set of instructions in exactly the same way every time. But neural nets are all different. From the moment they come off the line their abilities and preferences and tendencies vary. Some are better at
detail work. Others have such a highly developed sense of kinesthesia they could walk through a china shop without rattling a plate. Others are great problem-solvers and are best at pure, abstract thinking. They have the same hardware, but from the complete jumble of connections in their nets they become individuals."

  Gray slid off his stool and began to pace. "It's just like the diversity of the human population. Take Georgi — an absolute genius at the physics of optical computing. He's also the best chess player I've ever come across. And then there's Margaret, who has the finest understanding of databases in the world. She's also a single mother and the most devoted parent I've ever met. She goes home at five sharp every night to cook dinner for her kids. Dorothy is a twenty-first-century Pasteur, but you should also hear her play the piano. She was a child prodigy, and she was paraded around at age four like some carnival act. And then there's Griffith — a balding roboticist who thinks he's a Hell's Angel. He drives a Harley around the beach on his days off playing 'Born to Be Wild' on his DAT player — always 'Born to Be Wild,' never anything else."

  "What about Hoblenz?" Laura asked.

  "He's a warrior. A member of an even more specialized breed who lives for the hunt. It used to be that all humans had to be able to defend their lives to survive. The poor fighters were culled from the herd along with the genes that made them weak. Now, we pay others to do our fighting for us, and we equip them with such highly productive tools that many need less overtly violent skills to excel at killing and we need fewer and fewer of them."

  In the quiet that followed, Laura agonized over something that bothered her. Finally, she gave voice to the cause of her disquiet. "Why are you showing me all this?" It hadn't come out right. "I mean, don't get me wrong. I'm fascinated by all this stuff, but I just don't understand how it relates to my job."

  "I'm asking you to do something that's never been done before. I'm asking you to tell me whether the computer is emotionally disturbed, and if it is…"

  Laura waited, but then had to prompt him. "Yes?"

  "… can you cure it in the next three days' time."

  19

  After returning from the tour of the nursery, Laura got back to work in the quiet of her office. She was surprised to find her hands stiff and sore from all the typing. She sat back and rubbed her hands — her attention drawn to the black eyeball in the wall beside the door.

  "Are you sure you can't hear me well enough so that I can just talk?" she asked in a raised voice.

  The computer answered her on the screen.

  Laura sighed and flexed her fingers like a pianist — the joints in her hands popping. "Never mind," she typed. "So, how do you know when you've got a virus?"

 

  "You can't sense their presence on your own?"

 

  "But you have no clue what it is?"

  the computer replied.

  Laura stared at the computer's reply with growing alarm. When Laura returned to Gray's house it was late. She was exhausted, but instead of heading upstairs for bed she felt compelled to go in search of Gray. The door to his study was closed, and Laura knocked.

  "Come in," Gray called out through the thick wood.

  She opened the door to see that Gray was rocked back in his chair — reading. His stockinged feet were propped on his desk, being warmed by the blaze in the fireplace.

  "Oh, Laura," he said. He dropped his feet to the floor and placed the papers he was reading on his desk.

  "Sorry to bother you so late. I just thought we might need to talk."

  Gray stood and walked around his desk, motioning for her to have a seat on the leather sofa.

  She suddenly wondered what she had come there to say. She'd begun half a dozen different conversations in her mind on the ride up the mountain, but she'd never gotten all the way to the point of the talk.

  "I'm glad you stopped by," Gray said as he sat in a chair beside the sofa. "I was just reading your latest article. It's very interesting. I had no idea what your views were."

  Laura didn't know how to take the comment. "But if you didn't know my views," she began, growing more and more defensive, "how is it that you offered me this job?"

  "Oh, I didn't pick you out," he said, and Laura was instantly crestfallen. She felt the blood rush to her face. "The computer did."

  "The computer?"

  Gray shrugged. "Would you like a drink?"

  Laura wasn't very much of a drinker. "Vodka tonic," she replied, sinking back into the thick cushions and slumping low.

  Gray busied himself at the bar. "About a week ago," Gray said as he stirred the ice with a tinkling sound, "the computer told me it wasn't feeling well. Then, about three days ago, it said it wanted to talk to someone." Gray handed Laura the cool glass, and she took a gulp of the tart liquid. "I asked who, and it gave me your name." He sat and took a sip of his own drink.

  "I see." She was almost afraid to ask. "So what do you think" — she nodded toward his desk—"about my paper?"

  "Like I said, it's very interesting. The writing is a little rough, but you're very close to the mark."

  "Wait a minute! What do you mean a little rough? I've never published a sloppy paper in my entire life!" Gray seemed surprised by her outburst, and Laura took a deep breath to calm herself.

  He got up and walked over to his desk. "It's the substance that counts. I have an engineer in my space design bureau who couldn't write his name in the sand with a stick, but put him on a…" He handed her the sheaf of papers.

  Circled near the bottom of the first page was a typographical error. The word "anosognosia" was improperly spelled "anosognesia."

  But it wasn't her last and highly controversial presentation to the Houston AI Symposium, or any of her other published papers.

  It was a draft of an unpublished paper that only she had seen.

  "Where the hell did you get this?" Laura snapped, waving it angrily in the air before him.

  Gray looked confused. "I assumed it was one of your recent publications."

  "I haven't published this! I've never even shown it to anyone!" Her jaw dropped. "You got it off my computer at Harvard. You stole this?"

  Gray slumped in his chair and gazed down at the drink he cradled now in both hands. He seemed deeply troubled. "Yes," he said in a distant voice.

  She stared back at him — incredulous. "That all you have to say?" she practically shouted. "You, who says how much you value privacy?"

  "Laura," Gray said calmly, "we have a major problem on our hands. I'm losing control of the computer. It starting to… to act independently."

  "Wait. Do you mean to tell me it was the computer who stole my paper? That you had nothing to do with it? That it found this," she said, raising the paper again, "and then picked me to come down here?" Gray just frowned. "Why? Why would it do such a thing?"

  "The answer's obvious. The paper is brilliant, even if your conclusion misses the mark."

  "Yeah, well… don't be so damn sure about that flattering conclusion." She was seething, but the word "brilliant" ran [garbled] in her ears.

  "I'm fairly certain about the conclusion," he said without malice. "But regardless, the thinking is original. It's a true advance, and it's not often you can say that about what comes out of your people."

  Her head shot up. "And which 'people' is that?"

  "Academics."

  She opened her mouth to argue, but the thought of Paul Burns churning out drivel on his way to tenure made her hesitate. She nevertheless sank back into the sofa, shaki
ng her head at his incredible arrogance.

  "The computer wants to believe what you say in that paper," Gray said, his tone growing more pained. "It would help the computer resolve certain personal… dilemmas if what you say is true. Let me see if I can paraphrase the points you make."

  "Oh sure! Go ahead. I'm sure you can boil it all down to a single sentence and then knock the stuffing out of it."

  "You believe," he began, ignoring her barb, "that there is no significance to the thing we call our 'self. ' That we concoct selves just like spiders spin webs and beavers build dams. We don't know why we do it. We don't even realize that's what we're doing. But we create this artificial concept of being a distinct 'self' because those humans who were so 'selfish' as to value their own lives above others' fared better in the game of natural selection than those who were more 'selfless.' They passed down their egocentricity, which we have come to refer to as individualism."

  He had summed up her paper perfectly, and it infuriated her.

  "That's an oversimplification!"

  "Well, all of that is the part you've gotten right."

  His comment threw her, and she didn't know what to say next. "So… what? Are you saying you agree with me?"

  "With that part, yes."

  "But that's all the [missing]."

  "Not exactly. There's an underlying criticism of the process you so accurately described. You continually belittle the effort by referring to it as 'merely' constructing a self, or a self being 'only' an artificial construct. The paper is replete with examples of your personal philosophy."

  Laura's mouth dropped open. "Oh, I see! I have a 'personal philosophy,' and it's my personal philosophy that's all wrong!" Gray nodded.

  "O-o-oh!" Laura burst out, slapping her hands down on her thighs in aggravation. She grabbed her drink and walked over to the bookshelves to face away from the offensive man.

  "You asked me what I thought."

  Laura took another gulp of her drink, then said, "So, just what is my personal philosophy?"

  "I call it communalism."

 

‹ Prev