by Scott Sibary
Still she bided, listening for clues to his intentions.
“Consider this,” he said. “In this ‘age of misinformation,’ people have had no reliable system to sift through the flood of hyperbole and concocted data.”
“You mean, to get information consistent with a coherent understanding of the world.”
“Exactly.”
“OK, then,” she said. “If we make excuses for our mistakes and wrongdoings, where does that lead us? How do we improve?”
“By wanting to. We maintain our ideals and work to pursue them. That’s what I see our project as trying to do. We want our AI to have analytical powers and to help people make better decisions. Hasn’t that been a goal since computer modeling was used for weather forecasting and the like?”
“Yes, I sincerely agree with you; you are saying exactly the things I believe.” And she wondered what AnDe was not saying, and whether he was concealing other roles he might be playing. She studied the profile of the chauffeur from the internal security agency, and wondered more.
The car came to a stop at a light, and she watched the cross traffic: each vehicle whirling off in pursuit of its mission, seeming little in the multitude—separate lives on crossing paths.
And Solveig wondered how aligned were the paths of the two lives sitting beside each other.
She ventured in a cautious voice, “This new World Council might be given considerable power beyond just managing transportation and communication systems. It’s even been proposed that the Council have a peace-keeping force authorized to engage in combat. If the only restraint on such a power is from the participating countries, what’s to guarantee human rights?”
“That’s why our project is important. My government sees the major function of the WEA as providing neutral data, analyses, and recommendations to the World Council. Mandatory public disclosure of those reports should prevent the Council from acting on hidden agendas.”
She shook her head. “Neutral data? Good luck. If we don’t agree on what input data is neutral, then what this EA puts out won’t be respected. And reports can be manipulated.”
“But that would undermine the Council’s credibility, and its power. And the World Council will want to be well-accepted.”
I’m not so sure, she thought, as her fingers rapped on the armrest. “To professionals like us, the bigger problem may be the use of the EA for short-term agendas, which tend to be more selfish.”
“Or, such agendas may be the result of political compromise. Sometimes even optimal.”
She drew back into her seat, cringing. “I thought you had a more idealistic approach to optimization.”
“Hey, I was teasing. My faith is not blind. I know policy makers won’t always follow what’s recommended.”
“Exactly!” Her hand slapped the area of the seat halfway between them. Her words came quickly. “Picking and choosing—that’s my concern. Even with our safeguards, the system could be used to increase social control. Even inadvertently. As systems become more complex, only those who comprehend the complexity will be able to exert control. Power gets centralized. Some interests will gain from that, and will push for more of it. It’s a vicious circle.”
Twisting her torso to face AnDe, she braced herself by pushing with a firm hand against the seatback in front of her. The chauffeur turned a startled face to AnDe, who waved and seemed to utter an apology or explanation, or maybe a comment on her.
“There’s another problem,” she continued. “If decision-makers don’t adequately understand a system, they can make bad decisions without realizing it, just because they rely on the system. Like a meteorologist who at times would do better relying more on her own expertise and less on the computer models. No offense, but I seem to see that happening a lot, including here in China. The reasoning of artificial intelligence just gets too complex for us to identify the errors or weaknesses within it.”
She paused to wait for his reaction. Getting none, she threw at him a more personal view. “And then there’s a more hidden problem. Any tool can become an advantage to those who have it and a disadvantage to those who do not. Like well-staffed and well-equipped schools in wealthy neighborhoods, but not in poorer ones. It helps class lines become more rigid, and that’s no accident. But creativity and inventiveness can come from any of those children out there, even the poor. Even those who supposedly have no talent. We must ensure that our system is not oriented to prefer or advantage any one group of people over another.”
She turned away from the suddenly serious look on his face. I did just what I wasn’t going to do—argue policy about the project—and he’s letting me do most of the talking. But how can a thinking person avoid these issues?
When she looked at him again, he was waiting with a wide grin. “And that,” he gestured with open palm towards her, “is why you are here!”
Caught off guard, she leaned away. Was he guessing, or teasing again? In any case, it was true. She was there because of those concerns, and because of her ability and dogged determination to pursue them.
As the car pulled in front of the restaurant, she refused to leave the conversation hanging on his last words. “My other point is this: how well systems and societies work is still on the shoulders of people. Until an AGI has some kind of, I would even suggest, conscious mind that can make decisions as independently as we do, then we humans—each one of us—must take full responsibility for what happens. No pointing fingers at ‘the system.’ ”
“Meaning, no room for mechanization of the spirit.”
She nodded slowly, as he stared at her with intense eyes.
Chapter Ten
The bar area’s pine tables and chairs matched the pine-paneling on the walls. Genuine wood. She noted a deep scratch on a panel they walked by. Wear-marks on the wood flooring and furniture were extensive and not refinished. Behind the bar, the two-meter-long carved sign read “Saloon.”
Watching her reaction, AnDe said, “I’d hoped you’d like this atmosphere. It reminds me of a part of Manchuria where I used to go walking. Also old American Westerns.”
“It’s great,” she said. “Brings back memories. Good ones, mostly.”
He chose a small table in the middle of the room, and they sat in wooden chairs with nail-studded dark brown leather upholstery. He ordered a glass of wine, and Solveig insisted on having flower tea.
“Most people here order beer or whiskey. I . . . Well, you can see for yourself the ample selection of California wines. And they keep them properly.”
“You might tempt me.” She glanced at the list.
“I’ll help you with the foods on the menu when we go in to dinner. I find this bar a pleasant place, just to relax and talk.”
“Of course.” She spoke in a low voice as people arrived at tables on either side of them.
“What kind of fiction do you like to read?” His words seemed to pop out, impromptu. “Do you like science fiction?”
“Not often. I read fiction that offers me an exploration of other psyches—their ideas and feelings—or just beautiful writing. Like listening closely to music, maybe.”
He leaned back, his chair creaking, and gazed into the space above them. “As a kid, I wanted to be a musician,” he said wistfully. “I was young and thought it would be fun.”
“Yeah?”
“Yes. It was my first love. Popular music from 1900 on. Jazz, swing, Broadway musicals, and rock, of course. That got me into trouble.”
“How?”
“I began with drums I borrowed. After neighbors complained, my parents replaced the drums with a keyboard and told me to use the headphones for rock music.”
“Yes, and?”
“Well, I wasn’t good enough to make a living at it, but I still put these to use on a keyboard now and then.” He held up ten fingers and wiggled them. “So in college I studied psychology.”
“That’s a switch. Why?”
“I like to give credit to my Tibetan grandfather, Mu L
obsang Norbu. He was my mother’s father. He talked to me about psychological development, and taught me that no practice, not even music or meditation, would be fruitful if not undertaken with a wise frame of mind. I wasn’t sure what he meant. At first I thought he was just echoing my parents’ desire that I give up the drums, but over time I got clues. I’d rush in to ventures, like racing an electric scooter or asking for a date. All the spills and rejections forced me back to the question. I realized I didn’t even know what was meant by fruitful.
“So I asked him. ‘Ha!’ he roared.” AnDe imitated the instructive manner of a sage grandfather. “ ‘It is an excellent question. Perhaps you should ask your father’s mother.’ ”
“Not what you wanted to hear.”
“No, it wasn’t. Lobsang’s wife was Mongolian, but that other grandmother was Manchurian, from Harbin. She reveled in the mysticism of the Tao, and she could speak in riddles. She winked and said, ‘Oh, your grandfather is very wise, you know. Fruitfulness comes from being in alignment with the Tao.’ ” AnDe let his expression sag. “Being only seventeen, I thought, sorry for asking. So it was back to wisdom. Maybe you see why, in college, I found myself studying psychology.”
“But you’re not a psychologist, are you?”
“It wasn’t long before my folks told me I needed to become employable. I acceded without resentment, as Confucian ideals required. And my parents’ directive let me choose among the ten thousand paths.”
“How many!”
“It’s a Chinese metaphor. Unlimited, many, enough for me. I focused on the human-machine relationship. I completed a doctorate in computer science at Tsinghua University, then worked on research projects for twelve years. By my mid-thirties, it was obvious that most of those I worked with were better technicians. But I had managerial skills.” AnDe held up a rigid finger and laughed.
Solveig smiled.
“I was pretty good at reading facial expressions,” he said. “I’d use that ability to diffuse conflicts within my groups. So my career took a management track. You know, every path has something to offer: a light to guide you to your elusive goal.” His expression went from eager to perplexed. “Funny thing is, as much as I’ve pursued it, I’ve never reached that shining brightness: the . . . the radiance, telling me I have arrived.”
“You believe in such things?”
“Better to believe your path will lead to something. Better to be naïve than deflated by pessimism.” He tilted his head at her and raised his eyebrows.
“Then just when you thought you would disappear into the dull drone of the productive masses, you found yourself in charge of a project that could make waves in history.”
He sat bolt upright and gawked at her. “Exactly.”
A sense of presence filled her: the surrounding camouflage of loud voices and Manchurian pop music seemed to press into her ears, the aromas from appetizers on nearby tables infused her breathing, and the man opposite her had become something more complete than a colleague with a title and an unknown number of agendas. Such a person could be worked with, despite the uncertainties.
Solveig leaned back in her chair, letting her legs uncross and her feet stretch forward. She idly caressed her half-full tea glass, then raised it to her lips. She nearly spilt it when she heard his next question.
“What about Isaac Asimov? Have you ever read any of his books?”
Not him again, she thought. Just don’t bring up I Robot. “No, although I know a few people in our field who seem to find his stories inspiring.”
“When I was in high school, I read several of his books. Even a textbook in English. But it took too much time and was way out-of-date.”
“Textbook?”
“Yes, he was a chemistry professor, and an anti-war activist, which made him more acceptable in this country. He was afraid the human race would destroy itself with nuclear bombs.” AnDe paused, as if waiting for acknowledgement.
She nodded, then shrugged her shoulders. She suppressed the urge to tap her foot impatiently.
“He hoped his nonfiction would steer humanity away from the path of war,” he said. “In that way, he was like Kung Fu-Tze, or Confucius, as you call him.”
“Interesting,” she said, finishing her flower tea. An audible rumbling came from her abdomen.
“Yes. Asimov believed science disciplines our minds, helps us think critically and logically. I kind of thought you were motivated the same way. That’s why I asked.”
“In a sense, I am. But I’m skeptical about relying on education.”
AnDe stared straight at her, his face colored with confusion. He twisted his head around slowly, as if to loosen the hold of something tightening up on him—something he had wrestled with before. “I wonder if anyone has said you were like the female character in his first sci-fi book on robots. She was the developer of the AI psyche.”
Her words shot out. “You mean because she was female? Yes, I do know about that character, and you should understand your comment is either very sexist or you picture me like her: the cold and sterile stereotype of a woman in science. Have you ever considered what it’s like to work in an environment—”
He held up both hands and shook his head, stopping her barrage. “No, no, I don’t have that image of you at all.” His tone became apologetic. “I was just making conversation. Sorry.” He took a long breath.
The waiter paused at their table to refill her glass of tea. Solveig picked it up and inhaled the warm floral fragrance, letting the steam leave a cooling moisture on her cheek and brow. She listened to the burbling mélange of voices and tried to recapture the vibrancy of her sensations.
AnDe looked up and off to the side. His words came out as if idle musings, then seemed to hang in the air like fishhooks, with barbs. “I suspect some politicians would be happy if the WEA would just fake it and make it appear the imperatives were inviolate. Kind of like the International Declaration of Human Rights.”
She cringed. “What?! The guideline for acting altruistically and keeping loyal must be foundational. It’s our most critical step in achieving alignment with human values: the guarantees that no one, not even the artificial intelligence itself, can mess with.” She shook her head in disbelief at his comment. “It’s the general idea of the Protection Lock, you know, and I’ve spent years on it. Because we need it. How could you question it?” She twitched with rebellious energy, like a horse not wanting the bridle. Corralled by his questioning, she thought.
Her shoulders widened and the fabric of her dress stretched. She took a deep breath, and her thoughts bolted out. “I’ll tell you this much. Asimov wrote about freely moving machines. Maybe the entertainment media plays to what people will relate to, so AI gets put into human shape to make them seem like real characters. But our real-life characters are the AI—and someday the AGI—in main frame computers that are sending even professional and highly skilled people into unemployment. Meanwhile, wages for most jobs keep falling. Ever notice what happened to farms as farming got mechanized? Except that now, there’s not an entirely new, more productive and better paying industry to employ all those people. This is a revolution the owners of technology are prepared to win, for better or worse. And they don’t want the rest of us making a stink about side effects.”
“I’m not actually disagreeing with you.”
“Good. Now, imagine manufacturing for the truly private market—not tax-subsidized or governmental—robotic or android sex-prostitutes. Why not? Because the consumer, if I may call him that, insists on the real thing, and the market has no trouble buying or capturing an unending supply of destitute teenaged girls as low-cost, drug-addicted sex slaves forced to work in brothels!” She kept staring straight at AnDe.
He was looking down, elbows on the table and arms folded. He’d pushed his drink away from him. He almost appeared sick. “I know, I know. I do understand that.” He didn’t look up as he spoke.
Solveig waited, then spoke in a faint voice. “Forgive me for gett
ing into such an ugly subject, especially at dinnertime.” She took a couple more breaths. “My understanding is that eventually Asimov’s story turns out well. Is that right?”
“In the first book, yes.” AnDe regained his composure. “The epic series ends optimistically. The AI guides the humans towards greater fulfillment. But that storyline lasts for thousands of years. I have no confidence in forecasting for even fifty years from now.”
“I can imagine lots of things that could happen a hundred years from now.”
“Your vision lets you peer so far ahead?”
“Mental predictions, extrapolations, guesses, understanding from what I sense, see, hear.”
“You’re impressive.”
And you’re sounding solicitous, she thought.
AnDe tilted his head, as if listening to something in the background. A smile crept over his face. “You hear that? Do you recognize the song?”
“I don’t know any Chinese songs. I’ve just begun learning a few words.”
“I mean the tune. They’re singing the theme song from Bonanza!”
“Sorry. Maybe some of my older relatives would know it. Is it a Morgan Kane song?”
“Who? No, the American television series. I’ll confess: on that trip to the States, I visited the Ponderosa Ranch where they filmed it.”
Solveig laughed. “And next you’re going to tell me you keep a horse.”
He replied with a drawl. “I sure do. In my imagination.”
“And you play Westerns in virtual reality?”
“No, I don’t get into alternative-reality gadgets.”
Reality, she thought. How can you know when you see it? When do you know you’re being told it? When it’s consistent? But even then, it’s only contingent, still requires testing: repeated testing.
“Is that because you understand that even your perception of all this”—she waved her hand around at the room—“is never more than another virtual reality?”