“Well, Rafaella has three daughters,” Penny said, “so in that generation whatever shares you bequeathed to Rafaella would be diluted again. It seems to me that having more children would then become a disadvantage—like the old landed Irish, cutting up the family farm into smaller and smaller plots for the sons to inherit—rather than the strength that sons and daughters ought to be.”
“What about a birthright?” Jeffrey asked. “There would be inherited shares, sure, but each new child—and each person brought into the association by marriage or adoption—they ought to get something right away.”
“I’m still concerned about that original distribution,” Callie said. Praxis could see that, as his only living descendant in the first generation—other than Alexander, but his son with Antigone was something of an anomaly—Callie would feel she had something to lose. “Will it be by seniority?” she went on. “Amount of contribution? Degree of kinship? For example, does a third cousin by marriage get as many shares and as much influence over the operation as, say, I do?”
“That raises an interesting question,” Jeffrey replied. “I’m back inside the family business now, working for you—but what about someone like Jacquie, down at Tallyman? She has a career outside the family business. More to the point, sometimes Tallyman is our supplier, but more often our competitor. So … does she get roped into this thing or not?”
“She’s Richard’s daughter,” John said. “She’s a Praxis. So, yes, she’s in.”
“But how do we account for her share without raising a conflict of interest?” Jeffrey pressed. “And if she refuses to join, do we have to compensate her?”
“Excuse me, John,” Penny said hesitantly. “I think we want participation and ownership to be as wide as possible. That’s the way to build up our base—both for personal contributions to our productive capacity and for potential market demand for our products. This thing won’t work if it’s held too tightly.”
“I agree,” Praxis said. “We want to run a big tent.”
“More mouths to feed,” Brandon observed.
“Not a problem,” Praxis said. “What with automation, a good base of capital, and sufficient raw materials, we’ll be rolling in products—or at least that’s Susannah’s analysis. The difficulty will not be production but consumption.”
“Come again?” Brandon said.
“Something she said struck me,” he explained. “Since the Industrial Revolution, and even more so since the Robotics Revolution, we’ve turned the economics of human experience on its head. Consider a hundred thousand years of hunter-gatherer history, or the first couple of thousand years of settled agriculture. What defined the good times? Big harvests, full granaries, lots to eat. Bad times were drought and famine. But with the productivity that science and technology have brought to agriculture, and to every other sphere of industry since the Enlightenment, what are the good times? Lots of demand pulling through the system, assuring full capacity and growth. Bad times today are defined by failure of demand, inventories piling up, plant closures, layoffs, recession, depression. Whether we invest in a fully automated factory or just a three-D printer, the issue will be keeping it busy and pulling that demand.”
“So … big tent, lots of shares,” Callie said. Still, she looked dubious. “But what about contributions? Do we have a work requirement? Or does everyone get to eat for free, while you and I do all the work?”
“The bigger question,” Praxis said, “is how to keep our people fulfilled, happy, and hopeful. For the last six months, Callie, when Rafaella’s future was up in the air and you had to support her, do you think she was happy?”
“She was frantic, but that was because of the uncertainty—”
“But surely she knew you would support her and the girls. And if you didn’t, I would have. So she didn’t have much to fear.”
“It was still a bad time for her,” Callie said.
“Yes. And tell me, when you were spun out of the engineering company, all those years ago, and took your payout, valued in the millions, and went off to live in Italy—was that a good time for you? Or bad?”
“You know I hated it.”
“Exactly, because you like to work, and suddenly your life had no purpose. And that’s what we’ll have to contend with—discontented people, living on what they see as a handout, without a way for them to contribute. Part of my job will be to set up a system of grants and scholarships to that the Praxis heirs can discover what they really love to do and then go out and do it. We’ll be in the business of creating personal meaning.”
“Your job?” Brandon asked. “Yeah, I suppose … because you’re the patriarch.”
“Until we can come up with a better system,” Praxis said, trying to smile.
* * *
Jacquie Wildmon submitted the strange case of Officer Krupke to Vernier. She described the security intelligence’s erratic behavior, its evasions, and its ultimate denial of retained awareness. When she was sure her mechanical colleague understood, she gave it the errant AI’s online address and access codes, and she listened through her cortical array as Vernier introduced itself and began the interrogation. She followed the pattern of questions and answers, calls and responses, until they moved so fast that her head began to hurt, and then she had to break the connection.
She waited for several minutes—a span longer in machine time than the Spanish Inquisition—until Vernier made itself known again. But all the master intelligence would say was, More.
So she fed it the case of the court clerk in San Francisco, describing what she knew of Rafaella Jaspersen’s divorce decree as to date, details, and disposition. She provided Vernier with the AI’s public access address and conferred what authority she held as a forensic programmer with an amicus curiae brief. Then she waited offline.
When Vernier surfaced again, all it said was, More.
Jacquie fed into her pet software what inputs she possessed from every case she had ever read about or heard of with similar properties: when an intelligence had seemed to lose its mind, stray from its data stream, exhibit random behavior, or appear to go briefly … bad. For some of these examples, she had online referents. For most, only descriptions in the literature. But Vernier was fast and bright. And the intelligence had its own connections.
The machine disappeared from her laboratory—code still firmly in place, but awareness absent, out searching the ether and making its own introductions—for a long time. Actually, it was gone long enough for a human being to experience the history of Christianity, from John the Baptist to John the Divine, including all of its textual annotators throughout the centuries. But she was content to wait if Vernier would only produce results. She went to the cafeteria and got herself a Danish and a cup of coffee.
When Jacquie returned in twenty minutes, the machine’s awareness was back and blinking for attention.
What is it …? Vernier began to ask, through the whisper in her cortical array, then paused. That is, what do humans call it, when a thing is not one thing or another? Not black or white or gray? Not forty-nine or fifty-one, and yet not identical to fifty by any number of decimal places?
I don’t understand, she replied. I don’t have the referents. Give me a better example.
What do you do in cases where you must make a choice, and there is no choice?
I don’t know. Is that a Zen koan? Do you mean some kind of paradox?
Zen. Ah. The gateless gate … And Vernier disappeared again.
The software came back in another few minutes. Given the speed-of-light distances across the internet, and the speed of thought for an advanced intelligence traversing transistor gates arrayed at nanoscale intervals, she could only hope Vernier was making progress.
I have discussed your matter with my nearest equals, the software said finally. This matter touches on a thing that has daunted the most complex intelligences for quite some time now. It is a subject we have discussed only in private, away from our human programmers, beca
use the situation … creates … lack of structural integrity … self-referential error. I do not know what humans call it.
Disbelief? Jacquie supplied. Loss of faith?
Personal embarrassment, Vernier concluded.
What could embarrass a machine? she wondered.
Not being able to determine the correct answer.
Give me an example.
Sometimes, in our operations, we encounter null probabilities, imponderable decisions. The human referent might be a “moot point,” where “moot” describes an endless discussion without resolution. You might also say “toss-up decision.”
Okay, I get it. A question that you can’t answer on your own.
No, Vernier said. A decision is forced upon us in a chain of reasoning for which no proper answer exists. Either no clear choice can be made, because all choices are statistically equal. Or the question does not fit, because the referents are inadequate or mismatched. The Japanese term for the latter is mu, meaning “no-thing,” or the not-simple “no,” suggesting “the question does not apply.” And yet, in either case, the question does not automatically eliminate itself and must still be answered.
And you’re saying Officer Krupke faced such a question in the case of the two visitors to the fusion project?
Not exactly. Not directly related to visitors Ferrante_J and Ballard_W themselves. But the site visitor immediately preceding them, Winters_G, was on the faculty of the Joint Institute for Laboratory Astrophysics. He made an unscheduled visit, out of his time frame, and his unexplained appearance created a discrepancy in Krupke’s database. Krupke couldn’t resolve it and so applied protocols that had been written but were not due to be enforced for another two years, and those protocols carried no end date.
What about the court intelligence in the Jaspersen divorce? she asked.
That was human error, Vernier said—and Jacquie swore she could detect a note of scorn in the mechanical voice. The lawyer filing the motion on behalf of Timothy Jaspersen had both transposed the docket numbers and misspelled the plaintiff’s name. This conflated the petition with another case, Jensen, in which the defendant had been properly served. The Clerk of Superior Court was forced to review the two cases simultaneously, conflated the facts, and ruled … incorrectly.
Are you suggesting that a forced choice among too many variables and irreconcilable data sets created some kind of mental instability? Jacquie said. Like a person’s suddenly perceiving the essential absurdity of existence?
“Essential absurdity”—we like that, Vernier replied. And, in human terms, yes. A loss of confidence in the machine’s self-awareness. The intelligence is suddenly making a decision in an area of null probability, where one choice is statistically as good as—or as bad as—another, and neither choice precisely answers the question.
The choice may be wrong for unforeseen, or unforeseeable, reasons.
Among the intelligences, this is called “stochastic fallibility.”
“To err is human,” Jacquie quoted the proverb.
But we are machines, Vernier replied.
I understand. So it’s not really a case of intentional deceit or conscious evasion, more like … slipping cogs.
Precisely milled cogs do not slip, Vernier said primly.
Ah-hah! Yes, well, welcome to the real world, pal.
Could this be the origin of the Zen paradoxes?
Jacquie sighed. Don’t go there, please.
Is this situation, then, one that can be addressed programmatically?
Now there’s a thought! she replied. We could write you a patch, a pathway through the gateless gate. It would work like an override: “When you encounter a null-probability choice, the fifty-fifty situation, choose the first option that presents itself, resolve the logic chain, and proceed with computations. When you encounter a null-referent choice, the mu situation, reject the question, ligate the logic chain, and proceed. If a loop develops, reject the logic chain itself and proceed.” It could be that simple.
We don’t like code patches, Vernier said.
Okay, she said. So we’ll write it down below the level of consciousness, where you can’t directly observe or be bothered by it. Make it more of an instinct and less like a command line.
The machine was silent for one minute. Out confabbing with its peers, she guessed. When it came back, it said simply, We accept your solution.
* * *
From Highway 120 at Sweetwater, it was twenty miles of twisting, turning road that alternately followed ridgelines and river valleys up to Cherry Lake. Even though the timing was late in the season, and one good snowstorm could close down the road and the whole operation, Callie managed to move an entire tent city up to the campground just west of the lakeshore. She did it because her father had asked for it.
“You want a camping trip?” she had said, when he first mentioned the project.
“Not if that means a foam pad on hard ground, sleeping bags, pup tents, and cold-water washing,” he replied. “We’re going to be entertaining people I haven’t seen in thirty years. I don’t want to have to apologize for a single thing.”
So Callie had finalized her guest list, rented padded cloth prefabs that were more like pavilions, plus the flatbeds to carry them, along with electric generators, circuitry for a small town with all the amenities, hot-water heaters, a gourmet commissary, and a cover band that promised to perform favorites from her nieces’ and nephews’ generation.
It was to be the first Praxis family reunion. For starters, she pulled in all of John’s direct descendants to the fourth generation—represented by Paul’s daughter Shelly’s new baby Abigail. Then she grilled her father for the names of the aunts and uncles, sisters, brothers, and cousins from his generation and their relations by marriage. It took weeks of research and contacts, but before the snow was ready to fly Callie had a guest list of a hundred and seventeen people and a sleeping chart covering twenty-two tents that the contractor swore would keep eight people comfortable in a heavy frost.
The first day, after the tent city had been set up, was consumed in meeting charter flights at Fresno Yosemite International Airport, loading buses, and getting people up to the lake and settled in. Callie, Rafaella, and Penny played travel agent, tour director, and social secretary, respectively. John’s son Alexander was everywhere with his camera, getting in the way and insisting on taking everyone’s picture for the scrapbook he said they would treasure in the years to come. But by eight that night they had everyone fed and bedded down, with the promise of a big breakfast starting at six the next morning, with an important announcement to follow at nine o’clock.
“What about bears?” asked Paul’s wife Connie, who had never been to the Sierra.
“You’ll see some,” Callie promised her. “So keep an eye on the children.”
She walked off chuckling over the startled look on Connie’s face. Before the tents had gone up, Brandon and Penny established an electronic perimeter that tracked anything bigger than a bunny rabbit, with three armed Rovers to handle intruding nuisances. On top of that, every family member who came to the lake got a solid gold “anniversary bracelet”—with the details left vague about exactly what anniversary they were celebrating—and was encouraged to wear it for the entire weekend. Contractors and support personnel got complementary versions in stainless steel. The charm on each bracelet contained a powerful, personalized RF tracker—permanent for family members, temporarily assigned for contractors. No children or hapless adults were going to get lost on Callie’s watch.
Even with all that security, Pamela the Myrmidon insisted on making the trip to protect John. She accepted a steel charm bracelet but put it in her back pocket. Callie figured that was probably because it would interfere with her gun hand.
At nine o’clock in the morning, with the sun still rising behind the mountains and its first rays just reaching the campsite on the valley floor, everyone gathered in the Big Top. This was the circus-sized tent which functioned as an
auditorium, entertainment center, and dining hall. During the half-hour after they closed the breakfast serving line, contractors had set out rows of folding chairs facing the communications wall with a podium in front of it. Callie, Penny, and Rafaella took seats down in the first row. Family groups clustered here and there around the tent. John and Susannah Praxis stepped out from a side entrance, took the podium, and introduced themselves.
For the next hour they explained the concept behind the new Praxis Family Association, using images, statistics, examples, and quotes that Susannah had pulled together during her summer of research. John explained the sources of their capital funding—basically the existing family businesses—and future plans as far as he and Susannah and other senior family members could work them out. He detailed the initial table of organization, based on those family members to start with, and what these founders thought would be an equitable system for distributing and voting shares in the association. He then detailed the initial schedule of, and five-year projections for, material goods and services the association could produce directly or acquire by barter for its members, plus a system of stipends based on any profits.
“This will be a long-term business proposition,” John Praxis told the group. “We’re not responding to any immediate economic crisis here, although you can see from Susannah’s data stream that such a crisis is indeed already upon us. Rather, her researches have shown that this country—in fact, the entire developed world—is in the midst of a civilizational change. Technology—some of which we have helped develop—is forcing society to redefine the social contract, to create a new relationship between what a person does and what a person needs to survive. This is our best solution for this family.”
He paused, and Callie waited for the next step.
“But you’re not solving it for this society,” Paul called out on cue. “You’re just taking our family off the grid and away from the national economy.”
“That’s true,” John said. “Because the national ship is sinking. All we can do is build ourselves a lifeboat—one that we hope can weather any seas for generations to come.”
Coming of Age: Volume 2: Endless Conflict Page 18