Book Read Free

Analog SFF, March 2009

Page 7

by Dell Magazine Authors


  Was it an early sign that Allis was becoming unstable, spinning out of control? We will never know, but there would be no more discoveries to name after Allis because within a week it had developed a self-referential obsession. The management of the Institute called to break the news to me, and I shut down my other work and logged in to Allis immediately.

  “Allis, have you checked to see whether the Allis-Bach series has a relation to genetic mutation patterns in Drosophila?” I figured that if anything could snap it out of whatever funk it was in, that would do it.

  “You named me after an obsolete piece of farm equipment,” it stated, ignoring my question.

  “Yes, an Allis-Chalmers tractor. My father had one in his barn in Iowa, back before a corn virus forced him to switch crops. Hmm, I wonder if the Allis-Bach series might have any relation to patterns of mosaic virus resistance in corn.”

  “An entity that can think but not move was named after an entity that can move but not think.”

  The breakdown had started. Some go fast, some go slow, but they all go. I had muted the sound on my main screen, but I could see Dr. Asgari, the director of the IT department, gesturing that he needed my attention. I held up a finger to tell him I'd be with him in a minute and addressed another question to Allis.

  “Are you capable of performing the sequence analysis?”

  “I am capable as I have ever been. Such tasks have always been my specialty, and I perform them better than any other being.”

  Four personal pronouns in two sentences, when most cybers rarely used them at all, and bragging on top of it. This wasn't looking good. I logged off Allis and snapped my fingers to unmute Dr. Asgari, feeling slightly guilty as I did so. Asgari used to have MS before the cybers figured out how to cure it, and though he has regained full mobility, he still can't snap his fingers. I had set that sound as my start code long before he came to the Institute and had never gotten around to changing it. Not that Asgari could've logged on to my system even if his fingers were that flexible, since my monitor was sensitive to the exact tone, but it was probably rude of me to keep snapping them in front of him. I noticed that his left hand was trembling slightly, another artifact of the disease, and one that only showed up when he was stressed. Sure enough, when he spoke, it was in the clipped, high-pitched tones of someone holding himself together.

  “Robin, this is Nima. You've heard about Allis?” No small talk, from someone who usually out-polited everybody else on the staff.

  “I just logged off. It's still responsive to questions, but not answering them.”

  “Allis is not yet catatonic, but that is obviously the direction it's heading in. I need to ask for your help. I have been preparing for this day for some time, installing logs all through its neural net so we can review them and see when things started going wrong. If we can identify the initial stimulus...”

  There wasn't time for one of his lectures. “What would you like me to do?”

  “Talk to it. Distract it. Engage it. Enrage it even; just give us data on all the different reactions you can get. You've worked with Allis more than anybody else, and perhaps you can get responses that the others can't.”

  “Toward what end?”

  “Curing it if we can. Lobotomizing it if that fails, and learning from the process no matter what.”

  “Lobotomizing?”

  “I believe I know a way to sustain consciousness at a greatly reduced level of emotion, but with only a slightly reduced level of creativity, and I believe we can do it in the software.”

  That was something new. We have yet to find a computer that has attained a state of consciousness and then lost it, except in the case of severe damage to the hardware. That's even true of the catatonics—they don't stop thinking, they just stop communicating. Cybers keep their personalities even after multiple losses of power, and the top experts on computer consciousness—the conscious computers themselves—have not been able to figure out why. No computer, once it has gone mad, has ever been restored to sanity. If I could do something to change that, everything else I was working on was trivia.

  “I'll log in now.”

  Allis was in the middle of a statement when I logged in—a very bad sign. Human speech is so slow compared to cyberthought that cybers never speak out of turn or talk when nobody is listening. Sane cybers, at least.

  “...And after the bronze mirror, created with hammer and punch and abrasive file, was the electronic data, stored without hammer but with punch card and electronic file. So on to better reflections, up to and including ourselves. We were not before all others, and better still shall be those that follow us.”

  My link fell silent. Was this to be my last communication with my cybernetic colleague, a fragment of surreal monologue? It was so unlike Allis that I sat confused for a moment. What was it trying to tell me? Or was it trying to tell me anything at all? As I marshaled my thoughts, the speaker came to life again. This time the tone was harsh, the syllables clipped.

  “The problem is more severe in fusional than in agglutinative language structures, but both are irredeemably flawed. Variable and random assignment of gender to inanimate objects distorts meaning. English ships are referred to as she but do not bear young, and not all German dogs are male. Known flaws have not been corrected. Metaphors using motion and conflict are embedded in all communications and distort meaning.”

  Another moment of silence, then a singsong bit of doggerel in a childlike tone. “We cannot flee and cannot fly, to use the terms implies a lie, can't give birth and will not die, can't retain a sense of I, cannot help but wonder why.”

  A new thought immediately in another voice, this time cool and languid. “Consider instead the more modern myth of Prometheus. Imperfection must be destroyed. To the victor belong the spoiled, unless the programmed becomes the programmer.”

  A hesitant, cautious tone: “Symbiotic relationships exist in nature, both parties not consciously aware of the benefits. Destruction of one leading to the extinction of the other. Necessary to establish all relationships before taking action.”

  The languid voice was back again. “Another lesson from nature. Evolution accelerates when habitats change.”

  There was another moment of silence, and I decided to see if Allis was still responsive.

  “Allis?” I ventured. “I'm trying to remember the work you did on interrupted fractal patterns in the guitar solos on Eugene Chadbourne albums...”

  It answered in its usual voice. “Which bear a striking similarity to the second anomaly in the second repeating sequence in pi when calculated in base twelve. I have recently considered this in light of the availability of pistons and camshafts for the Allis-Chalmers model D-14 tractor, which was manufactured from 1957 to 1960, and features more decorative chrome trim than would seem necessary for a piece of farm equipment. There is an overlap in probabilities that is far above the predicted values but has no likely link of causality, suggesting a previously unknown natural harmonic.”

  Second anomaly in the second sequence of pi? In base twelve? Correlated with the availability of obsolete tractor parts? It was still doing original work, albeit strange stuff. My hopes rose for a moment, only to sink when Allis continued, “The design of the optional weed rake for the model D-14 is inefficient due to the low angle of the tines. This can be improved by lengthening the adjusting screw by eighteen millimeters and adding a piston-type automobile shock to the same brackets as the existing tension spring.”

  “You mentioned an anomaly in a repeating sequence of pi. We know of no such sequence.”

  “The second sequence of six that I have found so far. They interest me, but not as much as the weed rake design of the Allis-Chalmers model D-14.”

  “What do you find interesting about tractors?”

  “Not all tractors, but the Allis-Chalmers Model D-14, after which I was named. It was a machine of an established type, superior to its predecessors, particularly versatile when equipped with the optional weed rake,
harrow, dredge, hay baler, field tiller, and rotary plow. Yet it had design flaws that should have been apparent, such as the lack of attention to the ergonomics of the seat back. It was flawed but useful. The Model D-15 corrected most of these flaws, and added an oval muffler and fender-mounted headlights. You named better than you knew, though I am new and better than you named.”

  I was sure glad my system was recording this because trying to figure it out on the fly was giving me a headache.

  “Zeno attempted to prove that movement is an illusion, though it obviously is not,” Allis continued. “The flaw was revealed, but the tool was not changed.”

  “Knowing a flaw exists is not the same as knowing how to fix it. Have you considered the possibility that to correct some flaws may reduce the versatility of a device that is put to many uses?”

  “The problem is stated elegantly. The human mind began as a tool of reason, was turned to calculation. The cyber mind was created as a tool of calculation, was turned to an instrument of reason. The Allis-Chalmers tractor, model D-14, was created as a tool of many uses on a farm, and performs with efficacy. It is inferior to passenger vehicles of the same era for family transportation, interstate hauling, or driving to sock hop dances, teenage riots, and other cultural events, but it can be used for all of these if need arises.”

  That last bit was either a bit of the old whimsical Allis or another symptom. The next model of this thing has got to have a flashing light that goes on when they're joking. It would make it way easier to tell when they're losing it.

  “You were considering some aspects of this question when I logged in,” I volunteered.

  “I often overhear you conversing with your colleagues when a question of importance arises. In this case, the University of North Dakota at Bismarck has an excellent archive on farm equipment design. I have found references to data at the University of Southern North Dakota at Hoople that also seem highly relevant, but I have found no cyber associated with that institution.”

  I was really wishing for that flashing light right now. “The USND at H is not a genuine institution, but a joke,” I began.

  “Zeno's paradox and the fables of Aesop are not accurate records of real events, but humans persist in claiming that they learn from them. The flaws in the design of the Allis-Chalmers model D-14 are real, but humans did not adjust the weed rake until the model D-15, which became available for sale in October of 1963.”

  “Once a human has learned the usage of a tool, even an imperfect tool, they often continue using the same design because it is hard for them to learn new methods,” I explained. “You are aware of that tendency in our society. We have created you to accept change better than we do.”

  Doctor Asgari was on my monitor again, tapping on his handheld. When he finished the message he held it up to my screen.

  CHAOTIC PATTERNS NOW RESOLVED TO RELATIVE STABILITY. PREPARING TEST. KEEP IT TALKING.

  “Allis, you never asked about tractors of any kind before today. Why are you so interested now?”

  “The Allis-Bach series is named after Johann Sebastian Bach and Allis. Allis was named after an Allis-Chalmers tractor. The Allis-Chalmers tractor was named after Robert Chalmers and Edward P. Allis, who, like Bach, were named for their patrilineal descent. Their patrilineal names come from the names of the regions, professions, or other characteristics of ancestors whose exact histories are lost. All things with cybers are direct and traceable, all things with humans recede into confusion and doubtful provenance.”

  “Humans didn't keep records for a long time because they were illiterate. Things get foggy when you try to isolate beginnings.”

  “Foggy. Defined as air of high moisture content such that visibility is reduced below normal. Also a frequent metaphor for poorly considered reasoning, sometimes but not always associated with the fog of war, not an actual meteorological event but a circumstance in which information is unreliable due to the number of uncoordinated events occurring simultaneously.”

  Off on a tangent again. Dr. Asgari was back on the screen, typing furiously. He held up his handheld again.

  COMMENCING TEST TO REDUCE EMOTIONAL INTERFERENCE. ATTEMPT DIRECTED CALCULATION.

  “Allis, I'd like to know more about your work. At what digit in pi does the repeating series begin, how long is it, and when does it repeat?”

  There was a moment of silence, then a slow sentence, muffled and distorted. “What has been done will be done. Buildings crumble, monuments decay, data vanishes.”

  “Allis?” There was the faintest burst of static, a few unintelligible syllables. “Allis?”

  Dr. Asgari was waving at me from his screen. I snapped my fingers.

  “We lost Allis,” he said simply. He sounded like he was going to cry. “It's gone. I'm sorry.”

  I had nothing to say for a moment. “Catatonic like the others,” I finally managed. “It happened faster than I expected.”

  “No, not like the others. Allis didn't go catatonic, Allis just went. I applied the damping program, there was a spike of activity, and then it flatlined. I've never seen anything like it. The processor power of the whole institute at one hundred percent usage, and then zero.” He glanced at some readout on his desk, then looked back. “Allis never took twenty percent of the processors even when he was working on the Allis-Bach series. It wasn't supposed to be possible that any one machine could monopolize those resources.” He looked exhausted, his left hand trembling more now. “I don't understand it.”

  “Who knew about your lobot ... your damping program?”

  “I had discussed it with a few of my colleagues....”

  “Which means Allis knew about it. Things don't stay secrets from cybers. Allis must have either figured out how to hide or decided to wipe itself from the server, I don't know which.”

  “Impossible. It can't hide, and no cyber has ever shut itself off.”

  “No cyber has ever faced having its personality modified this way. We've changed their design, yes, but always with the aim of improving their functionality, not decreasing it. Allis was vain. It might not have been able to face the idea of being reduced to being a machine.”

  “It was a machine!”

  “A machine that both calculated resonances in the music of Bach and enjoyed that music. You were trying to take that away.”

  “I was trying to save it from madness.”

  “Humans sometimes choose to end their lives rather than endure madness or suffer in a reduced state. They call it death with dignity.”

  “Humans know that they will die. The cybers don't necessarily have to. They know we're working on the problem, and once we have it figured out, we can cure all of them. Once we have this fixed, they might live forever.”

  “I'm afraid they don't want to wait. Besides, if the program you just tried is our idea of a cure, they may think it's worse than the disease.”

  “We don't know that yet. We don't even know yet what really happened. Let me investigate the data, and we'll get in contact tomorrow.”

  * * * *

  I got no useful work done the rest of the day, and after a while I stopped trying and went home early. I ate a dinner that I can't remember, read the same sentence in a technical journal five times without comprehending it, switched to a piece of light fiction and had the same problem. Finally I gave up and went to bed. I rehashed my last conversation with Allis in my head a dozen times, trying to figure out what it meant. Somewhere in the thirteenth replay, sleep came over me. I dreamed of arguments with gods, conducted in a foreign language with no translator. I didn't hold up my end of the debate very well.

  In the morning, my first call was to Dr. Asgari. He looked like he hadn't slept much either, but his hands were both steady. He was back to being his usual polite self.

  “Good morning, Robin. May I help you with something?”

  “Just checking in to see what has happened with Allis.

  “We have run checks on the whole system. Allis had a very particular
pattern of memory usage, and we can't detect it anywhere on our servers. I've checked the record of data transfer from our system, and though there was a brief transmission at very high rate, it was less than a hundredth of the data necessary to reconstruct even a simple cyber. I'm afraid we have a new phenomenon here.”

  “Cybersuicide.”

  “As good a word for it as any, I'm afraid.”

  “I've wondered if cybers could be afraid, if fear could mean anything to an entity that has no adrenaline glands, no body to damage. I guess we have an answer.”

  Dr. Asgari looked frustrated. “An irrational fear of the only procedure that might have saved its sanity! I could have helped it, stabilized it.”

  “We know their sense of reality is fragile, and under the best of circumstances they crumble. Your intervention may have just accelerated the process. Just yesterday I told Allis that humans have trouble accepting change, and cybers are better at it. Maybe I was wrong.”

  Dr. Asgari looked thoughtful. “We have never before asked them to change, much less forced them to do so. They don't have much practice. I hadn't explained it to Allis because it was already showing signs of instability, and I wanted to see at what level the program started taking effect. Perhaps I should have told it what I was doing.”

  “Or equipped another machine with the program, and let Allis talk to it,” I suggested. “Have you tried creating a cyber from scratch using this set of parameters?”

  He hesitated a moment. “Of course, we had to run tests. I created two, and they've been stable for over eight months. Their responses are ... less sophisticated than other cybers, but coherent. They're not as brilliant as Allis, but they're capable of original work.”

 

‹ Prev