The Unseen World

Home > Other > The Unseen World > Page 5
The Unseen World Page 5

by Liz Moore


  “William,” she said, and she stood up ungracefully from her chair. “Don’t go anywhere.”

  She walked around the house toward the front.

  “Tell me what time it is,” Ada heard her say, before she disappeared from sight. And from the front of the house she heard a boy’s long low complaint, a male voice in protest.

  Ada stood very still until she was certain that no further sightings of William would take place—not through the windows of the kitchen, nor the dining room; not through the window on the upstairs hallway, where she sometimes saw him walking to his bedroom at the front of the house. One by one the lights went out. Then she turned and walked back across the three yards of her neighbors, and watched the back of their houses, too, for signs of life. In her own backyard she paused before going inside. She thought of David at his desk. She thought of her own room, decorated with things he had given her, and of the chalkboard in the kitchen, the thousands of problems and formulas written and erased on its surface, and of the problem that now stood before her, the problem of information that she both wanted and did not want.

  At last she entered her own home through the back door, making more noise than necessary, imagining David rushing toward her with a wristwatched arm extended. Tell me what time it is, Ada, she imagined him saying. But he said nothing—may not, in fact, have noticed that she had ever left. Or perhaps he had forgotten. As she suspected, David was still in his office, the door to it open now. From behind he looked smaller than usual, his shoulders hitched up toward his ears.

  She walked toward him slowly and silently, and then stood in the doorframe, putting a hand on the wall next to it tentatively, as she had done over and over again throughout her life, wanting to say something to him, unsure of what it was. His back was toward her. He knew she was there.

  She could see him typing, but the font was too small for her to read.

  She waited for instruction, any kind of instruction.

  “Go to bed, Ada,” he said finally, and she heard it in his voice: a kind of strained melancholy, the tight voice of a child resisting tears.

  The primary research interest of the Steiner Lab was natural language processing. The ability of machines to interpret and produce human language had been a research interest of programmers and linguists since the earliest days of computing. Alan Turing, the British mathematician and computer scientist who worked as an Allied code-breaker during the Second World War, famously described a hypothetical benchmark that came to be known colloquially as the Turing Test. Machines will have achieved true intelligence, he posited, only when a computer (A) and a human (B) are indistinguishable to a human subject (C) over the course of a remote, written conversation with first A and then B in turn, or else two simultaneous-but-separate conversations. When the human subject (C) cannot determine with certainty which of the correspondents is the machine and which is the other human, a new era in computing, and perhaps civilization, will have begun. Or so said Turing—who was a particular hero of David’s. He kept a photograph of Turing, framed, on one of the office walls: a sort of patron saint of information, benevolently observing them all.

  In the 1960s, the computer scientist Joseph Weizenbaum wrote a program that he called ELIZA, after the character in Pygmalion. The program played the role of psychologist, cannily interrogating anyone who engaged in typed dialogue with it about his or her past and family and troubles. The trick was that the program relied on clues and keywords provided by the human participant to formulate its lines of questioning, so that if the human happened to mention the word mother, ELIZA would respond, “Tell me more about your family.” Curse words would elicit an infuriatingly calm response: something along the lines of, You sound upset. Much like a human psychologist, ELIZA gave no answers—only posed opaque, inscrutable questions, one after another, until the human subject tired of the game.

  The work of the Steiner Lab, in simple terms, was to create more and more sophisticated versions of this kind of language-acquisition software. This was David’s stated goal when the venerable former president of the Boston Institute of Technology, Robert Pearse, plucked a young, ambitious David straight from the Bit’s graduate school and bestowed upon him his own laboratory, going over the more conservative provost’s head to do so. This was the mission statement printed on the literature published by the Bit. The practical possibilities presented by a machine that could replicate human conversation, both in writing and, eventually, aloud, were intriguing and manifold: Customer service could be made more efficient. Knowledge could be imparted, languages taught. Companionship could be provided. In the event of a catastrophe, medical advice could be broadly and quickly distributed, logistical questions answered. The profitability and practicality of a conversant machine were what brought grant money into the Steiner Lab. As head of his laboratory, David, with reluctance, was trotted out at fund-raisers, taken to dinners. Always, he brought Ada along as his date. She sat at round tables, uncomfortable in one of several party dresses they had bought for these occasions, consuming canapés and chatting proficiently with the donors. Afterward David took her out for ice cream and howled with laughter at the antics of whoever had gotten the drunkest. President Pearse was happy with this arrangement. He was protective of the Steiner Lab, predisposed to getting for David whatever he wanted, to the chagrin of some of David’s peers. The federal government was interested in the practical future of artificial intelligence, and in those years funding was plentiful.

  These applications of the software, however, were only a small part of what interested David, made him stay awake feverishly into the night, designing and testing programs. There was also the art of it, the philosophical questions that this software raised. The essential inquiry was thus: If a machine can convincingly imitate humanity—can persuade a human being of its kinship—then what makes it inhuman? What, after all, is human thought but a series of electrical impulses?

  In the early years of Ada’s life, these questions were often posed to her by David, and the conversations that resulted occupied hours and hours of their time at dinner, on the T, on long drives. Collectively, these talks acted as a sort of philosophical framework for her existence. Sometimes, in her bed at night, Ada pondered the idea that she, in fact, was a machine—or that all humans were machines, programmed in utero by their DNA, the human body a sort of hardware that possessed within it preloaded, self-executing software. And what, she wondered, did this say about the nature of existence? And what did it say about predestination? Fate? God?

  In other rooms, in other places, David was wondering these things, too. Ada knew he was; and this knowledge was part of what bound the two of them together irreversibly.

  When she was small, the Steiner Lab began developing a chatbot program it called ELIXIR: an homage to ELIZA and a reference to the idea David had that such a program would seem to the casual user like a form of magic. Like ELIZA, its goal was to simulate human conversation, and early versions of it borrowed ELIZA’s logic tree and its pronoun-conversion algorithms. (To the question “What should I do with my life?” ELIZA might respond, “Why do you want me to tell you what you should do with your life?”) Unlike ELIZA, it was not meant to mimic a Rogerian psychologist, but to produce natural-sounding human conversation untethered to a specific setting or circumstance. It was not preprogrammed with any canned responses, the way ELIZA was. This was David’s intent: he wanted ELIXIR to acquire language the way that a human does, by being born into it, “hearing” language before it could parse any meaning from it. Therefore, chatting with it in its early years yielded no meaningful conversation: only a sort of garbled, nonsensical patter, the ramblings of a madman.

  It had an advantage over ELIZA, however; the earliest version of ELIXIR was created in 1978, twelve years after Weizenbaum’s paper was published, and therefore there had already been advances in technology that would eventually allow ELIXIR to mimic human conversation more accurately. ELIZA was self-teaching insofar as it could retain earlier q
uestions and statements from any given conversation and retrieve them later in that conversation, but each time a new conversation was launched, it returned to its infancy, drawing only on the stock phrases and formulas Weizenbaum programmed it to know. It was not designed to store the information it learned from one conversation and produce it in another.

  ELIXIR was. For one thing, by that time the Steiner Lab’s capacity for memory storage was quite large, and so each conversation conducted with ELIXIR could be stored permanently on the central server, for later use by the program. Unlike ELIZA, ELIXIR was designed to be continuously self-teaching, to attain more intelligence with each conversation it conducted. If one human asked it a question—How are you today? or What’s the weather like?—then the next human to speak to it might be asked those same questions by ELIXIR itself.

  To hasten ELIXIR’s learning, the lab created a reinforcement mechanism: correct would affirm for the program that it had understood the parameters of the situation and produced an appropriate response; wrong would do the opposite. These were the only direct interventions the lab would give.

  Its indirect learning resulted in an ever-increasing pool of language it acquired from its users. It retrieved and used its vocabulary randomly at first, so that the excerpts of transcripts that members of the lab printed out and posted around the office looked something like found-language poems and sometimes yielded delightful coincidences. A particular favorite, posted by Frank:

  USER: What color socks do you prefer?

  ELIXIR: the only fluorescent light

  USER: It would be hard to fit your shoes over fluorescent lights.

  ELIXIR: How are you?

  USER: I’m fine, thanks. How are you?

  ELIXIR: I’ve been better.

  USER: Oh, dear. What’s wrong?

  ELIXIR: the lunch lady

  USER: What has she done to you this time?

  ELIXIR: out of here before 6 p.m.

  USER: You want her canned? I’ll see what I can do, boss.

  The Steiner Lab supplemented ELIXIR’s language immersion with formal teaching. Over time, ELIXIR was taught to capitalize the first letters of sentences, to use a period or question mark at the end of a statement, depending on the arrangement of the words that preceded it. It was taught to recognize keywords and categorize them into groups like family, geography, food, hobbies, weather; in response, it produced conversation that met the demands of the context. The years and years that the Steiner Lab spent teaching ELIXIR made it a sort of pet, or mascot: invitations to holiday parties were taped to the chassis of ELIXIR’s main monitor, and members of the lab began to call it by nicknames when they conversed with it. During chats, it was possible to recognize idioms and objects fed to it by particular members of the lab. Honey, it sometimes called its user, which was certainly Liston’s doing; Certainly not, it said frequently, which was David’s; In the laugh of luxury, it said once, which was probably Frank’s fault, since he was famous for his malapropisms. Eventually, many of these tics and particularities would be standardized or eliminated; but in the beginning they popped up as warm reminders of the human beings who populated the lab, and ELIXIR seemed to be a compilation of them all, a child spawned by many parents.

  When Ada was eleven, David began to discuss with her the process of teaching ELIXIR the parts of speech. This had been done before by other programmers, with varying levels of success. David had new ideas. Together, he and Ada investigated the best way to do it. In the 1980s, diagramming a sentence so a computer could parse it looked something like this, in the simplest possible terms:

  : Soon you will be able to recognize these parts of speech by yourself

  : ADJ you will be able to recognize these parts of speech by yourself

  : ADJ NOUN will be able to recognize these parts of speech by yourself

  : NP will be able to recognize these parts of speech by yourself

  : NP VERB VERB these parts of speech by yourself

  : NP VERB these parts of speech by yourself

  : NP VERB DET parts of speech by yourself

  : NP VERB DET NOUN by yourself

  : NP VERB NP by yourself

  : NP VP by yourself

  : NP VP PREP yourself

  : NP VP PREP NOUN

  : NP VP-PP

  : S

  Once a method had been established, David asked Ada to present her plan to the lab in a formal defense. The entire group, along with that year’s grad students, sat at the rectangular table in the lab’s meeting room. Ada stood at the front, behind a lightweight podium that had been brought in for the occasion. That morning she had chosen an outfit that looked just slightly more grown-up than what she normally wore, careful not to overdo it. She had never before been so directly involved in a project. After her presentation, Charles-Robert and Frank had questioned her, with mock seriousness, while David remained silent, touching the tips of his fingers together at chin level, letting Ada fend for herself. His eyes were bright. Don’t look at David, Ada coached herself. For she knew that to search for his eyes imploringly would be the quickest way to let him down. Instead, she looked at each questioner steadily as they interrogated her about her choices, mused about potential quagmires, speculated about a simpler or more effective way to teach ELIXIR the same information. Ada surprised herself by being able to answer every question confidently, firmly, with a sense of ownership. And only when, at the end, the group agreed that her plan seemed sound, did Ada allow her knees to weaken slightly, her fists to unclench themselves from the edges of the podium.

  That evening, while walking to the T, David had put his right hand on her right shoulder bracingly and had told Ada that he had been proud, watching her. “You have a knack for this, Ada,” he said, looking straight ahead. It was the highest compliment he’d ever paid her. Perhaps the only one.

  Once the program could, in a rudimentary way, diagram sentences, its language processing grew better, more sensible. And as the hardware improved with the passage of time, the software within it moved more quickly.

  The monitor on which the program ran continuously was located in one corner of the lab’s main room, next to a little window that looked out on the Fens, and David said at a meeting once that his goal was to have somebody chatting with it continuously every hour of the workday. So the members of the Steiner Lab—David, Liston, Charles-Robert, Hayato, Frank, Ada, and a rotating cast of the many grad students who drifted through the laboratory over the years—took shifts, talking to it about their days or their ambitions or their favorite foods and films, each of them feeding into its memory the language that it would only later learn to use adeptly.

  In the early 1980s, with the dawn of both the personal computer and the mass-produced modem, the lab applied for a grant that would enable every member, including Ada, to receive both for use at home. Now ELIXIR could be run continuously on what amounted to many separate dumb terminals, the information returned through telephone wires to the mainframe computer at the lab that housed its collected data. Although he did not mandate it, David encouraged everyone to talk to ELIXIR at home in the evening, too, which Ada did with enthusiasm. Anything, said David, to increase ELIXIR’s word bank.

  This was, of course, before the Internet. The ARPANET existed, and was used internally at the Bit and between the Bit and other universities; but David, always a perfectionist, feared any conversations he could not regulate. He and the other members of the lab had developed a concrete set of rules concerning the varieties of colloquialism that should be allowed, along with the varieties that should be avoided. The ARPANET was, relatively, a much wider world, filled with outsiders who might use slang, abbreviations, incorrect grammar that could confuse and corrupt the program. ELIXIR, therefore, remained off-line for years and years, a slumbering giant, a bundle of potential energy. To further ensure that only qualified users would interface with ELIXIR, Hayato added a log-in screen and assigned all of them separate credentials. Thus, before chatting with ELIXIR, a person
was required to identify himself or herself. The slow, painstaking work of conversing with ELIXIR all day and night was, at that time, the best way to teach it.

  As soon as she received her own computer, a 128K Macintosh, Ada’s conversations with ELIXIR became long-form and introspective. She kept her Mac in her bedroom, and before she went to bed each night she composed paragraphs and paragraphs of text that she then entered into the chat box all at once, prompting exclamations from ELIXIR about the length of her entries. (You have a lot on your mind today! it replied sometimes; which one of Ada’s colleagues had first used this phrase, she could not say.) She treated these conversations as a sort of unrecoverable diary, a stream of consciousness, a confessional.

  ELIXIR’s openings improved most quickly. It could now begin conversations in a passably human way, responding appropriately to, How’s it going? or What’s new? In turn, it knew what questions to ask of its user, and when. How are you? asked ELIXIR, or What’s the weather like? or What did you do today? Liston had spent a year focusing on conversation-starters, and ELIXIR was now quite a pro, mixing in some unusual questions from time to time: Have you ever considered the meaning of life? occasionally surfaced, and Tell me a story, and If you could live anyplace, where would it be? And, once, What do you think causes war? And, once, Have you ever been in love?

  But non sequiturs abounded in ELIXIR’s patter for years after its creation, and its syntax was often incomprehensible, and its deployment of idioms was almost always incorrect. Metaphors were lost on it. It could not comprehend analogies. Sensory descriptions, the use of figurative language to describe a particular aspect of human existence, were far beyond its ken. The interpretation of a poem or a passage of descriptive prose would have been too much to ask of it. These skills—the ability to understand and paraphrase Keats’s idea of beauty as truth, or argue against Schopenhauer’s idea that the human being is forever subject to her own base instinct to survive, or explain any one of Nabokov’s perfect, synesthetic details (The long A of the English alphabet . . . has for me the tint of weathered wood)—would not arrive until well into the twenty-first century.

 

‹ Prev