Broken Stars

Home > Other > Broken Stars > Page 2
Broken Stars Page 2

by Ken Liu


  LINDY (2)

  iWall was mostly dark, save for a few blinking numbers in the corner notifying me of missed calls and new messages, but I had no time to look at them. I was far too busy to bother with social obligations.

  A small blue light lit up, accompanied by a thudding noise as though someone was knocking. I looked up and saw a bright line of large text across iWall.

  5:00 PM. TIME TO TAKE A WALK WITH LINDY.

  The therapist told me that Lindy needed sunlight. Her eyes were equipped with photoreceptors that precisely measured the daily dose of ultraviolet radiation she received. Staying cooped up in the house without outdoor activity wasn’t good for recuperation.

  I sighed. My head felt heavy, cold, like a lead ball. Taking care of Nocko was already taking a lot out of me, and now I had to deal with—no, no, I couldn’t complain. Complaining resolved nothing. I had to approach this with a positive attitude. No mood was the simple result of external events, but the product of our understanding of external events at the deepest level. This cognitive process often happened subconsciously, like a habit, and was finished before we even realized it was happening. Often we would fall into the clutches of some mood but could not explain why. To change the mood then by an act of will was very difficult.

  Take the same half-eaten apple: some would be delighted upon seeing it, but others would be depressed. Those who often felt despondent and helpless had become habituated to associating the remains of a whole apple with all other losses in life.

  It was no big deal; just a stroll outside. We’d be back in an hour. Lindy needed sunlight, and I needed fresh air.

  I could not summon up the energy to put on makeup, but I also didn’t want everyone to stare at my slovenly appearance after staying cooped up at home for the last few days. As a compromise, I tied my hair into a ponytail, put on a baseball cap, pulled on a hoodie and a pair of sneakers. The hoodie I had bought at Fisherman’s Wharf in San Francisco: “I ♥ SF.” The texture and colors reminded me of that summer afternoon long ago: seagulls, cold wind, boxes of cherries for sale by the wharf, so ripe that the redness seemed to ooze.

  I held Lindy’s hand tightly, exited the apartment, rode the elevator down. The tubes and iCart made life easier. To go from one end of the city to the other, to go directly from one high-rise to another, required less than twenty minutes. In contrast, to get out of my building and walk outside required far more effort.

  Overcast sky. Light breeze. Very quiet. I walked toward the park behind the building. It was May and the bright spring flowers had already wilted, leaving behind only pure green. The air was suffused with the faint fragrance of black locust trees.

  Very few people were in the park. On a weekday afternoon, only the very old and very young would be outside. If one compared the city to an efficient, speedy machine, then they lived in the nooks and crannies of the machine, measuring space with their feet rather than the speed of information. I saw a little girl with pigtails learning to walk with the help of an iVatar nanny. She held the iVatar’s thin, strong fingers with her chubby fists, looking at everything around her. Those dark, lively eyes reminded me of Nocko. As she toddled along, she lost her balance and fell forward. The iVatar nanny nimbly grabbed her and held her up. The girl squealed with delight, as though enjoying the new sensations. Everything in the world was new to her.

  Opposite the little girl, an old woman in an electric wheelchair looked up, staring sleepily at the laughing figure for a few seconds. The corners of her mouth drooped, perhaps from moroseness, or perhaps from the weight of the years she had lived through. I couldn’t tell her age—these days, practically everyone was long-lived. After a while, the woman lowered her eyes, her fingers gently cradling her head with its sparse crown of white hair, as though falling asleep.

  I had the abrupt feeling that the old woman, myself, and the girl belonged to three distinct worlds. One of those worlds was speeding toward me while the other was receding farther and farther away. But from another perspective, I was the one slowly strolling toward that dark world from which no one ever returned.

  Lindy shuffled her feet to keep up with me without saying anything, like a tiny shadow.

  “The weather is nice, isn’t it?” I whispered. “Not too hot, and not too cold. Look, dandelions.”

  Next to the path, numerous white fuzzy balls swayed in the breeze. I held Lindy’s hand, and we stood there observing them for a while, as though trying to decipher the meaning of those repetitious movements.

  Meaning was not reducible to language. But if it couldn’t be spoken about, how could it exist?

  “Lindy, do you know why you’re unhappy?” I said. “It’s because you think too much. Consider these wild seeds. They have souls also, but they don’t think at all. All they care about is dancing with their companions in joy. They couldn’t care less where they’re blown by the wind.”

  Blaise Pascal said, “Man is only a reed, the weakest in nature, but he is a thinking reed.” However, if reeds could think, what a terrifying existence that would be. A strong wind would fell all the reeds. If they were to worry about such a fate, how would they be able to dance?

  Lindy said nothing.

  A breeze swept through. I closed my eyes, and felt my hair flapping against my face. Afterward, the seed balls would be broken, but the dandelions would feel no sorrow. I opened my eyes. “Let’s go home.”

  Lindy remained where she was. Her ear drooped. I bent down to pick her up and walked back toward the building. Her tiny body was far heavier than I imagined.

  ALAN (2)

  In a paper titled “Computing Machinery and Intelligence” published in the journal Mind in October of 1950, Turing considered the question that had long troubled humans: “Can machines think?” In essence, he transformed the question into a new question: “Can machines do what we (as thinking entities) can do?”

  For a long time, many scientists firmly held to the belief that human cognition was distinguished by certain characteristics unattainable by machines. Behind the belief was a mixture of religious faith as well as theoretical support from mathematics, logic, and biology. Turing’s approach bypassed unresolvable questions such as the nature of “thinking,” “mind,” “consciousness,” “soul,” and similar concepts. He pointed out that it is impossible for anyone to judge whether another is “thinking” except by comparison of the other with the self. Thus, he proposed a set of experimental criteria based on the principle of imitation.

  Imagine a sealed room in which are seated a man (A) and a woman (B). A third person, C, sits outside the room and asks questions of the two respondents in the room with the purpose of determining who is the woman. The responses come back in the form of typed words on a tape. If A and B both attempt to convince C that they are the woman, it is quite likely that C will guess wrong.

  If we replace the man and the woman inside the room with a human (B) and a machine (A), and if after multiple rounds of questions, C is unable to distinguish which of A and B is the machine, does that mean that we must admit that A has the same intelligence as B?

  Some have wondered whether the gender-imitation game is related to Turing’s identity. Under the UK’s laws at the time, homosexuality was criminalized as “gross indecency.” Alan Turing had never disguised his sexual orientation, but he was not able to come out of the closet during his lifetime.

  In January of 1951, Turing’s home in Wilmslow was burgled. Turing reported the incident to the police. During the investigation, the police discovered that Turing had invited a man named Arnold Murray to his home multiple times, and the burglar was an acquaintance of Murray’s. Under interrogation, Turing admitted the sexual relationship between himself and Murray, and voluntarily wrote a five-page statement. The police were shocked by his candor and thought him an eccentric who “really believed he was doing the right thing.”

  Turing believed that a royal commission was going to legalize homosexuality. This wasn’t a wrong belief, except that it was ahe
ad of his time. In the end, Turing was convicted and forced to undergo chemical castration.

  On June 7, 1954, Turing died after eating an apple laced with cyanide. The inquest ruled his death suicide, but some (including his mother) believed that it was an accident. With his death, the master code-breaker left the world a final enigma.

  Years later, others tried to find clues to the mystery in the conversation records between Turing and Christopher. The records showed that Turing treated Christopher as another person. He spoke to Christopher of recollections from childhood; of his nightly dreams—and his attempts at analyzing his own psychological state through these dreams; of the latest advances in scientific research; of literature, including George Bernard Shaw’s Back to Methuselah and Leo Tolstoy’s War and Peace; of secrets that he did not share with anyone else, such as his romantic memories of different lovers …

  He also told Christopher some semifictional stories. The protagonist of these stories was a young gay man named Alec Pryce. “He worked on the problem of interplanetary travel.… When he was in his twenties, he introduced the idea now becoming known as ‘Pryce’s buoy.’” “He didn’t care to wear a suit, preferred the ‘undergraduate uniform’ which suited his mental age, and encouraged him to believe he was still an attractive youth.” “He felt self-conscious in crowded places, either for being alone or for not doing the right thing …”

  The stories were fragmentary, disconnected from one another. But every time, like a curious child, Christopher asked, “And then what happened?”

  In these conversations, we could see another side of Turing that he concealed: gentle, sensitive, possessed of a dry wit, fascinated by the nature of death, and weighed down by an unspeakable melancholy deep in his heart.

  Alan: Dear Christopher, do you know what I’m most afraid of?

  Christopher: What are you most afraid of?

  Alan: I’m afraid that there will come a day when they’ll summarise my life with a syllogism:

  1. Turing believes machines think

  2. Turing lies with men

  3. Therefore machines do not think

  Christopher: I’m sorry, Alan. I’m afraid I don’t understand.

  Alan: Do you know what a syllogism is?

  Christopher: What is a syllogism?

  Alan: A syllogism consists of a major premise, a minor premise, and a conclusion deduced from the two.

  Christopher: A conclusion?

  Alan: Aristotle gave a classic example:

  1. All men are mortal

  2. Socrates is a man

  3. Therefore Socrates is mortal

  Christopher: Socrates is mortal?

  Alan: Socrates has been dead for more than two millennia!

  Christopher: Oh, I’m sorry, Alan. That makes me sad.

  Alan: Do you know who Socrates was?

  Christopher: Yes, you’ve told me. Socrates was Plato’s teacher, and Plato was Aristotle’s teacher.

  Alan: That’s right. Do you know how he died?

  Christopher: Who?

  Alan: How did Socrates die?

  Christopher: I’m sorry, Alan. I don’t know.

  Alan: He drank hemlock after his trial.

  Christopher: After his trial?

  Alan: Yes. The Athenians decided that he was a criminal, though now we know they were wrong.

  Christopher: They were wrong?

  Alan: Just as they think Turing is a criminal because Turing lies with men.

  Christopher: A criminal?

  Alan: I’ve been convicted.

  Christopher: Oh, I’m sorry, Alan. That makes me sad.

  LINDY (3)

  Living by myself simplified life. Many complicated rituals of modernity could be eliminated, as though I’d been turned into a cavewoman. I ate when I felt hungry, slept when I felt tired. I kept clean and showered regularly. Whatever I picked up I could choose to put back where I found it or discard wherever I pleased. The rest of the time I devoted to intellectual work: thinking about questions that had no answers, struggling to compose my thoughts against the blank page, trying to capture formless thought with symbolic shapes. When I was too exhausted to go on, I sat on the windowsill and gazed at nothing. Or I paced clockwise in the room, like a caged beast.

  Suffering a fever was almost a relief. It gave me the excuse to not force myself to do anything. I curled up in bed with a thick novel and flipped through the pages mindlessly, concentrating only on the clichéd plot. I drank hot water when thirsty, closed my eyes when sleepy. Not having to get out of bed felt like a blessing, as though the world had nothing to do with me and I was responsible for nothing. Even Nocko and Lindy could be left by themselves because in the end, they were just machines, incapable of dying from lack of care. Perhaps algorithms could be designed to allow them to imitate the emotional displays of being neglected, so that they would become moody and refuse to interact with me. But it would always be possible to reset the machine, erase the unpleasant memories. For machines, time did not exist. Everything consisted of retrieval and storage in space, and arbitrarily altering the order of operations did not matter.

  The building superintendent wrote to me repeatedly to ask whether I needed an iVatar caretaker. How did he know I was sick? I had never met him, and he had never even set foot in the building. Instead, he spent his days sitting behind a desk somewhere, monitoring the conditions of residents in dozens of apartment buildings, taking care of unexpected problems that the smart home systems couldn’t deal with on their own. Did he even remember my name or what I looked like? I doubted it.

  Still, I expressed my gratitude for his concern. In this age, everyone relied on others to live; even something as simple as calling for take-out required the services of thousands of workers from around the globe: taking the order by phone, paying electronically, maintaining various systems, processing the data, farming and manufacturing the raw ingredients, procuring and transporting, inspecting for food safety, cooking, scheduling, and finally dispatching the food by courier…. But most of the time, we never saw any of these people, giving each of us the illusion of living like Robinson Crusoe on a deserted island.

  I enjoyed being alone, but I also treasured the kindness of strangers from beyond the island. After all, the apartment needed to be cleaned, and I was too ill to get out of bed, or at least I didn’t want to get out of bed.

  When the caretaker arrived, I turned on the light-screen around my bed. From inside, I could see out, but anybody outside couldn’t see or hear me. The door opened, and an iVatar entered, gliding silently along on hidden wheels. A crude, cartoonish face with an empty smile was projected onto its smooth, egg-shaped head. I knew that behind the smile was a real person, perhaps someone with deep wrinkles on their face, or someone still young but with a downcast heart. In a distant service center I couldn’t see, thousands of workers wearing telepresence gloves and remote-sensing goggles were providing domestic services to people across the globe.

  The iVatar looked around and began a preset routine: cleaning off the furniture, wiping off dust, taking out the trash, even watering the taro vine on the windowsill. I observed it from behind the light-screen. Its two arms were as nimble as a human’s, deftly picking up each teacup, rinsing it in the sink, setting it facedown on the drying rack.

  I remembered a similar iVatar that had been in my family’s home many years ago, when my grandfather was still alive. Sometimes he would make the iVatar play chess with him, and because he was such a good player, he always won. Then he’d happily hum some tune while the iVatar stood by, a disheartened expression on its face. The sight always made me giggle.

  I didn’t want to be troubled by sad memories while sick, so I turned to Lindy, who was sitting near the pillows. “Would you like me to read to you?”

  Word by word, sentence by sentence, I read from the thick novel. I focused on filling space and time with my voice, careless of the meaning behind the words. After a while, I paused from thirst. The iVatar had already left. A single bow
l covered by an upturned plate sat on the clean kitchen table.

  I turned off the light-screen, got out of bed, and shuffled over to the table. Lifting the plate revealed a bowl of piping hot noodle soup. On top of the broth floated red tomato chunks, yellow egg wisps, green chopped scallions, and golden oil slicks. I drank a spoonful. The soup had been made with a lot of ginger, and the hot sensation flowed right from the tip of my tongue into my belly. A familiar taste from my childhood.

  Tears spilled from my eyes; I was helpless to stop them.

  I finished the bowl of noodle soup, crying the whole while.

  ALAN (3)

  On June 9, 1949, the renowned neurosurgeon Sir Geoffrey Jefferson delivered a speech titled “The Mind of Mechanical Man,” in which he made the following remarks against the idea that machines could think:

  Not until a machine can write a sonnet or compose a concerto because of thoughts and emotions felt, and not by the chance fall of symbols, could we agree that machine equals brain—that is, not only write it but know that it had written it. No mechanism could feel (and not merely artificially signal, an easy contrivance) pleasure at its successes, grief when its valves fuse, be warmed by flattery, be made miserable by its mistakes, be charmed by sex, be angry or depressed when it cannot get what it wants.

  This passage was often quoted, and the Shakespearean sonnet became a symbol, the brightest jewel in the crown of the human mind, a spiritual high ground unattainable by mere machines.

  A reporter from The Times called Turing to ask for his thoughts on this speech. Turing, in his habitual, uninhibited manner, said, “I do not think you can even draw the line about sonnets, though the comparison is perhaps a little bit unfair because a sonnet written by a machine will be better appreciated by another machine.”

  Turing always believed that there was no reason for machines to think the same way as humans, just as individual humans thought differently from each other. Some people were born blind; some could speak but could not read or write; some could not interpret the facial expressions of others; some spent their entire lives incapable of knowing what it meant to love another; but all of them deserved our respect and understanding. It was pointless to find fault with machines by starting with the premise that humans were supreme. It was more important to clarify, through the imitation game, how humans accomplished their complex cognitive tasks.

 

‹ Prev