Book Read Free

Philip K. Dick and Philosophy

Page 11

by D. E. Wittkower


  The objection to Turing’s behavioral approach is that a machine passing the Turing Test may be able to simulate human conversational behavior, but the machine might be just following some cleverly devised rules. That is the view of Bryant, Deckard’s supervisor in Blade Runner, who refers to the androids as “skin jobs.” The Turing test, according to this objection, fails to factor in human consciousness. Both Philip Dick’s and the movie Blade Runner’s indifference to whether or not other human beings and machines are conscious is consistent with the spirit of philosophical positivism that rejects as meaningless anything that cannot be verified observationally.

  . . . You Ever Take That Test Yourself?

  In Blade Runner, Deckard sits at the piano with Rachael. Next to the sheet music are pictures of Deckard when he was a child. They are the same sort of slightly worn black and white photographs that Rachael offered Deckard as proof of her humanness. The comparison goes unnoticed by Deckard and Rachael. Moments earlier Rachel asked Deckard, “You know that Voight-Kampff test of yours, you ever take that test yourself?” Suddenly the viewer is placed in the same quandary as characters in the movie and is led to wonder whether Deckard is an android and whether his memories are also artificial implants.

  In the novel, Luba Luft, the opera singer, also asks the question whether Deckard has tested himself. Deckard answers Luft by explaining that he took the test a long time ago. “Maybe it is a false memory,” she says. But, he says, his superiors knew about the test. She suggests that there might have been a human Deckard, whom he, android-Deckard, killed, and his superiors did not know of the switch. Rachael, at first, had no idea that she was an android, and the same may be true of Deckard.

  The similarities between humans and androids are striking throughout Dick’s novel. When the reader first meets Rick Deckard, he is debating with his wife, Iran, on what setting to dial their Penfield Artificial Mood Simulator. She accuses Deckard of killing those “poor andys.” Her expression of sympathy for androids sets Deckard to thinking whether he should dial the mood simulator to suppress his anger at her comment. She says that she has dialed six hours of despair because it is unhealthy not to react to the devastation from “World War Terminus” that has left Earth toxic and has caused most of the human population to exit to extraterrestrial colonies. She says she wants to avoid what was called an “absence of appropriate affect”—once treated as a mental illness.

  It is, in fact, the very absence of the proper affect that they believe distinguishes androids from humans. Iran explains that she has programmed the mood simulator to automatically follow the mood of despair by a hopeful mood, to avoid the dangers of remaining in despair. Deckard recommends instead that she dial a 104, which is the desire to watch TV. She says that she does not want to dial anything. He recommends that she dial mood 3, which is the desire to dial a mood. Iran points out that there is a logical contradiction in dialing the desire to want to dial when you don’t feel like dialing. She turns the TV off and reluctantly agrees to dial mood 594: “ecstatic sexual bliss.” Deckard dials for himself a “creative and fresh attitude towards his work.”

  Paradoxically, moods, particularly of empathy, are thought to distinguish humans from androids. The Voight-Kampff test is based on such a theory, yet here, moods are artificially induced in humans by a machine. Dick has made the artificial simulation of feelings and emotions even more impersonal by assigning to each mood a number. If machines can artificially induce moods, then using the feeling of empathy to distinguish between humans and machines seems pointless.

  The use of the mood simulator is one of many images in Dick’s novel that blurs the distinction between human and android. The androids have created a phony police agency headed by a police chief named Garland. Shortly after Deckard meets him, Garland is exposed as an android. Garland tried to deceive Deckard into thinking that another bounty hunter, Phil Resch, who works for the phony police agency, was also an android. Resch had thought that Garland was human and was fearful that he himself was an android with false memory implants. Prior to being tested, he tries to prove his humanness by the fact that he owns a live squirrel that thrives under his care, something that androids were thought incapable of doing.

  The Voight-Kampff test proved that Resch was not an android, but only after Resch was deeply tormented by the thought that he was not human. At one point Resch compares “how an andy must feel,” to the painting by Munch entitled “The Scream” in which a figure “screamed in isolation. Cut off by—or despite—its outcry.” Resch’s comparison of an android with the figure in Munch’s painting is filled with feeling and emotion, hardly the language that would be used to describe a “thing” or an “it.”

  I Love You, My Artificial Construct

  Deckard feels regret having retired the android Luba Luft. He reflects on the impracticality of the loss saying that, “She was a wonderful singer. The planet could have used her. This is insane.” The distinction between androids and humans becomes unclear for Deckard when he realizes he is feeling “empathy toward an artificial construct.” He says, “But Luba had seemed genuinely alive; it had not worn the aspect of simulation.” Deckard registers an empathetic 4.0 out of 6.0 on the Voight-Kampff test at the thought of killing a female android. He thinks that it may be an anomaly and wonders “if any human has ever felt this way before about an android.” He considers that it may have been because she was an opera singer and because of his love of the opera, The Magic Flute. Phil Resch suggests it was sex because she was physically attractive. We are told in the novel, “for the first time in his life, he had begun to wonder.” The cause of his wonder was whether the distinction between android and human signified any real difference. He is brought to the point of saying, “So much for the distinction between authentic living human and humanoid construct.”

  Perhaps Phil Resch gives the best account of the difference between androids and humans when he says, “If we included androids in our range of empathetic identification, as we do animals, we could not protect ourselves.” Deckard agrees and says, “The Nexus-6 types,” like Rachael, “they’d roll all over us and mash us flat. You and I, all the bounty hunters—we stand between the Nexus-6 and mankind, a barrier which keeps the two distinct.” It is not because the two are distinct that there are bounty hunters; it is because there are bounty hunters that the two are distinct.

  As postmodernism has claimed, and in particular the philosophy of Michel Foucault, the distinctions between what is true and what is not, what is real and what is not real, reduce to a matter of the struggle for power and the domination of one social class over another. In this case, it is humans over androids, even as the two become indistinguishable because of technological development. Technological changes invariably bring ideological changes. Technology alters the balance of power, renders obsolete traditional modes of production and consumption, and moves wealth across different segments of society. The institution of the bounty hunter who retires androids is a form of reactionary humanism that is being rendered obsolete by advances in cybernetics.

  The ideological superstructure of the post-apocalyptic society in which Deckard lives values the living over the artificial. The most striking example is its attitude toward animals. Deckard keeps an electric sheep on the roof of his apartment complex. He is envious of his neighbor who has a real live horse. Deckard feels like a “fraud.” He describes having had a real sheep that died of tetanus and was replaced with an electric one. His neighbor says to him, “It’s not the same.” Deckard replies, “But almost.” He explains, “You feel the same doing it, you have to keep your eye on it exactly as you did when it was really alive.”

  The real sheep gets sick and dies; the electric one has a mechanical breakdown. The amount of care is the same. The difference between the real and the artificial is mainly a matter of what other people think. It is important in Deckard’s society that the truck coming to repair artificial animals is marked “animal hospital,” so no one suspects the animal
is artificial, similarly, in the case of the phrase “pet hospital” in the deceptively named, “Van Ness pet hospital . . . in the competitive field of false-animal repair.” Social pressure, rather than an irreducible difference between the natural and artificial, causes Deckard to feel a need for a “real” animal.

  In the closing chapter of the book, Deckard finds a toad that he at first thinks is alive. He says that it made him feel like a kid again. But, when he shows the toad to Iran, she finds a tiny control panel on its abdomen that he had not noticed. He is visibly disappointed, but says to himself, “But it does not matter. The electric things have their lives, too. Paltry as those lives are.” If electric things, as inanimate and unresponsive as the toad Deckard found in the desert have their lives, then so also do androids, whose behaviors like Rachael’s, are indistinguishable from humans.

  Has Deckard had some great metaphysical realization about the nature of mind and matter? Contemporary philosophy and postmodernism, in particular, would suggest not. What has changed is the way in which he talks about the living and its simulation as a result of the convergence of social, economic, and technological forces. Only linguistic norms have changed, such as those governing the use of the personal and impersonal pronouns, ‘he’, ‘she’, and ‘it’.

  . . . Because My Furnace Believed the House Was Cold

  For the philosopher Daniel Dennett, it is a matter of the way we choose to talk as to whether behavior is conscious or not. He calls the three ways of representing behavior “stances.”

  When we talk about beliefs, thoughts, and intentions we are using what he calls “the intentional stance.” To say that a bird flies because it knows or thinks it is in danger and feels a need to escape is the intentional stance. To predict that a bird will fly when it flaps its wings, on the basis of the aerodynamic shape of the wings, is the stance of the designer and engineer. To describe the free fall of a bird in the language of mechanics, using terms like mass, velocity, and distance, is the stance of the physicist.

  From the point of view of the intentional stance, the disk drive of the computer is “reading” the disk. The engineer takes the design stance in saying, of the same activity, that laser light is reflecting off grooves in the rotating disk. The physicist takes the physical stance in saying that light is being converted into a flow of electrons.

  For the non-technical user, the vocabulary of “searching and reading” may be the most convenient way of describing the computer. However, it would be odd to say that the house thermostat believed the room was cold and decided to turn the furnace on because it wanted to keep the temperature constant. By contrast, it would be entirely appropriate to say that John chose to leave the room because he felt too hot. For Dennett, what stance we adopt is not a matter of a metaphysical difference between the nature of humans and machines, but a difference in the choice of vocabularies. There are circumstances in which it would be entirely appropriate to describe John’s behavior in terms biochemical and neurological processes without mentioning beliefs, decisions, intentions, or volitions.

  The gradual acceptance of androids and machines into the family of things that we call persons is a matter of a change in linguistic conventions. Whether we regard humans, animals, or machines as having rights and treat them with moral regard is the result of changes in patterns of communication necessitated by social forces. Philip Dick has represented a world that exists in the aftermath of nuclear war, when technology has become sophisticated enough to convincingly simulate human beings and animals, and where economic conditions have become such that the use of androids is a necessity. Under such extreme conditions we can expect the linguistic practices of a culture to change with accompanying changes in its metaphysical, moral, and epistemological beliefs. This would explain Deckard’s profound change in attitude toward androids in his attraction to Luba Luft and his sexual encounter with Rachael Rosen.

  Shaky Theological Foundations

  Mercerism is a religion that employs a device called an Empathy Box, allowing users to experience fusion with other human beings and with the Sisyphus-like sufferings of Mercer. Mercerism holds that empathy with others is distinctly human. This belief system enables the character Isidore to deal with his loneliness. He’s a so-called “chickenhead” and “special,” whose IQ is too low to permit him to emigrate to an off-world colony. Mercerism serves as the metaphysical underpinning that justifies hunting down and killing androids. However, when that ideological foundation is exposed as a fraud by Buster Friendly, empathy, as the necessary condition that defines humans and serves as the theoretical basis of the Voight-Kampff test, loses its quasi-theological basis. For Deckard, the loss of the theological foundation, his attraction to and admiration of the talents of Luba Loft, his sexual encounter with Rachael, the callousness of his fellow bounty hunter, Phil Resch, his sympathy for the artificial toad that he finds in the desert, and the difficulty of detecting the new Nexus 6 androids, all contribute to undermining his conviction of, and deconstructing for the reader, the difference between humans and androids.

  The Right Stuff

  Is functional equivalence to human behavior sufficient to say that androids are sentient? John Searle, who rejects the positivism of Turing and the linguistic critique of Dennett, thinks not. Searle imagines someone given Chinese symbols and being asked to find their equivalence in English using a translation manual. Searle claims that no matter how clever are the algorithms used to translate Chinese into English, the mere mechanical manipulation of symbols according to syntactic rules never yields meaning and understanding: the Chinese characters would be meaningless for the translator.

  Even though a machine could equal the human brain in passing the Turing test, functional equivalence is not a sufficient condition for intentional content. For Searle, passing the Turing Test is not enough to say that the machine has a mental life. Searle’s position is still a thoroughgoing materialism, and hydrogen and carbon molecules are still insentient, but there is something special in the biological mix of the human brain that produces awareness. However, even this position is consistent with Dick’s representation of androids. After all, androids are not like electric sheep.

  Electric animals are mechanical artifacts that only appear biological. Isidore looks for a “concealed control panel . . . plus quick charge battery terminals” on a cat that was picked up for repair. Milt, one of the repairmen, comments that, “It has been our experience that the owner of the animal is never fooled.” By contrast, androids are, as Rachael explains to Deckard moments before they go to bed, “chitinous reflex machines.” They are biological, and detecting them requires technical subtlety. A bone marrow test has to be performed on the body of Mr. Polokov to verify that he was a humanoid robot. The Boneli test measures “the reflex-arc response taking place in the upper ganglia of the spinal column,” which requires, “several microseconds more in the humanoid robot than in a human nervous system.”

  Androids are clearly a different order of artificial life than electric sheep. So, together with the functional equivalence to humans, androids may have the right stuff to experience mental life, and satisfy Searle’s condition for intentionality.

  The position of Turing, and the view of Philip Dick’s novel and the movie Blade Runner, is that if it behaves intentionally, it is intentional; if it behaves consciously, it is conscious; and if it functions thoughtfully, it thinks. From the point of view of postmodernism and the philosophy of language, the difference between humans and machines that simulate humans is a matter of choosing what language game to play in describing their behavior.

  Either way, whether it is human or machine, we have the option of the vocabulary of intentionality or the vocabulary of mechanical causality. As represented by Philip Dick, this is the most humane, indeed the most humanistic, view of the relation between the human and the machine.

  Soul Against the Archons

  09

  Matt Damon Is a Vast Sinister Conspiracy

  D.E.
WITTKOWER

  Matt Damon is not your friend. Hell, he isn’t even human. Matt Damon is a product on the market for our enjoyment. Matt Damon is a story we like to tell ourselves about talent leading to success in a free society. Matt Damon is someone we all know and can use as a connection between us, even though we’ve never met him. Matt Damon is a tool to keep us in line; a simulacra for us to look up to; a replicant who lives an interesting life so that we don’t see that ours is so dull and so prefabricated. Matt Damon is a vast, sinister conspiracy.

  There are a lot of people behind Matt Damon—executives, agents, writers, PR teams for all the films he’s signed on to. One more-or-less central figure behind Matt Damon is a fortyyear old guy who happens to have the same name and, for his part, seems like a nice guy. But Matt Damon is a nexus of power and control much larger than this particular guy who “is” Matt Damon.

  To the credit of that guy who plays the role of (“is”) Matt Damon, he seems to be aware of this and have a good sense of humor about it. If you’re not familiar with the song he did with Sarah Silverman, you should look it up. But not at work, or near children.

 

‹ Prev