The Story of Psychology
Page 71
In the 1950s and 1960s, psychologists turned up, along with their new findings about cognitive influences on motivation, a wealth of evidence that the mind, rather than the viscera, thalamus, or limbic system, is often the major source of emotional experiences and their physical symptoms. Some of that evidence:
—For half a century it had been known that when a person guilty of some crime is read a list of words or asked questions, some of which are neutral and others of which relate to the crime, the latter often cause a rise in the suspect’s blood pressure and galvanic skin conductance. In the 1950s and 1960s further research found other telltale symptoms and improved the technology of lie detection equipment. The premise that the conscious mind influences the emotions—or at least guilty anxiety and its associated physical symptoms—was confirmed.51
—In 1953 Howard S. Becker, a sociologist, studied fifty people who had become marijuana users. He found, among other things, that new users have to be taught to notice and identify what they feel, label the state as “high,” and identify it as pleasant. The physiological feelings of the high acquire their meaning in considerable part from cognitive and social factors.52
—In 1958, in a celebrated experiment, Joseph Brady subjected pairs of monkeys to regular stress in the form of electric shock. One monkey of each pair could postpone the shock for twenty seconds by pressing a lever; the other monkey’s experiences were linked to the first one’s. (He was either not shocked or shocked according to what the first one did or failed to do.) Surprisingly, the monkeys who could avoid the shock developed ulcers, the passive ones did not. Evidently the anticipation and burden placed on the first monkey by the ability to control the shock produced anxiety and its somatic symptoms. Those in the shock-controlling group were soon dubbed “executive monkeys,” their situation being likened to that of human executives working under high pressure and constant anticipation of crisis.53 It was not, however, only anticipation that caused ulcers; it was also the uncertainty about when they had to take action. When a researcher named Jay Weiss repeated Brady’s experiment (with rats instead of monkeys), he added a warning tone that signaled the executive rats (but not the passive ones) to take action. Both groups developed ulcers, but the executive rats, thanks to the security of the warning tone, developed distinctly fewer than the passive rats.54
—In 1960, Eckhard Hess (whom we saw, a while back, imprinting mallard ducklings on a mechanical mother) photographed the eyes of volunteers while they looked at different pictures. The pupils of the men widened when they saw pictures of women, especially pin-ups; the pupils of the women did so when they saw pictures of babies, particularly of one with his mother. The mind, recognizing and evaluating the content of the pictures, sent signals to the limbic system, which then generated both peripheral and central responses, namely, the pupillary widening and a sense of sexual interest.55
By far the most impressive experiment on cognitive influences on the emotions was conducted in 1962 by Stanley Schachter (1922– 1997) and Jerome Singer; it yielded a theory that dominated emotion research for twenty years. Schachter, whom we last saw enacting with gusto the role of a true believer in a cult expecting a worldwide flood, was a bluff, craggy-faced man with a zany sense of humor and, in the 1960s, a taste for daring and deceptive experimentation. Only such a person could have conceived of and coolly carried out the historic work in question.
Schachter, after reviewing the evidence for and against the James-Lange and Cannon-Bard theories, had concluded that “the variety of emotion, mood, and feeling states are by no means matched by an equal variety of visceral patterns,” and, like a number of other psychologists, came to believe that cognitive factors might be the major determinants of emotional states. He and Singer hypothesized that human beings cannot identify an emotion from the physical symptoms they are experiencing but must rely on external clues. The mind, using those clues, labels what the body is feeling as anger, joy, fear, and so on.
To test their hypothesis, Schachter and Singer asked volunteers for permission to inject them with Suproxin, supposedly a vitamin preparation, to investigate its effects on vision. In reality, the material injected was adrenaline, which causes the heart to race, the face to flush, and the hands to tremble—as do certain strong emotions. Some subjects were told in advance that Suproxin had these side effects, others were not.
Just before each subject began to feel the effects of the adrenaline, he was ushered into a room where he and another student (a confederate), who supposedly had also just had the vitamin injection, had to fill out a five-page questionnaire. The confederate enacted one of two scripts that he had rehearsed. In the presence of some subjects he would act giddy, silly, and happy. He would doodle, pitch crumpled paper into a distant wastebasket in a “basketball game,” make a paper airplane and sail it around the room, play with a hula hoop, and so on, meanwhile saying things like “This is one of my good days. I feel like a kid again.” With other subjects he would grumble about the length of the questionnaire and become annoyed by the questions (which grew ever more personal and insulting, one of the last being, “With how many men has your mother had extramarital relations?”—to which the lowest multiple-choice answer was “4 and under”). Finally he would rip up the questionnaire, throw the pieces on the floor, and storm out of the room.
Through a one-way screen, the researchers observed and rated each volunteer’s behavior and afterward had him fill out a mood scale indicating how irritated, angry, or annoyed, or conversely how good or happy, he felt. The results were arresting. Of the volunteers who had not been told about the injection’s effects, those who had seen the confederate being euphoric had also behaved, and said they felt, euphoric, and those who had seen him being irritated and angry had also behaved, and said they felt, irritated and angry. But volunteers who had been told in advance about Suproxin’s physiological side effects gave no such responses; they already had an adequate cognitive explanation for their feelings. Schachter and Singer’s historic conclusion:
Given a state of physiological arousal for which an individual has no immediate explanation, he will label this state and describe his feelings in terms of the cognitions available to him. To the extent that cognitive factors are potent determiners of emotional states, it should be anticipated that precisely the same state of physiological arousal could be labeled “joy” or “fury” or “jealousy” or any of a great diversity of emotional labels depending on the cognitive aspects of the situation.56
The cognitive theory of emotional arousal was a smashing success. It not only illustrated the importance of cognition, the new passion of psychologists, but made sense of a mass of previously bewildering findings. Over the next two decades a huge amount of related research was conducted by psychologists, some of which qualified or contradicted the Schachter-Singer theory but most of which confirmed and added to it. A few highlights of the findings:
—Schachter and his colleague Larry Gross enlisted volunteers, some obese and some normal, in what they were told was a study of how somatic reactions are related to psychological traits. The experimenters conned each volunteer into handing over his watch when electrode paste was applied to his wrists; the electrodes being attached to him served only to disguise the removal of the watch. The researchers also left a box of crackers in the room and told the volunteer—who was alone during the experiment—to help himself. In the room was a doctored clock, running either at half speed or at double speed. After a while some volunteers thought it was their dinnertime although it was still early, others that it was not yet their dinnertime although it was late. The obese volunteers ate more crackers when they thought it was beyond their regular dinner hour than when they thought it was not yet their dinner hour; normal volunteers ate the same number no matter what time they thought it was. The conclusion: Not the stomach but the mind of the obese volunteers determined their feelings of hunger.57
—Another research team had an attractive female confederate approach men students as t
hey were crossing either a swaying suspension bridge over a deep canyon or a low, solid bridge. In each situation the confederate’s cover story was that for a research project she wanted them to fill out a questionnaire and make up a brief story about a picture. She gave each man her name and telephone number so that he could call her if he wanted to know more about the project. The men she approached on the frightening suspension bridge wrote stories with more sexual imagery and were more likely to call her and ask for a date than the men she approached on the low, firm bridge. The experimenters concluded that the men approached on the frightening bridge interpreted their anxiety as the first stage of sexual attraction. As per Schachter-Singer theory, the men had taken an external clue—the presence of the attractive woman—as the explanation of their physical feelings.58
—In the late 1970s Paul Rozin and Deborah Schiller of the University of Pennsylvania undertook an inquiry into how and why human beings develop a liking for a painful stimulus, in this case chili pepper in food. Rozin and Schiller interviewed college students in Philadelphia and Mexicans in a highlands village near Oaxaca, and discovered that initially the response to chili pepper by children is almost always strongly negative; this ruled out the possibility that chili lovers are relatively insensitive to the irritant. They found that the initial dislike of the painful sensation changed because of the mother’s training and the social situation (especially in Mexico). The recognition that the burning sensation is considered desirable led the children to develop a liking for it—again, evidence that the mind decides how a sensation is to be interpreted.59
—Sexual arousal and mating behavior are automatically triggered in insects by pheromones (attractant secretions); even in mammals, odors produced by a female in heat switch on sexual desire and activity in the male, as every dog owner knows. Moreover, in many mammals hormone levels in both male and female determine when they have the mating urge. But in human beings, pheromones and hormone levels have only a limited relation to sexual interest. A vast amount of anthropological, historical, and psychological research data attest that human sexual arousal is largely a matter of cognitive responses—reactions to clues specific to each culture.60 Three scraps of evidence, out of thousands available:
In some cultures, the female breast, normally concealed, is powerfully exciting to men; in those where it is routinely exposed, it is not. Similarly, at the turn of the century, a woman’s ankle was an erotic sight for Western men; by the 1980s, in magazines like Playboy and Penthouse photographs of completely nude women were considered marginally erotic and only those featuring a clear view of the pudenda, preferably tumescent and open, were thought of as highly arousing.
Alfred Kinsey’s historic surveys of American sexual behavior, conducted during the 1940s and published in 1948 and 1953, found that women were much less often stimulated by erotic materials than men. But a national survey made nearly three decades later found that the sexual revolution and the women’s movement had made women far more arousable by erotic material than formerly. Again, in Kinsey’s era women were much less likely than men to experience orgasm in intercourse; by the time of the later survey they had become considerably more orgasmic.61
Volunteers in an experiment were exposed to erotica while carrying out difficult arithmetical tasks; although they were aware of the erotic stimuli, they did not become aroused. Apparently, to become excited by erotic material, the viewer or reader must fantasize himself or herself as part of the action; the participants in the experiment were too preoccupied by their work to do so.62
As early as the 1930s, but chiefly from the 1950s on, researchers in other areas of psychology were also turning up evidence that cognitive processes are a major source of human emotions and motivations. To do justice to the diverse research would require volumes; we will content ourselves with a few paragraphs about each of four examples:
I
I In the mid-1930s, as we have seen, the Harvard personality researcher Henry Murray developed the Thematic Apperception Test (TAT) to measure aspects of personality, especially unconscious ones. Drawing on psychoanalytic theory, he framed these in the form of thirty-five needs: for orderliness, dominance, deference, aggression, abasement, nurturance, affiliation (belonging and friendship), and others. Each of the thirty-five was a motivating force, and many were investigated from that angle in the following years.
Perhaps the most intensively researched was the need for achievement, or, as it is referred to in psychological literature, nAch. In the 1950s and 1960s, David McClelland and his colleagues at Wesleyan University in Connecticut produced a number of valuable studies of the personality and behavior of people with high nAch and of its sources. Among their findings: Persons high in nAch prefer tasks that offer concrete feedback and hence tend to choose work in which it is possible to see growth and expansion … Boys high in nAch had mothers who expected them, from an early age, to be independent and self-reliant, and who put fewer restrictions on them than mothers of low-nAch boys … A survey of twenty-three modern societies found that the value a society places on achievement is reflected in its children’s stories and is correlated to its increase in electrical production in recent years.63
All of which indicates that motivation to achieve is acquired from one’s parents and society, and is thus cognitive in nature.
II
Freud held that the ego or largely conscious self develops as the child learns to control his or her impulse to obtain immediate gratification, and to postpone seeking satisfaction for the sake of greater reward or social acceptability. Thus, motivation in the older child and adult, though powered by the drive to obtain pleasure, is cognitively directed.
In the 1950s and beyond, experimental evidence gathered by developmental psychologists supported Freud’s ego-development theory. Walter Mischel and his collaborators, for example, offered children the choice of an immediate small reward or a delayed larger one. At seven, most children chose the immediate reward; at nine, most of them chose the delayed larger one.64
Meanwhile, the writings of the psychoanalysts Anna Freud and Heinz Hartmann had been bringing about a change in the focus of psychodynamic psychology. The ego was found to be more powerful and influential, the id less so, than had been thought. To psychologists who were psychodynamically oriented, this meant that in large part the human adult is motivated by conscious wishes, ego defense mechanisms, and values. By the 1950s, therefore, psychotherapists and academic psychologists were actively exploring positive cognitive forces used by the ego to combat stress, in particular hope to counteract anxiety when facing uncertainties, and coping mechanisms to deal with problems rather than irrational reactions and self-defenses.65
III
Most twentieth-century psychologists across the spectrum from Freud to Skinner were determinists. As scientists, they believed that human behavior, like all events in the real world, is caused; every thought and act is the result of antecedent events and forces. This premise seemed to them essential to the status of psychology as a science. In this view, if individuals could behave as they chose—if some or much of their behavior were determined by their will, operating freely, rather than by past experiences and present forces—there could not be a body of rigorous laws concerning behavior. Accordingly, the term “will” had largely disappeared from psychology by midcentury and ever since has not even been mentioned in passing in most textbooks, although an excellent current one does list it in the index as “will, illusion of conscious.”66
But the concept has refused to die; it lives on in altered form and under other names, and for good reason.
For one thing, the goal of psychotherapy is to liberate the patient from the control of unconscious forces. This can only mean that the patient becomes capable of consciously weighing and judging the alternatives and deciding how to behave. But what is a decision if not a volitional act?
For another, developmental psychologists had found that a crucial feature of children’s mental development is the g
radual appearance of “metacognition”—awareness of their own thought processes and ability to manage them. Children slowly discover that there are ways to remember things, to formulate problem-solving strategies, to categorize objects; they begin to exercise conscious and voluntary control of their own thought processes.67
For yet another, cognitive psychology had to devise a modern equivalent of will to account for the phenomenon of decision making, observed in innumerable studies of thinking and problem solving. Artificial intelligence experts refer to the “executive functions” of programs that simulate thinking; that is, the parts of such programs that weigh the results achieved at any point and determine what steps to take next. Some theorists say that the human mind, likewise, has executive machinery that makes decisions. But the decisions made by an AI program are fully predictable, while predictions of decisions of a human being are often wrong. Why? Is there, after all, some area of freedom within human choice, some kind of free will within voluntary control? We will look further into this enigma in the final chapter; for now, it is enough to note that whether one views decision making as a fully predictable executive process or as a voluntary act, its motivation is of cognitive origin.68
IV
Murray suggested in the 1930s that social factors are often a source of motivation, but the suggestion lay fallow; in the 1950s, with the growth of social psychology and humanistic psychology, psychologists became interested in “social motivation.”69 This was an important component of an integrated theory of motivation put forward in 1954 by Abraham Maslow, the leader of the humanistic psychology movement of the 1950s and 1960s.