Conformity

Home > Other > Conformity > Page 3
Conformity Page 3

by Cass R Sunstein


  Reasons and Blunders

  Why do people sometimes ignore the evidence of their own senses? The two principal explanations involve information and peer pressure. Some of Asch’s subjects seem to have thought that the unanimous confederates must be right. But other people, though believing that group members were unaccountably mistaken, were unwilling to make, in public, what those members would see as an error. They falsified their own views. They said something they believed to be untrue.

  In Asch’s own studies, several conformists said, in private interviews, that their own opinions must have been wrong22—suggesting that information, rather than peer pressure, is what was moving them.23 This informational account is strengthened by a study in which people recorded their answers anonymously but gave nearly as many wrong answers as they had under Asch’s own conditions.24 A similar study finds that conformity is not much lower when the subject’s response is unavailable to the majority.25

  On the other hand, these are unusual results, and experimenters generally do find significantly reduced error, in the same basic circumstances as Asch’s experiments, when subjects are asked to give a purely private answer.26 This finding suggests that people did not really believe their own senses were misleading them; they were trying instead not to look stupid in front of other people. And in experiments in which conformity or deviation is made very visible, conformity grows.27 The finding also suggests that peer pressure matters a great deal and helps explain Asch’s findings.

  Asch’s own conclusion was that his results raised the possibility that “the social process is polluted” by the “dominance of conformity.”28 He added, “That we have found the tendency to conformity in our society so strong that reasonably intelligent and well-meaning young people are willing to call white black is a matter of concern.”29 As I have noted, Asch’s experiments produce broadly similar findings across nations, and so in Asch’s sentence just quoted, the word “society” could well be replaced with the word “world.”

  We should stress a separate point here: many people are not willing to disclose their own information to the group, even though it is in the group’s interest to learn what is known or thought by individual members. To see this point, imagine a group almost all of whose members believe something to be true even though it is false. Imagine too that one member of the group or a very few members of the group know the truth. Are they likely to correct the dominant view? If Asch’s findings generalize, the answer is that they may not be. They are not reticent because they are irrational. They are making a perfectly sensible response to the simple fact that the dominant view is otherwise—a fact that suggests either that the small minority is wrong or that they are likely to risk their own reputations if they insist they are right. As we shall see, Asch’s findings help explain why groups can end up making unfortunate and even self-destructive decisions.

  There have been significant developments, of course, in the decades since Asch did his original research. Some of the most interesting work makes a sharp distinction between compliance and acceptance.30 People comply when they defer to others whom they believe to be wrong. In that case, they will conform in public but not in private. People accept when they internalize the view of the group. As we have seen, Asch’s findings involve a degree of both compliance and acceptance. More recent empirical work also finds evidence of both, with a further finding that as the size of the majority expands, compliance increases.31

  Both theoretical and empirical research has also explored whether conformity works by changing people’s beliefs or instead their preferences and tastes, finding that researchers have focused excessively on the former.32 There has been important clarifying work about the kinds of activities that will see high levels of conformity, and thus fads and fashions.33 We also know more about the kinds of people who are most likely to conform34 and the circumstances that heighten or diminish conformity in Asch-like settings. If, for example, people are reminded of circumstances in which they have acted without inhibition, they are more likely to conform.35 In general, and with qualifications that are not central to my argument here, Asch’s central findings have held up.

  Would those findings apply to judgments about morality, policy, and law? It seems jarring to think that people would yield to a unanimous group when the question involves a moral, political, or legal issue on which they have great confidence. But if Asch is correct, such yielding should be expected, at least some of the time. We will find powerful evidence that this happens within federal courts of appeals in the United States. The deadening effect of public opinion was of course a central concern of John Stuart Mill, who insisted that protection “against the tyranny of the magistrate is not enough” and that it was also important to protect “against the tyranny of the prevailing opinion and feeling; against the tendency of society to impose, by other means than civil penalties, its own ideas and practices as rules of conduct on those who dissent from them.”36

  Mill’s focus here is on the adverse effects of conformity not only on the individuals who are thus tyrannized but also on society itself, which is deprived of important information. I do not think it irrelevant that the love of Mill’s life started as an illicit affair. (His lover and eventual wife, Harriet Taylor, was married when their relationship began.) Mill’s relationship with Taylor produced widespread opprobrium in their circles and a rupture from his own family. In his writing, Mill celebrated freedom from social convention and “experiments of living.” His attack on conformity was general, emphasizing as it did the importance of following one’s own path, free of “the tyranny of the prevailing opinion and feeling.” But Mill practiced what he preached. The idea of “experiments of living” deserves emphasis in the annals of freedom.

  How to Increase (or Decrease) Conformity

  What factors increase or decrease conformity? Consistent with Sherif’s findings, people are less likely to conform if they have high social status or are extremely confident about their own views.37 They are more likely to conform if the task is difficult or if they are frightened.38 Consider some other ways to make conformity more or less likely.

  Financial Rewards

  Financial rewards for correct answers affect performance, and in two different ways.39 When people stand to make money if they are right, the rate of conformity is significantly decreased in the same basic condition as the Asch experiments, so long as the task is easy. People are less willing to follow group members when they stand to profit from a correct answer. We can see why that happens. If you know what is right, and if you will make money by saying what is right, you will probably say what is right, even if people around you are blundering.

  But there is a striking difference when the experiments are altered to make the underlying task difficult. In that event, a financial incentive, rewarding correct answers, actually increases conformity. People are more willing to follow the crowd when they stand to profit from a correct answer if the question is hard. Perhaps most strikingly, the level of conformity is about the same, when financial incentives are absent, in low-difficulty and high-difficulty tasks—but the introduction of financial rewards splits the results on those tasks dramatically apart, with significantly decreased conformity for low-difficulty tasks and significantly increased conformity for high-difficulty tasks.40

  These results have straightforward explanations. A certain number of people, in the Asch experiments, actually know the right answer and give conforming answers only because it is not worthwhile to reject the shared view of others in public. But when a financial incentive is offered, peer pressure is outweighed by the possibility of material gain. The implication is that an economic reward can counteract the effects of social pressures. There is a lesson here for groups of all kinds—schools, private employers, and governments. If people know they will gain if they say what they know, then groups are more likely to obtain crucial information.

  By contrast, difficult tasks leave people with a great deal of uncertainty about whether the
y are right. In such circumstances, people are all the more likely to give weight to the views of others, simply because those views may well be the most reliable source of information. If you are asked to solve a difficult math problem, or to describe the most sensible approach for reducing deaths on the highways, you might defer to the wisdom of the room. Consider in this regard the parallel finding that people’s confidence in their own judgments is directly related to the confidence shown by the experimenter’s confederates.41 When the confederates act with confidence and enthusiasm, subjects also show heightened confidence in their judgments, even when they are simply following the crowd. Consider also the broad claim that imitation of most other people can operate as a kind of “fast and frugal” heuristic, one that works well for many creatures, including human beings, in a wide variety of settings.42 If you are not sure what to do, you might well do what others do. Like most heuristics, the imitation heuristic, while generally sensible and often the best available, produces errors in many situations.43

  There is a disturbing implication. A “majority consensus” is “often capable of misleading individuals into inaccurate, irrational, or unjustified judgments.” Such a consensus “can also produce heightened confidence in such judgments as well.”44 It follows that “so long as the judgments are difficult or ambiguous, and the influencing agents are united and confident, increasing the importance of accuracy will heighten confidence as well as conformity—a dangerous combination.”45 As we shall see, the point very much bears on the sources of unjustified extremism, especially under circumstances in which countervailing information is unavailable. Extremists are often following one another.

  The Size of the Group

  Asch’s original studies found that varying the size of the group of confederates, unanimously making the erroneous decision, mattered only up to a number of three; increases from that point had little effect.46 Using one confederate did not increase subjects’ errors at all; using two confederates increased errors to 13.6 percent; and using three confederates increased errors to 31.8 percent, not substantially different from the level that emerged from further increases in group size. But Asch’s own findings appear unusual on this count. Subsequent studies have usually found that, contrary to Asch’s own findings, increases in the size of the group of confederates do increase conformity.47

  A Voice of Sanity

  More significantly, a modest variation in the experimental conditions makes all the difference. The existence of at least one compatriot, or voice of sanity, dramatically reduces both conformity and error. When one confederate made a correct match, errors were reduced by three-quarters, even if there was a strong majority the other way.48 There is a clear implication here: If a group is embarking on an unfortunate course of action, a single dissenter might be able to turn it around, by energizing ambivalent group members who would otherwise follow the crowd.

  It follows that affective ties among members, making even a single dissent less likely, might well undermine the performance of groups and institutions. Consider here Brooke Harrington’s brilliant study of the performance of investment clubs—small groups of people who pool their money to make joint decisions about stock market investments.49 The worst-performing clubs were built on affective ties and primarily social; the best-performing clubs had limited social connections and were focused on increasing returns. Dissent was far more frequent in the high-performing clubs. The low performers usually had unanimous votes, with little open debate. Harrington found that the votes in low-performing groups were “cast to build social cohesion rather than to make the best financial choice.”50 In short, conformity resulted in significantly lower returns.

  Being In or Out

  Much depends on the subjects’ perceived relationship to the experimenters’ confederates and in particular on whether the subjects consider themselves to be part of the same group in which those confederates fall. If the subjects identify themselves as members of a different group from the majority, the conformity effect is greatly reduced.51 People are especially likely to conform when the group consists of people whom subjects like or admire or with whom they otherwise feel connected.52 The general point explains why group membership is often emphasized by those who seek to increase or decrease the influence of a certain point of view—such as conservatives, liberals, Catholics, Jews, socialists, Democrats, and Republicans. Perhaps advocates can be discredited, with the relevant group, by showing that they are “conservative” or “leftists,” and so prone to offer unacceptable views. I have referred to the phenomenon of “reactive devaluation,” by which people devalue arguments and positions simply because of their source.

  Thus conformity—and potentially error—is dramatically increased, in public statements, when subjects perceive themselves as part of a reasonably discrete group that includes the experimenter’s confederates (all of whom are psychology majors, for example).53 By contrast, conformity is dramatically decreased, and error correspondingly decreased, in public statements when subjects perceive themselves as in a different group from the experimenter’s confederates (all of whom are ancient history majors, for example).54

  Notably, private opinions, expressed anonymously afterward, were about the same whether or not the subjects perceived themselves as members of the same group as others in the experiment. And people who thought they were members of the same group as the experimenter’s confederates gave far more accurate answers, and far less conforming answers, when they were speaking privately.55 In the real world, would-be dissenters might silence themselves when and because they are in a group of like-minded others—partly because they do not want to risk the opprobrium of those others and partly because they fear they will, through their dissent, weaken the effectiveness and reputation of the group to which they belong.

  There is a large lesson here. Publicly expressed statements, showing agreement with a majority view, may be both wrong and insincere, especially when people think of themselves as members of the same group as the majority.56 The finding of heightened conformity is linked with evidence of poor performance by groups whose members are connected by affective ties; in such groups, people are less likely to say what they know and more likely to suppress disagreement. A system of checks and balances, attempting to ensure that ambition will check ambition, can be understood as a way of increasing the likelihood of dissent and of decreasing the likelihood that members of any particular group, or institution, will be reluctant to disclose what they think and know.57

  Shocks, Authority, and Expertise

  In the Sherif and Asch experiments, no particular person has special expertise. No member of the group shows unusual measurement abilities or wonderful eyesight. But we might safely predict that subjects would be even more inclined to blunder if they had reason to believe that one or more of the experimenters’ confederates was particularly likely to be correct. This hypothesis receives support from a possible interpretation of one of the most alarming findings in modern social science, involving conformity not to the judgments of peers but to the will of an experimenter.58 These experiments are of independent interest, because they have implications for social influences on judgments of morality, not merely facts.

  The experiments, conducted by the psychologist Stanley Milgram, ask people to administer electric shocks to a person sitting in an adjacent room.59 Subjects are told, falsely, that the purpose of the experiment is to test the effects of punishment on memory. Unbeknownst to the subject, the victim of the electric shocks is a confederate and there are no real shocks. The apparent shocks are delivered by a simulated shock generator, offering thirty clearly delineated voltage levels, ranging from 15 to 450 volts, accompanied by verbal descriptions ranging from “slight shock” to “danger: severe shock.”60 As the experiment unfolds, the subject is asked to administer increasingly severe shocks for incorrect answers, up to and past the “danger: severe shock” level, which begins at 400 volts.

  In Milgram’s original experiments, the subjects
included forty men between the ages of twenty and fifty. They came from a range of occupations, including engineers, high school teachers, and postal clerks.61 They were paid $4.50 for their participation—and also told they could keep the money no matter how the experiment went. The “memory test” involved remembering word pairs; every mistake, by the confederate/victim, was to be met by an electric shock and a movement to one higher level on the shock generator. To ensure that everything seems authentic, the subject is, at the beginning of the experiment, given an actual sample shock at the lowest level. But the subject is also assured that the shocks are not dangerous, with the experimenter declaring, in response to a prearranged question from the confederate, “Although the shocks can be extremely painful, they cause no permanent tissue damage.”62

  In the original experiments, the victim does not make any protest until the 300-volt shock, which produces a loud kick, by the victim, on the wall of the room where he is bound to the electric chair. After that point, the victim does not answer further questions and is heard from only after the 315-volt shock, when he pounds on the wall again—and is not heard from thereafter, even with increases in shocks to and past the 400-volt level. If the subject indicates an unwillingness to continue, the experimenter offers prods of increasing firmness, from “Please go on” to “You have no other choice; you must go on.”63 But the experimenter has no power to impose sanctions on subjects.

 

‹ Prev