The upshot is that dissenters, disclosing their own private information, need to be encouraged to speak out, simply because they confer benefits on those who observe them. The point applies to many organizations. And if the point is put together with an emphasis on the risk of cascades on courts, there is fresh reason to appreciate judicial dissents, if only because they increase the likelihood that majority decisions will receive critical scrutiny. Note here that within the U.S. Supreme Court alone, dissenting opinions have frequently become the law, indeed have become the law on well over 130 occasions—a point to which I will return.
This claim has an implication for appropriate institutional arrangements: any system that creates incentives for individuals to reveal information to the group is likely to produce better outcomes. A system of majority rule in which individuals know their well-being will be promoted (or not) depending on the decision of the group therefore has significant advantages. Well-functioning organizations, public as well as private, are likely to benefit from this insight. In this light, we might even offer a suggestion about the nature of civic responsibility: in case of doubt, citizens should reveal their private signal, rather than disguising that signal and agreeing with the crowd. Perhaps counterintuitively, this kind of behavior is not optimal from the point of view of the individual who seeks to get things right, but it is best from the point of view of a group or nation that seeks to use all relevant information.
It is important to make some distinctions here. The majority-rewarding variation on the urn experiment gives people an incentive to disclose accurate information that they have. This is the information from which the group benefits, and this is the information that does not emerge if people are rewarded for correct individual decisions. Full disclosure of accurate information is a central goal of institutional design. But the experiment does not suggest that a group is better off if people always disagree, or even if they always say what they think. In the tale of “The Emperor’s New Clothes,” the boy is not a skeptic or a malcontent. On the contrary, he is a particular kind of dissenter; he is a discloser, revealing the information that he actually holds. The majority-rewarding variation of the urn experiment encourages subjects to act like that boy.
By contrast, we can imagine a different kind of person, the contrarian, who thinks he will be rewarded, financially or otherwise, simply for disagreeing with others. There is no reason to celebrate the contrarian. In many cases, contrarians are unlikely to give any help to the group. If contrarians are known as such, their signals will be very noisy and not very informative. If contrarians are not known as such, they are often failing to disclose accurate information, simply because they are contrarians rather than disclosers; in that sense, they are not helping the group to arrive at correct decisions. We could imagine a variation on the urn experiment in which a contrarian confederate regularly announced the opposite of what his predecessor announced. It is safe to predict that such behavior would reduce cascades, but it would not reduce errors by individuals or groups. On the contrary, it would increase them.
Dissenters who are disclosers, then, are to be prized, at least if they are disclosing some important truth about the issue at hand. By contrast, dissenters who are contrarians are at best a mixed blessing. And we can also imagine dissenters who do not disclose a missing fact but instead simply state a point of view that would otherwise be missing from group discussion. Such dissenters might urge, for example, that a lot of immigration increases economic growth, that animals should have rights, that school prayer should be permitted, or that capital punishment should be banned. In the domains of politics and law, cascade-type behavior typically leads people to be silent not about facts but about points of view. It is obvious that groups need relevant facts; do they need to know about privately held opinions as well?
They certainly do, and for two different reasons. First, those opinions are of independent interest. If most or many people favor school prayer or believe that capital punishment is morally unacceptable, it is valuable to know that fact. Other things being equal, both individuals and governments do better if they know what their fellow citizens really think. Second, people with dissenting opinions might well have good arguments. Those arguments might depend, in the end, on judgments of fact; they might depend on purely normative claims. It is important for those who conform, fall into a cascade, or independently concur to hear those arguments. This is a standard Millian point,26 to which I will shortly return.
On the federal courts in the United States, some judges suggest that they often offer a “go along concurrence,” joining the majority though they privately disagree. Such judges give a false signal about their actual opinions and, very possibly, their future votes. That is true not only on federal courts. Many people offer “go-along concurrences,” in companies, in legislatures, and in the White House. I was privileged to work for President Barack Obama, in the Executive Office of the President, and I saw some “go-along concurrences.” When things were working best, people revealed what they thought.
Suppose, as is often the case, that people are rewarded not only or not mostly for being correct but also or mostly for doing what other people do. The reward might be material, in the form of more cash or improved prospects, or it might be nonmaterial, in the form of more and better relationships. In the real world, people are often punished for nonconformity and rewarded for conformity. People who reject the views of leaders or of the majority might well find themselves less likely to be promoted and more likely to be disliked. Organizations, groups, and governments often prize harmony, and nonconformists tend to introduce disharmony. Sometimes it is more important to be “on the team” than to be right. “Sometimes cultural groups adopt very high levels of norm enforcement that severely suppress the individual variations, innovations, and ‘errors’ that innate cultural transmission mechanisms require to generate adaptive evolutionary processes within groups.”27
The likely result should be clear. If rewards come to those who conform, cascade-like behavior will increase, simply because the incentive to be correct is strengthened or replaced by the incentive to do what others do. The magnitude of this effect will depend on the size of the incentive to conform. But whenever the incentive is positive, people will be all the more likely to ignore their private information and to follow others. The opposite result should be expected if people are penalized for following others or rewarded for independence; if so, cascade-like behavior should be reduced or even eliminated. I am now emphasizing the incentive to conform, but in some settings, independence is prized. I will offer a few remarks on that possibility below.
If conformity is rewarded, the problem is especially severe for the earliest disclosers or dissenters, who “may bear especially high costs because they are conspicuous, individually identified, and easy to isolate for reprisals.”28 And if the earliest dissenters are successfully deterred, dissent is likely to be exceedingly rare. Authoritarian governments are well aware of that fact; they try to nip dissent in the bud. But once the number of disclosers or dissenters reaches a certain level, there may be a tipping point, producing a massive change in behavior.29 Indeed a single discloser, or a single skeptic, might be able to initiate a chain of events by which a myth is shattered.
Return to the tale of “The Emperor’s New Clothes”: “A child, however, who had no important job and could only see things as his eyes showed them to him, went up to the carriage. ‘The Emperor is naked,’ he said. . . . The boy’s remark, which had been heard by the bystanders, was repeated over and over again until everyone cried: ‘The boy is right! The Emperor is naked! It’s true!’”30 The power of the tale stems from its familiarity in ordinary life. All of us have seen situations in which someone says the emperor is naked or in which someone might (and should) have done so. The challenge is that it might be very difficult to initiate this process, especially if early disclosers are subject to social or legal sanctions.
Here we can see a potentially beneficial rol
e of misfits and malcontents, who can perform a valuable function in getting otherwise neglected information and perspectives to others. Consider the suggestion that harmful obstacles to cultural improvement come from a “social structure” that eliminates “valuable innovators, experimenters, and error-makers from being viewed as people to copy.”31 The qualification, noted above, is that contrarians might help to reduce cascades without reducing errors.
With respect to conformity, these speculations are supported by an ingenious variation on the urn experiment mentioned above.32 In this experiment, people were paid twenty-five cents for a correct decision but seventy-five cents for a decision that matched the decision of the majority of the group. There were punishments for incorrect and nonconforming answers as well. If people made an incorrect decision, they lost twenty-five cents. If their decision failed to match the group’s decision, they lost seventy-five cents.
In this experiment, cascades appeared almost all of the time! No fewer than 96.7 percent of rounds resulted in cascades, and 35.3 percent of announcements did not match the announcer’s private signal, that is, the signal given by his or her own draw. And when the draw of a subsequent person contradicted the announcement of the predecessor, 72.2 percent of people matched the first announcement. Consider, as a dramatic illustration, this period of the experiment (the actual urn for this period was B):33
1 2 3 4 5 6 7 8 9 10
Private draw
A
B
B
B
A
B
B
B
A
B
Decision
A
A
A
A
A
A
A
A
A
A
The lesson is that institutions that reward conformity and punish deviance are far more likely to produce worse decisions and to reveal less in the way of private information. And here there is a link to the earlier suggestion that serious mistakes are committed by groups whose members are connected by bonds of affection, friendship, and solidarity. In such groups, members are usually less willing, or even unwilling, to state objections and counterarguments, for fear that these will violate generally held norms. Cascades and bad decisions are likely; return to the investment clubs discussed above. We can see here that an organization that depends on affective ties is likely to stifle dissent and to minimize the disclosure of private information and belief; some religious and political organizations are obvious illustrations. A socially destructive norm of conformity aggravates people’s tendency to ignore their private information and to say and do what others do.
If an organization wants to avoid error, it should make clear that it welcomes the disclosure of private signals, simply because that is in the organization’s own general interest. This point might seem counterintuitive, because in most well-functioning societies, conformity to the majority’s view seems to be the civil thing to do. What I am suggesting here is that from the social standpoint, it is better to behave in the way that one would if being right were all that mattered and better still to behave as one would if a correct group decision were all that mattered.
Of course, the normative issues are not always simple. Bonds of affection and solidarity are often important to group members, and many people do not appreciate dissent and disagreement. Perhaps the real point of the relevant group or organization is not to perform well but to foster an optimistic outlook and good relationships. Conformists avoid creating the difficulties that come from contestation but at the expense, often, of a good outcome; dissenters tend to increase contestation while also improving performance.
In the abstract, it is not easy to specify the optimal tradeoffs between the various goods. Everything depends on the group’s goals—on what it is trying to maximize. If the only goal is to arrive at the right decisions, groups need to encourage disclosers and dissenters. If the central goal of group members is to maintain and improve social bonds or to have a good time, and not to carry out some task, conformity might be just fine, at least if nonconformists introduce tension and hostility. Or consider the question of dissent in wartime. It is important for those who wage war to know what citizens really think and also to have a sense of actual and potential errors. But it is also important, especially in wartime, for citizens to have a degree of solidarity, to be broadly optimistic, and to believe they are involved in a common endeavor; this belief can help solve collective action problems that otherwise threaten success. Some forms of dissent might correct mistakes while also weakening social bonds. Of course, freedom of speech should be the rule, but there is no simple solution to this dilemma. We might simply notice that those who are inclined to dissent must decide whether it is worthwhile to create the disruption that comes from expressing their views.
It is also possible that dissenters will be wrong, especially—but not only—if they are contrarians, and if they are wrong, they might spread errors through the same processes discussed here. They might be sources of fake news. Nothing in the discussion thus far shows that conformity and cascades are bad as such. The only suggestions have been that the underlying mechanisms increase the likelihood that people will not reveal what they know or believe and that this failure to disclose can produce social harm. It would not be difficult to generate experiments in which informational and reputational influences produce fewer mistakes than independence—if, for example, the task is especially difficult and if the experimenter introduces confident confederates equipped with the correct answer. When specialists have authority, and when people listen carefully to them, it is generally because errors are minimized through this route. But reputational influences carry serious risks insofar as they lead people, including specialists, not to disclose what they actually know. Indeed, this is the most troublesome implication of the conformity experiments.
When Silence Is Golden
I have been stressing cases in which disclosure is in the group’s interest, but the discussion also suggests the opposite possibility, certainly when group members might go public and say what they know to the world at large.34 Confidentiality can be essential. If group members reveal information that is embarrassing or worse, they might assist a competitor or an adversary. They might also make it harder for the group to have candid discussions in the future, simply because everyone knows that whatever is said might be made public. Strong norms against “leaking” are a natural corrective. And if some members of the group have engaged in wrongdoing, revelation of that fact might injure many or all group members.
Apart from confidentiality, anyone who has ever attended a workplace meeting is aware of the possibility that speakers receive the full benefits of the time they use, while inflicting costs on others. This unfortunate state of affairs can lead to unduly long meetings. The same problem can afflict the deliberations of both legislatures and courts. Conformity to a group norm, involving silence or informal time constraints, can be extremely valuable.
It is important to acknowledge that the problem I am emphasizing—the failure to disclose accurate information that will benefit the public—is closely paralleled by the problems raised in many cases in which silence, not revelation, is a collective good. And if disclosure will spread inaccurate information, it is unlikely to be beneficial, especially if it negates the beneficial effects of previous decisions or produces a cascade of its own (recall the spread of fake news). Because my focus is on the failure to disclose information, I will not devote attention to situations in which silence is golden, except to note that the basic analysis of those situations is not so different from the analysis here.35
The conformity experiment could itself be varied in many ways, with predictable results. If financial rewards were solely or almost solely for conformity, cascade behavior would be increased; if the seventy-five-cent reward were cut in half, cascade behavior should decline. Of course, it is possible to
imagine many mixed systems. An obvious example is a system of majority rule in which people are not only rewarded when the group’s majority reaches the right result but also rewarded for conformity or punished for nonconformity. Will cascades develop in such cases?
The answer will depend on the size of the two sets of incentives. If the accuracy of the group’s decision will greatly affect individuals’ well-being—if their lives will get much better as a result of good results—cascades are less likely. But if the ultimate outcome has little effect and if conformity will carry high rewards, cascades are inevitable. A system in which individuals receive two dollars for a correct majority decision and twenty-five cents for conforming will produce different (and better) results from a system in which individuals receive twenty-five cents for a correct majority decision and two dollars for conformity.
The real world of groups and democracy offers countless variations on these rewards, and often the rewards are highly indeterminate; people do not know what they are or have a hard time in quantifying them. But there can be little doubt that conformity pressures actually result in less disclosure of information. Consider the words of a medical researcher who questions a number of Lyme disease diagnoses: “Doctors can’t say what they think anymore. . . . If you quote me as saying these things, I’m as good as dead.”36 When privately interviewed, gang members express considerable discomfort about their antisocial behavior, but their own conduct suggests a full commitment, leading to a widespread belief that most people approve of what is being done.37 Or consider the remarks of a sociologist who publicly raised questions about the health threats posed by mad cow disease, suggesting that if you raise those doubts publicly, “you get made to feel like a pedophile.”38
Conformity Page 6