Rationality- From AI to Zombies
Page 44
Real cults are vastly worse. “Love bombing” as a recruitment technique, targeted at people going through a personal crisis. Sleep deprivation. Induced fatigue from hard labor. Distant communes to isolate the recruit from friends and family. Daily meetings to confess impure thoughts. It’s not unusual for cults to take all the recruit’s money—life savings plus weekly paycheck—forcing them to depend on the cult for food and clothing. Starvation as a punishment for disobedience. Serious brainwashing and serious harm.
With all that taken into account, I should probably sympathize more with people who are terribly nervous, embarking on some odd-seeming endeavor, that they might be joining a cult. It should not grate on my nerves. Which it does.
Point one: “Cults” and “non-cults” aren’t separated natural kinds like dogs and cats. If you look at any list of cult characteristics, you’ll see items that could easily describe political parties and corporations—“group members encouraged to distrust outside criticism as having hidden motives,” “hierarchical authoritative structure.” I’ve written on group failure modes like group polarization, happy death spirals, uncriticality, and evaporative cooling, all of which seem to feed on each other. When these failures swirl together and meet, they combine to form a Super-Failure stupider than any of the parts, like Voltron. But this is not a cult essence; it is a cult attractor.
Dogs are born with dog DNA, and cats are born with cat DNA. In the current world, there is no in-between. (Even with genetic manipulation, it wouldn’t be as simple as creating an organism with half dog genes and half cat genes.) It’s not like there’s a mutually reinforcing set of dog-characteristics, which an individual cat can wander halfway into and become a semidog.
The human mind, as it thinks about categories, seems to prefer essences to attractors. The one wishes to say “It is a cult” or “It is not a cult,” and then the task of classification is over and done. If you observe that Socrates has ten fingers, wears clothes, and speaks fluent Greek, then you can say “Socrates is human” and from there deduce “Socrates is vulnerable to hemlock” without doing specific blood tests to confirm his mortality. You have decided Socrates’s humanness once and for all.
But if you observe that a certain group of people seems to exhibit ingroup-outgroup polarization and see a positive halo effect around their Favorite Thing Ever—which could be Objectivism, or vegetarianism, or neural networks—you cannot, from the evidence gathered so far, deduce whether they have achieved uncriticality. You cannot deduce whether their main idea is true, or false, or genuinely useful but not quite as useful as they think. From the information gathered so far, you cannot deduce whether they are otherwise polite, or if they will lure you into isolation and deprive you of sleep and food. The characteristics of cultness are not all present or all absent.
If you look at online arguments over “X is a cult,” “X is not a cult,” then one side goes through an online list of cult characteristics and finds one that applies and says “Therefore it is a cult!” And the defender finds a characteristic that does not apply and says “Therefore it is not a cult!”
You cannot build up an accurate picture of a group’s reasoning dynamic using this kind of essentialism. You’ve got to pay attention to individual characteristics individually.
Furthermore, reversed stupidity is not intelligence. If you’re interested in the central idea, not just the implementation group, then smart ideas can have stupid followers. Lots of New Agers talk about “quantum physics” but this is no strike against quantum physics. Of course stupid ideas can also have stupid followers. Along with binary essentialism goes the idea that if you infer that a group is a “cult,” therefore their beliefs must be false, because false beliefs are characteristic of cults, just like cats have fur. If you’re interested in the idea, then look at the idea, not the people. Cultishness is a characteristic of groups more than hypotheses.
The second error is that when people nervously ask, “This isn’t a cult, is it?,” it sounds to me like they’re seeking reassurance of rationality. The notion of a rationalist not getting too attached to their self-image as a rationalist deserves its own essay (though see Twelve Virtues, Why Truth? And . . ., and Two Cult Koans). But even without going into detail, surely one can see that nervously seeking reassurance is not the best frame of mind in which to evaluate questions of rationality. You will not be genuinely curious or think of ways to fulfill your doubts. Instead, you’ll find some online source which says that cults use sleep deprivation to control people, you’ll notice that Your-Favorite-Group doesn’t use sleep deprivation, and you’ll conclude “It’s not a cult. Whew!” If it doesn’t have fur, it must not be a cat. Very reassuring.
But Every Cause Wants To Be A Cult, whether the cause itself is wise or foolish. The ingroup-outgroup dichotomy etc. are part of human nature, not a special curse of mutants. Rationality is the exception, not the rule. You have to put forth a constant effort to maintain rationality against the natural slide into entropy. If you decide “It’s not a cult!” and sigh with relief, then you will not put forth a continuing effort to push back ordinary tendencies toward cultishness. You’ll decide the cult-essence is absent, and stop pumping against the entropy of the cult-attractor.
If you are terribly nervous about cultishness, then you will want to deny any hint of any characteristic that resembles a cult. But any group with a goal seen in a positive light is at risk for the halo effect, and will have to pump against entropy to avoid an affective death spiral. This is true even for ordinary institutions like political parties—people who think that “liberal values” or “conservative values” can cure cancer, etc. It is true for Silicon Valley startups, both failed and successful. It is true of Mac users and of Linux users. The halo effect doesn’t become okay just because everyone does it; if everyone walks off a cliff, you wouldn’t too. The error in reasoning is to be fought, not tolerated. But if you’re too nervous about “Are you sure this isn’t a cult?” then you will be reluctant to see any sign of cultishness, because that would imply you’re in a cult, and It’s not a cult!! So you won’t see the current battlefields where the ordinary tendencies toward cultishness are creeping forward, or being pushed back.
The third mistake in nervously asking “This isn’t a cult, is it?” is that, I strongly suspect, the nervousness is there for entirely the wrong reasons.
Why is it that groups which praise their Happy Thing to the stars, encourage members to donate all their money and work in voluntary servitude, and run private compounds in which members are kept tightly secluded, are called “religions” rather than “cults” once they’ve been around for a few hundred years?
Why is it that most of the people who nervously ask of cryonics, “This isn’t a cult, is it?” would not be equally nervous about attending a Republican or Democrat political rally? Ingroup-outgroup dichotomies and happy death spirals can happen in political discussion, in mainstream religions, in sports fandom. If the nervousness came from fear of rationality errors, people would ask “This isn’t an ingroup-outgroup dichotomy, is it?” about Democrat or Republican political rallies, in just the same fearful tones.
There’s a legitimate reason to be less fearful of Libertarianism than of a flying-saucer cult, because Libertarians don’t have a reputation for employing sleep deprivation to convert people. But cryonicists don’t have a reputation for using sleep deprivation, either. So why be any more worried about having your head frozen after you stop breathing?
I suspect that the nervousness is not the fear of believing falsely, or the fear of physical harm. It is the fear of lonely dissent. The nervous feeling that subjects get in Asch’s conformity experiment, when all the other subjects (actually confederates) say one after another that line C is the same size as line X, and it looks to the subject like line B is the same size as line X. The fear of leaving the pack.
That’s why groups whose beliefs have been around long enough to seem “normal” don’t inspire the same nervousness as �
��cults,” though some mainstream religions may also take all your money and send you to a monastery. It’s why groups like political parties, that are strongly liable for rationality errors, don’t inspire the same nervousness as “cults.” The word “cult” isn’t being used to symbolize rationality errors, it’s being used as a label for something that seems weird.
Not every change is an improvement, but every improvement is necessarily a change. That which you want to do better, you have no choice but to do differently. Common wisdom does embody a fair amount of, well, actual wisdom; yes, it makes sense to require an extra burden of proof for weirdness. But the nervousness isn’t that kind of deliberate, rational consideration. It’s the fear of believing something that will make your friends look at you really oddly. And so people ask “This isn’t a cult, is it?” in a tone that they would never use for attending a political rally, or for putting up a gigantic Christmas display.
That’s the part that bugs me.
It’s as if, as soon as you believe anything that your ancestors did not believe, the Cult Fairy comes down from the sky and infuses you with the Essence of Cultness, and the next thing you know, you’re all wearing robes and chanting. As if “weird” beliefs are the direct cause of the problems, never mind the sleep deprivation and beatings. The harm done by cults—the Heaven’s Gate suicide and so on—just goes to show that everyone with an odd belief is crazy; the first and foremost characteristic of “cult members” is that they are Outsiders with Peculiar Ways.
Yes, socially unusual belief puts a group at risk for ingroup-outgroup thinking and evaporative cooling and other problems. But the unusualness is a risk factor, not a disease in itself. Same thing with having a goal that you think is worth accomplishing. Whether or not the belief is true, having a nice goal always puts you at risk of the happy death spiral. But that makes lofty goals a risk factor, not a disease. Some goals are genuinely worth pursuing.
On the other hand, I see no legitimate reason for sleep deprivation or threatening dissenters with beating, full stop. When a group does this, then whether you call it “cult” or “not-cult,” you have directly answered the pragmatic question of whether to join.
Problem four: The fear of lonely dissent is something that cults themselves exploit. Being afraid of your friends looking at you disapprovingly is exactly the effect that real cults use to convert and keep members—surrounding converts with wall-to-wall agreement among cult believers.
The fear of strange ideas, the impulse to conformity, has no doubt warned many potential victims away from flying-saucer cults. When you’re out, it keeps you out. But when you’re in, it keeps you in. Conformity just glues you to wherever you are, whether that’s a good place or a bad place.
The one wishes there was some way they could be sure that they weren’t in a “cult.” Some definite, crushing rejoinder to people who looked at them funny. Some way they could know once and for all that they were doing the right thing, without these constant doubts. I believe that’s called “need for closure.” And—of course—cults exploit that, too.
Hence the phrase, “Cultish countercultishness.”
Living with doubt is not a virtue—the purpose of every doubt is to annihilate itself in success or failure, and a doubt that just hangs around accomplishes nothing. But sometimes a doubt does take a while to annihilate itself. Living with a stack of currently unresolved doubts is an unavoidable fact of life for rationalists. Doubt shouldn’t be scary. Otherwise you’re going to have to choose between living one heck of a hunted life, or one heck of a stupid one.
If you really, genuinely can’t figure out whether a group is a “cult,” then you’ll just have to choose under conditions of uncertainty. That’s what decision theory is all about.
Problem five: Lack of strategic thinking.
I know people who are cautious around Singularitarianism, and they’re also cautious around political parties and mainstream religions. Cautious, not nervous or defensive. These people can see at a glance that Singularitarianism is obviously not a full-blown cult with sleep deprivation etc. But they worry that Singularitarianism will become a cult, because of risk factors like turning the concept of a powerful AI into a Super Happy Agent (an agent defined primarily by agreeing with any nice thing said about it). Just because something isn’t a cult now, doesn’t mean it won’t become a cult in the future. Cultishness is an attractor, not an essence.
Does this kind of caution annoy me? Hell no. I spend a lot of time worrying about that scenario myself. I try to place my Go stones in advance to block movement in that direction. Hence, for example, the series of essays on cultish failures of reasoning.
People who talk about “rationality” also have an added risk factor. Giving people advice about how to think is an inherently dangerous business. But it is a risk factor, not a disease.
Both of my favorite Causes are at-risk for cultishness. Yet somehow, I get asked “Are you sure this isn’t a cult?” a lot more often when I talk about powerful AIs, than when I talk about probability theory and cognitive science. I don’t know if one risk factor is higher than the other, but I know which one sounds weirder . . .
Problem #6 with asking “This isn’t a cult, is it?” . . .
Just the question itself places me in a very annoying sort of Catch-22. An actual Evil Guru would surely use the one’s nervousness against them, and design a plausible elaborate argument explaining Why This Is Not A Cult, and the one would be eager to accept it. Sometimes I get the impression that this is what people want me to do! Whenever I try to write about cultishness and how to avoid it, I keep feeling like I’m giving in to that flawed desire—that I am, in the end, providing people with reassurance. Even when I tell people that a constant fight against entropy is required.
It feels like I’m making myself a first dissenter in Asch’s conformity experiment, telling people, “Yes, line X really is the same as line B, it’s okay for you to say so too.” They shouldn’t need to ask! Or, even worse, it feels like I’m presenting an elaborate argument for Why This Is Not A Cult. It’s a wrong question.
Just look at the group’s reasoning processes for yourself, and decide for yourself whether it’s something you want to be part of, once you get rid of the fear of weirdness. It is your own responsibility to stop yourself from thinking cultishly, no matter which group you currently happen to be operating in.
Once someone asks “This isn’t a cult, is it?” then no matter how I answer, I always feel like I’m defending something. I do not like this feeling. It is not the function of a Bayesian Master to give reassurance, nor of rationalists to defend.
Cults feed on groupthink, nervousness, desire for reassurance. You cannot make nervousness go away by wishing, and false self-confidence is even worse. But so long as someone needs reassurance—even reassurance about being a rationalist—that will always be a flaw in their armor. A skillful swordsman focuses on the target, rather than glancing away to see if anyone might be laughing. When you know what you’re trying to do and why, you’ll know whether you’re getting it done or not, and whether a group is helping you or hindering you.
(PS: If the one comes to you and says, “Are you sure this isn’t a cult?,” don’t try to explain all these concepts in one breath. You’re underestimating inferential distances. The one will say, “Aha, so you’re admitting you’re a cult!” or “Wait, you’re saying I shouldn’t worry about joining cults?” or “So . . . the fear of cults is cultish? That sounds awfully cultish to me.” So the last annoyance factor—#7 if you’re keeping count—is that all of this is such a long story to explain.)
*
Part K
Letting Go
121
The Importance of Saying “Oops”
I just finished reading a history of Enron’s downfall, The Smartest Guys in the Room, which hereby wins my award for “Least Appropriate Book Title.”
An unsurprising feature of Enron’s slow rot and abrupt collapse was that the executive
players never admitted to having made a large mistake. When catastrophe #247 grew to such an extent that it required an actual policy change, they would say, “Too bad that didn’t work out—it was such a good idea—how are we going to hide the problem on our balance sheet?” As opposed to, “It now seems obvious in retrospect that it was a mistake from the beginning.” As opposed to, “I’ve been stupid.” There was never a watershed moment, a moment of humbling realization, of acknowledging a fundamental problem. After the bankruptcy, Jeff Skilling, the former COO and brief CEO of Enron, declined his own lawyers’ advice to take the Fifth Amendment; he testified before Congress that Enron had been a great company.
Not every change is an improvement, but every improvement is necessarily a change. If we only admit small local errors, we will only make small local changes. The motivation for a big change comes from acknowledging a big mistake.
As a child I was raised on equal parts science and science fiction, and from Heinlein to Feynman I learned the tropes of Traditional Rationality: theories must be bold and expose themselves to falsification; be willing to commit the heroic sacrifice of giving up your own ideas when confronted with contrary evidence; play nice in your arguments; try not to deceive yourself; and other fuzzy verbalisms.
A traditional rationalist upbringing tries to produce arguers who will concede to contrary evidence eventually—there should be some mountain of evidence sufficient to move you. This is not trivial; it distinguishes science from religion. But there is less focus on speed, on giving up the fight as quickly as possible, integrating evidence efficiently so that it only takes a minimum of contrary evidence to destroy your cherished belief.