Book Read Free

The Confidence Game

Page 26

by Maria Konnikova


  Later that afternoon, he went to visit Captain Dick Slaughter, the man who owned that choice tract of land. They drew up a contract. Things were going well for Norfleet: a buyer for his land and the potential for some successful investments, all in one. He would, he confidently told Slaughter, be able to pay the full $90,000 in forty-five days. For today, he put down a $5,000 deposit, shook hands, and went to enjoy the rest of the day. The streets of Dallas seemed friendlier by the minute.

  Early the next morning, Norfleet and Spencer returned once more to the Adolphus. While Spencer went to buy the morning papers, Stetson made Norfleet a proposition. On certain days of the month, he explained, his company controlled the market for certain stocks. They would advise him on the optimal moment to buy and sell—and if he executed everything properly, he would make a large sum of money. All he needed was a smart, discreet, honest, and straightforward man on whom he could rely, to use his name to make the trades. And Norfleet—well, he was a fellow lodge member. What more could you require?

  Norfleet had never had anything to do with the markets before. The mechanics of the thing seemed close to gibberish. But he was a good businessman, and he knew that, in the business world, getting something for nothing simply didn’t happen. And if you got something by knowing something someone else didn’t—well, in some cases, you were smart. In others, you were shady. “Is this legitimate?” he asked Stetson. He wanted no part of anything that was even remotely gray.

  “Absolutely,” Stetson assured him. “It’s strictly business. We do it every day.”

  Norfleet agreed. He felt he could trust Stetson. The bonds of brotherhood run deep. There was only one problem. “I have no money,” he told the two men. (By this time, Spencer had returned, papers in hand, and been apprised of the business dealings.)

  Not a problem at all, Stetson assured him. No money was needed. His membership in the United Brokers, which allowed him to trade in the stock exchange, was backed by $100,000.

  The stock exchange was impressive. A large stone building with multiple offices, hallways, people milling around, and money flying about. The men approached a glass window. Here, explained Stetson, was where buy orders were placed and money returned.

  Norfleet felt a hand on his shoulder. “Excuse me,” an official-looking man said. His name was E. J. Ward. He was the stock exchange’s secretary. “Are you a member of the exchange?”

  Norfleet was not.

  “I am very sorry to do so, but I shall be forced to invite you outside, as only members are allowed here.”

  Norfleet quickly made his exit; he didn’t want to run afoul of any regulations, he assured the secretary. Wait in my hotel, Stetson instructed him. Norfleet demurred. He wanted to drop out; it just didn’t seem right.

  Spencer chimed in. He understood how the markets worked. He’d accompany Stetson. All Norfleet would have to do was wait. In fact, Stetson immediately offered, how did Norfleet feel about investing his $800 winnings?

  The day drew to an end. Norfleet had spent the better part of the afternoon wandering the town, seeing some cattle, eyeing the competition. He was now back at the Jefferson, lounging by the window, thinking back on the wondrous few days he’d had. New friends. New experiences. A new type of finance he didn’t quite grasp but that seemed awfully impressive. The door burst open.

  Spencer was ebullient—$68,000, cash. He threw it on the bed. Stetson, ever the more composed, simply smiled. Meticulously, he counted out Norfleet’s share from his $800: $28,000. This was truly cause for celebration.

  Norfleet was shocked. Pleasantly so. All he’d done was behave like a decent human being, and yet here he was, a far richer man than he’d been a mere twenty-four hours earlier. It was a fortune.

  Someone knocked on the door. It was Ward—the man who’d just that morning ejected him from the exchange building. Had they had, he asked, sufficient funds to cover the orders in the event the market moved against them? As nonmembers, they had to guarantee all trades in advance.

  No, they both replied. They didn’t have the requisite cash.

  Stetson rose from the chair. According to the exchange rules, he told the secretary, they had until the following Monday to come up with the call money.

  Ward agreed. In the meantime, though, he’d need to hold on to the cash they’d won. They would get a receipt, but the money itself would have to wait.

  The three friends conferred. How to raise the funds? In the end, they came up with an agreement. Spencer would raise $35,000. His agency business was thriving—“Not so bad for a young man just out of the army, huh?”—and he could come up with it in short order. Norfleet’s share: $20,000. Stetson would supply the remainder.

  The next day, Norfleet left for home, accompanied by Spencer—Spencer’s money was already on the way, and he’d take the opportunity to inspect the farmland. Norfleet, on his end, would need to borrow money from his bank, where they knew him and trusted his word.

  Three days later, he was ready to reclaim his winnings.

  It was at the Fort Worth Cotton Exchange that their fortunes took a downward turn. Or, rather, that stupidity got the best of the moment. For when Stetson instructed Spencer to sell “Mexican petroleum” at a two-point margin, Spencer mangled the instructions. He’d lost Stetson’s original note, and when he re-created it from memory, instead of selling, he had placed an order to buy. Stetson’s information was good; Spencer’s execution was shoddy.

  For the first time in their weeklong acquaintance, Norfleet saw Stetson lose his cool. “Spencer, you have ruined us!” he screamed, flinging the receipt in his face. His skin was red, almost purple. His eyes bulged. Rage seemed to radiate from every pore. “You have lost every dollar we have and that we had coming to us.”

  Spencer became hysterical. He’d lost his mother’s estate, he cried. He was ruined. Norfleet couldn’t quite grasp it. Twenty thousand dollars—gone. All because of a stupid, stupid, stupid error.

  After a while, Stetson calmed down. He’d make things right, he vowed. He would go back to the exchange and try to hedge the loss.

  The two men waited in silence. They were both ruined. And it seemed an awful lot to hope for that Stetson would succeed.

  But again, their luck seemed to have turned. Stetson returned, triumphant. He had gotten in a sell order, and their losses were covered. Indeed, shortly after, the exchange secretary arrived; $160,000 was theirs—their initial capital, and then some. But, as before, a cash guarantee was required.

  On the morning of November 20, Norfleet again rode for home. He had lost $20,000, true, but if he now raised an additional $25,000, he would recoup all losses and even come out ahead. His credit tapped, he turned to his brother-in-law.

  * * *

  There’s a sense in which any decision of any weight concerning the future is a gamble. It’s inherently risky because the future is inherently uncertain. And so, in the interim—the period after we’ve made our choice but before we know the final outcome—we wait, we look, we evaluate the evidence, we calculate the odds that things will turn out as expected. In other words, we form what is known as an expectancy: an expectation of how things will progress. It can range from the very basic—I’ve settled on a restaurant for this evening and expect my meal to be delicious—to the more complex—I’ve decided to invest in this real estate venture and expect building to commence in 2015, conclude by the end of the year, cost $20 million, and, by 2017, bring in $10 million a year. (Clearly, I’ve never invested in real estate in my life.) That initial set of expectations will in turn affect how we think, feel, and act as new evidence comes in. It will, as well, affect how we interpret and evaluate that evidence in the first place.

  When we committed to the confidence game, we had formed a very particular expectancy: that of eventual success. And at this stage in the game, everything has been going according to the exact expectations we’ve set out; our plans seem to be well laid indeed. We’re coming off some heady wins. We have money in hand. We have f
ine lab results. We have a solid piece of reporting. We have a rare, genuine bottle of wine or work of art. We’ve established a bond of trust with our deceiver—he’s been good to his word so far. The convincer has done its work. We think we’re in the home stretch; just a little more of the same, and our initial confidence, trust, and judgment will be completely verified.

  From the confidence man’s perspective, this is the ideal moment to make a killing: pull the plug just when your mark is at his most convinced. The mark has already tasted victory and lauded himself on his discernment and prowess. He’s already hooked. If the grifter lets him keep winning, it doesn’t do him any additional good. Everything that goes to the mark, after all, is less for the confidence man. Instead, what if the grifter now makes the mark lose? At least a bit? In other words, what do we do when reality suddenly doesn’t match the expectancy we’ve built?

  That’s the question at the heart of the breakdown, the moment when the con artist sees just how far he can take us. In the put-up, he picked us out of the crowd with care. In the play, he established a bond through some emotional wrangling and expert storytelling. In the rope, he laid out his persuasive pitch for our already-willing ears. In the tale, he’s told us how we will personally benefit, relying on our belief in our exceptionalism. In the convincer, he’s let us win, persuading us that we’d been right in going along with him. And now comes the breakdown. We start to lose. How far can the grifter push us before we balk? How much of a beating can we take? Things don’t completely fall apart yet—that would lose us entirely, and the game would end prematurely—but cracks begin to show. We lose some money. Something doesn’t go according to plan. One fact seems to be off. A figure is incorrectly labeled. A wine bottle is “faulty.” The crucial question: do we notice, or do we double down? High off the optimism of the convincer, certain that good fortune is ours, we often take the second route. When we should be cutting our losses, we instead recommit—and that is entirely what the breakdown is meant to accomplish.

  Leon Festinger first proposed the theory of cognitive dissonance, today one of the most famous concepts in psychology, in 1957. When we experience an event that counteracts a prior belief, he argued, the resulting tension is too much for us to handle; we can’t hold two opposing beliefs at the same time, at least not consciously. “The individual strives,” Festinger wrote in A Theory of Cognitive Dissonance, “toward consistency within himself.” True, here and there one might find exceptions. But overall, “It is still overwhelmingly true that related opinions or attitudes are consistent with one another. Study after study reports such consistency among one person’s political attitudes, social attitudes, and many others.” He continued, “There is the same kind of consistency between what a person knows or believes and what he does.” If we believe in education, we send our children to college. If a child knows something is bad but can’t quite resist it, she’ll try to avoid getting caught if she does it. So when something goes awry—someone knows smoking is bad for him but smokes anyway, for instance—we work to reduce the tension, through a process Festinger called dissonance reduction.

  Festinger first observed the tendency not in a lab, but rather in the actions of a cult he had been following, which believed that an alien-led rapture would, on a certain date, at a certain time, lead the members to the alien world as a reward for their goodliness. When the date and time came and went, though, and no aliens were forthcoming, Festinger expected the cult would disperse. Instead, the group promptly reformulated their understanding of the alien plan.

  While Festinger was surprised, the behavior wasn’t a new one—and was actually to be expected of the mind that has fallen sway to a con as strong as the cult. Writing several centuries earlier, Francis Bacon wouldn’t have been at all shocked. He would have likely anticipated correctly just how the whole thing would play out. “And such is the way of all superstitions, whether in astrology, dreams, omens, divine judgments, or the like,” he wrote, “wherein men, having a delight in such vanities, mark the events where they are fulfilled, but where they fail, although this happened much oftener, neglect and pass them by.” In other words, they work to minimize the discord in their minds—the exact tendency Festinger would term dissonance reduction.

  To reduce dissonance, Festinger argued, we can do several things. We can revise our interpretation of the present reality: there actually isn’t any inconsistency; we were just looking at it wrong. We achieve this through selectively looking for new, confirming information or selectively ignoring disconfirming information. The study on smoking was flawed. The sample was biased. It doesn’t apply to me. We can revise our prior expectation: I thought this would happen all along, so it’s actually not discordant. I always knew they would try to convince me smoking was bad, so learning that fact isn’t actually jarring. I was prepared all along and made the decision anyhow: I think my experience will defy the odds. Or we can alter the reality itself: stop smoking. Generally, the first two approaches tend to be easier to accomplish. Changing your perception or your memory is easier than changing behavior. It’s easier to change what we believe about smoking than to actually quit.

  Even as conflicting evidence comes in, expectancies tend to be sticky, especially when they’ve been confirmed in the past. “Once useful expectancies have developed,” write psychologists Neal Roese and Jeffrey Sherman, “our cognitive system is rather conservative about altering or replacing them.” We don’t altogether ignore new inputs—that would be maladaptive and stupid—but we err on the side of what we’ve already decided was true. After all, we did a lot of work to get to that point. And what we’ve already decided was true can color how we view the new event: even as a conflicting piece of information is coming to our attention, we are already revising our interpretation of it to fit with our expectancy.

  Our prior expectations act as a heuristic of sorts: they give us a basic cognitive road map for how we should look at what’s going on, so that with every new piece of information we don’t have to reinvent the wheel. The stronger the expectation and the greater the opportunity for ambiguity, the more likely we are to experience the so-called expectancy assimilation effect, that is, assimilating new data to fit old views rather than revising the old views themselves.

  “When men wish to construct or support a theory,” wrote Charles Mackay in Extraordinary Popular Delusions and the Madness of Crowds, his popular 1852 exposé of hoax-like behaviors among his more devious-minded compatriots, “how they torture facts into their service!” Psychologists have since termed that tendency the confirmation bias, our predisposition to take in and sift through evidence selectively, so as to confirm what we’re already expecting to be the case. Our desire to avoid dissonance in the first place has an in-the-moment impact on how we evaluate what’s happening—and what we choose to evaluate or ignore in the first place. It’s a sort of unconscious equivalent to what a lawyer does as part of his job description: collect evidence and present it in a way that sheds the best possible light on your side of the case, one very particular, and particularly selective, version of events that presents the cleanest and most convincing picture.

  Franz Friedrich Anton Mesmer was used to performing miracles. A physician by training, he had over the years developed an approach to therapy that could cure the trickiest, most intractable of ailments. It was based on the theory of animal magnetism. Naturally occurring magnetic fluids, Mesmer argued, could be used to cure both body and mind. His first therapeutic breakthrough came in the case of Franzl Oesterline. She had a “convulsive malady” that necessitated around-the-clock care, and no traditional remedies seemed to work. Mesmer decided to test his theory: he set up a magnet that would alter the “gravitational tides” that were having such a severe effect on the young woman. The cure worked. It was as if a fluid had drained from her body, Oesterline recounted. She recovered almost instantly. In short order, Mesmer’s Vienna practice became known for its incredible cures. A blind pianist was once more able to see. A paralytic w
as able to walk.

  Next Mesmer took his trade to Paris, where he became a favorite of Marie Antoinette and Wolfgang Amadeus Mozart. His mesmerizing salons were the talk of the town. Sometimes he used magnets. Other times he would ask visitors to sit in magnetized water or hold a magnetized pole. He could mesmerize a roomful of people at a time: they would faint, have epiphanies, be cured of whatever ailed them. A Magnetic Institute soon followed.

  King Louis XVI, however, had his doubts. He appointed a commission from the French Academy of Sciences to investigate Mesmer’s claims. Benjamin Franklin, Joseph Guillotin, Jean Bailly, Antoine Lavoisier: the prominent men of Paris set about verifying the practice of “mesmerism.” At the time, Franklin was quite ill. The tests, it was decided, would be held at his residence. Rather than go himself, Mesmer sent an assistant—or, if you will, a possible scapegoat should things not go according to plan. It turned out to be a rather shrewd move. The assistant “magnetized” a tree to see whether a blindfolded twelve-year-old could tell it from the other trees. He could not. There was no basis for animal magnetism, the commission reported back. The whole thing was a sham—at least from a scientific standpoint.

  From what standpoint was it not? If it was all a con, how had it had physical effects on so many people? Mesmerism is one of the earliest examples of the power of our beliefs to change reality: the placebo effect, or dissonance reduction at its finest, in full action. We want to believe something works, and so we will it to work. Our mind literally changes the reality of our body’s health. Mesmer clearly possessed strong powers of suggestion, and people really did get better in his presence. Scientifically, what he was doing was worthless. But people latched on to his purported claims, and the more popular were his successes, the more they conveniently forgot those patients he wasn’t able to help. His reputation grew stronger apace.

  One of the earliest scientific demonstrations of the power of belief to change reality came again not from a lab, but this time from the classroom. In 1965, Harvard psychologist Robert Rosenthal joined with an elementary school principal, Lenore Jacobson, to determine whether how a teacher expects a student to perform would, in turn, affect how she would see the student’s performance. At the Oak School, Rosenthal and Jacobson gathered together a small group of elementary school teachers and told them about a test that measured intellectual capacity, the Harvard Test of Inflected Acquisition. They had given the exam, they said, to Oak School students. Now they would share the scores, so that the teachers had the added information to better teach their classes. Some students, the researchers said, were “growth spurters.” They could be expected to show significant improvements that year. The “spurters,” of course, were actually chosen at random; the Harvard test in question did not even exist.

 

‹ Prev