Believing What You Don't Believe

This article was originally published in the New York Times on October 30, 2015.

HOW is it that people can believe something that they know is not true?

For example, Kansas City Royals fans, sitting in front of their television sets in Kansas City, surely know that there is no possible connection between their lucky hats (or socks, or jerseys) and the outcome of a World Series game at Citi Field in New York, 1,200 miles away. Yet it would be impossible to persuade many of them to watch the game without those lucky charms.

It’s not that people don’t understand that it’s scientifically impossible for their lucky hats to help their team hit a home run or turn a double play — all but the most superstitious would acknowledge that. It’s that they have a powerful intuition and, despite its utter implausibility, they just can’t shake it.

Consider a 1986 study conducted by the psychologist Paul Rozin and his colleagues at the University of Pennsylvania. The participants were asked to put labels on two identical bowls of sugar. The labels read “sucrose” and “sodium cyanide (poison).” Even though the participants were free to choose which label to affix to which bowl, they were nevertheless reluctant, after labeling the bowls, to use sugar from the one that they had just labeled poison. Their intuition was so powerful that it guided their behavior even when they recognized that it was irrational.

Psychologists who study decision making and its shortcomings often rely on the idea, popularized by the psychologist Daniel Kahneman in his book “Thinking, Fast and Slow,” that there are two modes of processing information. There is a “fast system” that is intuitive and quickly generates impressions and judgments, and a “slow system” that operates in a deliberate and effortful manner, and is responsible for overriding the output of the fast system when the slow system detects an error.

Much of the time, the fast system is good enough. When you’re deciding whether to grab your umbrella when leaving the house, you can glance up at the sky to see how gray and cloudy it is. You’re using a shortcut based on similarity (does it look like it is going to rain?) as a substitute for thinking about probability — and generally, this is a good rule of thumb.

But the fast system is also prone to systematic biases and errors. If a gray sky makes you think it will rain and you don’t take into account that you’re visiting San Diego (rather than Seattle), then your judgment is likely to be biased. (Technically speaking, you’re neglecting the base rate that is necessary for a sound probability judgment.)

This is when the slow system can step in. If someone points out that rain in San Diego is very rare, even when the sky looks gray, you might revise your guess and leave your umbrella at home. Your slow system detects an error — and corrects it.

But as one of us, Professor Risen, discusses in a paper just published in Psychological Review, many instances of superstition and magical thinking indicate that the slow system doesn’t always behave this way. When people pause to reflect on the fact that their superstitious intuitions are irrational, the slow system, which is supposed to fix things, very often doesn’t do so. People can simultaneously recognize that, rationally, their superstitious belief is impossible, but persist in their belief, and their behavior, regardless. Detecting an error does not necessarily lead people to correct it.

This cognitive quirk is particularly easy to identify in the context of superstition, but it isn’t restricted to it. If, for example, the manager of a baseball team calls for an ill-advised sacrifice bunt, it is easy to assume that he doesn’t know that the odds indicate his strategy is likely to cost his team runs. But the manager may have all the right information; he may just choose not to use it, based on his intuition in that specific situation.

In fact, sometimes the slow system can exacerbate the problem rather than fix it. Instead of making the manager’s decision more rational, the slow system may double down by trying to rationalize the intuition, generating reasons that it is correct to bunt, at least in this particular case.

Once we realize that detecting an error does not necessarily result in correcting that error — they are two separate processes, not one process, as most “dual system” models assume — then we are in a better position to fix those errors. For example, rather than pointing out to the baseball manager that calling for a sacrifice bunt is irrational (as if he didn’t know that already), you might have him devise a policy ahead of time for what he should do in all such situations, and encourage him stick to it. It is easy to rationalize a powerful but misguided intuition in a specific situation. But it is much harder to concoct such rationalizations when setting a policy for, say, a whole baseball season.

When the manager of a baseball team chooses a strategy despite knowing that, statistically, it will cost his team runs, he’s not being superstitious. He may even be able to rationalize his decision, convincing himself that his decision is correct. But what he’s doing is pretty similar, psychologically, to the fan who wears a lucky hat or the fielder who won’t step on the foul line: He’s got a powerful intuition and he just can’t shake it.

— Jane L. Risen and A. David Nussbaum are, respectively, an associate professor of behavioral science and an adjunct associate professor of behavioral science at the Booth School of Business at the University of Chicago.