Wednesday, January 20, 2016

On (Some of) the Varieties of Superstitious Experience

Have you ever noticed that people believe a lot of weird stuff? Superstitious stuff, ranging from religious beliefs to crackpot medicine to talismans and good-luck charms and all that? That vaccines cause autism (they don't!)? It's not surprising. I want to talk some about why.

To write Everybody Is Wrong About God, I spent a couple of years studying religious and moral psychology. The main theme of the book speaks to what I found as it applies to the word "God" and to corollaries. One of the most important facts I learned in studying the psychology of religion is that people turn to religions for what seems obvious: to meet psychological and social needs. Particularly, they turn to religion to meet needs for meaning making, control, and sociality. These are tied up in core human needs for esteem, security, and connection, plus the need to understand.

One of my primary resources while reading and writing was a psychology of religion textbook, The Psychology of Religion: An Empirical Approach, 4th edition, by Ralph Hood, Jr., Peter Hill, and Bernard Spilka. I'll call this book HHS for convenience. One thing HHS does almost immediately is link "meaning making" to attributional needs. "Attributional" roughly means explanatory. Religions, among other things, offer people attributional schema, ways by which they make sense of the world by telling themselves and each other how it works.

Then again, we all do this, religious or not, and we do it all the time. What matters to us is some grasp on how things work, which only has to be as good an explanation as seems to work. Human beings, if little else, are very fond of heuristics, mental shortcuts that are "good enough" rules of thumb for solving our problems. As it works, in a lot of cases, a lot of superstitions are, or seem, good enough, especially if we don't know what else to do.

This state of affairs generates a lot of superstitions and is the basis for theologian Alvin Plantinga's so-called "Evolutionary Argument Against Naturalism," which posits that evolution would only select for getting good-enough answers, not right answers, and that we humans can get to right answers at all must argue against a naturalistic universe. Sigh.... Plantinga's obvious irony aside, given the superstitious set of beliefs he hopes to defend by saying that evolution should make people superstitious, a real take-home point here is that people are always seeking attributions.

Science is humanity's best attempt at finding good attributions for phenomena, even if it may not be equipped to provide "ultimate" attributions. What really matters, though, is that we all take and run with explanatory guesses that seem good enough, and part of what defines "good enough" for us is that they comport with how we think and feel.

To explain a bit, one of the most important observations I gleaned from reading HHS is the following:
Our theoretical position asserts that attributions are triggered when meanings are unclear, control is in doubt, and self-esteem is challenged. There is, as suggested, much evidence that these three factors are interrelated.

Given these three sources of motivations for attributions, an individual may attribute the causes of events to a wide variety of possible referents (oneself, others, chance, God, etc.). For the psychologist of religion, these referents may be classified into two broad categories: "naturalistic" and "religious." The evidence is that most people in most circumstances initially employ naturalistic explanations and attributions, such as references to people, natural events, accidents, or chance. Depending on a wide variety of situational and personal characteristics, there is a good likelihood of shifting to religious attributions when naturalistic ones do not satisfactorily meet the needs for meaning, control, and esteem. (p. 45)
I think this observation is very important for a number of reasons that I discuss in Everybody Is Wrong About God, but my intention in this essay is not to touch upon those. Instead, I want to mention a bone I pick in a footnote and run with that: the "broad categories" in HHS are too broad and mislabeled.

The correct labels for the two broad categories are "naturalistic" and "superstitious." Religious attributions are a particular kind of superstitious attribution, and not all superstitious attributions are religious in nature. Thinking one has bad luck, including if that bad luck is believed to be a result of some mistake (like breaking a mirror, for an old-fashioned one), would represent a common superstitious attribution that isn't religious (as almost every sentence beginning with "Knowing my luck..." attests). There are others, and they're nearer than you may think.

What binds them together is that they are beliefs about how the world works--that is, attributional schema--that either lack evidence supporting them, have unworkable theory underlying them, or both. In other words, they're superstitious. They sometimes seem to work and get repeated because of that, but that seeming to work is a misattribution of observations, usually to a extant attributional schema, often supported by little more than confirmation bias and cherry picking, besides a lot of cultural momentum, but we'll come back to this.

What, in fact, I want to talk about most here are a few particular kinds of nonreligious superstitious attributions, which in a way is saying a few kinds of "woo." I'm leaving out the usual kinds of things we call "superstitions" for the obvious reason--we already understand them as superstitions.
  1. Stuff like astrology.
  2. Quackery.
  3. Nearly natural superstitions.
I think it's probably best simply to highlight what I mean by each of these categories and then talk broadly about the big point, why people adopt such attitudes and what we can do about them.

Stuff like astrology:

Almost everything New Age falls into this category of superstitious attributions, and as long as they don't get too religious in nature, these can be distinguished from religious attributions in a substantive way (however fuzzy the boundary is). Some things in this category, like astrology, are older and more venerated, and some things are newer, like "the Law of Attraction" and whatever Deepak Chopra has been spewing over the last decade or two. These are rather like belief systems, and they often have to pull upon pretty dualistic or otherwise out-there sets of beliefs to be anchored as attributional in nature. ("God" is the religious anchor in theistic religions, for comparison.) They often benefit from their sheer size and, sadly, wide cultural acceptability.

What's relevant here is that these beliefs are not religious and often point to things that are legitimately natural, like planetary behavior, the mysterious nature of the mind, and so on (as opposed to transcendent super-reality) and thus feel more naturalistic than many religious attributions. They're really not.

Quackery (and "alternative" medicine, partly minus herbal):

Again, we're going to deal with fuzzy boundaries here, especially to my next category, but included here is quackery like (most) chiropractic, (most) acupuncture, all homeopathy, and many other brands of bullshit dedicated to a misguided (and often well-intentioned, yet often opportunistic) attempt to improve health or life-experience (or make money from that hope). I think some elaboration will be needed, and I really want to urge you not to care too much about the rather arbitrary division between quackery and what I'm calling "nearly natural superstitions."

Particularly, I want to qualify my two parenthetical "mosts" above. For example, chiropractic is more subtle as a superstition than is astrology if for no other reasons than that if you go to the chiropractor, your back probably is going to pop, you're probably going to feel somewhat better, and there are definite reasons to think that something real and possibly beneficial happened (if the doctor didn't injure you...). In fact, I'd argue that chiropractic is probably legitimately medically useful about ten percent of the time that it is applied, but its theory is utterly crap. Chiropractors often bill themselves as "nerve doctors" who manipulate the spine so that the nerves aren't impeded so that the body can use the better, more natural nerve conduction to facilitate healing, cure sicknesses, reverse serious disorders, and so on. That's crap, but sometimes bones are out of place and benefit from chiropractic adjustment.

Things are similar with acupuncture. The theory is crap. The vast majority of what it does is crap. However, dry-needling and certain types of massage that are often employed by DAcs can be genuinely effective (sometimes for medically understood reasons and sometimes not). I don't have a good estimate for what proportion of acupuncture application is legitimately useful, but it's probably lower than that for chiropractic. Still, it's very probably true that sometimes under very specific situations (like when dry needling might be medically indicated and happens to get applied by the acupuncturist in the relevant way) it can work, like really.

In both of those cases, and in other similar ones, the relevant issue is that, however successful the treatment might be in certain cases, the underlying mechanism is misattributed. That's not good, and it's ultimately sad because whatever legitimacy is there promotes the modalities and gets obscured by the crap theory and therefore is harder to pry out of the dungheap and improved upon.

Nearly natural superstitions:

What I'm talking about here is really anything that isn't systematic or out-there enough to stick into one of the other categories. Critically, these superstitions present in a way that seems to rely upon naturalistic attributions but don't. A lot of "detox" stuff, many herbal remedies, and dietary and fitness advice or other health advice fall under this umbrella. This will probably require some elaboration as well before proceeding.

What really stands out in this category is that we're working with a genuinely complicated system in many cases (like human bodies or society), and it's really hard to tell what's working and what isn't. The supplements industry, and a lot of the fitness industry, and much of the wellness industry, rely pretty heavily upon this epistemic fog, and they make billions off it. These are really subtle things, too, in a lot of cases.

What makes them "nearly natural" is that almost everything about them is natural in seeming but inaccurate. Not to pick on ginseng, which may or may not have therapeutic benefits, taking a ginseng supplement is intentionally consuming a substance full of chemicals that plausibly could have some effect (many drugs are very effective in very small doses). We also usually really want to feel better. Whether it be by placebo, by some effect, by the change in habit, by some ancillary aspect (like drinking a glass of water to take your ginseng), a sunk-cost commitment, or some of any or all of these things, it can be pretty easy to convince yourself that you've taken something beneficial. To the degree that it isn't really beneficial, congratulations, you've just installed a nearly natural superstitious attribution.

Terrifyingly, anti-vaccination superstitions probably fall into this category. In that seemingly bizarro world, no one is claiming that there is anything magical going on leading from vaccines to autism. All of the purported (and false) mechanisms are entirely naturalistic.

Lots of other examples exist too, especially in the social arena. Almost every ideological exaggeration in almost every political direction (examples from the Regressive Left, Whackadoodle Right, and any other highly motivated political zealots should easily spring to mind instantly without having to name them). Society exists, and so do the people in it, and superstitions about what's going on in it are rife.

...and in the Darkness, bind them

So, what's going on with these non-religious, not-overtly-superstitious superstitious attributions?

First, let's remove the obvious: people simply not knowing better yet. Even if the best guesses we have about how things work are ultimately superstitious, there's some reasonableness in not branding them so in cases where the person legitimately doesn't know better. That is, I'm particularly interested in when people are choosing superstitious attributions when naturalistic ones are available. The critical difference is that people will often change beliefs readily when simply mistaken, but they will not do so when there's some psychological investment in the beliefs.

That said, what's going on here is roughly the same thing, I think, as with religious and more overtly superstitious attributions. Let's turn back to HHS for an explanation, repeating what I think is most relevant:
Our theoretical position asserts that attributions are triggered when meanings are unclear, control is in doubt, and self-esteem is challenged. ... Depending on a wide variety of situational and personal characteristics, there is a good likelihood of shifting to religious attributions when naturalistic ones do not satisfactorily meet the needs for meaning, control, and esteem. (p. 45)
So, what's going on is that people want to understand things. With astrology, for example, they want to understand people and how to have effective relationships with them. This includes understanding themselves--Why do I seem so reluctant to commit to something...? Oh, but could it be that I'm a Gemini? They also want to understand how things are going in life. That meeting didn't work out, but of course not, Mercury is in retrograde, so I shouldn't have bothered (and won't next time). The superstitious astrological attributional framework gives a context in which one can better understand, and darkly, perhaps control others and oneself. Instead of relying upon deities and magical entities, however, it references nonsense about the real planets and stars (even if the asterisms are fictions).

Feelings of meaning, control, and esteem are pretty central to all superstitious attributions, and I think often control is often pretty fundamental. Notice how readily examples concerning health and well-being came up in the above discussion, and probably in your own mind if you try to think of more examples. The reason is probably because of what I already said: we all want to feel better but don't really know what works. If we're afraid (of feeling bad or being ill or living poorly or having autistic kids), control is in doubt, and we're likely to latch onto beliefs that restore some illusion of control. HHS discusses this important point too:
Though the ideal in life is actual control, the need to perceive personal mastery is often so great that the illusion of control will suffice. Lefcourt (1973) even suggests that this illusion "may be the bedrock upon which life flourishes." ... The attribution process described earlier represents not just a need for meaning, but also for mastery and control. Especially when threatened with harm or pain, all higher organisms seek to predict and/or control the outcomes of the events that affect them. This fact has been linked by attribution theorists and researchers with novelty, frustration or failure, lack of control, and restriction of personal freedom. It may be that people gain a sense of control by making sense out of what is happening and being able to predict what will occur, even if the result is undesirable. (p. 17, italics in original, bold added)
What do we do with this?

What this tells us is that we've really got to think about how we're going to handle certain kinds of nonsense if we want to try to diminish its harmful impacts (and we should). If people are holding superstitious attributions, ranging from strongly religious ones to nearly natural ones, because those satisfy some deeper psychological or social needs, we're not likely to argue them out of those beliefs, or not easily. We may, in fact, increase their commitment to those beliefs via the Backfire Effect, which in this case might arise along a vector of cognitive dissonance trying to protect the deeply needed beliefs and those related to them. A different approach than arguing is indicated, and the entire skeptic (and reasonable) community should be curious about finding ways to do better in this regard.

It isn't hard to tell the difference between an error and a superstitious attributional schema. An easy litmus test is simply attempting a factual correction of incorrect information. People aren't generally dumb. Thus, people who aren't holding superstitious beliefs for deeper psychological reasons will often simply change their minds about their erroneous beliefs. If they argue back, however, they've told us as plainly as they probably can (literally) that something deeper is at work, so we shouldn't argue with them. We need a different way that either unseats their ability to cling to that belief (and we know arguing isn't usually one of those) or that helps them identify the underlying needs and meet them differently (and, in the case of superstitious attributions, more successfully).

Usually, two things seem true about people psychologically attached to superstitious attributions: they can't really justify their beliefs, and they're (often) afraid (or, possibly, angry, but those aren't far apart in this regard). Asking them to justify their beliefs, say via Socratic dialogue and actual interest in their thoughts, and leaving enough safety to escape can help introduce doubt and unseat such beliefs, or at least make room for competing ideas that aren't so superstitious. Seeing why people believe as they do may help us understand what might help them find better attributions than the ones they hold. Failing to recognize the (probable) fear at the root of their adherence to those beliefs is an almost sure recipe for failure, though.

No comments:

Post a Comment