Tuesday, February 25, 2014

On religion, partial inoculation, and treatment-resistance disease.

It was the mid-1960s, and the NASA programs that would put a handful of human beings on the moon for the first time within the decade were in full swing. Following Kennedy's famous pronouncement, this fact was a major element of the era's decidedly scientific zeitgeist. The Second Great War was behind us, and partially because of it, the economy was booming. Money, though, didn't all go to the top, even if lots of it did. Economic inequality in the United States in the 1960s was low and still steadily cruising toward its eventual nadir. In other huge news, widespread application of antibiotics had changed the face of disease seemingly overnight just a couple of decades earlier. And vaccines had just entered the scene, dropping the incidence and death rates of many serious diseases, again seemingly overnight. Life was good, and on April 8, 1966, TIME magazine's cover asked, "Is God dead?"

Not quite. God was not dead, though he was knocking at death's door. Though the 1960s were far from idyllic, all of the necessary conditions were in place to inoculate modern citizens from what, in later years, some call "the faith virus." Science was big and in the public eye, satisfying our innate needs to understand the world and to feel like our unpredictable circumstances can be controlled. A fair degree of economic equality gave enough opportunity to enough people to feel hopeful and secure without God. At the same time, we were crushing disease, which has been traditionally believed to be the wrath of God enacted upon us lowly sinners. In the 1960s, it wasn't only scientists who agreed with Laplace's famous observation that there was no need for "that hypothesis"--God--in the model. It was popular sentiment.

But we didn't know an awful lot about these things in the 1960s, and I'd dare to suggest that we know better now, largely because of that failure, and hopefully it's not too late. We have a burden upon us, though, not to repeat the mistakes of our past.


Antibiotics and vaccines did what Jesus could not. They formed the nearest thing we can probably imagine to a miracle. Even if we accept them on their ridiculous face, Jesus' miracles cured only tens. Antibiotics and vaccines took common serious illnesses dropped their mortality rates almost to zero, and they did it blindingly fast. Where Jesus is said to have cured tens with his miraculous powers, antibiotics and vaccines cured tens of millions, in far less time and without a shred of doubt. Scourges of mankind like smallpox, many a theologian's delight for the fear it commanded, were effectively eradicated from the planet in a time comparable to the whole of Jesus' purported ministry.

But we didn't realize our danger. Evolution works quickly in extreme circumstances, allowing lifeforms to cling to existence beyond any hope. In rapidly reproducing bacteria and viruses, in which whole generations can be measured in minutes or hours instead of decades, the opportunity to evolve resistance to antibiotics and vaccines is stunning--and one of our greatest contemporary perils. Many disease-causing bacteria are evolving in response to our crusade against them, and in some cases they have evolved antibiotic-resistant strains (that's the R in the flesh-eating MRSA). Likewise, there is a serious threat that some of our vaccines may be obsolete within a few decades as new strains emerge beyond our protection. It is hard to imagine a worse situation than the resurgence of horrendous illnesses that are both unpreventable and untreatable.

There's a certain trick about evolution, though. Extinct species do not evolve. Unless it is somehow released from one of the handful of laboratories in which it still resides, smallpox isn't likely to be making a comeback. Indeed, even within local populations, say a particular host of a particular disease, at least some individuals must survive the onslaught that besets them to have a chance to adapt to the stress. The ones that survive, of course, are the ones hardiest to the adversity, which is one reason that antibacterial soaps that promise to kill 99.9% of germs are a little frightening. The hardiest 0.1% are the survivors that go on to reproduce, and the proportion of their offspring resistant to would-be toxic chemistry goes down.

With antibiotics it is less their application than their bad application that has led to horror stories like treatment-resistant tuberculosis arising in India. One must apply the right drugs and then see them completely through so that between the drug and the sick person's own immune system, there aren't enough survivors to be concerned with. This does not always happen for a variety of reasons, most of them bad or downright heinous (as with the state of medical treatment for Indian "Untouchables" for the worst of all possible reasons--religious ones). What happens reliably instead is that antibiotic resistant strains of diseases evolve, and dimly remembered horrors threaten to reawaken in our future.

With vaccines, the matter is similar. The reason smallpox was all-but eradicated, rather like polio was as well, is that the vaccines for these illnesses were spread globally via concerted efforts to stamp out the diseases. Nearly everyone was vaccinated, so the disease could not get almost any toehold, and where it did, it could not spread. The application was complete. This, though, is not always possible. Diseases that we've vaccinated nearly out of existence often cannot have hosts other than humans, but others can be carried by other animals in addition to us. To these, we must continually immunize ourselves and our newborn babies.

That issue was simple: it became standard praxis to provide a series of vaccinations to children starting almost from the hour of their birth. It worked wonderfully, but because of unscientific yammering, this situation has been reversed. Many parents refuse to vaccinate their children for very bad reasons. Predictably, these diseases--mumps, measles, rubella, pertussis, and so on--are making a roaring comeback. And the prediction is dire. Within a few decades, it seems, our vaccines may be mostly useless.

Incomplete application of an inoculation leaves open a dangerous door. The pathogens evolve, and the forms that survive are often more dangerous and considerably less treatable. Our God-is-dead hope of the 1960s teeters on this balance, because when the specter of deadly infectious disease comes back onto the scene, so will a desperate belief in God (and a manipulative one about his wrath).


Since the 1960s, we have learned that pushing for socioeconomic equality causes certain vectors of the inequality virus to work overtime to wheedle out ways in which they can exercise their sociopathic privilege to the ruin of many. The New Deal and ensuing Great Society spawned a cult of individual sovereignty that served as the perfect cover for the societal sickness known as plutocracy to creep back on us, and the date can be traced to roughly 1971 for the start of the full-force effort.

Somehow, American culture, enjoying the fruits of the Great Society, were not put in a position to understanding clearly that it is the society that makes great societies work, nor did they properly understand the role that wealth and income inequality play in it. Particularly, as those forms of inequality increase, the society becomes sick in profound ways. For the individual, the trend is not at all clear--more money means more opportunity means good things are on the track. For the society, though, now that we've looked, the fact stands out like a sore thumb. Wealth and income inequality are societal diseases, and many of the social ills we now bear witness to are symptoms. One of those is a return to distrustful individualism--a rejection of society--and thus the symptom exacerbates the cause.

Like with disease, the inoculation against this sickness was not complete in the US, and the reforms from the Roosevelt through Eisenhower eras served as the basis for plutocratic greed to evolve. Evolve it did, and by embracing the anti-New-Deal reactionary "value" of individual sovereignty, it brought itself back from the edge of death in a way with more popular appeal than ever. In 1980, with Ronald Reagan as its figurehead, government--which is to say society--became "the problem," and the cult of the rugged individual rocketed into the mainstream. American society at large was reinfected with the plutocracy virus that now threatens to tear it apart, and the virus is spreading abroad. European and Australian plutocrats, among others, are picking up the thought processes that have divided America, starting in the 1970s, and now their own Great Societies are being torn apart. The inoculation of progressive liberalism was not fully administered, largely because it poisoned itself with the ridiculous, postmodern relativism.

We know better now, of course, but in the 1960s we did not realize fully enough that wealth and income inequalities are such pathogenic socioeconomic forces. Again, the God-is-dead hope of the 1960s is threatened by this because desperate economic situations, which leave people feeling less autonomous however intensely they pretend to be ruggedly individually sovereign, lead to a resurgence of desperate beliefs in God (and again, a ripe opportunity for religious leaders to capitalize on the problem).


God wasn't dead in 1966, but a great deal of what went by that name was. In isolated corners of American culture, a new and decidedly fundamentalist, evangelical variant on Protestant Christianity was initiating a Great Revival. Others, notably the nearly immutable Roman Catholic Church, plodded along as ever. For typical Americans, belief in God might have been largely irrelevant, perhaps even quaint, but the inoculation against the faith virus was not made complete, and the surviving religious cells were positioning themselves to go big-time.

And they did. And they still are. A variety of forces contributed to this effect, social, political, economic, and even theological, but for the last few decades of the last millennium, America underwent a huge religious revival, turning back from the attitudes that characterized the middle of the twentieth century. The problem was that the faith virus was not treated fully. God became irrelevant before faith was exposed as a contagious cognitive flaw, and so susceptible minds were taken in a boomerang effect the revival came to full force. Some of this revival is accounted for by what appears to be a natural boomerang effect with regard to religious attitudes, but for that to have happened requires an incomplete inoculation against the faith virus in the first place.

New Atheism

"New Atheism," as it is called, is an enough-is-enough response to this revival, which took place not only within American Christianity but also in other faiths throughout the world, most notably Islam. The landmark event, of course, took place on September 11, 2001, which could be taken to be the first exclamation point in the story of the world's reinfection with an intense, recalcitrant strain of the faith virus after a brief remission toward enlightenment. Since the World Trade Center came down in flame, smoke, ash, and death, the religious story has been told more and more fervently, often in capslock, and we are left wondering how grim the situation is. A very hard to treat, profoundly virulent strain of the faith virus has taken root, and the medicine of "New Atheism" is bitter. "New Atheism" is unwavering rejection of religious authority, and a certain amount of steeled nerve to the cries of butt-hurt offense that flow from it.

Here, then, we see a difference between the God-is-dead from a half-century past and the "New Atheism" of today. "New Atheists" are profoundly less likely to ever be taken by the faith virus again because they understand both that God is irrelevant and that faith is a cognitive flaw loaded with pretense. "New Atheists" have been properly inoculated. The mental infrastructure that keeps the demon out is robust, solid, and clear, and it is held for clearly articulated reasons.

We need to take our medicine,...

...but not everyone wants to.

A growing movement of accommodationist atheists, faitheists, in the phrasing of Chris D. Stedman, author of a book called Faitheist, seem to prefer a kinder, gentler, less rebellious attitude from atheists toward faith. They, like NYU professor and social psychologist Jonathan Haidt, are "not anti-religion," and they beg "New Atheists" to see--and honor--what we have to learn from the faithful instead of making a hard-nosed stand against the faith virus. To see what I mean, consider this recent piece from Steadman (dubbed a "must read for ALL atheists" by one of Steadman's fans on Twitter)--or read his book--and this one from Haidt.

Steadman and Haidt, and their growing group of followers (particularly among the Progressive Left), prefer the shortsighted obvious. Taking antibiotics often makes one feel ill and is a hassle, and the feeling of being sick often subsides days or weeks before the full course of the prescription is run. Vaccines require getting an injection that is sometimes painful, can cause mild symptoms, and can also be a hassle--even without the ignorant unscientific fear that they cause autism--and who on earth really gets whooping cough? (N.B.: A friend of mine just did thanks to some unvaccinated kids at the playground where his (vaccinated) kids play.) Standing up in a hard-line fashion to religious authority and privilege hurts people's feelings and is mentally and emotionally draining. Steadman's position asks, can't we all just get along instead?

Sure, we can. And, if we prefer to, we can partially inoculate against the faith problem--one that possesses a potent opportunity to lead to calamity (take evangelical Christianity's near-universal religiously "justified" denialism of climate change, of all not-religious things, for example). What we cannot do, though, is delude ourselves into believing that it will serve to solve the unique problem that religious faith presents in our world. And we should recognize that if faith survives its brush with "New Atheism" so far, it will be stronger than ever for the encounter and at least as much of a problem.

"New Atheism" has shaken the world of faith like a cultural antibiotic, and in its wake it provides a potent vaccine. The disease isn't going easily, though, as is sometimes the case (the reader is encouraged to investigate the full treatment experience for hepatitis or tuberculosis, not to mention chemotherapy for cancer). There is a desire for harmony and peace that calls some to a mission to abandon the "New Atheist" treatment protocol in favor of the kind of deference that feeds religion forever. I understand that and wish for it as well, and I am quite sure it is wrong.

Ameliorative measures

There are ways to polish the rough edges of "New Atheism" to make the medicine a little easier to swallow. Up until now, it has admittedly at times been quite a blunt instrument, but as it matures, it is being refined. One of the best and most obvious suggestions has been brought to light by hard-liner Peter Boghossian, author of A Manual for Creating Atheists. Boghossian calls above all for authenticity and honesty. Be real. Be honest. Be willing to change your mind when the reasons are good. But do not mistake authenticity and honesty for deference to delusion. These achieve all of the goals of the anti-"New Atheist" crowd without their main failures. Authenticity of this kind does not condescend to the faithful by assuming that they need their faith to get through the day, and it does not give unwarranted deference to religious privilege. Authenticity and honesty are not merely palliative measures but are clear refinements of the "New Atheist" medicine, a course of treatment that we need to see through. Failure will preserve the disease in a state more resistant to future treatment.

Here's how to do it. Be ruthlessly honest with yourself, even if you do not have the nerve to do it with other people. By applying relentless honesty to one's own positions, neither faith nor deference to it is possible, and the anti-"New Atheism" confusion is revealed as a way to walk away from the cure before it is effected while nourishing the disease. People deserve dignity, and this obviously includes religious people, but ideas do not. Honesty and authenticity, even if unappreciated, are the best ways to dignify a person regardless of the quality of their ideas. It does nothing like offering dignity to a person to coddle their bad ideas, and so we shouldn't. And there is "New Atheism" in a sentence. So ask yourself, what are people who are against "New Atheism" really against?

No comments:

Post a Comment