Is There Something Missing in Our Lives? Is Nietzsche to Blame?
By the summer of 1990 the author Salman Rushdie had been living in hiding for more than a year. This had followed a fatwa, an Islamic juristic ruling, issued by the Iranian supreme cleric, Ayatollah Khomeini, on February 14, 1989, in which he had said, “I inform the proud Muslim people of the world that the author of the Satanic Verses book, which is against Islam, the Prophet and the Quran, and all those involved in its publication who were aware of its content, are sentenced to death. I ask all Muslims to execute them wherever they find them.”
This was by any standard a monstrous event, made all the more terrible by Khomeini’s claim of authority over all Muslims. But, however wrong, the threat had to be dealt with and Rushdie was given police protection and the use of a bulletproof Jaguar, though he had to find his safe houses himself. In July of that year, the police had suggested a further refinement for his safety—a wig. “You’ll be able to walk down the street without attracting attention,” he was told. The Metropolitan Police’s best wig man was sent to see him and took away a sample of his hair. The wig was made and arrived “in a brown cardboard box looking like a small sleeping animal.” When he put it on, the police said it “looked great” and they decided to “take it for a walk.” They drove to Sloane Street in London’s Knightsbridge and parked near the fashionable department store Harvey Nichols. When he got out of the Jaguar “every head turned to stare at him and several people burst into wide grins or even laughter. ‘Look,’ he heard a man’s voice say, ‘there’s that bastard Rushdie in a wig.’”1
It is a funny story, despite the grim circumstances in which it took place, and Rushdie tells it again himself in his memoir, Joseph Anton (the cover name he adopted), which he felt safe to publish only in 2012, nearly a quarter of a century after the original fatwa.
There was certainly something missing in his life during those anxious times, the most precious thing of all—his liberty. But that is not exactly what the German philosopher Jürgen Habermas had in mind when he wrote his celebrated essay, “An Awareness of What Is Missing: Faith and Reason in a Post-secular Age” (2008). He, too, was concerned with the impact of religion on our lives but meant something no less precious perhaps, and far more difficult to pin down.
NO “AMEN”: THE TERMS OF OUR EXISTENCE AND THE IDEA OF A MORAL WHOLE
This something had first occurred to him after he attended a memorial service for Max Frisch, the Swiss author and playwright, held in St. Peter’s Church in Zurich on April 9, 1991. The service began with Karin Pilliod, Frisch’s partner, reading out a brief declaration written by the deceased. It said, among other things: “We let our nearest speak, and without an ‘amen.’ I am grateful to the ministers of St. Peter’s in Zurich . . . for their permission to place the coffin in the church during our memorial service. The ashes will be strewn somewhere.” Two friends spoke, but there was no priest and no blessing. The mourners were mostly people who had little time for church and religion. Frisch himself had drawn up the menu for the meal that followed.
Habermas wrote much later (in 2008) that at the time the ceremony did not strike him as peculiar, but that, as the years passed, he came to the view that the form, place and progression of the service were odd. “Clearly, Max Frisch, an agnostic, who rejected any profession of faith, had sensed the awkwardness of non-religious burial practices and, by his choice of place, publicly declared that the enlightened modern age has failed to find a suitable replacement for a religious way of coping with the final rite de passage which brings life to a close.”
And this more than a hundred years since Nietzsche announced the death of God.
Habermas used this event—Frisch’s memorial—as the basis for “An Awareness of What Is Missing.” In that essay he traces the development of thought from the Axial Age to the Modern period and argues that, while “the cleavage between secular knowledge and revealed knowledge cannot be bridged,” the fact that religious traditions are, or were in 2008, an “unexhausted force,” must mean that they are based more on reason than secular critics allow, and this “reason,” he thinks, lies in the religious appeal to what he calls “solidarity,” the idea of a “moral whole,” a world of collectively binding ideals, “the idea of the Kingdom of God on earth.” It is this, he says, that contrasts successfully with secular reason, and provides the “awkward” awareness of something that is missing. In effect, he says that the main monotheisms had taken several ideas from classical Greece—Athens as much as Jerusalem—and based their appeal on Greek reason as much as on faith: this is one reason why they have endured.
Habermas has one of the most fertile, idiosyncratic and provocative minds of the post–Second World War conversation, and his ideas on this score are underlined by the similar notions of his American contemporaries Thomas Nagel and Ronald Dworkin. In his recent book, Secular Philosophy and the Religious Temperament, Nagel puts it this way: “Existence is something tremendous, and day-to-day life, however indispensable, seems an insufficient response to it, a failure of consciousness. Outrageous as it sounds, the religious temperament regards a merely human life as insufficient, as a partial blindness to or rejection of the terms of our existence. It asks for something more encompassing without knowing what it might be.”
The most important question for many people, Nagel says, is this: “How can one bring into one’s individual life a full recognition of one’s relation to the universe as a whole?” [Italics added.] Among atheists, he says, physical science is the primary means whereby we understand the universe as a whole, “but it will seem unintelligible [as a means] to make sense of human existence altogether. . . . We recognize that we are products of the world and its history, generated and sustained in existence in ways we hardly understand, so that in a sense every individual life represents far more than itself.” At the same time he agrees with the British philosopher Bernard Williams that the “transcendent impulse,” which has been with us since at least Plato, “must be resisted,” and that the real object of philosophical reflection must be the ever more accurate description of the world “independent of perspective.” He goes on: “The marks of philosophy are reflection and heightened self-awareness, not maximal transcendence of the human perspective. . . . There is no cosmic point of view, and therefore no test of cosmic significance that we can either pass or fail.”2
In a later book, Mind and Cosmos (2012), he goes further, arguing that the neo-Darwinian account of the evolution of nature, life, consciousness, reason and moral values—the current scientific orthodoxy—“is almost certainly false.” As an atheist, he nonetheless felt that both materialism and theism are inadequate as “transcendent conceptions,” but at the same time acknowledged that it is impossible for us to abandon the search “for a transcendent view of our place in the universe.” And he therefore entertained the possibility (on virtually no evidence, as he conceded) that “life is not just a physical phenomenon” but includes “teleological elements.” According to the hypothesis of natural teleology, he wrote, there would be “a cosmic predisposition to the formation of life, consciousness, and the value that is inseparable from them.” He admitted: “In the present intellectual climate such a possibility is unlikely to be taken seriously”; and indeed, he has been much criticized for this argument.
The argument itself will be discussed more fully in chapter 26, but it fits in here because it shows that, 130-odd years after Nietzsche famously announced “the death of God,” many people (though by no means all) are still trying to find other ways to look out upon our world than the traditional religious viewpoints.
Almost simultaneously, Nagel was joined by his fellow American philosopher colleague Ronald Dworkin in his Religion Without God (2013). Here, too, the main thrust of the argument will be discussed in chapter 26, but Dworkin’s chief point was that “religious atheism” is not an oxymoron (not anymore, anyway); that religion, for him and others like him, “does not necessarily mean a belief in God”—rather, “it concerns the meaning of human life and what living well means”; and life’s intrinsic meaning and nature’s intrinsic beauty are the central ingredients of the fully religious attitude to life. These convictions cannot be isolated from the rest of one’s life—they permeate existence, generate pride, remorse and thrill, mystery being an important part of that thrill. And he said that many scientists, when they confront the unimaginable vastness of space and the astounding complexity of atomic particles, have an emotional reaction that many describe in almost traditional religious terms—as “numinous,” for example.
This feels new, though, as we shall see in chapter 15, some of it at least was presaged by John Dewey between the two world wars and hinted at by Michael Polanyi in the late 1950s and early 1960s.3 The significant factor, for now, is that these three philosophers—on either side of the Atlantic and each at the very peak of his profession—are all saying much the same thing, if in different ways. They share the view that, five hundred and more years after science began to chip away at many of the foundations of Christianity and the other major faiths, there is still an awkwardness, as Habermas put it, or a blindness or “unsufficiency” (Nagel); a mystery, thrilling and numinous, as Dworkin characterized it, in regard to the relationship between religion and the secular world. All three agree with Bernard Williams that the “transcendent” impulse must be resisted, but they acknowledge—ironically—that we cannot escape the search for transcendence and that, as a result, many people feel “something” is missing. This is, in effect, they say, the modern secular predicament.
It is in many ways extraordinary that these three individuals—all hugely respected—should, within a few months of each other, but independently, come to similar conclusions: that, depending on where you start counting—from the time of Galileo and Copernicus, four or five hundred years ago, or Nietzsche, 130 years ago, secularization is still not fitting the bill, is still seriously lacking in . . . something.
The Canadian philosopher Charles Taylor has no doubt what that something is. In two very long books, Sources of the Self (1989) and A Secular Age (2007), he repeatedly charges that people today who inhabit a secular world and lack faith are missing out, missing out on something important, vital—perhaps the most important something there is—namely, as he puts it, a sense of wholeness, fulfillment, fullness of meaning, a sense of something higher; they have an incompleteness, that there is in the modern world “a massive blindness” to the fact that there is “some purpose in life beyond the utilitarian.”4
Human flourishing, Taylor maintains—a fulfilled life—can be achieved only via religion (Christianity, in his case). Otherwise, the world is “disenchanted,” life is a “subtraction story” with important parts missing. With no sense of “transcendence,” no sense of the “cosmic sacred,” we are left with “merely human values,” which he finds “woefully inadequate.” The “higher times,” he says, have faded, we are imbued with “a sense of malaise, emptiness, a need for meaning”; there is a terrible sense of flatness in the everyday, the emptiness of the ordinary, and this need for meaning can be met only “by a recovery of transcendence.”5
POROUS VERSUS BUFFERED SELVES
Taylor pursues this argument further than any of the others. He says that humanism has failed, that the “pursuit of happiness,” a current concern, is a much thinner idea or ideal than “fulfillment” or “flourishing” or transcendence; that it uses a “less subtle language,” giving rise to less subtle experiences; that it is lacking in “spiritual insight,” spontaneity or immediacy, is devoid of “harmony” and “balance,” and is ultimately unhealthy.
The modern individual, he says, is a “buffered” self rather than a “porous” self. A porous self is open to all the feelings and experiences of the world “out there,” while the modern buffered self is denied these experiences because our scientific education teaches us only concepts, our experiences are intellectual, emotional, sexual and so on, rather than “whole.” Modern individuals have been denied a “master narrative” in which they may find their place, and without which their “sense of loss can perhaps never be stilled.” Without these factors, he goes on, there is no scope for any human life to achieve a “sense of greatness” out of which a “higher” view of fulfillment arises. The sense that there is “something more” presses in on us, and, therefore, we can never be “comfortable” with unbelief.
Phew. Skeptics may raise their eyebrows at these claims but there is no doubt that they chime with what many people feel or think. And the likes of Taylor find support for their arguments in the statistical fact that, after the high point of secularization in the 1960s and 1970s, at the beginning of the twenty-first century more and more people are turning—or returning—to religion. Richard Kearney has even given it a name, Anatheism.6 We shall return to the (ambiguous) meaning of these statistics presently, but it is certainly true that in 2014 the battle between religious thinkers and atheists is as fierce (and indeed as bitter) as it has been for many a year.
For their part, the militant atheists, as they have been described, largely occupy a Darwinian position. Richard Dawkins, Daniel Dennett, Sam Harris and Christopher Hitchens, to name only the best known, follow Charles Darwin in seeing human beings as an entirely naturally occurring biological species, which has slowly evolved from “lower” animals, in a universe that has likewise evolved over the past 13.5 billion years from a “singularity,” or “Big Bang,” itself a naturally occurring process (albeit where the laws of nature break down) that we shall understand someday. This process has no need of any supernatural entity.
In the latest rounds of this debate, Dawkins and Harris have used Darwinian science to explain the moral landscape in which we live, and Hitchens has described such institutions as the library, or “lunch with a friend,” as episodes in a modern life just as fulfilling as prayer or church- or synagogue- or mosque-going.
The average reader—especially the average young reader—could be forgiven for thinking that this is all there is to the debate: either we embrace religion, or we embrace Darwinism and its implications. Steve Stewart-Williams has taken this reasoning to its logical conclusion when he says, in Darwin, God and the Meaning of Life (2010), that there is no God, that the universe is entirely natural and in that sense accidental, so that there can be no purpose to life, and no ultimate meaning other than that which we work out for ourselves as individuals.
But though it is the Darwinists who, among atheists, are making the most noise at the moment (and with good reason, given the amount of biological research that has accumulated in the past decades), theirs is not the only game in town. The fact is that, since the advance of religious doubt gathered pace in the seventeenth and eighteenth centuries, and in particular since Nietzsche announced “the death of God” in 1882 (adding, moreover, that it was we humans who had killed him), many people have addressed themselves to the difficult question of how we are to live without a supernatural entity on whom we can rely.
Philosophers, poets, playwrights, painters, psychologists, to name only those whose professions begin with the same letter of the alphabet, have all sought to think through just how we might live, individually and communally, when we have only our own selves to fall back on. Many—one thinks of Dostoevsky, T. S. Eliot, Samuel Beckett—have expressed their horror at what they see as the bleak world that is left once the idea of God leaves it. Perhaps because horror claims all the best tunes, these Jeremiahs have caught the popular imagination, but The Age of Atheists will concentrate instead on the other—in some ways braver—souls who, instead of waiting and wallowing in the cold, dark wastelands of a Godless world, have devoted their creative energies to devising ways to live on with self-reliance, invention, hope, wit and enthusiasm. Who, in Wordsworth’s words, “grieve not, rather find / Strength in what remains behind.”
This aspiration, how to live without God, how to find meaning in a secular world, is—once you put your mind to it—a grand theme that has been touched on by a number of the more daring modernist writers, artists and scientists but has never before been gathered together, so far as I know, into a master narrative. When that is done, it provides a rich and colorful story, as I hope to show, a set of original yet overlapping ideas which I am sure many readers will find thrilling, provocative, yet commonsensical and even consoling.
Some consolation is especially called for because the debate over faith, over what is missing in people’s lives, has degenerated in recent years into a bizarre mix of the absurd and the deadly.
ARE WE IN A SPIRITUAL RECESSION? OR, ARE WE AS FURIOUSLY RELIGIOUS AS WE EVER WERE?
Twice in recent years, religious figures predicted that the world would end—on May 21, 2011, and December 21, 2012. Nothing of the kind happened either time, but none of the figures concerned felt a need to acknowledge that their predictions were . . . well, plain wrong. Pakistan has experienced numerous assassinations of individuals seen—by fellow members of the public—to be contravening its relatively new Islamic blasphemy laws. Tunisia has seen two prominent secular politicians assassinated. Sexual and child abuse cases by Muslims in Britain and Holland, or by Catholic priests in a whole raft of countries worldwide, have become virtually part of the furniture of our lives; the abuse of young white girls, by Muslim men, in Britain has been described as a “tidal wave of offending.”7
These events, coming in the wake of other, even more spectacular, atrocities (the devastation of 9/11, the bombings in Bali, Madrid and London, all committed by Muslims), may not have been quite as bloody in terms of the numbers killed. But they do mark an extension of religiously motivated criminal behavior into ever widening areas of human intolerance—and therein lies what is arguably the most important intellectual, political—even existential—paradox facing us in the young twenty-first century.
An atheist observing this set of absurd and deadly behaviors could be forgiven for grimacing in chastened satisfaction. After centuries of religious strife, after more than two hundred years of deconstruction of the factual historical basis of the Bible, after a plethora of new gods has emerged in the most unlikely, mundane and prosaic of ways and places—the Duke of Edinburgh is worshipped as a god on the Pacific island of Vanuatu, a Lee Enfield motorcycle is revered as a deity in parts of India, there is now a website, godchecker.com, listing more than three thousand “supreme” beings—humans everywhere seem to have learned next to nothing. They are still locked into ancient enmities, still espouse outdated and disproved doctrines, still fall for shabby con tricks, allowing themselves to be manipulated by religious showmen and charlatans.
And yet, and yet . . . The blunt (and to many the perplexing) truth appears to be that, despite the manifest horrors and absurdities of many aspects of religion, despite the contradictions, ambiguities and obvious untruths embodied by all major and minor faiths, it is—according to a number of distinguished authorities—atheism that appears to be in retreat today.
One of the first to point this out was the sociologist Peter Berger. His view might be seen as poignant because it had some of the characteristics of a conversion. Berger, an Austrian émigré who became professor of sociology and theology at Boston University, was in the 1950s and 1960s a keen advocate of “secularization theory.” This theory, which was at its strongest in the mid–twentieth century and could be traced back to the Enlightenment, held that modernization “necessarily” leads to the decline of religion, both in society and in the minds of individuals. On this analysis, secularization was and is a good thing, in that it does away with religious phenomena that are “backward,” “superstitious” and “reactionary.”
That was then. In the opening decades of the twenty-first century, however, the picture appears very different, at least to some people. As mentioned above, Peter Berger was one of the first to draw attention to the change which brought about, on his part, a famous recantation. In 1996, he accepted that modernity had, “for fully understandable reasons,” undermined all the traditional certainties, but he insisted that uncertainty “is a condition that many people find very hard to bear.” Therefore, he pointed out, “any movement (not only a religious one) that promises to provide or to renew certainty has a ready market.”8 And, looking about him, he concluded that the world today “is as furiously religious as it ever was . . . is anything but the secularized world that had been predicted (whether joyfully or despondently),” that whatever religious color people have, they are all agreed upon “the shallowness of a culture that tries to get along without any transcendent point of reference.”9
Berger is not alone. There is no question that the spirits of religious authors are on the rise. In 2006, John Millbank, professor of religion at the University of Nottingham, sought to explain how theology can lead us “beyond secular reason.” In The Language of God (2006), Francis S. Collins, the geneticist who led the American government’s effort to decipher the human genome, described his own journey from atheism to “committed Christianity.” In God’s Universe (2006), Owen Gingerich, professor emeritus of astronomy at Harvard, explained how he is “personally persuaded that a superintelligent Creator exists beyond and within the cosmos.” And in Evolution and Christian Faith, published the same year, Joan Roughgarden, an evolutionary biologist at Stanford University, recounted her struggles to fit the individual into the evolutionary picture—complicated in her case by the fact that she is transgender and so has views at odds with some conventional Darwinian thinking about sexual identity.
In 2007, Antony Flew, professor of philosophy at various universities in Britain and Canada, explained in There Is a God how “the world’s most notorious atheist [himself] changed his mind.” Also in 2007, Gordon Graham examined whether art, for all its advantages, can ever “re-enchant” the world the way religion did, concluding that it couldn’t. In 2008, Dr. Eben Alexander suffered bacterial meningitis and went into a deep coma for a week. Recovering, he wrote a best-selling memoir, Proof of Heaven: A Neurologist’s Journey to the Afterlife, in which he described heaven as full of butterflies, flowers, and blissful souls and angels.10
RELIGION AS SOCIOLOGY, NOT THEOLOGY
There is another perplexing side to this—namely, that in the past decade some new and sophisticated arguments have been made for understanding religion as a natural phenomenon. Some of these arguments, moreover, have arisen as a result of new scientific findings that have changed the nature of the debate. What are we to make of this state of affairs, in which atheism has the better case, where its evidence involves new elements, which introduces new arguments, but where religion, so its adherents claim, has the numbers, despite its manifest horrors and absurdities?
The most convincing argument I have encountered—certainly the one with the most substantial and systematic evidence to support it—is that offered by Pippa Norris and Ronald Inglehart in Sacred and Secular: Religion and Politics Worldwide (2004). Their book draws on a massive base of empirical evidence generated by the four waves of the World Values Survey, carried out from 1981 to 2001, which has conducted representative and sophisticated national surveys in almost eighty societies, covering all of the world’s major faiths. Norris and Inglehart also used Gallup International Polls, the International Social Survey Program and Eurobarometer surveys. While, they say, “it is obvious that religion has not disappeared from the world, nor does it seem likely to do so,” they insist that the concept of secularization “captures an important part of what is [still] going on.”
Their study identifies a core sociological factor, something they term “existential security,” which they say rests on two simple axioms and which “prove[s] extremely powerful in accounting for most of the variations in religious practices found across the world.”11
The first basic building block in their theory is the assumption that rich and poor nations around the globe differ sharply in their levels of sustainable human development and socioeconomic inequality and thus in the basic living conditions of human security and vulnerability to risks. The idea of human security has emerged in recent years, they say, as an important objective of international development. At its simplest, the core idea of security rejects military strength to ensure territorial integrity and replaces it with freedom from various risks and dangers, ranging from environmental degradation to natural and man-made disasters such as floods, earthquakes, tornadoes and droughts, and to epidemics, violations of human rights, humanitarian crises and poverty.
The past thirty years have seen dramatic improvements in some parts of the developing world. Nevertheless, the United Nations Development Program (UNDP) reports that worldwide progress has been erratic during the last decade, with some reversals: fifty-four countries (twenty of them in Africa) are poorer now than in 1990; in thirty-four countries, life expectancy has fallen; in twenty-one, the Human Development Index declined. In Africa, trends in HIV/AIDS and hunger are worsening. The gap between living conditions in rich and poor societies is growing.12
Analysis of data from societies around the world has revealed that the extent to which people emphasize religion and engage in religious behavior could, indeed, be predicted with considerable accuracy from a society’s level of economic and other development. Multivariate analysis (a mathematical technique) has demonstrated that a few basic developmental indicators, such as per capita GNP, rates of HIV/AIDS, access to improved water sources and the number of doctors per hundred thousand people, predict “with remarkable precision” how frequently the people of a given society worship or pray. The most crucial explanatory variables are those that differentiate between vulnerable societies and societies in which survival is so secure that people take it for granted during their formative years.13
In particular, Norris and Inglehart hypothesize that, all things being equal, the experience of growing up in less secure societies will heighten the importance of religious values, while, conversely, experience of more secure societies will lessen it. The main reason, they say, is that “the need for religious reassurance becomes less pressing under conditions of greater security.” It follows that people living in advanced industrial societies will often grow increasingly indifferent to traditional religious leaders and institutions and become less willing to engage in spiritual activities. “People raised under conditions of relative security can tolerate more ambiguity and have less need for the absolute and rigidly predictable rules that religious sanctions provide.”
It seems plain that improving conditions of existential security erode the importance of religious values but—and here is the rub—at the same time reduce the rates of population growth in postindustrial societies. So rich societies are becoming more secular in their values but shrinking in population. In contrast, poorer nations remain deeply religious in their values and will also have much higher fertility rates, producing ever larger populations (and therefore tending to remain poor).14 A core aim of virtually all traditional religions is to maintain the strength of the family, “to encourage people to have children, to encourage women to stay home and raise the children, and to forbid abortion, divorce, or anything that interferes with high rates of reproduction.” It should be no surprise, then, that these two interlinked trends mean that rich nations are becoming more secular, but the world as a whole is becoming more religious.
TRANSCENDENCE VERSUS POVERTY
A number of things follow from this analysis. In the first place, we can say that the original secularization theory was right all along but many societies did not follow (or failed to follow) the same industrialization/urbanization path as did the West. Second, and conceivably more important, we can now see that religion is best understood “as a sociological rather than a theological phenomenon.”15 Far from “transcendence” being the fundamental ingredient or experience related to belief, as Peter Berger and others claim, poverty and existential insecurity are the most important explanatory factors. Given all this, and combined with the UNDP findings—that the gap between rich and poor countries continues to widen, and “existential insecurity in some fifty or more countries is likewise growing”—then the “success” of religion is actually a by-product of the failure of some countries to successfully modernize and reduce the insecurities of their people. On this reading, the expansion of religion is nothing for us, as a world community trying to help each other, to be proud of—triumphalism concerning the religious revival is therefore, on this account, misplaced.
A final point is more subtle. When we actually look at the “flavor” of the religions that are flourishing now, when we look at their theological, intellectual and emotional characteristics, what do we find? We find, first, that it is the established churches—those with the most elaborately worked out theologies, theologies as often as not about transcendence—that are losing adherents, to be replaced by evangelicals, Pentecostals, “health-and-wealth” charismatics and fundamentalists of one kind or another. In 1900, 80 percent of the world’s Christians lived in Europe and the United States; today, 60 percent of them live in the developing world.16
What are we to make of evangelical healing and prophecy? If these worked often enough, they would surely take over the world far more than they have done, offering a better explanation for disease, say, than any scientifically derived view. What are we to make of “speaking in tongues,” a biblical phrase that confers a would-be dignity on a phenomenon that, under any rational light, borders on mental illness? When, in February 2011, a reporter on live television in the United States suddenly broke into gibberish for a few moments, it attracted wide interest on other TV stations and on the Internet, and both ribald and sympathetic comment, but no one suggested for a moment that she had had a religious experience (and she didn’t say that herself). Discussion centered on which regions of her brain might have caused such an “epileptic-type” outburst.
What are we to make of health-and-wealth churches? What role does “transcendence” play in their ideology? Health and wealth directly address existential insecurity.
To the atheist mind, these developments—the violent intolerance of fundamentalist Islam, the willful ignorance of the creationists in certain regions of the United States, speaking in tongues by evangelicals, charismatic “healing,” the worship of motorcycles in India—suggest nothing less than a turning-back of the clock. The simple, obvious and rational sociological explanation for these events only underlines their crudity.
Alongside the sociological explanations for the religious revival, the psychological ones seem—to an extent—almost beside the point. In their book God Is Back, John Micklethwait and Adrian Wooldridge hold that there is “considerable evidence that, regardless of wealth, Christians are healthier and happier than their secular brethren.” David Hall, a doctor at the University of Pittsburgh Medical Center, maintains that weekly church attendance can add two to three years to someone’s life. A 1997 study of seven thousand older people by the Duke University Medical Center found that religious observance “might” enhance immune systems and lower blood pressure. In 1992 there were just three medical schools in the United States that had programs examining the relationship between spirituality and health; by 2006 the number had increased to 141.17
Micklethwait and Wooldridge state: “One of the most striking results of the Pew Forum [Research Center]’s regular survey of happiness is that Americans who attend religious services once or more a week are happier (43 percent very happy) than those who attend monthly or less (31 percent) or seldom or never (26 percent). . . . The correlation between happiness and church attendance has been fairly steady since Pew started the survey in the 1970s; it is also more robust than the link between happiness and wealth.”18
Studies also show, they say, that religion can combat bad behavior as well as promote well-being. “Twenty years ago, Richard Freeman, a Harvard economist, found that black youths who attended church were more likely to attend school and less likely to commit crimes or use drugs.” Since then, a host of further studies, including the 1991 report by the National Commission on Children, have concluded that religious participation is associated with lower rates of crime and drug use. James Q. Wilson (1931–2012), perhaps America’s pre-eminent criminologist, succinctly summarized “a mountain of [social-scientific] evidence”: “Religion, independent of social class, reduces deviance.” Finally, Jonathan Gruber, “a secular-minded economist” at the Massachusetts Institute of Technology, has argued “on the basis of a mass of evidence” that churchgoing produces a boost in income.
Two observations are pertinent here. The first is that these examples are taken from the United States and, as is becoming clear, that country is exceptional in all sorts of ways and not at all typical of what is happening elsewhere. The second observation is, perhaps, more relevant to our subject. Even if some of these surveys showing the benefits of belief are true, what exactly is being argued here? That God rewards people who go to church regularly and often by making them happier, healthier and, to an extent, richer? But if so, and if God is omnipotent and beneficent, what about the 57 percent of regular churchgoers who are not happy? They go to church—so why has (an omnipotent and benevolent) God discriminated against them? By the same token, why are any non-churchgoers happy? Twenty-six percent say they are, yet seldom or never go to church. How do we know that these people weren’t happy or unhappy to begin with, irrespective of their churchgoing behavior? And in any case, these figures show that, even among the churchgoers, the unhappy outweigh the happy by a significant majority. What, we may ask, is God playing at?
Still more to the point, and revealingly, these are arguments for the psychological benefits of faith, not for theological ones. One could argue—theologians in the past have argued—that happiness is not the aim for religious people, certainly not for pious Christians, the crux of their belief system being that they can hope for salvation only in the next life. There is thus something in this whole exercise, of trying to prove the benefits of faith at every level, that smacks of . . . well, shaping the evidence to fit the conclusion that was wanted in the first place. Jonathan Haidt in The Righteous Mind argues further that “human flourishing requires social order and embeddedness,” which is best obtained by religion, which is the “handmaiden of groupishness, tribalism and nationalism.” But he also adds that research shows that religious people are better neighbors and citizens not because they pray or read the scriptures or believe in hell (“These beliefs and practices turned out to matter very little”) but because they were “enmeshed” with others of similar religion. Here, too, religion is conceived of as a psychological phenomenon, not a theological one.
The psychological evidence, however, is really overwhelmed by the much wider picture as described by Norris and Inglehart’s sociology. Their conclusion is worth giving in full:
“The critique [of secularization theory] relies too heavily on selected anomalies [and ignores some striking oddities]. And focuses too heavily on the United States (which happens to be a striking deviant case) rather than comparing systematic evidence across a broad range of rich and poor societies. . . . Philosophers and theologians have sought to probe into the meaning and purpose of life since the dawn of history; but for the great majority of the population, who lived at the margin of subsistence, the need for reassurance and a sense of certainty was the main function of religion.”19
Point one in the argument of this book, then, is that although for some people in the early twenty-first century “God is back!” the actual situation is rather more complex and considerably more fraught than that simple statement suggests. Contrary to what many religious people would like to believe is happening, that atheism is in retreat, that is not true either, at least in the developed world.
At the same time, for many people, Charles Taylor had a point when in his 2008 book A Secular Age he wrote that modernity involves in some sense a “subtraction story,” a loss or narrowing of experience, a “disenchantment” with the world that “leaves us with a universe that is dull, routine, flat, driven by rules rather than thoughts, a process that culminates in bureaucracy run by ‘specialists without spirit, hedonists without heart,’” that atheists lead impoverished lives that are somehow less “full” than the lives of believers, that atheists “yearn” for something more than can be provided by the self-sufficient power of reason, and that they are blind and deaf to the miraculous moments when “God breaks in,” in the works of Dante or Bach, or Chartres Cathedral, say.20
Many atheists would dismiss Taylor out of hand, but he is not entirely alone in this, either. Here is another raft of books published since the millennium: Luc Ferry, Man Made God: The Meaning of Life (2002); John Cottingham, On the Meaning of Life (2003); Julian Baggini, What’s It All About? Philosophy and the Meaning of Life (2004); Richard Holloway, Looking in the Distance: The Human Search for Meaning (2004); Roy F. Baumeister, The Cultural Animal: Human Nature, Meaning and Social Life (2005); John F. Haught, Is Nature Enough? Meaning and Truth in the Age of Science (2006); Terry Eagleton, The Meaning of Life (2007); Owen J. Flanagan, The Really Hard Problem: Meaning in a Material World (2007); Claire Colebrook, Deleuze and the Meaning of Life (2010).
Now, at one stage such a phrase as “the meaning of life” could have been used only in an ironical or jokey way. Its serious use would have been seen as embarrassing. The 1983 Monty Python film The Meaning of Life had several answers, including “be kind to fish,” “wear more hats” and “avoid eating fat.” But “the meaning of life” is no longer an embarrassing subject, it would seem, in the twenty-first century.
Why should that be? Could it be that Taylor has at least part of a point, in that many of the ways of thought conceived over the past 130 years have proved not to have all the answers? Certainly, many ideologies and “isms” of the modern world have either collapsed or become dead ends: imperialism, nationalism, socialism, Marxism, communism, Stalinism, fascism, Maoism, materialism, behaviorism, apartheid. Most recently, with the “credit crunch” of 2008 and its turbulent wake, even capitalism has come under the spotlight.
“THE THINGS WE HAVE ARE DEVALUED BY THE THINGS WE WANT NEXT”
The impact of the credit crunch was much more than economic. Writing in the [London] Times, the author Jeanette Winterson argued that the “so-called civilized West, at its most materialistic, has failed to deliver the goods . . . we are in a terrible mess”; “the way out” is through art, she concluded. In a later article in the same newspaper, she wrote: “We have created a society without values, that believes in nothing.” Other aspects of the crisis were highlighted, again in the Times, which reported that a survey of Faithbook—a new multi-faith page on Facebook—showed that 71 percent of those surveyed thought that we are today in a “spiritual recession” and that that is more worrying than the material recession. (Another survey showed there had been a 27 percent increase in praying since the credit crunch began, yet more evidence of religious behavior having to do with existential insecurity.) In November 2008 it was reported that in Britain more people believed in aliens and ghosts than believed in God: of the three thousand surveyed (not a small sample), 58 percent believed in supernatural entities against 54 percent who believed in God.I The subscribers to Faithbook hold that “any faith is better than none.”
Despite some of the cathedrals of capitalism having gone under, or been rescued by nationalization or government bailouts, capitalism hasn’t yet, in 2014, collapsed. It certainly got a fright, and is still in intensive care, but its obituary hasn’t yet been published. More to the point, all this has provoked, and will continue to provoke, a change of attitude, or perspective: we now appear to be entering a more serious, more reflective time when, as a result of the financial collapse, people are seriously reassessing the values and ideas by which we live. Nigel Biggar, regius professor of moral and pastoral theology at Oxford, told the Financial Times that, having taught many students who went into the City or big law firms, he has observed a recent change. “I kept in touch with some of them. When they were young, the 24/7 life was stimulating. It became a burden later on when they had a family, but then they were trapped by wealth. I see a move away from that now: more interest in teaching and other forms of public service.”21
Several things are conflated here. Religious belief and unbelief are two of them. The failure of science to engage the enthusiasm of many is another. And the psychological dimension is yet another, in which the chief objects of attention have been happiness and loneliness, different sides of the same coin when it comes to fulfillment.
A survey published in Britain in 2008 showed that people across the country were “increasingly lonely,” and that the predicament had been accelerating in the previous decade. The increase in loneliness had started, the survey reported, in the late 1960s, when neighborhoods had been progressively weakened by increased rates of divorce, immigration, the need to move house for job-related reasons and the growth of transitory student populations (British universities have increased since 1963 from twenty-three to more than a hundred). Thomas Dumm’s Loneliness as a Way of Life (2008) characterizes America as the archetypal lonely society of the future, typified by a “possessive individualism” in which “personal choice” is a cloak rather than an opportunity.22
Happiness, touched on a few pages back, has received, perhaps inevitably, even more attention. Confining ourselves only to twenty-first-century sources, there has been a wave of books exploring happiness—how to achieve it, its links to the latest brain science, what gets in the way of it, how it varies around the world, why women are (in general) less happy than men.
One well-publicized finding is that although the developed Western nations have become better off in a financial and material sense, they are not any happier than they were decades ago. In fact, in The Age of Absurdity: Why Modern Life Makes It Hard to Be Happy (2010), Michael Foley argues that modern life has made things worse, “deepening our cravings and at the same time heightening our delusions of importance as individuals. Not only are we rabid in our unsustainable demands for gourmet living, eternal youth, fame and a hundred varieties of sex, we have been encouraged—by a post-1970s ‘rights’ culture that has created a zero-tolerance sensitivity to any perceived inequality, slight or grievance—into believing that to want something is to deserve it.”23 Moreover, “the things we have are devalued by the things we want next”—another consequence of capitalism.
On the other hand, the latest World Values Survey, published in August 2008, found that over the past twenty-five years, in forty-five out of fifty-two countries where polling took place, happiness had risen. But the research also showed that economic growth boosts happiness noticeably only in countries with per capita GDP of less than $12,000. Happiness had fallen in India, China, Australia, Belarus, Hungary, Chile, Switzerland (Switzerland!) and Serbia. Happiness appeared more related to democratization, greater variety and opportunities in the workplace, access to travel and the opportunity to express oneself. Other research showed that individualistic nations, especially in the West, “were particularly susceptible to negative emotions,” whereas Asian or Latin American countries were less so “because they consider their individual feelings less important than the collective good.”24
Let us be honest. These are all fascinating findings, and many of them are salutary and worrying in equal measure. But they are also contradictory and paradoxical. In America it is the churchgoers who are happiest, but worldwide it is those who are existentially insecure (and therefore extremely unlikely to be happy) who most attend church; religion is associated in America with less criminality, but worldwide with more; in America attendance at church boosts income, but worldwide a rise in income fails to increase happiness and it is the poorest who most attend church. Peter Berger says we are as furiously religious as ever but the members of Faithbook think we are in a spiritual recession; Peter Berger says it is the absence of transcendence that people miss but the World Values Survey shows that it is instead the absence of bread, water, decent medication and jobs that people miss, and which leads them to religion.
Despite the contradictions in these findings, amid the atavistic, violent and absurdly incoherent nature of many recent religious manifestations, and although the sociological explanations for both religious and non-religious orientations seem—rationally and convincingly—to outweigh theological ones, it is clear that many religious souls refuse to accept such a state of affairs.
Charles Taylor and the other authors referred to above lead the way in arguing that atheists suffer impoverished lives. But the Norris-Inglehart survey indicates that once existential insecurity is relieved, faith disappears. This sociological transformation is still occurring—it is even beginning to occur in the United States. A Pew Research Center poll published in 2012 reveals that the number in the United States with no religious affiliation has risen from 16 percent in 2008 to 20 percent four years later. Church attendance has dropped from around 40 percent in 1965 to under 30 percent now.25
• • •
One book cannot hope to have much of an impact when set against the absurd, tragic and horrific dimensions of recent religious history, but this one at least aims to offer something that hasn’t, to my knowledge, been done before. It aims to be an extensive survey of the work of those talented people—artists, novelists, dramatists, poets, scientists, psychologists, philosophers—who have embraced atheism, the death of God, and have sought other ways to live, who have discovered or fashioned other forms of meaning in the world, other ways to overcome the great “subtraction,” the dreadful impoverishment that so many appear to think is the inevitable consequence of losing the idea of supernatural transcendence.
I hope to show that such an eventuality is far from inevitable. In fact, when you look at our recent history you encounter quite a lot of surprises in the works of luminaries you thought you knew; you make some unusual (and revealing) juxtapositions; and you discover that the search for other ways to live has been one of the core components—part of the DNA, to use a modern metaphor—of modern culture. You also realize that, far from atheists leading less than full lives, neither God nor the Devil has all the best tunes.
One more point, but an important one. Is Nietzsche to blame for our current predicament? Why is it that his intervention has caught our attention above all others? And what does that tell us?
THE PHENOMENON THAT WAS NIETZSCHE
Toward the end of March 1883, Friedrich Nietzsche, then aged thirty-nine and staying in Genoa, was far from well. He had recently returned from Switzerland to his old lodgings on the Salita delle Battistine but this brought no immediate relief from his migraines, stomach troubles and insomnia. Already upset (but also relieved) by the death the previous month of his erstwhile great friend the composer Richard Wagner, with whom he had fallen out, he came down with a severe attack of influenza for which the Genoese doctor prescribed daily doses of quinine. Unusually, a heavy snowfall had blanketed the city, accompanied by “incongruous thunderclaps and flashes of lightning,” and this too seems to have affected his mood and hindered his recovery. Unable to take the stimulating walks that were part of his routine and helped his thinking, by the 22nd of the month, he was still listless and bedridden.26
What added to his “black melancholy,” as he put it, was that it was four weeks since he had sent his latest manuscript to his publisher, Ernst Schmeitzner, in Chemnitz, who seemed in no hurry to bring out this new book, entitled Thus Spake Zarathustra. He sent Schmeitzner a furious letter of reproach, which brought an apologetic reply, but a month later Nietzsche learned the real reason for the delay. As he said in a letter: “The Leipzig printer, Teubner, has shoved the Zarathustra manuscript aside in order to meet a rush order for 500,000 hymnals, which had to be delivered in time for Easter.” This rich irony was not lost on Nietzsche, of course. “The realization that his fearless Zarathustra, the ‘madman’ who had the nerve to proclaim to the somnambulists around him that ‘God is dead!’ should have been momentarily smothered beneath the collective weight of 500,000 Christian hymnbooks struck Nietzsche as downright ‘comic.’”27
The response of the first readers of the work was mixed. Heinrich Köselitz, Nietzsche’s friend, who by long tradition was sent the proofs to read and correct, was rapturous, and he expressed the hope that “this extraordinary book” would one day be as widely distributed as the Bible. Very different was the reaction of the typesetters in Leipzig, who were so frightened by what they read that they considered refusing to produce the book.
The world has never forgotten—and some have never forgiven—Nietzsche for saying “God is dead,” and then going on to add that “we have killed him.” He had actually said that before, in The Gay Science published the previous year, but the pithy style of Zarathustra attracted much more attention.
What is it with Nietzsche? Why is it his phrase above all others that has been remembered and has stuck? After all, belief in God had been declining for some time. For some, perhaps even many, belief in God—or gods, supernatural entities of any kind—had never seemed right. In most histories of unbelief, or doubt, the account begins in the eighteenth century with Edward Gibbon and David Hume, moving through Voltaire and the French Revolution, taking in Kant, Hegel and the Romantics, German biblical criticism, Auguste Comte and the “positivist breakthrough.” In the mid–nineteenth century came Ludwig Feuerbach and Karl Marx, Søren Kierkegaard, Arthur Schopenhauer, and the ravages of geological and biological science brought about by Charles Lyell, Robert Owen, Robert Chambers, Herbert Spencer and, above all, Charles Darwin.
These accounts, as often as not, add for good measure stories of celebrated individuals who lost their faith—George Eliot, Leslie Stephen, Edmund Gosse. And those who didn’t, but who heard the signals, among them Matthew Arnold, who, in the decade following Darwin’s Origin, lamented in his poem “Dover Beach” “the melancholy, long, withdrawing roar” of the sea of faith. Other accounts stress the sheer antiquity of unbelief, and here the cast includes Epicurus and Lucretius, Socrates and Cicero, Al-Rawandi and Rabelais. Here is not the place to rehearse these narratives. Our concern will be with the timing and the circumstances which culminated in Nietzsche’s notably bold proclamation (albeit, we should always remember, one made by a madman).
THE WHIFF OF DANGER AND THE CARGO OF LIFE
One of those circumstances was Nietzsche himself. He was a thoroughly unusual man—quixotic, contradictory, a young meteor who shone with an incandescent writing style but who burned out quickly and went mad at the age of forty-five. His aphoristic style lent itself to easy assimilation, by the public as well as by other philosophers, and was designed to be provocative and incendiary, succeeding only too well, as the reservations of those typesetters in Leipzig show. His madness, too, added a colorful salting to his biography, and to the biography of his ideas after his death in 1900. Were his extreme views “the uninterrupted consequence of his reason,” or were they flavored (distorted?) by his illness, an affliction that has grown more—not less—notorious since his death, as it has become clear that he was suffering from syphilis?
The uses to which his ideas have been put, or are said to have been put, since his death, are also a source of continuing notoriety. Nietzsche’s concept of nihilism caught the imagination of the world, one of its consequences being that he is the only person whose ideas, as Steven Aschheim points out, have been blamed for two world wars. This is a burdensome—and enduring—legacy.
His core insight—and the most dangerous—was that there does not exist any perspective external to or higher than life itself. There cannot exist any privileged viewpoint, any abstraction or force outside the world as we know it; there is nothing beyond reality, beyond life itself, nothing “above”; there is no transcendence, nothing metaphysical. As a result, we can make no judgment on existence that is universally valid or “objective”: “the value of life cannot be assessed.” As Nietzsche famously insisted, “There are no facts, only interpretations.”28
From this, certain things follow. We are solely the product of historical forces. Contrary to what the scientists say, the world is a chaos of multiple forces and drives “whose infinite and chaotic multiplicity cannot be reduced to unity.”29 We must learn to situate ourselves in this multiplicity and chaos and the way we do so is via the “will to power,” by which we seek to gain control over inanimate nature. Our history, especially that of the great religions, Christianity in particular, has given us a “hidden prejudice” in favor of the “beyond” at the expense of the “here and now,” and this must be changed. This very likely means that much of our activity will be in refuting what has gone before, a task made no easier by the competing forces within us, a jostling, which is our natural state and requires us to be spirited in making sense out of this jostling.30
Importantly, Nietzsche tells us that this struggle to achieve mastery over the chaos that is both outside and inside us—the “cargo of life”—leads to a more intense form of existence, and it is the only aim we can have in life, in this life here and now. Our ethical stance should be to achieve this intensity at whatever cost: our only duty is to ourselves.31
The role of reason in our lives is to enable us to realize that many of our urges are irrational, and no less powerful or valuable for that: we must harness them, and unlock them intelligently, so that they do not continue to thwart one another. This rationalization of the passions in our life he defines as the spiritual quality of existence. We should seek harmony, but we should recognize that some passions are not what the traditional religions have approved of. For example, enmity is one of the passions; it should be accepted and lived with as much as any of the others.32
All this naturally affected Nietzsche’s idea of salvation. Salvation, he holds, cannot apply to some ideal “beyond” the here and now. “God becomes the formula for every slander upon the ‘here and now,’ and for every lie about the ‘hereafter.’” And he goes so far as to propose putting what he called the “doctrine of eternal recurrence” in the place of “metaphysics” and “religion.” This was his idea that salvation cannot be other than resolutely earthly, “sewn into the tissue of forces that are the fabric of life.” The doctrine of eternal recurrence reads that you must live your life in such a way that you would wish to live it again. “All joy wants eternity,” he says, and is the criterion for deciding which moments in a life are worth living and which are not. “The good life is that which succeeds in existing for the moment, without reference to past or future, without condemnation or selection, in a state of absolute lightness, and in the finished conviction that there is no difference therefore between the instant and eternity.”
We must make a “Dionysiac affirmation,” “stand in a Dionysian relationship to existence,̶
How We Have Sought to Live Since the Death of God
The Age of Atheists
How We Have Sought to Live Since the Death of God
From one of England’s most distinguished intellectual historians comes “an exhilarating ride…that will stand the test of time as a masterful account of” (The Boston Globe) one of the West’s most important intellectual movements: Atheism.
In 1882, Friedrich Nietzche declared that “God is dead” and ever since tens of thousands of brilliant, courageous, thoughtful individuals have devoted their creative energies to devising ways to live without God with self-reliance, invention, hope, wit, and enthusiasm. Now, for the first time, their story is revealed.
A captivating story of contest, failure, and success, The Age of Atheists sweeps up William James and the pragmatists; Sigmund Freud and psychoanalysis; Pablo Picasso, James Joyce, and Albert Camus; the poets of World War One and the novelists of World War Two; scientists, from Albert Einstein to Stephen Hawking; and the rise of the new Atheists—Dawkins, Harris, and Hitchens. This is a story of courage, of the thousands of individuals who, sometimes at great risk, devoted tremendous creative energies to devising ways to fill a godless world with self-reliance, invention, hope, wit, and enthusiasm. Watson explains how atheism has evolved and reveals that the greatest works of art and literature, of science and philosophy of the last century can be traced to the rise of secularism.
From Nietzsche to Daniel Dennett, Watson’s stirring intellectual history manages to take the revolutionary ideas and big questions of these great minds and movements and explain them, making the connections and concepts simple without being simplistic. The Age of Atheists is “highly readable and immensely wide-ranging…For anybody who has wondered about the meaning of life…an enthralling and mind-expanding experience” (The Washington Post).