How the Internet Is Taking Away America’s Religion

Back in 1990, about 8 percent of the U.S. population had no religious preference. By 2010, this percentage had more than doubled to 18 percent. That’s a difference of about 25 million people, all of whom have somehow lost their religion.

That raises an obvious question: how come? Why are Americans losing their faith?

Today, we get a possible answer thanks to the work of Allen Downey, a computer scientist at the Olin College of Engineering in Massachusetts, who has analyzed the data in detail. He says that the demise is the result of several factors but the most controversial of these is the rise of the Internet. He concludes that the increase in Internet use in the last two decades has caused a significant drop in religious affiliation.

Downey’s data comes from the General Social Survey, a widely respected sociological survey carried out by the University of Chicago, that has regularly measure people’s attitudes and demographics since 1972.

In that time, the General Social Survey has asked people questions such as: “what is your religious preference?” and “in what religion were you raised?” It also collects data on each respondent’s age, level of education, socioeconomic group, and so on. And in the Internet era, it has asked how long each person spends online. The total data set that Downey used consists of responses from almost 9,000 people.

Downey’s approach is to determine how the drop in religious affiliation correlates with other elements of the survey such as religious upbringing, socioeconomic status, education, and so on.

He finds that the biggest influence on religious affiliation is religious upbringing—people who are brought up in a religion are more likely to be affiliated to that religion later.

However, the number of people with a religious upbringing has dropped since 1990. It’s easy to imagine how this inevitably leads to a fall in the number who are religious later in life. In fact, Downey’s analysis shows that this is an important factor. However, it cannot account for all of the fall or anywhere near it. In fact, that data indicates that it only explains about 25 percent of the drop.

He goes on to show that college-level education also correlates with the drop. Once it again, it’s easy to imagine how contact with a wider group of people at college might contribute to a loss of religion.

Since the 1980s, the fraction of people receiving college level education has increased from 17.4 percent to 27.2 percent in the 2000s. So it’s not surprising that this is reflected in the drop in numbers claiming religious affiliation today. But although the correlation is statistically significant, it can only account for about 5 percent of the drop, so some other factor must also be involved.

That’s where the Internet comes in. In the 1980s, Internet use was essentially zero, but in 2010, 53 percent of the population spent two hours per week online and 25 percent surfed for more than 7 hours.

This increase closely matches the decrease in religious affiliation. In fact, Downey calculates that it can account for about 25 percent of the drop.

That’s a fascinating result. It implies that since 1990, the increase in Internet use has had as powerful an influence on religious affiliation as the drop in religious upbringing.

At this point, it’s worth spending a little time talking about the nature of these conclusions. What Downey has found is correlations and any statistician will tell you that correlations do not imply causation. If A is correlated with B, there can be several possible explanations. A might cause B, B might cause A, or some other factor might cause both A and B.

But that does not mean that it is impossible to draw conclusions from correlations, only that they must be properly guarded. “Correlation does provide evidence in favor of causation, especially when we can eliminate alternative explanations or have reason to believe that they are less likely,” says Downey.

For example, it’s easy to imagine that a religious upbringing causes religious affiliation later in life. However, it’s impossible for the correlation to work the other way round. Religious affiliation later in life cannot cause a religious upbringing (although it may color a person’s view of their upbringing).

It’s also straightforward to imagine how spending time on the Internet can lead to religious disaffiliation. “For people living in homogeneous communities, the Internet provides opportunities to find information about people of other religions (and none), and to interact with them personally,” says Downey. “Conversely, it is harder (but not impossible) to imagine plausible reasons why disaffiliation might cause increased Internet use.”

There is another possibility, of course: that a third unidentified factor causes both increased Internet use and religious disaffiliation. But Downey discounts this possibility. “We have controlled for most of the obvious candidates, including income, education, socioeconomic status, and rural/urban environments,” he says.

If this third factor exists, it must have specific characteristics. It would have to be something new that was increasing in prevalence during the 1990s and 2000s, just like the Internet. “It is hard to imagine what that factor might be,” says Downey.

That leaves him in little doubt that his conclusion is reasonable. “Internet use decreases the chance of religious affiliation,” he says.

But there is something else going on here too. Downey has found three factors—the drop in religious upbringing, the increase in college-level education and the increase in Internet use—that together explain about 50 percent of the drop in religious affiliation.

But what of the other 50 percent? In the data, the only factor that correlates with this is date of birth—people born later are less likely to have a religious affiliation. But as Downey points out, year of birth cannot be a causal factor. “So about half of the observed change remains unexplained,” he says.

So that leaves us with a mystery. The drop in religious upbringing and the increase in Internet use seem to be causing people to lose their faith. But something else about modern life that is not captured in this data is having an even bigger impact.

What can that be? Answers please in the comments section.

Ref: Religious Affiliation, Education and Internet Use

Why Atheists Like Dawkins and Hitchens Are Dead Wrong

Acolytes of Dawkins & Hitchens pretend that ignorant evangelicals represent all of religion. Here’s what they miss.

Photo Credit: ollyy/

I’m supposed to hate science. Or so I’m told.

I spent my childhood with my nose firmly placed between the pages of books on reptiles, dinosaurs, marine life and mammals. When I wasn’t busy wondering if I wanted to be more like Barbara Walters or Nancy Drew, I was busy digging holes in my parents’ backyard hoping to find lost bones of some great prehistoric mystery. I spent hours sifting through rocks that could possibly connect me to the past or, maybe, a hidden crystalline adventure inside. Potatoes were both  apart of a delicious dinner and batteries for those ‘I got this’ moments; magnets repelling one another were a sorcery I needed to, somehow, defeat. The greatest teachers I ever had were Miss Frizzle and Bill Nye the Science Guy.

I also spent my childhood reciting verses from the Qur’an and a long prayer for everyone — in my family and the world — every night before going to bed. I spoke to my late grandfather, asking him to save me a spot in heaven. I went to the mosque and stepped on the shoes resting outside a prayer hall filled with worshippers. I tried fasting so I could be cool like my parents; played with prayer beads and always begged my mother to tell me more stories from the lives of the Abrahamic prophets.

With age, my wonder with religion and science did not cease. Both were, to me, extraordinary portals into the life around me that left me constantly bewildered, breathless and amazed.

Science would come to dominate my adolescent and early teenage years: papier mache cigarettes highlighting the most dangerous carcinogens, science fair projects on the virtues of chocolate consumption during menstruation; lamb lung and eye dissections, color coded notes, litmus tests on pretty papers, and disturbingly thorough study guides for five-question quizzes. My faith, too, remained operational in my day-to-day life: longer conversations with my late grandfather and all 30 Ramadan fasts, albeit with begrudging pre-dawn prayers. I attended Qur’anic recitation classes where I could not, for the life of me, recite anything that was not in English. I still read and listened to the stories of the prophets, with perhaps a greater sense of historical wonder and on occasion I would perform some of the daily prayers. Unsupervised access to the internet also led to the inevitable debates in Yahoo chat rooms about how Islam did not subjugate me as a woman. At the age of 16, I was busting out Quranic verses and references from the traditions of the Prophet Muhammad to shut up internet dwellers like Crusade563 and PopSmurf1967.

It never once occurred to me during those years, and later, that there could be any sort of a conflict between my faith and science; to me both were part of the same things: This universe and my existence within it.

And yet, here we are today being told that the two are irreconcilable; that religion begets an anti-science crusade and science pushes anti-religion valor. When did this become the only conversation on religion and science that we’re allowed to have?

This current discourse that pits faith and science against one another like Nero’s lions versus Christians — inappropriate analogy intended — borrows directly from the conflation of all religious traditions with the history and experience of Euro-American Christianity, specifically of the evangelical variety.

In my own religious tradition, Islam, there is a vibrant history of religion and science not just co-existing but informing one another intimately. Astrophysicistschemistsbiologistsalchemistssurgeonspsychologistsgeographerslogiciansmathematicians– amongst so many others – would often function as theologians, saints, spiritual masters, jurists and poets as much as they would as scientists. Indeed, a quick survey of some of the most well known Muslim intellectuals of the past 1,400 years illustrates their masterful polymathy, their ability to reach across fields of expertise without blinking at any supposed “dissonance.” And, of course, this is not something exclusive to Islam; across the religious terrain we can find countless polymaths who delved into the worlds of God and science.

Despite the history of the intellectual output of, well, the whole rest of the world, contemporary discussions in this country on the relationship between science and religion take religion to consist solely, again, of Euro-American Evangelical Christianity.  Thus “religious perspectives on human origins” are not really all that encompassing. Muslims, for instance, do not believe in Christian creationism and, actually, have differences on the nature of human origin. The Muslim creationism movement, headed by Turkish author and creationist activist Adnan Oktar (known popularly by the pseudonym Harun Yahya), is actually relatively recent and borrows much from Christian creationism – including even directly copied passages and arguments from anti-evolution Christian literature.

The absence of a centralized religious clergy and authority in Sunni Islam allows for individual and scholarly theological negotiation – meaning that there is not, necessarily, a “right” answer embedded in Divine Truth to social and political questions. Some of the most influential and fundamental Islamic legal texts are filled with arguments and counter-arguments which all come from the same source (divine revelation), just different approaches to it.

In other words: There’s plenty of wiggle room and then some. On anything that is not established as theological Truth (e.g. God’s existence, the finality of Prophethood, pillars and articles of faith), there is ample room for examination, debate and disagreement, because it does not undercut the fabric of faith itself.

Muslims, generally, accept evolution as a fundamental part of the natural process; they differ, however, on human evolution – specifically the idea that humans and apes share an ancestor in common.  In the 13th century, Shi’i Persian polymath Nasir al-din al-Tusi discussed biological evolution in his book “Akhlaq-i-Nasri” (Nasirean Ethics). While al-Tusi’s theory of evolution differs from the one put forward by Charles Darwin 600 years later and the theory of evolution that we have today, he argued that the elemental source of all living things was one. From this single elemental source came four attributes of nature: water, air, soil and fire – all of which would evolve into different living species through hereditary variability. Hierarchy would emerge through differences in learning how to adapt and survive. Al-Tusi’s discussion on biological evolution and the relationship of synchronicity between animate and inanimate (how they emerge from the same source and work in tandem with one another) objects is stunning in its observational precision as well as its fusion with theistic considerations. Yet it is, at best, unacknowledged today in the Euro-centric conversation on religion and science. Why?

My point here in this conversation about religion and science’s falsely created incommensurability isn’t about the existence of God – I would like to think that ultimately there is space for belief and disbelief. I would like to also believe, however, that the conversation on belief and disbelief can move beyond the Dawkinsean vitriol that disguises bigotry as a self-righteous claim to the sanctity of science; a claim that makes science the proudly held property of the Euro-American civilization and experience.

Hoisted into popular culture by the Holy Trinity of Dawkins-Hitchens-Harris, New Atheism mirrors the very religious zealotry it claims is at the root of so much moral, political and social decay. In particular, these authors and their posse of followers have – as Nathan Lean characterized it in this publication back in March of last year – taken a particular penchant for “flirting with Islamophobia.” Instead of engaging with Islamic theology, New Atheists – the most prominent figurehead being Richard Dawkins – are more interested in ridiculing Muslims and Islam by employing the use of the same tired, racist talking points and images that situate Muslims in need of ‘enlightenment’ – or, salvation.

The Evangelical Christian Right is a formidable force to be reckoned with in American national politics; there are legitimate fears by believing, non-believing and non-caring Americans that the course of the nation, from women’s rights to education, can and will be significantly set back because of the whims of loud and large group of citizens who refuse to acknowledge certain facts and changing realities and want the lives of all citizens to be subservient to their own will. This segment of the world’s religious topography, however, does not represent Religion or, in particular, Religion’s relationship with science.

Religion is a vast historical experience between human communities, its individual parts, the environment and something Sacred that acts as that elemental glue between everything. Science and religion are not incommensurable – and it’s time we stop treating them like they are.


Sana Saeed is a writer on politics with an interest in minority politics, media critique and religion in the public sphere. Follow her on Twitter@SanaSaeed.

A Hard Rain: Noah, Revised


More Tolkien than Torah, Darin Arinovsky’s “Noah” is a cinematic tour de force that combines breathtaking CGI-based imaginary landscapes with a film score by Clint Mansell that hearkens back to Hollywood’s golden age of Bernard Herrmann and Max Steiner. Even without a single minute of dialog, the film achieves the mesmerizing quality of Godfrey Reggio’s Qatsi trilogy, especially the last installment Naqoyqatsi, the Hopi word for “Life at War”.

Like other films that view the bible as a theme to riff on in the manner of Miles Davis improvising on a banal tune like “Billy Boy”, Aronovsky takes the material of Genesis 5:32-10:1 and shapes it according to his own aesthetic and philosophical prerogatives. As might be expected, the Christian fundamentalists are not happy with the film since it turns Noah into something of a serial killer on an unprecedented scale, acting on what he conceives of as “the Creator’s” instructions, namely to bring the human race to an end. Religious Jews who have a literalist interpretation of the bible have been far less vocal, no doubt a function of the Hasidic sects viewing all movies as diversions from Torah studies. (For those with unfamiliarity with Jewish dogma, the Torah encompasses the first five books of the Old Testament that are replete with fables such as the Great Flood, many of which have inspired some classic cinematography, such as Charlton Heston splitting the Red Sea.)

Unlike the fable it is based on, Aronovsky’s Noah never received instructions about being fruitful and multiplying. His intention is to leave the planet to the animals and wind down the human race’s participation in the tree of life, to use the title of Terrence Malick’s overrated 2011 film. In my view, Aronovsky has much deeper thoughts and more sure-handed cinematic instincts than Malick could ever hope for. To pick only one scene, the massive moving carpet of animals headed toward the Ark is a CGI tour de force. Instead of a stately procession in circus parade fashion, it is more like a zoological tsunami that anticipates the great tsunami soon to follow.

Clint Mansell, whose orchestral accompaniment to this and other key scenes is so effective, has an interesting background. He was the lead singer and guitarist for the band Pop Will Eat Itself, a group that originated in 1981 and whose style incorporated hip-hop and industrial rock at one point or another. Mansell made the transition to film score composer in 1998, working on Aronovsky’s first film “Pi”, a surrealist thriller about a character named Maximillian Cohen who believed that everything in nature could be understood through numbers.

Speaking of numbers, Russell Crowe was cast perfectly as Noah given his past leading roles. As mathematician John Nash in A Perfect Mind, who suffered from schizophrenia, he played a man hearing voices after the fashion of Noah. The voices in Nash’s head told him that he had to save the world from the Commies, while those in Noah’s assured him that “the Creator” needed to kill everybody on earth except Noah and his immediate family. Which character was more insane? That’s the real question.

Another role that prepared Crowe for his latest was as Captain of the HMS Surprise, a British warship led on an Ahab-like pursuit of a French rival during the Napoleonic wars. As Captain Jack Aubrey, Crowe was ready to sacrifice his crew and himself for the greater glory of the British monarchy just as Noah was ready to do for “the Creator”, an entity that never makes much of an appearance in Aronovsky’s film, unlike the typical Biblical epic.

One of the two revisionist elements of Aronovsky’s film that have merited the most controversy is his inclusion of a character named Tubal-Cain who is a descendant of Adam’s bad son just as Noah is a descendant of the good son Seth. Played by Ray Winstone, Tubal-Cain is the warlord ruling over all those wicked people the Creator is bent on destroying, just like an artist who burns a painting from earlier in his career that he deems inferior to his latest. Unlike a movie based on the tale of “Sodom and Gomorrah”, it is not quite clear what got enraged God. After all, there are no sadomasochistic orgies going on in Tubal-Cain’s camp as he lays siege to Noah’s Ark (not that there is really anything wrong with sadomasochistic orgies). All we know from the Torah is that “The Lord saw how great the wickedness of the human race had become on the earth.” If you read the bible carefully, you’ll understand that the deity gets much more pissed off at worshipping false idols than he does over murder, theft, rape, and other acts normal people consider far more wicked. Indeed, Tubal-Cain is convinced that Noah is a mad man since his fundamentally “deep ecology” views on the need to rid the planet of the pestilent homo sapiens is at odds with God making man in his own image and giving him ”dominion over the fish of the sea and over the birds of the heavens and over the livestock and over all the earth and over every creeping thing that creeps on the earth.” What’s wrong with that? Animal rights lovers and vegetarians need not apply.

The other element is “the watchers”, who are Ent-like creatures that help Noah and his family ward off Tubal-Cain’s warriors while serving as carpenters on the Ark. Instead of being tree-like monsters, they are giants made of stone who happen to be “fallen angels” trying to get on the Creator’s good side after their past transgressions. Unlike the characters in John Milton’s “Paradise Lost”, these angels seem perfectly reasonable and no threat to the established order. As is persistent throughout the film and the Old Testament itself, the Creator’s moral compass often seems more broken than those he holds dominion over.

That fundamentally strikes me as the underlying philosophical issue of Aronovsky’s film, namely the impossibility of living a “good life” on the basis of biblical myths, legends, and fables. The moral relativism of “Noah” was likely to have angered those who believe that the bible was literally written by God, even if it was close to the mark.

The film also resonates with current-day concerns over a new threat to the continued existence of humanity, namely the climate change that is capable of a new Great Flood that will unfortunately only kill the innocent rather than the wicked. What the bible never makes clear is that god is merciful to those who have capital rather than pure hearts.

Unlike the past five extinctions, the sixth that is posed by climate change and other looming environmental disasters will be as a result of human intervention rather than a deus ex machina like a meteor.

Interestingly enough, there is some scholarly support for the idea that a great flood occurred in the distant past, one that is evoked not only in the Torah but in the Babylonian Epic of Gilgamesh and Plato’s Timaeus as well.

In an article titled “Noah’s Flood Reconsidered” for the autumn 1964 issue of Iraq, a scholarly journal, E.I. Mallowan concluded that the flood depicted in the Epic of Gilgamesh—the obvious inspiration for Noah—occurred some time prior to 2650 BC.

Indeed, archaeologists working in the ancient city of Ur in 1928-29 found evidence of two deep pits that exposed a stratum of “clean water-laid clay”, proving in their eyes that a Noachian-type flood had occurred. However, neither the Epic of Gilgamesh nor the archaeologists viewed the flood as impacting all of humanity, only a great city and civilization that existed at the dawn of history. Despite Iraq’s reputation as desert-like, it is also subject to powerful storms that wash away everything in its path—a natural catastrophe rivaling the man-made catastrophe of George W. Bush.

It has been many years since I looked at Plato’s Timaeus—48 in fact, when I was avoiding the draft in the New School Graduate Philosophy program—but I took a quick look in preparing this article.

Like the rest of his work, this is a Socratic dialog in which the principals are sounding boards for Plato’s idealism. One of them, an Athenian named Timaeus, describes a Creator who is a lot more human than the cruel and capricious figure of the Old Testament: “Why did the Creator make the world?…He was good, and therefore not jealous, and being free from jealousy he desired that all things should be like himself.” And, like the hero of Darin Aronovsky’s “Pi”, Plato’s creator sees the natural world as one based on numbers. After creating three major entities of the existing world—body, soul, and essence—god proceeded to divide the entire mass into portions related to one another in the ratios of 1, 2, 3, 4, 9, 8, and 27.

Once Timaeus establishes the ratios that govern the known universe, he drills down into the less than perfect reality that govern our daily lives, such as those inflicted on our bodies: “When on the other hand the body, though wasted, still holds out, then the bile is expelled, like an exile from a factious state, causing associating diarrhoeas and dysenteries and similar disorders.”

Critias, another Athenian, weighs in on the ever-present danger of natural catastrophes including the one that befell Atlantis:

Now in this island of Atlantis there was a great and wonderful empire which had rule over the whole island and several others, and over parts of the continent, and, furthermore, the men of Atlantis had subjected the parts of Libya within the columns of Heracles as far as Egypt, and of Europe as far as Tyrrhenia….But afterwards there occurred violent earthquakes and floods; and in a single day and night of misfortune all your warlike men in a body sank into the earth, and the island of Atlantis in like manner disappeared in the depths of the sea. For which reason the sea in those parts is impassable and impenetrable, because there is a shoal of mud in the way; and this was caused by the subsidence of the island.

Perhaps someday archaeologists will discover evidence of a great flood that destroyed Atlantis just as they have found evidence of the flood depicted in the Epic of Gilgamesh. In late January divers discovered perfectly preserved stone-age tools that were between 10 and 11,000 years old in the Swedish bay of Hanö. Södertörn University’s Björn Nilsson, the leader of the research team, was annoyed (by comparisons in the popular press made to Atlantis:

Nilsson admitted that “lousy Swedish tabloids” had blown the story out of the water by labelling the find “Sweden’s Atlantis”, even though the remnants never belonged to an actual village. The people were all nomadic at the time, he explained, so there was no village. He trumpeted, however, that the finds so far were “world-class” and “one-of-a-kind”. He added that was extremely rare to find evidence from the Stone Age so unspoiled.

We’ll probably never know what caused these nomads to be swept away by floods but we will know what might cover Manhattan under the Atlantic in the not too distant future. We cannot go back in history to change the circumstances that led to such disasters but we can control our own fate in order to save both animals and the human race. For that effort we need to rely on science and radical politics, not the Creator.

Louis Proyect blogs at and is the moderator of the Marxism mailing list. In his spare time, he reviews films for CounterPunch.


Religion Is Good For Your Brain


“Sheila M. Elred writes in Discovery Magazine that a recent study has found that people at risk of depression were much less vulnerable if they identified as religious. Brain MRIs revealed that religious participants had thicker brain cortices than those who weren’t as religious. ‘One of the worst killers of brain cells is stress,’ says Dr. Majid Fotuhi. ‘Stress causes high levels of cortisol, and cortisol is toxic to the hippocampus. One way to reduce stress is through prayer. When you’re praying and in the zone you feel a peace of mind and tranquility.’ The reports concluded that a thicker cortex associated with a high importance of religion or spirituality may confer resilience to the development of depressive illness in individuals at high familial risk for major depression. The social element of attending religious services has also been linked to healthy brains. ‘There’s something magical about socializing,’ says Fotuhi. ‘It releases endorphins in the brain. It’s hard to know whether it’s through religion or a gathering of friends, but it improves brain health in the long term.’”

“Listening to sermons and reading religious works like the Bible may also invoke a cognitive benefit. “You’re exercising your higher cortical function, thinking about complex concepts that require some imagination,” says Harold G. Koenig, director of the Center for Spirituality, Theology, and Health at Duke University and a professor of psychiatry. According to Koenig the benefits of devout religious practice, particularly involvement in a faith community and religious commitment, are that people cope better. “In general, they cope with stress better, they experience greater well-being because they have more hope, they’re more optimistic, they experience less depression, less anxiety, and they commit suicide less often. They don’t drink alcohol as much, they don’t use drugs as much, they don’t smoke cigarettes as much, and they have healthier lifestyles. They have stronger immune systems, lower blood pressure, probably better cardiovascular functioning, and probably a healthier hormonal environment physiologically—particularly with respect to cortisol and adrenaline And they live longer.” So where does that leave non-believers? “Out of luck, I guess,” Koenig jokes. “Actually, I would suspect that people doing the types of things like religious people do — socializing, doing similarly complex cognitive tasks, would have similar benefits. But it is interesting that religion provides that whole package of things that people can adopt and pursue over time.” Dr Dan Blazer says the study is very interesting but is still exploratory and that spirituality may be a marker of something else, such as socioeconomic status. “It’s hard to study these things,” concludes Fotuhi . “It’s why research has stayed away from them. But there does seem to be a strong link between spirituality and better brain health.””

Don’t Want to Die? Just Upload Your Brain

March 5, 2014, 11:32 PM

l haven’t seen “Her,” the Oscar-nominated movie about a man who has an intimate relationship with a Scarlett Johansson-voiced computer operating system. I have, however, read Susan Schneider’s “The Philosophy of ‘Her’,” a post on The Stone blog at the New York Times looking into the possibility, in the pretty near future, of avoiding death by having your brain scanned and uploaded to a computer. Presumably you’d want to Dropbox your brain file (yes, you’ll need to buy more storage) to avoid death by hard-drive crash. But with suitable backups, you, or an electronic version of you, could go on living forever, or at least for a very, very long time, “untethered,” as Ms. Schneider puts it, “from a body that’s inevitably going to die.”

This idea isn’t the loopy brainchild of sci-fi hacks. Researchers at Oxford University have been on the path to human digitization for a while now, and way back in 2008 the Future of Humanity Institute at Oxford released a 130-page technical report entitled Whole Brain Emulation: A Roadmap. Of the dozen or so benefits of whole-brain emulation listed by the authors, Andrew Sandberg and Nick Bostrom, one stands out:

If emulation of particular brains is possible and affordable, and if concerns about individual identity can be met, such emulation would enable back‐up copies and “digital immortality.”

Scanning brains, the authors write, “may represent a radical new form of human enhancement.”

Hmm. Immortality and radical human enhancement. Is this for real? Yes:

It appears feasible within the foreseeable future to store the full connectivity or even multistate compartment models of all neurons in the brain within the working memory of a large computing system.

Foreseeable future means not in our lifetimes, right? Think again. If you expect to live to 2050 or so, you could face this choice. And your beloved labrador may be ready for upload by, say, 2030:

A rough conclusion would nevertheless be that if electrophysiological models are enough, full human brain emulations should be possible before mid‐century. Animal models of simple mammals would be possible one to two decades before this.

Interacting with your pet via a computer interface (“Hi Spot!”/“Woof!”) wouldn’t be quite the same as rolling around the backyard with him while he slobbers on your face or watching him dash off after a tennis ball you toss into a pond. You might be able to simulate certain aspects of his personality with computer extensions, but the look in his eyes, the cock of his head and the feel and scent of his coat will be hard to reproduce electronically. All these limitations would probably not make up for no longer having to scoop up his messes or feed him heartworm pills. The electro-pet might also make you miss the real Spot unbearably as you try to recapture his consciousness on your home PC.

But what about you? Does the prospect of uploading your own brain allay your fear of abruptly disappearing from the universe? Is it the next best thing to finding the fountain of youth? Ms. Schneider, a philosophy professor at the University of Connecticut, counsels caution. First, she writes, we might find our identity warped in disturbing ways if we pour our brains into massive digital files. She describes the problem via an imaginary guy named Theodore:

[If Theodore were to truly upload his mind (as opposed to merely copy its contents), then he could be downloaded to multiple other computers. Suppose that there are five such downloads: Which one is the real Theodore? It is hard to provide a nonarbitrary answer. Could all of the downloads be Theodore? This seems bizarre: As a rule, physical objects and living things do not occupy multiple locations at once. It is far more likely that none of the downloads are Theodore, and that he did not upload in the first place.

This is why the Oxford futurists included the caveat “if concerns about individual identity can be met.” It is the nightmare of infinitely reproducible individuals — a consequence that would, in an instant, undermine and destroy the very notion of an individual.

But Ms. Schneider does not come close to appreciating the extent of the moral failure of brain uploads. She is right to observe an apparent “categorical divide between humans and programs.” Human beings, she writes, “cannot upload themselves to the digital universe; they can upload only copies of themselves — copies that may themselves be conscious beings.” The error here is screamingly obvious: brains are parts of us, but they are not “us.” A brain contains the seed of consciousness, and it is both the bank for our memories and the fount of our rationality and our capacity for language, but a brain without a body is fundamentally different from the human being that possessed both.

It sounds deeply claustrophobic to be housed (imprisoned?) forever in a microchip, unable to dive into the ocean, taste chocolate or run your hands through your loved one’s hair. Our participation in these and infinite other emotive and experiential moments are the bulk of what constitutes our lives, or at least our meaningful lives. Residing forever in the realm of pure thought and memory and discourse doesn’t sound like life, even if it is consciousness. Especially if it is consciousness.

So I cannot agree with Ms. Schneider’s conclusion when she writes that brain uploads may be choiceworthy for the benefits they can bring to our species or for the solace they provide to dying individuals who “wish to leave a copy of [themselves] to communicate with [their] children or complete projects that [they] care about.” It may be natural, given the increasingly virtual lives many of us live in this pervasively Internet-connected world, to think ourselves mainly in terms of avatars and timelines and handles and digital faces. Collapsing our lives into our brains, and offloading the contents of our brains to a supercomputer is a fascinating idea. It does not sound to me, though, like a promising recipe for preserving our humanity.

Image credit:

Sex at the Satan Club

At the height of ’60s counterculture, the sexual revolution found itself an unexpected bedfellow: Devil worship


Sex at the Satan ClubA still from “Eyes Wide Shut”

A mining expedition in the South American jungle: Edward MacKensie, jealous of his business partner’s lover and wanting to keep the expedition’s riches for himself, engineers an “accident” that kills the partner and his lover. Twenty years later, MacKensie is a rich and successful man, married with a teenage daughter. Despite (or perhaps because) of his wealth and success, MacKensie finds himself bored with life, in particular, his sex life. He pays the office boy and secretary to have sex in front of him, and then cruelly mocks them when they do not perform to his expectations. He searches for hookers who might better understand his peculiar “tastes,” which center on sadistic forms of torture and humiliation, and longs for the Victorian era for the fabled abandon of its sexual underground. “Now there was an era,” he laments to himself, “when a woman like Mrs. Berkeley would earn a thousand pounds for inventing a whipping horse on which a pretty girl could be postured in a thousand different lascivious ways for the lash.” After another humiliating failure with a prostitute, MacKensie meets the mysterious Carlos Sathanas, a worldly, rich sophisticate. Their conversation quickly turns to “unusual pleasures.” “To put it bluntly,” he tells MacKensie, “for all this talk about the new sexual freedom, I for one fail to perceive it except in the huge dissemination of titallitory books and magazines and movies, which are nothing more or less than pure psychic masturbation. They depict fantasies that are not in existence, but perhaps were in another century.” Sathanas confides that he is the founder and sole proprietor of “the Satan Club,” an organization devoted to fulfilling the most bizarre sexual desires of its secret, exclusive membership. MacKensie joins eagerly and soon finds himself participating in a series of increasingly exotic sexual scenarios.

Three weeks into his membership, MacKensie anticipates what promises to be the most provocative show yet, the one that will make him an official member of the Satan Club for life. Encouraged to partake of a very special mixture of Spanish fly—an hallucinatory blend discovered by Sathanas himself—a blindfolded MacKensie is escorted into a basement and strapped into a strange device called “the chair of Tantalus,” guaranteed by Sathanas to enhance his sexual arousal to unprecedented heights. With the blindfold now removed, a curtain parts to reveal two nude women intertwined on a couch. Aroused to point of physical pain, MacKensie looks down to see there is a collar device attached to his penis making orgasm impossible: the chair of Tantalus! But his horror and despair are only beginning. As the effects of the Spanish fly begin to wane, he recognizes the two women on the couch as his wife and her recently hired personal masseuse. They mock him with contemptuous laughter as their sexual escapades become more intense. Worse yet, his teenage daughter now enters the tableau on all fours, eagerly mounted by the family dog! The agony of arousal and humiliation is overwhelming, and MacKensie begs for release. Calm and collected, Sathanas appears on stage to explain. He is in fact the business partner MacKensie left for dead twenty years ago in the jungle. Having been told of MacKensie’s murderous past and philandering ways, his family now hates him— utterly. All money and property have been transferred to the wife, who plans to divorce him and run away with the masseuse. His daughter no longer has any interest in men, only her beloved German Shepherd. His former partner’s revenge is complete. The show is over. Later, as the lights go up, MacKensie is alone but still strapped into the chair of Tantalus. He realizes the night’s spectacle has unfolded in the basement of his very own Long Island home—of which he is now dispossessed. Destroyed by material and erotic greed, he stares “unseeingly at that stage where all his life had collapsed about him.”

As a book trading in sexual fantasy, the very “psychic masturbation” so deplored in the text by Sathanas, “The Satan Club” is rather relentless in its emphasis on frustration, failure, and damnation. As one would expect from a “dirty book,” MacKensie’s saga links a number of extended and graphically rendered sexual interludes clearly crafted for the reader’s arousal. Yet the overall structure of the book, despite its “immoral” status as pornography, is strangely, even prudishly moral in its actual execution. We must assume until the very last page that Sathanas is in fact Satan himself, tempting MacKensie’s desire for ever more perverted sexual scenarios in order to take possession of his soul. In any case, MacKensie’s lust does lead to his “damnation,” broke and humiliated in Long Island if not actually burning in hell. Sexually adrift through most of the novel, MacKensie learns a powerful lesson about fantasy and desire, a lesson, in turn, that one would think might prove unsettling to the man who would seek out and buy a copy of “The Satan Club” for his own arousal. What exactly is the pleasure to be had in following the inexorable downward spiral of a man seeking to realize his own sexual fantasies? Moreover, what is gained by situating this prurient yet prudish narrative within the “satanic” conventions of temptation, trickery, and damnation?

“The Satan Club” serves as a reminder that of all the various avenues of morality policed by religion, none absorbs more mental and social energy than sexuality. Innumerable historians of religion, culture, and sexuality have discussed how civilization emerged (at least in part) from the social regulation of unfettered sexual expression, leading in the West to the eventual ascendance of property relations, heteronormative monogamy, and reproductive futurism—as well as all of this social order’s attending “discontents.” Playing on these repressions, Lucifer’s role within modernity has focused most intently on tempting the chaste to overthrow their superego masters, profane their faith, and reclaim forbidden desires and practices, forsaking the stabilizing institution of monogamous reproductive marriage for the entropic energies of “unbridled” lust. In modern fiction, this template is at least as old as J. K. Huysmans’s scandalous account of fin de siècle Satanism, “La Bas” (1891). Huysmans’s narrator, Durtal, a bored author interested in learning more about satanic sects said to be proliferating within the Catholic Church, infiltrates a Black Mass presided over by one Paris’s most respected priests. Like any good decadent, he assumes the rite will at least be diverting. Attending with his lover—the wife of a rival author—his bemusement turns to horror as the priest “wipes himself” with the Eucharist, women writhe in ecstasy on the floor, and the choirboys “give themselves” to the men. Escaping this “monstrous pandemonium of prostitutes and maniacs,” Durtal flees with his mistress (a possible succubus) to a seedy hotel, where he is then seduced (seemingly against his will) in a bed “strewn with fragments of hosts.” Satan makes no definitive appearance in “La Bas”—like much nineteenth-century fiction, Huysmans’s realism emphasizes the plausible horrors of clerical contamination over the gothic pyrotechnics of supernatural intervention—but the novel’s interlinking of power, profanity, sexual transgression, and shame remains central to the genre even today.

Published in 1970, “The Satan Club” stands at the threshold of the most recent wave of popular interest in Satanism, one that traces its beginnings to the social transformations of the 1960s, especially the baby boomer alignment of sexual, spiritual, and psychedelic politics attending the so-called hippie counterculture. By the end of the 1960s, “Satanism” assumed an increasingly public identity, traceable in large part to the efforts of Anton Szandor LaVey. Although neither a hippie nor a baby boomer, this former carnie and crime-scene photographer exploited the countercultural currents of San Francisco when he founded the Church of Satan in 1966 (see figure 9.1). Fluent in the art of self-promotion, LaVey garnered international press in founding the church, including pieces in such journalistic mainstays as Time, Life, Look, and McCall’s. LaVey also appeared as a guest on “The Tonight Show” with Johnny Carson and as the devil himself in Roman Polanski’s “Rosemary’s Baby” (1968), a film that ushered in a decade-long wave of satanic fictions. “The Exorcist” (1973), “The Omen” (1976), and their various sequels further mined this vein, as did a made-for-TV movie asking the question: “Look What’s Happened to Rosemary’s Baby?” (1976). By the mid-1970s, Satan had become such big business that Alan Ladd Jr., then president of Fox’s film division, noted that “almost every movie company has five or six Devil movies in the works,” a sentiment echoed by Ned Tanen of MCA: “Devil movies” have “eclipsed the western in popularity all over the world.” The reason, for Tanen, was clear, a logic still invoked to explain any and all trends in moviemaking: “Devil movies play equally well in Japan, Ecuador, and Wisconsin,” he observed. A more “pop” Satan also became a staple of the Christian-publishing industry in this period, most notoriously in the widely read screeds of Hal Lindsey, including “The Late Great Planet Earth” (1970) and “Satan Is Alive and Well on Planet Earth” (1972). Long before the “Left Behind” series transformed the Book of Revelation into an epic soap opera, Lindsey scoured the headlines for signs of the antichrist’s arrival and the onset of the apocalypse. Flirtations between rock music and Satanism are well know in this period, from Led Zeppelin guitarist Jimmy Page’s purchase of Aleister “the Beast” Crowley’s Boleskin House to the coded imagery of the Rolling Stones’ album, “Goats Head Soup.” The devil was such a ubiquitous presence in the American popular culture of the 1970s that minister C. S. Lovett even penned a diet book in 1977 under the alarming title: “Help Lord—the Devil Wants Me Fat!” “When You’re Watching tv, the commercial break is one of the devil’s favorite moments,” warns Lovett. He then suggests a script for warding off Satan’s “food attacks”: “I know you’re trying to dominate me with food, Satan. So, in the name of Jesus Go . . . get off my back!”

Beneath this sheen of Hollywood “black horror,” devil rock, and mass-market Satanism, however, lurked another circle of hellish cultural production. Shadowing “mainstream” Satanism was a cycle of sexploitation films, pornographic magazines, and adult paperbacks that—like “The Satan Club”—centered not so much on the gravitas of demon possession, the antichrist, and the apocalypse, but on a more licentious engagement of sexual tourism and erotic experimentation. As the dark overlord of a larger interest in occult sexuality, Satan presided over a ludic proliferation of transgressive temptation and “forbidden” pleasures in adult media of the 1960s and 1970s. Explicit paperbacks of the era promoted Satanism as a nonstop orgy in such titles as “Infernal Affair” (1967), “Devil Sex” (1969), “Sex Slaves of the Black Mass” (1971), and “Satan, Demons, and Dildoes” (1974), to name only a few. At the grind house, sexploitation movie titles also foregrounded the lure of satanic spectacle with such offerings as “The Lucifers” (1971), “Satanic Sexual Awareness” (1972), “Sons of Satan” (1973), “The Horny Devils” (1971, aka “Hotter Than Hell”), and the perhaps inevitable Exorcist knock-off: “Sexorcism Girl” (1975). In the increasingly targeted market for print pornography, magazines such as Sexual Witchcraft and Bitchcraft specialized in provocative images of occultists staging sexualized rituals (“Nudity in Witchcraft! The True Inside Story,” proclaims one banner headline). Even the infamous Ed Wood Jr. threw his hat into the occult-sex ring by appearing (most painfully) in the 1971 cheapie, “Necromania.”

Already a central figure in the West’s psychic economy of sexual prohibition (at least in its religious iterations), the devil’s historical relation to God, religion, and faith made “occult sex” a fundamentally perverse genre, even when tales such as “The Satan Club” ultimately sided with “real-world” explanations over the supernatural. As the Christian embodiment of evil temptation, Satan promised access to any and all sensual pleasures—an invitation to lustful exploration that resonated within the postwar era’s ongoing disarticulation of sex, marriage, and reproduction. And yet, as a product of the authority of religious morality, this eroticized occult could not, by definition, escape the very moral order it sought to evade, undermine, or destroy. Satan (or a surrogate such as Sathanas) is both a saboteur of morality and its most damning enforcer, the ambassador of temptation and the executioner of guilt. Drawing his prey from their moral orbit by appealing to their most base and selfish of desires, Satan—in his supernatural, dialectic relation to God—ultimately reasserts the very repression that a bored MacKensie foolishly believes might be overcome. Such is the essence of “taboo” pleasure—a desire to violate convention and custom that ultimately reaffirms the authority of the law on which the taboo depends. This dynamic made satanic sexploitation a doubly perverse genre—“perverse” in its appetites and its effects. Although such fare offered the lure of ever-more “exotic” sexual adventures, for both protagonist and audience, the horned ambassador of such indulgence demanded nothing less than the sexual adventurer’s eternal soul!

The “Black Pope”

As the author of “The Satanic Bible” and self-appointed spokesman of modern Satanism, LaVey frequently spoke to the press as the authority on Satanism’s history and future—a heritage LaVey often cast in terms of sexual indulgence. “The Satanic Age started in 1966,” LaVey explained. “That’s when God was proclaimed dead, the Sexual Freedom League came into prominence, and the hippies developed as a free sex culture.” Within the sweeping social transformations of the postwar era, LaVey’s brand of Satanism contributed to a significant rewriting of the devil, one that cast Satan more as a dandy or libertine than the Lord of Darkness. This “urbane” Satan was largely a function of growing secularization and new strategies for organizing erotic and social life within the so-called sexual revolution. Hoping to compete with the growing popularity of Hugh Hefner’s Playboy, Stanley Publications introduced Satan magazine in 1957, billing it as “Devilish Entertainment for Men.” The magazine only survived for six issues, leading historian Bethan Benwell to speculate, “There were limits to how far the ‘playboy ethic’ could be pushed.

Perhaps . . . the magazine’s title and allusions flaunted the libertine ideal a little too brazenly.” Certainly, not everyone saw this new sexual “ethic” as progress—satanic or otherwise. “Increasing divorce and desertion and the growth of prenuptial and extramarital sex relations are signs of sex addiction somewhat similar to drug addiction,” accused Pitirim Sorokin in his book “The American Sex Revolution” (1956). It was a claim that has resonated with moral reformers to this very day. Responding to Sorokin, Edwin M. Schur commented in 1964 that many sociologists of the era believed “there really may not have been any startling change in sexual behavior in the very recent years.” Schur located perceptions of a sexual revolution more in an ongoing redefinition of the socioeconomic relationship between the individual and the family, pushing this “revolution” back even further in time by citing Walter Lipmann’s observation in 1929 that once “chaperonage became impossible and the fear of pregnancy was all but eliminated, the entire conventional sex ethic was shattered.” Whether sexual practices were actually changing across the 1950s and 1960s was less important than the widely held perception that more people were having more sex in more “liberated” scenarios. This sense that individual desire, expressed in sexuality and selfishness had eclipsed familial and social responsibility and would remain a core moral debate of the twentieth century, creating the conditions not only for LaVey’s Satanism, but also the Moynihan Report, Thomas Wolfe’s “The Me Decade,” and Christopher Lasch’s “The Culture of Narcissism.”

Promoting the Church of Satan in 1966, LaVey frequently invoked the libertine connotations already attached to such satanic sophistication, even as he attempted to distance his new religion from mere hedonism. Sex might lure converts to the church, but LaVey’s ambitions for his “religion” were more about philosophical empowerment than licentious abandon. In truth, LaVey’s Satanism had little to do with Satan. Although he was never reticent to appear in the trappings of Christianity’s satanic dramaturgy—donning capes, horns, and pentagrams for the camera—LaVey took great pains to divorce his version of Satanism from any actual biblical entity, his devil having more in common with Zarathustra and Ayn Rand than Lucifer the fallen angel. Although aspiring to provide a new philosophy of the mind, LaVey’s background in carnie ballyhoo made him more than willing to hustle some flesh in publicizing the church. An early promotional event involved LaVey booking a San Francisco nightclub to stage an eroticized witches’ Sabbath, a theatrical piece concluding with then stripper and soon-to-be Manson murderer Susan Atkins emerging nude from a coffin. Ever the showman, LaVey sparked another round of national press by performing a satanic wedding ceremony in 1967, complete with a nude redhead serving as the altar. “The altar shouldn’t be a cold unyielding slab of sterile stone,” reasoned LaVey, but “a symbol of enthusiastic lust and indulgence.” He also cultivated a public relationship with sex symbol Jayne Mansfield, leading to the rumors of her conversion to Satanism, amplified in the wake of her untimely and gruesome death in a car accident in the summer of 1967. Yet despite the salacious aspects of the early church (“Phase One . . . the nudie stuff,” LaVey would later call it), LaVey also made several attempts to deemphasize the sexual abandon seemingly promised by the “religion,” no doubt to defend against the many “sex criminals” who apparently contacted him just prior to their release from prison in hopes of joining the congregation. Many potential converts, he reported, were disappointed to discover there were no “orgies” in the ceremonies; indeed, LaVey appears to have had only contempt for the type of orgiastic ritual imagined by Huysmans and, according to LaVey, allegedly still practiced in the “amateur” Satanist congregations of Los Angeles (presided over, according to LaVey, by “dirty old men”). The church made no judgment about the morality of any sexual pursuit, advocating “the practice of any type of sexual activity which satisfy man’s individual needs, be it promiscuous heterosexuality, strict faithfulness to a wife or lover, homo-sexuality, or even fetishism,” in short, “telling each man or woman to do what comes naturally and not to worry about it.” Those looking to affirm their sexual appetites, whatever they might be, were welcome at the church; those actually looking to have sex were not. “There are some beautiful women that belong to the Church,” claimed LaVey, “but they don’t have to come here to get laid. They could go down to any San Francisco bar and get picked up.”

Building on fantasies of libertine conquest and masculine sophistication, LaVey was savvy enough to recognize that one growth market would be sexual empowerment for women. Toward that end, he published “The Compleat Witch” in 1971, a manual teaching women how to seduce or otherwise manipulate men through witchcraft. Writing at the high-water mark of second-wave feminism, LaVey’s advice is strangely prescient of Camille Paglia and other postfeminist provocateurs. “Any bitter and disgruntled female can rally against men, burning up her creative and manipulative energy in the process,” he writes. “She will find the energies she expends in her quixotic cause would be put to more rewarding use, were she to profit by her womanliness by manipulating the men she holds in contempt, while enjoying the ones she finds stimulating.” No doubt such advice was appealing to women hoping to find a strategy for sexual success, and male readers fantasizing that they themselves might become the prey of such “sexual witchcraft.” LaVey’s practical advice for the aspiring witch included such tactics as positive visualization (“Extra Sensory Projection”), “indecent exposure” (showing as much flesh as legally possible—a “power” denied to men, notes LaVey), and not “scrubbing away your natural odors of seduction” (including keeping a swatch of dried menstrual blood in an amulet). As this is a book about witchcraft, LaVey includes some thoughts on the art of “divination,” but even here his comments are more in line with the art of the con than the art of the occult. A woman willing to follow LaVey’s sartorial and psychic program was promised an enhanced sense of personal power over the weak-minded male of the species, the book combining a rather conservative view of feminine seduction with a sexual will to power. Here LaVey put an occult spin on Helen Gurley Brown’s “Sex and the Single Girl,” another book notorious for allegedly empowering women by cultivating their essentialized wiles. Indeed, Dodd and Mead’s print campaign for “The Compleat Witch” dubbed it a study of “hex and the single girl,” suggesting the publisher saw the book more as a “relationship” title than a primer in black magic.

LaVey may have had his own detailed ideas about the philosophy of his religion and great ambitions for the future of Satanism, but he ultimately had little control over how the satanic 1960s and 1970s would play in the popular imagination; indeed, much of LaVey’s time as Satanism’s “official” spokesman appears to have been consumed in distancing his church from the atrocities of Satan-linked killers such as Charles Manson, “Nightstalker” Richard Ramirez, and dozens of cat-killing teenage boys in the Midwest—not to mention the general religious competition offered by the Process, the Raelians, the People’s Temple, and California’s other proliferating sects, cults, and “kooks.” Satan may have just been a convenient symbol for LaVey, but Lucifer’s very real presence in the lives of those hoping to either invoke or avoid him made it difficult for LaVey’s more “magical” form of Randian Objectivism to gain traction. Moreover, by building his church’s public facade, not on rock or sand but on images of a devilish libido and fantasies of a guilt-free eroticism, LaVey’s brand of Satanism could not help but be linked to the era’s larger transformations in sexuality, especially among those already intrigued or repulsed by the highly visible growth of various “countercultures” of the 1960s. As a “hot” new scenario promising unlimited sexual action and erotic power, LaVey’s bid to resurrect self-interested materialism became more naughty than Nietzschean, emerging as a prominent subgenre in the era’s developing and increasingly brazen pornography industry.

Excerpted from “Altered Sex: Satan, Acid, and the Erotic Threshold” by Jeffrey Sconce in “Sex Scene: Media and the Sexual Revolution,” edited by Eric Schaefer. Copyright Duke University Press, 2014. All rights reserved.


Whole Foods: America’s Temple of Pseudoscience

Photo by Getty

Tech + Health

Americans get riled up about creationists and climate change deniers, but lap up the quasi-religious snake oil at Whole Foods. It’s all pseudoscience—so why are some kinds of pseudoscience more equal than others?

If you want to write about spiritually-motivated pseudoscience in America, you head to the Creation Museum in Kentucky. It’s like a Law of Journalism. The museum has inspired hundreds of book chapters and articles (some of them, admittedly, mine) since it opened up in 2007. The place is like media magnet. And our nation’s liberal, coastal journalists are so many piles of iron fillings.

But you don’t have to schlep all the way to Kentucky in order to visit America’s greatest shrine to pseudoscience. In fact, that shrine is a 15-minute trip away from most American urbanites.

I’m talking, of course, about Whole Foods Market. From the probiotics aisle to the vaguely ridiculous Organic Integrity outreach effort (more on that later), Whole Foods has all the ingredients necessary to give Richard Dawkins nightmares. And if you want a sense of how weird, and how fraught, the relationship between science, politics, and commerce is in our modern world, then there’s really no better place to go. Because anti-science isn’t just a religious, conservative phenomenon—and the way in which it crosses cultural lines can tell us a lot about why places like the Creation Museum inspire so much rage, while places like Whole Foods don’t.

My own local Whole Foods is just a block away from the campus of Duke University. Like almost everything else near downtown Durham, N.C., it’s visited by a predominantly liberal clientele that skews academic, with more science PhDs per capita than a Mensa convention.

Still, there’s a lot in your average Whole Foods that’s resolutely pseudoscientific. The homeopathy section has plenty of Latin words and mathematical terms, but many of its remedies are so diluted that, statistically speaking, they may not contain a single molecule of the substance they purport to deliver. The book section—yep, Whole Foods sells books—boasts many M.D.’s among its authors, along with titles like The Coconut Oil Miracle and Herbal Medicine, Healing, and Cancer, which was written by a theologian and based on what the author calls the Eclectic Triphasic Medical System.

You can buy chocolate with “a meld of rich goji berries and ashwagandha root to strengthen your immune system,” and bottles of ChlorOxygen chlorophyll concentrate, which “builds better blood.” There’s cereal with the kind of ingredients that are “made in a kitchen—not in a lab,” and tea designed to heal the human heart.

Whenever we talk about science and society, it helps to keep in mind that very few of us are anywhere near rational.

Nearby are eight full shelves of probiotics—live bacteria intended to improve general health. I invited a biologist friend who studies human gut bacteria to come take a look with me. She read the healing claims printed on a handful of bottles and frowned. “This is bullshit,” she said, and went off to buy some vegetables. Later, while purchasing a bag of chickpeas, I browsed among the magazine racks. There was Paleo Living, and, not far away, the latest issue of What Doctors Don’t Tell You. Pseudoscience bubbles over into anti-science. A sample headline: “Stay sharp till the end: the secret cause of Alzheimer’s.” A sample opening sentence: “We like to think that medicine works.”

At times, the Whole Foods selection slips from the pseudoscientific into the quasi-religious. It’s not just the Ezekiel 4:9 bread (its recipe drawn from the eponymous Bible verse), or Dr. Bronner’s Magic Soaps, or Vitamineral Earth’s “Sacred Healing Food.” It’s also, at least for Jewish shoppers, the taboos that have grown up around the company’s Organic Integrity effort, all of which sound eerily like kosher law. There’s a sign in the Durham store suggesting that shoppers bag their organic and conventional fruit separately—lest one rub off on the other—and grind their organic coffees at home—because the Whole Foods grinders process conventional coffee, too, and so might transfer some non-organic dust. “This slicer used for cutting both CONVENTIONAL and ORGANIC breads” warns a sign above the Durham location’s bread slicer. Synagogue kitchens are the only other places in which I’ve seen signs implying that level of food-separation purity.

Look, if homeopathic remedies make you feel better, take them. If the Paleo diet helps you eat fewer TV dinners, that’s great—even if the Paleo diet is probably premised more on The Flintstones than it is on any actual evidence about human evolutionary history. If non-organic crumbs bother you, avoid them. And there’s much to praise in Whole Foods’ commitment to sustainability and healthful foods.

Still: a significant portion of what Whole Foods sells is based on simple pseudoscience. And sometimes that can spill over into outright anti-science (think What Doctors Don’t Tell You, or Whole Foods’ overblown GMO campaign, which could merit its own article). If scientific accuracy in the public sphere is your jam, is there really that much of a difference between Creation Museum founder Ken Ham, who seems to have made a career marketing pseudoscience about the origins of the world, and John Mackey, a founder and CEO of Whole Foods Market, who seems to have made a career, in part, out of marketing pseudoscience about health?

Well, no—there isn’t really much difference, if the promulgation of pseudoscience in the public sphere is, strictly speaking, the only issue at play. By the total lack of outrage over Whole Foods’ existence, and by the total saturation of outrage over the Creation Museum, it’s clear that strict scientific accuracy in the public sphere isn’t quite as important to many of us as we might believe. Just ask all those scientists in the aisles of my local Whole Foods.

So, why do many of us perceive Whole Foods and the Creation Museum so differently? The most common liberal answer to that question isn’t quite correct: namely, that creationists harm society in a way that homeopaths don’t. I’m not saying that homeopathy is especially harmful; I’m saying that creationism may be relatively harmless. In isolation, unless you’re a biologist, your thoughts on creation don’t matter terribly much to your fellow citizens; and unless you’re a physician, your reliance on Sacred Healing Food to cure all ills is your own business.

The danger is when these ideas get tied up with other, more politically muscular ideologies. Creationism often does, of course—that’s when we should worry. But as vaccine skeptics start to prompt public health crises, and GMO opponents block projects that could save lives in the developing world, it’s fair to ask how much we can disentangle Whole Foods’ pseudoscientific wares from very real, very worrying antiscientific outbursts.

Still, we let it off the hook. Why? Two reasons come to mind. The first is that Whole Foods is a for-profit business, while the Creation Museum is the manifestation of an explicitly religious and political movement. For some reason, there’s a special stream of American rage directed at ideological attacks on science that seems to evaporate when the offender is a for-profit corporation. It wasn’t especially surprising that Bill Nye would go and debate Ken Ham; it would have been unusual had he, say, challenged executives at the biotech company Syngenta—which has seemingly been running a smear campaign against a Berkeley biologist—to a conversation about scientific integrity, or challenged Paleo Magazine’s editors to a debate about archaeology. For those of us outside the fundamentalist world, I imagine that the Creation Museum gift shop is the one part of the museum that makes some kind of sense. Well, okay, they’re trying to make money with this stuff. Meanwhile, Whole Foods responds to its customers, as any good business should.

And, second, we often have it stuck in our heads that science communicators have only failed to speak to the religious right. But while issues of science-and-society are always tied up, in some ways, with politics, they’re not bound to any particular part of the spectrum. Just ask Robert F. Kennedy, Jr., liberal political scion and vaccine skeptic extraordinaire, or Prince Charles, who pushed British health ministers to embrace homeopathic medicine.

Bringing sound data into political conversations and consumer decisions is a huge, ongoing challenge. It’s not limited to one side of the public debate. The moral is not that we should all boycott Whole Foods. It’s that whenever we talk about science and society, it helps to keep two rather humbling premises in mind: very few of us are anywhere near rational. And pretty much all of us are hypocrites.

Apocalypses Everywhere

Is There Any Hope in an Era Filled with Gloom and Doom? 
By Ira Chernus

Wherever we Americans look, the threat of apocalypse stares back at us.

Two clouds of genuine doom still darken our world: nuclear extermination and environmental extinction. If they got the urgent action they deserve, they would be at the top of our political priority list.

But they have a hard time holding our attention, crowded out as they are by a host of new perils also labeled “apocalyptic”: mounting federal debt, the government’s plan to take away our guns, corporate control of the Internet, the Comcast-Time Warner mergerocalypse, Beijing’s pollution airpocalypse, the American snowpocalypse, not to speak of earthquakes and plagues. The list of topics, thrown at us with abandon from the political right, left, and center, just keeps growing.

Then there’s the world of arts and entertainment where selling the apocalypse turns out to be a rewarding enterprise. Check out the website “Romantically Apocalyptic,” Slash’s album “Apocalyptic Love,” or the history-lite documentary “Viking Apocalypse” for starters. These days, mathematicians even have an “apocalyptic number.”

Yes, the A-word is now everywhere, and most of the time it no longer means “the end of everything,” but “the end of anything.” Living a life so saturated with apocalypses undoubtedly takes a toll, though it’s a subject we seldom talk about.

So let’s lift the lid off the A-word, take a peek inside, and examine how it affects our everyday lives. Since it’s not exactly a pretty sight, it’s easy enough to forget that the idea of the apocalypse has been a container for hope as well as fear. Maybe even now we’ll find some hope inside if we look hard enough.

A Brief History of Apocalypse

Apocalyptic stories have been around at least since biblical times, if not earlier. They show up in many religions, always with the same basic plot: the end is at hand; the cosmic struggle between good and evil (or God and the Devil, as the New Testament has it) is about to culminate in catastrophic chaos, mass extermination, and the end of the world as we know it.

That, however, is only Act I, wherein we wipe out the past and leave a blank cosmic slate in preparation for Act II: a new, infinitely better, perhaps even perfect world that will arise from the ashes of our present one. It’s often forgotten that religious apocalypses, for all their scenes of destruction, are ultimately stories of hope; and indeed, they have brought it to millions who had to believe in a better world a-comin’, because they could see nothing hopeful in this world of pain and sorrow.

That traditional religious kind of apocalypse has also been part and parcel of American political life since, in Common Sense, Tom Paine urged the colonies to revolt by promising, “We have it in our power to begin the world over again.”

When World War II — itself now sometimes called an apocalypse — ushered in the nuclear age, it brought a radical transformation to the idea. Just as novelist Kurt Vonnegut lamented that the threat of nuclear war had robbed us of “plain old death” (each of us dying individually, mourned by those who survived us), the theologically educated lamented the fate of religion’s plain old apocalypse.

After this country’s “victory weapon” obliterated two Japanese cities in August 1945, most Americans sighed with relief that World War II was finally over. Few, however, believed that a permanently better world would arise from the radioactive ashes of that war. In the 1950s, even as the good times rolled economically, America’s nuclear fear created something historically new and ominous — a thoroughly secular image of the apocalypse.  That’s the one you’ll get first if you type “define apocalypse” into Google’s search engine: “the complete final destruction of the world.” In other words, one big “whoosh” and then… nothing. Total annihilation. The End.

Apocalypse as utter extinction was a new idea. Surprisingly soon, though, most Americans were (to adapt the famous phrase of filmmaker Stanley Kubrick) learning how to stop worrying and get used to the threat of “the big whoosh.” With the end of the Cold War, concern over a world-ending global nuclear exchange essentially evaporated, even if the nuclear arsenals of that era were left ominously in place.

Meanwhile, another kind of apocalypse was gradually arising: environmental destruction so complete that it, too, would spell the end of all life.

This would prove to be brand new in a different way. It is, as Todd Gitlin has so aptly termed it, history’s first “slow-motion apocalypse.” Climate change, as it came to be called, had been creeping up on us “in fits and starts,” largely unnoticed, for two centuries. Since it was so different from what Gitlin calls “suddenly surging Genesis-style flood” or the familiar “attack out of the blue,” it presented a baffling challenge. After all, the word apocalypse had been around for a couple of thousand years or more without ever being associated in any meaningful way with the word gradual.

The eminent historian of religions Mircea Eliade once speculated that people could grasp nuclear apocalypse because it resembled Act I in humanity’s huge stock of apocalypse myths, where the end comes in a blinding instant — even if Act II wasn’t going to follow. This mythic heritage, he suggested, remains lodged in everyone’s unconscious, and so feels familiar.

But in a half-century of studying the world’s myths, past and present, he had never found a single one that depicted the end of the world coming slowly. This means we have no unconscious imaginings to pair it with, nor any cultural tropes or traditions that would help us in our struggle to grasp it.

That makes it so much harder for most of us even to imagine an environmentally caused end to life. The very category of “apocalypse” doesn’t seem to apply. Without those apocalyptic images and fears to motivate us, a sense of the urgent action needed to avert such a slowly emerging global catastrophe lessens.

All of that (plus of course the power of the interests arrayed against regulating the fossil fuel industry) might be reason enough to explain the widespread passivity that puts the environmental peril so far down on the American political agenda. But as Dr. Seuss would have said, that is not all! Oh no, that is not all.

Apocalypses Everywhere

When you do that Google search on apocalypse, you’ll also get the most fashionable current meaning of the word: “Any event involving destruction on an awesome scale; [for example] ‘a stock market apocalypse.’” Welcome to the age of apocalypses everywhere.

With so many constantly crying apocalyptic wolf or selling apocalyptic thrills, it’s much harder now to distinguish between genuine threats of extinction and the cheap imitations. The urgency, indeed the very meaning, of apocalypse continues to be watered down in such a way that the word stands in danger of becoming virtually meaningless. As a result, we find ourselves living in an era that constantly reflects premonitions of doom, yet teaches us to look away from the genuine threats of world-ending catastrophe.

Oh, America still worries about the Bomb — but only when it’s in the hands of some “bad” nation. Once that meant Iraq (even if that country, under Saddam Hussein, never had a bomb and in 2003, when the Bush administration invaded, didn’t even have a bomb program). Now, it means Iran — another country without a bomb or any known plan to build one, but with the apocalyptic stare focused on it as if it already had an arsenal of such weapons — and North Korea.

These days, in fact, it’s easy enough to pin the label “apocalyptic peril” on just about any country one loathes, even while ignoring friends, allies, and oneself. We’re used to new apocalyptic threats emerging at a moment’s notice, with little (or no) scrutiny of whether the A-word really applies.

What’s more, the Cold War era fixed a simple equation in American public discourse: bad nation + nuclear weapon = our total destruction. So it’s easy to buy the platitude that Iran must never get a nuclear weapon or it’s curtains. That leaves little pressure on top policymakers and pundits to explain exactly how a few nuclear weapons held by Iran could actually harm Americans.

Meanwhile, there’s little attention paid to the world’s largest nuclear arsenal, right here in the U.S. Indeed, America’s nukes are quite literally impossible to see, hidden as they are underground, under the seas, and under the wraps of “top secret” restrictions. Who’s going to worry about what can’t be seen when so many dangers termed “apocalyptic” seem to be in plain sight?

Environmental perils are among them: melting glaciers and open-water Arctic seas, smog-blinded Chinese cities, increasingly powerful storms, and prolonged droughts. Yet most of the time such perils seem far away and like someone else’s troubles. Even when dangers in nature come close, they generally don’t fit the images in our apocalyptic imagination. Not surprisingly, then, voices proclaiming the inconvenient truth of a slowly emerging apocalypse get lost in the cacophony of apocalypses everywhere. Just one more set of boys crying wolf and so remarkably easy to deny or stir up doubt about.

Death in Life

Why does American culture use the A-word so promiscuously? Perhaps we’ve been living so long under a cloud of doom that every danger now readily takes on the same lethal hue.

Psychiatrist Robert Lifton predicted such a state years ago when he suggested that the nuclear age had put us all in the grips of what he called “psychic numbing” or “death in life.” We can no longer assume that we’ll die Vonnegut’s plain old death and be remembered as part of an endless chain of life. Lifton’s research showed that the link between death and life had become, as he put it, a “broken connection.”

As a result, he speculated, our minds stop trying to find the vitalizing images necessary for any healthy life. Every effort to form new mental images only conjures up more fear that the chain of life itself is coming to a dead end. Ultimately, we are left with nothing but “apathy, withdrawal, depression, despair.”

If that’s the deepest psychic lens through which we see the world, however unconsciously, it’s easy to understand why anything and everything can look like more evidence that The End is at hand. No wonder we have a generation of American youth and young adults who take a world filled with apocalyptic images for granted.

Think of it as, in some grim way, a testament to human resiliency. They are learning how to live with the only reality they’ve ever known (and with all the irony we’re capable of, others are learning how to sell them cultural products based on that reality). Naturally, they assume it’s the only reality possible. It’s no surprise that “The Walking Dead,” a zombie apocalypse series, is their favorite TV show, since it reveals (and revels in?) what one TV critic called the “secret life of the post-apocalyptic American teenager.”

Perhaps the only thing that should genuinely surprise us is how many of those young people still manage to break through psychic numbing in search of some way to make a difference in the world.

Yet even in the political process for change, apocalypses are everywhere. Regardless of the issue, the message is typically some version of “Stop this catastrophe now or we’re doomed!” (An example: Stop the Keystone XL pipeline or it’s “game over”!) A better future is often implied between the lines, but seldom gets much attention because it’s ever harder to imagine such a future, no less believe in it.

No matter how righteous the cause, however, such a single-minded focus on danger and doom subtly reinforces the message of our era of apocalypses everywhere: abandon all hope, ye who live here and now.

Doom and the Politics of Hope

Significant numbers of Americans still hold on to the hope that comes from the original religious version of the apocalypse. Millions of evangelical Christians seem ready to endure the terrors of the destruction of the planet, in a nuclear fashion or otherwise, because it’s the promised gateway to an infinitely better world. Unfortunately, such a “left behind” culture has produced an eerie eagerness to fight both the final (perhaps nuclear) war with evildoers abroad and the ultimate culture war against sinners at home.

This “last stand” mentality, deeply ingrained in (among others) some uncompromising tea partiers, seems irrational in the extreme to outsiders. It makes perfect sense, however, if you are convinced beyond a scriptural doubt that we’re heading for Armageddon.

A version of plain old apocalypse was once alive on the political left, too, when there was serious talk of a revolution that would tear down the walls and start rebuilding from the ground up. Given the world we face, it may at least be time to bring back the hope for a better future that lay at its heart.

With doom creeping up on us daily in our environmental slow-motion apocalypse, what we may well need now is a slow-motion revolution. Indeed, in the energy sphere it’s already happening. Scientists have shown that renewable sources like sun and wind could provide all the energy humanity needs. Alternative technologies are putting those theories into practice around the globe, just not (yet) on the scale needed to transform all human life.

Perhaps it’s time to make our words and thoughts reflect not just our fears, but the promise of the revolution that is beginning all around us, and that could change in a profound fashion the way we live on (and with) this planet. Suppose we start abiding by this rule: whenever we say the words “Keystone XL,” or talk about any environmental threat, we will follow up with as realistic a vision as we can conjure up of “Act II”: a new world powered solely by renewable sources of energy, free from all carbon-emitting fuels, and inhabited in ingeniously organized new ways.

In an age in which gloom, doom, and annihilation are everywhere, it’s vital to bring genuine hope — the reality, not just the word — back into political life.

Ira Chernus, a TomDispatch regular, is professor of religious studies at the University of Colorado Boulder and author of the online “MythicAmerica: Essays.” He blogs at

Copyright 2014 Ira Chernus

Why I Disrupted the Wisdom 2.0 Conference


February 19, 2014

The organizer behind the demonstration speaks out

Amanda Ream

The invisibility of the crisis in San Francisco right now is reminiscent of that of the AIDS epidemic. To quote from Vito Russo, a founder of the AIDS activist group ACT UP, film historian, and rabble rouser, it’s “like living through a war which is happening only for those people who happen to be in the trenches.” He lived in this city when it was a haven for political radicals, queer people, artists, and immigrants, when it was America’s great city of sanctuary.

“You look around and you discover that you’ve lost more of your friends, but nobody else notices,” he said. “It isn’t happening to them.”

People are not dying, but they’re disappearing every day, from all over the city. The tech industry’s great economic boom is driving a housing crisis, with no-fault evictions increasing 175% since last year. The city doesn’t keep track of how many people live in these apartments, but the Anti-Eviction Mapping Project estimates up to 3580 residents were no-fault evicted in 2013.

I came to San Francisco like generations of people before me because I wanted to find the freedom to live out my ideals. And to practice the dharma—no other city has so many teachers and centers. It’s a great place to find the teachings of the Buddha. The tech industry, Google and Facebook and their peers, have adopted the culture of this place.

Just like the gentrification of a neighborhood where new, wealthy people displace people who have lived there longer, the dharma is undergoing a process of gentrification in San Francisco today. Lost is the bigger picture of the teachings that asks us to consider our interdependence and to move beyond self-help and addressing only our own suffering. The dharma directs us to feel the suffering of others.

The pace of displacement in the city’s Mission District makes whole sections of the neighborhood unrecognizable to people who lived there just a year before. With great respect for Sharon Salzberg, Konda Mason, and Shinzen Young, who taught this year at Wisdom 2.0, I ask the following question about the dharma on display at this conference: To whom is it recognizable?

While members of Eviction Free San Francisco held a banner across the stage, I handed out leaflets to the more than 500 attendees that read, “Thank you for your practice. We invite you to consider the truth behind Google and the tech industry’s impact on San Francisco.”

At a conference like this, our action—a banner and a chant of “San Francisco Not for Sale” on a bullhorn—is only meaningful in the context of the larger movement to keep families in their homes, to save the city and the diversity we love, and to repeal the state law that allows for no-fault evictions, which create conditions for speculators and evictors to run wild for profits. We want to preserve an economically diverse city that works for all of us, not just the tech industry.

When I zipped up that banner in a bag to sneak it into the conference, I thought about the ways this action could contribute to a larger conversation among people of conscience about how to stop this crisis of economic inequality. But like our Mission District neighbors, the activists and the message of Eviction Free San Francisco were disappeared without a word, censored from the livestream of the event. As we were marched out of the hall by angry conference staff, the Google presentation carried on, asking the audience to “check in with their body” about the conflict. No one addressed the issues we were raising, not then or later on in the conference. It was a case study in spiritual bypassing.

It’s almost too easy to point this out at Wisdom 2.0. Most of the workshops offer lifestyle and consumer choices that are meant to help people heal from the harm, emptiness, and unsustainability associated with living under capitalism, but it does so without offering an analysis of where this disconnection comes from. The conference presents an evolution in consciousness of the wealthiest among us as the antidote to suffering rather than the redistribution of wealth and power.

We disrupted Wisdom 2.0 to make visible the struggle of eviction and gentrification that we and our neighbors are facing. The invitation still stands for the organizers, presenters, and attendees of this conference, as well as our new neighbors who work for the companies that put it on, to recognize our demands and engage with these social issues.

Before Google’s talk on corporate mindfulness at Wisdom 2.0, I sat there in my chair, a participant in a centering practice alongside other conference attendees. I felt connected. We were only different from them because we were preparing ourselves to take the stage as uninvited guests in order to ask the question that most needs asking in San Francisco right now: Who is included and who is excluded from this community?

Amanda Ream is a member of Oakland’s East Bay Meditation Center and is in the Dedicated Practitioner’s Program at Spirit Rock. She works as a union organizer in Bernal Heights, San Francisco.

View a video of the Google Wisdom 2.0 disruption here.

Ancient Church Discovered By Archaeologists In Israel

Including This Remarkable Symbol Of Jesus Christ

February 4, 2014



Archaeologists in Israel have uncovered intricate mosaics on the floor of a 1,500-year-old Byzantine church, including one that bears a Christogram (symbol of Jesus Christ) surrounded by birds.


Christogram of Jesus
The Christogram / Davida Eisenberg Degen, Israel Antiquities Authority


The ruins were discovered during a salvage excavation ahead of a construction project in Aluma, a village about 30 miles (50 kilometers) south of Tel Aviv, the Israel Antiquities Authority (IAA) announced Wednesday (Jan. 22). Excavator Davida Eisenberg Degen said the team used an industrial digger to probe a mound at the site, and through a 10-foot (3 meters) hole, they could see the white tiles of an ancient mosaic.


Much of the church was revealed during excavations over the past month. The basilica was part of a local Byzantine settlement, but the archaeologists suspect it also served as a center of Christian worship for neighboring communities because it was next to the main road running between the ancient seaport city of Ashkelon in the west and Beit Guvrin and Jerusalem in the east.


ancient mosaics
Animal Mosaic / Davida Eisenberg Degen, Israel Antiquities Authority


“Usually a Byzantine village had a church, but the size of this church and its placement on the road makes it more important,” Degen told LiveScience.


Remarkable finds


The excavators plan to keep working on the site for another week, but one of the most remarkable finds so far was a mosaic containing a Christogram, or a “type of monogram of the name of Jesus,” Degen said.


At the time, Byzantine Christians wouldn’t have put crosses on their mosaic floors so as to not step on the symbol of Christ, Degen explained. The Christogram in the mosaic may look like a cross, but it’s actually more like a “chi rho” symbol, which puts together the first two captial letters in the Greek word for Christ, and often looks like an X superimposed on a P. There is an alpha and omega (the first and last letters of the Greek alphabet) on either side of the chi rho, which is another Christian symbol, as Christ was often described as the “”the beginning and the end.” Four birds also decorate the mosaic, and two of them are holding up a wreath to the top of the chi rho.


Inside the 72-by-39-foot (22-by-12-meter) basilica, archaeologists also found marble pillars and an open courtyard paved with a white mosaic floor, said Daniel Varga, director of the IAA’s excavations.


Just off the courtyard, in the church’s narthex, or lobby area, there is “a fine mosaic floor decorated with colored geometric designs” as well as a “twelve-row dedicatory inscription in Greek containing the names ‘Mary’ and ‘Jesus’, and the name of the person who funded the mosaic’s construction,” Varga said in a statement.


1500 year old mosaics
Photo: Yoli Shwartz / Israel Antiquities Authority


The mosaics in the main hall, or nave, meanwhile, are decorated with vine tendrils in the shape of 40 medallions, one of which contains the Christogram. Many of the other medallions contain botanical designs and animals such as a zebras, peacocks, leopards and wild boars, the excavators said. Three contain inscriptions commemorating two heads of the local regional church named Demetrios and Herakles.


Regarding the new discoveries, the IAA plans to remove the mosaic for display at a regional museum or visitors’ center, and the rest of the site will be covered back up.